Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Econometrics formula sheet, Cheat Sheet of Economics

Formula sheet in data and causality, OLS formulas, gauss-Markova assumptions, statistics basics, hypothesis testing and classical linear model.

Typology: Cheat Sheet

2021/2022
On special offer
30 Points
Discount

Limited-time offer


Uploaded on 02/07/2022

myboy
myboy 🇺🇸

4.4

(72)

31 documents

Partial preview of the text

Download Econometrics formula sheet and more Cheat Sheet Economics in PDF only on Docsity! Econometrics Cheat Sheet Data & Causality Basics about data types and causality.Types of data Experimental Observational Data Data from collected randomized passively experiment Time Cross-sectional series Single Multiple unit, units, multiple one point points in in time time Longitudinal (or Panel)Multiple time periods units followed over multiple Experimental data • Correlation ==> Causality • Very rare in Socia! Sciences Statistics basics We examine a random sample of data to learn about the population Random sample Parameter (0) Estimator of 0 Estimate of 0 Sampling distribution Bias of estimator W Efficiency Consistency Representative of population Some number describing population Rule assigning value of 0 to sample e.g. Sample average, Y = 1J I:�1 Y; What the estimator spits out for a particular sample ( 0) Distribution of estimates across ali possible samples E(W)- 0 W efficient if Var(W) < Var(W) W consistent if 0 ➔ 0 as N ➔ oo Hypothesis testing The way we answer yes/no questions about our population using a sample of data. e.g. "Does increasing public school spending increase student achievement?" null hypothesis (Ho) alt. hypothesis (Ha) significance leve! (a) test statistic (T) criticai value (c) p-value Typically, Ho : 0 = O Typically, Ho : 0 t O Tolerance for making Type I error; (e.g. 10%, 5%, or 1%) Some function of the sample of data Value of T such that reject Ho if ITJ > c; c depends on a; c depends on if 1- or 2-sided test Largest a at which fai! to reject Ho ; reject Ho if p < a Simple Regression Model Regression is useful because we can estimate a ceteris paribus relationship between some variable x and our outcome y Y = f3o + f31x + u We want to estimate /Ji, which gives us the effect of x on y. OLS formulas To estimate /Jo and /J1 , we make two assumptions: l. E(u) = O 2. E(ulx)=E(u) for all x When these hold, we get the following formulas: /Jo = y - /J1x /J1 = � (y, x) Var (x) fitted va.lues (y;) 1/ì = /Jo + /J1 x; resid uals (il;) u; = y; - y; Tota! Sum of Squares SST = I:�1 (y; -y)2 Expl. Sum of Squares SSE = I:;1 (y; -y)2 Resid. Sum of Squares SSR = I:i=l uf R-squared (R2 ) R2 = ii:; "frac. of var. in y explained by x" Algebraic properties of OLS estimates I:�1 u; = O (mean & sum of residuals is zero) I:�1 x;u; = O (zero covariance bet. x and resids.) The OLS line (SRF) always passes through (x, y) SSE + SSR = SST O '.S R2 '.S 1 lnterpretation and functional form Our model is restricted to be linear in parameters But not linear in x Other functional forms can give more realistic model Model DV RHS Interpretation of /31 Level-level y X 6.y = /316.x Level-log y log(x) 6.y = (/31/100) [1%6.x] Log-level log(y) X %6.y = (100/31) 6.x Log-log log(y) log(x) %6.y = /31 %.6.x Quadrati e y X+ X2 .6.y = (/31 + 2/32x) .6.x Note: DV dependent variable; RHS right hand side Multiple Regression Model Multiple regression is more useful than simple regression because we can more plausibly estimate ceteris paribus relationships (i.e. E (ulx) = E (u) is more plausible) Y = f3o + /31x1 + · · · + /3kXk + U /J1, ... , /Jk : partial effect of each of the x's on y /Jo = y -/J1x1 - · · · - /JkXk fJ _ � (y, residualized Xj ) J - Var (residualized Xj ) where "residua.lized x/' means the residuals from OLS regression of Xj on ali other x's (i.e. x1 , . .. , Xj-l, x 1+1, . .. Xk) Gauss-Markov Assumptions l. y is a linear function of the /3's 2. y and x's are randomly sampled from population 3. No perfect multicollinearity 4. E(ulx1 , . .. , xk ) = E(u) = O (Unconfoundedness) 5. Var (ulx1 , .. . , xk) = Var (u) = a2 (Homoskedasticity) When (1)-(4) hold: OLS is unbiased; i.e. E(/Jj ) = /3j When (1)-(5) hold: OLS is Best Linear Unbiased Estima.tor Variance of u (a.k.a. "error variance") a- 2 = SSR N-I<-1 1 � .2 = -N-- I<- - 1 � U; t=l Variance and Standard Error of /J j where • a2 Var(/3j) = SSTj (l-Ry)' j = 1, 2, .. . , k N SSTj = (N - l)Var(xj ) = L(Xij -Xj ) i=l RJ = R2 from a regression of Xj on ali other x's Standard deviation: JVar Standard error: � . I a- 2 se(/3j) = , SST1(1-R7)'j = l, .. . , k Classica! Linear Model ( CLM) Add a 6th assumption to Gauss-Markov: 6. u is distributed N (o, a2 ) Need this to know what the distribution of /Jj is Otherwise, can't conduct hypothesis tests about the f3's Testing Hypotheses about the /3's Under A (1)-(6), can test hypotheses about the (3's t-test for simple hypotheses To test a simple hypothesis like Ho : /3j = O Ha: /3j t O use a t-test: fJ - o t = _J __ se (/Ji) where O is the null hypothesized value. Reject Ho if p < a or if ltl > e (See: Hypothesis testing)
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved