Quiz 2
Let the joint distribution of positively valued random variables \(\left(X_1, Y\right)\) have the following properties for \(\alpha_0>0\) and \(\alpha_1>0\):
- \(\mathbb{E}\left(Y|X_1\right) = \alpha_0+\alpha_1 X_1\)
- \(\mathsf{Var}\left(Y|X_1\right) = \alpha_0+\alpha_1 X_1\)
A random sample \(\{\left(X_{1t},Y_t\right)\}\) was obtained from the previously described joint distribution.
Determine whether the following statements are True or False.
- Assumption 3.1 is satisfied.
- Assumption 3.2 is satisfied.
- Assumption 3.4 is satisfied.
- Assumption 3.5 is satisfied.
- The least squares estimator for the slope is an unbiased estimator for the regression slope.
Explanation:
First, note that because we have IID random samples \(\{\left(X_{1t},Y_t\right)\}\) from the joint distribution of \(\left(X_1,Y\right)\), we can write:
\[
\begin{eqnarray}
\mathbb{E}\left(Y_t|X_{11},\ldots, X_{1n}\right) &=& \mathbb{E}\left(Y_t|X_{1t}\right) \\
\mathsf{Var}\left(Y_t|X_{11},\ldots, X_{1n}\right) &=& \mathsf{Var}\left(Y_t|X_{1t}\right)
\end{eqnarray}
\]
Second, note that from the given information, we also must have \[
\begin{eqnarray}
\mathbb{E}\left(Y_t|X_{1t}\right) &=& \alpha_0+\alpha_1 X_{1t}\\
\mathsf{Var}\left(Y_t|X_{1t}\right) &=& =\alpha_0+\alpha_1 X_{1t}
\end{eqnarray}
\]
With those two points, we can now provide an explanation for the answers:
- True, because IID sampling plus \(Y_t=\mathbb{E}\left(Y_t|X_{1t}\right)+\varepsilon_t=\alpha_0+\alpha_1 X_{1t}+\varepsilon_t\) where \(\varepsilon_t\) is a CEF error. Thus, linearity according to Assumption 3.1 is satisfied.
- True, because IID sampling plus \(\varepsilon_t\) is a CEF error.
- False, since \(\mathsf{Var}\left(Y_t|X_{1t}\right) =\alpha_0+\alpha_1 X_{1t}\) and linearity allows us to conclude that \(\mathsf{Var}\left(\varepsilon_t|X_{1t}\right) =\alpha_0+\alpha_1 X_{1t}\). Therefore, \(\mathsf{Var}\left(\varepsilon_t|X_{1t}\right)\) is not a constant and in fact depends on values of \(X_1\).
- Because Assumption 3.4 is not satisfied, Assumption 3.5 is automatically not satisfied. Look at the variance or the fact that you do not know anything about the form of the distribution of \(\varepsilon_t | X_{1t}\).
- Assumptions 3.1 and 3.2 are satisfied and provided that the least squares estimator is unique, it has to be unbiased for the regression slope. As a result, the statement is true. See Theorem 3.4 in the textbook (or recall Lectures 6-8).