wykład ekonometria 1

 0    40 fiche    beatabalcerzak
baixar mp3 Imprimir jogar verifique-se
 
questão American English resposta American English
(Data) A panel is called balanced if all micro-units (cross-sectional data) have measurements in all periods.
começar a aprender
TRUE
(Components of the regression model) In the model: y = B1+ B2xi, +ei, the variable x can be called a dependent variable.
começar a aprender
FALSE, In this model, 𝑦 y is the dependent variable, while 𝑥 x is the independent or explanatory variable.
(Components of the regression model) In the model: y = B+ B+e, 8, is the slope.
começar a aprender
FALSE, B1 is the intercept in this model, while B2 is the slope
(Assumptions of the regression model) Multicollinearity of explanatory variables is one of the assumptions underlying a multiple regression model.
começar a aprender
FALSE, Multicollinearity is not an assumption of the regression model; rather, it's a problem when explanatory variables are highly correlated, violating the assumption of no perfect multicollinearity.
(The Gauss-Markov theorem) The Gauss-Markov theorem states that the OLS estimator is best because, under specific assumptions, it is unbiased.
começar a aprender
TRUE
(Ordinary least squares) OLS estimates are selected in such a way that the sum of residuals was the smallest.
começar a aprender
FALSE, OLS minimizes the sum of the squared residuals, not just the sum of the residuals.
(Coefficient of determination) If the model does not contain an intercept parameter, SST ≠ SSR+SSE.
começar a aprender
TRUE
(Statistical tests) The level of significance of a test is the probability of committing an error consisting in rejecting the null hypothesis which is true.
começar a aprender
TRUE
(t-tests) When testing the null hypothesis Ho: Bk = c against the alternative hypothesis H1: Bk>c you should reject the null hypothesis if the test statistic t=<t with subscript "(1-alpha; N-K)"
começar a aprender
FALSE, You reject the null hypothesis if the test statistic 𝑡 is greater than the critical value 𝑡 with subscript (1 − 𝛼; 𝑁 − 𝐾) in a one-tailed test, not less than.
(Prediction) For a simple regression model: the variance of the forecast error depends on the variation in the explanatory variable.
começar a aprender
TRUE
(F-tests) In general, an F-test statistic value depends on restricted estimation results only.
começar a aprender
FALSE, The F-test statistic depends on both restricted and unrestricted models since it compares the two.
(Restricted estimation) The restricted least squares estimator stays unbiased, even if the constraints that are imposed are false.
começar a aprender
FALSE, If the imposed constraints are incorrect, the estimator will generally be biased because the true model is mis-specified.
(Nonlinear models) In the log-log model the slope is constant.
começar a aprender
FALSE, In a log-log model, the elasticity (percentage change in y with respect to percentage change in x) is constant, but the slope itself is not constant.
(The Jarque-Bera test) The Jarque-Bera test statistic depends on skewness and kurtosis of the data.
começar a aprender
TRUE
(Specification errors) The omitted-variable bias occurs if the omitted variable is corre- lated with the variables included in the model.
começar a aprender
TRUE
(Collinearity) One of the consequences of strong linear dependencies between explanatory variables is that the standard errors are small.
começar a aprender
FALSE, Strong multicollinearity actually leads to inflated (large) standard errors, making it harder to detect significant relationships.
(Heteroskedasticity) The Breusch-Pagan test uses a variance function including all explanatory variables from the model under investigation.
começar a aprender
TRUE
(Dummy variables) A slope-indicator variable allows for a change in the intercept.
começar a aprender
FALSE, A slope-indicator variable allows for a change in the slope, not the intercept. It interacts with an explanatory variable to change the slope for different groups.
(Dummy variables) The value 0 for a dummy variable defines the reference group, or base group.
começar a aprender
TRUE
(Autocorrelation) One consequence of autocorrelated errors is that the least squares estimator is no longer best.
começar a aprender
TRUE
(Types of data) Annual profit for each of 400 randomly chosen micro enterprises from Poland for the year 2022 is an example of cross sectional series.
começar a aprender
TRUE
(Components of the regression model) Regressand can be otherwise referred to as an explanatory variable.
começar a aprender
FALSE, Regressand refers to the dependent variable, not the explanatory variable
(Components of the regression model) In the model: yi = β1 +β2xi +ei, β1 and β2 are random variables.
começar a aprender
FALSE, β1 and β2 ​are parameters, not random variables
(Assumptions of the regression model) Homoskedasticity of the error term is one of the assumptions underlying a multiple regression model.
começar a aprender
TRUE
(The Gauss-Markov theorem) The Gauss-Markov theorem implies that the OLS estimator is better than any nonlinear unbiased estimator.
começar a aprender
FALSE, The Gauss-Markov theorem only applies to linear unbiased estimators, and it does not state that OLS is better than any nonlinear estimator​
(Ordinary least squares) Standard errors are square roots of estimated variances of the OLS estimators.
começar a aprender
TRUE
(Coefficient of determination) The value of R2 can decrease if we add an insignificant explanatory variable to the model.
começar a aprender
FALSE, The value of 𝑅 2 R 2 cannot decrease by adding an explanatory variable, even if it is insignificant
(Confidence intervals) For a given dataset and model, a 99% interval estimate of a parameter of the model is wider than a 95% interval.
começar a aprender
TRUE
(t-tests) Using a t-test we can test whether all the variables in the multiple regression model are jointly insignificant.
começar a aprender
FALSE, A t-test tests the significance of individual variables, while an F-test is used to test whether all variables in the model are jointly insignificant
(Prediction) For a simple regression model: the variance of the forecast error depends on the value of explantory variable used to compute the prediction.
começar a aprender
TRUE
(Testing) In an F-test a p-value of 0.02 leads to the rejection of the null hypothesis at 5% significance level.
começar a aprender
TRUE
(Scaling the variables) In the simple regression model: if the scale of y and x is changed by the same factor then the estimated intercept will change.
começar a aprender
TRUE
(Nonlinear models) In the model ln(yi) = β1 + β2ln(xi) + ei, the parameter β2 is elasticity.
começar a aprender
TRUE
(The Jarque-Bera test) The null hypothesis in the Jarque-Bera test concerns the normal distribution of the variable being tested.
começar a aprender
TRUE
(Specification errors) Including some unnecessary regressors in the multiple regression model produces biased estimators of the coefficients of the regressors that belong in the equation.
começar a aprender
FALSE, Adding unnecessary variables to a regression model increases the variance of the estimates but does not affect the accuracy (unbiasedness) of the estimates for the important variables already in the model.
(Multicollinearity) It is not possible to estimate the model by least squares when there is exact multicollinearity.
começar a aprender
TRUE
(Model selection) The AIC would choose, from models with the same sum of squared residuals, the model with the smallest number of parameters.
começar a aprender
FALSE, The AIC penalizes models for the number of parameters, but it doesn’t necessarily choose the model with the smallest number of parameters
(Heteroskedasticity) Heteroskedasticity tests include: the Breusch-Pagan test and the Durbin-Watson test.
começar a aprender
FALSE, The Durbin-Watson test is for autocorrelation, not heteroskedasticity. Breusch-Pagan is a test for heteroskedasticity
(Heteroskedasticity) One consequence of heteroskedasticity is that the usual standard errors are incorrect and should not be used.
começar a aprender
TRUE
(Dummy variables) A dummy variable trap means that the model cannot be estimated using ordinary least squares because of an incorrect use of indicator variables.
começar a aprender
TRUE

Você deve entrar para postar um comentário.