Suppose you are estimating parameters of the following regression model:
Ŷt = 9941 + 0.25 X2t+ 15125 X3t
(6114) (0.121) (7349)
R 2= 0.87, RSS = 10310
(The figures in parentheses are the estimated standard errors. RSS are residual sum of squares.)
(i) Comment on the signs of the variables in the model.
(ii) Interpret and explain individual coefficients.
(iii) Suppose X3 increases by 0.25; what is the expected impact of this change on Y?
(iv) Comment on the explanatory power of the regression.
(v) Using t-tests show whether individual coefficients are significantly different from zero at 5% level of significance.
(vi) Test whether the coefficient of X2 is significantly different from 1 at 5% level of significance.
(vii) Carry out an appropriate test to check if coefficients are jointly significant.
(i)
The sign of all the variables are positive.
Conclusion : The positive sign indicate that there is positive correlation between the dependent variable and the independent variables . Since all the regression coefficient are positive , with increase in the independent variables , y also increase.
(ii)
From regression equation :
b0(intercept)=9941,
b1( first regression coefficient)=0.25 ,
b2(regression coefficient)=15125.
Conclusion :
Interpretation of b0 : It indicate that estimated value of y when all the regression coefficient are 0 is 9941.
Interpretation of b1 : For one unit increase in x2 value of y is expected to increase 0.25 , holding x3 constant.
Interpretation of b2 : For one unit increase in x3 value of y is expected to increase 15125 holding x2 constant.
(iii)
When x3 increases by 0.25 , then the expected change in y is
"0.25\u00d715125=3781.25."
Conclusion : Therefore expected change in y is 3781.25 holding the x2 constant .
(iv)
The given value of the coefficient of determination ,R2 is 0.87 or 87% of the variation in the value of the response variable is explained by the fitted regression model which is based on X1 and X2.
so the exlanatory power of regression line is good.
(v)
For "\\beta_2" (coefficient of x1)
"H_O:\\beta_2=0\\\\H_1:\\beta_2\\not =0"
test statistic
"t=\\frac{b_2-\\beta1}{s_{b_2}}=\\frac{0.25}{0.121}=2.066"
the p-value of the test statistic for a two-tailed test at a 5% significance level and n-2 degree of freedom is given by p(t>2.066)
now for any feasible value of n(more than 5)the p-value is less than the significance level. Thus we have sufficient evidence to reject the null hypothesis and conclude that the coefficient of X2 is not zero.
For "\\beta_3" (coefficient of x1)
"H_O:\\beta_3=0\\\\H_1:\\beta_3\\not =0"
test statistic
"t=\\frac{b_2-\\beta1}{s_{b_2}}=\\frac{15125}{7349}=2.058"
the p-value of the test statistic for a two-tailed test at a 5% significance level and n-2 degree of freedom is given by p(t>2.058)
now for any feasible value of n(more than 5)the p-value is less than the significance level. Thus we have sufficient evidence to reject the null hypothesis and conclude that the coefficient of X3 is not zero.
(vi)
For "\\beta_2" (coefficient of x1)
"H_O:\\beta_2=1\\\\H_1:\\beta_2\\not =1"
test statistic
"t=\\frac{b_2-\\beta1}{s_{b_2}}=\\frac{0.25-1}{0.121}=6.198"
(vii)
the p-value of the test statistic for a two-tailed test at a 5% significance level and n-2 degree of freedom is given by p(t<6.198)
now for any feasible value of n(more than 5)the p-value is less than the significance level. Thus we have sufficient evidence to reject the null hypothesis and conclude that the coefficient of X2 is significantly different from 1.
Comments
Leave a comment