代写STATS 4A03 Homework 3代做留学生SQL语言
- 首页 >> Matlab编程Homework 3
STATS 4A03
Due on Crowdmark by Friday, March 15 at 11:59pm
Guidelines: Unless otherwise specified, you are required to justify and prove all your answers.
You are welcome and encouraged to collaborate with other students on homework assignments, and you should feel free to discuss the problems and talk about how to come up with solutions with each other. However, you are expected to write all your solutions independently of any collaborators and you should not share written solutions with other students before the deadline. If you collaborate with other students, you must cite any collaborators that you had on any given problem.
You may use the textbook and lecture slides. You are discouraged from using outside resources (online, Math stack, etc.), but if you decide to do so, you must cite all your sources. If your solution is too similar to the cited one, you may lose credit on the problem.
Your homework grade will be based on completeness plus the correctness of a random subset of four (4) problems.
Exercise 1. Consider an observed series Y1 , Y2, . . . , Y100 from an AR(1) model Yt = 0.7Yt - 1 + et. Would it be unusual if r1 = 0.6? What about if r10 = -0.15? Justify your answers.
Exercise 2. Suppose {Yt } is a stationary process and we observe the values Y1 = 6, Y2 = 5, Y3 = 4, Y4 = 6, Y5 = 4. Find the method of moments estimators of µ , γ0 , and ρ 1 .
Exercise 3. Consider an MA(1) model and suppose we have the observed values Y1 = 0, Y2 = -1, and
Y3 = 1/3. Find the conditional least squares estimator of θ and the method of moments estimator of σe(2) .
Exercise 4. Consider the following parameterizations of the AR(1) process with nonzero mean:
Model 1 : Yt - µ = φ(Yt - 1 - µ) + et
Model 2 : Yt = φYt - 1 + θ0 + et.
Suppose we want to estimate φ and µ (in Model 1) or φ and θ0 (in Model 2) via conditional least squares estimation on Y1. Show that with Model 1 we need to solve nonlinear equations to obtain the estimates, while with Model 2 we only need to solve linear equations.
Exercise 5. For an AR(1) model with φ = 0.6 and n = 100, the lag 1 sample autocorrelation of the residuals is found to be ˆ(r)1 = 0.15. Is this result unusual? Justify your answer.
Exercise 6. Based on a series of length n = 100, we fit an AR(2) model and obtain residual autocorrelations of ˆr1 = 0.14, ˆr2 = 0.15, and ˆr3 = 0.18. If the maximum likelihood estimates are φˆ1 = 1 and φˆ2 = -1.1, do these residual autocorrelations support the AR(2) specification? Individually? Jointly? Justify your answers.