程序代写代做代考 Alastair Hall ECON61001: Semester 1, 2020-21 Econometric Methods

Alastair Hall ECON61001: Semester 1, 2020-21 Econometric Methods
Problem Set for Tutorial 5
To develop our large sample analysis, we assumed in class that plimT−1X′X = Q, a nonsingular matrix of finite constants. In this case you consider the behaviour of X′X when a simple linear regression model with regressor equal to the time trend.
1. Suppose that xt = [1,t]′. Evaluate whether the following matrices converge to􏰔a finite, nonsingular m􏰔atrix as T → ∞: (a) T−1X′X; (b) T−2X′X; (c) T−3X′X. Hint: Tt=1 t = T(T + 1)/2, Tt=1 t2 = T(T + 1)(2T + 1)/6.
The times series εt is known as “white noise” if it has the following properties: (i) E[εt] = 0 for all t; (ii) Var[εt]=σ2; (iii) Cov[εt,εs]=0 for t̸=s.
2. Let εt be a white noise process, and define the following three time series: ut = εt, vt = (−1)tεt and wt = I(t = 10)+εt where I( · ) is an indicator function that equals one if the event (t = 10 here) in parentheses occurs and zero otherwise.
(a) Is ut weakly stationary? Explain. (b) Is vt weakly stationary? Explain. (c) Is wt weakly stationary? Explain.
(d) Is vt strongly stationary? Explain.
If the time series yt follows an Autoregressive Moving Average model of orders (p,q), denoted
ARMA(p,q) then it is generated as follows:
yt = c+θ1yt−1 +θ2yt−2 +θpyt−p +εt +φ1εt−1 +φ2εt−2 +…+φqεt−q,
where εt is a white noise process. Special case of this class of models are Autoregressive processes and Moving Average processes. If yt follows an Autoregressive process of order p – denoted AR(p) – then it is generated via:
yt = c+θ1yt−1 +θ2yt−2 +θpyt−p +εt.
If yt follows a Moving Average process of order q – denoted MA(q) – then it is generated is by the
equation:
yt = c+εt +φ1εt−1 +φ2εt−2 +…+φqεt−q.
where εt is a white noise process. In the next two questions, you consider the properties of two simple ARMA models, and your analysis provides an example of how the time series properties of MA and AR models are different.
1

3. Consider the model the MA(1) model:
yt = εt + φεt−1
where εt is white noise with variance σ2.
(a) Show that E[yt] = 0.
(b) Show that V ar[yt] = σ2(1 + φ2).
(c) Show that Cov[yt, yt−1] = φσ2 and Cov[yt, yt−s] = 0 for all s > 1. (d) Is yt a weakly stationary process?
(e) What is the long run variance of yt?
4. Consider the AR(1) model:
yt = θyt−1 + εt (1) where εt is white noise with variance σ2. We assume |θ| < 1 which is known in this context as the stationarity condition. Via back substitution in (1), we can show that: yt =θyt−1+εt = θ{θyt−2 + εt−1} + εt 􏰈∞ θiεt−i (2) If the stationarity condition holds and so limm→∞ θm = 0 then it follows by letting m → ∞ = θmyt−m + in (2) we can argue that yt also has MA(∞) representation given by1 θiεt−i. Using the MA(∞) representation, answer the following questions. (a) Show that E[yt] = 0. (b) Show that V ar[yt] = σ2/(1 − θ2). (c) Show that Cov[yt, yt−s] = θsσ2/(1 − θ2). (d) Is yt weakly stationary? (e) What is the long run variance of yt? Hint: If |h|<1 then 􏰔∞i=0hi =(1−h)−1. 1The details of the argument need not concern us here. 2 yt = i=0 m􏰈−1 i=0

Posted in Uncategorized

Leave a Reply

Your email address will not be published. Required fields are marked *