SF2943 Time Series Analysis 7.5 cr, Spring 2017
Current information.
[Back to the main page]
Please remember to fill out the course survey: http://www.math.kth.se/cgibin/evaluation/evaluation?e=459.
Monday 4/9
All exams from August 18 have now been graded and the results have been reported in Rapp. The grade thresholds are the same as for the May exam:
22 (F), 23 (Fx), 25 (E), 27 (D), 32 (C), 37 (B), 42 (A).
If you have received the grade "Fx", contact Pierre if you wish to try to earn a passing grade. Your final grade  "E" or "F"  is decided by an oral exam and the deadline for this is October 18.
Saturday 19/8
A set of solutions to the exam on August 18th can be found here. These have not yet been proof read and may contain (minor) typos.
Sunday 11/6
Some more detailed statistics for the exam is available here.
Saturday 10/6
All exams have now been graded and results will be reported in Ladok early next week. The grade thresholds are:
22 (F), 23 (Fx), 25 (E), 27 (D), 32 (C), 37 (B), 42 (A).
Grade distribution: F: 13%; Fx: 5%; E: 7%; D: 27%; C: 20%; B: 14%; A: 14%.
More detailed statistics (performance on the different problems etc.) will be available next week.
Tuesday 6/6
Your projects and exams are currently being graded. For the projects, if you do not hear anything then you have received a "Pass". Your exams should be graded by the end of the week.
Some comments on the provided solutions (now updated): In Problem 3 the signs of the second AR coefficient were given in the reverse order  series CBD has second coefficient less than 0 and AEF has a second coefficient greater that 0 (in fact, they have the same absolute value 0.7). Moreover, you do not need to give as detailed an explanation/motivation for your solution as the one provided to receive full credit. Next, in Problem 4 we allow solutions (for part C) that use a linear filter that produces a series with nonzero mean and then go on to state that the process is causal. A comment on this is now included in the provided solution.
In case you notice any other typos or discrepancies in the solutions please let either Salah or me know.
Monday 29/5
A set of solutions to today's exam is available here. These have not yet been proof read and may contain (minor) typos. More information about the exam and the solutions will be made available in the next couple of days.
Friday 26/5
A course survey is available at http://www.math.kth.se/cgibin/evaluation/evaluation?e=459. It is greatly appreciated if you take a few minutes to answer this short questionnaire as it helps us improve both the course and our own teaching in general.
Thursday 25/5
Another minor typo was found in the lecture notes on spectral densities  thanks to Mattias Nilsson for bringing this to my attention. The issue was that comments for AR(2) processes mentioned that we always find a maximum of the spectral density, which is not true. Rather, there is a formula for the extreme point of the denominator and then we must also check whether or not this corresponds to a minimum of the denominator, and thus a maximum of the spectral density, or the other way around. The new version is available via the same links as before.
Monday 22/5
Regarding the preliminary plan, Chapter 8 is NOT included in the course and thus not relevant for the final exam.
Thursday 18/5
The following note summarizes the lecture on ARCH and GARCH processes: GARCH.pdf. Note that the textbook does not contain everything that was discussed in that lecture.
Tuesday 16/5
Regarding Wednesday's schedule: Presentations of Problem 7, i.e., old exam problems, are held 1315, followed by an exercise session 1517.
Monday 15/5
Today's class focused on multivariate time series. We started by defining an mvariate time series, its secondorder properties and the notion of stationarity. We then discussed some properties of matrixvalued covariance functions and the definition of the matrixvalued spectral density function of an mvariate time series. Next, we discussed the mdimensional analogues of the estimators of the mean and acvf in the univariate setting. The class ended with the definition of multivariate ARMA processes and conditions for cauality and invertibility in such models; we covered much of Sections 7.17.4 (albeit somewhat briefly).
Saturday 13/5
Regarding the upcoming two weeks, I will be traveling from the 19th until the day of the exam. In addition ot being out of the office, during this time I will have but so much time to answer emails. Therefore, if you want to come see me in preparation for the exam, you should try and schedule a time next week; there will be regular office hours on Monday (1012).
Friday 12/5
Some minor errors were spotted in the document on spectral densities first posted on April 20th. These mainly pertained to the example with an AR(12) process and have now been corrected; the updated version is available: SpectralDensity.pdf. Please let me know if you notice any additional errors.
Thursday 11/5
Thank you to the groups who presented their solutions on Wednesday, you all did a very good job.
During the second lecture Salah solved the following exercises: Problem 4 on the 05/16 exam, 7.1, 7.2 and a combination of problems 1 and 2 on the following exam: Exam University of Orléans
Regarding previous exams: I have previously stated that Timo's and Tobias' exams probably would be on the more difficult side. Having looked more closely at them this is really only applicable to Timo's exam form June 2010, mostly because of the topics not being completely aligned with what we have focused on during the course. Moreover, the problems on Tobias' exams are in fact both aligned with current course topics and on a comparable level of difficulty. Thus, I have to revise my previous statement on those exams (December 2010 to August 2012) and similarly for Timo's exams from 2013, thereby greatly increasing the pool of suitable practice exams.
Tuesday 9/5
Regarding the presentation seminar on Wednesday 10/5: The seminar will be held 1315, followed by an exercise session 1517; most likely we will not need the full two hours for the presentations. Even if your has not been asked to present you may be asked questions on your work during the seminar. Therefore it might be beneficial to have a copy of your report available during the seminar.
Monday 8/5
Today's lecture started with a recap of ARCH and GARCH processes, particularly the properties of GARCH(1,1). Notes on this topic will be posted during the week and those who did not attend either lecture should consider this as mandatory reading. After the recap we looked at some simulations of a GARCH(1,1) process. Particularly interesting is the behavior of the conditional variance and the GARCH process for different assumptions on the "noise" sequence. For example, we saw how using a Student's tdistribution gives rise to heavy tails.
After the recap we covered Sections 6.1 and 6.3 on ARIMA models and unit roots in time series models. Specifically, for Section 6.1, we defined ARIMA processes and discussed the intuition behind them, looked at simulations of an ARIMA(1,1,0): Plots of the time series, acf, pacf and the same things for the differenced series, fitting AR models using MLE, YuleWalker for the original and the differenced series. Next, we discussed the presence of unit roots in time series models, how unit roots relate to the need to difference or the process being overdifferenced, and how to test for unit roots in either an AR or MA model (and the correspoding conlusions regarding processing the data). Here we specifically discussed the Augmented DickeyFuller test and used it on the simulated ARIMA(1,1,0) data.
Wednesday 3/5
The topic of the first lecture was ARCH and GARCH processes (Section 10.3.5). We began by looking at Nasdaq data from the early 2000s to motivate the need for time series models with certain properties not achievable by the types of models previously considered. We then moved on to defining ARCH(p) processes and looking carefully at ARCH(1). Having derived some properties (unconditional and conditional variance, covariance structure, likelihood function etc.) we looked at some simulations for such a process. In particular, we observed the effect changing one of the parameters in the model has on the behavior of such a process (e.g., volatility clustering).
Next we defined the GARCH(p,q) process and studied GARCH(1,1) in some detail. For example, we observed the ARMA structure inherent in the definition of GARCH(1,1), the impact of the choice of the parameter values on the properties of the process, forecasts of the conditional variance and the loglikelihood function when the "noise sequence" has a Gaussian distribution.
The topics of the lecture went slightly beyond the textbook. Thus, a short note summarizing the lecture will be posted here in due time.
In the second lecture the following exercises were solved: 5.1, 5.3, 5.4 and 5.11.
Important: (i )Regarding the project report, you can hand in a paper copy at Teknikringen or email me an electronic copy. (ii) Make sure to register for the exam; any questions should be directed to the student affairs office.
Wednesday 26/4
During the first lecture we continued on the topics of Chapter 5: Maximum likelihood estimation (5.2), diagnostic checking (5.3) and forecasting (5.4). Regarding forecasting of ARMA processes we also covered Section 3.3 on how the innovation's algorithm can produce the best linear predictor for such processes. We only wrote down the expression for onestep predictions, please review the derivation for multiple steps on your own. Moreover, there is a nice example on forecasting for an ARMA(1,1) (Example 3.3.3) that I encourage you to look at.
During the second lecture the following problems were covered: 3.1 (a) and (e), 4 on exam 6/15, 3.4, 3.11, 5 on exam 5/16.
Monday 24/4
Today's lecture covered the first part of Chapter 5: The YuleWalker equations and the HannanRissanen algorithm. Specifically, we focused on sections 5.1.1, including the brief comments on order selection using asymptotic properties of the YuleWalker estimates, and 5.1.4. We then moved on to talk about maximum likelihood estimation and will continue on this topic on Wednesday 26/4; today we defined the likelihood function associated with a sample from a Gaussian process and began discussing how to express this function in terms of the quantities appearing in the innovation's algorithm.
If you are consulting the (preliminary) leture plan, as of today were are roughly half a lecture behind, the topics not covered thus far being stochastic volatility and GARCH processes (Section 10.3.5).
There is now a list of recommended exercises  Chapter 3 and onward  on the main page. As stated in the document previous exams are also a terrific source for problems to work on. As always, if you have any trouble solving any of the exercises you are of course welcome to come talk to either me or Salah, either during lectures or office hours.
Thursday 20/4
As a complement to the (mainly) theoretical treatment of spectral analysis during the lectures on April 19, the following set of notes studies some specific examples: SpectralDensity.pdf. The aim is to give you a better understanding of what information is encoded in the spectral density of a process, what the spectral density looks like for some ARMA processes, and illustrate the periodogram for realizations of the processes under consideration.
Wednesday 19/4
Today we covered spectral analysis of time series  good job by those of you who persevered the entire afternoon; this was the last Wednesday with lectures in both time slots.
We covered most of Chapter 4 of the textbook and some additional results; primarily rigorous results for the periodogram that are not explicitly stated and/or proved in the book. The focus was on the theory of spectral analysis rather than specific examples. In the upcoming days I will post some notes that deal with some examples, I also encourage you to experiment with spectral densities and associated periodograms in, say, R or Matlab.
Note that on the main page there is now a link to the "Formulas and survey" document that you are allowed to bring to the final exam.
Sunday 9/4
As I have been away for a few days I am a bit behind on my emails, my apologies for this. As I mentioned in class I will not reply to the "group name"emails, but if you have an urgent question that you have yet to receive an answer to please send it again and I will get to if sooner than if I go through the emails of the past couple of days in order.
An update on the topics covered in the previous lectures and exercise session:
Wed 29/3: Problems 2.1, 2.4 (c), 2.8, 2.11 and Problem 1 from the June 2015 exam.
Mon 3/4: Sample mean and autocorrelation and their properties. Projections of time series. Recursive algorithms, the DurbinLevinson algorithm. [Sections 2.4 and 2.5, up to the section on the innovations algorithm].
Wed 5/4: The innovations algorithm for onestep prediction. Prediction operator and its properties. Innovations algorithm for hstep prediction. ARMA(p,q) processes, causality and invertibility. Different methods for computing the acf of an ARMA(p,q) process. The partial autocorrelation function (pacf). Examples of acf and pacf for some ARMA processes: MA(q), AR(p). The material corresponded to (parts of) Sections 2.5, 3.1 and 3.2. For forecasting of ARMA processes, please read the section on hstep prediction using the innovations algorithm (mentioned in class but not covered, the methods simplifies when considering ARMA processes). We ended the class by looking at seven different (simulated) time series, their sample mean, acf and pacf.
Project work: The time series for Problem 4 are available on the main page.
Wednesday 29/3
In today's class we finished Section 2.2 by discussing the existence and uniqueness of a stationary solution to the equation defining an AR(1)process. Specifically for the causal case; the noncausal case was only mentioned briefly. We then covered ARMA(1,1) processes  existence and uniqueness, causality and invertibility and the acvf. The class ended with a few minutes on ARMA(p,q) processes (definition and condition for stationarity).
Project work: The problem set for your projects is now available (see the main page). Note that the second presentation has been moved from May 8th to May 10th to allow you at least one weekday to prepare for any presentation you are assigned.
Monday 27/3
Today's class covered topics on stationary processes: Properties autocovariance functions, nonnegative definite functions and acvf's, MA(q) process, linear processes (roughly Sections 2.1 and 2.2, exluding the material on AR(1) processes); the lecture ended with the statement and proof of Proposition 2.2.1 on applying a linear filter to a stationary, zeromean process.
Sunday 26/3
Note that next week's classes have been moved from B2 to M2 (Monday) and K1 (Wednesday).
Wednesday 22/3
During today's lecture (part 1) we covered most of sections 1.4 and 1.5: MA(1) and AR(1) processes and their acf's, the sample mean, acvf, acf, estimation of tends and seasonal components and differencing. We did not cover differencing for the full classical decomposition model (trend and seasonal component)  see Section 1.5.2.2.
During Part 2 of today's lecture the following problems were solved: 1.1, 1.4, 1.5 and 1.8(a).
Some comments about the 2nd and 3rd editions of the textbook. From the table of contents, the first five chapters are close to identical. The 3rd edition has some additional subsections but this seems to be mostly a matter of organization rather than new material. The main difference for this course is with Chapter 10 (2nd ed.) and Chapter 7 (3rd), specifically regarding stochastic volatility and GARCH processes. When we reach that topic I will provide more detailed instructions for the two editions. Note also that the 3rd edition is available online via the library.
Monday 20/3
Today's opening lecture consisted of general course information  summarized in a set of slides: Intro.pdf  and parts of sections 1.2, 1.3 and 1.4 in the textbook. Specifically: Some simple zeromean models, definition of mean, autocovariance and autocorrelation functions, (weakly) stationary and stricly stationary processes, definition of IID noise and white noise. We ended by computing the ACVF for IID noise and WN and discussed the subtle difference between the two types of sequences (we will return to this next time).
Important: Note the changed location for next lecture, which now takes place in K1.
Friday 10/2
Welcome to SF2943! The first session is on Monday March 20 at 8 am in room B2 (Brinellvägen 23) [Map].
More information will be available closer to the course start. For now, if you have any questions about the course you can reach out to either me (Pierre) or Salah; see the main page for contact information. For administrative questions please contact the Student Affairs Office.
Main page
Mathematical Statistics
Mathematical Statistics Courses
