# High-Yield Practice from the Curriculum for Quantitative Methods

We are in the final few months before the exam, so I hope your exam preparation is in full swing!

As you may know my mantra is “Practice, Practice, Practice”…and I always tell my students that the first place to practice from is the curriculum itself. Ideally you should do all the EOC Questions. These are the questions at the end of each reading from the curriculum. However, if you are short of time or you are doing a second review of the EOC questions, then I have prepared a list of those questions which you must do!

So this is a list of Level II questions (and some curriculum examples) which cover key and testable concepts. Be sure to do such High-Yield questions at least once before the exam. As part of your final revision look at these questions again to reinforce key concepts. I have posted Quantitative Methods questions here and I will add a few more topics as separate posts. For all the topics or more information be sure to visit the IFT website and go HIGH-YIELD!

My best wishes for your last months of study.

Note: In the table below PP refers to the Practice Problem which you will find at the end of each reading in the curriculum. In some cases, curriculum examples are referred to and in some cases the IFT Q-bank is referred to.

| Topic | Quantitative Methods | |
| IFT strongly recommends that you do all the practice problems in the curriculum, but if you are severely time constrained do at least the following. |
| Reading | Question # | Concept Tested |
| Fintech | PP1 | Fintech |
| PP2 | Big data |
| PP3 | Machine learning |
| PP4 | Text analytics |
| PP5 | Robo- advisory services |
| PP6 | Risk analysis |
| PP7 | Algorithmic trading |
| PP8 | DLT |
| PP9 | DLT |
| Correlation and Regression | PP5 - PP10 | |
| PP5 | Coefficient of determination = R-squared |
| PP6 | Effect of deleting observations on R-Squared and SEE |
| PP7 | Correlation coefficient = Multiple R |
| PP8 | F - Stat formula |
| PP9 | Predicting independent variable using regression equation |
| PP10 | Interpreting p-values |
| PP11 - PP16 | |
| PP11 | Testing the significance of the correlation coefficient |
| PP12 | Time series vs cross sectional data |
| PP13 | Predicting independent variable using regression equation |
| PP14 | Interpreting R-squared |
| PP15 | Interpreting SEE |
| PP16 | Interpreting t-stats |
| PP17 - PP26 | |
| PP17 | Scatter plots |
| PP18 | Calculating sample covariance |
| PP19 | Calculating sample correlation |
| PP20 | Interpreting regression results |
| PP21 | Dependent vs independent variable |
| PP22 | Degrees of freedom |
| PP23 | Calculating confidence intervals |
| PP24 | Interpreting t-stats |
| PP25 | Predicting independent variable using regression equation |
| PP26 | Calculating F-stat |
| Multiple Regression and Machine Learning | PP17 - PP22 | |
| PP17 | Predicting independent variable using regression equation |
| PP18 | Confidence interval for the regression coefficient |
| PP19 | Testing the significance of the correlation coefficient |
| PP20 | Interpreting multiple R-squared |
| PP21 | Problems in regression analysis - Heteroskedasticity |
| PP22 | Model misspecification issues - omitted variable |
| PP29 - PP36 | |
| PP29 | Calculating F-statistic |
| PP30 | Qualitative independent variables - interpreting coefficients |
| PP31 | Problems in regression analysis - multicollinearity |
| PP32 | Qualitative independent variables - setting up the model |
| PP33 | Problems in regression analysis - Heteroskedasticity |
| PP34 | Effects of positive serial correlation |
| PP35 | Durbin–Watson statistic |
| PP36 | Qualitative dependent variables - when to use probit and logit models |
| PP37 - PP45 | |
| PP37 | Testing the significance of the correlation coefficient |
| PP38 | Interpreting p-values |
| PP39 | Interpreting p-values |
| PP40 | Predicting independent variable using regression equation |
| PP41 | R-squared and adjusted R-squared |
| PP42 | Interpreting F-stat |
| PP43 | Interpreting F-stat |
| PP44 | Assumptions of multiple regression |
| PP45 | Adjusted R-squared |
| Example 17 | Major types of machine learning |
| 1 | Classification problem vs regression problem |
| 2 | Penalized regression |
| 3 | CART |
| 4 | Neural networks |
| 5 | Clustering |
| 6 | Dimension reduction |
| Time-Series Analysis | PP20 - PP26 | |
| PP20 | Forecasting using a linear trend model |
| PP21 | Forecasting using a log linear trend model |
| PP22 | Interpreting the Durbin–Watson statistic |
| PP23 | Covariance stationary time series |
| PP24 | Forecasting using the chain rule |
| PP25 | Interpreting autocorrelations in an AR model |
| PP26 | Mean-reverting level |
| PP27 - PP35 | |
| PP27 | Properties of random walk & covariance stationary time series |
| PP28 | Covariance stationary time series |
| PP29 | Unit root |
| PP30 | Dickey–Fuller test |
| PP31 | Interpreting autocorrelations in an AR model |
| PP32 | Forecasting using a first differenced model |
| PP33 | ARCH |
| PP34 | Working with two time series |
| PP35 | Selecting an appropriate time series model |
| Simulations | Online assessment - Jason Yang Case Scenario | |
| Q1 | To compare scenario analysis with simulations |
| Q2 | To define prob distribution for the variables |
| Q3 | How to treat correlation across variables? |
| Q4 | To define the probability distribution for the simulation variables |
| Q5 | To explain the results of a simulation |
| Q6 | What are the issues in simulation? |