S2000magician, qm is killing me

Am really looking forward to when you’re going to simplify qm on your website. Thanks in advance

I’m getting to it.

Take a look at the derivatives stuff I recently posted: pricing and valuing forwards and futures, currency forwards, FRAs, and plain vanilla interest rate swaps. (I’ll get to the other swap material and options soon, I hope. I need to get some Level III stuff finished up and posted.)

Hello sir S2m,

My query is related to QM so I thought I should post this here rather than starting a new topic.

My question is that, are not the multiplication regression and simple regression readings in QM similar to cartesian plane thing in geometry and basic differential calculus, (finding slope, functions etc.) that we did in high school?

Sorry, my conceptual understanding of Maths/Stats is zero however I can do all the calculations. And can you please explain the meaning/definition of “Regression”, in layman’s terms, in context of linear and multiple regression?

I did search the meaning of regression online and it says “to go back” but “to go back to WHAT”?

Thank you, like always

I would be interested in reading, can I have the address?

Regression means look at past data and come up with an equation that satisfies the relationship. (This is what i understood from readings)

Simple example: supply of money and inflation.

supply of money affects inflation… so you come up with a releationship between these two based on past data

you have to decide which variable is dependent and which is independent.

Goverment can control supply of money directly by their monetory policy… (so no dependency)

Goverment can not 100% control inflation dierctly. they can only try to controll it indirectly via supply of money (so inflation is dependent variable in this relationship)

inflation = A + B(supply of money)

which will not always satisfy. inflation will be close to the value achived by plugging in supply of money in this equation… but there might be slight ups and downs… (that tells you for future how much room for error you have to consider)

you then use all the techniques mentioned in chapter 11,12 and 13 to verify if the equation you came up with is correct or not (A and B in above equations are mentioned as b0 and b1 in the readings)

Got it. But how this “Regression” is different from Cartesian plane thing and basic differential calculus that is taught in high school?

y = a +bx +u

where, a = alpha (y-intercept)

b= beta (slope) aka rise/run

U = error term

The equation is the same for cartesian thing and this regression thing.

Is regression just another name for cartesian thing?

Also, this regression relates with differential calculus. The slope is the rate of change. Y (dependent variable) is the function of X (independent variable). Right?

I wouldn’t try and understand the mathematical background of the theory but rather try and understand what regression analysis is trying to produce. We regress data in order to try and create an equation that will predict future values based on a set observed values with a certain degree of confidence.

Actually, I am good with what Regression does/does not do. (and yes I realize that would suffice for exam)

But I am confused with how regression relates to the stuff that I have mentioned earlier. I saw that stuff in high school and and that was very similar to Regression. It can’t just be a coincidence.

I am trying to develop a framework about all these Mathematical/Statistical concepts so I can get the bigger picture.

where is the site?

I would like to visit

Magician’s website?



You use differential calculus because in regression, you are trying to minimize the mean square error and differentiation achieves that, that’s why it’s called the least square method.

Multiple regression uses the same concept as simple regression, but has a greater explanatory power because of more variable. But there are more potential issues because of interaction between variables, overfitting, etc. so that’s why there is a battery of new tests you have to learn.

Oh okay. I get the picture now but it’s blurry. Can you please expland a little bit on “because in regression, you are trying to minimize the mean square error and differentiation achieves that”?

Massive thanks. smiley

In calculus, by setting the derivative of a function to zero, one can find its minimum. And since the point of the regression line is to minimize the error term (MSE), using differentiation achieves that.

In simple (linear) regression, you are trying to compute the slope (m) and intercept (b) of a line y = mx + b that minimizes the sum of the squared (vertical) distances between the data points and the line. So, if your data points are (_x_1, _y_1), (_x_2, _y_2), . . . , (xn, yn), then, for each data point (xi, yi), you have the the point on the line (xi, yi*), where yi* = mxi + b. Then the function you’re trying to minimize is:

f(m, b) = Σ(yi – yi*)²

= Σ[yi – (mxi + b)]²

This is a function of two variables: m and b. To determine the values of m and b that minimize this function, you compute the partial derivatives ∂_f_/∂_m_ and ∂_f_/∂_b_, set them both equal to zero, and solve the two (linear) equations simultaneously for m and b.

For multiple (linear) regression, we do the same thing, but now we have two or more slopes (but still only one intercept). We’ll have a similar function f (of three or more variables, one more than the number of independent variables), we’ll compute the partials of f with respect to each of the variables, set all of those partials equal to zero, and solve the set of (linear) equations simultaneously.

As calculus problems go, these are rather trivial.

More accurately, we’re minimizing the sum of the squared errors (SSE). It amounts to the same thing, however.

s2k is a beast