course Mth 272
Æý€½ãNÏü×ü¦†½²¯ ìðþö…hÒassignment #031
031. `query
Applied Calculus II
05-11-2009
......!!!!!!!!...................................
19:14:21
Query problem 7.7.4 points (1,0), (2,0), (3,0), (3,1), (4,1), (4,2), (5,2), (6,2)
......!!!!!!!!...................................
RESPONSE -->
To find the best linear model you have to use the linear model eqn f(x)= ax+b and the sum of squared errors [f(x1)-y1]^2 + ....
(a+b-0)^2 + (2a+b-0)^2 + (3a+b-0)^2 + (3a+b-1)^2 + (4a+b-1)^2+ (4a+b-2)^2+ (5a+b-2)^2 + (6a+b-2)^2
=116a^2+8b^2+56ab-60a-16b+14 then diff. with respect to a
when expanded your expression comes out to
116a^2+8b^2+56ab-74a-16b+14,
which changes the derivatives and the simultaneous equations slightly (see given solution below).
dS/da= 232a+56b-60 = 0
a= 29/9
dS/db = 16b + 56a-16
b= - 185/18
the least squares regression line eqn is then
29/9x - 185/18
the sum of squared errors is
S= (-7.05-0)^2+ (-3.83-0)^2+ (-.611-0)^2....
S=134.8714
confidence assessment: 3
Everything looks correct except for one term of your expansion (see note above).
Given solution:
** The text gives you equations related to the sum of the x terms, sum of y values, sum of x^2, sum of y^2 etc, into which you can plug the given information.
To use partial derivatives and get the same results. The strategy is to assume that the equation is y = a x + b and write an expression for the sum of the squared errors, then minimize this expression with respect to a and b, which are treated as variables.
If y = a x + b then the errors at the four points are respectively
| (a * 1 + b) - 0 |,
| (a * 2 + b) - 0 |,
| (a * 3 + b) - 0 |,
| (a * 3 + b) - 1 |,
| (a * 4 + b) - 1 |,
| (a * 4 + b) - 2 |,
| (a * 5 + b) - 2 |, and
| (a * 6 + b) - 2 |.
The sum of the squared errors is therefore
sum of squared errors: ( (a * 1 + b) - 0 )^2+( (a * 2 + b) - 0 )^2+( (a * 3 + b) - 0 )^2+( (a * 3 + b) - 1 )^2+( (a * 4 + b) - 1 )^2+( (a * 4 + b) - 2 )^2+( (a * 5 + b) - 2 )^2+( (a * 6 + b) - 2 )^2.
It is straightforward if a little tedious to simplify this expression, but after simplifying all terms, squaring and then collecting like terms we get
116•a^2 + 2•a•(28•b - 37) + 8•b^2 - 16•b + 14.
We minimize this expression by finding the derivatives with respect to a and b:
The derivatives of this expression with respect to b and a are respectively
56•a + 16•b - 16 and 232•a + 56•b - 74.
Setting both derivatives equal to zero we get the system
56•a + 16•b - 16 = 0
232•a + 56•b - 74 = 0.
Solving this system for a and b we get
a = 1/2, b = - 3/4.
So see that this is a minimum we have to evaluate the expression f_aa * f_bb - 4 f_ab^2.
f_aa = 56 and f_bb = 56, while f_ab = 0 so f_aa * f_bb - 4 f_ab^2 is positive, telling us we have a minimum.
Thus our equation is
y = a x + b or
y = 1/2 x - 3/4. *&*&
.................................................
......!!!!!!!!...................................
19:14:30
Give the equation of the least squares regression line and explain how you obtained the equation.
......!!!!!!!!...................................
RESPONSE -->
answered
confidence assessment: 3
.................................................
......!!!!!!!!...................................
19:14:36
What is the sume of the squared errors?
......!!!!!!!!...................................
RESPONSE -->
answered
confidence assessment: 3
.................................................
......!!!!!!!!...................................
19:50:45
Query problem 7.7.6 (was 7.7.16) use partial derivatives,etc., to find least-squares line for (-3,0), (-1,1), (1,1), (3,2)
......!!!!!!!!...................................
RESPONSE -->
confidence assessment: 3
.................................................
......!!!!!!!!...................................
19:51:16
Give the equation of the desired line.
......!!!!!!!!...................................
RESPONSE -->
least squares eqn = .3x+1
confidence assessment: 3
** If y = a x + b then the errors at the four points are respectively
| (a * -3 + b) - 0 |,
| (a * -1 + b) - 1 |,
| (a * 1 + b) - 1 | and
| (a * 3 + b) - 2 |. The sum of the squared errors is therefore
( (a * -3 + b) - 0 )^2 + ( (a * -1 + b) - 1 )^2 + ( (a * 1 + b) - 1 )^2 + ( (a * 3 + b) - 2 )^2 =
[ 9 a^2 - 6 ab + b^2 ] + [ (a^2 - 2 a b + b^2) - 2 ( -a + b) + 1 ] + [ a^2 + 2 ab + b^2 - 2 ( a + b) + 1 ] + [ 9 a^2 + 6 ab + b^2 - 4 ( 3a + b) + 4 ] =
20•a^2 - 12•a + 4•b^2 - 8•b + 6.
This expression is to be minimized with respect to variables a and b.
The derivative with respect to a is 40 a - 12 and the derivative with respect to b is 8 b - 8.
40 a - 12 = 0 if a = 12/40 = .3.
8b - 8 = 0 if b = 1.
The second derivatives with respect to and and b are both positive; the derivative with respect to a then b is zero. So the test for max, min or saddle point yields a max or min, and since both derivatives are positive the critical point gives a min.
The least-squares line is therefore
t = .3 x + 1.**
.................................................
......!!!!!!!!...................................
19:57:19
What was your expression for the sum of the squared errors?
......!!!!!!!!...................................
RESPONSE -->
(.1-0)^2+ (.7-1)^2 + (1.3-1)^2 + (1.9-2)^2= .6 = S
confidence assessment: 3
.................................................
......!!!!!!!!...................................
19:58:20
How did you minimize this expression (be specific)?
......!!!!!!!!...................................
RESPONSE -->
i used the calculator to make sure the equation was correct, and graphed the lines that were not correct until i found the best fitting one.
confidence assessment: 3
.................................................
"
I believe you understand the process well. See my notes for a couple of corrections and details.