程序代写代做代考 ## Exercise 3

## Exercise 3

### (a)

Let $y_i$ be the measurement at time $t_i$.

Let $m$ be the number of measurements. In this case, $m = 100$.

The residual at time $t_i$ is
$$
r_i = (x_1 + x_2t_i^2) exp(−x_3t_i) – y_i
$$
The objective function is
$$
f (x)= frac {1} {2} sum_{i=1}^m r_i^2(x) = frac {1} {2} sum_{i=1}^m ((x_1 + x_2t_i^2) exp(−x_3t_i) – y_i)^2
$$
Our goal is to minimize $f(x)$, it is a least-squares problem.
$$
frac {partial r_i} {partial x_1} = exp(-x_3t_i)
$$

$$
frac {partial r_i} {partial x_2} = t_i^2exp(-x_3t_i)
$$

$$
frac {partial r_i} {partial x_3} = -t_i exp(-x_3t_i)
$$

The Jacobian matrix is
$$
J(x) = [frac {partial r_i} {partial r_j}]_{ij} = left(egin{array}{cc}
exp(-x_3t_1) & t_1^2exp(-x_3t_1) & -t_1exp(-x_3t_1)\
exp(-x_3t_2) & t_2^2exp(-x_3t_2) & -t_2exp(-x_3t_2) \
… & … & … \
exp(-x_3t_{200}) & t_i^2exp(-x_3t_{200}) & -t_i exp(-x_3t_{200})
end{array}
ight)
$$

### (b)

#### Gauss-Newton

##### Parameters

| Name | Value |
| ——- | ——– |
| x0 | [1,1,1]’ |
| descent | ‘gauss’ |
| alpha0 | 0.05 |
| tol | 0.00001 |
| maxIter | 10000 |

##### Result

| $x_1$ | $x_2$ | $x_3$ | $f$ |
| —— | ——– | —— | ——- |
| 3.3976 | 147.2555 | 1.9922 | 88.0913 |

##### Plot

![](GN.png)

#### Levenberg-Marquardt

##### Parameters

| Name | Value |
| ——- | ——– |
| x0 | [1,1,1]’ |
| Delta | 1 |
| eta | 0.001 |
| tol | 0.00001 |
| maxIter | 10000 |

##### Result

| $x_1$ | $x_2$ | $x_3$ | $f$ |
| —— | ——– | —— | ——- |
| 3.3984 | 147.2763 | 1.9922 | 88.0908 |

##### Plot

![](LM.png)

#### Discussion

We can see that the parameters estimated by Gauss-Newton and Levenberg-Marquardt are very similar. The objective value achieved by Levenberg-Marquardt is a little lower than Gauss-Newton (88.0908 compared with 88.0913).

From the fit plots, we also can see their estimation have no obvious difference, both are good fit the noisy measurements. The estimated paraeters $x$ are close to the actual value.

Leave a Reply

Your email address will not be published. Required fields are marked *