A classic observation regarding polynomial fitting is Runge's phenomenon. What was noticed was that sometimes blindly increasing the polynomial order when fitting a data set leads to high oscilation or blowup of the polynomial.

One lesson from observation is that when doing polynomial fitting, you should be careful about simply "throwing more firepower" at the problem by increasing the polynomial degree.

I'd like you to reimplement the experiment as follows. Start with 10 evenly space $x$-values between $[-1,1]$ using `linspace`. Then, compute a set of $y$-values at those points using the function:

Now, try doing a polynomial fit of degree 10 on this data set and plotting it against the function. Notice what near the endpoints? Try increasing the number of nodes used and see what happens.

Compare this to the fit when using a spline to interpolate the curve.

Recall that the discrete Laplacian is the "tridiagonal" matrix:

$$ \begin{bmatrix} 2 & -1 \\ -1 & 2 & -1 \\ & -1 & 2 & -1 \\ & & \ddots & \ddots & \ddots \\ & & & -1 & 2 \end{bmatrix} $$One way this shows up is in the finite differences approximation of the 1D Poisson equation:

$$ -\Delta u = -u_{xx} = f $$on the interval $[0,1]$ with boundary conditions $u(0)=0$ and $u(1)=0$.

We can approximate $u_{xx}$ is by creating a uniform grid of points in the interval $[0,1]$ which are spaced apart by distance $h$. The finite differences formula for this is given by:

$$ -u_{xx} \approx -\frac{u_{i-1} - 2 u_i + u_{i+1}}{h^2} $$Notice that, based on this formula, the discrete Laplacian technically scale with $h^2$. So, we'll make the slight change and work with the matrix:

$$A = \frac{1}{h^2} \cdot \begin{bmatrix} 2 & -1 \\ -1 & 2 & -1 \\ & -1 & 2 & -1 \\ & & \ddots & \ddots & \ddots \\ & & & -1 & 2 \end{bmatrix} $$Now, it's easy to check that the eigenfunctions of the original equation are $\sin(n \pi x)$ with eigenvalue $(n\pi)^2$ for all $n\ge 1$.

The goal of this problem is to use `eigvalsh` function in `scipy.linalg` to check how closely we're approximating the eigenvalues of the continuous problem using the $n \times n$ discrete Laplacian as $n$ grows.

For simplicity, check how well the first 6 eigenvalues are approximated as $n$ grows.

Note: `eigvalsh` *should* give you the eigenvalues in sorted order! So, you can just take the first 6 entries.

One of the original uses of splines was in the design of ships. Flexible rulers called "splines" were held in place by lead weights called "ducks". Since the rulers were elastic, they relaxed into a shape of "least energy" which defined a curve meeting the duck constraints.

See if you can find some images of "splines and ducks" being used in ship design.

*Yes; that's really all I'm asking. No "real" problem to work on. :-)*

Consider the numerical behavior of the function:

$$ f(x) = \frac{1 - \cos(x)}{x^2} $$on the interval $(0,a]$ as $a$ gets very small. To avoid a divide by zero issue, just take the left end point to be nearly zero. For example, $1e^{-16}$.

Using the identity $\sin^2(x) = \frac{1 - \cos(2x)}{2}$, try rewriting the function $f(x)$ in a way which avoids the dangerous subtraction.

As another, alternative, try using a second order Taylor expansion of $\cos(x)$ instead to see if that helps the behavior. (One advantage of this is that you can get rid of the "divide by x issue".)