teaching

Teaching with Julia vs. teaching with MATLAB

In fall 2020, I was set to teach our first-semester computational math course using my co-authored book with its couple of hundred or so MATLAB codes. Then I was approached by an honors student about doing an add-on honors section.

Jekyll for clicker questions

For a few years, I’ve been a fan of clickers (aka personal response systems) for large lecture sections. Clickers are a simple–and scalable–way to incorporate a little widespread active learning in the classroom.

Trefethen & Bau & MATLAB & Julia, Lectures 24-29: Eigenvalue stuff

Part V of T&B is on dense methods for eigenvalue and singular value problems. For my course, this is the part of the text that I condense most severely. In part that’s due to the need to cover unconstrained nonlinear solving and optimization stuff later on.

Trefethen & Bau & MATLAB & Julia: Lectures 20, 21, 23: Solving square systems

Three in one this time: Lecture 20, which is on Gaussian elimination / LU factorization, Lecture 21, on row pivoting, and Lecture 23, on Cholesky factorization. I mainly skipped over Lecture 22, about the curious case of the stability of pivoted LU, but the main example is dropped into the end of my coverage of pivoting.

Trefethen & Bau & MATLAB & Julia: Lecture 19, Stability of least squares

Here are the notebooks in MATLAB and Julia. The new wrinkle in these codes is extended precision. In MATLAB you need to have the Symbolic Math toolbox to do this in the form of vpa.

Trefethen & Bau & MATLAB & Julia, Lectures 12-13: Conditioning and floating point

I’ve run into trouble managing gists with lots of files in them, so I’m back to doing one per lecture. Here are Lecture 12 and Lecture 13. We’ve entered Part 3 of the book, which is on conditioning and stability matters.

Trefethen & Bau & MATLAB & Julia, Lecture 11: Least squares

This week’s notebooks (MATLAB and Julia–now all lectures are together for each language) are about least squares polynomial fitting. The computational parts are almost identical, except for how polynomials are represented.

Trefethen & Bau & MATLAB & Julia, Lecture 8: Gram-Schmidt

This lecture is about the modified Gram-Schmidt method and flop counting. The notebooks are here. I’m lost. Almost as an afterthought I decided to add a demonstration of the timing of Gram-Schmidt compared to the asymptotic flop count.

Trefethen & Bau & MATLAB & Julia, Lectures 6-7

Here are the Jupyter notebooks for Lecture 6 and Lecture 7. (I finally noticed that a Gist can hold more than one notebook…duh.) Not much happened in Lecture 6, but I got gobsmacked in Lecture 7.

Trefethen & Bau & MATLAB & Julia Lecture 5: More on the SVD

Notebooks are viewable for matlab and julia. This is one of my favorite demos. It illustrates low-rank approximation by the SVD to show patterns in voting behavior for the U.S. Congress.