Some changes arrive with great drama and attention. Others creep in stealthily, like the gradually warming water that boils the proverbial frog (contrary to reality).
I can’t think of the last time I welcomed the sight of a printed journal in my mailbox. Though I might print the occasional article that needs to be pored over slowly, I’ll recycle that copy after a bit; the electronic version is the one that I archive.

I’ve spent the last two spring semesters teaching ODEs (ordinary differential equations) to a total of about 170 biomedical and chemical engineering majors. The content is dictated by a number of constraints: the perceived desires of the client departments, multiple instructors, all of whom have more experience with the course than I do, and traditional expectations. Based on a limited survey of popular textbooks ( this, this, and our choice, Brannan and Boyce), many courses like this are quite similar.

I’ve used MATLAB for over 25 years. (And before that, I even used MATRIXx, a late, unlamented attempt at a spinoff, or maybe a ripoff.) It’s not the first language I learned to program in, but it’s the one that I came of age with mathematically. Knowing MATLAB has been very good to my career.
However, it’s impossible to ignore the rise of Python in scientific computing. MathWorks must feel the same way: not only did they add the ability to call Python directly from within MATLAB, but they’ve adopted borrowed some of its language features, such as more aggressive broadcasting for operands of binary operators.

At long last, I’ve refreshed the look for this site. Previously it was based on a “Metro UI” style for HTML, which looked nice to me at the time. Actually it still looks pretty nice, but it was named for the Metro design introduced with Windows 8, which tells you that it wasn’t exactly a modern look.
More importantly to me, I’ve ditched writing raw HTML for creating a site using the Hugo content creation system.

For a few years, I’ve been a fan of clickers (aka personal response systems) for large lecture sections. Clickers are a simple–and scalable–way to incorporate a little widespread active learning in the classroom. They can’t work miracles, but they do allow me to reward attendance, rouse the students once in a while, and give good feedback to all of us about how well the latest concepts are sinking in. I like the accountability: If you got the question wrong when 80% of the class got it right, that’s on you, but if 20% of the class got it right, that’s on me.

I’m going to wrap up the long-paused MATLAB versus Julia comparison on Trefethen & Bau by chugging through all the lectures on iterative methods in one post.
I’m back to using gists–not thrilled with any of the mechanisms for sharing this stuff.
Lecture 32 (sparse matrices and simple iterations) Lecture 33 (Arnoldi iteration) Lecture 34 (Arnoldi eigenvalues) These are remarkable mainly in that they have such striking similarity in both languages.

I’ve just finished one of the most remarkable fiction reading experiences I’ve had in quite some time: It Can’t Happen Here, by Sinclair Lewis.
ICHH is a satirical novel written and set in 1935 America. It describes the rise of a populist dictatorship modeled closely along the rise of the Nazis in Germany.(Lewis’ wife, Dorothy Thompson, was the first American journalist expelled from Nazi Germany and was clearly responsible for much of the shape of the book.

Part V of T&B is on dense methods for eigenvalue and singular value problems. For my course, this is the part of the text that I condense most severely. In part that’s due to the need to cover unconstrained nonlinear solving and optimization stuff later on. But I also find that this is the least compelling part of the text for my purposes.
It’s heavily weighted toward the hermitian case. That’s the cleanest situation, so I see the rationale.

Three in one this time: Lecture 20, which is on Gaussian elimination / LU factorization, Lecture 21, on row pivoting, and Lecture 23, on Cholesky factorization. I mainly skipped over Lecture 22, about the curious case of the stability of pivoted LU, but the main example is dropped into the end of my coverage of pivoting.
The Julia surprises are, not surprisingly, coming less frequently. In Lecture 20 I had some fun with rational representations.

Here are the notebooks in MATLAB and Julia.
The new wrinkle in these codes is extended precision. In MATLAB you need to have the Symbolic Math toolbox to do this in the form of vpa. In Julia, you have to use version 0.5 or (presumably) later, which had a surprising side effect I’ll get to below.
The reason for extended precision is that this lecture presents experiments on the accuracy of different algorithms for linear least squares problems.

Powered by the Academic theme for Hugo.