GOAL: teach practical computational techniques using Python programming in 2 weeks to someone with little or no programming experience.
Anyone can learn basic Python programming and data structures from the material for the first 2-3 days posted below. For the rest you may occasionally need to know linear algebra and calculus concepts.
This is the material from a 2 week Python programming bootcamp for our students heading to internships for the summer as part of the NSF funded PI4 Program for Interdisciplinary and Industrial Internships at Illinois. The camp was held June 7 -- 18, 2016 and met 9am to 5 pm. Each day we: (i) learned new computational concepts; (ii) tested our understanding on short programming exercises; (iii) presented and critiqued student project solutions; and (iv) coded the next project. Most of the time each day was spent programming.
Discussed printing output, basic expressions, conditionals, functions and looping and lists. Also briefly introduced the assert statement as a rudimentary way to test code.
Introduced a few fundamental data structures in Python: Lists, tuples, dictionaries and sets. Also introduced generator and iterator based techniques. As a reminder, we saw some compact ways to construct lists, sets and use generators with functions like min, max and sum.
We saw the basics of opening files and reading and writing their contents. Along the way we covered basic string processing and formatting. Then we covered array and matrix objects of NumPy including how to create arrays of various kinds, and some simple operations one can do on arrays. This included helper functions for common tasks like uniform subdivision of an interval. We closed the day with some basic plotting of single variable functions and some simple analysis and histograms of data. The new packages used today were: a little bit of SciPy, Matplotlib and NumPy.
Today we continued with more operations on NumPy arrays. We saw some basic restructuring methods like stacking and reshaping and creating arrays needed for plotting. Finally, we saw how to compute samples of scalar functions of two variables and plotting the result using contour plots, wireframe and surface plots. Most of the rest of the day was spent in working in groups of 3 debugging each other's code.
We studied the impact (sometimes surprising) of finite precision arithmetic on computations. Then we considered with dense and sparse matrices available in SciPy and NumPy. We saw direct solvers and the idea of and need for matrix factorizations for solving linear systems. The factorizations we looked at were LU and Cholesky. We saw how to use these. The idea of matrix condition number and how it affects the quality of the solution was then introduced. For small perturbations in input we want the answer to not change much if possible. For very large systems with sparse matrices we looked at some iterative: Jacobi method, conjugate gradient (CG) method for symmetric positive definite matrices and I pointed out and we used the GMRES method for general matrices. In particular, systems which have nearly linearly dependent columns tend to be poorly conditioned.
Covered linear least squares problems and two methods for solving those: normal equations and the QR factorization. We looked at the difference between the reduced and full QR factorizations. The very simple power iteraion for finding dominant eigenvalue was introduced and it was shown how to turn that into finding any eigenvalue using the shifted inverse iteraion. Finally combining eigenvalue update with shifted iteraion we arrived at the Rayleigh quotient iteration. The SciPy and NumPy tools for solving eigenvalue problems using more modern and standard tools were then introduced. Finally we looked at the SVD (singular value decomposition).
We worked on Project 6 from the day before and any remaining unfinished coding.
Looked at how to solve a single nonlinear equation and systems of nonlinear equations using scipy.optimize. Then we saw an overview of the landscape of numerical optimization and how to classify and find algorithms based on the type of objective function and types of constraints. We saw how to use the linear programming function and nonlinear unconstrained and constrained optimization functions in scipy.optimize.
We worked on Project 7 and from the day before and any remaining unfinished coding.
Outline of directed, undirected graphs and how to create and draw them using NetworkX. Introduced a few fundamental graph algorithms such as traversal, shortest path, minimal spanning tree and maximum flow. Then we saw how to use odeint from scipy.integrate to solve a single ordinary differential equation (ODE) and a system of such equations and how to draw planar vector fields and streamplots for visualizing the flow of an ODE. We then looked at teh advection equation, a simple linear hyperbolic partial differential equation (PDE) and its solution using finite differencing. The finite element method for 1D problems was introduced and the FEniCS software was introduced as a tool for working with elliptic PDEs. Finally we looked at how to use scipy.interpolate to interpolate data in 1 and 2 dimensions.
This bootcamp was made possible by funding from NSF Grant 1345032 (in the Mentoring Through Critical Transition Points program). In addition I'd like to acknowledge the following people: The PI4 PIs (Yuliy Baryshnikov, Rick Laugesen, Lee DeVille) for suggesting the idea of such a bootcamp; the Math IT team of Department of Mathematics for imagining and creating the physical and computational infrastructure and for their systems help throughout; Pat Szuta of the Math IT for teaching the introduction to Unix and git; the TA Stefan Klajbor for his enthusiasm and for his help; Dr. Kaushik Kalyanaraman for discussions about the material and help with solutions; and Sean Shahkarami for creating the 2015 iteration of this bootcamp.