Courses

Below is a list of courses that are acceptable to fulfill the Applied Math requirements. It is not intended to be exhaustive, and not every course is offered every year.  Students should consult with the Director of Graduate Studies for more information.  For this year’s courses, please visit:  Applied Math Graduate Courses

AMTH 500, Spectral Graph Theory & Apps:   An applied approach to spectral graph theory. The combinatorial meaning of the eigenvalues and eigenvectors of matrices associated with graphs. Applications to optimization, numerical linear algebra, error-correcting codes, computational biology, and the discovery of graph structure.

MATH 520, Measure Theory and Integration:  Construction and limit theorems for measures and integrals on general spaces; product measures; Lp spaces; integral representation of linear functionals. 

AMTH 525, Seminar in Applied Mathematics:  This course consists of weekly seminar talks given by a wide range of speakers. Required of all first-year students.  

AMTH 552 / CPSC 663, Deep Learning Theory and Applications:   Deep neural networks have gained immense popularity in the past decade due to their outstanding success in many important machine-learning tasks such as image recognition, speech recognition, and natural language processing. This course provides a principled and hands-on approach to deep learning with neural networks. Students master the principles and practices underlying neural networks, including modern methods of deep learning, and apply deep learning methods to real-world problems including image recognition, natural language processing, and biomedical applications. Course work includes homework and a final project—either group or individual, depending on enrollment—with both a written and oral (i.e., presentation) component. The course assumes basic prior knowledge in linear algebra and probability.

AMTH 553 / CPSC 553 / CB&B 555 / GENE 555, Unsupervised Learning for Big Data: This course focuses on machine-learning methods well-suited to tackling problems associated with analyzing high-dimensional, high-throughput noisy data including: manifold learning, graph signal processing, nonlinear dimensionality reduction, clustering, and information theory. Though the class goes over some biomedical applications, such methods can be applied in any field. Prerequisites: knowledge of linear algebra and Python programming.

AMTH 640 / CPSC 640, Topics in Numerical Computation: This course discusses several areas of numerical computing that often cause difficulties to non-numericists, from the ever-present issue of condition numbers and ill-posedness to the algorithms of numerical linear algebra to the reliability of numerical software. The course also provides a brief introduction to “fast” algorithms and their interactions with modern hardware environments. The course is addressed to Computer Science graduate students who do not necessarily specialize in numerical computation; it assumes the understanding of calculus and linear algebra and familiarity with (or willingness to learn) either C or FORTRAN. Its purpose is to prepare students for using elementary numerical techniques when and if the need arises.

AMTH 666 / ASTR 666 / MATH 666, Classical Statistical Thermodynamics:   Classical thermodynamics is derived from statistical thermodynamics. Using the multi-particle nature of physical systems, we derive ergodicity, the central limit theorem, and the elemental description of the second law of thermodynamics. We then develop kinetics, transport theory, and reciprocity from the linear thermodynamics of irreversible processes. Topics of focus include Onsager reciprocal relations, the Fokker-Planck equation, stability in the sense of Lyapunov, and time invariance symmetry. We explore phenomena that are of direct relevance to astrophysical and geophysical settings. No quantum mechanics is necessary as a prerequisite.

AMTH 667 / CPSC 576 / ENAS 576, Advanced Computational Vision:   Advanced view of vision from a mathematical, computational, and neurophysiological perspective. Emphasis on differential geometry, machine learning, visual psychophysics, and advanced neurophysiology. Topics include perceptual organization, shading, color, and texture.

MATH 670, Topics on Random Graphs:   We discuss a variety of topics on the theory of random graphs. We introduce the standard models of random graphs and focus on the threshold phenomenon for graph properties. For many interesting and n atural graph properties, the probability for a random graph to enjoy the property moves from close to 0 to close to 1 in a relatively small interval in terms of the given density of the random graph; we investigate this for thep properties of (1) containing fixed size or spanning subgraphs (like a perfect matching or a Hamiltonian cycle); (2) chromatic number; (3) transfer of classical theorems in extremal combinatorics. This course is open to students from Statistics and Computer Science as well. Yale College juniors and seniors are also welcome. Some background in discrete probability and graph theory is helpful, but the course is self-contained.

MATH 674, Extremal Combinatorics:   The course is a stand alone introduction to extremal combinatorics. We focus on algebraic and combinatorial techniques (e.g., combinatorial Nullstellensatz, tensor constructions, duality shifting) as applied to a variety of discrete settings including hypergraphs and set systems, designs, graphs, posets, and arithmetic combinatorics. Time permitting, we may also discuss topics from eigenvalue methods, entropy, linear programming, finite geometry, or coding theory. No prerequisites. Interested undergraduates are encouraged to contact the instructor.

MATH 675, Numerical Methods for Partial Differential Equations:  (1) Review of the classical qualitative theory of ODEs; (2) Cauchy problem. Elementary numerical methods: Euler, Runge-Kutta, predictor-corrector. Stiff systems of ODEs: definition and associated difficulties, implicit Euler, Crank-Nicolson, barrier theorems. Richardson extrapolation and deferred corrections; (3) Boundary value problems. Elementary theory: finite differences, finite elements, abstract formulation and related spaces, integral formulations and associated numerical tools, nonlinear problems; (4) Partial differential equations (PDEs). Introduction: counterexamples, Cauchy–Kowalevski theorem, classification of second-order PDEs, separation of variables; (5) Numerical methods for elliptic PDEs. Finite differences, finite elements, Richardson and deferred corrections, Lippmann–Schwinger equation and associated numerical tools, classical potential theory, “fast” algorithms; (6) Numerical methods for parabolic PDEs. Finite differences, finite elements, Richardson and deferred corrections, integral formulations and related numerical tools; (7) Numerical methods for hyperbolic PDEs. Finite differences, finite elements, Richardson and deferred corrections, time-invariant problems and Fourier transform.

AMTH 710 / MATH 710, Harmonic Analysis on Graphs and Applications:  This class covers basic methods of classical harmonic analysis that can be carried over to graphs and data analysis. We cover the fundamentals of nonlinear Fourier analysis, including functional approximations in high dimensions. We intend to cover foundational material useful for data organization and geometries.

MATH 724, Heat Kernel and Analysis on Manifolds: Topics include Laplace operator on Riemannian manifolds, heat equation, maximum principles and regularity theory, spectral properties, the distance function, Gaussian estimates, Davies-Gaffney estimates, Green function, Ultracontractive estimates, pointwise Gaussian estimates. The goal is to go through Heat Kernel and Analysis on Manifolds by Alexander Grigor’yan.

AMTH 745 / CB&B 745 / CPSC 745, Advanced Topics in Machine Learning and Data Mining: An overview of advances in the past decade in machine learning and automatic data-mining approaches for dealing with the broad scope of modern data-analysis challenges, including deep learning, kernel methods, dictionary learning, and bag of words/features. This year, the focus is on a broad scope of biomedical data-analysis tasks, such as single-cell RNA sequencing, single-cell signaling and proteomic analysis, health care assessment, and medical diagnosis and treatment recommendations. The seminar is based on student presentations and discussions of recent prominent publications from leading journals and conferences in the field. Prerequisite: basic concepts in data analysis (e.g., CPSC 545 or 563) or permission of the instructor.

AMTH 765 / CB&B 562 / ENAS 561 / INP 562 / MB&B 562 / MCDB 562 / PHYS 562, Modeling Biological  Systems II:   This course covers advanced topics in computational biology. How do cells compute, how do they count and tell time, how do they oscillate and generate spatial patterns? Topics include time-dependent dynamics in regulatory, signal-transduction, and neuronal networks; fluctuations, growth, and form; mechanics of cell shape and motion; spatially heterogeneous processes; diffusion. This year, the course spends roughly half its time on mechanical systems at the cellular and tissue level, and half on models of neurons and neural systems in computational neuroscience. Prerequisite: a 200-level biology course, or permission of the instructor.

AMTH 797 / MATH 797, Geometry of Data: Technological developments have enabled the acquisition and storage of increasingly large-scale, high-resolution, and high-dimensional data in many fields. These datasets pose a challenge for classical methods relying on parametric modeling, statistical estimation, or linear methods. However, it has been shown that many real-world data represented in high dimension (audio, images) lie on or near low-dimensional manifolds. This course covers manifold learning: a class of nonparametric kernel-based methods that extract low-dimensional structures from high-dimensional data. The course also reviews the role graphs play in modern signal processing and machine learning: signal processing on graphs and learning representations with deep neural networks. Topics include: linear data analysis (PCA, ICA, CCA); kernel-based methods, kernel PCA; affinity and Laplacian matrices; spectral clustering; manifold learning and nonlinear dimensionality reduction; graph signal processing; and representation learning with neural networks.

MATH 801, Cocycles, Lyapunov Exponents, and Spectral Theory: We develop cocycles over an ergodic base, prove the basic ergodic theorems (Kingman), and discuss Lyapunov exponents, Osseldets’ theorem, and Furstenberg’s theorem in the random case, as well as its higher-dimensional version. We connect these results to the spectral theory of Schrödinger operators with potentials defined by an ergodic process, introduce the avalanche principle, and establish Hölder regularity of the Lyapunov exponent in the energy.