All of my coursework that i've taken at Kansas State University in Math, CS, and Physics, all courses listed have recieved an A grade
Introduction to the theory of analytic functions. Holomorphic functions, contour integrals, residue theory, conformal mapping and other topics.
Functions of one variable; limits, continuity, differentiability, Riemann-Stieltjes integral, sequences, series, power series, improper integrals.
Combinatorics and graph theory. Topics selected from counting principles, permutations and combinations, the inclusion-exclusion principle, recurrence relations, trees, graph coloring, Eulerian and Hamiltonian circuits, block designs, and Ramsey Theory.
Topics include optimization of multivariable functions, tangent spaces, tangent vector fields, Lagrange multipliers, integration in n-dimensions, integration of differential forms, integral transforms, Picard's Theorem, the generalized Stokes' Theorem and its corollaries.
Topics include vector spaces, vector valued functions, paths and curves, linear transformations, systems of linear equations, vector fields, linear ordinary differential equations, determinant, rank, diagonalization, stability behavior of linear ODE, dual spaces, cotangent vector fields (1-forms), inner product spaces, quadratic forms, exterior algebra, derivatives of vector valued functions.
An introduction to mathematical analysis of time- and space-complexity of algorithms, including worst-case, average-case, and amortized complexity. An examination of various algorithmic designs, such as greedy algorithms, divide-and-conquer algorithms, and dynamic programming algorithms. Techniques for proving correctness of algorithms.
Logical formalisms used to model and reason about computer systems. Propositional and predicate logic, syntax, semantics, and proof theory; soundness and completeness issues. Mathematical induction. Program verification: invariants and program logics.
A study of common data and program structures together with associated algorithms. Topics include interfaces, design patterns, arrays, stacks, queues, lists, trees, hash tables, recursion, binary search, and tree traversals. Experience with both use and implementation of these structures and algorithms using a modern programming language. Discussion of tradeoffs involving performance and software maintainability.
An introduction to electricity and magnetism. The first of a two semester study of Maxwell’s equations in both integral and differential forms. Topics include electrostatics with vector calculus; electrostatic potential solutions in rectangular, cylindrical, and spherical coordinates; dielectrics; electrostatic energy and capacitance; magnetostatics with vector calculus; Biot-Savart law; vector and scalar potentials for magnetism; magnetic permeability; Faraday’s law in integral and differential form; magnetic energy and inductance; displacement current; lumped oscillations and LCR systems; impedance.
Principles of statistics and dynamics of systems of particles and rigid bodies. Topics include Newton’s laws for one particle, non-inertial reference frames, central forces, system of particles, rigid body statics and motion in a plane and in three dimensions, Lagrangian mechanics and Hamilton’s equations, oscillating systems and normal coordinates.
Additionally coures that im currently taking at Kansas State University in Math and CS
Groups, rings, fields, vector spaces and their homomorphisms. Elementary Galois theory and decomposition theorems for linear transformations on a finite dimensional vector space.
Analysis of numerical methods for linear algebra. Perturbation theory and error analysis, matrix factorizations, solutions to linear systems, least-squares, problems, techniques for special matrix structures, symmetric and nonsymmetric eigenvalue problems, iterative and direct methods.
Descriptive statistics, probability concepts and laws, sample spaces; random variables; binomial, uniform, normal, and Poisson; two-dimensional variates; expected values; confidence intervals; binomial parameter, median, mean, and variance; testing simple hypotheses using CIs and X2; goodness of fit. Numerous applications.
Foundations of deep learning, with focus on different types of modern neural network architectures (such as convolutional neural networks, recurrent neural networks, and transformers). Regularization and optimization in the context of deep learning. Frameworks and tools for building and training deep neural networks.