Fa20 - GEOMETRIC FNDTNS OF DATA SCI (51240)

Instructor:

Professor Chandrajit Bajaj

  • Lecture Hours -- Mon, Wed- 9:30 - 11:00am  (online)
  • Office hours –  Mon, Wed 1:30  - 3:00 p.m. or by appointment
  • Zoom (office hours and 1-1 meeting):  https://utexas.zoom.us/my/cbajaj
  • Contact: bajaj at cs.utexas.edu

NOTE: Most questions should be submitted to Canvas rather than by sending emails to the instructor. Please attempt to make reservation a day before for the office hour  to avoid conflicts. 

Teaching Assistant

  • Office hours – TBD
  • Contact: TBD

Note: Please attempt to make reservations a day before for the office hours  to avoid conflicts. 

Course Motivation and Synopsis

 This course is on fundamental algorithmic, computational aspects of data sciences, machine (deep) learning, optimal control and statistical inference analysis. The machine learning techniques you will encounter include Multi-Linear Perceptrons, Conv. Nets, Residual Nets, Recurrent Nets, Variational Autoencoders, Generative Adversarial Nets, Actor-Critic, and  Reservoir Nets. The foundational  topics span dimensional reduction (PCA, KDA, LDA, Gaussian Mixtures, Johnson Lindenstrauss), transformation maps (differentiable, diffeomorphic, triangular, and normalizing flows),  geometric optimization (convex, non-convex including polynomial ). You will learn to dance between   discrete and continuous mathematics, computer science, statistics and see the intimate connections between deep reinforcement  learning and optimal control.  Issues of measurement errors, noise and outliers shall be central to bounding the precision, bias and accuracy of the data analysis. The geometric insight and characterization gained provides the basis  for  designing and improving existing approximation algorithms for NP - hard problems with better accuracy / speed tradeoffs.

 An initial listing of lecture topics  is given in the syllabus below. This is subject to some modification, given the background and speed at which we cover ground.  Homework exercises shall be given almost  bi-weekly.  Assignment solutions that are turned in late shall suffer a  10% per day reduction in credit, and a 100% reduction once solutions are posted. There will be a mid-term exam in class. The content will be similar to the homework exercises. A list of  Machine (Deep, Reinforcement, Imitation) Learning project topics will also be assigned as individual (or pair - group ) data science projects with a written/oral presentation, at the end of the semester. This project results shall  be scored, and be in lieu of a final.

The course is aimed at senior under-graduate students. Those in the 5-year master's program students, especially in the CS, CSEM, ECE, PHYS, STAT and MATH. are welcome. You’ll need mathematics, computer science and probabilty statistics at the level of first year graduate, plus linear algebra, geometry, plus introductory functional analysis and numeric optimization  (e.g., for  CS and ECE students) and combinatorial optimization (e.g.,for  CSEM and Math. students).  

Course Material.

  1. [B1] Chandrajit Bajaj (frequently updated)  A Mathematical Primer for Computational Data Sciences 
  2. [BHK] Avrim Blum, John Hopcroft and Ravindran Kannan. Foundations of Data Science 
  3. [MU] Michael Mitzenmacher, Eli Upfal Probability and Computing (Randomized Algorithms and Probabilistic Analysis)
  4. [ZLLS] Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola  Dive into Deep Learning,
  5. [SB2] Richard Sutton, Andrew Barto , Reinforcement Learning ; An introduction , 
  6. [CVX] Stephen Boyd, Lieven Vandenberghe. Convex Optimization .
  7. [JK] Prateek Jain, Purshottam Kar Non-Convex Optimization for Machine Learning .
  8. [HF] Hermann Flaschka  Principles of Analysis
  9. Extra reference materials .

 

COURSE OUTLINE 

Date Topic Reading Assignments

Wed

08-26-2020

1. Introduction to Data Science,  Machine Learning, Geometry of Data, High Dimensional Spaces  [notes]

Supplementary [Slides1] [Slides2]

 [BHK] Ch 1, 12-Appendix

[B1] Ch 1,2

[CVX] Appendix

[ZLLS] Introduction

 

Mon

08-31-2020

2. Perceptrons and Deep Learning, Models, Gradient Descent, Backpropagation [see readings]

Linear Regression,  Classification, Detection and Localization  I [Slides1] [Slides2]

[HF] Sec 1,2,3

[BHK] Ch 5, 12-Appendix

[ZLLS] Intro, Chap 3, 5

A1 posted

 

Wed

09-02-2020

3. Model Selection: LearningModels from Data, Underfitting, Overfitting [See Readings] [Slides]

Geometry of Vector, Matrix, Functional Norms, Loss Functions  and Optimization II  [notes]

 

[MU] Ch 1 -4

[B1] Appendix

[ZLLS] LNN Chap 4

Wed

09-09-2020

4.  Probability Theory Bayes Thm , Maximum Likelihood, MAP  Estimates  I [notes]

Markov, Chebyshev, Hoeffding  Bounds  [notes] 

Softmax, Cross-Entropy Loss Functions [notes, and See readings]

[MRT] Chap 1, 2

[MU] . Chap 1

[ZLLS] Chap 3, Appendix

 

Mon

09-14-2020

5. Probability Theory, Noisy Stochastic Regression, Bayes Thm , Maximum Likelihood, MAP  Estimates  II [notes]

[ZLLS Chap4]

[CVX Chaps 1-5]

A1 due

 

 

Wed

09-16-2020

6.  Convolutional Neural Networks [notes] 

Model Selection, Mini-Batch Sampling, Regularization,  in Deep Networks [slides, notes, See readings]

[ZLLS]  Chap 4, Chap 6 

A2  posted

A1. solution

A1 MLP (G. Kluber)

Mon

09-21-2020

7. MonteCarlo Sampling [notes] Low Discrepancy Quasi-Monte Carlo Sampling, [slides]

Model Selection, Underfitting, Overfitting. [Supp-notes]

[BHK] [CVX] Appendix

See References in Notes

Wed

09-23-2020

8. Sampling Multivariate Gaussians in High Dimensions, Separating Mixture of Gaussians I  [notes]

 

[BHK Chap2]

See Refs in Notes

Mon

09-28-2020

9. Transform Sampling [notes]   [Supp] , Normalizing Flows [notes]

Random Projections and Johnson-Lindenstrauss [notes]  

Separating Mixture of Gaussians I I

 

[BHK Chap 3]

[CVX] Chap 1, 2, 3

See Refs in Notes

 

Wed

09-30-2020

10. Low Rank Matrix  Approximation with Applications [notes]

 

Deep Learning Object Recognition, Deep Conv. Nets. [notes, see readings]

[B2]  Ch 5

[CVX] Ch 5, Ch 8

[ZLLS]  Chap 4, Chap 6 

A2 due

A2 solution (Friday Oct 2)

Mon

10-05-2020

11.  Spectral Methods for Learning: PCA, Kernel PCA [notes],  Eigenfaces [notes] 

Modern Convolutional Kernel Neural Networks I I  [notes]

[CVX] Ch 5, Ch 8

[B2]  Ch 5

 

A3  posted Friday Oct 2

Wed

10-07-2020

12. Optimization Methods I:  Method of Lagrange Multipliers [notes], Projected Gradient Descent, [notes]

Classification-KSVM [notes]

[K17]

[ZLLS Chap 6]

 

 

 

Mon

10-12-2020

13. Big Data Machine Learning:  Matrix Sampling, Matrix Sketching  Algorithms, [notes] 

RIP Matrices  , Compressive Sensing [notes]

 

 

See References in notes

 

Wed

10-14-2020

 14.  Optimization Methods II: 

Non-Convex Optimization,  Alternating Minimization  Robust Regression Recovery [notes] 

 

See [JK] 

and References in notes

A3 due (Friday)

 

Mon

10-19-2020

15.  Spectral Methods for Learning II:

Connections between AMRR, Compressive Sensing, Matrix Sketching [notes]

 See References in Notes

A3 solution 

 

Wed

10-21-2020

MIDTERM 

 

A4 posted 

Mon

10-26-2020

16. Statistical Machine Learning I : Expectation Maximization (Mixed Regression)  [notes]

 

 

[CVX Chap 1-5]

[JK]Chap 5

Wed

10-28-2020

 17. Statistical Machine Learning: Expectation Maximization II (Latent Variable Models, Soft Clustering, Mixed Regression)  [notes]

Stochastic Noisy Gradient Descent Deep Networks Parameter Training.

 [notes] 

[BHK] Ch 7

[JK] Chap 5

 

Mon

11-02-2020

18. Statistical Machine Learning- Separating Mixture of Gaussians III:  Variational Inference -- Deep Variational Autoencoders   [notes]]

 

[BHK] Ch 2

Kingma & Welling [Intro to VAE]

Final PROJECT  LIST POSTED

 

Wed

11-04-2020

19. Statistical Machine Learning- Separating Mixture of Gaussians IV:  Deep Generative Adversarial Networks  [notes]]

Kingma & Welling [Intro to VAE]

[ZLLS] Chap 17

K17] Kingma & Welling

 

 

A4 due

A5 posted (Fri)

Mon

11-09-2020

 20. Optimizing and Selecting  Loss Functions,  Energy. Based Analysis [paper]

Bias-Variance-Tradeoff [notes]

 

See References in Notes

 

Wed

11-11-2020

21.  Deep Neural Networks III :  OdeNets, Adjoints and Back Propagation [notes] [notes1]

See References in Notes

 

Mon

11-16-2020

22. Dynamical Systems :

Koopman Theory, Dynamic Mode Decomposition, with Applications to Reduced Models [notes] [slides]

See References in Notes

Part 1 of Project Report Due

 

Wed

11-18-2020

23. ] Stochastic Gaussian Processes II: Uncertainty Propagation, Kalman Filtering [notes][KF-Dynam-paper]

See References in Notes

 

 

 

 

A5 due on Nov 20.

Solutions out Nov 22.

Mon

11-23-2020

 24.  Recurrent and Recursive Neural Networks, ODENets [notes]

Attention Mechanisms [notes]

[ZLLS] Ch 10

See References in Notes

 

 

Wed

11-25-2020

Thanksgiving Break

Mon

11-30-2020

26. Geometry of Deep Reinforcement Learning :  Markov Decision Processes,  Connections to Optimal Control  [notes]  [notes]

 

 

[SB2]  Chap  3

 

Wed

12-02-2020

27.  Geometry of Deep  Reinforcement Learning II --- Value and Policy Iterations    [notes]

 

[SB2] Chap 

 

 

MON

12-07-2020

28.  Geometry of Deep  Reinforcement Learning III --- Q-function, Actor-Critic    [notes]  [SB2]

THUR

12-10-2020

29. Final Presentations Day

SUBMIT VIDEOS TO UTBOX

MON

12-14-2020

30. Final Reports Due SUBMIT REPORTS/MODELS TO CANVAS Final Project Report Due

 

Project FAQ

1. How long should the project report be?

Answer: See directions in the Class Project List.  For full points, please address each of the evaluation questions as succinctly as possible. Note the deadline for the report is May 11 midnight. You will get feedback on your presentations,  that should also be incorporated in your final report.

Tests

There will be one in-class midterm exam and one final project. The important deadline dates are:

  • Midterm: Wednesday, October 21, 9:30am - 11:00am 
  • Final Project  Written Report, Due: Dec 11, by 11:59pm

 

Assignments

There will be five written HW assignments and one initial project report and one final project report . Please refer to the above schedule for assignments and final project reports due times.

Course Requirements and Grading

Grades will be based on these factors

  • In-class attendance and participation (5%)
  • HW assignments (50% and with potential to get extra credit) 

4 assignments. Some assignments may have extra questions for extra points you can earn. (They will be specified in the assignment sheet each time.)

  • In-class midterm exam (15%) 
  • First Presentation & Report (10%) 
  • Final Presentation & Report (20%)  

Students with Disabilities. Students with disabilities may request appropriate academic accommodations from the Division of Diversity and Community Engagement, Services for Students with Disabilities, 471-6259, http://www.utexas.edu/diversity/ddce/ssd . 

 

Accommodations for Religious Holidays. By UT Austin policy, you must notify the instructor of your pending absence at least fourteen days prior to the date of observance of a religious holiday. If you must miss a class or an examination in order to observe a religious holiday, you will be given an opportunity to complete the missed work within a reasonable time before or after the absence, provided proper notification is given.

 

Statement on Scholastic Dishonesty. Anyone who violates the rules for the HW assignments or who cheats in in-class tests or the final exam is in danger of receiving an F for the course. Additional penalties may be levied by the Computer Science department,  CSEM  and the University. See http://www.cs.utexas.edu/academics/conduct/

 

Course Summary:

Date Details Due