Sp18 - Algo. Found. of Data Sciences [51800] Stat/Discrete/Continous Methods Machine Learning (63625), Also Mathematics of ML 393C (54377)

Sp18 - Algo. Found. of Data Sciences [51800] Stat/Discrete/Continous Methods Machine Learning (63625), Also Mathematics of ML 393C (54377)

Instructor:

Professor Chandrajit Bajaj

  • Office hours – Mon, Tue,  Wed. 1:00 p.m. - 2:00 p.m. POB 2.324A
  • Contact: bajaj at cs.utexas.edu

NOTE: Most questions should be submitted to Canvas rather than by sending emails to the instructor. Please attempt to make reservation a day before for the office hour  to avoid conflicts. 

 

Teaching Assistant

Yi Wang

  • Office hours – Tues, Thur. 3:00 p.m.- 4:30 p.m. POB 2.102
  • Contact: panzer.wy@gmail.com

Note: Please attempt to make reservations a day before for the office hours  to avoid conflicts. 

Lecture Time and Location: M W 11 – 12:30 a.m. in GDC 5.304

 

Course Motivation and Synopsis

 As businesses and academic enterprises gather ever increasing amount of data/ information, new challenges arise for the data analysts. With this is the growing demand for reliable software that can parse all data sets, and make robust inferences from the information. 

This course dwells on the foundational algorithmic and computational aspects of data sciences, machine learning and statistical inference analysis. The topics spans scalable data analysis and geometric optimization, while  weaving  together computer and computational science, discrete and continuous mathematics and statistics. Students shall delve with breadth-and-depth into dimensionality, sparsity, resolution, resolvability, recovery, prediction, for a variety of   data (sequence, stream, graph-based,  time-series, images, video, hyper-spectral), emanating from multiple sensors (big and small, slow and fast), and accumulated via the interactive WWW.  Issues of measurement errors, noise and outliers shall be central to bounding the precision, bias and accuracy of the data analysis. We shall learn to characterize and measure the differences in high dimensions, along with  computational concepts that underlie dimension reduction, sampling, sketching, optimization in machine learning (and deep learning).  The geometric insight and characterization gained provides the basis  for  designing and improving existing approximation algorithms for NP - hard problems with better accuracy / speed tradeoffs.

 An initial listing of lecture topics  is given in the syllabus below. This is subject to modification, given the background and speed at which we cover ground.  Homework exercises shall be given almost  bi-weekly.  Assignment solutions that are turned in late shall suffer a  10% per day reduction in credit, and a 100% reduction once solutions are posted. There will be a mid-term exam in class. The content will be similar to the homework exercises. A list of  topics will also be assigned as individual (or pair - group ) data science projects with a written/oral presentation, at the end of the semester. This project shall  be graded, and in lieu of a final.

The course is aimed at graduate students, especially in the CS, CSEM, ECE and MATH. programs, but others are welcome. You’ll need math at the level of senior undergraduate, incoming graduate student, plus linear and modern algebra, plus introductory functional analysis,  probability and statistics  (e.g., for  CS and ECE students) or more discrete math (e.g.,for  CSEM students).  

Course Material.

  1. [B1] Chandrajit Bajaj Molecular Modeling: Computational Science Perspective
  2. [B2] Chandrajit Bajaj (frequently updated)  A Mathematical Primer for Computational Data Sciences 
  3. [BHK] Avrim Blum, John Hopcroft and Ravindran Kannan. Foundations of Data Science 
  4. [CVX] Stephen Boyd, Lieven Vandenberghe. Convex Optimization .
  5. [GBC] Ian Goodfellow, Yoshua Bengio, Aaron Courville Deep Learning .
  6. [JK] Prateek Jain, Purshottam Kar Non-Convex Optimization for Machine Learning .
  7. [MU] Michael Mitzenmacher, Eli Upfal Probability and Computing (Randomized Algorithms and Probabilistic Analysis)
  8. Extra reference materials .

 

TENTATIVE  COURSE OUTLINE (in Flux). 

Date Topic Reading Assignments
01-17-2018 1. Introduction to Data Science, Geometry of Data, High Dimensional Spaces,  Models, Applications  [notes]

 [BHK] Ch 1

 [B2]  Ch 1

 
01-22-2018 2. Applications and Solution Approaches, Convex and Non-Convex Optimization, Geometry of Optimization  [notes]

 [BHK] Ch 2, Ch12 -Appendix

[B2]  Ch 2

[JK]  Ch 1

 [A1]

due on 02-05-2018, 11:59pm

01-24-2018 3. Convex and Non-Convex Projected Gradient Descent [Notes1], Geometry of Vector, Matrix, Functional Norms  and Balls[Notes]

 [BHK] Ch 2, Ch 12-Appendix

[B2]  Ch 2

[JK]  Ch 2, 3

 
01-29-2018 4. Projection Operators; Non-Convex Optimization Guarantees [Notes1] Lagrange Multipliers [Notes2]

[JK]  Ch 3,4

[B2]  Ch 2, 9

01-31-2018 5. Alternating Minimization with Applications to Machine Learning and Signal Processing. Robust Linear Regression [Notes]

 

[B2]  Ch 9

[JK]  Ch 9

 
02-05-2018 6.  Expectation Maximization, Maximum Likelihood [notes1] High Dimensional Probability and Geometry [notes] Probability Primer [notes]

 [BHK]  Ch 12

[B2]  Ch 10

[B1] Appendix

 

[A1 Solution]

 

02-07-2018 7. Alternating Maximization for Expectation Maximization, Gaussian Mixture Models [notes] Bayes Rule [notes]

[BHK] Chap 12

[B2]  Ch 10

 

[JK]  Ch 5

 [A2]

Due on 02-21-2017

02-12-2018

8.  Concentration Theorems [notes] Chernoff Bounds for  Provably (Probabilistic) Approximation Algorithms [notes] 

[BHK] Chap 12

[B2]  Ch 10

[MU] Ch 2-4

 

02-14-2018

9. Sampling Distributions [notes] Random Sampling, Johnson Lindenstrauss, Compressive Sensing, Sparse Recovery  [notes]  Transformation of Random Variables [notes],  Low Discrepancy Sampling   [slides]

[BHK] Chap 12

[B2]  Ch 5

 

02-19-2018

10.  Importance Sampling [notes] Pairwise Independence [notes] MC & QMC Integration & Koksma-Hlawka Bound, Holder Inequalities,  [notes] 

[BHK] Chap 2, Appendix

[B2]  Ch 5

 

02-21-2018

11. Markov Chain Monte Carlo, Metropolis and Gibbs Sampling [notes]

[BHK] Chap 4

[MU]  Ch 7,10

 

02-26-2018

12.  Stochastic / Noisy Gradient Descent Algorithms [notes]

[BHK] Chap 5,

[JK]  Ch 6

 

02-28-2018

13.  Proximal Gradient Descent, Matrix Completion, ADMM [notes]

[Additional Ref]

 

 

[A2 Solution]

[A3]  

Due on 03-16-2018

03-05-2018

 14. Matrix Sampling, Matrix Sketching  (PAC) Algorithms, [notes]  

 

 

03-07-2018

 15. Best Fit Linear Spaces, Spectral Decomposition, PCA, Applications [notes] Applications of Low Rank Matrix Approximation [notes]

[BHK] Chap 6

 

[A3 Solutions]

03-10 till 03-18 2018 

SPRING BREAK

 

03-19-2018 16. Review of Lectures 1-15. Tying up loose Ends  [notes]

[BHK] Chap 1,2,3,4, 10, 12, Appendix

 [JK]  Ch 1- 5

03-21-2018

 

MIDTERM in CLASS [solutions]

[A4]

Due 04-04-2018

03-26-2018

 

17. Geometry of Machine Learning, VC dimension [notes] 

[BHK] Chap 5

 

03-28-2018

 18.  Geometry of Kernel SVM , Primal Dual Optimization   [notes]  

[BHK] Chap 5

[CVX] Chap 6

04-02-2018

 19. Dimension Reduction (JL), Compressed Sensing   [notes]

[BHK] Chap 10

Valiant 

04-04-2018

20.  Geometry of Machine Learning, PCA, SVD revisited [notes]

[BHK] Chap 5

 

See Refs in Notes

[A5]

Due 04-20-2018

[A4 Solutions]

 

04-09-2018

21. Spectral Methods for Learning: Sparse and Kernel PCA,  Applications [notes]

[BHK] Chap 5

 

See Refs in Notes

 

 

04-11-2018

22. Geometry of  Spectral Methods for Learning: Fischer LDA, KDA, Applications [notes]

[BHK] Chap 5

See Refs in Notes

 

 

 

04-16-2018

23. Geometry of Spectral Methods for Learning: CCA ,QR, Applications   [notes]

 

See Refs in Notes

 [Final Assignment Project Topics]

04-18-2018

24.  Geometry of Clustering I:  Spectral Analysis, Kernel, Optimization [notes]

[BHK] Chap 7

See Refs in Notes

[A5 Solutions]

04-23-2018

25. Geometry of Clustering II:  Balanced Graph Cuts  [Notes]

 

[BHK] Chap 7

See Refs in Notes

 

04-25-2018

26. Geometry of Tensors, Rank-1 Decompositions, Optimization, Applications [notes]

[B2]  Ch 7

See Refs in Notes

04-30-2018

27.  Geometry of Tensors, Tucker and SVD Decomposition, Applications [notes2]

[B2]  Ch 7

See Refs in Notes

05-02-2018

28. Algebra & Geometry of Deep Learning: CNN, RNN etc., Applications, Next Steps   [notes]

[BHK] Chap 5

[GBC]  Chap 9,14

05-04-2018

Final PROJECT PRESENTATION DAY (see Project List and Teams below) POB 2.402 9:30am - 3pm

05-11-2018

Final Project Report  Due by 11:59pm

 

Class Project Topics List

Class Project Teams 

Grp. No. Class Member Teams Project Topic PPT Day
1 Yan Han Hyperspectral Image Analysis: Tensor Decomposition May  4
2 Jialin Wu, Yajie Niu Image Object Detection May  4
3 Yuege Xie Image Object Detection May  4
4 Shuangquan Feng, Xinrui Hua Image Object Detection May  4
5 Andrew J DuPlissis Image Reconstruction/Deblurring: Primal Dual Optimization May  4

6 Qiujiang Jin Image Reconstruction/Deblurring: Primal Dual Optimization May  4
7

William Ruys

Image Reconstruction/Deblurring: Primal Dual Optimization. May  4
8 Meghana Palukuri, Akshay Kumar Varanasi Hyperspectral Image Analysis: Denoising, Dictionary Learning and Classification May  4
9 Xiao Xu, Wenbo Zhang Image Segmentation: Markov Random Field May  4
10 Michael Griffin, Tao Zhang Image Segmentation: Markov Random Field May  4
11 Kazbek Kazhyken Knowledge base completion. May  4
12 Tai Yin Chu Knowledge base completion. May  4
13 Dany Haddad Knowledge base completion May  4
14 Ethan Leeman Structured low rank decomposition of multivariated Hankel Matrices May  4
15 Supawit Chockchowwat 3D point cloud segmentation and classification May  4
16 Caleb Chuck 3D point cloud segmentation and classification
May  4

(Remark: It is not the order of presentation.) 

 

Project FAQ

1. How long should the project report be?

Answer: See directions in the Class Project List.  For full points, please address each of the evaluation questions as succinctly as possible. Note the deadline for the report is May 11 midnight. You will get feedback on your presentations,  that should also be incorporated in your final report.

Tests

There will be one in-class midterm exam and one final project. The important deadline dates are:

  • Midterm: Wednesday, March 21, 11:00am - 12:30am
  • Final Project Due: May 11,  11.59pm

 

Assignments

There will be five written HW assignments and one final project report. Please refer to the above schedule for assignments and final project report due time. 

 

Course Requirements and Grading

Grades will be based on these factors

  • In-class attendance and participation (10%)
  • HW assignments (45% and with potential to get extra credit) 

5 assignments. Some assignments may have extra questions for extra points you can earn. (They will be specified in the assignment sheet each time.)

  • In-class midterm exam (20%) 
  • Final Presentation & Report (25%) 

 

 

Students with Disabilities. Students with disabilities may request appropriate academic accommodations from the Division of Diversity and Community Engagement, Services for Students with Disabilities, 471-6259, http://www.utexas.edu/diversity/ddce/ssd . 

 

Accommodations for Religious Holidays. By UT Austin policy, you must notify the instructor of your pending absence at least fourteen days prior to the date of observance of a religious holiday. If you must miss a class or an examination in order to observe a religious holiday, you will be given an opportunity to complete the missed work within a reasonable time before or after the absence, provided proper notification is given.

 

Statement on Scholastic Dishonesty. Anyone who violates the rules for the HW assignments or who cheats in in-class tests or the final exam is in danger of receiving an F for the course. Additional penalties may be levied by the Computer Science department,  CSEM  and the University. See http://www.cs.utexas.edu/academics/conduct/

Course Summary:

Date Details