Rubric

Keep in mind that 27 students have already been assessed using this rubric. Changing it will affect their evaluations.
A3 (1)
A3 (1)
Criteria Ratings Pts
Problem 1, question 1
threshold: pts
3 to >0.0 pts Full Marks Step 1: Realize the lifting function $\Phi$ such that $K(x, y) = <\Phi(x), \Phi(y)>$.
0 pts No Marks
pts
3 pts
--
Problem 1, question 1
threshold: pts
3 to >0.0 pts Full Marks Step 2: Write A into the correct form $B^T B$, where B = [\Phi(a_1) ... \Phi(a_n)] with $\Phi$ from step 1.
0 pts No Marks
pts
3 pts
--
Problem 1, question 1
threshold: pts
4 to >0.0 pts Full Marks Step 3: Show $y^T A y = y^T B^T B y = ||By||^2$ with $B$ from step 2.
0 pts No Marks
pts
4 pts
--
Problem 1, question 2
threshold: pts
10 to >6.0 pts Full Marks - Correct step 1,2,3. Step 3: Compute B = (\Phi(a_1) \Phi(a_2) \Phi(a_3)) = \sqrt(\Sigma) U^T
6 to >3.0 pts Correct step 1,2 Step 2: Compute A = U\Sigma U^T
3 to >0 pts Correct step 1. Step 1: Compute Gram matrix
pts
10 pts
--
Problem 2, question 1
+ Step 1: Give the correct joint pdf of (Y_1, Y_2).
threshold: pts
7.5 to >0.0 pts Full Marks Step 1: Show the support of (Y_1, Y_2) is a square.
0 pts No marks
pts
7.5 pts
--
Problem 2, question 1
+ Step 2: Show the support of (Y_1, Y_2) is a square.
threshold: pts
7.5 to >0.0 pts Full Marks Step 2: Give the correct joint pdf of (Y_1, Y_2)
0 pts No Marks
pts
7.5 pts
--
Problem 2, question 2
threshold: pts
10 to >1.0 pts Full Marks Step 1: Compute correct determinant of Jacobian matrix.
1 to >0 pts Partial Marks Express x_1, x_2 in terms of y_1, y_2
pts
10 pts
--
Problem 2, question 2
threshold: pts
5 to >0.0 pts Full Marks Step 2: Show (Y_1, Y_2) has joint pdf p(y_1, y_2)
0 pts No Marks
pts
5 pts
--
Problem 3, question 1
threshold: pts
5 to >0.0 pts Full Marks Step 1: Compute p(z_i|y_i, \Theta^{t}).
0 pts No Marks
pts
5 pts
--
Problem 3, question 1
threshold: pts
5 to >0.0 pts Full Marks Step 2: Show expected complete-data log-likelihood with p(z_i) computed in step 1.
0 pts No Marks
pts
5 pts
--
Problem 3, question 1
threshold: pts
5 to >0.0 pts Full Marks Step 3: Take derivative of quantity in step 2, and set to 0, to compute \Theta^{(t+1)}.
0 pts No Marks
pts
5 pts
--
Problem 3, question 2
threshold: pts
5 to >0.0 pts Full Marks Step 1: Generate 1000 random 1D points from ground truth parameters.
0 pts No Marks
pts
5 pts
--
Problem 3, question 2
threshold: pts
5 to >0.0 pts Full Marks Step 2: Implement EM to predict parameters using sampled points from step 1.
0 pts No Marks
pts
5 pts
--
Problem 3, question 2
threshold: pts
5 to >0.0 pts Full Marks Step 3: Report #iterations, compare predicted results with ground truths.
0 pts No Marks
pts
5 pts
--
Problem 3, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 1: Show a correct setup of the problem including observations, latent variables.
0 pts No Marks
pts
5 pts
--
Problem 3, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 2: Compute conditional probability of latent variables given observations and parameters at current iteration.
0 pts No Marks
pts
5 pts
--
Problem 3, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 3: Update parameters using expected complete-data log-likelihood formula and conditional probability computed at step 2.
0 pts No Marks
pts
5 pts
--
Problem 1, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 1: Generate 50 sample locations uniformly in [0, 5].
0 pts No Marks
pts
5 pts
--
Problem 1, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 2: Generate noisy training samples
0 pts No marks
pts
5 pts
--
Problem 1, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 3: Compute predictive posterior p(y*| D, x* = 2.5)
0 pts No Marks
pts
5 pts
--
Problem 1, question 3
threshold: pts
5 to >0.0 pts Full Marks Step 4: Plot true function f(x), noisy training samples {x_i, y_i}, predictive posterior
0 pts No Marks
pts
5 pts
--