Prev: M15 Next: M17

# M16 Past Exam Problems

📗 Enter your ID (the wisc email ID without @wisc.edu) here: and click (or hit enter key)
📗 If the questions are not generated correctly, try refresh the page using the button at the top left corner.
📗 The same ID should generate the same set of questions. Your answers are not saved when you close the browser. You could print the page: , solve the problems, then enter all your answers at the end.
📗 Please do not refresh the page: your answers will not be saved.

# Warning: please enter your ID before you start!


# Question 1


📗  

# Question 2


📗  

# Question 3


📗  

# Question 4


📗  

# Question 5


📗  

# Question 6


📗  

# Question 7


📗  

# Question 8


📗  

# Question 9


📗  

# Question 10


📗  

# Question 11


📗  

# Question 12


📗  

# Question 13


📗  

# Question 14


📗  

# Question 15


📗  

# Question 16


📗  

# Question 17


📗  

# Question 18


📗  

# Question 19


📗  

# Question 20


📗  

# Question 21


📗  

# Question 22


📗  

# Question 23


📗  

# Question 24


📗  

# Question 25


📗  


📗 [3 points] Suppose there are three classifiers f1,f2,f3 to choose from (i.e. the hypothesis space has three elements), and the activation values from these classifiers based on a training set of three items are listed below. Which classifier is the best one if loss is used for comparison? (Enter a number 1 or 2 or 3).
📗 Reminder: zero-one loss means i=1n1{aiyi}, square loss means i=1n(aiyi)2, cross entropy loss means i=1nyilog(ai)+(1yi)log(1ai).
Items 1 2 3
y
f1
f2
f3

📗 Answer: .
📗 [3 points] Which ones of the following functions are equal to the squared error for deterministic binary classification? C=i=1n(f(xi)yi)2,f(xi){0,1},yi{0,1}. Note: IS is the indicator notation on S.
📗 Note: the question is asking for the functions that are identical in values.
📗 Choices:
i=1n
i=1n
i=1n
i=1n
i=1n
None of the above
📗 [3 points] In one step of gradient descent for a L2 regularized logistic regression, suppose w = , b = , and Cw = , Cb = . If the learning rate is α = and the regularization parameter is λ = , what is w after one iteration? Use the loss C(w,b) and the regularization λ2[wb]22 = λ2(w2+b2).
📗 Answer: .
📗 [2 points] Consider a single sigmoid perceptron with bias weight w0 = , a single input x1 with weight w1 = , and the sigmoid activation function g(z)=11+exp(z). For what input x1 does the perceptron output value a = .
📗 Note: Math.js does not accept "ln(...)", please use "log(...)" instead.
📗 Answer: .
📗 [2 points] Consider a single sigmoid perceptron with bias weight w0 = , a single input x1 with weight w1 = , and the sigmoid activation function g(z)=11+exp(z). For what input x1 does the perceptron output value a = .

📗 The red curve is a plot of the activation function, given the y-value of the green point, the question is asking for its x-value.
📗 Note: Math.js does not accept "ln(...)", please use "log(...)" instead.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .
📗 [1 points] Blank.
📗 Answer: .

# Grade


 * * * *

 * * * * *


📗 You could save the text in the above text box to a file using the button or copy and paste it into a file yourself .
📗 You could load your answers from the text (or txt file) in the text box below using the button . The first two lines should be "##m: 16" and "##id: your id", and the format of the remaining lines should be "##1: your answer to question 1" newline "##2: your answer to question 2", etc. Please make sure that your answers are loaded correctly before submitting them.


📗 You can find videos going through the questions on Link.





Last Updated: April 07, 2025 at 1:55 AM