Prev: Q1 Next: Q3
Back to week 1 page: Link

# Q2 Quiz Instruction

📗 The quizzes must be completed during the lectures and submitted on TopHat: Link. No Canvas submissions are required. The grades will be updated by the end of the week on Canvas.
📗 Please submit a regrade request if (i) you missed a few questions because you are late or have to leave during the lecture; (ii) you selected obviously incorrect answers by mistake (one or two of these shouldn't affect your grade): Link

Answer Points Out of
Correct 1 Number of Questions
Plausible but Incorrect 1 -
Obviously Incorrect 0 -


Slides: PDF

The following questions may appear as quiz questions during the lecture. If the questions are not generated correctly, try refresh the page using the button at the top left corner.


# Question 1

Code:


# Question 2

Code:


# Question 3

Code:


# Question 4

Code:


📗 [3 points] Move the sliders below to change the green plane normal so that the largest number of the blue points are above the plane and the largest number of the red points are below the plane.

The current number of mistakes is ???.
📗 Answers:
\(w_{1}\) = 0
\(w_{2}\) = 0
\(w_{3}\) = 1
\(b\) = 0
📗 [3 points] Move the sliders below to change the green plane normal so that the total loss from blue points below the plane and the red points above the plane is minimized.

The current total cost is ???.
📗 Answers:
\(w_{1}\) = 0
\(w_{2}\) = 0
\(w_{3}\) = 1
\(b\) = 0
📗 [3 points] Which ones of the following functions are equal to the squared error for deterministic binary classification? \(C = \displaystyle\sum_{i=1}^{n} \left(f\left(x_{i}\right) - y_{i}\right)^{2}, f\left(x_{i}\right) \in \left\{0, 1\right\}, y_{i} \in \left\{0, 1\right\}\). Note: \(I_{S}\) is the indicator notation on \(S\).
📗 Note: the question is asking for the functions that are identical in values.
📗 Choices:
\(\displaystyle\sum_{i=1}^{n}\)
\(\displaystyle\sum_{i=1}^{n}\)
\(\displaystyle\sum_{i=1}^{n}\)
\(\displaystyle\sum_{i=1}^{n}\)
\(\displaystyle\sum_{i=1}^{n}\)
None of the above
📗 [3 points] Which one of the following is the gradient descent step for w if the activation function is and the cost function is ?
📗 Choices:
\(w = w - \alpha \displaystyle\sum_{i=1}^{n} \left(a_{i} - y_{i}\right)\)
\(w = w - \alpha \displaystyle\sum_{i=1}^{n} \left(a_{i} - y_{i}\right)\)
\(w = w - \alpha \displaystyle\sum_{i=1}^{n} \left(a_{i} - y_{i}\right)\)
\(w = w - \alpha \displaystyle\sum_{i=1}^{n} \left(a_{i} - y_{i}\right)\)
\(w = w - \alpha \displaystyle\sum_{i=1}^{n} \left(a_{i} - y_{i}\right)\)
None of the above





Last Updated: November 30, 2024 at 4:34 AM