Prev: Q1 Next: Q2
Back to week 1 page: Link

# Q1 Quiz Instruction

📗 The quizzes must be completed during the lectures and submitted on TopHat: Link. No Canvas submissions are required. The grades will be updated by the end of the week on Canvas.
📗 Please submit a regrade request if (i) you missed a few questions because you are late or have to leave during the lecture; (ii) you selected obviously incorrect answers by mistake (one or two of these shouldn't affect your grade): Link

Answer Points Out of
Correct 1 Number of Questions
Plausible but Incorrect 1 -
Obviously Incorrect 0 -


Slides: PDF

The following questions may appear as quiz questions during the lecture. If the questions are not generated correctly, try refresh the page using the button at the top left corner.


# Question 1

Code:


# Question 2

Code:


# Question 3

Code:


# Question 4

Code:


📗 [3 points] Move the sliders below to change the green plane normal so that all the blue points are above the plane and all the red points are below the plane. The current number of mistakes is ???.

📗 Answers:
\(w_{1}\) = 0
\(w_{2}\) = 0
\(w_{3}\) = 1
\(b\) = 0
📗 [3 points] Find the Perceptron weights by using the Perceptron algorithm: select a point on the diagram and click to run one interation of the Perceptron algorithm.
📗 You can set the learning rate here: .

📗 Answer: 0,0.1,0

📗 [4 points] Consider a Linear Threshold Unit (LTU) perceptron with initial weights \(w\) = and bias \(b\) = trained using the Perceptron Algorithm. Given a new input \(x\) = and \(y\) = . Let the learning rate be \(\alpha\) = , compute the updated weights, \(w', b'\) = :
📗 Answer (comma separated vector): .
📗 [0 points] To be added.





Last Updated: April 29, 2024 at 1:11 AM