df
has 5 columns and 10 rows. After applying p = PCA(3)
and p.fit(df)
, what is the shape of p.components_
? Note: the rows of p.components_
are the principal components. (3, 5)
(5, 3)
(3, 10)
(10, 3)
explained_variance_ratio_
of a PCA model: array([0.4, 0.3, 0.2, 0.1])
. How many components (at least) do we need to explain 80 percent (or more) of the variance of the original data? [1, 2, 3, 4]
and starting centroids [0]
and [5]
, what are the centroids after the first iteration of assigning points and updating centroids, using the iterative K-Means Clustering algorithm with Manhattan distance? [1.5, 3.5]
[0, 5]
[2, 4]
[1, 3]
PCA
, which of the following approximately reconstructs the original dataframe df
using the first three components? p = PCA()
then W = p.fit_transform(df)
and C = p.components_
. W[:, :3] @ C[:3, :] + p.mean_
W[:, :3] @ C[:, :3] + p.mean_
W[:3, :] @ C[:3, :] + p.mean_
W[3:, :] @ C[:, :3] + p.mean_
LinearRegression
KMeans
SVC
(support vector machine) PCA
dw
at [w1, w2, w3, w4]
= [1, -1, 2, -2]
is [-2, 2, -1, 1]
, if gradient descent w = w - alpha * dw
is used, which variable will increase by the largest amount in the next iteration? w1
w2
w3
w4
max 2 w1 - w2
subject to w1 - w2 <= 1
and w1 + w2 >= 0
with w1, w2 >= 0
is written in the standard form max c * x
subject A x <= b
and x >= 0
, what is the matrix A
? Assume c = [2, -1]
and b = [1, 0]
. [[1, -1], [-1, -1]]
[[1, -1], [1, 1]]
[[-1, 1], [-1, -1]]
[[-1, 1], [1, 1]]
Last Updated: November 18, 2024 at 11:43 PM