Quiz5

1.

Let \(x_1,x_2,x_3 \) represent 3 features. Which of the following are NOT linear combinations of these features?

Answer

  • \(0.4x_1 + 0.3x_2 + 0.6x_3\)
  • \(4x_1^2 + 3x_2^2 + x_3^2\)
  • \(4^2 x_1 + 3^2 x_2 + 6^2 x_3\)
  • \(4x_1 + 3x_2 + 6x_3\)

Question 2

Which one of the following statements about PCA is false?

Answer

  • PCA projects the attributes into a space where covariance matrix is diagonal
  • The first Principal Component points in the direction of maximum variance
  • PCA is a non-linear dimensionality reduction technique
  • PCA is useful for exploratory data analysis

Question 3

Which one of the following statements about PCA is false?

Answer

  • PCA works well for circular data
  • The first PC points to maximum variance
  • PCA computes eigen-value eigen-vector decomposition of the covariance matrix
  • PCA works well for ellipsoidal data

Question 4

The magnitude of vector x projected onto a unit vector u is

Answer

  • \(x \times u\)
  • \((x - \mu_x) \cdot (u - \mu_u)\)
  • \(x\cdot u\)
  • \(||x||||u||\)

Question 5

Feature selection is:

Answer

  • selecting a subset of attributes
  • selecting principal components with maximum variance
  • combining many features into one
  • selecting principal components that are not orthogonal to each other