Please enter your details correctly below, you will get detailed explanations of all the questions in the quiz.

Q1. Imagine, you have 1000 input features and 1 target feature in a machine learning problem. You have to select 100 most important features based on the relationship between input features and the target features.

Do you think, this is an example of dimensionality reduction?

Yes

No

Q2. I have 4 variables in the dataset such as – A, B, C & D. I have performed the following actions:

Step 1: Using the above variables, I have created two more variables, namely E = A + 3 * B and F = B + 5 * C + D.

Step 2: Then using only the variables E and F I have built a Random Forest model.

Could the steps performed above represent a dimensionality reduction method?

True

False

Q3. Which of the following techniques would perform better for reducing dimensions of a data set?

Removing columns which have too many missing values

Removing columns which have high variance in data

Removing columns with dissimilar data trends

None of these

Q4. Dimensionality reduction algorithms are one of the possible ways to reduce the computation time required to build a model?

Yes

No

Q5. Which of the following algorithms cannot be used for reducing the dimensionality of data?

t-SNE

PCA

LDA False

None

Q6. The most popularly used dimensionality reduction algorithm is Principal Component Analysis (PCA). Which of the following is/are true about PCA?

PCA is an unsupervised method

It searches for the directions that data have the largest variance

Maximum number of principal components <= number of features

All principal components are orthogonal to each other

1 and 2

1 and 3

2 and 4

1, 2 and 4

All of the above

Q7. Suppose we are using dimensionality reduction as pre-processing technique, i.e, instead of using all the features, we reduce the data to k dimensions with PCA. And then use these PCA projections as our features. Which of the following statement is correct?

Higher ‘k’ means more regularization

Higher ‘k’ means less regularization

Cannot say

Q8. In which of the following scenarios is t-SNE better to use than PCA for dimensionality reduction while working on a local machine with minimal computational power?

Dataset with 1 Million entries and 300 features

Dataset with 100000 entries and 310 features

Dataset with 10,000 entries and 8 features

Dataset with 10,000 entries and 200 features

Q9. Which of the following statement is true for a t-SNE cost function?

It is asymmetric in nature.

It is symmetric in nature.

It is same as the cost function for SNE.

Q10. Imagine you are dealing with text data. To represent the words you are using word embedding (Word2vec). In word embedding, you will end up with 1000 dimensions. Now, you want to reduce the dimensionality of this high dimensional data such that, similar words should have a similar meaning in nearest neighbor space.In such case, which of the following algorithm are you most likely choose?

t-SNE

PCA

LDA

None

Q11. Which of the following statement is correct for t-SNE and PCA?

t-SNE is linear whereas PCA is non-linear

t-SNE and PCA both are linear

t-SNE and PCA both are nonlinear

t-SNE is nonlinear whereas PCA is linear

Q12. What will happen when eigenvalues are roughly equal?

PCA will perform outstandingly

PCA will perform badly

Cannot say

None

Q13. PCA works better if there is?

A linear structure in the data

If the data lies on a curved surface and not on a flat surface

If variables are scaled in the same unit

1 and 2

2 and 3

1 and 3

1, 2 and 3

Q14. What happens when you get features in lower dimensions using PCA?

The features will still have interpretability

The features will lose interpretability

The features must carry all information present in data

The features may not carry all information present in data

1 and 3

1 and 4

2 and 3

2 and 4

Q15. Under which condition SVD and PCA produce the same projection result?

When data has zero median

When data has zero mean

Both are always same

None

Great job!

Need a detailed solution set? Fill in your details below and you will get an answer key with the detailed solution set absolutely (FREE OF CHARGE! - NO HIDDEN CHARGES).