Questions Courses
1. What is the difference between AI, Machine Learning, and Deep Learning?
2. What is supervised learning?
3. What is unsupervised learning?
4. What is overfitting, and how can it be avoided?
5. What is the difference between classification and regression?
6. What is a confusion matrix?
7. What is precision and recall?
8. What is the bias-variance tradeoff?
9. What is cross-validation, and why is it important?
10. What is the purpose of regularization in ML models?
11. What is the difference between bagging and boosting?
12. What is the difference between L1 and L2 regularization?
13. What is Gradient Descent?
14. Explain the working of a Decision Tree.
15. What is the difference between a Decision Tree and a Random Forest?
16. What is a support vector machine (SVM)?
17. Explain the k-means clustering algorithm.
18. What is Principal Component Analysis (PCA)?
19. What is the difference between a generative and discriminative model?
20. What is the kernel trick in SVM?
21. What are Convolutional Neural Networks (CNNs)?
22. Explain the difference between RNNs and LSTMs.
23. What are Generative Adversarial Networks (GANs)?
24. What is Transfer Learning?
25. Explain Q-learning in Reinforcement Learning.
26. What are autoencoders?
27. What is the vanishing gradient problem, and how can it be solved?
28. What is the difference between stochastic and batch gradient descent?
29. What are activation functions in neural networks?
30. What is the purpose of dropout in neural networks?
AI/ML Interview Questions
Are you preparing for a AI/ML interview? Here are 30 essential questions to help you succeed in your interview and demonstrate your mastery of AI/ML.
Top AI/ML Interview Questions & Answers
1. What is the difference between AI, Machine Learning, and Deep Learning?
AI is the broader concept of creating machines that simulate human intelligence. Machine Learning is a subset of AI where machines learn from data. Deep Learning is a subset of Machine Learning that uses neural networks with many layers.
2. What is supervised learning?
Supervised learning is a type of machine learning where the model is trained on labeled data, meaning each training example is paired with an output label.
3. What is unsupervised learning?
Unsupervised learning involves training a model on data without labeled responses, aiming to find hidden patterns or structures (e.g., clustering).
4. What is overfitting, and how can it be avoided?
Overfitting occurs when a model performs well on training data but poorly on unseen data. It can be avoided using regularization, simplifying the model, increasing training data, or applying cross-validation.
5. What is the difference between classification and regression?
Classification predicts a label (e.g., spam or not spam), while regression predicts a continuous value (e.g., predicting house prices).
6. What is a confusion matrix?
A confusion matrix shows the performance of a classification model by comparing actual and predicted values, including metrics like True Positives, False Positives, False Negatives, and True Negatives.
7. What is precision and recall?
Precision measures the accuracy of positive predictions: TP / (TP + FP). Recall measures the ability to find all positive samples: TP / (TP + FN).
8. What is the bias-variance tradeoff?
Bias refers to the error from overly simplistic models. Variance refers to error from complex models. The tradeoff is finding a balance to minimize total error.
9. What is cross-validation, and why is it important?
Cross-validation assesses how well a model generalizes to unseen data by splitting data into folds and evaluating performance across subsets.
10. What is the purpose of regularization in ML models?
Regularization techniques like L1 and L2 penalize model complexity to prevent overfitting and improve generalization.
11. What is the difference between bagging and boosting?
Bagging reduces variance by training models on different data subsets (e.g., Random Forest). Boosting reduces bias by sequentially improving model errors (e.g., AdaBoost).
12. What is the difference between L1 and L2 regularization?
L1 regularization (Lasso) encourages sparsity by penalizing the absolute value of coefficients, while L2 (Ridge) penalizes their square, keeping all features.
13. What is Gradient Descent?
Gradient Descent is an optimization algorithm used to minimize a loss function by iteratively adjusting parameters in the opposite direction of the gradient.
14. Explain the working of a Decision Tree.
A Decision Tree splits data based on feature values to make decisions, using metrics like Gini impurity or entropy to select the best split.
15. What is the difference between a Decision Tree and a Random Forest?
A Decision Tree is a single tree structure, while a Random Forest is an ensemble of trees that uses bagging to reduce variance and improve accuracy.
16. What is a support vector machine (SVM)?
An SVM is a supervised learning algorithm that finds the optimal hyperplane to separate data into different classes.
17. Explain the k-means clustering algorithm.
K-means clustering partitions data into k clusters by minimizing variance within clusters, iteratively assigning points and updating centroids.
18. What is Principal Component Analysis (PCA)?
PCA is a dimensionality reduction technique that transforms data into a new coordinate system, with variance maximized along principal components.
19. What is the difference between a generative and discriminative model?
A generative model learns joint probability distributions (e.g., Naive Bayes), while a discriminative model learns conditional probabilities (e.g., Logistic Regression).
20. What is the kernel trick in SVM?
The kernel trick allows SVM to perform non-linear classification by implicitly mapping data into a higher-dimensional space.
21. What are Convolutional Neural Networks (CNNs)?
CNNs are neural networks designed for processing images, using convolutional layers for feature extraction and pooling layers for downsampling.
22. Explain the difference between RNNs and LSTMs.
RNNs handle sequential data with hidden states, while LSTMs address the vanishing gradient problem with gates to control information flow.
23. What are Generative Adversarial Networks (GANs)?
GANs consist of a Generator that creates synthetic data and a Discriminator that evaluates its authenticity, training in a competitive manner.
24. What is Transfer Learning?
Transfer Learning involves using a pre-trained model on a related problem, fine-tuning it on new data instead of training from scratch.
25. Explain Q-learning in Reinforcement Learning.
Q-learning is a reinforcement learning algorithm where an agent learns action values by maximizing cumulative rewards.
26. What are autoencoders?
Autoencoders are neural networks for dimensionality reduction, with encoders compressing data into latent space and decoders reconstructing it.
27. What is the vanishing gradient problem, and how can it be solved?
The vanishing gradient problem occurs when gradients become too small during backpropagation. Solutions include using ReLU, LSTMs, or batch normalization.
28. What is the difference between stochastic and batch gradient descent?
Stochastic Gradient Descent updates parameters for each example, while Batch Gradient Descent updates them using the entire dataset.
29. What are activation functions in neural networks?
Activation functions introduce non-linearity. Examples include Sigmoid, ReLU, Tanh, and Softmax.
30. What is the purpose of dropout in neural networks?
Dropout is a regularization technique that prevents overfitting by randomly ignoring neurons during training.