Month 20: Hyperparameter Tuning and Optimization

Month 20: Hyperparameter Tuning and Optimization

Week 1: Introduction to Hyperparameter Tuning and Optimization

  • Overview of Hyperparameter Tuning and Optimization
  • Types of Hyperparameters
  • Grid Search and Random Search

Week 2: Gradient-Based Optimization Techniques

  • Gradient Descent
  • Stochastic Gradient Descent (SGD)
  • Mini-Batch Gradient Descent
  • Momentum-based Optimization Techniques (Nesterov, Adagrad, Adadelta, Adam)

Week 3: Bayesian Optimization Techniques

  • Introduction to Bayesian Optimization
  • Gaussian Processes
  • Acquisition Functions
  • Bayesian Optimization Libraries (e.g., Hyperopt, Optuna)

Week 4: Automated Machine Learning

  • Introduction to Automated Machine Learning
  • AutoML Techniques (e.g., TPOT, H2O.ai, Auto-Keras)
  • Model Selection and Stacking
  • Challenges and Future Directions of AutoML