Junyeop Na Dev Blog
Junyeop Na Dev Blog
  • Archives
  • All Categories
  • All Tags

Optuna

1 article

azure c++ computer vision cpp data analytics decorator deep learning game server geospatial data geosql GPU image segmentation Langchain LLM LM Studio machine learning multithreading network programming object oriented programming oop Optuna python pytorch RAG segformer synapse analytics tcp threads udp voice recognition
<Deep Learning Intermediate> Optuna for Effective Hyperparameter Tuning

<Deep Learning Intermediate> Optuna for Effective Hyperparameter Tuning

Sunday, May 4, 2025

Hyperparameter tuning is an important part of training ML models. Machine learning engineers used to tune hyperparameters by hand or use Grid Search. These days there are more powerful tools that can help tune hyperparameters in less time. We’re going to look into a easy-to-use tool called Optuna Problems With Grid Search Takes too long: Search space gets huge as more hyperparameters get inside grid. If there is 86,000 grids already, adding one hyperparameter with just binary options doubles the number of grids to 172,000 Doesn’t utilize previous search result: As we do the search, we can get a sense of which range of hyperparameters are more effective to experiment Only supports discrete grids ex) After doing 10 experiments with learning rate between 0.
Jun Yeop(Johnny) Na
Jun Yeop(Johnny) Na
5 minute read

© 2024 - 2025 Junyeop Na Dev

🌱 Powered by Hugo with theme Dream.

© 2024 - 2025 Junyeop Na Dev

🌱 Powered by Hugo with theme Dream.