Scaled hyperparameter optimisation is a term from the world of artificial intelligence, big data and automation. Hyperparameters are settings in an AI model that have a major influence on how well the model works. Examples of hyperparameters include the learning rate or the size of the data used to train an AI.
Scaled hyperparameter optimisation means that not just a few settings are tested, but many different options are tried out automatically and simultaneously - often on several computers or with the help of cloud services. This saves a lot of time and can significantly improve the quality of AI models.
Let's assume an online shop wants to use artificial intelligence to predict which products will sell particularly well. To get the best prediction, the AI system must be optimised. With scaled hyperparameter optimisation, hundreds of variants are automatically tested in a short space of time until the best setting is found - without any time-consuming manual work.
This method enables companies to develop powerful and reliable AI models more quickly and thus organise their processes more efficiently.















