The term "overparameterised models" originates from the fields of artificial intelligence, Industry and Factory 4.0 as well as Big Data and Smart Data. It refers to mathematical models, for example in AI, that have more parameters (i.e. adjustable values) than would actually be necessary to solve a specific problem.
Imagine you have a robot in a factory that needs to learn to sort different parts. If the model that enables the robot to learn has a large number of parameters, this "over-parameterised model" can often recognise many more patterns than are actually present. This allows the robot to learn not only the real features, but also random details that are not relevant to the task.
Over-parameterised models are used because they offer a high degree of flexibility and often achieve better results on large amounts of data - provided there is enough data and the models are controlled correctly. The aim is to find a balance: The model should be so complex that it can solve difficult tasks, but not so complex that it learns everything - even the unimportant things.















