Table of difference between Model Parameters and HyperParameters

PARAMETERS HYPERPARAMETER
They are required for making predictions They are required for estimating the model parameters
They are estimated by optimization algorithms(Gradient Descent, Adam, Adagrad) They are estimated by hyperparameter tuning
They are not set manually They are set manually
The final parameters found after training will decide how the model will perform on unseen data The choice of hyperparameters decide how efficient the training is. In gradient descent the learning rate decide how efficient and accurate the optimization process is in estimating the parameters

Difference Between Model Parameters VS HyperParameters

The two most confusing terms in Machine Learning are Model Parameters and Hyperparameters. In this post, we will try to understand what these terms mean and how they are different from each other.

Similar Reads

What is a Model Parameter?

A model parameter is a variable of the selected model which can be estimated by fitting the given data to the model....

What is a Model Hyperparameter?

A model hyperparameter is the parameter whose value is set before the model start training. They cannot be learned by fitting the model to the data....

Table of difference between Model Parameters and HyperParameters

PARAMETERS HYPERPARAMETER They are required for making predictions They are required for estimating the model parameters They are estimated by optimization algorithms(Gradient Descent, Adam, Adagrad) They are estimated by hyperparameter tuning They are not set manually They are set manually The final parameters found after training will decide how the model will perform on unseen data The choice of hyperparameters decide how efficient the training is. In gradient descent the learning rate decide how efficient and accurate the optimization process is in estimating the parameters...