Question: Explain the difference between batch gradient descent and stochastic gradient descent.Answer: Batch gradient descent updates the model parameters using the entire dataset, while stochastic gradient descent updates the parameters using one randomly selected data point at a time. Mini-batch gradient descent is a compromise, using a small subset of the data for each update. |
保存以便复习
收藏此条目、标记为困难题,或将其加入复习集合。
这有帮助吗? 是 否
用户评价最有帮助的内容:
- Explain the concept of feature engineering.
- What is the purpose of regularization in machine learning?
- Explain the term \'hyperparameter\' in the context of machine learning.
- What is the purpose of the activation function in a neural network?
- Explain the term \'precision\' in the context of classification.