Data Scientist Question:

Tell me do gradient descent methods always converge to same point?

Tweet Share WhatsApp

Answer:

No, they do not because in some cases it reaches a local minima or a local optima point. You don’t reach the global optima point. It depends on the data and starting conditions

Download Data Scientist PDF Read All 55 Data Scientist Questions
Previous QuestionNext Question
What is selective bias?Tell me what is Linear Regression?