Data Scientist Question:
Tell me do gradient descent methods always converge to same point?
Answer:
No, they do not because in some cases it reaches a local minima or a local optima point. You don’t reach the global optima point. It depends on the data and starting conditions
Previous Question | Next Question |
What is selective bias? | Tell me what is Linear Regression? |