all AI news
Numerical optimization based on the L-BFGS method
March 22, 2022, 1:22 p.m. | Utpal Kumar
Towards Data Science - Medium towardsdatascience.com
We will inspect the Limited-memory Broyden, Fletcher, Goldfarb, and Shanno (L-BFGS) optimization method using one minimization example for the Rosenbrock function. Further, we will compare the performance of the L-BFGS method with the gradient-descent method. The L-BFGS approach along with several other numerical optimization routines, are at the core of machine learning.
Introduction
Optimization problems aim at finding the minima or maxima of a given objective function. There are two deterministic approaches to optimization problems — first-order derivative (such as …
gradient-descent numerical numerical-analysis optimization python
More from towardsdatascience.com / Towards Data Science - Medium
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior AI & Data Engineer
@ Bertelsmann | Kuala Lumpur, 14, MY, 50400
Analytics Engineer
@ Reverse Tech | Philippines - Remote