Feb. 19, 2024, 6:47 p.m. | /u/theloneliestsoulever

Machine Learning www.reddit.com

I have this:

Sigma: R->R is a non-decreasing L-Lipschitz function, W € R^kxd, and b €R^k there exists a convex L||W||_2^2 smooth function F_(w,b) such that
Nabla_x F_(w b)(x) =W^T sigma(Wx +b)

and we have a convex potential layer
z = x - (2*W^T sigma(Wx + b) )/ ||W||_2^2


Now, can anyone help me with the rigorous proof of how F_(w,b) is L||W||_2^2 smooth? And, is z differentiable?

If not the solution, then can someone else suggest some reading material …

continuity function functions layer machinelearning

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York