From 64c1550eb1314364b3da9825a99c74646b1b5451 Mon Sep 17 00:00:00 2001 From: "Steven G. Johnson" Date: Fri, 28 Jan 2022 15:30:00 -0500 Subject: [PATCH] video link --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 2725756..29fd194 100644 --- a/README.md +++ b/README.md @@ -115,7 +115,7 @@ In physics, first and second derivatives of eigenvalues and first derivatives of * part 1: continued [Hessian notes](https://www.dropbox.com/s/tde5cow6wuais8y/Hessians.pdf?dl=0): using Hessians for optimization (Newton & quasi-Newton & Newton–Krylov methods), cost of Hessians * part 2: adjoint differentiation of ODES (guest lecture by Dr. Chris Rackauckas), notes coming soon -* video: coming soon +* [video](https://mit.zoom.us/rec/share/4yJfjUXsAhykKGebU8lw1wqOb_TIwK-WISEHwEbG8WyB2bLlyYLjkZXQHonT2Tte.lLaSTRDjVxYocEmz?startTime=1643385485000) **Further reading (part 1)**: See e.g. these Stanford notes on [sequential quadratic optimization](https://web.stanford.edu/class/ee364b/lectures/seq_notes.pdf) using trust regions (sec. 2.2). See 18.335 [notes on BFGS quasi-Newton methods](https://github.com/mitmath/18335/blob/spring21/notes/BFGS.pdf) (also [video](https://mit.zoom.us/rec/share/naqcRgSkZ0VNeDp0ht8QmB566mPowuHJ8k0LcaAmZ7XxaCT1ch4j_O4Khzi-taXm.CXI8xFthag4RvvoC?startTime=1620241284000)). The fact that a quadratic optimization problem in a sphere has [strong duality](https://en.wikipedia.org/wiki/Strong_duality) and hence is efficiently solvable is discussed in section 5.2.4 of the [*Convex Optimization* book](https://web.stanford.edu/~boyd/cvxbook/). There has been a lot of work on [automatic Hessian computation](https://en.wikipedia.org/wiki/Hessian_automatic_differentiation), but for large-scale problems you can ultimately only compute Hessian–vector products efficiently in general, which are equivalent to a directional derivative of the gradient, and can be used e.g. for [Newton–Krylov methods](https://en.wikipedia.org/wiki/Newton%E2%80%93Krylov_method).