This note describes how to implement and solve a quadratic programming optimization problem using a shallow neural network in Keras. A single linear layer is used with a custom one-sided loss to impose the inequality constraints. A custom kernel regularizer is used to impose the optimization objective, yielding a form of penalty method. This provides a useful exercise in augmenting the loss, metrics, and callbacks used in Keras. This also potentially allows the exploitation of the back-end implementations of Keras and Tensorflow on GPU’s and distributed storage. We demonstrate the method for large-scale computational image reconstruction with compressed sensing simulations.

www.keithdillon.com/papers_preprints/Quadratic Programming with Keras.pdf

Quadratic Programming with Keras