A Master of Science thesis in Mathematics by Baha Khalil entitled, "Neural Networks as a Convex Problem," submitted in September 2016. Thesis advisor is Dr. Dmitry Efimov. Soft and hard copy available.
We reformulated the problem of training the neural networks model into a convex optimization problem by performing a local quadratic expansion of the cost function and adding the necessary constraints. We designed a new algorithm that extends the back propagation algorithm for parameters estimation by using second-order optimization methods. We computed the second order mixed partial derivatives of the cost function for a single hidden layer neural network model to construct the Hessian matrix. We used the Gauss-Newton approximation instead of the Hessian matrix to avoid the analytical computation of the second order derivative terms for higher order neural network topologies. To compare the accuracy and computational complexity of our proposed algorithm versus the standard back propagation we tested both algorithms in different applications, such as: regression, classification, and ranking.