Michael V. Solodov
Instituto de Matematica Pura e Aplicada
Estrada Dona Castorina 110
Jardim Botanico
Rio de Janeiro
RJ, CEP 22460320, Brazil
Tel: (5521) 5295228
Fax: (5521) 5124115
Email: solodov@impa.br , solodov@cs.wisc.edu
Web : http://www.cs.wisc.edu/~solodov/solodov.html
PhD Computer Sciences, University of Wisconsin  Madison, 1995.
(advisor
Olvi L. Mangasarian ,
John von Neumann Professor of Mathematics and Computer Sciences)
MS Computer Sciences, University of Wisconsin  Madison, 1992.
Diploma with Distinction
Applied Mathematics and Cybernetics, Moscow State University, 1991.
Optimization in Neural Networks
One of the areas of my research has to do with applications
of optimization theory and algorithms to problems arising in
the neural networks field of artificial intelligence. Neural networks is
a large interdisciplinary area of research and it has already found
applications in many branches of science and technology. However, much
of the work in the area has been based on heuristic concepts and
trialanderror experimentation. It is therefore of great importance
to provide rigorous mathematical foundation to neural networks theory
and algorithms (where ever possible).
I also feel that new more effective machine learning
methods can be developed by applying general optimization techniques
in conjunction with artificial intelligence paradigms, and by taking
advantage of the problem structure.
The following papers contain some of the work in this direction.
Relevant Publications
 M.V. Solodov

Incremental gradient algorithms with stepsizes bounded away from zero.
Computational Optimization and Applications
11 (1998), 2335.
 O.L. Mangasarian and M.V. Solodov

Serial and Parallel Backpropagation Convergence Via Nonmonotone
Perturbed Minimization.
Optimization Methods and Software 4 (1994), 103116.
 M.V. Solodov and S.K. Zavriev

Error stability properties of generalized gradienttype algorithms.
Journal of Optimization Theory and Applications,
98 (3), September 1998.
 O.L. Mangasarian and M.V. Solodov

Backpropagation Convergence Via Deterministic Perturbed
Minimization.
Advances in Neural Information Processing Systems 6,
J.D. Cowan, G. Tesauro and J. Alspector (eds), Morgan Kaufmann
Publishers, San Francisco, CA, 1994, 383390.
You can access my other research papers in Mathematical Programming from
my home page.
Back to the
Machine Learning in Mathematical Programming page.