This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
Re: minimization of functions of several variables
- To: "Pavel Krotkov" <krotkov at drfmc dot ceng dot cea dot fr>
- Subject: Re: minimization of functions of several variables
- From: Brian Gough <bjg at network-theory dot co dot uk>
- Date: Thu, 1 Feb 2001 12:47:26 +0000 (GMT)
- Cc: <gsl-discuss at sources dot redhat dot com>
- References: <002501c08c31$0994b280$410ba884@ceng.cea.fr>
- Reply-To: gsl-discuss at sources dot redhat dot com
> I have encountered a problem of finding a minimum of a function of =
> several variables in a situation when finding the derivatives is =
> complicated (impossible). In GSL there seems to be only the simulated =
> annealing algorithm implemented that suits. Has anybody tried it in the =
> multidimensional case?
It's true, the multidimensional minimizers are all gradient based.
You can use them if you provide a function which computes the gradient
part numerically. We should provide some derivative-free minimization
routines, but no one has implemented any yet.