I’m working on a couple problems for which the AI technique of Evolution Strategies makes a perfect match. My own AI/EC background is mostly in genetic algorithms (Holland) and genetic programming (Koza); ESs were always “that German thing”. But ESs have gained in popularity as a general problem-solving paradigm, and for my current specific problem this approach is great.
Here’s the general form of the problem: Attempt to describe the mean and standard deviation of several weakly correlated outputs in the range [-1.0, 1.0], assuming you don’t know anything about the output values to begin with. Experimental results will (should?) tell you everything you need to know.
So, start with a uniform input distribution. Run the inputs through the model and calculate the outputs. Evolve the distribution of the next round of inputs based on the feedback. Wash, rinse, repeat. You may eventually get to a normal or pseudo-normal distribution if there is a single “correct” output.
ES allows you to coevolve the mutation function(s) (the mean and standard deviation of the inputs) as you go. The idea is that I can present more and more specific sets of inputs and arrive at very neat, very precise, measurements of the output parameter(s) using this method.
Representation is a little heavier than normal using this approach – roughly 3X – but as the man said, if you don’t particularly care about the results, you can get a program to run as fast as you want.
In case my rich multimillionaire great-uncle is reading this post, here’s a $128.00 book that I want to buy: “Noisy Optimization With Evolution Strategies”.