Revisiting Artificial Neural Networks (ANNs)
Though I researched and toyed with artificial neural networks for a few weeks last summer, the speed drawbacks of AHK prevented me from pursuing a generative module that uses the technique of neural networks to generate output. Now, however, with the completion of the core pieces of a c++ library, I am able to leverage the speed of c++ in designing plugins. That being said, the time has come to revisit ANNs...this time, wielding roughly 100x the speed!
The basic theory of artificial neural networks is simple: create layers of "units" that perform single-function calculations, then create numerous connections between said units that carry input/output signals between them. Finally, sandwich the layers of units with layers of inputs and outputs that carry signals between an external interface and the internal unit layers. ANNs essentially rely on the ability of function compositions to quickly become chaotic when plotted against a varying input. One can, however, shape the output with user-defined parameters or any sort of evolutionary optimization run against a "training set" of data.
I spent the past two days writing a basic artificial neural network library in c++ to experiment with the applicability of neural networks to music. Currently, I am only playing with settings and observing graphical output in order to get a feel for the capabilities of ANNs. When I feel that I understand their behavior, I will map their outputs to music and see what happens!
Here are some output graphs from different neural networks. Notice that many interesting effects, including discontinuities, can be achieved with the composition of relatively simply functions (breaks in the curves are possible with either ternary or modulus functions).
The neural networks each contained 100 units (neurons) and 100 to 500 neural connections (synapses) between the units. A single input node and a single output node were used to communicate with the network.