Posts In: ANN

Cranking out Engines

March 9, 2010 General News 0 Comments

It's almost mid-March already.  I don't like that fact the the samples haven't improved appreciably in a while.  As I noted earlier, it's mostly due to the fact that I've been upgrading internals rather than working on sound.  Still, it's time to step on it.

Over the past few weeks, I've been working like mad to re-code all the old engines in c++, taking advantage of the massive optimizations possible therein.  So far, the following engines are now at least partially-functional as part of the new c++ library:

  • Artificial Neural Network Engine (was actually never implemented in AHK and has yet to be used in a plugin)
  • Contour Grammar
  • Evolutionary Engine
  • Fraccut
  • Markov Engine
  • Multi-State Engine

ALL of the new implementations are better than their predecessors, both in terms of efficiency and ease-of-use.  Certain complex engines such as the Markov engine may see speed increases of over a thousand fold thanks to redesigning.

By the end of the month, these myriad engines should be coming together to form some really powerful new plugins.  All it takes is code.

Revisiting Artificial Neural Networks (ANNs)

February 6, 2010 General News 0 Comments

Though I researched and toyed with artificial neural networks for a few weeks last summer, the speed drawbacks of AHK prevented me from pursuing a generative module that uses the technique of neural networks to generate output. Now, however, with the completion of the core pieces of a c++ library, I am able to leverage the speed of c++ in designing plugins. That being said, the time has come to revisit ANNs...this time, wielding roughly 100x the speed!

The basic theory of artificial neural networks is simple: create layers of "units" that perform single-function calculations, then create numerous connections between said units that carry input/output signals between them. Finally, sandwich the layers of units with layers of inputs and outputs that carry signals between an external interface and the internal unit layers. ANNs essentially rely on the ability of function compositions to quickly become chaotic when plotted against a varying input. One can, however, shape the output with user-defined parameters or any sort of evolutionary optimization run against a "training set" of data.

I spent the past two days writing a basic artificial neural network library in c++ to experiment with the applicability of neural networks to music. Currently, I am only playing with settings and observing graphical output in order to get a feel for the capabilities of ANNs. When I feel that I understand their behavior, I will map their outputs to music and see what happens!

Here are some output graphs from different neural networks. Notice that many interesting effects, including discontinuities, can be achieved with the composition of relatively simply functions (breaks in the curves are possible with either ternary or modulus functions).

The neural networks each contained 100 units (neurons) and 100 to 500 neural connections (synapses) between the units. A single input node and a single output node were used to communicate with the network.

Heading Towards c++

July 25, 2009 General News 0 Comments

My investigations in neural networks have led me back to the conclusion that I can't hide from c++ forever. As well as AutoHotKey has served me over the past few years, I fear the power limits may finally be upon me. The fact of the matter is, c++ boasts a raw speed and power with which AHK can't compete. AHK wins for ease-of-use any day...but in algorithmic programming, the speed of c++ wins. As such, I'm bringing out the old c++ compilers again and learning my way around. I used to be pretty fluent in the language, but I've grown pretty rusty and have grown too content with the ease of AHK programming.

I'm reworking the neural network engine in c++, and looking to gain roughly a 100x performance increase (based on some numbers I've seen around the AHK forums). If there is justice in the world, that could mean 100x better function approximations, 100x larger neural nets, or even 100x better music. Who knows.

If c++ starts working for me again, I have a good compromise in mind. Design the GUIs in AHK, because they generally don't have to do much complicated work. They just have to look friendly. Then do the actually plugin processing in c++ to get the speed advantage. This part doesn't have to look good.

This should also solve another thing I've been worried about: code security from theft. If I really do release mGen, it'd be reverse engineer, cracked, and modified to the world's content by even the least adept hacker. The code of AHK is way too insecure because it's an interpreted language, which means the whole script is sitting right there in memory, waiting to be stolen. In c++, however, everything is compiled down to machine code, which would take way, way longer to reverse engineer. It could still be done. BUT, with c++, I could purchase obfuscation software, which would be the final step in preventing reverse engineering. It wouldn't keep out professionals that really want to crack my program. But obfuscated c++ would be about 1000x harder to reverse engineer than an AHK script (no obfuscation tools even exist for AHK). So that's something that's been sitting in the back of my mind.

Not to mention, it's nice to have an OOP language back. I've missed classes, and I my mind has matured to the point that I can now understand all the basic functions of c++, including classes, pointers & references, inheritance, etc. Hopefully my new neural net engine will prove it.

Artificial Neural Networks (ANNs)

July 23, 2009 General News 0 Comments

I've recently begun working with computational structures called artificial neural networks (ANNs). I have read many articles concerning them and am now working on several books covering the basic of ANNs. I generally am not much on such micro-scale computation (ANNs comprise tiny, tiny computational blocks that must be assembled by the hundreds to perform real-world functions), but I figured I need to have experience working with each and every technique of algorithmic composition. My background in systems which learn has been relatively weak until now.

Surprisingly, I managed to achieve some very interested graphical results with some rudimentary neural nets. Unfortunately, as is the case with fractals, interesting imagery does not necessarily translate into interesting music. I'll try not to get pulled into that trap of sacrificing musical quality for interesting computation structures.

Neural Networks - Emulating Human Creativity

Upon finding the article Algorithmic Composition and Reductionist Analysis: Can a Machine Compose? I immediately got excited. The author gives a great overview of algorithmic composition and details his own endeavors into the field.

In particular, the author touches on the concept of heuristic algorithms - including both genetic algorithms and neural networks - that slowly approach a desired solution by having a human evaluator determine the fitness of the system. I found the following quote astounding:

"A researcher trained a neural network to recognize makes of car from a photograph, and he decided to look inside the network at the individual neurons, rather than regarding it as a "black box" that somehow worked for some incomprehensible reason. He found that certain areas of the network were specializing into recognizing certain features of the car, and, by introducing a level of random "noise" into the network, got the network to design its own cars."

And finally, the author's take on algorithmic composition and creativity:

"I have always felt very uneasy about throwing any musical ideas away, as it would amount to destroying something that I think is unique. But, if computer composition took over to a degree, would the 'preservation people' be content with the idea that the music exists, somewhere, within the set of possibilities? May I delete Clara Empricost's symphony with impunity, once it has generated it? Should I preserve the algorithm and the random number seeds somewhere? An interesting set of problems."