Tag Archives: MSE

Grammar + MSE = gMSE

Both the multi-state engine and the Markov engine have been finished in the past week.  Now, to tackle a new implementation of GGrewve with maximum ease-of-use, I've dreamed up yet another type of engine.  It's a rather simple hybrid type that will combine grammar with multi-state analysis.  Not surprisingly, I will call it gMSE (grammatical multi-state engine).

One can think of gMSE as being a valuation dictionary based on a plurality of states.  The gMSE answers queries of the form "What quantitative value would [word] receive based on past analysis and given that the current state is [state plurality]?"  Perhaps more importantly, gMSE can answer queries of the form "Which word would received the highest/lowest quantitative score based on past analysis and given that the current state is [state plurality]?"  In this way, the grammatical multi-state engine embeds grammatical data in the analysis of state pluralities.

This new engine will, ideally, make a newer and better version of GGrewve quite easy to create.  Since GGrewve is based on probabilistic grammar analysis, it is easy to see how gMSE could accommodate the GGrewve engine plus added levels of depth thanks to the multi-state analysis.  All of this can be accomplished with just a single object: a gMSE space.  The gMSE space itself contains a grammar dictionary, an MSE space, and an MSE statestream, all wrapped into a single, easily-manageable object.

Cranking out Engines

It's almost mid-March already.  I don't like that fact the the samples haven't improved appreciably in a while.  As I noted earlier, it's mostly due to the fact that I've been upgrading internals rather than working on sound.  Still, it's time to step on it.

Over the past few weeks, I've been working like mad to re-code all the old engines in c++, taking advantage of the massive optimizations possible therein.  So far, the following engines are now at least partially-functional as part of the new c++ library:

  • Artificial Neural Network Engine (was actually never implemented in AHK and has yet to be used in a plugin)
  • Contour Grammar
  • Evolutionary Engine
  • Fraccut
  • Markov Engine
  • Multi-State Engine

ALL of the new implementations are better than their predecessors, both in terms of efficiency and ease-of-use.  Certain complex engines such as the Markov engine may see speed increases of over a thousand fold thanks to redesigning.

By the end of the month, these myriad engines should be coming together to form some really powerful new plugins.  All it takes is code.

SandBox: Lightning Fast Idea Development

Over the past week most of the work I've done has been in the SandBox with the multi-state analysis engine.  MSE is having mixed results with chord tension prediction and I may need to think about developing a more advanced mathematical engine to produce better results (for example, create large matrices describing relations and use algorithms to explicitly solve for partial relations rather than using weak value correlations).

The SandBox, on the other hand, is proving more valuable every minute.  Testing ideas is now easier than it's ever been before, and I've had less downtime over the past week than I can remember having in a long time.  I'm working on an extension of SandBox called ScratchPad that will further simplify the process of getting ideas into program form by allowing the user to employ a very simple, 4-part coding interface that automatically injects the four blocks of custom code into different areas of a predefined template, letting the template and ScratchPad do all the housekeeping work.

To put it in perspective, I just timed myself to see how long it would take to put a very, very simple idea into action.  I opened ScratchPad and made a simple Brownian motion generative module for melody.  It took about 55 seconds from opening a blank ScratchPad to having the code rendered and activated (set to the default running code) within the SandBox module.  I then opened mGen and tested the SandBox module just to make sure that what I had done worked - and indeed it did, a Brownian motion melody greeted the screen when the render finished.  Sure, it's not pretty, and it probably doesn't sound good.  Nonetheless, it demonstrates how ScratchPad and SandBox allow me to translate from ideas to code to music in almost no time at all.  There was certainly no wasted time.  ScratchPad whittles down the task of codewriting to make sure that the only thing the user has to write is code that is directly specific to the idea being implemented, not to other general tasks such as data handling, initializing, closing down, and all the other time-consuming tasks in between.

I have no doubt that there will be a great deal of ideas tried in the upcoming weeks with the help of SandBox and ScratchPad.

Multi-State Analysis Engine

Yet another generalized algorithmic engine has been added to the XIAS library. Multi-state analysis (abbreviated MSE for "multi-state engine") is way of describing the intrinsic "value" of certain state combinations based on the partial analysis of relationships between individual states. The engine is, more or less, a creative implementation of an inference algorithm. But here's the best part - the engine takes categorical states as an input and transforms them into an arbitrary quantitative measurement.

It's easier to explain how the algorithm works in practice. Suppose the user wishes for the machine to know that eating ice-cream on a warm school day is "good." In particular, it's better than eating ice-cream on a cold school day. Still, a non-school day is better than either of these situations, regardless of temperature and presence of ice-cream. The following statements might be made to the engine:


From these statements, we might ask the engine to infer the intrinsic "value" of ice-cream, the weather, or a school day vs. a non-school day. More interestingly, we might ask the engine to infer the value of the situation in which we have no ice-cream, but it is a warm non-school day ("no_icecream,warm,!schoolday). The second inference would be much more interesting, because the engine would base its response on not just the sum of predicted individual state values, but also the predicted values of the partial relationships between the states (no_icecream to warm, warm to !schoolday, and no_icecream to !schoolday). Notice that we can use any syntax for the categorical inputs - the engine is not concerned with symbols such as ! and _, it treats each state uniquely. The names of states are for user convenience only (and a great convenience it is to have this flexibility!)

For this particular situation, the ability to analyze first- and second-order relationships between states doesn't contribute much to the quality of the overall analysis. However, thinking of this engine as applicable to an analysis of music, it is easy to see how useful the ability to analyze n-th order relationships would be. What is the contribution to tension of a seventh chord being played on the third beat of the measure? How pleasing is it to have [y] follow [x] in a melody given that the root is [z]? These kinds of questions are difficult to answer without highly-specialized analysis engines. But the XIAS MSE can do just that - place quantitative values on multiple-state situations based on partial relationships between the states inferred from given situational values.

I am already in the process of evaluating the abilities of the multi-state analysis engine in predicting the tension of chords based on individual note offsets. With any luck, MSE will prove to be a valuable addition to XIAS.