Grammatical Representation of Contour

January 4, 2010 General News, Ideas 0 Comments

Though the mathematical melodies of Entropic Root provide excellent and oftentimes interesting background patterns, they still lack the coherence of a genuine, human-crafted melody. I do not wish to imply that no progress has been made towards that ultimate goal of realistic melodies, however, I must be realistic in admitting that a mathematical method like Entropic Root cannot achieve the same level of composition as a human because Entropic Root, at its core, does not have any concept of what it is creating: it is not intentional.

Contour seems to be a reasonable way to mold a melody before determining actual pitches. In the past, great success has been achieved with grammatical representations of music. GrammGen, the first I ever wrote, implemented a very simple grammatical system for generating melodies. Though not up to par with the performance of modern plugins, GrammGen often provided interesting melodies. Like Entropic Root, however, GrammGen had very little deliberateness in its actions. In order to resolve this, I hope to combine the premeditative abilities of contour shaping with the structural simplification of grammar engines to achieve a deliberate and coherent melody generation engine.

Essentially, I am introducing yet another grammatical engine. This engine, however, will be based on a “contour as a language” paradigm. In other words, it will treat the words of the grammar as having contextual significance because of a contour correspondence. Rather than assigning progressive, numerical values to randomly-generated words and jumbling them together to create a phrase, this “contour grammar” engine will recognized the contextual significance of each word by its parent type. There are eight parent types, which correspond directly to fundamental units of contour: two jump contours, two linear contours, and four “bounce” or “pivot” contours. Words will then be randomly generated under parent types, but the engine will know to which type of contour a given word belongs. A Markov engine or simple multi-state engine will determine the contextual significance of contour units.

Below is a quick sketch of the eight fundamental units of contour.

If we imagine that each line segment has an arbitrary magnitude (but that segments equal in size in the diagram have equal magnitudes), then it is not hard to see that these eight fundamental units actually characterize most possible monophonic melodies. This is especially true if one considers certain transformations such as stretching and splitting.

I am hopeful that the method of contour grammar will be a historic step towards intentionality in output for mGen, starting with melodies.