Scoring Sound
Scoring Sound
Thor Magnusson
Buy on Leanpub

Table of Contents

Preface

SuperCollider is one of the most expressive, elegant and popular programming languages used in today’s computer music. It has become well known for its fantastic sound, strong object oriented design, useful primitives for musical composition, and openness in terms of aims and objectives. Unlike much commercial software, one of the key design ideas of SuperCollider is to be a tabula rasa, a blank page, that allows musicians to design their own musical compositions or instruments with as few technological constraints as possible. There are other environments that operate in similar space, such as Pure Data, CSound, Max/MSP, ChucK, Extempore, and now JavaScript’s Web Audio, but I believe that SuperCollider excels at important aspects required for such work, and for this reason it has been my key platform for musical composition and instrument design since 2001, although I have enjoyed working in other environments as well.

This book is an outcome of teaching SuperCollider in various British higher education institutions since 2005, in particular at the Digital Music and Sound Arts programme at the University of Brighton, Music Informatics at the University of Sussex, Sonic Arts at Middlesex University and Music Informatics at the University of Westminster. Lacking the ideal course book, I created a tutorial that I’ve used for teaching synthesis and algorithmic composition in SuperCollider. The tutorial’s focus was not on teaching SuperCollider as a programming language, but to explore key concepts, from different synthesis techniques and algorithmic composition to user interfacing with graphical user interfaces or hardware. I have subsequently used this tutorial in diverse workshops given around the world, from Istanbul to Reykjavik; from Madrid to Rovaniemi.

An earlier version of this book was published on the DVD of The SuperCollider Book publised by MIT Press. The SuperCollider book is an excellent source for the study of SuperCollider and is highly recommended, but has different aims than this current book as it goes deeper into more specific areas, whilst the current book aims to be present a smoother introduction, a general overview and a specific focus on practical uses of SuperCollider. The original tutorial was initially released as .sc files, then moved over to the new .scd document format, and finally ported to the .html format that became the standard help file format of SuperCollider. SuperCollider documentation has now gained a new and fantastic documentation format which can be studied by exploring the new documentation system of SuperCollider. However, with this updated tutorial, I have decided to port it into a more modern ebook format that would be applicable in the diverse readers on different operating systems. I have chosen Lean Publishing as the publication platform for this rewriting, as I can write the book in the attractive markdown format and use github for revision control. Furthermore, I can publish the book ad-hoc, get real-time feedback from readers, and disseminate the book in the typical modern ebook formats appropriate to most ebook readers.

The aim with this book is the same as my initial tutorials written in 2005, i.e., to serve as a good undergraduate introduction to SuperCollider programming, audio synthesis, algorithmic composition, and interface building of innovative creative audio systems. I do hope that my past and future students will find this work useful, and I sincerely hope that it also is beneficial to anyone who decides to embark upon the exciting expedition into the fantastic and enticing world of SuperCollider: the ideal workshop for people whose creative material is sound.

I encourage any reader who finds bugs, errors, or simply would like a better explanation of a topic to give me feedback through this book’s Discord Server.

Brighton, June 2013 - Reykjavik, April 2021.

Introduction

Embarking upon learning SuperCollider can seem daunting at first. The software environment the user is faced with might look confusing, but let’s be assured that this feeling will be quickly overcome. Learning SuperCollider is in many ways similar to learning an acoustic instrument and it takes practice to become good at it. However, it should be noted here immediately at the beginning that becoming a code-ninja in SuperCollider need not necessarily be the goal. Indeed, one can write some very good music knowing only a few chords on a guitar or the piano!

The SuperCollider IDE (Integrated Development Environment) is the same on the Linux, Mac, and Windows operating systems. There might be minor differences, but it looks roughly like this (and this picture contains some labels for further explanation):

A screenshot of the SuperCollider IDE
A screenshot of the SuperCollider IDE

You will see a coding window on the left, a documentation window, and a post window where the SuperCollider language informs you what it is up to. So let’s dive straight into the first exercise, the famous “Hello World” print into a console (the post window). Simply type “Hello world”.postln; (or “Hola Mundo”.postln; if you like) into the coding window, highlight that text and hit Shift + Return (or go to the Language menu and select “Evaluate Selection or Line”). If you look at the post window, a “Hello World” has been posted there. Now try to write the same with a spelling mistake, such as “Hello World”.possstln; and you will see an error message appearing.

SuperCollider is case sensitive, which means that it understands “SinOsc” but has no clue what “Sinosc” means. You will also notice the semicolon (;) at the end of every line written. This is for the SuperCollider interpreter (the language parser) to understand that the current line has ended. The SuperCollider environment consists of three different elements (or processes): the IDE (the text editor that you see in front of you), the language (sclang that is the programming language), and the synth (the audio server that will generate the sound). Later chapters will explain how different languages (such as Java, C++, or Pd) can communicate to the audio server.

Now, let’s dive straight into making some sound, as that’s really the reason you are reading this book. First boot the audio server from the Server menu and then type:

{SinOsc.ar(440)}.play;

into the code window. Now evaluate that line (and if your code is only one line, you can simply place the cursor somewhere in that line and hit SHIFT+RETURN). You will hear a sound in the left speaker of your system (yes all oscillators are mono by nature). It might be loud, and you will need to stop it. Hit CMD+. or CTRL+. to stop the sound. There is also a menu item in the Language menu to stop the sound, but it is recommended that you simply write these key commands into the motor memory of your fingers.

Let us play a little with this code (hit Cmd/Ctrl+period (Cmd+.) to stop the sound after every line):

// Octave higher
{SinOsc.ar(880, 0, 1)}.play;
// Half the amplitude
{SinOsc.ar(880, 0, 0.5)}.play;
// Add another oscillator to multiply the frequency
{SinOsc.ar(880 * SinOsc.ar(2), 0, 0.5)}.play;
// Or multiply the the amplitude
{SinOsc.ar(880, 0, 0.5 * SinOsc.ar(2) )}.play;

What happened here? We are listening to a sine wave oscillator of 880 Hz, or cycles per second (cps). The sine wave oscillator is one of many elements of what is called, in most sound programming languages, a “unit generator”. The UGen outputs samples according to specific algorithms depending upon the desired wave form or filter functionality. So a SinOsc will output samples in different way than a Saw. Furthermore, in the code above we are using the output of one oscillator to multiply the parameters of another. But the question arises: which parameters? What is that comma after 880 and the stuff appearing after it?

Finally, what we have listened to is a sine wave of 880 Hz, with respective amplitudes of 1 and 0.5. And this is important: signals sent to the sound card of your computer are typically consisting of samples of values between −1 and 1 in amplitude. If the signal is above 1 or below −1, you typically get what is called “clipping” and the sound most likely becomes distorted.

You might also have noticed the information given to you at the bottom of the IDE, that you have used a number of ugens (u), synths (s), groups (g), and SynthDefs (d). This will be explained in the following chapters, but for now: congratulations with having made some sound in SuperCollider!

About the Installation

You have now installed and explored SuperCollider on your system. This book does not cover how to install SuperCollider on the different operating systems, but we should note that on any SuperCollider installation, a user specific area is created where you can install your classes, find the synth definitions you have created, and install SC-plugins. This is in your user directory (which can be found by running this line: Platform.userExtensionDir and the path will be posted in the post window). For example on the Mac: ~/Library/Application Support/SuperCollider

Part I

Chapter 1 - The SuperCollider language

This chapter will introduce the fundamentals for creating and running a simple SuperCollider program. It will introduce the basic concepts needed for further exploration. We will learn the basic key orientation practices of SuperCollider, that is how to run code, post into the post window and use the documentation system. We will also discuss the key fundamental things needed to understand and write SuperCollider code, namely: variables, arrays, functions and basic data flow syntax. Having grasped the topics introduced in this chapter, you should be able to write practically anything that you want, although later we will go into Object Orientated Programming, which will make things considerably more effective and perhaps easy.

The semicolon, brackets and running a program

The semicolon “;” is what divides one instruction from the next. It defines a line of code. After the semicolon, the interpreter looks at next line. There has to be semicolon after each line of code. Forgetting it will give you errors printed in the post console.

This code will work fine if you evaluate only this line:

"Hello World".postln

But not this, if you evaluate both lines (by highlighting both and evaluating them with Shift+Return):

"Hello World".postln
"Goodbye World".postln;

Why not? Because the interpreter (the SC language) will not understand

"Hello World".postln "Goodbye World".postln; 

However, this will work:

"Hello World".postln; "Goodbye World".postln; 

It is up to you how you format your code, but you’d typically want to keep it readable for yourself in the future and other readers too. There is however a style of SC coding used for Tweeting, where the 140 character limit introduces interesting constraints for composers. Below is a Twitter composition by Tim Walters, but as you can see, it is not good for human readability although it sounds good (The language doesn’t care about human readability, but we do):

play{HPF.ar(({|k|({|i|SinOsc.ar(i/96,Saw.ar(2**(i+k))/Decay.ar(Impulse.ar(0.5**i/k),[k*i+1,\
k*i+1*2],3**k))}!6).product}!32).sum/2,40)}

It can get tiring to having to select many lines of code and here is where brackets come in handy as they can create a scope for the interpreter. So this following code:

var freq = 440;
var amp = 0.5;
{SinOsc.ar(freq, 0, amp}.play;

will not work unless you highlight all three lines. Imagine if these were 100 lines: you would have to do some tedious scrolling up and down the document. So using brackets, you can simply double click after or before a bracket, and it will highlight all the text between the matching brackets.

(
var freq = 440;
var amp = 0.5;
{SinOsc.ar(freq, 0, amp}.play;
)

Matching brackets

Often when writing SuperCollider code, you will experience errors whose origin you can’t figure out. Double clicking between brackets and observe whether they are matching properly is one of the key methods of debugging SuperCollider code.

(
"you ran the program and ".post; 
(44+77).post; " is the sum of 44 and 77".postln;
"the next line - the interpreter posts it twice as it's the last line".postln;
)

The following will not work. Why not? Look at the post window.

(
(44+77).postln
55.postln;
)

Note that the • sign is where the interpreter finds the error.

The post window

You have already posted into the post window (many other languages use a “print” and “println” for this purpose). But let’s explore the post window a little further.

(
"hello".post; // post something
"one, two, three".post;
)


(
"hello there".postln; // post something and make a line break
"one, two, three".postln;
)

1+4; // returns 5

Scale.minor.degrees // returns an array with the degrees in the minor scale

You can also use postf:

"the first value is %, and the second one is % \n".postf(1111, 9999);

If you are posting a long list you might not get the whole content using .postln, as SC is lazy and doesn’t like printing too long data structures, like lists.

For this purpose use the following:

Post << "hey"

Example

Array.fill(1000, {100.rand}).postln; // you see you get ...etc...

Whereas,

Post << Array.fill(1000, {100.rand}) // you get the whole list

The Documentation system (The help system)

The documentation system in SuperCollider is a good source for information and learning. It includes introduction tutorials, overviews and documentation for almost every class in SuperCollider. The documentation files typically contain examples of how to use the specific class/UGen, and thus serves as a great source for learning and understanding. Many SC users go straight into the documentation when they start writing code, using it as a template and copy-paste the examples into their projects.

So if you highlight the word Array in an SC document and hit Cmd+d or Ctrl+d (d for documentation), you will get the documentation for that class. You will see the superclasses/subclasses and learn about all the methods that the Array class has. With no text highlighted, you can search the documentation by hitting Cmd+D (capital d) and you will get a menu asking “Search documentation for” where you can type in your item, such as “LFNoise0”.

Also, if you want to read and browse all the documentation, you can open a help browser: Help.gui.

Comments

Comments are information written for humans, but ignored by the language interpreter. It is a good practice to write comments where you think you might forget what a certain block of code will do. It is also a communication to another programmer who might read your code. Feel free to write as many comments as you want, but often it might be a better practice to name your variables and function names (we’ll learn later in this section what these words mean) such that you don’t need to add a comment.

// This is a comment

/*
And this is 
also a comment
*/

Comments are red by default, but can be any colour (in the Format menu choose ‘syntax colorize’)

Variables

Here is a mantra to memorise: Variables are containers of some value. They are names or references to values that could change (their value can vary). So we could create a variable that is a property of yourself called age. Every year this variable will increase by one integer (a whole number). So let us try this now:

var age = 33;
age = age + 1; // here the variable ‘age’ gets a new value, or 33 + 1
age.postln; // and it posts 34

SuperCollider is not strongly typed so there is no need to declare the data type of variables. Data types (in other languages) include : integer, float, double, string, custom objects, etc… But in SuperCollider you can create a variable that contains an integer at one stage, but later contains reference to a string or a float. This can be handy, but one has to be careful as this can introduce bugs in your code.

Above we created a variable ‘age’, and we declared that variable by writing ‘var’ in front of it. All variables have to be declared before you can use them. There are two exceptions, all lowercase letters from a to z (note that ’s’ is a special variable that is by default used as a reference to the SC Server) can be used without declaration. There are also so called environmental variables (which can be considered global variables within a certain context) and they start with the ‘~’ symbol. More on that later.

a = 3; // we assign the number 3 to the variable "a"
a = "hello"; // we can also assign a string to it.
a = 0.333312; // or a floating point number;
a = [1, 34, 55, 0.1, "string in a list", \symbol, pi]; // or an array with mixed types

a // hit this line and we see in the post window what "a" contains

SuperCollider has scope, so if you declare a variable within a certain scope, such as a function, they can have a local value within that scope. So try to run this code (by double clicking behind the first bracket).

(
var v, a;
v = 22;
a = 33;
"The value of a is : ".post; a.postln;
)
"The value of a is now : .post; a.postln; // then run this line 

The value of ‘a’ will be from the code block above this one. So ‘a’ is a global variable, but because you declared it with a var in a scope (the brackets) it did not override the global variable. This is good for prototyping and testing, but not recommended as a good software design. A variable with the name ‘myvar’ could not be global – only single lowercase characters.

If we want longer variable names, we can use environmental variables (using the ~ symbol): they can be seen as global variables, accessible from anywhere in your code

~myvar = 333;

~myvar // post it;

But typically we just declare the variable (var) in the beginning of the program and assign its value where needed. Environmental variables are not necessary, although they can be useful, and this book will not use them extensively.

But why use variables at all? Why not simply write the numbers or the value wherever we need it? Let’s take one example that should demonstrate clearly why they are useful:

{
 // declare the variables
var freq, oscillator, filter, signal;
freq = 333; // set the frequency variable
 // create a Saw wave oscillator with two channels
oscillator = Saw.ar([freq, freq+2]);
// use a resonant low pass filter on the oscillator
filter = RLPF.ar(oscillator, freq*4, 0.25);
// multiply the signal by 0.5 to lower the amplitude 
signal = filter * 0.5;
}.play;

As you can see, the ‘freq’ variable is used in various places in the above synthesizer. You can now change the value of the variable to something like 500, and it the frequency will ‘automatically’ be turned into 500 Hz in the left channel, 502 Hz in the right, and the cutoff frequency will be 2000 Hz. So instead of changing these variables throughout the code, you change it in one place and its value magically plugged into every location where that variable is used.

Functions

Functions are an important feature of SuperCollider and most other programming languages. They are used to encapsulate algorithms or functionality that we only want to write once, but use in different places at various times. They can be seen as a black box or a factory that takes some input, parses it, and returns some output. Just as a sophisticated coffee machine might take coffee beans and water as input, it then grounds the beans, boils the water, brews the coffee, and finally outputs a lovely drink. The key point is that you don’t need (or want) to know precisely how all this happens. It is enough to know where to fill up the beans and water, and then how to operate the buttons of the machine (strength, number of cups, etc.). The coffee machine is a [black box] (http://en.wikipedia.org/wiki/Black_box).

Functions in SuperCollider are notated with curly brackets ‘{}’

Let’s create a function that posts the value of 44. We store it in a variable ‘f’, so we can call it later.

f = { 44.postln };

When you run this line of code, you see that the SuperCollider post window notifies you that it has been given a function. It does not post 44 into the post window. For that we have to call the function, i.e., to ask it to perform its calculation and return some value to us.

f.value // to call the function we need to get its value

Let us write a more complex function:

f = {
    69 + ( 12 * log( 220/440 ) / log(2) )
};
f.value // returns the MIDI note 57 (the MIDI note for 220 Hz)

This is a typical function that calculates the MIDI note of a given frequency in Hz (or cycles per second). Most electronic musicians know that the MIDI note 60 is C, and that 69 is A, and that A is 440 Hz. But how is this calculated? Well the function above does return the MIDI note of 220 Hz. But this is a function without any input (or argument as it is called in the lingo). Let’s open up this input channel, by drilling a hole into the black box, and let’s name this argument ‘freq’ as that’s what we want to put in.

f = { arg freq;
    69 + ( 12 * log( freq/440 ) / log(2) )
}

We have now an input into our function, an argument named ‘freq’. Note that this argument has been put into the right position inside the calculation. We can now put in any frequency and get the relevant MIDI note.

f.value(440) // returns 69
f.value(880) // returns 81
f.value(261) // returns 59.958555396543 (a fractional MIDI note, close to C (or 60))

The above is a good example of why functions are so great. The algorithm of calculating the MIDI note from frequency is somewhat complex (or nasty?), and we don’t really want to memorise it or write it more than once. We have simply created a black box that we put in to the ‘f’ variable and now we can call it whenever we want without knowing what is inside the black box.

We will be using functions all the time in the coming chapters. It is vital to understand how they receive arguments, process the data, and return a value.

The final thing to say about functions at this stage is that they can have default values in their arguments. This means that we don’t have to pass in all the arguments of the function.

f = { arg salary, tax=20;
    var aftertax;
    aftertax = salary - (salary * (tax/100) )
}

So here above is a function that calculates the pay after tax, with the default tax rate set at 20%. Of course we can’t be sure that this is the tax rate forever, or in different countries, so this needs to be an argument that can be set in the different contexts.

f.value(2000) // here we use the default 20% tax rate
f.value(2000, 35) // and here the tax rate has become 35%

You will see the following

f = { arg string; string.postln; } // we will post the string that comes into the function
f.value("hi there") // and here we call the function passing "hi there” as the argument.

Often written in this form:

f = {|string| string.postln;} // arguments can be defined within two pipes ‘|’
f.("hi there") // and you can skip the .value and just write a dot (.)

Arrays, Lists and Dictionaries

Arrays are one of the most useful things to understand and use in computer music. This is where we can store bunch of data (whether pitches, scales, synths, or any other information you might want to reference). A common thing a novice programmer typically does is to create lots of variables for data that could be stored in an array, so let’s dive straight into learning how to use arrays and lists.

An array can be seen as a storage space for things that you need to use later. Like a bag or a box where you keep your things. We typically keep the reference to the array in a variable so we can access it anywhere in our code:

a = [11, 22, 33, 44, 55]; // we create an array with these five numbers

You will see that the post window posts the array there when you run this line. Now let us try to play a little with the array:

a[0]; // we get at the first item in the array (most programming languages index at zero)
a[4] // returns 55, as index 4 into the array contains the value 55
a[1]+a[4] // returns 77 as 22 plus 55 equals 77
a.reverse // we can reverse the array
a.maxItem // the array can tell us what is the highest value

and so on. The array we created above had five defined items in it. But we can create arrays differently, where we fill it algorithmically with any data we’d be interested in:

a = Array.fill(5, { 100.rand }); // create an array with five random numbers from 0 to 100

What happened here is that we tell the Array class to fill a new array with five items, but then we pass it a function (introduced above) and the function will be evaluated five times. Compare that with:

a = Array.fill(5, 100.rand ); // create an array with ONE random number from 0 to 100

We can now play a little bit with that function that we pass to the array creation:

a = Array.fill(5, { arg i; i }); // create a function with the iterator (‘i’) argument
a = Array.fill(5, { arg i; (i+1)*11 }); // the same as the first array we created
a = Array.fill(5, { arg i; i*i });
a = Array.series(5, 10, 2); // a new method (series). 
// Fill the array with 5 items, starting at 10, adding 2 in every step.

You might wonder why this is so fantastic or important. The fact is that arrays are used everywhere in computer music. The sound file you will load in later in this book will be stored in an array, with each sample in its own slot in an array. Then you can jump back and forth in the array, scratching, cutting, break beating or whatever you would like to do, but the fact is that this is all done with data (the samples of your soundfile) stored in an array. Or perhaps you want to play a certain scale.

m = Scale.minor.degrees; // the Scale class will return the degrees of the minor scale

m is here an array with the following values: [ 0, 2, 3, 5, 7, 8, 10 ]. So in a C scale, 0 would be C, 2 would be D (two half notes above C), 3 would be E flat, and so on. We could represent those values as MIDI notes, where 60 is the C note (~ 261Hz). And we could even look at the actual frequencies in Hertz of those MIDI notes. (Those frequencies would be passed to the oscillators as they are expecting frequencies and not MIDI notes as arguments).

m = Scale.minor.degrees; // Scale class returns the degrees of the minor scale
m = m.add(12); // you might want to add the octave (12) into your array
m = m+60 // here we simply add 60 to all the values in the array
m = m.midicps // and here we turn the MIDI notes into their frequency values
m = m.cpsmidi // but let’s turn them back to MIDI values for now

We could now play with the ‘m’ array a little. In an algorithmic composition, for example, you might want to pick a random note from the minor scale

n = m.choose; // choose a random MIDI note and store it in the variable ’n’
x = m.scramble; // we could create a melody by scrambling the array
x = m.scramble[0..3] // scramble the list and select the first 4 notes
p = m.mirror // mirror the array (like an ascending and descending scale)

You will note that in ‘x = m.scramble’ above, the ‘x’ variable contains an array with a scrambled version of the ‘m’ array. The ‘m’ array is still intact: you haven’t scrambled that one, you’ve simply said “put a scrambled version of ‘m’ into variable ‘x’.” So the original ‘m’ is still there. If you really wanted to scramble ‘m’ you would have to do:

m = m.scramble; // a scrambled version of the ‘m’ array is put back into the ‘m’ variable
// But now it’s all scrambled up. Let’s sort it into ascending numbers again:
m = m.sort

Arrays can contain anything, and in SuperCollider, they can contain values of mixed types, such as integers, strings, floats, and so on.

a = [1, "two”, 3.33, Scale.minor] // we mix types into the array.
// This can be dangerous as the following
a[0]*10 // will work
a[1]*10 // but this won’t, as you cant multiply the word "two with 10 

Arrays can contain other arrays, containing other arrays of any dimensions.

// a function that will create a 5 item array with random numbers from 0 to 10
f = { Array.fill(5, { 10.rand }) }; // array generating function 
a = Array.fill(10, f.value);  // create another array with 10 items of the above array
// But the above was evaluated only once. Why? 
// Because, you need to pass it a function to get a different array every time. Like this:
a = Array.fill(10, { f.value } );  // create another array with 10 items of the above array
// We can get at the first array and see it’s different from the second array
a[0]
a[1]
// We could put a new array into a[0] (that slot contains an array)
a[0] = f.value
// We could put a new array into a[0][0] (an integer)
a[0][0] = f.value

Above we added 12 to the minor scale.

m = Scale.minor.degrees;
m.add(12) // but try to run this line many times, the array won’t grow forever

Lists

It is here that the List class becomes useful.

l = List.new;
l.add(100.rand) // try to run this a few times and watch the list grow

Lists are like arrays - and implement many of the same methods - but the are slightly more expensive than arrays. In the example above you could simply do ‘a = a.add(100.rand)’ if ‘a’ was an array, but many people like lists for reasons we will discuss later.

Dictionaries

A dictionary is a collection of items where keys are mapped to values. Here, keys are keywords that are identifiers for slots in the collection. You can think of this like names for values. This can be quite useful. Let’s explore two examples:

a = Dictionary.new
a.put(\C, 60)
a.put(\Cs, 61)
a.put(\D, 62)
a[\Ds] = 63 // same as .put
// and now, let's get the values
a.at(\D)
a[\D#] // same as .at

a.keys
a.values
a.getPairs
a.findKeyForValue(60)

Imagine how you would do this with an Array. One way would be

a = [\C, 60, \Cs, 61, \D, 62, \Ds, 63]
// we find the slot of a key:
x = a.indexOf(\D) // 4
a[x+1]
// or simply
a[a.indexOf(\D)+1]

but using an array you need to keep track of the how things are organised and indexed.

Another Dictionary example:

b = Dictionary.new
b.put(\major, [ 0, 2, 4, 5, 7, 9, 11 ])
b.put(\minor, [ 0, 2, 3, 5, 7, 8, 10 ])
b[\minor]

Methods?

We have now seen things as 100.rand and a.reverse. How does .rand and .reverse work? Well, SuperCollider is an Object Orientated language and these are methods of the respective classes. So an integer (like 100), has methods like .rand, .midicps, or .neg. It does not have a .reverse method. Why not? Because you can’t reverse a number. However, an array (like [11,22,33,44,55]) can be reversed or added to. We will explore this later in the chapter about Object Orientated programming in SC, but for now it is enough to think that the object (an instantiation of the class) has relevant methods. Or to use an analogy: let’s say we have a class called Car. This class is the information needed to build the car. When we build a Car, we instantiate the class and we have an actual Car. This car can then have some methods, for instance: start, drive, turn, putWipersOn. And these methods could have arguments, like speed(60), or turn(-60). You could think about the object as the noun, the method as the verb, and the argument as the adjective. (As in: John (object) walks (method) fast (adjective)).

// we create a new car. 4 indicating for example number of seats
c = Car.new(4); 
c.start;
c.drive(40); // the car drives 40 miles per hour
c.turn(-60); // the car turns 60 degrees to the left

So to really understand a class like Array or List you need to read the documentation and explore the methods available. Note also that the Array is subclassing (or getting methods from its superclass) the ArrayedColldection class. This means that it has all the methods of its superclass. Like a class “Car” might have a superclass called “Vehicle” of which a “Motorbike” would also be a subclass (a sibling to “Car”). You can explore this by peeking under the hood of SC a little:

Array.openHelpFile // get the documentation of the Array class
Array.dumpInterface // get the interface or the methods of the Array class
Array.dumpFullInterface // get the methods of Array’s superclasses as well.

You can see that in the .dumpFullInterface method will tell you all the methods Array inherits from its superclasses.

Now, this might give you a bit of a brainache, but don’t worry, you will gradually learn this terminology and what it means for you in your musical or sound practice with SuperCollider. Wikipedia is good place to start reading about [Object Oriented Programming] (https://en.wikipedia.org/wiki/Object-oriented_programming).

Conditionals, data flow and control

The final thing we should discuss before we start to make sounds with SuperCollider is how we control data and take decisions. This is about logic, about human thinking, and how to encode such decisions in the form of code. Such logic the basic form of all clever systems, for example in artificial intelligence. In short it is about establishing conditions and then decide what to do with them. For example: if it is raining and I’m going out, I take my umbrella with me, else I leave it at home. It’s about basic logic that humans do all the time throughout the day. And programming languages have ways formalise such conditions, most typically with an if-else statement.

In pseudocode it looks like this: if( condition, { then do this }, { else do this }); as in: if( rain, { umbrella }, { no umbrella });

So the condition represents a state that is either true or false. If it is true (there is rain), then it evaluates the first function, if false (no rain) it evaluates the second condition.

Another form is a simple if statement where you don’t need to specify what to do if it’s false: if( hungry, { eat } );

So let’s play with this:

if( true, { "condition is TRUE".postln;}, {"condition is FALSE".postln;});
if( false, { "condition is TRUE".postln;}, {"condition is FALSE".postln;});

You can see that true and false are keywords in SuperCollider. They are so called Boolean values. You should not use those as variables (well, you can’t). In digital systems, we operate in binary code, in 1s and 0s. True is associated with 1 and false with 0.

true.binaryValue;
false.binaryValue;

Boolean logic is named after George Boole who wrote an important paper in 1848 (“The Calculus of Logic”) on expressions and reasoning. In short it involves the statements AND, OR, and not.

A simple Boolean truth table might look like this

true AND true = true
true AND false = false
false AND false = false
true OR true = true
true OR false = true
false OR false = false

And also

true AND not false = true

etc. Let’s try this in SuperCollder code and observe the post window. But first we need to learn the basic syntax for the Boolean operators:

== stands for equal != stands for not equal && stands for and || stands for or

And we also use comparison operators

”>” stands for more than
“<” stands for less than
“>=” stands for more than or equal to
“<=” stands for less than or equal to

true == true // returns true
true != true // returns false (as true does indeed equal true)
true == false // returns false
true != false // returns true (as true does not equal false)
3 == 3 // yes, 3 equals 3
3 != 4 // true, 3 does not equal 4
true || false // returns true, as one of the elements are true
false || false // returns false, as both of the elements are false
3 > 4 // false, as 3 is less than 4
3 < 4 // true
3 < 3 // false
3 <= 3 // true, as 3 is indeed less than or equal to 3

You might not realise it yet, but knowing what you now know is very powerful and it is something you will use all the time for synthesis, algorithmic composition, instrument building, sound installations, and so on. So make sure that you understand this properly. Let’s play with this a bit more in if-statements:

if( 3==3, { "condition is TRUE".postln;}, {"condition is FALSE".postln;});
if( 3==4, { "condition is TRUE".postln;}, {"condition is FALSE".postln;});
// and things can be a bit more complex:
if( (3 < 4) && (true != false), {"TRUE".postln;}, {"FALSE".postln;});

What happened in that last statement? It asks: is 3 less than 4? Yes. AND is true not equal to false? Yes. Then both conditions are true, and that’s what it posts. Note that of course the values in the string (inside the quotation marks) could be anything, we’re just posting now. So you could write:

if( (3 < 4) && (true != false), {"VERDAD".postln;}, {"FALSO".postln;}); 

in Spanish if you’d like, but you could not write this:

verdad == verdad

as the SuperCollider language is in English.

But what if you have lots of conditions to compare? Here you could use a switch statement:

(
a = 4.rand; // a will be a number from 0 to 4;
switch(a)
	{0} {"a is zero".postln;} // runs this if a is zero
	{1} {"a is one".postln;} // runs this if a is one
	{2} {"a is two".postln;} // etc.
	{3} {"a is three".postln;};
)

Another way is to use the case statement, and it might be faster than the switch.

(
a = 4.rand; // a will be a number from 0 to 4;
case
	{a == 0} {"a is zero".postln;} // runs this if a is zero
	{a == 1} {"a is one".postln;} // runs this if a is one
	{a == 2} {"a is two".postln;} // etc.
	{a == 3} {"a is three".postln;};
)

Note that both in switch and case, the semicolon is only after the last testing condition. (so the line evaluation goes from “case…… to that semicolon” )

Looping and iterating

The final thing we need to learn in this chapter is looping. Looping is one of the key tricks used in programming. Say we want to generate 1000 synths at once. It would be tedious to write and evaluate 1000 lines of code one after another, but it’s easy to loop one line of code 1000 times!

In many programming languages this is done with a [for-loop] (http://en.wikipedia.org/wiki/For_loop):

for(int i = 0; i > 10, i++) {
	println("i is now" + i);		
}

The above code will work in Java, C, JavaScript and many other languages. But since SuperCollider is a fully object orientated language, where everything is an object - which can have methods - so an integer can have a method like .neg, or .midicps, but also .do (which is our loop).

So in SuperCollider we can simply do:

10.do({ "SCRAMBLE THIS 10 TIMES".scramble.postln; })

What happened is that it loops through the command 10 times and evaluates the function (which scrambles and posts the string we wrote) every time. We could then make a counter:

(
var counter = 0;
10.do({ 
	counter = counter + 1;
	"counter is now: ".post; 
	counter.postln; 
})
)

But instead of such counter we can use the argument passed into the function in a loop:

10.do({arg counter; counter.postln;});
// you can call this argument whatever you want:
10.do({arg num; num.postln;});
// and the typical convention is to use the character "i" (for iteration):
10.do({arg i; i.postln;});

Let’s now try to make a small program that gives us all the prime numbers from 0 to 10000. There is a method of the Integer class that is called isPrime which comes in handy here. We will use many of the things learned in this chapter, i.e., creating a List, making a do loop with a function that has a iterator argument, and then we’ll ask if the iterator is a prime number, using an if-statement. If it is (i.e. true), we add it to the list. Finally we post the result to the post window. But note that we’re only posting after we’ve done the 10000 iterations.

(
p = List.new;
10000.do({ arg i; // i is the iteration from 0 to 10000
	if( i.isPrime, { p.add(i) }); // no else condition - we don't need it
});
Post << p;
)

We can also loop through an Array or a List. Then the do-loop will pick up up all the items of the array and pass it into the function that you write inside the do loop. Additionally, it will add an iterator. So we have two arguments to the function:

(
[ 11, 22, 33, 44, 55, 66, 77, 88, 99 ].do({arg item, counter; 
	item.post; " is in the array at slot: ".post; counter.postln;
});
)

So it posts the slot (the counter/iterator always starts at zero), and the item in the list. You can call the arguments whatever you want of course. Example:

[ 11, 22, 33, 44, 55, 66, 77, 88, 99 ].do({arg aa, bb; aa.post; " is in the array at slot: \
".post; bb.postln });

Another looping technique is to use the for-loop:

for(startValue, endValue, function); // this is the syntax
for(100, 130, { arg i; i = i+10; i.postln; }) // example

We might also want to use the forBy-loop:

forBy(startValue, endValue, stepValue, function); // the syntax
forBy(100, 130, 4, { arg i; i = i+10; i.postln; }) // example

While is another type of loop:

while (testFunc, bodyFunc); // syntax
(
i = 0;
while ({ i < 30 }, {  i = i + 1; i.postln; });
)

This is enough about the language. Now is the time to dive into making sounds and explore the synthesis capabilities of SuperCollider. But first let us learn some tricks of peeking under the hood of the SuperCollider language:

Peaking under the hood

Each UGen or Class in SuperCollider has a class definition in a class file. These files are compiled every time SuperCollider is started and become the application environment we are using. SC is an “interpreted” language. (As opposed to a “compiled” language like C or JavaScript). If you add a new class to SuperCollider, you need to recompile the language (there is a menu item for that), or simply restart SuperCollider.

UGen.dumpSubclassList // UGen is a class. Try dumping LFSaw for example

UGen.browse  // examine methods interactively in a GUI (OSX)

SinOsc.dumpFullInterface  // list all methods for the classhierarchically
SinOsc.dumpMethodList  // list instance methods alphabetically
SinOsc.openHelpFile

Chapter 2 - The SuperCollider Server

The SuperCollider Server, or SC Synth as it’s also known as, is an elegant and great sounding audio engine. As mentioned earlier, SuperCollider is traditionally separated between a server and a client, that is, an audio server (the SC Synth) and the SuperCollider language client (sc-lang). When the server is booted, it connects to the default audio device (such as internal or external audio cards), but you can set it to any audio device available to your computer (for example using virtual audio routing software like Jack).

A figure illustrating the structure of SuperCollider: 1) the server (scsynth); 2) the language; 3) an interpreter for the language; 4) the client for the server; and 5) the SuperCollider IDE - from http://doc.sccode.org/Guides/ClientVsServer.html
A figure illustrating the structure of SuperCollider: 1) the server (scsynth); 2) the language; 3) an interpreter for the language; 4) the client for the server; and 5) the SuperCollider IDE - from http://doc.sccode.org/Guides/ClientVsServer.html

The SC Synth renders audio and has an elegant structure of Busses, Groups, Synths and UGens, and it works a bit like a modular synth, where the output of certain chain of oscillators and filters can be routed into another module. The audio is created through creating graphs called synth definitions or SynthDefs. These are definitions of synths, but in a wide sense as they can do practically anything audio related (for example performing audio analysis rather than synthesis).

The SC Synth is a program that runs independently from the SuperCollider IDE or language. You can use any software to control it, like C/C++, Java, Python, Lua, Pure Data, Max/MSP or any other.

A figure illustrating how different clients can communicate with the sc server - from http://doc.sccode.org/Guides/ClientVsServer.html
A figure illustrating how different clients can communicate with the sc server - from http://doc.sccode.org/Guides/ClientVsServer.html

This chapter will introduce the SuperCollider server for the most basic purposes of getting started with this amazing engine for audio work. This section will be fundamental for the succeeding chapters.

Booting the Server

When you “boot the server”, you are basically starting a new process on your computer that does not have a GUI (Graphical User Interface). If you observe the list of running processes of your computer, you will see that when you boot the server, a new process will appear (try typing “top” into a Unix Terminal). The server can be booted through a menu command (Menu-> Server -> Boot Server), or through code (s.boot). You can also boot it from the command line if you know where the server is on your system, as it is independent of the SuperCollider application.

// let us explore the 's' variable, that stands for the synth:
s.postln; // we see that it contains a localhost synth
s.addr // the address of the synth (IP address and Port)
s.name // the localhost server is the default server (see Main.sc file)
s.serverRunning // is it running?
s.avgCPU // how much CPU is it using right now?

// Let's boot the server. Look at the post window
s.boot

We can explore creating our own servers with specific ports and IP addresses:

n = NetAddr("127.0.0.1", 57200); // IP (get it from whatsmyip.org) and port
p = Server.new("hoho", n); // create a server with the specific net address
p.makeWindow; // make a GUI window
p.boot; // boot it

// try the server:
{SinOsc.ar(444)}.play(p);
// stop it
p.quit;

From the above you might start to think about possibilities of having the server running on a remote computer with various clients communicating to it over network, and yes, that is precisely one of the innovative ideas of SuperCollider 3. You could put any server (with a remote IP address and port) into your server variable and communicate to it over a network. Or have many servers on diverse computers, instructing each of them to render audio. All this is common in SuperCollider practice, but the most common setup is using the SuperCollider IDE to write SC Lang code to control a localhost audio server (localhost meaning “on the same computer”). And that is what we will focus on for a while.

The Unit Generators

Unit Generators have been the key building blocks of digital synthesis systems, since Max Matthews’ Music N systems in the 1960s. Written in C++ and compiled as plugins for the SC Server, they encapsulate complex calculations into a simple black box that returns to us - the synth builders or musicians - what we are after, namely an output that could be in the form of a wave or a filter. The Unit Generators, or UGens as they are commonly called, are modular and the output of one can be the input of another. You can think of them like units in a modular synthesizer, for example the Moog:

A Moog Modular Synth
A Moog Modular Synth

UGens typically have audio rate (.ar) and control rate (.kr) methods. Some have initialization rate as well. The difference here is that an audio rate UGen will output as many samples as the sample rate per second. A computer with 44.1kHz sample rate will require each UGen to calculate 44100 samples per second. Control rate is of much lower rate than the sample rate and gives the synth designer the possibility of saving computational power (or CPU cycles) if used wisely.

// Here is a sine wave unit generator
// it has an audio rate method (the .ar)
// and its argument order is frequency, phase and multiplication
{SinOsc.ar(440, 0, 1)}.play 
// now try to run a SinOsc with control rate:
{SinOsc.kr(440, 0, 1)}.play // and it is inaudible

The control rate SinOsc is inaudible, but it is running fine on the server. We use control rate UGens to control other UGens, for example frequency, amplitude, or filter frequency. Let’s explore that a little:

// A sine wave of 1 Hz modulates the 440 Hz frequency
{SinOsc.ar(440*SinOsc.kr(1), 0, 1)}.play 
// A control rate sine wave of 3 Hz modulates the amplitude
{SinOsc.ar(440, 0, SinOsc.kr(3))}.play 
// An audio rate sine wave of 3 Hz modulates the amplitude
{SinOsc.ar(440, 0, SinOsc.ar(3))}.play
// and as you can hear, there is no difference
 
// 2 Hz modulation of the cutoff frequency of a Low Pass Filter (LPF)
// we add 1002, so the filter does not go into negative range
// which might blow up the filter
{LPF.ar(Saw.ar(440), SinOsc.kr(2, 0, 1000)+1002)}.play 

The beauty of UGens is how one can connect the output of one to the input of another. Oscillator UGens typically output values between -1 and 1, in a certain pattern (e.g., sine wave, saw wave, or square wave) and in a certain frequency. Other UGens such as filters or FFT processing do calculations on an incoming signal and output a new signal. Let’s explore one more example of connected UGens that demonstrates their modular power:

{
	// we create a slow oscillator in control rate
	a = SinOsc.kr(1);
	// the output of 'a' is used to multiply the frequency of a saw wave
	// resulting in a frequency from 440 to 660. Why?
	b = Saw.ar(220*(a+2), 0.5);
	// and here we use 'a' to control amplitude (from -0.5 to 0.5)
	c = Saw.ar(110, a*0.5);
	// we add b and c, and use a to control the filter cutoff frequency
	// we simply added a .range method to a so it now outputs
	// values between 100 and 2000 at 1 Hz
	d = LPF.ar(b+c, a.range(100, 2000));
	Out.ar(0, Pan2.ar(d, 0));
}.play

This is a simple case study of how UGens can be added (b+c), and used in any calculation (such as a*0.5 - which is an amplitude modulation, creating a tremolo effect) of the signal. For a bit of fun, let’s try to use a microphone and make a little effect of your voice:

{
	// we take sound in from the sound card
	a = SoundIn.ar(0);
	// and we ring modulate using the mouse to control frequency
	b = a * SinOsc.ar(MouseX.kr(100, 3000));
	// we also use the mouse (vertical) to control delay
	c = b + AllpassC.ar(b, 1, MouseY.kr(0.001, 0.2), 2);
	// and here, instead of Pan2, we simply use an array [c, c]
	Out.ar(0, [c, c]);
}.play

A good way to explore UGens is to browse them in the documentation.

UGen.browse; // XXX check if this works

The SynthDef

Above we explored UGens by wrapping them in a function and call .play on that function ({}.play). What this does is to turn the function (indicated by {}, as we learned in the chapter 1) into a synth definition that is sent to the server and then played. The {}.play (or Function:play, if you want to peek into the source code – by highlighting “Function:play” and hit Cmd+I – and explore how SC compiles the function into a SynthDef under the hood) is how many people sketch sound in SuperCollider and it’s good for demonstration purposes, but for all real synth building, we need to create a synth definition, or a SynthDef.

A SynthDef is a pre-compiled graph of unit generators. This graph is written to a binary file and sent to the server over OSC (Open Sound Control - See chapter 4). This file is stored in the “synthdefs” folder on your system. In a way you could see it as your own VST plugin for SuperCollider, and you don’t need the source code for it to work (although it does not make sense to throw that away).

It is recommended that the SynthDef help file is read carefully and properly understood. The SynthDef is a key class of SuperCollider and very important. It adds synths to the server or writes synth definition files to the disk, amongst many other things. Let’s start by exploring how we can turn a unit generator graph function into a synth definition:

// this simple synth
{Saw.ar(440)}.play
// becomes equivalent to this synth definition
SynthDef(\mysaw, {
	Out.ar(0, Saw.ar(440));
}).add;

You notice that we have done two things: given the function a name (\mysaw), and we’ve wrapped our saw wave in an ‘Out’ UGen which defines which ‘Bus’ the audio is sent to. If you have an 8 channel sound card, you could send audio to any bus from 0 to 7. You could also send it to bus number 20, but we would not be able to hear it then. However, we could put another synth there that routes the audio back onto audio card busses, for example 0-7.

// you can use the 'Out' UGen in Function:play
{Out.ar(1, Saw.ar(440))}.play // out on the right speaker

NOTE: There is a difference in the Function-play code and the SynthDef, in that we need the Out Ugen in a synth definition to tell the server which audiobus the sound should go out of. (0 is left, 1 is right)

But back to our SynthDef, we can now try to instantiate it, and create a Synth. (A Synth is an instantiation (child) of a SynthDef). This synth can then be controlled if we reference it with a variable.

// create a synth and put it into variable 'a'
a = Synth(\mysaw);
// create another synth and put it into variable 'b'
b = Synth(\mysaw);
a.free; // kill a
b.free; // kill b

This is obviously not a very interesting synth. It is ‘hardcoded’, i.e., the parameters in it (such as frequency and amplitude) are static and we can’t change them. This is only done in very specific situations, as normally we would like to specify the values of our synth both when initialising the synth and after it has been started.

In order to open the SynthDef up for specified parameters and enabling it to be changed, we need to put arguments into the UGen function graph. Remember in chapter 1 how we created a function with arguments:

f = {arg a, b; 
	c = a + b; 
	postln("c is now: " + c)
};
f.value(2, 3);

Note that you cannot write ‘f.value’, as you will get an error trying to add ‘nil’ to ‘nil’ (‘a’ and ‘b’ are both nil in the arg slots in the function. To solve that we can give them default values:

f = {arg a=2, b=3; 
	c = a + b; 
	postln("c is now: " + c)
};
f.value(22, 33);
f.value;

So we add the arguments for the synthdef, and we add a Pan2 UGen that enables us to pan the sound from the left (-1) to the right (1). The centre is 0:

SynthDef(\mysaw, { arg freq=440, amp=0.2, pan=0;
	Out.ar(0, Pan2.ar(Saw.ar(freq, amp), pan));
}).add;
// this now allows us to create a new synth:
a = Synth(\mysaw); // explore the Synth help file
// and control it, using the .set, method of the Synth:
a.set(\freq, 220);
a.set(\amp, 0.8);
a.set(\freq, 555, \amp, 0.4, \pan, -1);

This synth definition could be written better and more understandable. Let’s say we were to add a filter to the synth, it might look like this:

SynthDef(\mysaw, { arg freq=440, amp=0.2, pan=0, cutoff=880, rq=0.3;
	Out.ar(0, Pan2.ar(RLPF.ar(Saw.ar(freq, amp), pan), cutoff, rq));
}).add;

But this is starting to be hard to read. Let us make the SynthDef easier to read (although for the computer it is the same, as it only cares about where the semicolons (;) are).

// the same as above, but more readable
SynthDef(\mysaw, { arg freq=440, amp=0.2, pan=0, cutoff=880, rq=0.3;
	var signal, filter, panned;
	signal = Saw.ar(freq, amp);
	filter = RLPF.ar(signal, cutoff, rq);
	panned = Pan2.ar(filter, pan);
	Out.ar(0, panned);
}).add;

This is roughly how you will write and see other people write synth definitions from now on. The individual parts of a UGen graph are typically put into variables to be more human readable and easier to understand. The exception are SuperCollider tweets (#supercollider) where we have the 280 character limit. We can now explore the synth definition a bit more:

a = Synth(\mysaw); // we create a synth with the default arguments
b = Synth(\mysaw, [\freq, 880, \cutoff, 12000]); // we pass arguments
a.set(\cutoff, 500);
b.set(\freq, 444);
a.set(\freq, 1000, \cutoff, 1200);
b.set(\cutoff, 4000);
b.set(\rq, 0.1);

Observing server activity (Poll, Scope and FreqScope)

SuperCollider has various ways to explore what is happening on the server, in addition to the most obvious one: sound itself. Due to the separation between the SC server and the sc-lang, this means that data has to be sent from the server and back to the language, since it’s the language that prints or displays the data. The server is just a lean mean sound machine and doesn’t care about anything else. Firstly we can try to poll (get) the data from a UGen and post it to the post window:

// we can explore the output of the SinOsc
{SinOsc.ar(1).poll}.play // you won't be able to hear this
// and compare to white noise:
{WhiteNoise.ar(1).poll}.play // the first arg of noise is amplitude
// we can explore the mouse:
{MouseX.kr(10, 1000).poll}.play // nothing to hear

// we can poll the frequency of a sound:
{SinOsc.ar(LFNoise2.ar(1).range(100, 1000).poll)}.play
// or we poll the amplitude of it
{SinOsc.ar(LFNoise2.ar(1).range(100, 1000)).poll}.play
// and we can add a label (first arg is poll rate, second is label)
{SinOsc.ar(LFNoise2.ar(1).range(100, 1000).poll(10, "freq"))}.play

People often use poll to explore what is happening in the synth, to debug, or try to understand why something is not working. But it is typically not used in software that is to be shipped or used in performance as it actually takes some computing power to be sending the messages from the server to the language. Another way to explore the server state is to use scope:

// we can explore the output of the SinOsc
{SinOsc.ar(1)}.scope // you won't be able to hear this
// and compare to white noise:
{WhiteNoise.ar(1)}.scope // the first arg of noise is amplitude
// we can scope the mouse state (but note the control rate):
{MouseX.kr(-1, 1)}.scope // nothing to hear
// the range method maps the output from -1 to 1 into 100 to 1000
{SinOsc.ar(LFNoise2.ar(1).range(100, 1000))}.scope;
// same here, we explore the saw wave form at different frequencies
{Saw.ar(220*SinOsc.ar(0.5).range(1, 10))}.scope

The scope shows amplitude over time, that is: the horizontal axis is time and the vertical axis is amplitude. This is often called a time-domain view of the signal. But we can also explore the frequency content of the sound, a view we call frequency-domain view. This is achieved by performing an FFT analysis of the signal which is then displayed to the scope (don’t worry, this happens ‘under the hood’ and we’ll learn about this in chapter 13). Now let’s explore the freqscope:

// we see the wave at 1000 Hz, with amplitude modulated
{SinOsc.ar(1000, 0, SinOsc.ar(0.25))}.freqscope
// some white noise again:
{WhiteNoise.ar(1)}.freqscope // random values throughout the spectrum
// and we can now experienc the power of the scope
{RLPF.ar(WhiteNoise.ar(1), MouseX.kr(20, 12000), MouseY.kr(0.01, 0.99))}.freqscope
// we can now explore various wave forms:
{Saw.ar(440*XLine.ar(1, 10, 5))}.freqscope // check the XLine helpfile
// LFTri is a non-bandlimited UGen, so explore the mirroring or 'aliasing'
{LFTri.ar(440*XLine.ar(1, 10, 25))}.freqscope

Futhermore, there is a Spectrogram Quark that shows a spectrogram view of the audio signal, but this is not part of the SuperCollider distribution. However, it’s easy to install and we will cover this in the chapter on the Quarks.

A quick intro to busses and multichannel expansion

Chapter 14 will go deeper into busses, groups, and how to route the audio signals through the SC Server. However, it is important at this stage to understand how the server works in terms of channels (or busses). Firstly, all oscillators are mono. Many newcomers to SuperCollider find it strange that they only hear a signal in their left ear when using headphones running a SinOsc. Well, it would be strange to have it in stereo, quadrophonic, 5.1 or any other format, unless we specifically ask for that! We therefore need to copy the signal into the next bus if we want stereo. The image below shows a rough sketch of how the sc synth works.

A sketch illustrating busses in the SC Synth
A sketch illustrating busses in the SC Synth

By default SuperCollider has 8 output channels, 8 input channels, and 112 private audio bus channels (where we can run effects and other things). This means that if you have an 8 channel sound card, you can send a signal out on any of the first 8 busses. If you have a 16 channel sound card, you need to enter the ServerOptions class and change the ‘numOutputBusChannels’ variable to 16. More on that later, but let’s now look at some examples:

// sound put out on different busses
{ Out.ar(0, LFPulse.ar(220, 0, 0.5, 0.3)) }.play; // left speaker (bus 0)
{ Out.ar(1, LFPulse.ar(220, 0, 0.5, 0.3)) }.play; // right speaker (bus 1)
{ Out.ar(2, LFPulse.ar(220, 0, 0.5, 0.3)) }.play; // third speaker (bus 2)

// Pan2 makes takes the signal and converts it into an array of two signals
{ Out.ar(0, Pan2.ar(PinkNoise.ar(1), 0)) }.scope(8)
// or we can play it out on bus 6 (and you probably won't hear it)
{ Out.ar(0, Pan2.ar(PinkNoise.ar(1), 0)) }.scope(8)
// but the above is the same as:
{ a = PinkNoise.ar(1); Out.ar(0, [a, a]) }.scope(8)
// and (where the first six channels are silent):
{ a = PinkNoise.ar(1); Out.ar(0, [0, 0, 0, 0, 0, 0, a, a]) }.scope(8)
// however, it's not the same as:
{ Out.ar(0, [PinkNoise.ar(1), PinkNoise.ar(1)]) }.scope(8)
// why not? -> because we now have TWO signals rather than one

It is thus clear how the busses of the server are represented by an array containing signals (as in: [signal, signal, signal, signal, etc.]). We can now take a mono signal and ‘expand’ it into other busses. This is called multichannel expansion:

{ SinOsc.ar(440) }.scope(8)
{ [SinOsc.ar(440), SinOsc.ar(880)] }.scope(8)
// same as:
{ SinOsc.ar([440, 880]) }.scope(8)
// a trick to 'expand into an array'
{ SinOsc.ar(440) ! 2 }.scope(8)
// if that was strange, check this:
123 ! 30

Enough of this. We will explore busses and audio signal routing in chapter 14 later. However, it is important to understand this at the current stage.

Getting values back to the language

As we have discussed, the SuperCollider language and server are two separate applications. They communicate through the OSC protocol. This means that the communication between the two is asynchronous, or in other words, that you can’t know precisely how long it takes for a message to arrive. Also, you would not know in which order things will happen if you were to depend on a value from the server in a code block in the language. However, if we would like to do something with audio data in the language, such as visualising it, posting it, or such, we need to send a message to the server and wait for it to respond back. This can happen in various ways, but a typical way of doing this is to use the SendTrig Ugen:

// this is happening in the language
OSCdef(\listener, {arg msg, time, addr, recvPort; msg.postln; }, '/tr', n);
// and this happens in the server
{
	var freq;
	freq = LFSaw.ar(0.75, 0, 100, 900);
	SendTrig.kr(Impulse.kr(10), 0, freq);
	SinOsc.ar(freq, 0, 0.5)
}.play 

What we see above is the SendTrig, sending 10 messages every second to the language (the Impulse triggers those messages). It sends a ‘/tr’ OSC message to port 57120 locally. (Don’t worry, we’ll explore this later in a chapter on OSC). The OSCdef then has a function that posts the message from the server.

// this is happening in the language
OSCdef(\listener, {arg msg, time, addr, recvPort; msg.postln; }, '/tr', n);
// and this happens on the server
{
	var freq;
	freq = LFSaw.ar(0.75, 0, 100, 900);
	SendTrig.kr(Impulse.kr(10), 0, freq);
	SinOsc.ar(freq, 0, 0.5)
}.play 

A little bit more complex example might involve a GUI (Graphical User Interfaces are part of the language) and synthesis on the server:

(
// this is happening in the language
var win, freqslider, mouseslider;
win = Window.new.front;
freqslider = Slider(win, Rect(20, 10, 40, 280));
mouseslider = Slider2D(win, Rect(80, 10, 280, 280));

OSCdef(\sliderdef, {arg msg, time, addr, recvPort; 
	{freqslider.value_(msg[3].linlin(600, 1400, 0, 1))}.defer; 
}, '/slider', n); // the OSC message we listen to
OSCdef(\sliderdef2D, {arg msg, time, addr, recvPort; 
	{ mouseslider.x_(msg[3]); mouseslider.y_(msg[4]); }.defer; 
}, '/slider2D', n); // the OSC message we listen to
	
// and this happens on the server
{
	var mx, my, freq;
	freq = LFSaw.ar(0.75, 0, 400, 1000); // outputs 600 to 1400 Hz. Why?
	mx = LFNoise2.kr(2).range(0,1);
	my = LFNoise2.kr(2).range(0, 1);
	SendReply.kr(Impulse.kr(10), '/slider', freq); // sending the OSC message 
	SendReply.kr(Impulse.kr(10), '/slider2D', [mx, my]); 
	(SinOsc.ar(freq, 0, 0.5)+RLPF.ar(WhiteNoise.ar(0.3), mx.range(100, 3000), my))!2 ;
}.play;
 )

We could also write values to a control bus on the server, from which we can read in the language. Here is an example:

b = Bus.control(s,1); // we create a control bus
{Out.kr(b, MouseX.kr(20,22000))}.play // and we write the output of some UGen to the bus
b.get({arg val; val.postln;}); // we poll the puss from the language
// or even:
fork{loop{ b.get({arg val; val.postln;});0.1.wait; }}

Check the source of Bus (by hitting Cmd+I) and locate the .get method. You will see that the Bus .get method is using an OSCresponder underneath. It is therefore “asynchronous”, meaning that it will not happen in the linear order of your code. (The language is asking server for the value, and the server then sends back to language. This takes time).

Here is a program that demonstrates the asynchronous nature of b.get. The {}.play from above has to be running. Note how the numbered lines of code appear in the post window “in the wrong order”! (Instead of a synchronous posting of 1, 2 and 3, we get the order of 1, 3 and 2). It takes between 0.1 and 10 milliseconds to get the value on a 2.8 GHz Intel computer.

(
x = 0; y= 0;
b = Bus.control(s,1); // we create a control bus
{Out.kr(b, MouseX.kr(20,22000))}.play;
t = Task({
	inf.do({
		"1 - before b.get : ".post; x = Main.elapsedTime.postln;
		b.get({|val| 	
			"2 - ".post; val.postln; 
			y = Main.elapsedTime.postln;
			"the asynchronious process took : ".post; (y-x).post; " seconds".postln;
		}); //  this value is returned AFTER the next line
		"3 - after b.get : ".post;  Main.elapsedTime.postln;
		0.5.wait;
	})
}).play;
)

This type of communication from the server to the language is not very common. The other way (from language to server) is however. This section is therefore not vital for your work in SuperCollider, but you will at some point stumble into the question of synchronous and asynchronous communication with the server and this section should prepare you for that.

ProxySpace

SuperCollider is an extremely wide and flexible language. It is profoundly deep and you will find new things to explore for years to come. Typically SC users find their own way of working in the language and then explore new areas when they find they need so, or are curious.

ProxySpace is one such area. It makes live coding and various on line coding extremely flexible. Effects can be routed in and out of proxies, and source changed. Below you will find a quick examples that are useful when testing UGens or making prototypes for synths that you will write as synthdefs later. ProxySpace is also often used in live coding. Evaluate the code below line by line:

p= ProxySpace.push(s.boot)

~signal.play;
~signal.fadeTime_(2) // fading in and out in 2 secs
~signal= {SinOsc.ar(400, 0, 1)!2}
~signal= {SinOsc.ar([400, 404], 0, LFNoise0.kr(4))}
~signal= {Saw.ar([400, 404],  LFNoise0.kr(4))}
~signal= {Saw.ar([400, 404],  Pulse.ar(2))}
~signal= {Saw.ar([400, 404],  Pulse.ar(Line.kr(1, 30, 20)))}
~signal= {LFSaw.ar([400, 404],  LFNoise0.kr(4))}
~signal= {Pulse.ar([400, 404],  LFNoise0.kr(4))}
~signal= {Blip.ar([400, 404],  12, Pulse.ar(2))}
~signal= {Blip.ar([400, 404],  24, LFNoise0.kr(4))}
~signal= {Blip.ar([400, 404],  4, LFNoise0.kr(4))}
~signal= {Blip.ar([400, 404],  MouseX.kr(4, 40), LFNoise0.kr(4))}
~signal= {Blip.ar([200, 204],  5, Pulse.ar(1))}

// now let's try to add some effects 

~signal[1] = \filter -> {arg sig; (sig*0.6)+FreeVerb.ar(sig, 0.85, 0.86, 0.3)}; // reverb
~signal[2] = \filter -> {arg sig; sig + AllpassC.ar(sig, 1, 0.15, 1.3 )}; // delay
~signal[3] = \filter -> {arg sig; (sig * SinOsc.ar(2.1, 0, 5.44, 0))*0.5}; // tremolo
~signal[4] = \filter -> {arg sig; PitchShift.ar(sig, 0.008, SinOsc.ar(2.1, 0, 0.11, 1))}; /\
/ pitchshift
~signal[5] = \filter -> {arg sig; (3111.33*sig.distort/(1+(2231.23*sig.abs))).distort*0.2};\
 // distort
~signal[1] = nil;
~signal[2] = nil;
~signal[3] = nil;
~signal[4] = nil;
~signal[5] = nil;

Another ProxySpace example:

p = ProxySpace.push(s.boot);
~blipper = { |freq=20, nHarm=30, amp=0.1| Blip.ar(freq, nHarm, amp)!2 };
~blipper.play;
~lfo = { MouseX.kr(10, 100, 1) };
~blipper.map(\freq, ~lfo);
~blipper.set(\nHarm, 50)
~lfn = { LFDNoise3.kr(15, 30, 40) };
~blipper.map(\nHarm, ~lfn);
~lfn = 30;
~blipper.set(\nHarm, 50);

Ndef

Ndef is an alternative and more dynamic way of working than using SynthDefs. They can be rewritten on the fly whilst running. They are using the ProxySpace like the code above. Example (from the documentation) here below:

Ndef(\sound).play;
Ndef(\sound).fadeTime = 1;
Ndef(\sound, { SinOsc.ar([600, 635], 0, SinOsc.kr(2).max(0) * 0.2) });
Ndef(\sound, { SinOsc.ar([600, 635] * 3, 0, SinOsc.kr(2 * 3).max(0) * 0.2) });
Ndef(\sound, { SinOsc.ar([600, 635] * 2, 0, SinOsc.kr(2 * 3).max(0) * 0.2) });
Ndef(\sound, Pbind(\dur, 0.17, \freq, Pfunc({ rrand(300, 700) })) );

Ndef(\lfo, { LFNoise1.kr(3, 400, 800) });
Ndef(\sound).map(\freq, Ndef(\lfo));
Ndef(\sound, { arg freq; SinOsc.ar([600, 635] + freq, 0, SinOsc.kr(2 * 3).max(0) * 0.2) });
Ndef(\lfo, { LFNoise1.kr(300, 400, 800) });

Ndef.clear; //clear all Ndefs

Chapter 3 - Controlling the Server

This chapter explores how we use the SuperCollider language to control the SC Server. From a certain perspective the server with its synth definitions can be seen as an instrument and the language as the performer or the score. The SuperCollider language is an interpreted object orientated and functional language written in C/C++ inspired by the Smalltalk. In many ways it is similar to Python, Ruby, Lua or JavaScript, but these are all different languages, and for good reasons: there is no point in creating a programming language that’s the same as another.

SuperCollider is a powerful language, and as its author James McCartney writes in a 2003 paper:

Different languages are based on different paradigms and lead to different types of approaches to solve a given problem. Those who use a particular computer language learn to think in that language and can see problems in terms of how a solution would look in that language. (McCartney 2003)

SuperCollider is very open and allows us to do things in multiple different ways. We could talk about different coding or compositional styles. And none are better than others. It depends on what people get used to and what practices are in line with how they already think or would like to think.

Music is a time-based art form. It is largely about scheduling events in time (which is a notational practice) or about performing those events yourself (which is an instrumental practice). SuperCollider is good for both practices and it provides the user with specific functionalities that make sense for a musical programming language, which might seem strange in a general language. This chapter and the next will introduce diverse wasy to control the server through automated loops, through patterns, Graphical User Interfaces, and other interface protocols such as MIDI or OSC.

Tasks, Routines, forks and loops

We have learned to design synths graphs with UGens, and wrap them in a SynthDef. We have started and stopped a Synth on the server, but we might ask: then what? How do we make music with SuperCollider? How do we schedule things to happen repeatedly in time?

The most basic way of scheduling things is to create a process that loops and runs the same code repeatedly. In chapter 1 we looked at the .do function (which loops N times, e.g. 10.do({arg i; i.postln;}). Such a process can count, so we can use the counter to access data from arrays and this allows us to use the counter as an index into an array into which we can write anything, perhaps a melody. The problem with .do is that it can’t pause or wait. So all the events would be played at the same time (or very quickly after each other).. So we need to wrap the .do function in a Routine. Let us look at a basic routine:

Routine({
	inf.do({arg i;
		"iteration: ".post; i.postln;
		0.25.wait; 
	})
}).play

This could also be written as:

fork{
	inf.do({arg i;
		"iteration: ".post; i.postln;
		0.25.wait; 
	})
}

but the key thing is that we have a routine that serves like an engine that can be paused and woken up again after a certain wait. Try to run the do-loop without a fork:

// this won't work, as there is no routine involved
100.do({arg i; "iteration: ".post; i.postln; 0.25.wait; });
// but this will work, as we are not asking the loop to wait:
100.do({arg i; "iteration: ".post; i.postln; })

A routine can be played with different clocks (TempoClock, SystemClock, and AppClock) and we will explore them later in this chapter. But here is how we can ask different clocks to play the routines:

(
r = Routine.new({
	10.do({ arg a;
		a.postln;
		1.wait;
	});
	0.5.wait;
	"routine finished!".postln;
});
)

SystemClock.play(r); // and then we run it
r.reset // we have to reset the routine to start it again:
AppClock.play(r); // here we tell AppClock to play routine r
r.play(AppClock) // or we can use this syntax
r.stop; // stop the routine
r.play; // try to start the routine again... It won't work.

In the last line above we experience that we can’t restart a routine after it has stopped. Here is where Tasks come in handy, but they are pauseable processes that behave like routines (check the Task helpfile).

(
t = Task({
	inf.do({arg i;
		"iteration is: ".post; i.postln;
		0.25.wait;
	})
});
)

t.play;
t.pause;
t.resume;
t.stop;
t.play;
t.reset; 

Let’s make some music with a Task. We can put some note values into an array and then ask a Task to loop through that array, repeating the melody we make. First we create a SynthDef that we would like to use for this piece of music:

SynthDef(\ch3synth1, {arg freq=333, amp=0.4, pan=0.0, dur=0.41; // the arguments
	var signal, env;
	env = EnvGen.ar(Env.perc(0.001, dur), doneAction:2); // doneAction gets rid of the synth
	signal = LFTri.ar(freq, 0, amp) * env; // the envelope multiplies the signal
	signal = Pan2.ar(signal, pan);
	Out.ar(0, signal);
}).add;

And here we create a composition to play it:

(
m = ([ 0, 1, 5, 6, 10, 12 ]+48).midicps;
m = m.scramble; // try to re-evaluate only this line
t = Task({
	inf.do({arg i;
		Synth(\ch3synth1, [\freq, m.wrapAt(i)]);
		0.25.wait;
	})
});
t.play;
)

In fact we could create a loop that re-evaluates the m.scramble line:

f = fork{
	inf.do({arg i;	
		m = m.scramble; 
		"SCRAMBLING".postln;
		4.8.wait; // why did I choose 4.8 second wait.
	})
}

BTW. Nobody said this was going to be good music, but music it is.

Patterns

Patterns are useful methods for creating musical structures in an efficient way. Patterns are high-level abstractions of keys and values that can be ‘bound’ together to control synths. Patterns use the TempoClock of the language to send control messages to the server. Patterns are related to Events, but those are collections of keys and values that can be used to control synths.

All this might seem very convoluted, but the key point is that we are operating with default values that can be used to control synths. A principal pattern to understand is the Pbind (a Pattern that binds keys to values, such as \frequency (a key) to 440 (a value)).

().play; // run this Event and we observe the posting of default arguments
Pbind().play; // the event arguments are used in the Pbind.

The Pbind is using the default arguments to play the ‘default’ synth (one that is defined by SuperCollider), a frequency of 261.6256, amplitude of 0.1, and so on.

// here we have a Pattern that binds the frequency key to the value of 1000
Pbind(\freq, 1000, \dur, 0.25).play;

The keys that the patterns play match the arguments of the SynthDef. Let’s create a SynthDef that we can fully control with a pattern:

// the synthdef has the conventional 'freq' and 'amp' arguments, but also our own 'cutoff'
SynthDef(\patsynth1, { arg out=0, freq=440, amp=0.1,  pan=0, cutoff=1000, gate = 1;
    var signal = MoogFF.ar( Saw.ar(freq, amp), cutoff, 3);
    var env = EnvGen.kr(Env.adsr(), gate, doneAction: 2);
    Out.ar(out, Pan2.ar(signal, pan, env) );
}).add;
// we play our 'patsynth1' instrument, and control the cutoff parameter
Pbind(\instrument, \patsynth1, \freq, 100, \cutoff, 300, \amp, 0.6).play;
// try this as well:
Pbind(\instrument, \patsynth1, \freq, 100, \cutoff, 3000, \amp, 0.6).play;

Patterns have some default timing mechanism, so we can control the duration until the next event, and we can also set the sustain of the note:

Pbind(\instrument, \patsynth1, \freq, 100, \amp, 0.6, \dur, 0.5).play;
Pbind(\instrument, \patsynth1, \freq, 100, \amp, 0.6, \dur, 0.5, \sustain, 0.1).play;

All this is quite musically boring, but here is where patterns start to get exciting. There are diverse list patterns that allow us to operate with lists, for example by going sequentially through the list (Pseq), picking random values from the list (Prand), shuffling the list (Pshuf), and so on:

// here we format it differently, into pairs of keys and values
Pbind(
	\instrument, \patsynth1, 
	\freq, Pseq([100, 200, 120, 180], inf), // sequencing frequency
	\amp, 0.6, 
	\dur, 0.5
).play;

// we can use list patterns for values to any keys:
Pbind(
	\instrument, \patsynth1, 
	\freq, Prand([100, 200, 120, 180], inf), 
	\amp, Pseq([0.3, 0.6], inf),
	\dur, Pseq([0.125, 0.25, 0.5, 0.25], inf), 
).play;

Pbind(
	\instrument, \patsynth1, 
	\freq, Pseq([100, 200, 120, 180], inf), 
	\cutoff, Pseq([1000, 2000, 3000], inf), // only 3 items in the list - it loops 
	\amp, Pseq([0.3, 0.6], inf), , 
	\dur, Pseq([0.125, 0.25, 0.5, 0.25], inf), 
).play;

There will be more on patterns later, but at this stage it is a good idea to play with the pattern documentation files, for example the ones found under Streams-Patterns-Events. There is also a fantastic Practical Guide to Patterns in the SuperCollider Documentation. Under ‘Streams-Patterns-Events>A-Practical-Guide’

To end this section on patterns, let’s simply play a little with Pdefs:

// here we put a pattern into a variable "a"
(
a = Pdef.new(\example1, 
		Pbind(\instrument, \patsynth1, // using our sine synthdef
			\freq, Pseq([220, 440, 660, 880], inf), // freq arg
			\dur, Pseq([0.25, 0.5, 0.25, 0.5], inf);  // dur arg
		)
);
)

a.play;
a.pause;
a.resume;

// but we don't need to:
(
Pdef(\example2, 
	Pbind(\instrument, \patsynth1, // using our sine synthdef
		\freq, Pseq.new([720, 770, 990, 880], inf), // freq arg
		\dur, Pseq.new([0.25, 0.5, 0.25, 0.5], inf);  // dur arg
	)
);
)

Pdef(\example2).play;
Pdef(\example2).pause;
Pdef(\example2).resume;

// Now, let's play them both together with a bit of timeshift

(
Pdef(\example1).quant_([2, 0, 0]);
Pdef(\example2).quant_([2, 0.5, 1]); // offset by half a beat
Pdef(\example1).play;
Pdef(\example2).play;
)

// and without stopping we redefine the example1 pattern:
(
Pdef(\example1, 
	Pbind(\instrument, \patsynth1, // using our sine synthdef
		\freq, Pseq.new([
			Pseq.new([220, 440, 660, 880], 4),
			Pseq.new([220, 440, 660, 880], 4) * 1.5], // transpose the melody
			inf),
		\dur, Pseq.new([0.25, 0.125, 0.125, 0.25, 0.5], inf);  // dur arg
	)
);
)

The TempoClock

TempoClock is one of three clocks available for timing organisation in SuperCollider. The others are SystemClock and AppClock. TempoClock is a scheduler like SystemClock, but it schedules in beats rather than milliseconds. AppClock is less accurate, but it can call GUI primitives and therefore to be used when GUI’s need update from a clock controlled process.

Let’s start by creating a clock, give it the tempo of 1 beat per second (that’s 60 bpm), and schedule a function to be played in 4 beats time. The arguments of beats and seconds since SuperCollider was started are passed into the function, and we post those.

t = TempoClock.new;
t.tempo = 1;
t.sched(4, { arg beat, sec; [beat, sec].postln; }); // wait for 4 beats (4 secs);

You will note that the beat is a fractional number. This is because the beat returns the appropriate beat time of the clock’s thread. If you prefer to have the beats in whole numbers, you can use the schedAbs method:

t = TempoClock.new;
t.tempo = 4; // we make the tempo 240 bpm (240/60 = 4)
t.schedAbs(4, { arg beat, sec; [beat, sec].postln; }); // wait for 4 beats (1 sec);

If we would like to schedule the function repeatedly, we add a number representing the next beat at the end of the function.

t = TempoClock.new;
t.tempo = 1;
t.schedAbs(0, { arg beat, sec; [beat, sec].postln; 1}); 

And with this knowledge we can start to make some music:

t = TempoClock.new;
t.tempo = 1;
t.schedAbs(0, { arg beat, sec; [beat, sec].postln; 1}); 
t.schedAbs(0, { arg beat, sec; "_Scramble_".scramble.postln; 0.5});

We can try to make some rhythmic pattern with the tempoclock now. Let us just use a simple synth like the one we had above, but now we call it ‘clocksynth’.

// our synth
SynthDef(\clocksynth, { arg out=0, freq=440, amp=0.5,  pan=0, cutoff=1000, gate = 1;
    var signal = MoogFF.ar( Saw.ar(freq, amp), cutoff, 3);
    var env = EnvGen.kr(Env.perc(), gate, doneAction: 2);
    Out.ar(out, Pan2.ar(signal, pan, env) );
}).add;
// the clock
t = TempoClock.new;
t.tempo = 2;
t.schedAbs(0, { arg beat, sec; 
	Synth(\clocksynth, [\freq, 440]);
	if(beat%4==0, { Synth(\clocksynth, [\freq, 440/4, \amp, 1]); });
	if(beat%2==0, { Synth(\clocksynth, [\freq, 440*4, \amp, 1]); });
1}); 

Yet another trick to play sounds in SuperCollider is to use “fork” and schedule a pattern through looping. If you look at the source of .fork (by hitting Cmd+I) you will see that it is essentially a Routine (like above), but it is making our lives easier by wrapping it up in a method of Function.

(
var clock, waitTime;
waitTime = 2;
clock = TempoClock(2, 0);

{ // a fork
	"starting the program".postln;
	{ // and we fork again (play 10 sines)
		10.do({|i|
			Synth(\clocksynth, [\freq, 1000+(rand(1000))]);
			"synth nr : ".post; i.postln;
			(waitTime/10).wait; // wait for 100 milliseconds
		});
		"end of 1st fork".postln;
	}.fork(clock);
	waitTime.wait;
	"finished waiting, now we play the 2nd fork".postln;
	{ // and now we play another fork where the frequency is lower
		20.do({|i|
			Synth(\clocksynth, [\freq, 100+(rand(1000))]);
			"synth nr : ".post; i.postln;
			(waitTime/10).wait;
		});
		"end of 2nd fork".postln;
	}.fork(clock);
	"end of the program".postln;
}.fork(clock);
)

Note that the interpreter reaches the end of the program before the last fork is finished playing.

This is enough about the TempoClock at this stage. We will explore it in more depth later.

GUI Control

Graphical user interfaces are a very common way for musicians to control their compositions. They serve like a control board for things that the language can do, and to control the server. In the next chapter we will explore interfaces in SuperCollider, but this example is provided in this chapter to give an indication of how the language works.

// we create a synth (here a oscillator with 16 harmonics) ( SynthDef(\simpleSynth, {|freq, amp| var signal, harmonics; harmonics = 16; signal = Mix.fill(harmonics, {arg i; SinOsc.ar(freq*(i+1), 1.0.rand, amp * harmonics.reciprocal/(i+1)) }); Out.ar(0, signal ! 2); }).add; )

(
var synth, win, freqsl, ampsl;
// create a GUI window
win = Window.new("simpleSynth", Rect(100, 100, 300, 90), false).front;
// and place the frequency and amplitude sliders in the window
StaticText.new(win, Rect(10,10, 160, 20)).font_(Font("Helvetica", 9)).string_("freq");
freqsl = Slider.new(win, Rect(40,10, 160, 24)).value_(1.0.rand)
	.action_({arg sl; synth.set(\freq, sl.value*1000;) });
StaticText.new(win, Rect(10,46, 160, 20)).font_(Font("Helvetica", 9)).string_("amp");
ampsl = Slider.new(win, Rect(40,46, 160, 24)).value_(1.0.rand)
	.action_({arg sl; synth.set(\amp, sl.value) });
// a button to start and stop the synth. If the button value is 1 we start it, else stop it
Button.new(win, Rect(220, 10, 60, 60)).states_([["create"], ["kill"]])
	.action_({arg butt;
		if(butt.value == 1, {
			// the synth is created with freq and amp values from the sliders
			synth = Synth(\simpleSynth, [\freq, freqsl.value*1000, \amp, ampsl.value]);
		},{
			synth.free;
		});
	});
)

Chapter 4 - Interfaces and Communication (GUI/MIDI/OSC)

SuperCollider is a very open environment. It can be used for practically anything sound related, whether it is scientific study of sound, instrument building, DJing, generative composition, or creating interactive installations. For these purposes we often need real-time interaction with the system and this can be achieved in many ways, but typically through screen-based or hardware interaction. This section will introduce the most common ways of interacting with the SuperCollider language.

MIDI - Musical Instrument Digital Interface

MIDI: A popular 80s technology (SC2 Documentation)

MIDI is one of the most common protocols for hardware and software communication. It is a simple protocol that has proven valuable, although it is currently seen to have gone past its prime. The key point of using MIDI in SuperCollider is to be able to interact with hardware controllers, synthesizers, and other software. SuperCollider has a strong MIDI implementation and should support everything you might want to do with MIDI.

// we initialise the MIDI client and the post window will output your devices
MIDIClient.init;
// the sources are the input devices you have plugged in
MIDIClient.sources;
// the destinations are the devices that can receive MIDI
MIDIClient.destinations;

Using MIDI Controllers (Input)

Let’s start with exploring MIDI controllers. The MIDI methods that you will use will depend on what type of controller you’ve got. The following are the available messages of MIDIIn:

  • noteOn
  • noteOff
  • control
  • bend
  • touch
  • polyTouch
  • program
  • sysex
  • sysrt
  • smpte

If you were to use a relatively good MIDI keyboard, you would be able to use most of these methods. In the following example we will explore the interaction with a simple MIDI keyboard.

MIDIIn.connectAll; // we connect all the incoming devices
MIDIFunc.noteOn({arg ...x; x.postln; }); // we post all the args

On the device I’m using now (Korg NanoKEY), I get an array formatted thus [127, 60, 0, 1001096579], where the first item is the velocity (how hard I hit the key), the second is the MIDI note, the third is the MIDI channel, and the fourth is the device number (so if you have different devices, you can differentiate between them using this ID).

For the example below, we will use the convenient MIDIdef class to register the definition we want to use for the incoming MIDI messages. Making such definitions is common in SuperCollider, as we make SynthDefs, OSCdefs and HIDdefs (Human Interface Device definitions). Let’s hook the incoming note and velocity values up to the freq and amp values of a synth that we create. Note that the MIDIdef contains two things, its name and the function it will trigger on every incoming MIDI note on! We simply create a Synth inside that function.

//First we create a synth definition for this example:
SynthDef(\midisynth1, {arg freq=440, amp=0.1;
	var signal, env;
	signal = VarSaw.ar([freq, freq+2], 0, XLine.ar(0.7, 0.9, 0.13));
	env = EnvGen.ar(Env.perc(0.001), doneAction:2); // this envelope will die out
	Out.ar(0, signal*env*amp);
}).add;

Synth(\midisynth1) // let's try it

// and now we can play the synth
MIDIdef.noteOn(\mydef, {arg vel, key, channel, device; 
	Synth(\midisynth1, [\freq, key.midicps, \amp, vel/127]);
	[key, vel].postln; 
});

But the above is not a common synth-like behaviour. Typically you’d hold down the note and it would not be released until you release your finger off the keyboard key. We therefore need to use an ADSR envelope. Wikipedia link.

//First we create a synth definition for this example:
SynthDef(\midisynth2, {arg freq=440, amp=0.1, gate=1;
	var signal, env;
	signal = VarSaw.ar([freq, freq+2], 0, XLine.ar(0.7, 0.9, 0.13));
	env = EnvGen.ar(Env.adsr(0.001), gate, doneAction:2);
	Out.ar(0, signal*env);
}).add;
// since we added default freq and amp arguments we can try it:
a = Synth(\midisynth) // playing 440 Hz
a.release // and the synth will play until we release it (gate = 0)
// the adsr envelope in the synth keeps the gate open as long as note is down

// now let's connect the MIDI
MIDIIn.connectAll; // we connect all the incoming devices
MIDIdef.noteOn(\mydef, {arg vel, key, channel, device; 
	Synth(\midisynth, [\freq, key.midicps, \amp, vel/127]);
	[key, vel].postln; 
});

What’s going on here? The synth definition uses a common trick to create a slight detuning in the frequency in order to make the sound more “analogue” or imperfect. We use a VarSaw that can change the saw waveform and we do change it with the XLine UGen. The synth def has an amp argument for the volume and a gate argument that keeps the synth playing until we tell it to stop.

But what happened? We play and we get a cacophony of sound. The notes are piling up on top of each other as they are not released. How would you solve this?

You could put the note into a variable:

MIDIdef.noteOn(\myOndef, {arg vel, key, channel, device; 
	a = Synth(\midisynth2, [\freq, key.midicps, \amp, vel/127]);
	[key, vel].postln; 
});
MIDIdef.noteOff(\myOffdef, {arg vel, key, channel, device; 
	a.release;
	[key, vel].postln; 
});

And it will release the note when you release your finger. However, now the problem is that if you press another key whilst holding down the first one, the second key will be the Synth that is put into variable ‘a’, so you have lost the reference to the first one. You can’t release it! There is no access to the synth as a new one has repleced the first. Here is where SuperCollider excels as a programming language and makes things so simple and easy compared to data-flow programming environments like Pd or Max/MSP. We just create an array and put our synths into it. Here every note has a slot in the array and we turn the synths on and off depending on the MIDI message:

a = Array.fill(127, { nil });
g = Group.new; // we create a Group to be able to set cutoff of all active notes
c = 6;
MIDIdef.noteOn(\myOndef, {arg vel, key, channel, device; 
	// we use the key as index into the array as well
	a[key] = Synth(\moog, [\freq, key.midicps, \amp, vel/127, \cutoff, c], target:g);
	
});
MIDIdef.noteOff(\myOffdef, {arg vel, key, channel, device; 
	a[key].release;
});
MIDIdef.cc(\modulation, { arg val; c=val.linlin(0, 127, 6, 20); g.set(\cutoff, c) });

MIDI Communication (Output)

It is equally easy to control external hardware or software with SuperCollider’s MIDI functionality. Just as above we initialise the MIDI client and check which devices are available:

// we initialise the MIDI client and the post window will output your devices
MIDIClient.init;
// the destinations are the devices that can receive MIDI
MIDIClient.destinations;

// the default device is selected
m = MIDIOut(0); 
// or select your own device from the list of destinations
m = MIDIOut(0, MIDIClient.destinations[0].uid); 
// we now have a MIDIOut object stored in variable 'm'.
// now we can use the object to send out MIDI messages:
m.latency = 0; // we put the latency to 0 (default is 0.2)
m.noteOn(0, 60, 100); // note on
m.noteOff(0, 60, 100); // note off

And you could control your device using Patterns:

Pbind(
	\type, \midi, 
	\midiout, m, 
	\midinote, Prand([60, 62, 63, 66, 69], inf), 
	\chan, 1, 
	\amp, 1, 
	\dur, 0.25
).play;

or for example a Task:

a =[72, 76, 79, 71, 72, 74, 72, 81, 79, 84, 79, 77, 76, 77, 76];
t = Task({
 	inf.do({arg i; // i is the counter and wrapAt can wrap the array
	m.noteOff(0, a.wrapAt(i-1), 100); // note off
	m.noteOn(0, a.wrapAt(i), 100); // note on
	0.25.wait;
 	})
}).play; 

You might have recognised the beginning of a Mozart melody there, but perhaps not, as the note lengths were not correct. How would you solve that? Try to fix the timing of the notes as an exercise. Tip: create a duration array (in var ‘d’ for example) and put that instead of “0.25.wait;” above. Use the wrapAt(i) to get at the correct duration slot.

OSC - Open Sound Control

Open Sound Control has become the principal protocol replacing MIDI in the 21st century. It is fast and flexible network protocol that can be used to communicate between applications (like SC Server and sc-lang), between computers (on a local network or the internet), or to hardware (that supports OSC). It is used by musicians and media artists all over the world and it has become so popular that commercial software companies are now supporting it in their software. In many ways it could have been called OMC (Open Media Control) as it is used in graphics, video, 3D software, games, and robotics as well.

OSC is a protocol of communication (how to send messages), but it does not define a standard of what to communicate (that’s the open bit). Unlike MIDI, it can send all kinds of information through the network (integers, floats, strings, arrays, etc.), and the user can define the message names (or address spaces as they are also called).

There are two things that the user needs to know: the computer’s IP address, and the listening Port. * IP address: Typically something like “194.81.199.106” or locally “127.0.0.1” (localhost) * Port: You can use any port, but ideally choose a port above 10000.

You have already used OSC in the SendTrig example of chapter 2, but there it was ‘under the hood’, so to speak, as the communication took place in the SuperCollider classes.

n = NetAddr("127.0.0.1", 57120);
a = OSCdef(\test, { arg msg, time, addr, recvPort; msg.postln; }, '/hello', n);
n.sendMsg('/hello', 4000.rand); // run this line a few times
n.sendMsg('/hola', 4000.rand); // try this, but it won't work. Why not?
a.free;

OSC messages make use of Unix-like address spaces. You know what that is already, as you are used to how the internet uses ‘/’ to indicate a folder down in web addresses. For example here this OSCdef.html document lies in a folder called ‘Classes’: http://doc.sccode.org/Classes/OSCdef.html together with lots of other documents. The address above is ‘/hello’ (not ‘/hola’).

The idea here is that we can send messages directly deep into the internals of our synthesizers or systems, for example like this:

‘/synth1/oscillator2/lowpass/cutoff’, 12000 ‘/synth1/oscillator2/frequency’, 300 ‘/light3/red/intensity’, 10 ‘/robot/leftarm/upper/xdegrees’, 90

and so on. We are here giving direct messages that are human-readable as well as specific for the machine. This is very different from how people used to use MIDI where you have no way of naming things, you have to resolve to mapping your things with only 16 channels and often constrained in messaging with numbers from 0-127.

Try to open Pure Data and create a new patch with the following in it:

[dumpOSC 12000] | | [print]

Then send the messages to Pd with this code in SC:

n = NetAddr("127.0.0.1", 12000);
n.sendMsg('/hello', 4000.rand); // Pure Data will print this message

Try to do the same with another computer on the same network, but then send it to that computer’s IP address:

n = NetAddr(other_computer_IP_address, 12000);
n.sendMsg('/hello', 4000.rand);

Use the same Pd patch on that computer, but then run the following lines in SuperCollider:

a = OSCdef(\test, { arg msg, time, addr, recvPort; msg.postln; }, '/hello', nil);

You notice that there is now ‘nil’ in the sender address. This allows any computer on the network to send to your computer. If you would limit that to a specific net address (for example NetAddr(“192.9.12.199”, 3000)), it would only be able to receive OSC messages from that specific address/computer.

Hopefully you have now been able to send OSC messages to another software on your computer, to Pd on another computer, and to SuperCollider on another computer. These examples were on the same network. You might have to change settings in your firewall for this to work over networks, and if you are on an institutional network (such as a University network) you might even have to ask the system administrators to open up for a specific port if the incoming message is coming from outside the network (Internally it works without admin changes).

We could end this section by creating a little program that is typical for who people use OSC over networks on the same or different computers. Here below we have the listener:

// synth definition used in this example
SynthDef(\osc, {arg freq=220, cutoff=1200;
	Out.ar(0, LPF.ar(Saw.ar(freq, 0.5), cutoff));
}).add;
// the four OSC defs, that represent the program functionality
OSCdef(\createX, { arg msg, time, addr, recvPort; 
	x = Synth(\osc);
}, '/create', nil); 
OSCdef(\releaseX, { arg msg, time, addr, recvPort;
	x.free; 
	}, '/free', nil);
OSCdef(\freqX, { arg msg, time, addr, recvPort; 
	x.set(\freq, msg[1]);
	}, '/freq', nil);
OSCdef(\cutoffX, { arg msg, time, addr, recvPort;
	x.set(\cutoff, msg[1]);
	}, '/cutoff', nil);

And the other system (another software or another computer) will send something like this:

n = NetAddr("127.0.0.1", 57120);
n.sendMsg('/create')
n.sendMsg('/freq', rrand(100, 2000))
n.sendMsg('/cutoff', rrand(100, 2000))
n.sendMsg('/free')

The messages could be wrapped into functionality that is plugged to some GUI, a hardware sensor (pressure sensor and motion tracker for example), or perhaps algorithmically generated together with some animated graphics.

GUI - Graphical User Interfaces

Note that we are creating N number of synths (defined in the variable “nrSynths”) and putting them all into one List. That way we can access and control them individually from the GUI. Look at how the sliders and buttons of the GUI are controlling directly their respective synth by accessing synthList[i] (where “i” is the index of the synth in the list)

TIP: change the nrSynths variable to some other number (10, 16, etc) and see what happens.

(
var synthList, nrSynths;
nrSynths = 6;

synthList = Array.fill(nrSynths, {0});

w = SCWindow("SC Window", Rect(400, 64, 650, 360)).front;

nrSynths.do({arg i;

	// we create the buttons
	SCButton(w, Rect(10+(i*(w.bounds.width/nrSynths)), 20, (w.bounds.width/nrSynths)-10, 20))
		.states_([["on",Color.black,Color.clear],["off",Color.white,Color.black]])
		.action_({arg butt;
			if(butt.value == 1, {
				synthList.put(i, Synth(\GUIsine));
				synthList.postln;
			}, {
				synthList[i].free;
			})
		});
		
	// frequency slider
	SCSlider(w, Rect(10+(i*(w.bounds.width/nrSynths)), 60, (w.bounds.width/nrSynths)-10, 20))
		.action_({arg sl;
				synthList[i].set(\freq, sl.value*1000); // simple mapping (check ControlSpec)
		});
		
	// amplitude slider
	SCSlider(w, Rect(10+(i*(w.bounds.width/nrSynths)), 100, (w.bounds.width/nrSynths)-10, 20))
		.action_({arg sl;
				synthList[i].set(\amp, sl.value);
		});
});
)

ControlSpec - Scaling/mapping values

In the examples above we have used a very crude mapping of a slider onto a frequency argument in a synth. A slider in SuperCollider GUI gives a value from 0 to 1.0 in resolution defined by yourself and the size of the slider (the longer the slider, the higher resolution). So above we are using parts of the slider to control frequency values from 0 to 20 Hz that we are most likely not interested in. And we might want an exponential mapping or negative.

The ControlSpec is the equivalent to [scale] in Pd or Max. Check the helpfile.

The ControlSpec takes the following arguments: minval, maxval, warp, step, default,units

a = ControlSpec.new(20, 22050, \exponential, 1, 440);
a.warp
a.default

// so any value we pass to the ControlSpec is mapped to our specification above
a.map(0.1)
a.map(0.99)

// we could constrain the mapping
a.constrain(16000)
a.map(1.66) // clips at max frequency (22050)

// we can also unmap values
a.unmap(11025) // we get a high value as pitch is exponetial

// let's see what this maps to on a linear scale (yes you guessed right)
a = ControlSpec.new(20, 22050, \lin, 1, 440);
a.unmap(11025).round(0.1)

// TIP: An array can be cast into a ControlSpec with the method .asSpec
[300, 3000, \exponential, 100].asSpec

// TIP2: Take a look at the source file for ControlSpec (CMD+i)
// You will see lots of different warps, like db, pan, midi, rq, etc.

(
var w, c, d, warparray, stringarray;
w = SCWindow("control", Rect(128, 64, 340, 960)).front;
warparray = [\unipolar, \bipolar, \freq, \lofreq, \phase, \midi, \db, \amp, \pan, \delay, \\
beats];
stringarray = [];

warparray.do({arg warpmode, i;
	a = warpmode.asSpec;
	SCStaticText(w, Rect(10, 30+(i*50), 300, 20)).string_(warparray[i].asString);
	stringarray = stringarray.add(SCStaticText(w, Rect(80, 30+(i*50), 300, 20)));
	SCSlider(w, Rect(10, 10+(i*50), 300, 20))
		.action_({arg sl;
			stringarray[i].string = "unmapped value"
			+ sl.value.round(0.01) 
			+ "......" 
			+ "mapped to:" 
		+ warpmode.asSpec.map(sl.value).round(0.01)
		})
});
)

Now we finish this by taking the example above and map the slider to pitch. Try to explore different warp modes for the pitch. And create an amplitude slider.

(
var spec, synth;
w = SCWindow("SC Window", Rect(128, 64, 340, 360)).front;
spec = [100, 1000, \exponential].asSpec;

SCButton(w, Rect(20,20, 100, 30))
	.states_([["on",Color.black, Color.clear],["off",Color.black, Color.green(alpha:0.2)]])
	.action_({ arg button; if(button.value == 1, { synth = Synth(\GUIsine)}, {synth.free }) });
	
SCSlider(w, Rect(20, 100, 200, 20))
	// HERE WE USE THE SPEC !!! - we map the spec to the value of the slider (0 to 1.0)
	.action_({arg sl; synth.set(\freq, spec.map(sl.value)) }); 
)

Other Views (but not all)

(

w = Window("SC Window", Rect(400, 64, 650, 360)).front;
a = Button(w, Rect(20,20, 60, 20))
	.states_([["on",Color.black,Color.clear],["off",Color.black,Color.clear]])
	.action_({arg butt; butt.value.postln;});

b = Slider(w, Rect(20, 50, 60, 20))
	.action_({arg sl;
		sl.value.postln;
	});

e = Slider(w, Rect(90, 20, 20, 60))
	.action_({arg sl;
		sl.value.postln;
	});
	
c = 2DSlider(w, Rect(20, 80, 60, 60))
	.action_({arg sl;
		[\x, sl.x.value, \y, sl.y.value].postln;
	});

d = RangeSlider(w, Rect(20, 150, 60, 20))
	.action_({arg sl;
		[\lo, sl.lo.value, \hi, sl.hi.value].postln;
	});

f = NumberBox(w, Rect(130, 20, 100, 20))
	.action_({
		arg numb; numb.value.postln;	
	});

g = StaticText(w, Rect(130, 50, 100, 20))
	.string_("some text");
	
h = ListView(w,Rect(130,80,80,50))
	.items_(["aaa","bbb", "ccc", "ddd", "eee", "fff"])
	.action_({ arg sbs;
		[sbs.value, sbs.item].postln;	// .value returns the integer
	});

i = MultiSliderView(w, Rect(130, 150, 100, 50))
	.action_({arg xb; ("index: " ++ xb.index ++" value: " ++ xb.currentvalue).postln});

j = PopUpMenu(w, Rect(20, 178, 100, 20))
	.items_(["one", "two", "three", "four", "five"])
	.action_({ arg sbs;
		sbs.value.postln;	// .value returns the integer
	});

k = EnvelopeView(w, Rect(20, 220, 200, 80))
	.drawLines_(true)
	.selectionColor_(Color.red)
	.drawRects_(true)
	.resize_(5)
	.action_({arg b; [b.index,b.value].postln})
	.thumbSize_(5)
	.value_([[0.0, 0.1, 0.5, 1.0],[0.1,1.0,0.8,0.0]]);

)

HID - Human Interface Devices

SuperCollider has good support for using joysticks, game pads, drawing tablets and other interfaces that work with the HID protocol (A subset of the USB protocol and using the USB port of the computer).

HID.findAvailable; // check which devices are attached
HID.postAvailable; // post the available devices
~myhid = HID.open( 1103, 53251 ); // adapt this line for  the device that you want to open!

// filter all events coming from the x-axis of a mouse
HIDdef.usage( \example, { |...args| args.postln; }, \X, \Mouse );

Hardware - Serial port (for example using Arduino)

Before USB, Bluetooth and such “modern” protocols, there was the Serial port. This would send data - in series - from and to the computer to external devices, such as sensors or actuators (e.g. a motor or a printer).

Arduino is a popular serial port chip used in embedded computing or as an external board that interfaces with sensors and actuators. The SuperCollider documentation explains this well in the Serial port documentation:

Serial

Part II

Chapter 5 - Additive Synthesis

In 1822, the mathematician Joseph Fourier published a work on heat with a theory that implied that any sound can be described as a function of pure sine waves. This is a very important statement for computer music. It means that we can recreate any sound that we hear by adding number of sine waves together with different frequency, phase and amplitude. Obviously this was a costly technique in times of modular synthesis, as one would have to apply multiple oscillators to get the desired sound. This has changed with digital sound, where innumerable oscillators can be added together with little cost. Here is a proof:

// we add 500 oscillators together and the CPU is less than 20% 
{({SinOsc.ar(4444.4.rand, 0, 0.005)}!500).sum}.play

Adding waves

Adding waves together seems simple, and indeed it is. By using the plus operator we can add two signals together and their values at the same time add up to the combined value. In the following images we can see how simple sinusoidal waves add up:

Adding two waves of 440Hz together
Adding two waves of 440Hz together

You can see how two sine waves that go from -1 to 1, when added up will have the amplitude of -2 to 2.

{[SinOsc.ar(440), SinOsc.ar(440), SinOsc.ar(440)+SinOsc.ar(440)]}.plot
// try this as well
{a = SinOsc.ar(440, 0, 0.5); [a, a, a+a]}.plot

You see that two waves at the same frequency added together becomes twice the amplitude. When two waves with the amplitude of 1 are added together we get an amplitude of 2 and in the graph we get a clipping where the wave is clipped at 1. This can cause a distortion, but also resulting in a different wave form, namely a square wave. You can explore this by giving a sine wave the amplitude of 10, but then clip the signal at, say -0.75 and 0.75.

{SinOsc.ar(440, 0, 10).clip(-0.75, 0.75)}.scope
Adding a 440Hz and a 220Hz wave together
Adding a 440Hz and a 220Hz wave together
{[SinOsc.ar(220), SinOsc.ar(440), SinOsc.ar(220)+SinOsc.ar(440)]}.plot
Adding two 440 waves together but one with inverted phase
Adding two 440 waves together but one with inverted phase
{[SinOsc.ar(440), SinOsc.ar(440, pi), SinOsc.ar(440)+SinOsc.ar(440, pi)]}.plot

The phase of the wave is important as it can either cancel the sound out or double its amplitude. Recording engineers are familiar with the problems of phasing in microphone placements where certain frequencies of a sound can be phased out if two mics are badly placed.

Most instrumental sounds can be roughly described as a combination of sine waves. Those sinusoidal waves are called partials (the horizontal lines you see in a spectrogram when you analyse a sound). In the example below we mix ten sine waves of frequencies between 200 and 2000. You might well be able to detect a pitch in the example if you run it many times, but since these are random frequencies they are not necessarily lining up to give us a solid pitch.

{Mix.fill(10, {SinOsc.ar(rrand(200,2000), 0, 0.1)})}.freqscope
{Mix.fill(10, {SinOsc.ar(rrand(200,2000), 0, 0.1)})}.spectrogram (XXX fix spectrogram quark)

In harmonic sounds, like the piano, guitar, or the violin we get partials that are whole number multiples of the fundamental (the lowest) partial. If they are fundamental multiples, the partials are called harmonics. The harmonics can be of varied amplitude, phase, envelope form, and duration. A saw wave is a waveform with all the harmonics represented, but lowering in amplitude:

{Saw.ar(880)}.freqscope

Try to play with adding waves together in various ways. Explore what happens when you add harmonics together (integer multiples of a fundamental frequency),

// adding two waves - the second is the octave (second harmonic) of the first
{(SinOsc.ar(440,0, 0.4) + SinOsc.ar(880, 0, 0.4))!2}.play
// here we add four harmonics (of equal amplitude) together
(
{	
var freq = 200;
SinOsc.ar(freq, 0, 0.2)   + 
SinOsc.ar(freq*2, 0, 0.2) +
SinOsc.ar(freq*3, 0, 0.2) + 
SinOsc.ar(freq*4, 0, 0.2) 
!2}.play
)

The harmonic series is something we all know intuitively and have heard many times (swing a flexible tube around your head and you will get a sound in the harmonic series). The Blip UGen in SuperCollider allows you to dynamically control the number of harmonics of equal amplitude:

{Blip.ar(440, MouseX.kr(1, 20))}.scope // using the Mouse
{Blip.ar(440, MouseX.kr(1, 20))}.freqscope
{Blip.ar(440, Line.kr(1, 22, 3) )}.play

Creating wave forms out of sinusoids

In SuperCollider you can create all kinds of wave forms out of a combination of sine waves. By adding SinOscs together, you can derive at your own unique wave forms that you might use in your synths. In this section we will look at how we use additive synthesis to derive at diverse wave forms.

// a) here is an array with 5 items:
Array.fill(5, {arg i; i.postln;});
// b) this is the same as (using a shortcut):
{arg i; i.postln;}.dup(5)
// c) or simply (using another shortcut):
{arg i; i.postln;}!5

// d) we can then sum the items in the array (add them together):
Array.fill(5, {arg i; i.postln;}).sum;
// e) we could do it this way as well:
sum({arg i; i.postln;}.dup(5));
// f) or this way:
({arg i; i.postln;}.dup(5)).sum;
// g) or this way:
({arg i; i.postln;}!5).sum;
// h) or simply this way:
sum({arg i; i.postln;}!5);

Above we created a Saw wave which contains harmonics up to the ![Nyquist rate] (http://en.wikipedia.org/wiki/Nyquist_rate), which is half of the sample rate SuperCollider is running. The Saw UGen is “band-limited” which means that it does not alias and mirror back into the audible range. (Compare with LFSaw which will alias - you can both hear and see the harmonics mirror back into the audio range).

{Saw.ar(MouseX.kr(100, 1000))}.freqscope
{LFSaw.ar(MouseX.kr(100, 1000))}.freqscope

We can now try to create a saw wave out of sine waves. There is a simple algorithm for this, where each partial is an integer multiple of the fundamental frequency, and decreasing in amplitude by the reciprocal of the partials’s/harmonic’s number (1/harmnum).

A ‘Saw’ wave with 30 harmonics:

(
f = {
        ({arg i;
                var j = i + 1;
                SinOsc.ar(300 * j, 0,  j.reciprocal * 0.5);
        } ! 30).sum // we sum this function 30 times
!2}; // and we make it a stereo signal
)

f.plot; // let's plot the wave form
f.play; // listen to it
f.freqscope; // view and listen to it

By inverting the phase (using pi), we get an inverted wave form.

(
f = {
        Array.fill(30, {arg i;
                var j = i + 1;
                SinOsc.ar(300 * j, pi,  j.reciprocal * 0.5) // note pi
        }).sum // we sum this function 30 times
!2}; // and we make it a stereo signal
)

f.plot; // let's plot the wave form
f.play; // listen to it
f.freqscope; // view and listen to it

A square wave is a type of a pulse wave (If the length of the on time of the pulse is equal to the length of the off time - also known as a duty cycle of 1:1 - then the pulse wave may also be called a square wave). The square wave can be created by sine waves if we ignore all the even harmonics and only add the odd ones.

(
f = {
        ({arg i;
                var j = i * 2 + 1; // the odd harmonics (1,3,5,7,etc)
                SinOsc.ar(300 * j, 0, 1/j)
        } ! 20).sum;
};
)

f.plot;
f.play;
f.freqscope;

Let’s quickly look at the regular Pulse wave in SuperCollider:

{ Pulse.ar(440, MouseX.kr(0, 1), 0.5) }.scope;
// we could also recreate this with an algorithm on a sine wave:
{ if( SinOsc.ar(122)>0 , 1, -1 )  }.scope; // a square wave
{ if( SinOsc.ar(122)>MouseX.kr(0, 1) , 1, -1 )  }.scope; // MouseX controls the period
{ if( SinOsc.ar(122)>MouseX.kr(0, 1) , 1, -1 ) * 0.1 }.scope; // amplitude down

A triangle wave is a wave form, similar to the pulse wave in that it ignores the even harmonics, but it has a different algorithm for the phase and the amplitude:

(
f = {
        ({arg i;
                var j = i * 2 + 1;
                SinOsc.ar(300 * j, pi/2, 0.7/j.squared) // cosine wave (phase shift)
        } ! 20).sum;
};
)
f.plot;
f.play;
f.freqscope;

We have now created various wave forms using sine waves, and here is how to wrap them up in a SynthDef for future use:

SynthDef(\triwave, {arg freq=400, pan=0, amp=1;
	var wave;
	wave = ({arg i;
                	var j = i * 2 + 1;
                	SinOsc.ar(freq * j, pi/2, 0.6 / j.squared);
        	} ! 20).sum;
	Out.ar(0, Pan2.ar(wave * amp, pan));
}).add;

a = Synth(\triwave, [\freq, 300]);
a.set(\amp, 0.3, \pan, -1);
b = Synth(\triwave, [\freq, 900]);
b.set(\amp, 0.4, \pan, 1);
s.freqscope; // if the freqscope is not already running
b.set(\freq, 1400); // not band limited as we can see 

We have created various typical wave forms above in order to show how they are sums of sinusoidal waves. A good idea is to play with this further and create your own waveforms:

(
f = {
        ({arg i;
                var j = i * 2.cubed + 1;
                SinOsc.ar(MouseX.kr(20,800) * j, 0, 1/j)
        } ! 20).sum;
};
)
f.plot;
f.play;
(
f = {
        ({arg i;
                var j = i * 2.squared.distort + 1;
                SinOsc.ar(MouseX.kr(20,800) * j, 0, 0.31/j)
        } ! 20).sum;
};
)
f.plot;
f.play;

Bell Synthesis

Not all sounds are harmonic. Many musical instruments are inharmonic, for example timpani drums, xylophones, and bells. Here the partials of the sound are not in a harmonic relationship (or multiples of some fundamental frequency). This does not mean that we can’t detect pitch, as there will be certain partials that have stronger amplitude and longer duration than others. Since we know bells are inharmonic, the first thing we might try is to generate a sound with, say, 15 partials:

{ ({ SinOsc.ar(rrand(80, 800), 0, 0.1)} ! 15).sum }.play

Try to run this a few times. What we hear is a wave form that might be quite similar to a bell at first, but then the resemblance disappears, because the partials do not fade out. If we add an envelope to each of these sinusoids, we get a different sound:

{
Mix.fill( 10, { 	
	SinOsc.ar(rrand(200, 700), 0, 0.1) 
	* EnvGen.ar(Env.perc(0.0001, rrand(2, 6))) 
});
}.play

Above we are using Mix.fill instead of creating an array with ! and then .summing it. These two examples do the same thing, but it is good for the student of SuperCollider to learn different ways of reading and writing code.

You note that there is a “new” bell every time we run the above code. But what if we wanted the “same” bell? One way to do that is to “hard-code” the frequencies, durations, and the amplitudes of the bell.

{
var freq = [333, 412, 477, 567, 676, 890, 900, 994];
var dur = [4, 3.5, 3.6, 3.1, 2, 1.4, 2.4, 4.1];
var amp = [0.4, 0.2, 0.1, 0.4, 0.33, 0.22, 0.13, 0.4];
Mix.fill( 8, { arg i;
	SinOsc.ar(freq[i], 0, 0.1) 
	* EnvGen.ar(Env.perc(0.0001, dur[i])) 
});
}.play

Generating a SynthDef using a non-deterministic algorithms (such as random) in the SC-lang will also generate a SynthDef that is the “same” bell. Why? This is because the values (430.rand) are defined when the synth definition is compiled. Try to recompile the SynthDef and you get a new sound:

(
SynthDef(\mybell, {arg freq=333, amp=0.4, dur=2, pan=0.0;
	var signal;
	signal = Mix.fill(10, {
		SinOsc.ar(freq+(430.rand), 1.0.rand, 10.reciprocal) 
		* EnvGen.ar(Env.perc(0.0001, dur), doneAction:2) }) ;
	signal = Pan2.ar(signal * amp, pan);
	Out.ar(0, signal);
}).add
)
// let's try our bell
Synth(\mybell) // same sound all the time
Synth(\mybell, [\freq, 444+(400.rand)]) // new frequency, but same sound
// try to redefine the SynthDef above and you will now get a different bell:
Synth(\mybell) // same sound all the time

Another way of generating this bell sound would be to use the SynthDef from last tutorial, but here adding a duration to the envelope:

(
SynthDef(\sine, {arg freq=333, amp=0.4, dur, pan=0.0;
	var signal, env;
	env = EnvGen.ar(Env.perc(0.01, dur), doneAction:2);
	signal = SinOsc.ar(freq, 0, amp) * env;
	signal = Pan2.ar(signal, pan);
	Out.ar(0, signal);
}).add
);

(
var numberOfSynths;
numberOfSynths = 15;
Array.fill(numberOfSynths, {
	Synth(\stereosineWenv, [	
		\freq, 300+(430.rand),
		\phase, 1.0.rand,
		\amp, numberOfSynths.reciprocal, // reciprocal here means 1/numberOfSynths
		\dur, 2+(1.0.rand)]);
});
)

The power of using this style would be if you really wanted to be able to define all the parameters of the sound from the language, for example sonifying some complex information from gestural or other data.

The Klang Ugen

Another interesting way of achieving this is to use the Klang UGen. Klang is a bank of sine oscillators that takes arrays of frequencies, amplitudes and phase as arguments.

{Klang.ar(`[ [430, 810, 1050, 1220], [0.23, 0.13, 0.23, 0.13], [pi,pi,pi, pi]], 1, 0)}.play

And we create a SynthDef with the Klang Ugen:

(
SynthDef(\saklangbell, {arg freq=400, amp=0.4, dur=2, pan=0.0; // we add a new argument
	var signal, env;
	env = EnvGen.ar(Env.perc(0.01, dur), doneAction:2); // doneAction gets rid of the synth
	signal = Klang.ar(`[freq * [1.2,2.1,3.0,4.3], [0.25, 0.25, 0.25, 0.25], nil]) * env;
	signal = Pan2.ar(signal, pan);
	Out.ar(0, signal);
}).add
)
Synth(\saklangbell, [\freq, 100])

Xylophone Synthesis

Additive synthesis is good for various types of sound, but it suites very well for xylophones, bells and other metallic instruments (typically inharmonic sounds) as we saw with the bell example above. Using harmonic wave forms, such as a Saw wave, Square wave or Triangle wave would not be useful here as those are harmonic wave forms (as we know from the section above).

In additive synthesis, people often analyse the sound they’re trying to synthesise with generating a spectrogram of its frequencies.

A spectrogram of a xylophone sound
A spectrogram of a xylophone sound

The information the spectrogram gives us is three dimensional. It shows us the frequencies present in the sound on the vertical x-axis, the time on the horizontal y-axis, and amplitude is color (which we could imagine as the z-axis). We see that the partials don’t have the same type of envelopes: some have strong attack, others come smoothly in; some have much amplitude, others less; some have a long duration whilst other have less; and of them vibrate in frequency. These parameters can mix. A loud partial could die out quickly while a soft one can live for a long time.

{ ({ SinOsc.ar(rrand(180, 1200), 0.5*pi, 0.1) // the partial
		*
	// each partial gets its own envelope of 0.5 to 5 seconds
	EnvGen.ar(Env.perc(rrand(0.00001, 0.01), rrand(0.5, 5)))
} ! 12).sum }.play

Analysing the bell above we can detect the following partials * partial 1: xxx Hz, x sec. long, with amplitude of ca. x * partial 2: xxx Hz, x sec. long, with amplitude of ca. x * partial 3: xxx Hz, x sec. long, with amplitude of ca. x * partial 4: xxx Hz, x sec. long, with amplitude of ca. x * partial 5: xxx Hz, x sec. long, with amplitude of ca. x * partial 6: xxx Hz, x sec. long, with amplitude of ca. x * partial 7: xxx Hz, x sec. long, with amplitude of ca. x

We can now try to synthesize those harmonics:

{ SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)
}.play

And we get a decent inharmonic sound (inharmonic is where the partials are not whole number multiples of a fundamental frequency). We would now need to set the right amplitude as well and we’re still guessing from the spectrogram we made, but more importantly we should be using our ears.

{ SinOsc.ar(xxx, 0, xxx)+
SinOsc.ar(xxx, 0, xxx)+
SinOsc.ar(xxx, 0, xxx)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)+
SinOsc.ar(xxx, 0, 0.1)
}.play

Some of the partials have a bit of vibration and we could simply turn the oscillator into a ‘detuned’ oscillator by adding two sines together:

// a regular 880 Hz wave at full amplitude
{SinOsc.ar(880)!2}.play
// a vibrating 880Hz wave (vibration at 3 Hz), where each is amp 0.5
{SinOsc.ar([880, 883], 0, 0.5).sum!2}.play
// the above is the same as (note the .sum):
{(SinOsc.ar(880, 0, 0.5)+SinOsc.ar(883, 0, 0.5))!2}.play
{ SinOsc.ar([xxx, xxx], 0, xxx).sum+
SinOsc.ar([xxx, xxx], 0, xxx).sum+
SinOsc.ar([xxx, xxx], 0, xxx).sum+
SinOsc.ar([xxx, xxx], 0, xxx).sum+
SinOsc.ar([xxx, xxx], 0, xxx).sum+
SinOsc.ar([xxx, xxx], 0, xxx).sum
}.play

And finally, we need to create envelopes for each of the partials:

{ (SinOsc.ar([xxx, xxx], 0, xxx).sum *
EnvGen.ar(Env.perc(0.00001, xxx))) +
 (SinOsc.ar([xxx, xxx], 0, xxx).sum *
EnvGen.ar(Env.perc(0.00001, xxx))) +
 (SinOsc.ar([xxx, xxx], 0, xxx).sum *
EnvGen.ar(Env.perc(0.00001, xxx))) +
 (SinOsc.ar([xxx, xxx], 0, xxx).sum *
EnvGen.ar(Env.perc(0.00001, xxx))) +
 (SinOsc.ar([xxx, xxx], 0, xxx).sum *
EnvGen.ar(Env.perc(0.00001, xxx))) +
}.play

And let’s listen to that. You will note that parenthesis have been put around each sine wave and its envelope multiplication. This is because SuperCollider calculates from left to right, and not giving + and - operators precedence, like in common maths and many other programming languages.

TIP: Operator Precedence - explore how these equations result in different outcomes

2+2*8 // you would expect 18 as the result, but SC returns what?
100/2-10 // here you would expect to get 40, and you get the same in SC. Why?
// now, for this reason it's a good practice to use parenthesis, e.g.,
2+(2*8)
100/(2-10) // if that's what you were trying to do

We have now created a reasonable representation of the bell sound that we listened to. The next thing to do is to turn that into a synth definition and make it stereo. Note that we add a general envelope with a doneAction:2, which will remove the synth from the server when it has stopped playing.

SynthDef(\bell, xxxx

// and we can play our new bell
Synth(\bell)

This bell has a specific frequency, but it would be nice to be able to pass a new frequency as a parameter. This could be done in many ways, one would be to pass the frequencies of each of the oscillators as arguments to the Synth. This would make the instrument quite flexible, but on the other hand it would weaken its unique character (now that so many more types of bell sounds - with their respective harmonic relationships - can be made with it). So here we decide to keep the same ratios between the partials for this unique bell sound, but a sound that can change in pitch. We find the ratios by dividing the frequencies by the lowest frequency.

[xxx, xxx2, xxx3, xxx4]/xxx
// which gives us this array:
[xxxxxxxxxxxxxxxxxxxxxxxxx]

We can now use those ratios in our synth definition

SynthDef(\bell, xxxx

// and we can play the bell with different frequencies
Synth(\bell, [\freq, 440])
Synth(\bell, [\freq, 220])
Synth(\bell, [\freq, 590])
Synth(\bell, [\freq, 1000.rand])

Harmonics GUI

Below you find a Graphical User Interface that allows you to control the harmonics of a fundamental frequency (the slider on the right is the fundamental freq). Here we are also introduced to the Osc UGen, which is a wavetable oscillator that reads its samples from a waveform stored in a buffer.

// we create a SynthDef
SynthDef(\oscsynth, { arg bufnum, freq = 440, ts= 1; 
	a = Osc.ar(bufnum, freq, 0, 0.2) * EnvGen.ar(Env.perc(0.01), timeScale:ts, doneAction:2);
	Out.ar(0, a ! 2);
}).add;

// and then we fill the buffer with our waveform and generate the GUI 
(
var bufsize, ms, slid, cspec, freq;
var harmonics;

freq = 220;
bufsize=4096; 
harmonics=20;

b=Buffer.alloc(s, bufsize, 1);

x = Synth(\oscsynth, [\bufnum, b.bufnum, \ts, 0.1]);

// GUI :
w = Window("harmonics", Rect(200, 470, 20*harmonics+140,150)).front;
ms = MultiSliderView(w, Rect(20, 20, 20*harmonics, 100));
ms.value_(Array.fill(harmonics,0.0));
ms.isFilled_(true);
ms.valueThumbSize_(1.0);
ms.canFocus_(false);
ms.indexThumbSize_(10.0);
ms.strokeColor_(Color.blue);
ms.fillColor_(Color.blue(alpha: 0.2));
ms.gap_(10);
ms.action_({ b.sine1(ms.value, false, true, true) }); // set the harmonics
slid=Slider(w, Rect(20*harmonics+30, 20, 20, 100));
cspec= ControlSpec(70,1000, 'exponential', 10, 440);
slid.action_({	
	freq = cspec.map(slid.value); 	
	[\frequency, freq].postln;
	x.set(\freq, cspec.map(slid.value)); 
	});
slid.value_(0.3); 
slid.action.value;
Button(w, Rect(20*harmonics+60, 20, 70, 20))
	.states_([["Plot",Color.black,Color.clear]])
	.action_({	a = b.plot });
Button(w, Rect(20*harmonics+60, 44, 70, 20))
	.states_([["Start",Color.black,Color.clear], ["Stop!",Color.black,Color.clear]])
	.action_({arg sl;
		if(sl.value ==1, {
			x = Synth(\oscsynth, [\bufnum, b.bufnum, \freq, freq, \ts, 1000]);
			},{x.free;});
	});	
Button(w, Rect(20*harmonics+60, 68, 70, 20))
	.states_([["Play",Color.black,Color.clear]])
	.action_({
		Synth(\oscsynth, [\bufnum, b.bufnum, \freq, freq, \ts, 0.1]);
	});	
Button(w, Rect(20*harmonics+60, 94, 70, 20))
	.states_([["Play rand",Color.black,Color.clear]])
	.action_({
		Synth(\oscsynth, [\bufnum, b.bufnum, \freq, rrand(20,100)+50, \ts, 0.1]);
	});	
)

The “Play” and “Play rand” buttons on the interface allow you to hit Enter repeatedly whilst changing the harmonic energy of the sound. Can you synthesise a clarinet or an oboe this way? Can you find the sound of a trumpet? You can get close, but of course each of the harmonics would ideally have their own envelope and amplitude (as we saw in the xylophone synthesis above).

Some Additive SynthDefs with routines playing them

The examples above might have raised the question whether all the parameters of the synth could be set from the outside as arguments passed to the synth in the form of arrays. This is possible, of course, but it requires that the arrays are created as inputs when the SynthDef is compiled. In the example below, the partials and the amplitudes of 15 oscillators are set on compilation as the default arguments in respective arrays.

Note the # in front of the arrays in the arguments. It means that they are literal (fixed size) arrays.

(
SynthDef(\addSynthArray, { arg freq=300, dur=0.5, mul=100, addDiv=8, partials = #[1, 2, 3, \
4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15], amps = #[ 0.30, 0.15, 0.10, 0.07, 0.06, 0.05, 0.\
04, 0.03, 0.03, 0.03, 0.02, 0.02, 0.02, 0.02, 0.02 ]; 	
	var signal, env;
	env = EnvGen.ar(Env.perc(0.01, dur), doneAction: 2);
	signal = Mix.arFill(partials.size, {arg i;
				SinOsc.ar(
					freq * harmonics[i], 
					0,
					amps[i]	
				)});
	
	Out.ar(0, signal.dup * env)
	}).add
)

// a saw wave sounding wave with 15 harmonics 
Synth(\addSynthArray, [\freq, 200])
Synth(\addSynthArray, [\freq, 300])
Synth(\addSynthArray, [\freq, 400])

This is because the synth it is using the default arguments of the SynthDef. Let’s try to pass a partials array

Synth(\addSynthArray, [\freq, 400, \partials, {|i| (i+1)+rrand(-0.2, 0.2)}!15])

What happened here? Let’s scrutinize the partials argument.

{|i| (i+1)+rrand(-0.2, 0.2)}!15
breaks down to
{|i|i}!15
or 
{arg i; i } ! 15
// but we don't want a frequency of zero, so we add 1
{|i| (i+1) }!15
// and then we add random values from -0.2 to 0.2
{|i| (i+1) + rrand(-0.2, 0.2) }!15
// resulting in frequencies such as 
{|i| (i+1) + rrand(-0.2, 0.2) * 440 }!15

We can now create a piece that sets new partial frequencies and their amplitude on every note. As mentioned above this could be carefully decided, or simply done randomly. If it is completely random, it might be worth looking into the Rand UGens though, as they allow for a random value to be generated within every synth.

// test the routine here below. uncommend and comment the variables f and a
(
fork {  // fork is basically a Routine
        100.do({
        		// partial frequencies:
         		// f = Array.fill(15, {arg i; i=i+1; i}).postln; // harmonic spectra (saw wave)
         		f = Array.fill(15, {10.0.rand}); // inharmonic spectra (a bell?)
         		// partial amplitudes:
         		// a = Array.fill(15, {arg i; i=i+1; 1/i;}).normalizeSum.postln; // saw wave amps
         		a = Array.fill(15, {1.0.rand}).normalizeSum.postln; // random amp on each harmon\
ic
         	  	Synth(\addSynthArray).set(\harmonics, f, \amps, a);
            		1.wait;
        });
      }  
)
(
n = rrand(10, 15);
{ Mix.arFill(n , { 
		SinOsc.ar( [67.0.rrand(2000), 67.0.rrand(2000)], 0, n.reciprocal)
		*
		EnvGen.kr(Env.sine(rrand(2.0, 10) ) )
	}) * EnvGen.kr(Env.perc(11, 6), doneAction: 2, levelScale: 0.75)
}.play;
)

fork {  // fork is basically a Routine
        100.do({
		n = rrand(10, 45);
		"Number of UGens: ".post; n.postln;
		{ Mix.fill(n , { 
			SinOsc.ar( [67.0.rrand(2000), 67.0.rrand(2000)], 0, n.reciprocal)
			*
			EnvGen.kr(Env.sine(rrand(4.0, 10) ) )
		}) * EnvGen.kr(Env.perc(11, 6), doneAction: 2, levelScale: 0.75)
		}.play;
		rrand(5, 10).wait;
		})
}

Using Control to set multiple parameters

There is another way to store and control arrays within a SynthDef. This is using the Control class. The controls are good for passing arrays into running Synths. In order to do this we use the Control UGen inside our SynthDef.

SynthDef("manySines", {arg out=0;
	var sines, control, numsines;
	numsines = 20;
	control = Control.names(\array).kr(Array.rand(numsines, 400.0, 1000.0));
	sines = Mix(SinOsc.ar(control, 0, numsines.reciprocal)) ;
	Out.ar(out, sines ! 2);
}).add;

Here we make an array of 20 frequency values inside a Control variable and pass this array to the SinOsc UGen which makes a “multichannel expansion,” i.e., it creates a sinewave in 20 succedent audio busses. (If you had a sound card with 20 channels, you’d get a sine out of each channel) But here we mix the sines into one signal. Finally in the Out UGen we use “! 2” which is a multichannel expansion trick that makes this a 2 channel signal (we could have used signal.dup).

b = Synth("manySines");

And here below we can change the frequencies of the Control

// our control name is "array"
b.setn(\array, Array.rand(20, 200, 1600)); 
b.setn(\array, {rrand(200, 1600)}!20); 
b.setn(\array, {rrand(200, 1600)}.dup(20));
// NOTE: All three lines above do exactly the same, just different syntax

Here below we use DynKlang (dynamic Klang) in order to change the synth in runtime:

(
SynthDef(\dynklang, { arg out=0, freq=110;
	var klank, n, harm, amp;
	n = 9;
	// harmonics
	harm = Control.names(\harm).kr(Array.series(4,1,4));
	// amplitudes
	amp = Control.names(\amp).kr(Array.fill(4,0.05));
	klank = DynKlang.ar(`[harm,amp], freqscale: freq);
	Out.ar(out, klank);
}).add;
)

a = Synth(\dynklang, [\freq, 230]);

a.set(\harm,  Array.rand(4, 1.0, 4.7))
a.set(\freq, rrand(30, 120))
a.set(\amp, Array.rand(4, 0.005, 0.1))

Klang and Dynklang

It can be laborious to build an array of synths and set the frequencies and amplitudes of each. For this we have a UGen called Klang. Klang is a bank of sine oscillators. It is more efficient than the DynKlang, but less flexible. (Don’t confuse with Klank and DynKlank which we will explore in the next chapter).

// bank of 12 oscillators of frequencies between 600 and 1000
{ Klang.ar(`[ Array.rand(12, 600.0, 1000.0), nil, nil ], 1, 0) * 0.05 }.play;
// here we create synths every 2 seconds
(
{
loop({
	{ Pan2.ar( 
		Klang.ar(`[ Array.rand(12, 200.0, 2000.0), nil, nil ], 0.5, 0)
		* EnvGen.kr(Env.sine(4), 1, 0.02, doneAction: 2), 1.0.rand2) 	
	}.play;
	2.wait;
})
}.fork;
)

Klang can not recieve updates to its frequencies nor can it be modulated. For that we use DynKlang (Dynamic Klang).

(
{ 
	DynKlang.ar(`[ 
		[800, 1000, 1200] + SinOsc.kr([2, 3, 0.2], 0, [130, 240, 1200]),
		[0.6, 0.4, 0.3],
		[pi,pi,pi]
	]) * 0.1
}.freqscope;
)

// amplitude modulation
(
{ 
	DynKlang.ar(`[ 
		[800, 1600, 2400, 3200],
		[0.1, 0.1, 0.1, 0.1] + SinOsc.kr([0.1, 0.3, 0.8, 0.05], 0, [1, 0.8, 0.8, 0.6]),
		[pi,pi,pi]
	]
) * 0.1
}.freqscope;
)

The following patch shows how a GUI is used to control the amplitudes of the DynKlang oscillator array

(	// create controls directly with literal arrays:
SynthDef(\dynsynth, {| freqs = #[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], 
	amps = #[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0], 
	rings = #[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]|
	Out.ar(0, DynKlang.ar(`[freqs, amps, rings]))
}).add
)

(
var bufsize, ms, slid, cspec, rate;
var harmonics = 20;
GUI.qt;

x = Synth(\dynsynth).setn(
				\freqs, Array.fill(harmonics, {|i| 110*(i+1)}), 
				\amps, Array.fill(harmonics, {0})
				);

// GUI :
w = Window("harmonics", Rect(200, 470, 20*harmonics+40,140)).front;
ms = MultiSliderView(w, Rect(20, 10, 20*harmonics, 110));
ms.value_(Array.fill(harmonics,0.0));
ms.isFilled_(true);
ms.indexThumbSize_(10.0);
ms.strokeColor_(Color.blue);
ms.fillColor_(Color.blue(alpha: 0.2));
ms.gap_(10);
ms.action_({
	x.setn(\amps, ms.value*harmonics.reciprocal);
}); 
)

Chapter 6 - Subtractive Synthesis

The previous chapter introduced additive synthesis. The idea is start with silence and add together partials and derive at the sound we are after. Subtractive synthesis works from the opposite. We start with a rich sound - a broadband sound either rich in partials/harmonics or noise - and then filter the unwanted frequencies out. WhiteNoise and Saw waves are typical sound sources, as the noise has equal energy on all frequencies, but the saw wave has a naturally sounding harmonic structure with energy on every harmonic.

Noise Sources

The definition of noise is a signal that is aperiodic, i.e., there is no periodic repetition of some form in the signal. If there was such repetition, we would talk about a wave form and then a frequency of those repetitions. The frequency becomes pitch or musical notes. Not so in the world of noise: there are no repetitions that we can detect and thus we perceive it as the opposite of a signal; the antithesis of a meaning. Some of us might remember the white noise of a dead analogue TV channel. Anyway, although noise might for some have negative connotations, it is a very useful musical element, in particular for synthesis as a rich input signal.

// WhiteNoise
{WhiteNoise.ar(0.4)}.plot(1)
{WhiteNoise.ar(0.4)}.play
{WhiteNoise.ar(0.4)}.scope
{WhiteNoise.ar(0.4)}.freqscope

// PinkNoise 
{PinkNoise.ar(1)}.plot(1)
{PinkNoise.ar(1)}.play
{PinkNoise.ar(1)}.freqscope

// BrownNoise
{BrownNoise.ar(1)}.plot(1)
{BrownNoise.ar(1)}.play
{BrownNoise.ar(1)}.freqscope

// Take a look at the source file called Noise.sc (or hit Cmd+i on WhiteNoise)
// You will find lots of interesting noise generators. For example these:

{ Crackle.ar(XLine.kr(0.99, 2, 10), 0.4) }.freqscope.scope;

{ LFDNoise0.ar(XLine.kr(1000, 20000, 10), 0.1) }.freqscope.scope;

{ LFClipNoise.ar(XLine.kr(1000, 20000, 10), 0.1) }.freqscope.scope;

// Impulse
{ Impulse.ar(80, 0.7) }.play
{ Impulse.ar(4, 0.7) }.play

// Dust (random impulses)
{ Dust.ar(80) }.play
{ Dust.ar(4) }.play

We can not start to sculpt sound with the use of filters and envelopes. For example, what would this remind us of:

{WhiteNoise.ar(1) * EnvGen.ar(Env.perc(0.001,0.3), doneAction:2)}.play

We can add a low pass filter (LPF) to the noise, so we cut off the high frequencies:

{LPF.ar(WhiteNoise.ar(1), 3300) * EnvGen.ar(Env.perc(0.001,0.5), doneAction:2)}.play

And here we use mouse movements to control the cutoff frequency (the x-axis) and the envelope duration (y-axis):

(
fork{
	100.do({
		{LPF.ar(WhiteNoise.ar(1), MouseX.kr(200,20000, 1)) 
			* EnvGen.ar(Env.perc(0.00001, MouseY.kr(1, 0.1, 1)), doneAction:2)}.play;
		1.wait;
	});
}
)

But what did that low pass filter do? It passes through the low frequencies, thus the name. A high pass filter will pass through the high frequencies. And a band pass filter (BPF) will pass through the frequencies of a frequency band that you specify. We can view the functionality of the low pass filter with the use of a frequency scope. Note also the quality parameter in the resonant low pass filter:

{LPF.ar(WhiteNoise.ar(0.4), MouseX.kr(100, 20000).poll(20, "cutoff"))}.freqscope;
{RLPF.ar(WhiteNoise.ar(0.4), MouseX.kr(100, 20000).poll(20, "cutoff"), MouseY.kr(0.01, 1).p\
oll(20, "quality"))}.freqscope

Note how the Y location of the mouse affects the quality of the resonance in the resonance low pass filter (RLPF).

Filter Types

Filters are algorithms that are typically applied in the time domain of an audio signal. This, for example, might include adding a delayed copy of the signal to the orginal signal.

Here is a very primitive such filter:

{
var signal;
var delaytime = MouseX.kr(0.000022675, 0.001); // from a sample 
signal = Saw.ar(220, 0.5);
d =  DelayC.ar(signal, 0.6, delaytime); 
(signal + d).dup
}.play

Let us try some of the filter UGens of SuperCollider:

// low pass filter
{LPF.ar(WhiteNoise.ar(0.4), MouseX.kr(40,20000,1)!2) }.play;

// low pass filter with XLine
{LPF.ar(WhiteNoise.ar(0.4), XLine.kr(40,20000, 3, doneAction:2)!2) }.play;

// high pass filter
{HPF.ar(WhiteNoise.ar(0.4), MouseX.kr(40,20000,1)!2) }.play;

// band pass filter (the Q is controlled by the MouseY)
{BPF.ar(WhiteNoise.ar(0.4), MouseX.kr(40,20000,1), MouseY.kr(0.01,1)!2) }.play;

// Mid EQ filter attenuates or boosts a frequency band
{MidEQ.ar(WhiteNoise.ar(0.024), MouseX.kr(40,20000,1), MouseY.kr(0.01,1), 24)!2 }.play;

// what's happening here?
{
var signal = MidEQ.ar(WhiteNoise.ar(0.4), MouseX.kr(40,20000,1), MouseY.kr(0.01,1), 24);
BPF.ar(signal, MouseX.kr(40,20000,1), MouseY.kr(0.01,1)) !2
}.play;

Resonating filters

A resonant filter does what is says on the tin, it resonates certain frequencies. The bandwidth of this resonance can vary, so with a WhiteNoise input, one could go from a very wide resonance (where the “quality” - the Q - of the filter is low), to a very narrow band resonance where the noise almost sounds like a sine wave. Let’s explore this with WhiteNoise and a band pass filter:

{BPF.ar(WhiteNoise.ar(0.4), MouseX.kr(100, 10000).poll(20, "cutoff"), MouseY.kr(0.01, 0.999\
9).poll(20, "rQ"))}.freqscope

Move your mouse around and explore how the Q factor, when increased, results in a narrower resonating bandwidth.

In a low pass and high pass resonant filters, the energy at the cutoff frequency can be increased or decreased by setting the Q factor (or in SuperCollider, the reciprocal (inverse) of Q).

// resonant low pass filter
{RLPF.ar(
	Saw.ar(222, 0.4), 
	MouseX.kr(100, 12000).poll(20, "cutoff"), 
	MouseY.kr(0.01, 0.9999).poll(20, "rQ")
)}.freqscope;
// resonant high pass filter
{RHPF.ar(
	Saw.ar(222, 0.4), 
	MouseX.kr(100, 12000).poll(20, "cutoff"), 
	MouseY.kr(0.01, 0.9999).poll(20, "rQ")
)}.freqscope;

There are bespoke resonance filters in SuperCollider, such as Resonz, Ringz and Formant.

// resonant filter
{ Resonz.ar(WhiteNoise.ar(0.5), MouseX.kr(40,20000,1), 0.1)!2 }.play

// a short impulse won't resonate
{ Resonz.ar(Dust.ar(0.5), 2000, 0.1) }.play

// for that we use Ringz
{ Ringz.ar(Dust.ar(2, 0.6), MouseX.kr(200,6000,1), 2) }.play

// X is frequency and Y is ring time
{ Ringz.ar(Impulse.ar(4, 0, 0.3),  MouseX.kr(200,6000,1), MouseY.kr(0.04,6,1)) }.play

{ Ringz.ar(Impulse.ar(LFNoise2.ar(2).range(0.5, 4), 0, 0.3),  LFNoise2.ar(0.1).range(200,30\
00), LFNoise2.ar(2).range(0.04,6,1)) }.play

{ Mix.fill(10, {Ringz.ar(Impulse.ar(LFNoise2.ar(rrand(0.1, 1)).range(0.5, 1), 0, 0.1),  LFN\
oise2.ar(0.1).range(200,12000), LFNoise2.ar(2).range(0.04,6,1)) })}.play

{ Formlet.ar(Impulse.ar(4, 0.9), MouseX.kr(300,2000), 0.006, 0.1) }.play;

{ Formlet.ar(LFNoise0.ar(4, 0.2), MouseX.kr(300,2000), 0.006, 0.1) }.play;

Klank and DynKlank

Just as Klang is a bank of fixed frequency oscillators, i.e., additive synthesis, Klank is a bank of fixed frequency resonators, where frequencies are subtracted from an input signal.

{ Ringz.ar(Dust.ar(3, 0.3), 440, 2) + Ringz.ar(Dust.ar(3, 0.3), 880, 2) }.play

//  using only one Dust UGen to trigger all the filters:
(
{ 
var trigger, freq;
trigger = Dust.ar(3, 0.3);
freq = 440;
Ringz.ar(trigger, 440, 2, 0.3) 		+ 
Ringz.ar(trigger, freq*2, 2, 0.3) 	+ 
Ringz.ar(trigger, freq*3, 2, 0.3) !2
}.play
)

// but there is a better way:

// Klank is a bank of resonators like Ringz, but the frequency is fixed. (there is DynKlank)

{ Klank.ar(`[[800, 1071, 1153, 1723], nil, [1, 1, 1, 1]], Impulse.ar(2, 0, 0.1)) }.play;

// whitenoise input
{ Klank.ar(`[[440, 980, 1220, 1560], nil, [2, 2, 2, 2]], WhiteNoise.ar(0.005)) }.play;

// AudioIn input
{ Klank.ar(`[[220, 440, 980, 1220], nil, [1, 1, 1, 1]], AudioIn.ar([1])*0.001) }.play;

Let’s explore the DynKlank UGen. It does the same as Klank, but it allows us to change the values after the synth has been instantiated.

{ DynKlank.ar(`[[800, 1071, 1353, 1723], nil, [1, 1, 1, 1]], Dust.ar(8, 0.1)) }.play;

{ DynKlank.ar(`[[200, 671, 1153, 1723], nil, [1, 1, 1, 1]], PinkNoise.ar([0.007,0.007])) }.\
play;

{ DynKlank.ar(`[[200, 671, 1153, 1723]*XLine.ar(1, [1.2, 1.1, 1.3, 1.43], 5), nil, [1, 1, 1\
, 1]], PinkNoise.ar([0.007,0.007])) }.play;

SynthDef(\dynklanks, {arg freqs = #[200, 671, 1153, 1723]; 
	Out.ar(0, 
		DynKlank.ar(`[freqs, nil, [1, 1, 1, 1]], PinkNoise.ar([0.007,0.007]))
	)
}).add

a = Synth(\dynklanks)
a.set(\freqs, [333, 444, 555, 666])
a.set(\freqs, [333, 444, 555, 666].rand)

We know resonant filters when we hear them. The typical cry-baby wah wah guitar pedal is a band pass filter, for example. In the examples below we use a SinOsc to “move” the band pass frequency up and down the frequency spectrum. The SinOsc is here effectively working as a LFO (Low Frequency Oscillator - usually with a frequency below 20 Hz).

{ BPF.ar(Saw.ar(440), 440+(3000* SinOsc.kr(2, 0, 0.9, 1))) ! 2 }.play;
{ BPF.ar(WhiteNoise.ar(0.5), 1440+(300* SinOsc.kr(2, 0, 0.9, 1)), 0.2) ! 2}.play;

Bell Synthesis using Subtractive Synthesis

The desired sound that you are trying to synthesize can be achieved through different methods. As an example, we could explore how to synthesize a bell sound with subtractive synthesis.

(
{
var chime, freqSpecs, burst, harmonics = 10;
var burstEnv, burstLength = 0.001;
freqSpecs = `[
	{rrand(100, 1200)}.dup(harmonics), //freq array
	{rrand(0.3, 1.0)}.dup(harmonics).normalizeSum, //amp array
	{rrand(2.0, 4.0)}.dup(harmonics)]; //decay rate array
burstEnv = Env.perc(0, burstLength); //envelope times
burst = PinkNoise.ar(EnvGen.kr(burstEnv, gate: Impulse.kr(1))*0.4); //Noise burst
Klank.ar(freqSpecs, burst)!2
}.play
)

This bell will be triggered every second. This is because the Impulse UGen is triggering the opening of the gate in the EnvGen (envelope generator) that uses the percussion envelope defined in the ‘burstEnv’ variable. If we wanted this to happen only once, we could set the frequency of the Impulse to zero. If we add a general envelope that frees the synth after being triggered, we could run a task that triggers bells every second.

(
Task({
	inf.do({
		{
		var chime, freqSpecs, burst, harmonics = 30.rand;
		var burstEnv, burstLength = 0.001;
		freqSpecs = `[
			{rrand(100, 8000)}.dup(harmonics), //freq array
			{rrand(0.3, 1.0)}.dup(harmonics).normalizeSum, //amp array
			{rrand(2.0, 4.0)}.dup(harmonics)]; //decay rate array
		burstEnv = Env.perc(0, burstLength); //envelope times
		burst = PinkNoise.ar(EnvGen.kr(burstEnv, gate: Impulse.kr(0))*0.5); //Noise burst
		Klank.ar(freqSpecs, burst)!2 * EnvGen.ar(Env.linen(0, 4, 0), doneAction: 2) 
		}.play;
		[0.125, 0.25, 0.5, 1].choose.wait;
	})
}).play
)

Simulating the Moog

The much loved MiniMoog is a typical subtractive synthesis synthesizer. A few oscillator types can be mixed together and subsequently passed through a characteristic resonance low pass filter. We could try to simulate a setting on the MiniMoog, using the MoogFF UGen that simulates the Moog VCF (Voltage Controlled Filter) low pass filter and choosing, say, a saw wave form (The MiniMoog also has triangle, square, and two pulse waves).

We would typically start by sketching our synth by hooking up the UGens in a .play or .freqscope:

{MoogFF.ar(Saw.ar(440), MouseX.kr(400, 16000), MouseY.kr(0.01, 4))}.freqscope

A common trick when simulating analogue equipment is to try to recreate the detuned oscillators of the analog synth (they are typically out of tune due to the difference of temperature within the synth itself). We can do this by adding another oscillator with a few Hz difference in frequency:

// here we add two Saws and split the signal into two channels
{ MoogFF.ar(Saw.ar(440, 0.4) + Saw.ar(442, 0.4), 4000 ) ! 2 }.freqscope
// like this:
{ ( SinOsc.ar(220, 0, 0.4) + SinOsc.ar(330, 0, 0.4) ) ! 2 }.play

// here we "expand" the input of the filter into two channels (the array)
{ MoogFF.ar([Saw.ar(440, 0.4), Saw.ar(442, 0.4)], 4000 )  }.freqscope
// like this - so different frequencies in each speaker:
{ [ SinOsc.ar(220, 0, 0.4), SinOsc.ar(330, 0, 0.4) ] }.play

// here we "expand" the saw into two channels, but sum them and then split into two
{ MoogFF.ar(Saw.ar([440, 442], 0.4).sum, 4000 ) ! 2 }.freqscope
// like this - and this is the one we'll use, although they're all fine:
{ SinOsc.ar( [220, 333], 0, 0.4) ! 2 }.play

We can then start to add arguments and prepare the synth graph for turning it into a SynthDef:

{ arg out=0, freq = 440, amp = 0.3, pan = 0, cutoff = 2, gain = 2, detune=2;
	var signal, filter;
	signal = Saw.ar([freq, freq+detune], amp).sum;
	filter = MoogFF.ar(signal, freq * cutoff, gain );
	Out.ar(out, Pan2.ar(filter, pan));
}.play

The two synth graphs above are pretty much the same, except we have removed the mouse input in the latter one. You can see the frequency, amp, pan, and filter cutoff values are derived from the default arguments in the top line. There are only three things left for us to do in order to have a good working general synth: add an envelope, and wrap the graph up in a SynthDef with a name:

SynthDef(\moog, { arg out=0, freq = 440, amp = 0.3, pan = 0, cutoff = 2, gain = 2, gate=1;
	var signal, filter, env;
	signal = Saw.ar(freq, amp);
	env = EnvGen.ar(Env.adsr(0.01, 0.3, 0.6, 1), gate: gate, doneAction:2);
	filter = MoogFF.ar(signal * env, freq * cutoff, gain );	
	Out.ar(out, Pan2.ar(filter, pan));
}).add;

a = Synth(\moog);
a.set(\freq, 222); // set the frequency of the synth
a.set(\cutoff, 4); // set the cutoff (this would cut of at the 4th harmonic. Why?)
a.set(\gate, 0); // kill the synth

We can now hook up a keyboard and play the \moog synth that we’ve designed. The MiniMoog is monophonic (only one note at a time), and it could be written like this:

(
c = 4;
MIDIdef.noteOn(\myOndef, {arg vel, key, channel, device;
	a.release; 
	a = Synth(\moog, [\freq, key.midicps, \amp, vel/127, \cutoff, c]);
	[key, vel].postln; 
});
MIDIdef.noteOff(\myOffdef, {arg vel, key, channel, device; 
	a.release; 
	//a = nil;
	[key, vel].postln; 
});
)
c = 10; // change the cutoff frequency at a later point 
// the 'c' variable could be set from a GUI or a MIDI controller

The “a == nil”, or “a.isNil” check is there to make sure that we don’t press another note and overwriting the variable ‘a’ with another synth. What would happen then is that the noteOff method would free the last synth put into variable ‘a’ and not the prior ones. Try to remove the condition and see what happens.

Finally, we might want to improve the MiniMoog and add a polyphonic feature. As we saw in an earlier chapter, we simply create an array for all the possible MIDI notes and turn them on and off:

a = Array.fill(127, { nil });
MIDIIn.connectAll;
MIDIdef.noteOn(\myOndef, {arg vel, key, channel, device; 
	// we use the key as index into the array as well
	a[key] = Synth(\moog, [\freq, key.midicps, \amp, vel/127, \cutoff, 4]);
});
MIDIdef.noteOff(\myOffdef, {arg vel, key, channel, device; 
	a[key].release;
});

We will leave it up to you to decide how you want to control the cutoff and gain parameters of the MoogFF filter UGen. This could be done through knobs or sliders on a MIDI interface, on a GUI, or you could even decide to explore mapping key press velocity to the cutoff frequency, such that the note sounds brighter (or dimmer?) the harder you press the key.

Chapter 7 - Modulation

Modulating one signal with another is one of the oldest and most common techniques in sound synthesis. Here, any parameter of an oscillator can be modulated by the output of another oscillator. Filters, PlayBufs (sound file players) and other things can also be modulated. In this chapter we will explore modulation, and in particular amplitude modulation (AM), ring modulation (RM) and frequency modulation (FM).

LFOs (Low Frequency Oscillators)

As mentioned most parameters or controls in an oscillator can be controlled by the output of another. Low frequency oscillators (LFOs) are oscillators that typically operate under 20 Hz, although in SuperCollider there is no point in trying to define oscillators as LFOs, as we might always want to increase that frequency to 40 or 400 Hz!

Here below are examples of a triangle wave that has different controls modulated by another UGen.

In the first example we have the frequency of one oscillator modulated by the output (amplitude) of another:

{ SinOsc.ar( 440 * SinOsc.ar(1), 0, 0.4) }.play

We hear that the modulation is 2 Hz, not one, and that is because the output of the modulating oscillator goes up to 1 and down to -1 in one second. So for a one cycle of modulation per second, you would have to give it 0.5 as an amplitude. Furthermore, a frequency argument with a negative sign is automatically turned into a positive one, as negative frequency does not make sense.

Let’s try the same for amplitude:

{ SinOsc.ar( 440, 0, 0.4 * SinOsc.ar(1)) }.play
// or perhaps using LFPulse (which outputs 1 and 0s if the amp is 1)
{ SinOsc.ar( 440, 0, 0.4 * LFPulse.ar(2)) }.play

We thus get the familiar effects of vibrato (modulation of frequency) and tremolo (modulation of amplitude) as they are commonly defined as:

// vibrato
{SinOsc.ar(440+SinOsc.ar(4, 0, 10), 0, 0.4) }.play
// tremolo
{SinOsc.ar(440, 0, SinOsc.ar(3, 0, 1)) }.play

In modulation synthesis we talk about a “modulator” (the oscillator that does the modulation) and the “carrier” which is the main signal being modulated.

// mouseX is the power of the vibrato
// mouseY is the frequency of the vibrato
{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseY.kr(20, 5), 0, MouseX.kr(5, 20)); 
	carrier = SinOsc.ar(440 + modulator, 0, 1);
	carrier ! 2 // the output
}.play

There are special Low Frequency Oscillators (LFOs) in SuperCollider. They are typically not band limited, which means that they start to alias (or mirror back) into the frequency domain. Consider the difference between Saw (band-limited) and LFSaw (non-band-limited) here:

{Saw.ar(MouseX.kr(100, 10000), 0.5)}.freqscope
{LFSaw.ar(MouseX.kr(100, 10000), 0.5)}.freqscope

When you move your mouse, you can see how the band-limited Saw only gives you the harmonics above the fundamental frequency set by the mouse. On the other hand, with LFSaw, you get the harmonics mirroring back into the audible range at the Nyquist frequency (half the sampling rate, very often 22.050Hz).

But the LFUgens are good for modulation and we typically can run them in the control rate (using .kr rather than .ar - which is typically 64 times less calculation per second -> that is, if the block size is set to 64 samples)

// LFSaw
{ SinOsc.ar(LFSaw.kr(4, 0, 200, 400), 0, 0.7) }.play

// LFTri
{ SinOsc.ar(LFTri.kr(4, 0, 200, 400), 0, 0.7) }.play
{ Saw.ar(LFTri.kr(4, 0, 200, 400), 0.7) }.play

// LFPar
{ SinOsc.ar(LFPar.kr(0.2, 0, 400,800),0, 0.7) }.play

// LFCub
{ SinOsc.ar(LFCub.kr(0.2, 0, 400,800),0, 0.7) }.play

// LFPulse
{ SinOsc.ar(LFPulse.kr(3, 1, 0.3, 200, 200),0, 0.7) }.play
{ SinOsc.ar(LFPulse.kr(3, 1, 0.3, 2000, 200),0, 0.7) }.play

// LFOs can also perform at audio rate
{ LFPulse.ar(LFPulse.kr(3, 1, 0.3, 200, 200),0, 0.7) }.play
{ LFSaw.ar(LFSaw.kr(4, 0, 200, 400), 0, 0.7) }.play
{ LFTri.ar(LFTri.kr(4, 0, 200, 400), 0, 0.7) }.play
{ LFTri.ar(LFSaw.kr(4, 0, 200, 800), 0, 0.7) }.play

Finally, we should note here at the end of this section on LFOs that the LFO frequency can of course go as high as you would like, but then it ceases being an LFO and starts to do different type of synthesis, which we will look at below. In the examples here, you will start to hear strange artefacts arriving when the oscillation goes up over 20 Hz (observe the post window).

{SinOsc.ar(440+SinOsc.ar(XLine.ar(4, 200, 10).poll(20, "mod freq:"), 0, 20), 0, 0.4) }.play
{SinOsc.ar(440, 0, SinOsc.ar(XLine.ar(4, 200, 10).poll(20, "mod freq:"), 0, 1)) }.play

Theremin

We have now obviously found the technique to create a Theremin using vibrato and tremolo:

// Using the MouseX to control amplitude
	{
		var f;
		f = MouseY.kr(4000, 200, 'exponential', 0.8);
		SinOsc.ar(
			freq: f+ (f*SinOsc.ar(7,0,0.02)),
			mul: MouseX.kr(0, 0.9)
		)
	}.play

// Using the MouseX to control vibrato speed
	{
		var f;
		f = MouseY.kr(4000, 200, 'exponential', 0.8);
		SinOsc.ar(
			freq: f+ (f*SinOsc.ar(3+MouseX.kr(1, 6),0,0.02)),
			mul: 0.3
		)
	}.play

Amplitude Modulation (AM synthesis)

In one of the examples above, the XLine Ugen to the LFO frequency up over 20Hz and we started to get some exciting artefacts in the sound. What was happening was that “sidebands” were appearing, i.e., partials on either side of the sine. Amplitude synthesis is a modulation that modulates the carrier with unipolar values (that is, they are between 0 and 1 - not bipolar (-1 to 1)).

In amplitude modulation, the sidebands are the sum and the difference of the carrier and the modulator frequency. For example, a 300 Hz carrier and 160 Hz modulator would generate 140 Hz and 460 Hz sidebands. However, the carrier frequency is always present.

{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseX.kr(2, 20000, 1), 0, mul:0.5, add:1);
	carrier = SinOsc.ar(MouseY.kr(300,2000), 0, modulator);
	carrier ! 2;
}.play

If there are harmonics in the wave being modulated, each of the harmonics will have sidebands as well. - Check the saw wave.

{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseX.kr(2, 2000, 1), mul:0.5, add:1);
	carrier = Saw.ar(533, modulator);
	carrier ! 2 // the output
}.play

In digital synthesis we can apply all kinds of mathematical operators to the sound, for example using .abs to calculate absolute values in the modulator. (this results in many sidebands - try also using .cubed and other unitary operators on the signal).

{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseX.kr(2, 20000, 1)).abs;
	carrier = SinOsc.ar(MouseY.kr(200,2000), 0, modulator);
	carrier!2 // the output
}.play

Ring Modulation

As mentioned above, ring modulation uses a bipolar modulation values (-1 to 1) whereas AM uses unipolar modulation values (0 to 1). This results in ordinary amplitude modulation outputting the original carrier frequency as well as the two side bands for each of the spectral components of the carrier and modulation signals. Ring modulation, however, cancels out the carrier frequencies and simply outputs the side-bands.

{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseX.kr(2, 200, 1));
	carrier = SinOsc.ar(333, 0, modulator);
	carrier!2;
}.play

Ring modulation was used much in the early electronic music studios, for example in Cologne, BBC Radiophonic workshop and so on. The Barrons used the technique in the music for Forbidden Planet and so did Stockhausen in his Microphonie II, where voices are modulated with the sound of an Hammond organ. Let’s try to ring modulate a voice:

b = Buffer.read(s, Platform.resourceDir +/+ "sounds/a11wlk01.wav");
{
	var modulator, carrier;
	modulator = SinOsc.ar(MouseX.kr(20, 200, 1));
	carrier = PlayBuf.ar(1, b, 1, loop:1) * modulator;
	carrier ! 2;
}.play;

Here a sine wave is modulating the voice of a girl saying “Columbia this is Houston, over…”. We could use one sound file to ring modulate the output of another:

b = Buffer.read(s, Platform.resourceDir +/+ "sounds/a11wlk01.wav");
c = Buffer.read(s, "yourSound.wav");
c.play
{
	var modulator, carrier;
	modulator = PlayBuf.ar(1, c, 1, loop:1);
	carrier = PlayBuf.ar(1, b, 1, loop:1) * modulator;
	carrier ! 2;
}.play;

Frequency Modulation (FM Synthesis)

FM Synthesis is a popular synthesis technique that works well for a number of sounds. It became popular with the Yamaha DX7 Synthesizer in the late 1980s, but it was invented in the 1970s when John Chowning, musician and researcher at Stanford University, discovered the power of FM synthesis. He was working in the lab one day when he accidentally plugged the output of one oscillator into the frequency input of another and he heard a sound rich with partials (or sidebands as we call them in modulation synthesis). It’s important to realise that at the time, an oscillator was expensive equipment, and the possibility of getting so many partials out of only two oscillators was very exciting in musical, engineering, and economical terms.

Chowning’s famous FM synthesis piece is called Stria and can be found on the interwebs. The piece was an eye opener for many musicians, as its sounds were so unusual in timbre, rendering the texture of the piece surprising and novel. Imagine being there at the time and hearing these “unnatural” sounds for the first time!

1980s synth pop music is of course full with the sounds of FM synthesis, when musicians began using the DX7 synthesizer. They very often using the pre-installed sounds of the synth itself rather than making their own. The reason for this could be that FM synthesis is quite hard to learn, as there are so multiple parameters at play in any sound. Another explanation is that the user interface of the DX7 prevented people from designing sounds in an effective and ergonomic way, thus the lack of new and exploratory sound design using that synth.

{SinOsc.ar(1400 + SinOsc.ar(MouseX.kr(2,2000,1), 0, MouseY.kr(1,1000)), 0, 0.5)!2}.freqscope

Monitoring the frequency scope in the example above, you will see that when you move your mouse around, sidebands are appearing, spreading with even distance to each other, and the more amplitude the modulator has, the more sidebands you get. Let’s explore the above example with comments, in order to get the terminology right:

// the same as above - with explanations:
{
SinOsc.ar(2000 	// the carrier and the carrier frequency
	+ SinOsc.ar(MouseX.kr(2,2000,1),  // the modulator and the modulator frequency
		0, 					  // the phase of the modulator
		MouseY.kr(1,1000) 		  // the modulation depth (index)
		), 
0,		// the carrier phase 
0.5)	// the carrier amplitude
}.play

What is happening is that we have a carrier oscillator (the first SinOsc) with a frequency of 2000 Hz. We then add to this frequency the output of another oscillator. Note that the amplitude of the modulator is very high: it goes up to 1000, which would become uncomfortable for your ears were you to play that on its own. So when you move the mouse across the x-axis, you notice that around the carrier frequency partial (of 2000Hz) there are appearing sidebands with the distance of the modulator frequency. That is, if the modulator frequency is 250 Hz, you get sidebands of 1750 and 2250; 1500 and 2500; 1250 and 2750, etc. The stronger the modulation depth, or the index, of the modulator (its amplitude basically), the louder the sidebands will become.

We could of course create all those sidebands with oscillators in an additive synthesis style, but note the efficiency of FM compared to Additive synthesis:

// FM
{PMOsc.ar(1000, 800, 12, mul: EnvGen.kr(Env.perc(0, 0.5), Impulse.kr(1)))}.play;
 // compared with additive synthesis:
{ 
Mix.ar( 
 SinOsc.ar((1000 + (800 * (-20..20))),  // we're generating 41 oscillators (see *)
  mul: 0.1*EnvGen.kr(Env.perc(0, 0.5), Impulse.kr(1))) 
)}.play 

Below are two patches that serve well to explore the power of simple FM synthesis. In the first one, a LFNoise0 UGen is used to trigger a new number between 20 and 60, 4 times per second. This number will be a floating point number (a fractional number) so it is rounded to an integer. Then the number is turned into frequency values using .midicps (where MIDI note value is turned into a value of cycles per second).

{ var freq, ratio, modulator, carrier;
freq = LFNoise0.kr(4, 20, 60).round(1).midicps; 
ratio = MouseX.kr(1,4); 
modulator = SinOsc.ar(freq * ratio, 0, MouseY.kr(0.1,10));
carrier = SinOsc.ar(freq + (modulator * freq), 0, 0.5);
carrier	
}.play

// let's fork it and create a perc Env!
{	
	40.do({
			{ var freq, ratio, modulator, carrier;
			freq = rrand(60, 72).midicps; 
			ratio = MouseX.kr(0.5,2); 
			modulator = SinOsc.ar(freq * ratio, 0, MouseY.kr(0.1,10));
			carrier = SinOsc.ar(freq + (modulator * freq), 0, 0.5);
			carrier * EnvGen.ar(Env.perc(0, 1), doneAction:2)
		}.play;
		0.5.wait;
	});
}.fork

The PMOsc - Phase modulation

Frequency modulation and phase modulation are pretty much the same. In SuperCollider we have a PMOsc (Phase Modulation Oscillator), and we can try to make the above example using that:

{PMOsc.ar(1400, MouseX.kr(2,2000,1), MouseY.kr(0,1), 0)!2}.freqscope

You will note a feature in phase modulation, in that when the modulating frequency is low (< 20Hz), you don’t get the vibrato-like effect of the frequency modulation synth.

The magic of the PMOsc can be studied if we look under the hood. PMOsc is a pseudo-UGen, i.e., it is not written in C and compiled as a plugin for the SC-server, but rather defined when the class library of SuperCollider is compiled (on startup or if you hit Shift+Cmd+l)

How does the PMOsc work? Let’s check the source file (Cmd+i or Ctrl+i). You will see that the PMOsc.ar method simply returns (with the ^ symbol) a SinOsc with another SinOsc in the phase argument slot.

PMOsc  {
	*ar { arg carfreq,modfreq,pmindex=0.0,modphase=0.0,mul=1.0,add=0.0; 
		^SinOsc.ar(carfreq, SinOsc.ar(modfreq, modphase, pmindex),mul,add)
	}	
	*kr { arg carfreq,modfreq,pmindex=0.0,modphase=0.0,mul=1.0,add=0.0; 
		^SinOsc.kr(carfreq, SinOsc.kr(modfreq, modphase, pmindex),mul,add)
	}
}

Here are a few examples for studying the PM oscillator:

{ PMOsc.ar(MouseX.kr(500,2000), 600, 3, 0, 0.1) }.play; // modulate carfreq
{ PMOsc.ar(2000, MouseX.kr(200,1500), 3, 0, 0.1) }.play; // modulate modfreq
{ PMOsc.ar(2000, 500, MouseX.kr(0,10), 0, 0.1) }.play; // modulate index

The SuperCollider documentation of the UGen presents a nice demonstration of the UGen that looks a bit like this:

e = Env.linen(2, 5, 2);
fork{
    inf.do({
        { LinPan2.ar(EnvGen.ar(e) 
			*
			PMOsc.ar(2000.0.rand,800.0.rand, Line.kr(0, 12.0.rand,9),0,0.1), 
			1.0.rand2)
			}.play;
        2.wait;
    })
}

Other examples of PM synthesis:

{ var freq, ratio;
freq = LFNoise0.kr(4, 20, 60).round(1).midicps; 
ratio = MouseX.kr(1,4); 
SinOsc.ar(freq, 				// the carrier and the carrier frequency
		SinOsc.ar(freq * ratio, 	// the modulator and the modulator frequency
		0, 					// the phase of the modulator
		MouseY.kr(0.1,10) 		// the modulation depth (index)
		), 
0.5)		// the carrier amplitude
}.play

Same patch without the comments and modulator and carrier put into variables:

{ var freq, ratio, modulator, carrier;
	freq = LFNoise0.kr(4, 20, 60).round(1).midicps; 
	ratio = MouseX.kr(1,4); 
	modulator = SinOsc.ar(freq * ratio, 0, MouseY.kr(0.1,10));
	carrier = SinOsc.ar(freq, modulator, 0.5);
	carrier	
}.play

The use of Envelopes in FM synthesis

Frequency modulation is a complex technique and Chowning’s initial research paper shows a wide range of applications of this synthesis method. For example, in the patch below, we have a much lower modulation amplitude (between 0 and 1) but we multiply the carrier frequency with the modulator.

(
var carrier, carFreq, carAmp, modulator, modFreq, modAmp; 
carFreq = 2000; 
carAmp = 0.2;		
modFreq = 327; 
modAmp = 0.2; 
{
	modAmp = MouseX.kr(0, 1); 	// choose normalized range for modulation
	modFreq = MouseY.kr(10, 1000, 'exponential');
	modulator = SinOsc.ar( modFreq, 0, modAmp);			
	carrier = SinOsc.ar( carFreq + (modulator * carFreq), 0, carAmp);
	[ carrier, carrier, modulator ]
}.play
)

And we can compare that technique with our initial FM example. In short, the frequency of the carrier is used as a parameter in the index (amplitude) of the modulator. These are design details and there are multiple ways of using FM synthesis to derive at the sound that you are after.

// current technique 
{ SinOsc.ar( 1400 + (SinOsc.ar( MouseY.kr(10, 1000, 1), 0, MouseX.kr(0, 1)) * 1400), 0, 0.5\
) ! 2 }.play
// our first example
{ SinOsc.ar(1400 + SinOsc.ar(MouseY.kr(10, 1000,1), 0, MouseX.kr(1,1000)), 0, 0.5) ! 2 }.pl\
ay

One of the key techniques in FM synthesis is to use envelopes do control the parameters in the modulator. By changing the width and amplitude of the sidebands, we can get many interesting sounds, for example trumpets, mallets or bells.

Let us first create a basic FM synthesis synth definition and try to play it with diverse arguments:

SynthDef(\fmsynth, {arg outbus = 0, freq=440, carPartial=1, modPartial=1, index=3, mul=0.2,\
 ts=1;
	var mod, car, env;
	// modulator frequency
	mod = SinOsc.ar(freq * modPartial, 0, freq * index );
	// carrier frequency
	car = SinOsc.ar((freq * carPartial) + mod, 0, mul );
	// envelope
	env = EnvGen.ar( Env.perc(0.01, 1), doneAction: 2, timeScale: ts);
	Out.ar( outbus, car * env)
}).add;

Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \ts, 1]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 2.5, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 3.5, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 4.0, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 300.0, \carPartial, 1.5, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 0.5, \ts, 2]);

Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \modPartial, 1, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 300.0, \carPartial, 1.5, \modPartial, 1, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 400.0, \carPartial, 1.5, \modPartial, 1, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 800.0, \carPartial, 1.5, \modPartial, 1, \ts, 2]);

Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \modPartial, 1, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \modPartial, 1.1, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \modPartial, 1.15, \ts, 2]);
Synth(\fmsynth, [ \outbus, 0, \freq, 600.0, \carPartial, 1.5, \modPartial, 1.2, \ts, 2]);
SynthDef(\fmsynthenv, {arg outbus = 0, freq=440, carPartial=1, modPartial=1, index=3, mul=0\
.2, ts=1;
	var mod, car, env;
	var modfreqenv, modindexenv;
	modfreqenv = EnvGen.kr(Env.perc(0.1, ts/10, 0.125))+1; // add 1 so we're not starting from\
 zero
	modindexenv = EnvGen.kr(Env.sine(ts, 1))+1;
	mod = SinOsc.ar(freq * modPartial * modfreqenv, 0, freq * index * modindexenv);
	car = SinOsc.ar((freq * carPartial) + mod, 0, mul );
	env = EnvGen.ar( Env.perc(0.01, 1), doneAction: 2, timeScale: ts);
	Out.ar( outbus, Pan2.ar(car * env))
}).add;

Synth(\fmsynthenv, [ \freq, 440.0, \ts, 10]);
Synth(\fmsynthenv, [ \freq, 440.0, \ts, 1]);
Synth(\fmsynthenv, [ \freq, 110.0, \ts, 2]);

Chapter 8 - Envelopes and shaping sound

In both analog and digital synthesis, we typically operate with sound sources that are constantly running - whether those are analog oscillators or digital unit generators. This is great fun of course and we can delight in altering parameters by turning knobs or or setting control values, sculpting the sound we are after. However, this sound is not very musical. Hardly any musical instruments can have infinite sound, and in instrumental sounds we typically get an initial burst of energy, the sound then reaches some sort of equilibrium until it fades out.

The way we shape these sounds in both analog and digital synthesis is to use so-called “envelopes.” They wrap around our sound and give it the desired shape we’re after. Most people have for example heard about the ADSR envelope (where the shape is Attack, Decay, Sustain, and Release) which is one of the available envelopes in SuperCollider:

The shape of an ADSR envelope
The shape of an ADSR envelope

Envelopes in SuperCollider come in two types, sustaining (un-timed) and non-sustaining (timed) envelopes. A gate is a trigger (a positive number) that holds the envelope open until it gets a message to close it (such as 0 or less). This is like a finger pressing down a key on a MIDI keyboard. If we were using an ADSR envelope, when the finger presses the key, we would run the A (attack) and the D (decay), but then the S (sustain) would last as long as the finger is pressed. On R (release), when the finger releases the key, the R argument defines how long it takes for the sound to fade out. Synths with gated envelopes are can therefore be of un-definite time, i.e., its time is not set at the point of initialising the synth.

However, using a non-gated envelope, or a timed one, we set the duration of the sound at the time of triggering the synth. Here we don’t need to use a gate to trigger and release a synth.

Envelope types

Envelopes are powerful as we can define precisely the shape of a sound. This could be the amplitude of a sound, but it could also be a definition of frequency, filter cutoff, and so on. Let’s look at a few common envelope types in SuperCollider:

Env.linen(1, 2, 3, 0.6).test.plot;
Env.triangle(1, 1).test.plot;
Env.sine(1, 1).test.plot;
Env.perc(0.05, 1, 1, -4).test.plot;
Env.adsr(0.2, 0.2, 0.5, 1, 1, 1).test.plot;
Env.asr(0.2, 0.5, 1, 1).test.plot;
Env.cutoff(1, 1).test(2).plot;
// using .new you can define your own envelope with as many points as you like
Env.new([0, 1, 0.3, 0.8, 0], [2, 3, 1, 4],'sine').test.plot;
Env.new([0,1, 0.3, 0.8, 0], [2, 3, 1, 4],'linear').test.plot;
Env.new({1.0.rand}!10, {2.0.rand}!9).test.plot;
Env.new({1.0.rand}!100, {2.0.rand}!99).test.plot;

Different sounds require different envelopes. For example, if we wanted to synthesise a snare sound, we might choose to use the .perc method of Env.

{ LPF.ar(WhiteNoise.ar(0.5), 2000) * EnvGen.ar(Env.perc(0.001, 0.5)) ! 2 }.play

// And more bespoke envelopes can be created with the .new method:
{ Saw.ar(EnvGen.ar(Env.sine(0.3).range(140, 120))) * EnvGen.ar(Env.new([0, 1, 0, 0.5, 0], [\
0.3, 0, 0.1,0])) ! 2 }.play

// Note that above we are using a .sine envelope to modulate the frequency argument of the \
Saw UGen.

Envelopes define points in time that have a target value, duration and shape. So we can define the value, length and shape of each of the nodes. The .new method expects arrays for the value, duration and shape arguments. This can be very useful, as through a very simple syntax you can create complex transitions of value through time:

Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], \welch).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], \step).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], \sqr).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], [2, 0, 5, 3]).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], [0, 0, 0, 0]).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], [5, 5, 5, 5]).plot;
Env.new([0, 1, 0.5, 1, 0], [1, 2, 3, 2], [20, -20, 20, 20]).plot;

The last array defines the curve where 0 is linear, positive number curves the segment up, and a negative number curves it down. Check the Env documentation for further explanation.

The EnvGen - Envelope Generator

The envelope itself does nothing. It is simply a description of a form; of values in time and the shape of the line between those values. If we want to apply this envelope to a signal, we need to use the EnvGen UGen to play the envelope within a synth graph. You note that the EnvGen has an .ar or a .kr argument, so it works either in audio rate or control rate. The envelope arguments are the following:

EnvGen.ar(envelope, gate, levelScale, levelBias, timeScale, doneAction)

where the first argument is the envelope type (for example Env.perc(0.1, 1)), the second argument is the gate (not used with timed envelopes, but the default of the gate argument is 1, so it triggers the synth), the third argument is levelScale, which can scale the levels (such as amplitude) of the envelope, the fourth is levelBias which offsets the envelope’s breakpoints, the fifth is timeScale, which can shorten or stretch the envelope (so a second long Env.sine(1), could become 10 second long), and finally we have the doneAction, but this defines what will happen to the synth instance after the envelope has done its job.

doneActions

The doneActions are an important aspect of how the SC-server works. One of the key strengths of SuperCollider is how a synth can be created and removed very effectively, making it useful for granular synthesis, or playback of notes. Here a grain or a note can be a synth that exists for 20 milliseconds or 20 minutes. Users of data flow languages, such as Pure Data, will appreciate how useful this is, as synths can be spawned at wish, and don’t need to be hard wired beforehand.

When the synth has exceeded its lifetime through the function of the envelope it will typically become silent. However, we don’t want to pile synths up after they have played, but rather free the server of them. Unused synths will still run, use up processing power (CPU), and eventually cause some distortion in the sound; for example, if hundreds of synths have not been freed from the server and are still using CPU.

The doneActions are the following:

  • 0 - Do nothing when the envelope has ended.
  • 1 - Pause the synth running, it is still resident.
  • 2 - Remove the synth and deallocate it.
  • 3 - Remove and deallocate both this synth and the preceding node.
  • 4 - Remove and deallocate both this synth and the following node.
  • 5 - Same as 3. If the preceding node is a group then free all members of the group.
  • 6 - Same as 4. If the following node is a group then free all members of the group.
  • 7 - Same as 3. If the synth is part of a group, free all preceding nodes in the group.
  • 8 - Same as 4. If the synth is part of a group, free all following nodes in the group.
  • 9 - Same as 2, but pause the preceding node.
  • 10 - Same as 2, but pause the following node.
  • 11 - Same as 2, but if the preceding node is a group then free its synths.
  • 12 - Same as 2, but if the following node is a group then free its synths.
  • 13 - Frees the synth and all preceding and following nodes.

The doneActions are used in the EnvGen UGen all the time and it is important not to forget it. However there are other UGens in SuperCollider that also can free their enclosing synth when an event has happened - such as finishing playing a sample buffer. The other UGens are the following:

  • PlayBuf and RecordBuf - doneAction when the buffer has been played or recorded.
  • Line and XLine - doneAction when a line has ended.
  • Linen - doneAction when the envelope is finished.
  • LFGauss - doneAction after the completion of a cycle.
  • DemandEnvGen - Similar to EnvGen.
  • DetectSilence - doneAction when the UGen detects silence below a threshold.
  • Duty and TDuty - doneAction evaluated when a duty stream ends.
SynthDef(\sine, {arg freq=440, amp=0.1, gate=1, dA = 2;
	var signal, env;
	signal = SinOsc.ar(freq, 0, amp);
	env = EnvGen.ar(Env.adsr(0.2, 0.2, 0.5, 0.3, 1, 1), gate, doneAction: dA);
	Out.ar(0, Pan2.ar(signal * env, 0));
}).add

s.plotTree // watch the nodes appearing on the server tree

In the examples below, when you add a node, it is always added at the top of the node tree. This is how SC server does it by default. Synths can be added anywhere in the three though, but that will be discussed later in the chapter on busses, nodes and groups. [xxx, 15. ]

// doneAction = 0
a = Synth(\sine, [\dA, 0])
a.release
a.set(\gate, 1)

// doneAction = 1
a = Synth(\sine, [\dA, 1])
a.release
a.set(\gate, 1)
a.run(true)

// doneAction = 2
a = Synth(\sine, [\dA, 2])
a.release
a.set(\gate, 1) // it's gone! (see server synth count)

// doneAction = 3
a = Synth(\sine, [\dA, 3])
b = Synth(\sine, [\freq, 660, \dA, 3])
a.release

// doneAction = 3
a = Synth(\sine, [\dA, 3])
b = Synth(\sine, [\freq, 660, \dA, 3], addAction:\addToTail)
b.release

// doneAction = 3
a = Synth(\sine, [\freq, 440, \dA, 3])
b = Synth(\sine, [\freq, 660, \dA, 3])
c = Synth(\sine, [\freq, 880, \dA, 3])
b.release // will release b and c

// doneAction = 4
a = Synth(\sine, [\freq, 440, \dA, 4])
b = Synth(\sine, [\freq, 660, \dA, 4])
c = Synth(\sine, [\freq, 880, \dA, 4])
b.release // will release a and b

// doneAction = 5
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 5])
c.release // will only free c (itself)

// doneAction = 5
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 5], addAction:\addToTail)
c.release // will free itself and the preceding group

// doneAction = 6
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 6])
c.release // will free itself and the following group

// doneAction = 7
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g )
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 7], target:g)
d = Synth(\sine, [\freq, 1100, \dA, 0], target:g)
e = Synth(\sine, [\freq, 1300, \dA, 0], target:g)
c.release // will free itself and preceding nodes in a group

// doneAction = 8
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 8], target:g)
d = Synth(\sine, [\freq, 1100, \dA, 0], target:g)
e = Synth(\sine, [\freq, 1300, \dA, 0], target:g)
c.release // will free itself and preceding nodes in a group

// doneAction = 9
a = Synth(\sine, [\freq, 440, \dA, 9])
b = Synth(\sine, [\freq, 660, \dA, 0])
a.release // will free itself and pause the preceding node
b.run(true) // it was only paused

// doneAction = 10
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 10])
d = Synth(\sine, [\freq, 1100, \dA, 0])
c.release // will free itself and pause following nodes in a group
g.run(true) // it was only paused

// doneAction = 11
a = Synth(\sine, [\freq, 440, \dA, 11])
b = Synth(\sine, [\freq, 660, \dA, 0])
a.release // will free itself and the preceding node

// doneAction = 12
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 12])
d = Synth(\sine, [\freq, 1100, \dA, 0])
c.release // will free itself and the following node

// doneAction = 13
g = Group.new;
a = Synth(\sine, [\freq, 440, \dA, 0], target:g)
b = Synth(\sine, [\freq, 660, \dA, 0], target:g)
c = Synth(\sine, [\freq, 880, \dA, 13])
d = Synth(\sine, [\freq, 1100, \dA, 0])
x = Synth(\sine, [\freq, 2100, \dA, 0])
e = Synth(\sine, [\freq, 1300, \dA, 0])
c.release // will free itself and the following node

Triggers and Gates

The difference between a gated and timed envelope has become clear in the above examples, but to put it in very simple terms, think of the piano as having a timed envelope (as the note dies after a while), but the organ as having a gated envelope (as the note only stops when the key is released). For user input it is good to be able to keep the envelope open as long as the user wants and free it at some event, such as releasing a key (or a person exiting a room in a sound installation).

Gates

Gates are typically used to start a sound that contains an envelope of some sort. They ‘open up’ for a flow values to pass through for a period of time (timed or untimed). When a gate closes, it typically runs the release part of the envelope used.

d = Synth(\sine, [\freq, 1100]) // key down
d.release // key up

// compare with
d = Synth(\sine, [\freq, 840]) // key down
d.free // kill immediately

// gate holds the EnvGen open. Here using Dust (random impulses) to trigger a new envelope
{EnvGen.ar(Env.adsr(0.001, 0.8, 1, 1), Dust.ar(1)) *  Saw.ar(55)!2}.play

// Here using Impulse (periodic impulses)
{EnvGen.ar(Env.adsr(0.001, 0.8, 1, 1), Impulse.ar(2)) *  SinOsc.ar(LFNoise0.ar(2).range(200\
, 1000))!2}.play

// With a doneAction: 2 we kill the synth after the first envelope
{EnvGen.ar(Env.adsr(0.001, 0.8, 0.1, 0.1), Impulse.ar(2), doneAction:2) *  SinOsc.ar(2222)!\
2}.play

// but if we increase the release time of the envelope, it will be retriggered before the d\
oneAction can kill it
{EnvGen.ar(Env.adsr(0.001, 0.8, 0.1, 1), Impulse.ar(2), doneAction:2) *  SinOsc.ar(1444)!2}\
.play

Triggers are similar to gates, they start a process, but they do not have the release function Gates have. So they are used to trigger envelopes.

trigger rate - Arguments that begin with “t_” (e.g. t_trig), or that are specified as \tr in the def’s rates argument (see below), will be made as a TrigControl. Setting the argument will create a control-rate impulse at the set value. This is useful for triggers.

Triggers

In the example above we saw how Dust and Impulse could be used to trigger an envelope. The trigger can be set from everywhere (code, GUI, system, etc) but we need to use “t_” in front of trigger arguments.

(
a = { arg t_gate = 1;
	var freq;
	freq = EnvGen.kr(Env.new([200, 200, 800], [0, 1.6]), t_gate);
     SinOsc.ar(freq, 0, 0.2) ! 2 
}.play;
)

a.set(\t_gate, 1)  // try to evaluate this line repeatedly
a.free // if you observe the server window you see this synth disappearing

(
a = { arg t_gate = 1;
	var env;
	env = EnvGen.kr(Env.adsr, t_gate);
     SinOsc.ar(888, 0, 1 * env) ! 2 
}.play;
)

a.set(\t_gate, 1)  // repeat this
a.free // free the synth (since it didn't have a doneAction:2)

// If you are curious about what doneAction:2 would have done, try this:
(
a = { arg t_gate = 1;
	var env;
	env = EnvGen.kr(Env.adsr, t_gate, doneAction:2);
     SinOsc.ar(888, 0, 1 * env) ! 2 
}.play;
)

a.set(\t_gate, 1)  // why does this line not retrigger the synth?
// Now try the same with doneAction:0

If you want to keep the same synth on the server and trigger it from another process than the synthesis control parameter process you can use gates and triggers for the envelope. Use doneAction:0 to keep the synth on the server before or after the envelope is triggered.

Let’s turn the examples above into SynthDefs and explore the concept of gates:

SynthDef(\trigtest, {arg freq, amp, dur=1, gate;
	var signal, env;
	env = EnvGen.ar(Env.adsr(0.01, dur, amp, 0.7), gate, doneAction:0); 
	signal = SinOsc.ar(freq) * env;
	Out.ar(0, signal);
}).add

a = Synth(\trigtest, [\freq, 333, \amp, 1, \gate, 0]) // gate is 0, no sound
a.set(\gate, 1)
a.set(\gate, 0)

// the synth is still running, even if it is silent
a.set(\freq, 788) // change the frequency

a.set(\gate, 1)
a.set(\gate, 0)

The example below does the same, but here with a fixed time envelope. Since that envelope finishes when it is done, it does not work with gates. We need a trigger to trigger it back to life.

// here we use a t_trig to retrigger the synth
SynthDef(\trigtest2, {arg freq, amp, dur=1, t_trig;
	var signal, env;
	env = EnvGen.ar(Env.perc(0.01, dur, amp), t_trig, doneAction:0); 
	signal = SinOsc.ar(freq) * env;
	Out.ar(0, signal);
}).add

a = Synth(\trigtest2, [\freq, 333, \amp, 1, \t_trig, 1])

a.set(\freq, 788)
a.set(\t_trig, 1);
a.set(\amp, 0.28)
a.set(\t_trig, 1);

a.set(\freq, 588)
a.set(\t_trig, 1);
a.set(\amp, 0.8)
a.set(\t_trig, 1);

Exercise: Explore the difference between a gate and a trigger.

MIDI Keyboard Example

The techniques we’ve been exploring above are useful when creating user interfaces for your synth. As an example we could create a synth definition to be controlled by a MIDI controller. Other usage could be networked communication, input from other software, or running musical patterns within SuperCollider itself. In the example below we build upon the example we did in chapter 4, but here we add pitch bend and vibrato.

MIDIIn.connectAll; // we connect all the incoming devices
MIDIFunc.noteOn({arg ...x; x.postln; }); // we post all the args

//First we create a synth definition for this example:
SynthDef(\moog, {arg freq=440, amp=1, gate=1, pitchBend=0, cutoff=20, vibrato=0;
	var signal, env;
	signal = LPF.ar(VarSaw.ar([freq, freq+2]+pitchBend+SinOsc.ar(vibrato, 0, 1, 1), 0, XLine.a\
r(0.7, 0.9, 0.13)), (cutoff * freq).min(16000));
	env = EnvGen.ar(Env.adsr(0), gate, levelScale: amp, doneAction:2);
	Out.ar(0, signal*env);
}).add;

a = Array.fill(127, { nil }); // create an array of nils, where the Synths will live
g = Group.new; // we create a Group to be able to set cutoff of all active notes
c = 6;
MIDIdef.noteOn(\myOndef, {arg vel, key, channel, device; 
	// we use the key as index into the array as well
	a[key] = Synth(\moog, [\freq, key.midicps, \amp, vel/127, \cutoff, 10], target:g);
});
MIDIdef.noteOff(\myOffdef, {arg vel, key, channel, device; 
	a[key].release;
	a[key] = nil; // we put nil back in the array as we use it in the if-statements below
});

MIDIdef.cc(\myPitchBend, { arg val; 
	c=val.linlin(0, 127, -10, 10); 
	"Pitch Bend : ".post; val.postln;
	a.do({arg synth; 
		if( synth != nil , { synth.set(\pitchBend, val ) }); 
	});	
});

MIDIdef.bend(\myVibrato, { arg val; 
	c=val.linlin(0, 127, 1, 20); 
	"Vibrato : ".post; val.postln;
	a.do({arg synth; 
		if( synth != nil , { synth.set(\vibrato, c ) }); 
	});	
});

Chapter 9 - Samples and Buffers

SuperCollider offers multiple ways of working with recorded sound. Sampling is one of the key techniques of computer music programming today, originating in tape-based instruments such as the Chamberlin or Mellotrone, but popularised in digital systems with samplers like E-mu Emulator and the Akai S-Series. Sampled sound is also the source of more recent techniques, such as granular and concatenative synthesis.

The first thing we need to know is that a sample is a collection of amplitude values in an array. If we were using 44.1kHz sample rate, we would have 44100 samples in the array if our sound was one second. And twice that amount if our sound was stereo.

We could therefore generate 1 second of whitenoise like this:

Array.fill(44100, {1.0.rand2});

The interesting question then is: how do we play these samples? What mechanism will read this and send it to the sound card? For that we use Buffers and UGens that can read them, such as PlayBuf.

Buffers

In short, a buffer is a collection of values in the memory of the computer. In SuperCollider, buffers are loaded onto the server not the language, so in our white noise example above, we would have to find a way to move our collection of values from the language to the server (as that’s where they would be played). Buffers can be used to contain all kinds of values in addition to sound, for example control data, gestural data from human movement, sonification data, and so on.

Allocating a Buffer

In order to create a buffer, we need to allocate it on the server. This is done through an .alloc method:

b = Buffer.alloc(s, 44100 * 4.0, 1); // 4 seconds of sound on a 44100 Hz system, 1 channel

// in the post window we get this information:
//  - > Buffer(0, 176400, 1, 44100, nil) // bufnum, number of samples, channels, sample-rat\
e, path

// If you run the line again, you will see that the bufnum has increased by 1.

// and we can get to this information by calling the server:
b.bufnum;

c = Buffer.alloc(s, 44100 * 4.0, 2); // same but now 2 channels

// This means that we now have twice the amount of samples, but same amount of frames
b.numFrames;
c.numFrames;

// and the number of channels
b.numChannels;
c.numChannels;

// It's clear though that 'c' has twice the amount of samples, even if both buffers have eq\
ual amount of frames

b.numFrames * b.numChannels;
c.numFrames * c.numChannels;

As mentioned buffers are collection of values in the RAM (Random Access Memory) of the computer. This means that the playhead can jump back and forth in the sound, play it fast or slow, backwards or forwards, and so on. But it also means that, unlike sound file playback from disk (where sound is buffered at regular intervals), the whole sound is stored in the memory of the computer. Try to open your Terminal and then run this line:

top

a = Array.fill(10, {Buffer.alloc(s,44100 * 8.0, 2)});

// You will see how the memory of the process called scsynth increases
// (scsynth is the name of the SuperCollider server process)

// now run the following line and watch when the memory is de-allocated.
10.do({arg i; a[i].free;})

We have now allocated some buffers on the server, but they only contain values of zero. Try playing it:

b.play
// We can load the samples from the server into an array ('a') in the language to check
// This means that the server will send the values from the server to the language over OSC.
b.loadToFloatArray(action: {arg array; a = array; a.postln;})

a.postln // and we see lots of 0s.

If we wanted to listen to the noise we created above, we could simply load the array into the buffer.

a = Array.fill(44100, {1.0.rand2}); // 1 second of noise (in an array in the language)
b = Buffer.loadCollection(s, a); // this line loads the array into the buffer (on the serve\
r)
b.play // and now we have a beautiful noise!

// We could then observe the samples by getting it back to the language like we did above:
a = Array.fill(44100, {arg i; i=i/10; sin(i)}); // fill a buffer with a sine wave
b = Buffer.loadCollection(s, a); // load the array onto the server
b.play // and now we have a beautiful sine!
b.loadToFloatArray(action: {arg array; a = array; Post << a}) // lots of samples

Reading a soundfile into a Buffer

We can read a sound file into a buffer simply by providing the path to it. This path is either relative to the SuperCollider application (so ‘hello.aif’ could be loaded if it was next to the SuperCollider application). Note that the IDE allows you to drag a file from your file system into the code document and the full path appears.

b = Buffer.read(s, Platform.resourceDir +/+ "sounds/a11wlk01.wav");
b.bufnum; // let's check its bufnum

{ PlayBuf.ar(1, b) ! 2 }.play // the first argument is the number of channels

// We can wrap this into a SynthDef, of course
(
SynthDef(\playBuf,{ arg out = 0, bufnum;
	var signal;
	signal = PlayBuf.ar(1, bufnum, BufRateScale.kr(bufnum));
	Out.ar(out, signal ! 2)
}).add
)
x = Synth(\playBuf, [\bufnum, b.bufnum]) // we pass in either the buffer or the buffer numb\
er

x.free; // free the synth 
b.free; // free the buffer

// for many buffers, the typical thing to do is to load them into an array:
b = Array.fill(10, {Buffer.read(s, Platform.resourceDir +/+ "sounds/a11wlk01.wav")});

// and then we can access it from the index in the array
x = Synth(\playBuf, [\bufnum, b[2].bufnum])

Since the PlayBuf requires information on the number of channels in the sound file, users need to make sure that this is clear, so people often come up with systems like this in their code:

b = Buffer.read(s, Platform.userAppSupportDir+/+"sounds/a11wlk01.wav");

SynthDef(\playMono, { arg out=0, buffer, rate=1;
	Out.ar(out, PlayBuf.ar(1, buffer, rate, loop:1) ! 2)
}).add;

SynthDef(\playMono, { arg out=0, buffer, rate=1;
	Out.ar(out, PlayBuf.ar(2, buffer, rate, loop:1)) // no "! 2"
}).add;

// And then
If(b.numChannels == 1, {
	x = Synth(\playMono, [\buffer, b]) // we pass in either the buffer or the buffer number
}, {
	x = Synth(\playStereo, [\buffer, b]) // we pass in either the buffer or the buffer number
});

Note that we don’t need the “!2” in the stereo version as that would simply make the left channel expand into the right (and add to the right channel), whereas the right channel would expand into Bus 3. [Bus 1, Bus 2, Bus 3, Bus 4, Bus 5, etc….] [ Left , Right ] [ Left , Right ]

Let us play a little with Buffer playback in order to get a feel for the possibilities of sound stored in random access memory.

// Change the playback speed
{Pan2.ar(PlayBuf.ar(1, b, MouseX.kr(-1,2), loop:1))}.play

// Scratch around in the file
{ PlayBuf.ar(1, b, MouseX.kr(-1.5, 1.5), loop: 1) ! 2 }.play

// Or perhaps a bit more excitingly 
{
	var speed;
	speed = MouseX.kr(-10, 10);
	speed = speed - DelayN.kr(speed, 0.1, 0.1);
	speed = MouseButton.kr(1, 0, 0.3) + speed ;
	PlayBuf.ar(1, b, speed, loop: 1) ! 2;
}.play

// Another version
{BufRd.ar(1, b, Lag.ar( K2A.ar( MouseX.kr(0,1)) * BufFrames.ir(b), 1))!2}.play

// Jumping to a random location in the buffer using LFNoise0
{PlayBuf.ar(1, b, 1, LFNoise0.ar(12)*BufFrames.ir(b), loop:1)!2}.play

// And so on ….

Recording live sound into a Buffer

Live sound can of course be fed directly into a Buffer for further manipulation. This could be useful if you are recording the sound, transforming it, overdubbing, cutting it up, scratching, and so on. However, in many cases a simple SoundIn UGen might be sufficient (and no Buffers used).

b = Buffer.alloc(s, 44100 * 4.0, 1); // 4 second mono buffer
// Warning, you might get feedback if you're not using headphones
{ RecordBuf.ar(SoundIn.ar(0), b); nil }.play; // run this for at least 4 seconds
{ PlayBuf.ar(1, b) }.play; // play it back

SuperCollider really makes this simple. However, the RecordBuf does more than simply recording. Since it loops, you can also overwrite the data that is already in the buffer with the preLevel argument. The preLevel argument is the amount that the data that is in the buffer is multiplied with before it is added to the incoming sound. We can now explore this in a more SuperCollider way of doing things, with SynthDefs and Synths.

SynthDef(\recBuf, { arg buffer=0, recLevel=0.5, preLevel=0.5;
	var in;
	in = SoundIn.ar(0);
	RecordBuf.ar(in, buffer, 0, recLevel, preLevel, loop:1);
}).add;

// we record into the buffer
x = Synth(\recBuf, [\buffer, b, \preLevel, 0]);
x.free;

// and we can play it back using the playBuf synthdef we created above
z = Synth(\playMono, [\buffer, b])
z.free;

// We could also explore the overdubbing of sound (leave this running)
(
x = Synth(\recBuf, [\buffer, b]); // here preLevel is 0.5 by default
z = Synth(\playMono, [\buffer, b, \rate, 1.5]); 
)

// Change the playback rate of the buffer
z.set(\rate, 0.75);

// if we like what we have recorded, we can easily write it to disk as a soundfile:
b.write("myBufRecording.aif", "AIFF", 'int16');

It is clear that playing with the recLevel and preLevel of a buffer recording, can create interesting layers of sound, where instrumentalists can record on top of what they already recorded. People could also engage in an “I’m Sitting in a Room” exercise a la Lucier.

Finally, as mentioned at the beginning of this chapter, buffers can contain any data and are not necessarily bound to audio content. In the example below we use the buffer to record mouse values at control rate (which is sample rate / block size) and write that mouse movement to disk in the form of an audio file.

b = Buffer.alloc(s, (s.sampleRate/s.options.blockSize) * 5, 1); // 5 secs of control rate
{RecordBuf.kr(MouseY.kr, b); SinOsc.ar(1000*MouseY.kr) }.play // recording the mouse
b.write("mouse.aif") // write the buffer to disk, aif is as good format as any

// play it back
b = Buffer.read(s, "mouse.aif")
{SinOsc.ar(1000*PlayBuf.kr(1, b))}.play

BufRd and BufWr

There are other UGens that can be helpful when playing back buffers. BufRd (buffer read) and BufWr (buffer write) are good examples of this, and so is the LoopBuf (from the sc3-plugins that are in the SuperCollider Extensions distribution).

In the example below we use a Phasor to ‘drive’ the reading of the buffer. This reading has to read sample by sample from the buffer, for example by providing the start and the end sample you want to read:

{ BufRd.ar(1, b, Phasor.ar(0, 1, 0, BufFrames.kr(b))) }.play;

// This way we can easily use SinOsc to modulate the play rate
{ BufRd.ar(1, b, Phasor.ar(0, SinOsc.ar(1).range(0.5, 1.5), 0, BufFrames.kr(b))) }.play;

// And we can also use the mouse to drive the reading 
b = Buffer.read(s, "sounds/a11wlk01.wav");

// Move the mouse!
SynthDef(\scratch, {arg bufnum, pitch=1, start=0, end;
	var signal;
	signal = BufRd.ar(1, bufnum, Lag.ar(K2A.ar(MouseX.kr(1, end)), 0.4));
	Out.ar(0, signal!2);
}).play(s, [\bufnum, b.bufnum, \end, b.numFrames]);

Streaming from disk

If your soundfile is very long, it is probably a good idea to stream the sound from disk, like most digital audio workstations do. This is because long stereo files would quickly fill up your RAM if working with many sound files.

// We still need a buffer (but we are cueing it, i.e. not filling)
b = Buffer.cueSoundFile(s, Platform.resourceDir +/+ "sounds/a11wlk01-44_1.aiff", 0, 1);

SynthDef(\playcuedBuf,{ arg out = 0, bufnum;
	var signal;
	signal = DiskIn.ar(1, bufnum, loop:1);
	Out.ar(out, signal ! 2)
}).add;

x = Synth(\playcuedBuf, [\bufnum, b]);

Wavetables and wavetable look-up oscillators

Wavetables are a classic method of sound synthesis. It works similarly to the BufRd of a Buffer above, but here we are creating a bespoke wavetable (which can often be visualised for manipulation) and using wavetable look-up oscillators to play the content of the wavetable back. In fact many of the oscillators of SuperCollider use wavetable look-up under the hood, SinOsc being a good example.

Let’s start with creating a SynthDef with an Osc (which is a wavetable look-up oscillator). It expects to get a signal in the form of a SuperCollider Wavetable, which is a special format for interpolating oscillators.

(
SynthDef(\wavetable,{ arg out = 0, buffer;
	var signal;
	signal = Osc.ar(buffer, MouseX.kr(60,300)); // mouseX controlling pitch
	Out.ar(out, signal ! 2)
}).add
)

// we then allocate a Buffer with 512 samples (the buffer size must be the power of 2)
b = Buffer.alloc(s, 512, 1); 
b.sine1(1.0, true, true, true); // and we fill it with a sine wave

b.plot // notice something strange?
b.getToFloatArray(action: { |array|  { array[0, 2..].plot }.defer }); // check this

// let's listen to it
a = Synth(\wavetable, [\buffer, b])
a.free;

// You can hear that it sounds very different from a PlayBuf trying to play the same file (\
and here we get aliasing), since the PlayBuf is not band limited:

{PlayBuf.ar(1, b, MouseX.kr(-1, 10), loop:1)}.play;

// We can then create different waveforms
b.sine1(1.0/[1,2,3,4], true, true, true); //
b.getToFloatArray(action: { |array|  { array[0, 2..].plot }.defer }); // view the wave
a = Synth(\wavetable, [\buffer, b])
a.free;

// A saw wave
b.sine1(0.3/Array.series(90,1,1)*2, false, true, true);
b.getToFloatArray(action: { |array|  { array[0, 2..].plot }.defer });
a = Synth(\wavetable, [\buffer, b])
a.free;

// Random numbers
b.sine1(Array.fill(50, {1.0.rand}), true, true, true);
b.getToFloatArray(action: { |array|  { array[0, 2..].plot }.defer });

a = Synth(\wavetable, [\buffer, b])
a.free;

// We can also use an envelope to fill a buffer
a = Env([0, 1, 0.2, 0.3, -1, 0.3, 0], [0.1, 0.1, 0.1, 0.1, 0.1, 0.1], \sin);
a.plot; // view this envelope 

// But we need to turn the envelope into a signal and then into a wavetable
c = a.asSignal(256).asWavetable;
c.size; // the size of the wavetable is twice the size of the signal... 512

// now we neet to put this wavetable into a buffer:
b = Buffer.alloc(s, 512);
b.setn(0, c);

// play it
a = Synth(\oscplayer, [\bufnum, b.bufnum])
a.free;

// try to load the above without turning the data into a wavetable, i.e.,
a = Env([0, 1, 0.2, 0.3, -1, 0.3, 0], [0.1, 0.1, 0.1, 0.1, 0.1, 0.1], \sin);
c = a.asSignal(256);
b = Buffer.alloc(s, 512);
b.setn(0, c);
a = Synth(\oscplayer, [\bufnum, b.bufnum])

// and you will hear aliasing where the partials of the sound mirror back into the audio ra\
nge

Above we saw how an envelope was turned into a Signal which was then converted to a Wavetable. Signals are a type of a numerical collection in SuperCollider that allows for various math operations. These can be useful for FFT manipulation of data arrays or simply writing data to a file, as in this example:

f = SoundFile.new;
f.openWrite( Platform.userAppSupportDir +/+ "sounds/writetest.wav");
d = Signal.fill(44100, { |i| // one second of sound  
	// 1.0.rand2;  // white noise
	// sin(i/10); // a sine wave
	sin(i/10).cubed;
});
f.writeData(d);
f.close;

Below we explore further how Signals can be used with wavetable oscillators.

x = Signal.sineFill(512, [0,0,0,1]);
// We can now operate in many ways on the signal
[x, x.neg, x.abs, x.sign, x.squared, x.cubed, x.asin.normalize, x.exp.normalize, x.distort]\
.flop.flat.plot(numChannels: 9);

c = x.asWavetable;

b = Buffer.alloc(s, 512);
b.setn(0, c); // set the wavetable into the buffer so Osc can read it.

// play it
a = Synth(\wavetable, [\buffer, b])
a.free;

// And the following lines will load a different wavetable into the buffer
c = x.exp.normalize.asWavetable;
b.setn(0, c);
c = x.abs.asWavetable;
b.setn(0, c);
c = x.squared.asWavetable;
b.setn(0, c);
c = x.asin.normalize.asWavetable;
b.setn(0, c);
c = x.distort.asWavetable;
b.setn(0, c);

// try also COsc (Chorusing wavetable oscillator)
{COsc.ar(b, MouseX.kr(60,300))!2}.play

// OscN
{OscN.ar(b, MouseX.kr(60,300))!2}.play // works better with the non-asWavetable example abo\
ve

// Variable OSC - which can morph between wavetables
b = {Buffer.alloc(s, 512)} ! 9;
x = Signal.sineFill(512, [0,0,0,1]);
[x, x.neg, x.abs, x.sign, x.squared, x.cubed, x.asin.normalize, x.exp.normalize, x.distort]\
.do({arg signal, i; b[i].setn(0, signal.asWavetable)});

{ VOsc.ar(b[0].bufnum + MouseX.kr(0,7), [120,121], 0, 0.3) }.play

// change the content of the wavetables to something random
9.do({arg i; b[i].sine1(Array.fill(512, {1.0.rand2}), true, true, true); })

// VOsc3 
{ VOsc3.ar(b[0].bufnum + MouseX.kr(0,7), [120,121], 0, 0.3) }.play

People often want to draw their own sound in a wavetable. We can end this excursion into wavetable synthesis by creating a graphical user interface that allows for the drawing of wavetables.

(
var size = 512;
var canvas, wave, lastPos, lastVal;

w = Window("Wavetable", Rect(100, 100, 1024, 500)).front;
wave = Signal.sineFill(size, [1]);
b = Buffer.alloc(s, size * 2); // double the size for the wavetable

Slider(w, Rect(0, 5, 1024, 20)).action_({|sl| x.set(\freq, sl.value*1000)});  
  UserView(w, Rect(0, 30, 1024, 470))
    .background_(Color.black)
    .animate_(true)
    .mouseMoveAction_({ |me, x, y, mod, btn|
       var pos = (size * (x / me.bounds.width)).floor;
       var val = (2 * (y / me.bounds.height)) - 1;
       val = min(max(val, -1), 1);
       wave.clipPut(pos, val);
       if(lastPos != nil, {
           for(lastPos + 1, pos - 1, { |i|
               wave.clipPut(i, lastVal + (((i - lastPos) / (pos - lastPos)) * (val - lastVa\
l)));
           });
           for(pos + 1, lastPos - 1, { |i|
               wave.clipPut(i, lastVal + (((i - lastPos) / (pos - lastPos)) * (val - lastVa\
l)));
           });
       });
       lastPos = pos;
       lastVal = val;
       b.loadCollection(wave.asWavetable);
       })
       .mouseUpAction_({
           lastPos = nil;
          lastVal = nil;
       })
       .drawFunc_({ |me|
	         Pen.color = Color.white;
           Pen.moveTo(0@(me.bounds.height * (wave[0] + 1) / 2));
           for(1, size - 1, { |i, a|
               Pen.lineTo((me.bounds.width * i /size)@(me.bounds.height * (wave[i] + 1)/2))
           });
           Pen.stroke;
       });
b.loadCollection(wave.asWavetable);
x = {arg freq=440; Osc.ar(b, freq) *0.4 ! 2 }.play;
)

Pitch and duration changes

If you would like to change the pitch but keep the duration of the sampled sound you are playing, you cannot simply change the rate of the PlayBuf, as the duration will get shorter by increasing the rate (an octave up will speed the sound up by a fact or of 2).

b = Buffer.read(s, Platform.userAppSupportDir +/+ "sounds/a11wlk01-44_1.aiff");

// The most common way
(
SynthDef(\playBuf,{ arg out = 0, bufnum;
	var signal;
	signal = PlayBuf.ar(1, bufnum, MouseX.kr(0.2, 4), loop:1);
	Out.ar(out, signal ! 2)
}).add
)

x = Synth(\playBuf, [\bufnum, b.bufnum])
x.free

We could use PitchShift to change the pitch without changing the time.. PitchShift is a granular synthesis pitch shifter (other techniques include Phase Vocoders)

(
SynthDef(\playBufWPitchShift,{ arg out = 0, bufnum;
	var signal;
	signal = PlayBuf.ar(1, bufnum, 1, loop:1);
	signal = PitchShift.ar(
		signal,	// stereo audio input
		0.1, 			// grain size
		MouseX.kr(0,2),	// mouse x controls pitch shift ratio
		0, 				// pitch dispersion
		0.004			// time dispersion
	);
	Out.ar(out, signal ! 2)
}).add
)

x = Synth(\playBufWPitchShift, [\bufnum, b.bufnum])
x.free

For time streching check out the Warp0, Warp1 Ugens.

Chapter 10 - Granular and Concatenative Synthesis

Granular synthesis is a synthesis technique that became available for most practical purposes with digital computer music software. Early pioneers were Barry Truax and Iannis Xenakis, but the technique has been well explored in the work of Curtis Roads, both in his musical output and in a fine book called Microsound.

The idea in granular synthesis is to synthesize a sound using small grains, typically of 10-50 millisecond duration, that are wrapped in envelopes. These grains can then result in a continuous sound or more discontinuous ‘grain clouds’. Here the individual grains become the building blocks, almost atoms, of a more complex structure.

Granular Synthesis

Granular synthesis is used in many pitch shifting and time stretching features of commercial software so most people would be well aware of its functionality and power. Let us explore the pitch shifting through the use of an indigenous SuperCollider UGen, the PitchShift. In the examples below, the grains are 100 ms windows that overlap. What is really happening is that the sample is played at variable rate (where rate of 2 is an octave higher), but the grains are layered on top of each other in order to maintain the time of the sound.

An example of a grain
An example of a grain
b = Buffer.read(s, Platform.userAppSupportDir+/+"sounds/a11wlk01.wav");

// MouseX controls the pitch
{ PitchShift.ar(PlayBuf.ar(1, b, 1, loop:1), 0.1, MouseX.kr(0,2), 0, 0.01) ! 2}.play;
// Same as above, but here MouseY gives random pitch
{ PitchShift.ar(PlayBuf.ar(1, b, 1, loop:1), 0.1, MouseX.kr(0,2), MouseY.kr(0, 2), 0.01) ! \
2}.play;

The grains are windows with a specific envelope (typically a Hanning envelope) and they overlap in order to create the continuous sound. Play around with the parameters of window size and overlap to explore how they result in different sound. The above examples used PitchShift for the purposes of changing the pitch but keeping the same playback rate. Below we use Warp1 to time stretch sound where the pitch remains the same.

// speed up the sound (with same pitch)
{Warp1.ar(1, b, Line.kr(0,1, 1), 1, 0.1, -1, 8, 0.1, 2)!2}.play

// slow down the sound (with the same pitch)
{Warp1.ar(1, b, Line.kr(0,1, 10), 1, 0.09, -1, 8, 0.1, 2)!2}.play

// use the mouse to read the sound (at the same pitch)
{Warp1.ar(1, b, MouseX.kr(0,1), 1, 0.1, -1, 8, 0.1, 2)!2}.play

// A SinOsc reading the sound (at the same pitch)
{Warp1.ar(1, b, SinOsc.kr(0.07).range(0,1), 1, 0.1, -1, 8, 0.1, 2)!2}.play

// use the mouse to read the sound (and control the pitch)
{Warp1.ar(1, b, MouseX.kr(0,1), MouseY.kr(0.5,2), 0.1, -1, 8, 0.1, 2)!2}.play

TGrains

The TGrains Ugen - or Trigger Grains - is a handy UGen for quick and basic granular synthesis. Here we can pass arguments such as number of grains per second, grain duration, rate (which is pitch), etc.

// mouse Y controlling number of grains per second
{TGrains.ar(2, Impulse.ar(MouseY.kr(1, 30)), b, 1, MouseX.kr(0,BufDur.kr(b)), 2/MouseY.kr(1\
, 10), 0, 0.8, 2)}.play

// mouse Y controlling pitch
{TGrains.ar(2, Impulse.ar(20), b, MouseY.kr(0.5, 2), MouseX.kr(0,BufDur.kr(b)), 2/MouseY.kr\
(1, 10), 0, 0.8, 2)}.play

// random pitch location, with mouse X controlling number 
// of grains per second an mouse Y controlling grain duration
{
TGrains.ar(2, 
	Impulse.ar(MouseX.kr(1, 50)), 
	b, 
	LFNoise0.ar(40, add:1), 
	LFNoise0.ar(40).abs * BufDur.kr(b), 
	MouseY.kr(0.01, 0.05), 
	0, 
	0.8, 
	2)
}.play

GrainIn

GrainIn enables you to granularize incoming audio. This UGen is part of a collection of other granular Ugens, such as GrainSin, GrainFM, and GrainBuf. Take a look at the documentation of these UGens and explore their functionality.

SynthDef(\sagrain, {arg amp=1, grainDur=0.1, grainSpeed=10, panWidth=0.5;
	var pan, granulizer;
	pan = LFNoise0.kr(grainSpeed, panWidth);
	granulizer = GrainIn.ar(2, Impulse.kr(grainSpeed), grainDur, SoundIn.ar(0), pan);
	Out.ar(0, granulizer * amp);
}).add;

x = Synth(\sagrain)

x.set(\grainDur, 0.02)
x.set(\amp, 0.02)
x.set(\amp, 1)

x.set(\grainDur, 0.1)
x.set(\grainSpeed, 5)
x.set(\panWidth, 1)

Custom built granular synthesis

Having explored some features of granular synthesis above, the best way to really understand what granular synthesis is would be to make our own granular synth engine that spawns grains of some sort according to our own rate, pitch, wave form, and envelope.

In the examples above we have continued the chapter on Buffers and used sampled sound as the source of our granular synthesis. Here below we will explore the technique with simpler waveforms, such as the sine wave.

SynthDef(\sineGrain, { arg freq=800, amp=0.4, dur=0.1, pan=0;
	var signal, env;
	// A Gaussian curve envelope that's released from the server after playback
	env = EnvGen.kr(Env.sine(dur, amp), doneAction: 2);
	signal = FSinOsc.ar(freq, 0, env);
	OffsetOut.ar(0, Pan2.ar(signal, pan)); 
}).add;

Synth(\sineGrain, [\freq, 500, \dur, 0.05]) // 50 ms grain duration

// we can then trigger 1000 grains, one every 10 ms
(
Task({
   1000.do({ 
   		Synth(\sineGrain, 
			[\freq, rrand(440, 1600), // 
			\amp, rrand(0.1,0.3),
			\dur, rrand(0.02, 0.1)
			]);
		0.01.wait;
	});
}).start;
)

If our grains all have the same pitch, we should be able to generate a continuous sine wave out of the grains as they will be overlapping as shown in this image

[image]

Task({
   1000.do({ 
   		Synth(\sineGrain, 
			[\freq, 440,
			\amp, 0.4,
			\dur, 0.1
			]);
		0.05.wait; // density
	});
}).start;

But the sound is not perfectly continuous. This is because when we create a Synth, it is being sent as quickly as possible to the server. As the language-synth communication is asynchronous there might be slight time differences in the time it takes to send the OSC message over to the server, and this causes the fluctuation. We therefore need to timestamp our messages and it can be done either through messaging style communication with the server, or using s.bind (which makes an OSC bundle under the hood and sends a time stamped OSC message to the server).

Task({
   1000.do({ 
		s.sendBundle(0.2, 
			["/s_new", \sineGrain, x = s.nextNodeID, 0, 1], 
			["/n_set", x, \freq, 440, \amp, 0.4, \dur, 0.1]
		);
		0.05.wait; // density
	});
}).start;

// Or simply (and probably more readably)
Task({
   1000.do({
		s.bind{
			Synth(\sineGrain, 
				[\freq, 440,
				\amp, 0.4,
				\dur, 0.1
			]);
		};
		0.05.wait; // density
	});
}).start;

There can be different envelopes in the grains. Here we use a Perc envelope:

SynthDef(\sineGrainWPercEnv, { arg freq = 800, amp = 0.1, envdur = 0.1, pan=0;
	var signal;
	signal = FSinOsc.ar(freq, 0, EnvGen.kr(Env.perc(0.001, envdur), doneAction: 2)*amp);
	OffsetOut.ar(0, Pan2.ar(signal, pan)); 
}).add;

Task({
   1000.do({
		s.bind{
			Synth(\sineGrainWPercEnv, 
				[\freq, rrand(1300, 4000),
				\amp, rrand(0.1, 0.2),
				\envdur, rrand(0.1, 0.2),
				\pan, 1.0.rand2
			]);
		};
		0.01.wait; // density
	});
}).start;

// Or doing the same using the Pbind Pattern
Pbind(
	\instrument, \sineGrainWPercEnv,
	\freq, Pfunc({rrand(1300, 4000)}),
	\amp, Pfunc({rrand(0.1, 0.2)}),
	\envdur, Pfunc({rrand(0.1, 0.2)}),
	\dur, 0.01, // density
	\pan, Pfunc({1.0.rand2})
).play;

The two examples above serve as a good explanation of how Patterns and Tasks work. We’ve got the same SynthDef, same arguments, but Patterns do operate with default keywords (like \instrument, \freq, \amp, and \dur). We therefore had to make sure that our envelope argument was not called \dur, since Pbind uses that to control the density (or the time it takes until the next event is fired - so “\dur, 0.01” in the pattern is the same as “0.01.wait” in the Task)

Pbind(
	\instrument, \sineGrainWPercEnv,
	\freq, Pseq([1000, 2000, 4000], inf), // try to add 3000 in here
	\amp, Pfunc({rrand(0.1, 0.2)}),
	\envdur, Pseq([0.01, 0.02, 0.04], inf),
	\dur, Pseq([0.01, 0.02, 0.04], inf), // density
	\pan, Pseq([0.9, -0.9],inf)
).play;

Finally, let’s try this out with a buffer.

b = Buffer.read(s, Platform.userAppSupportDir+/+"sounds/a11wlk01-44_1.aiff");

SynthDef(\bufGrain,{ arg out = 0, buffer, rate=1.0, amp = 0.1, dur = 0.1, startPos=0;
	var signal;
	signal = PlayBuf.ar(1, buffer, rate, 1, startPos) * EnvGen.kr(Env.sine(dur, amp), doneActi\
on: 2);
	OffsetOut.ar(out, signal ! 2)
}).add;

Synth(\bufGrain, [\buffer, b]); // try it

Task({
   1000.do({ arg i;
   		Synth(\bufGrain, 
			[\buffer, b,
   			\rate, rrand(0.8, 1.2),
			\amp, rrand(0.05,0.2),
			\dur, rrand(0.06, 0.1),
			\startPos, i*100 // jumping 100 samples per grain
		]);
		0.01.wait;
	});
}).start;

Concatenative Synthesis

Concatenative synthesis is a rather recent technique of data-driven synthesis, where source sounds are analysed into a database, segmented into units, but then an target sound (for example live audio input) is analysed and matched with the closest unit in the database which is then played. This is done at a very granular level, prompting Zils and Pachet to call the technique musaicing, from musical mosaicing, as it enables the synthesis of a coherent sound at a macro level that is built up of smaller units of sound, just like in traditional mosaics. The technique is therefore quite related to granular synthesis in the sense that a macro-sound is built out of micro-sounds.

The technique can be quite complex to work with as users might have to analyse and build up a database of source sounds. However, people have built plugins and classes in SuperCollider that help with this purpose and in this section here we will explore some of the work done in this area by Nick Collins, a long time SuperCollider user and developer.

b = Buffer.read(s,Platform.userAppSupportDir+/+"sounds/a11wlk01.wav");


{Concat2.ar(SoundIn.ar(0),PlayBuf.ar(1, b, 1, loop:1),1.0,1.0,1.0,0.1,0,0.0,1.0,0.0,0.0)}.p\
lay

// mouse X used control the match length
{Concat2.ar(SoundIn.ar(0),PlayBuf.ar(1, b, 1, loop:1),1.0,1.0,1.0,MouseX.kr(0.0,0.1),0,1.0,\
0.0,1.0,1.0)}.play

Chapter 11 - Physical Modelling

Physical modelling is a common synthesis technique where a mathematical model is built of some physical object. The maths here can be quite complex and outside the scope of this book. However, it is worth exploring the technique as there are PM UGens in SuperCollider and many musical instruments can easily be built using simple physical models, using filters and alike. Waveguide synthesis can model the physics of the acoustic instrument or sound generating object. It simulates the traveling of waves through a string or a tube. The physical structures of an instrument can be thought of as waveguides or a transmission lines.

In physical modelling, as opposed to traditional synthesis types (AM, FM, granular, etc), we are not imitating the sound of an instrument, but rather simulating the instrument itself and the physical laws that are involved in the creation of a sound.In physical modelling of sound we typically operate with excitation and a resonant body. The excitation is the material and weight of the thing that hits, whilst the resonant body is what is being hit and resonates. In many cases it does not make sense separating the two this way mathematically, but from a user-perspective we can think of material bodies of wood, glass, metal, or a string, as examples, being hit by a finger, a plectrum, a metal hammer, or anything imaginable, for example falling sand. Further resolution can be designed in the model of the instrument, for example defining the bridge of a guitar, the type of strings, the type of body, the room which the instrument is in, etc.

For a good text on physical modelling, check Julius O. Smith’s Physical Audio Signal Processing

Karplus-Strong synthesis (named after its authors) is a precursor of physical modelling and is good for synthesising strings and percussion sounds.

// we generate a short burst (the excitation)
{ Decay.ar(Impulse.ar(1), 0.1, WhiteNoise.ar) }.play

// we then wrap that noise in a fast repeating delay
{ CombL.ar(Decay.ar(Impulse.ar(1), 0.1, WhiteNoise.ar), 0.02, 0.001, 3, 1) }.play

The repeat rate of the delay becomes the pitch of the string, so 0.001 is 1000 Hz, or in a reciprocal relationship. We could therefore write 440.reciprocal in the delayTime argument of the CombL, and it would give us a string sound of 440 Hz. The duration of the string is controlled by the decayTime argument. This is the basic ingredient of a string synthesizer, but for further development, you might want to consider applying filters to the noise, or perhaps use another type of noise. Also, the time of the burst (above 100 ms) will affect the sound heavily.

SynthDef(\ks_string, { arg note, pan, rand, delayTime;
	var x, y, env;
	env = Env.new(#[1, 1, 0],#[2, 0.001]);
	// A simple exciter x, with some randomness.
	x = Decay.ar(Impulse.ar(0, 0, rand), 0.1+rand, WhiteNoise.ar); 
 	x = CombL.ar(x, 0.05, note.reciprocal, delayTime, EnvGen.ar(env, doneAction:2)); 
	x = Pan2.ar(x, pan);
	Out.ar(0, LeakDC.ar(x));
}).add;

{ // and play the synthdef
	20.do({
		Synth(\ks_string, 
			[\note, [48, 50, 53, 58].midicps.choose, 
			\pan, 1.0.rand2, 
			\rand, 0.1+0.1.rand, 
			\delayTime, 2+1.0.rand]);
		[0.125, 0.25, 0.5].choose.wait;
	});
}.fork;

// here using patterns
Pdef(\kspattern, 
	Pbind(\instrument, \ks_string, // using our sine synthdef
			\note, Pseq.new([60, 61, 63, 66], inf).midicps, // freq arg
			\dur, Pseq.new([0.25, 0.5, 0.25, 1], inf),  // dur arg
			\rand, Prand.new([0.2, 0.15, 0.15, 0.11], inf),  // dur arg
			\pan, 1.0.rand2,
			\delayTime, 2+1.0.rand;  // envdur arg
		)
).play;

Compare using white noise and pink noise as an exciter, as well as using a resonant filter to filter the burst:

// white noise
{  
	var burstEnv, burst; 
	burstEnv = EnvGen.kr(Env.perc(0, 0.01), gate: Impulse.kr(1.5));
	burst = WhiteNoise.ar(burstEnv);
	CombL.ar(burst, 0.2, 0.003, 1.9, add: burst);  
}.play;

// pink noise
{  
	var burstEnv, burst; 
	burstEnv = EnvGen.kr(Env.perc(0, 0.01), gate: Impulse.kr(1.5));
	burst = PinkNoise.ar(burstEnv);
	CombL.ar(burst, 0.2, 0.003, 1.9, add: burst);  
}.play;

// here we use RLPF (resonant low pass filter) to filter the white noise burst
{  
	var burstEnv, burst; 
	burstEnv = EnvGen.kr(Env.perc(0, 0.01), gate: Impulse.kr(1.5));
	burst = RLPF.ar(WhiteNoise.ar(burstEnv), MouseX.kr(100, 12000), MouseY.kr(0.001, 0.999));
	CombL.ar(burst, 0.2, 0.003, 1.9, add: burst);  
}.play;

SuperCollider comes with a plug called Pluck which is an implementation of the Karplus-Strong synthesis. This should be more effective than the examples above, but of similar sound.

{Pluck.ar(WhiteNoise.ar(0.1), Impulse.kr(2), MouseY.kr(220, 880).reciprocal, MouseY.kr(220,\
 880).reciprocal, 10, coef:MouseX.kr(-0.1, 0.5)) !2 }.play(s)

We could create a SynthDef with Pluck.

SynthDef(\pluck, {arg freq=440, trig=1, time=2, coef=0.1, cutoff=2, pan=0;
	var pluck, burst;
	burst = LPF.ar(WhiteNoise.ar(0.5), freq*cutoff);
	pluck = Pluck.ar(burst, trig, freq.reciprocal, freq.reciprocal, time, coef:coef);
	Out.ar(0, Pan2.ar(pluck, pan));
}).add;

Synth(\pluck);
Synth(\pluck, [\coef, 0.01]);
Synth(\pluck, [\coef, 0.3]);
Synth(\pluck, [\coef, 0.7]);

Synth(\pluck, [\coef, 0.3, \time, 0.1]);
Synth(\pluck, [\coef, 0.3, \time, 5]);

Synth(\pluck, [\coef, 0.2, \time, 5, \cutoff, 1]);
Synth(\pluck, [\coef, 0.2, \time, 5, \cutoff, 2]);
Synth(\pluck, [\coef, 0.2, \time, 5, \cutoff, 5]);
Synth(\pluck, [\coef, 0.2, \time, 5, \cutoff, 15]);

// A guitar that might need a little distortion
Pbind(\instrument, \pluck,
	\freq, Pseq([72, 70, 67,65, 63, 60, 48], inf).midicps,
	\dur, Pseq([0.5, 0.5, 0.375, 0.125, 0.5, 2], 1),
	\cutoff, Pseq([15, 10, 5, 2, 10, 10, 15], 1)	
).play

Biquad filter

In SuperCollider, the SOS UGen is a second order biquad filter that can be used to create various interesting sounds. We could start with simple glass-like sound:

{SOS.ar(Impulse.ar(2), 0.0, 0.05, 0.0, MouseY.kr(1.45, 1.998, 1), MouseX.kr(-0.999, -0.9998\
, 1))!2}.play

And with slight changes we have a more woody type of sound:

SynthDef(\marimba, {arg out=0, amp=0.1, t_trig=1, freq=100, rq=0.006;
	var env, signal;
	var rho, theta, b1, b2;
	b1 = 1.987 * 0.9889999999 * cos(0.09);
	b2 = 0.998057.neg;
	signal = SOS.ar(K2A.ar(t_trig), 0.3, 0.0, 0.0, b1, b2);
	signal = RHPF.ar(signal*0.8, freq, rq) + DelayC.ar(RHPF.ar(signal*0.9, freq*0.99999, rq*0.\
999), 0.02, 0.01223);
	signal = Decay2.ar(signal, 0.4, 0.3, signal);
	DetectSilence.ar(signal, 0.01, doneAction:2);
	Out.ar(out, signal*(amp*0.4)!2);
}).add;

Pbind(
	\instrument, \marimba, 
	\midinote, Prand([[1,5], 2, [3, 5], 7, 9, 3], inf) + 48, 
	\dur, 0.2 
).play;

// Or perhaps
SynthDef(\wood, {arg out=0, amp=0.3, pan=0, sustain=0.5, t_trig=1, freq=100, rq=0.06;
	var env, signal;
	var rho, theta, b1, b2;
	b1 = 2.0 * 0.97576 * cos(0.161447);
	b2 = 0.9757.squared.neg;
	signal = SOS.ar(K2A.ar(t_trig), 1.0, 0.0, 0.0, b1, b2);
	signal = Decay2.ar(signal, 0.4, 0.8, signal);
	signal = Limiter.ar(Resonz.ar(signal, freq, rq*0.5), 0.9);
	env = EnvGen.kr(Env.perc(0.00001, sustain, amp), doneAction:2);
	Out.ar(out, Pan2.ar(signal, pan)*env);
}).add;

Pbind(
	\instrument, \wood, 
	\midinote, Prand([[1,5], 2, [3, 5], 7, 9, 3], inf) + 48, 
	\dur, 0.2 
).play;

Waveguide synthesis

Waveguide synthesis is the most common form of physical modelling, often using delay lines, filtering, feedback and other non-linear elements. The Waveguide flute below is based upon Hans Mikelson’s Csound slide flute (ultimately derived from Perry Cook’s) STK slide flute physical model. SuperCollider port by John E. Bower, who kindly allowed for the flute’s inclusion in this tutorial.

SynthDef("waveguideFlute", { arg scl = 0.2, pch = 72, ipress = 0.9, ibreath = 0.09, ifeedbk\
1 = 0.4, ifeedbk2 = 0.4, dur = 1, gate = 1, amp = 2, vibrato=0.2;	
	var kenv1, kenv2, kenvibr, kvibr, sr, cr, block, poly, signalOut, ifqc,  fdbckArray;
	var aflow1, asum1, asum2, afqc, atemp1, ax, apoly, asum3, avalue, atemp2, aflute1;
	
	sr = SampleRate.ir;
	cr = ControlRate.ir;
	block = cr.reciprocal;
	ifqc = pch.midicps;	
	// noise envelope
	kenv1 = EnvGen.kr(Env.new( 
		[ 0.0, 1.1 * ipress, ipress, ipress, 0.0 ], [ 0.06, 0.2, dur - 0.46, 0.2 ], 'linear' )
	);
	// overall envelope
	kenv2 = EnvGen.kr(Env.new(
		[ 0.0, amp, amp, 0.0 ], [ 0.1, dur - 0.02, 0.1 ], 'linear' ), doneAction: 2 
	);
	// vibrato envelope
	kenvibr = EnvGen.kr(Env.new( [ 0.0, 0.0, 1, 1, 0.0 ], [ 0.5, 0.5, dur - 1.5, 0.5 ], 'linea\
r') )*vibrato;
	// create air flow and vibrato
	aflow1 = LFClipNoise.ar( sr, kenv1 );
	kvibr = SinOsc.ar( 5, 0, 0.1 * kenvibr );
	asum1 = ( ibreath * aflow1 ) + kenv1 + kvibr;
	afqc = ifqc.reciprocal - ( asum1/20000 ) - ( 9/sr ) + ( ifqc/12000000 ) - block;
	fdbckArray = LocalIn.ar( 1 );
	aflute1 = fdbckArray;
	asum2 = asum1 + ( aflute1 * ifeedbk1 );
	//ax = DelayL.ar( asum2, ifqc.reciprocal * 0.5, afqc * 0.5 );
	ax = DelayC.ar( asum2, ifqc.reciprocal - block * 0.5, afqc * 0.5 - ( asum1/ifqc/cr ) + 0.0\
01 );
	apoly = ax - ( ax.cubed );
	asum3 = apoly + ( aflute1 * ifeedbk2 );
	avalue = LPF.ar( asum3, 2000 );
	aflute1 = DelayC.ar( avalue, ifqc.reciprocal - block, afqc );
	fdbckArray = [ aflute1 ];
	LocalOut.ar( fdbckArray );
	signalOut = avalue;
	OffsetOut.ar( 0, [ signalOut * kenv2, signalOut * kenv2 ] );	
}).add;

// Test the flute
Synth(\waveguideFlute, [\amp, 0.5, \dur, 5, \ipress, 0.90, \ibreath, 0.00536, \ifeedbk1, 0.\
4, \ifeedbk2, 0.4, \pch, 60, \vibrato, 0.2] );

// test the flute player's skills:
Routine({
	var pitches, durations, rhythm;
	pitches = Pseq( [ 47, 49, 53, 58, 55, 53, 52, 60, 54, 43, 52, 59, 65, 58, 59, 61, 67, 64, \
58, 53, 66, 73 ], inf ).asStream;
	durations = Pseq([ Pseq( [ 0.15 ], 17 ), Pseq( [ 2.25, 1.5, 2.25, 3.0, 4.5 ], 1 ) ], inf).\
asStream;
	17.do({
		rhythm = durations.next;		
		Synth(\waveguideFlute, [\amp, 0.6, \dur, rhythm, \ipress, 0.93, \ibreath, 0.00536, \ifeed\
bk1, 0.4, \ifeedbk2, 0.4, \pch, pitches.next] );
		rhythm.wait;	
	});
	5.do({
		rhythm = durations.next;		
		Synth(\waveguideFlute, [\amp, 0.6, \dur, rhythm + 0.25, \ipress, 0.93, \ibreath, 0.00536,\
 \ifeedbk1, 0.4, \ifeedbk2, 0.4, \pch, pitches.next] );		
		rhythm.wait;
	});	
}).play;

Filters

Filters are a vital element in physical modelling. The main concepts here are some kind of an exciter (where in SuperCollider we might use triggers such as Impulse, Dust, or filtered noise) and a resonator (such as the Resonz and Klank resonators, Delays, Reverbs, etc.)

Ringz

The Ringz is a powerful ringing filter, with a decay time, so the impulse can ring for N amount of seconds. Let’s explore some examples:

// triggering a ringing filter by one impulse
{ Ringz.ar(Impulse.ar(0), 2000, 2) }.play

// one impulse per second
{ Ringz.ar(Impulse.ar(1), 2000, 2) }.play

// here using an envelope to soften the attack
{ Ringz.ar(EnvGen.ar(Env.perc(0.01, 1, 1), Impulse.ar(1)), 2000, 2) }.play

// playing with the frequency
{ Ringz.ar(Impulse.ar(4)*0.2, LFNoise0.ar(4)*2000, 0.1) }.play

// using XLine to increase rate and frequency
{ Ringz.ar(Impulse.ar(XLine.ar(1, 10, 4))*0.2, LFNoise0.ar(XLine.ar(1, 10, 4))*2000, 0.1) }\
.play

// using Dust instead of Impulse
{ Ringz.ar(Dust.ar(3, 0.3), 2000, 2) }.play

// here we use an Impulse to trigger the incoming sound
{ Ringz.ar(Impulse.ar(MouseX.kr(1, 100, 1)), 1800, MouseY.kr(0.05, 1), 0.4) }.play;

// control frequency as well
{ Ringz.ar(Impulse.ar(10)*0.5, MouseY.kr(100,1000), MouseX.kr(0.001,1)) }.play

// you could also usa an envelope to soften the attack
{ Ringz.ar(EnvGen.ar(Env.perc(0.001, 1), Impulse.kr(MouseX.kr(1, 100, 1))), 1800, MouseY.kr\
(0.05, 1), 0.4) }.play;

// here resonating white noise instead of a trigger
{ Ringz.ar(WhiteNoise.ar(0.005), 600, 4) }.play

// would this be useful in synthesizing a flute?
{ Ringz.ar(LPF.ar(WhiteNoise.ar(0.005), MouseX.kr(100, 10000)), 600, 1) !2}.play

// a modified example from the documentation
{({Ringz.ar(WhiteNoise.ar(0.001),XLine.kr(exprand(100.0,5000.0), exprand(100.0,5000.0), 20)\
, 0.5)}!10).sum}.play

// The Formlet UGen is a type of Ringz filter, useful for formant control:
{ Formlet.ar(Blip.ar(MouseX.kr(10, 400), 1000, 0.1), MouseY.kr(10, 1000), 0.005, 0.04) }.pl\
ay;

Resonz, Klank and DynKlank

The Resonz, Klank and DynKlank filters can be used in physical modeling. Some examples here below:

// mouse y controlling frequency - mousex controling bandwidth ratio
{ Resonz.ar(Impulse.ar(10)*1.5, MouseY.kr(40,10000), MouseX.kr(0.001,1)) }.play

// here with white noise - mouse y controlling frequency - mousex controling bandwidth ratio
{ Resonz.ar(WhiteNoise.ar(0.1), MouseY.kr(40,10000), MouseX.kr(0.001,1)) }.play

// playing with Ringz and Resonz
{ Ringz.ar(Resonz.ar(Dust.ar(20)*1.5, MouseY.kr(40,10000), MouseX.kr(0.001,1)), MouseY.kr(4\
0,10000), 0.04) }.play;

// let's explore the resonance using the freqscope
{ Resonz.ar(WhiteNoise.ar(0.1), MouseY.kr(40,10000), MouseX.kr(0.001,1)) }.freqscope

// Klank is a bank of Resonz filters 
{ Klank.ar(`[[800, 1071, 1153, 1723], nil, [1, 0.9, 0.1, 2]], Impulse.ar(1, 0, 0.2)) }.play;

// Klank filtering WhiteNoise
{ Klank.ar(`[[800, 1200, 1600, 200], [1, 0.8, 0.4, 0.8], [1, 1, 1, 1]], WhiteNoise.ar(0.001\
)) }.play;

// DynKlang is dynamic - using the mouse to change frequency and ringtime
{   var freqs, ringtimes;
    freqs = [800, 1071, 1153, 1723] * MouseX.kr(0.5, 2);
    ringtimes = [1, 1, 1, 1] * MouseY.kr(0.001, 5);
	DynKlank.ar(`[freqs, nil, ringtimes ], PinkNoise.ar(0.001))
}.play;

Decay

{ Decay.ar(Impulse.ar(XLine.kr(1,50,20), 0.25), 0.2, FSinOsc.ar(600), 0)  }.play;

{ Decay2.ar(Impulse.ar(XLine.kr(1,50,20), 0.25), 0.1, 0.3, FSinOsc.ar(600)) }.play;

SynthDef(\clap, {arg out=0, pan=0, amp=0.3, filterfreq=50, rq=0.01;
	var env, signal, attack, noise, hpf1, hpf2;
	noise = WhiteNoise.ar(1)+SinOsc.ar([filterfreq/2,filterfreq/2+4 ], pi*0.5, XLine.kr(1,0.01\
,4));
	hpf1 = RLPF.ar(noise, filterfreq, rq);
	hpf2 = RHPF.ar(noise, filterfreq/2, rq/4);
	env = EnvGen.kr(Env.perc(0.003, 0.00035));
	signal = (hpf1+hpf2) * env;
	signal = CombC.ar(signal, 0.5, 0.03, 0.031)+CombC.ar(signal, 0.5, 0.03016, 0.06);
	signal = Decay.ar(signal, 1.5);
	signal = FreeVerb.ar(signal, 0.23, 0.1, 0.12);
	Out.ar(out, Pan2.ar(signal * amp, pan));
	DetectSilence.ar(signal, doneAction:2);
}).add;

Synth(\clap, [\filterfreq, 1700, \rq, 0.14, \amp, 0.1]);

TBall, Spring and Friction

Physical modelling can involve the mathematical modelling of all kinds of phenomena, from wind to water to the simulation of moving or falling objects where gravity, speed, surface type, etc., are all parameters. The popular Box2D library (of AngryBirds fame) is one such library that simulates physical systems. In SuperCollider there are UGens that do that, for example TBall (Trigger Ball) and Spring

// arguments are trigger, gravity, damp and friction
{TBall.ar(Impulse.ar(0), 0.1, 0.2, 0.01)*20}.play

// a light ball falling on a bouncy surface on the moon?
{TBall.ar(Impulse.ar(0), 0.1, 0.1, 0.001)*20}.play

// a heavy ball falling on a soft surface?
{TBall.ar(Impulse.ar(0), 0.1, 0.2, 0.1)*20}.play

Having explored the qualities of the TBall as a system that outputs impulses according to a physical system, we can now apply these impulses in some of the resonant filters that we have explored above.

// here using Ringz to create a metal ball falling on a marble table
{Ringz.ar(TBall.ar(Impulse.ar(0), 0.09, 0.1, 0.01)*20, 3000, 0.08)}.play

// many balls falling randomly (using Dust)
{({Ringz.ar(TBall.ar(Dust.ar(2), 0.09, 0.1, 0.01)*20, rrand(2000,3000), 0.07)}!5).sum}.play

// here using Decay to create a metal ball falling on a marble table
{Decay.ar(TBall.ar(Impulse.ar(0), 0.09, 0.1, 0.01)*20, 1)}.play

// a drummer on the snare?
{LPF.ar(WhiteNoise.ar(0.5), 4000)*Decay.ar(TBall.ar(Impulse.ar(0), 0.2, 0.16, 0.003)*20, 1)\
!2}.play

{SOS.ar(TBall.ar(Impulse.ar(0), 0.09, 0.1, 0.01)*20, 0.6, 0.0, 0.0, rrand(1.991, 1.994), -0\
.9982)}.play

// Txalaparta? 
{({|x| SOS.ar(TBall.ar(Impulse.ar(1, x*0.1*x), 0.8, 0.2, 0.02)*20, 0.6, 0.0, 0.0, rrand(1.9\
92, 1.99), -0.9982)}!6).sum}.play

The Spring UGen is a physical model of a resonating spring. Considering the wave properties of spring this can be very useful for synthesis.

{
	var trigger =LFNoise0.ar(1)>0;
	var signal = SinOsc.ar(Spring.ar(trigger,1,4e-06)*1220);
	var env = EnvGen.kr(Env.perc(0.001,5),trigger);
	Out.ar(0, Pan2.ar(signal * env, 0));
}.play

// Two springs:
{
	var trigger = LFNoise0.ar(1)>0;
	var springs = Spring.ar(trigger,1,4e-06) * Spring.ar(trigger,2,4e-07);
	var signal = SinOsc.ar(springs*1220);
	var env = EnvGen.kr(Env.perc(0.001,5),trigger);
	Out.ar(0, Pan2.ar(signal * env, 0));
}.play

// And here are two tweets (less than 140 characters) simulating timpani drums. 

play{{x=LFNoise0.ar(1)>0;SinOsc.ar(Spring.ar(x,4,3e-05)*(70.rand+190)+(30.rand+90))*EnvGen.\
kr(Env.perc(0.001,5),x)}!2}

// here heavy on the tuning pedal
play{{x=LFNoise0.ar(1)>0;SinOsc.ar(Spring.ar(x,4,3e-05)*(70.rand+190)+LFNoise2.ar(1).range(\
90,120))*EnvGen.kr(Env.perc(0.001,5),x)}!2}

In the SC3plugins you’ll find a Friction UGen which is a physical model of a mass resting on a belt. The documentation of the UGen is good, but two examples are provided here for fun:

{Friction.ar(Ringz.ar(Impulse.ar(1), [400, 412]), 0.0002, 0.2, 2, 2.697)}.play

{Friction.ar(Klank.ar(`[[400, 412, 340]], Impulse.ar(1)), 0.0002, 0.2, 2, 2.697)!2}.play

The MetroGnome

How about trying to synthesise a wooden old-fashioned metronome?

(
SynthDef(\metro, {arg tempo=1, filterfreq=1000, rq=1.0;
var env, signal;
	var rho, theta, b1, b2;
	theta = MouseX.kr(0.02, pi).poll;
	rho = MouseY.kr(0.7, 0.9999999).poll;
	b1 = 2.0 * rho * cos(theta);
	b2 = rho.squared.neg;
	signal = SOS.ar(Impulse.ar(tempo), 1.0, 0.0, 0.0, b1, b2);
	signal = RHPF.ar(signal, filterfreq, rq);
	Out.ar(0, Pan2.ar(signal, 0));
}).add
)

// Move the mouse to find your preferred metronome (low left works best for me). We are her\
e polling the MouseX and MouseY Ugens, so you will be able to follow their output in the po\
st window.

a = Synth(\metro) // we create our metronome
a.set(\tempo, 0.5.reciprocal) // 120 bpm (0.5.reciprocal = 2 bps)
a.set(\filterfreq, 4000) // try 1000 (for example)
a.set(\rq, 0.1) // try 0.5 (for example)

// Let's reinterpret the Poème symphonique was composed by György Ligeti (in 1962)
// http://www.youtube.com/watch?v=QCp7bL-AWvw

SynthDef(\ligetignome, {arg tempo=1, filterfreq=1000, rq=1.0;
var env, signal;
	var rho, theta, b1, b2;
	b1 = 2.0 * 0.97576 * cos(0.161447);
	b2 = 0.97576.squared.neg;
	signal = SOS.ar(Impulse.ar(tempo), 1.0, 0.0, 0.0, b1, b2);
	signal = RHPF.ar(signal, filterfreq, rq);
	Out.ar(0, Pan2.ar(signal, 0));
}).add;

// and we create 10 different metronomes running in different tempi
// (try with 3 metros or 30 metros)
(
10.do({
	Synth(\ligetignome).set(
		\tempo, (rrand(0.5,1.5)).reciprocal, 
		\filterfreq, rrand(500,4000), 
		\rq, rrand(0.3,0.9) )
});
)

The STK synthesis kit

Many years ago, Paul Lansky ported the STK physical modelling kit by Perry Cook and Gary Scavone for SuperCollider. This collection of UGens can be found in the SC3-plugins, but they have not been maintained and the code might be in a bad shape, although there are still some UGens that work. It could be a good project for someone wanting to explore the source of a classic physical modelling source code to update the UGens for SuperCollider 3.7+.

Here below we have a model of a xylophone:

SynthDef(\xylo, { |out=0, freq=440, gate=1, amp=0.3, sustain=0.5, pan=0|
	var sig = StkBandedWG.ar(freq, instr:1, mul:3);
	var env = EnvGen.kr(Env.adsr(0.0001, sustain, sustain, 0.3), gate, doneAction:2);
	Out.ar(out, Pan2.ar(sig, pan, env * amp));
}).add;

Synth(\xylo)

Pbind(\instrument, \xylo, \freq, Pseq(({|x|x+60}!13).mirror).midicps, \dur, 0.2).play

Part III

Chapter 12 - Time Domain Audio Effects

In this book, we divide the section on audio effects into two separate chapters, on time domain and frequency domain effects, respectively. This is for a good reason as the two are completely different techniques of manipulating audio, where the former, the time domain effects, are well know from the world of analogue audio, whereas the latter, manipulation in the frequency domain, is only realistically possible through the use of computers running Fast Fourier Transformation (FFT) algorithms. This will be explained later.

Most of the audio effects that we know (and you can roughly think about the availability of guitar pedal boxes, where each box contains the implementation of some audio effect) are familiar and easy to understand effects that were often discovered by accident or invented through some form of serendipitous exploration. There are diverse stories of John Lennon and George Martin discovering flanging on an Abbey Road tape machine, but earlier examples exist, although the technique had not been given this name then. The time domain effects are either manipulation of samples in time (typically where the signal is split and something is done to one of them, such as delaying it, and they then added again) or in amplitude (where samples can be changed in value, for example to get a distortion effect). This chapter will explore the diverse audio effects that can be easily created using the UGens available in SuperCollider.

Delay

When we delay a signal, we can achieve various effects, from a simple echo to a more complex reverb. Typical variables are delay time (how long it takes before the sound appears again) and decay time (how long it will repeat). In SuperCollider, there are three main type of delays: Delay, Comb and Allpass:

  • DelayN/DelayL/DelayC are simple echos with no feedback.
  • CombN/CombL/CombC are comb delays with feedback (decaytime)
  • AllpassN/AllpassL/AllpassC die out faster than the comb, but have feedback as well

All of these delays come with different interpolation algorithms (N, L, C, standing for No interpolation, Linear interpolation, and Cubic interpolation). Interpolation is about what happens between two discrete values, for example samples. Will you get a jump when the next value appears (N), a line from one value to the next (L) or a curvy shape between the two (C), simulating better analogue signal behaviour. These are all good for different purposes, where the N is computationally cheap, but C is good if you are sweeping delay time and you want more nuanced interpolation that can deal with values between two samples.

Generally, we can talk about three types of time when using Delays, resulting in different types of effects:

1 Short ( < 10 ms)
2 Medium ( 10 - 50 ms)
3 Long ( > 50 ms)

A short delay (1-2 samples) can create a FIR (Finite Impulse Response) lowpass filter. Increase the delay time (1-10 ms) and a comb filter materialises. Medium delays result in a thin signal but could also an ambience and width in the sound. Long delays create discrete echo which imitates sound bouncing of hard walls.

Delays can also have variable delay time which can result in the following effects: Phase Shifting Flanging Chorus These effects are explained in dedicated sections here below

Short Delays (< 10 ms)

Let’s explore what a short delay means. This is a delay that’s hardly perceivable by the human ear if you would for example delay a click sound or an impulse.

{
	x = Impulse.ar(1);
	d =  DelayN.ar(x, 0.001,  MouseX.kr(s.sampleRate.reciprocal, 0.001).poll);
	(x+d)!2
}.play 

In the example above we have a delay from from a sample (e.g., 44100.reciprocal, or 0.000022675 seconds, or 0.023 ms) to 10 milliseconds. The impulse is the shortest sound possible (one sample of of 1 in amplitude), so it serves well in this experiment. When you move the mouse from the left to the right of the screen you will probably perceive the sound as one event, but you will notice that the sound changes slightly in timbre. It is filtered. And indeed, as we will see in the filter chapter, most filters work by way of delaying samples and multiplying the feedback or feedforward samples by different values.

We could try the same with a more continuous signal, for example a Saw wave. You will hear that the timbre of the wave changes when you move the mouse around, as it is effectively being filtered (adding two signals together where one is slightly delayed)

{
	x = Saw.ar(440, 0.4);
	d =  DelayC.ar(x, 0.001,  MouseX.kr(s.sampleRate.reciprocal, 0.001).poll);
	(x+d)!2
}.play

Note that in the example above I’m using DelayC, as opposed to the DelayN in the Impulse code. This is because the delay time is so small, at sample level, that interpolation becomes important. Try to change the DelayC to DelayN (no interpolation) and listen to what happens, particularly when moving the mouse at the left of the screen at shorter delay times. The best way to explore the filtering effect might be to use WhiteNoise:

{
	x = WhiteNoise.ar(0.1);
	d =  DelayN.ar(x, 0.001,  MouseX.kr(s.sampleRate.reciprocal, 0.001));
	(x+d)!2
}.play

In the examples above we have been adding the two signals together (the original and the delayed signal) and then duplicating it (!2) into two arrays, for a two-speaker output. Adding the signals create the filtering effect, but if we simply put each signal in each speaker, we get a completely different effect, namely spatialisation:

{
	x = WhiteNoise.ar(0.1);
	d =  DelayC.ar(x, 0.006,  MouseX.kr(s.sampleRate.reciprocal, 0.006));
	[x, d]
}.play

We have now entered the realm of psychoacoustics, but this can be explained quickly by the fact that sound travels around 343 metres per second, or 34cm per millisecond, roughly 0.6 millisecond difference in arrival to the ears of a normal head, if the sound is coming from one side direction. This is called Interaural Time Difference (ITD) and is one of they key factors for sound localisation. We could explore this in the following example, where we have a signal that is “delayed” from 0.001 ms before to 0.001 ms after the original signal. Try this with headphones, you should get some impression of sound moving from the left to the right ear.

{
	x = Impulse.ar(1);
	l =  DelayC.ar(x, 1.001,  1+MouseX.kr(-0.001, 0.001));
	r =  DelayC.ar(x, 1.001,  1+MouseX.kr(0.001, -0.001));
	[l, r] // left and right channels
}.play

// load some sound files into buffers (use your own)
d = Buffer.read(s,"sounds/digireedoo.aif");
e = Buffer.read(s,"sounds/holeMONO.aif");
e = Buffer.read(s, "sounds/a11wlk01.wav"); // this one is in the SC sounds folder

In the example below, explore the difference algorithms implemented in Delay, Comb and Allpass. The Delay does not have the decay time, therefore not resulting in the Karplus-Strong type of sound that we get with the other two. The details of the difference in the internal implementation of Comb and Allpass are too complex for this book, but it has to do with the how gain coefficients are calculated, where a combined feedback and feedforward combs equal an allpass.

{
	var signal, delaytime = MouseX.kr(0.00022675, 0.01, 1);
	signal = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	// signal = Saw.ar(440,0.3);
	// signal = WhiteNoise.ar(0.3);
	d =  DelayC.ar(signal, 0.6, delaytime);
	// d =  AllpassC.ar(signal, 0.6, delaytime, MouseY.kr(0.001,1, 1));
	// d =  CombC.ar(signal, 0.6, delaytime, MouseY.kr(0.001,1, 1));
	(signal + d).dup
}.play

Is this familiar?

{CombC.ar(SoundIn.ar(0), 0.6, LFPulse.ar(0.25).range(0.0094,0.013),  0.9)!2}.play

{
	var signal, delay, delaytime = MouseX.kr(0.00022675, 0.02, 1);
	signal = PlayBuf.ar(1, e, 1, loop:1);
	delay =  DelayC.ar(signal, 0.2, delaytime);
	[signal, delay]
}.play

Any amount of Delays can be added together to create the desired sound of course, something we will explore when we discuss reverbs:

{
	var signal;
	var delaytime = MouseX.kr(0.1,0.4, 1);
	signal = Impulse.ar(1);	
	Mix.fill(14, {arg i; DelayL.ar(signal, 1, delaytime*(1+(i/10))) });
}.play

The old Karplus-Strong in its most basic form:

{
	var delaytime = MouseX.kr(0.001,0.2, 1);
	var decaytime = MouseY.kr(0.1,2, 1);
	var signal = Impulse.ar(1);
	CombL.ar(signal, 0.6, delaytime, decaytime)!2
}.play

Medium Delay time ( 10 - 50 ms)

The examples above with delays under 10ms, resulted in change in timbre or spatial location, but we always felt that this was the same sonic event, even when using a one-sample impulse. It is dependent on subjects and context, but it can be said that we start to perceive a delayed event as two events if there is more than 20 ms delay between them. This code demonstrates that:

{x=Impulse.ar(1); y=DelayC.ar(x, 0.04, MouseX.kr(0.005, 0.04).poll); (x+y)!2}.play

The post window shows the milliseconds. A drummer who would be more than 20 ms off when trying to be on the exact beat would be showing a disappointing performance (of course, part of the art of a good percussionist is to be slightly ahead or behind, so the comment is not about intention) and any hardware interface that would have a latency of more than 20 ms would be considered rather poor interface.

Longer delays can also generate a spatialisation effect, although this is not modelling the interaural time difference (ITD), but rather creating the sensation of a wide sonic image.

e = Buffer.read(s,"sounds/holeMONO.aif");

{
	var signal, delay, delaytime = MouseX.kr(0.00022675, 0.05, 1).poll;
	signal = PlayBuf.ar(1, e, 1, loop:1);
	delay =  DelayC.ar(signal, 0.2, delaytime);
	[signal, delay]
}.play
// Using microphone input
{
	var signal, delay, delaytime = MouseX.kr(0.00022675, 0.05, 1).poll;
	signal = SoundIn.ar(0);
	delay =  DelayC.ar(signal, 0.2, delaytime);
	[signal, delay]
}.play

Longer Delays ( > 50 ms)

(
{
var signal;
var delaytime = MouseX.kr(0.05, 2, 1); // between 50 ms and 2 seconds - exponential.
signal = PlayBuf.ar(1, f.bufnum, BufRateScale.kr(f.bufnum), loop:1);

// compare DelayL, CombL and AllpassL

//d =  DelayL.ar(signal, 0.6, delaytime);
//d = CombL.ar(signal, 0.6, delaytime, MouseY.kr(0.1, 10, 1)); // decay using mouseY
d =  AllpassL.ar(signal, 0.6, delaytime, MouseY.kr(0.1,10, 1));

(signal+d).dup
}.play(s)
)
// same as above, here using AudioIn for the signal instead of the NASA irritation
(
{
var signal;
var delaytime = MouseX.kr(0.05, 2, 1); // between 50 ms and 2 seconds - exponential.
signal = AudioIn.ar(1);

// compare DelayL, CombL and AllpassL

//d =  DelayL.ar(signal, 0.6, delaytime);
//d = CombL.ar(signal, 0.6, delaytime, MouseY.kr(0.1, 10, 1)); // decay using mouseY
d =  AllpassL.ar(signal, 0.6, delaytime, MouseY.kr(0.1,10, 1));

(signal+d).dup
}.play(s)
)

Random experiments

Server.default = s = Server.internal
FreqScope.new;
{CombL.ar(Impulse.ar(10), 6, 1, 1)}.play(s)


(
{
var signal;
var delaytime = MouseX.kr(0.01,6, 1);
var decaytime = MouseY.kr(1,2, 1);

signal = Impulse.ar(1);

d =  CombL.ar(signal, 6, delaytime, decaytime);

d!2
}.play(s)
)


// we can see the Comb effect by plotting the signal.

(
{
a = Impulse.ar(1);
d =  CombL.ar(a, 1, 0.001, 0.9);
d
}.plot(0.1)
)

// a little play with AudioIn
(
{
var signal;
var delaytime = MouseX.kr(0.001,2, 1);
signal = AudioIn.ar(1);

a = Mix.fill(10, {arg i; var dt;
		dt = delaytime*(i/10+0.1).postln;
		DelayL.ar(signal, 3.2, dt);});

(signal+a).dup
}.play(s)
)

/*
TIP: if you get this line printed ad infinitum:
exception in real time: alloc failed
You could go into the ServerOptions.sc (source file) and change
	var <>memSize = 8192;
to
	var <>memSize = 32768;
which allows the server to use up more memory (RAM)
*/

(
{ // watch your ears !!! Use headphones and lower the volume !!!
var signal;
var delaytime = MouseX.kr(0.001,2, 1);
signal = AudioIn.ar(1);

a = Mix.fill(13, {arg i; var dt;
		dt = delaytime*(i/10+0.1).postln;
		CombL.ar(signal, 3.2, dt);});

(signal+a).dup
}.play(s)
)


// A source code for a Comb filter might look something like this:
int  i, j, s;

for(i=0; i <= delay_size;i++)

  { if (i >= delay)
     j = i - delay;    // work out the buffer position
    else 
    j = i - delay + delay_size + 1;
    // add the delayed sample to the input sample
    s = input + delay_buffer[j]*decay;
    // store the result in the delay buffer, and output
    delay_buffer[i] = s;
    output = s;
  } 

Phaser (phase shifting)

In a phaser, a signal is sent through an allpass filter, not filtering out any frequencies, but simply shifting the phase of the sound by delaying it. This sound is then added to the original signal. If the phase is 180 degrees, the sound is cancelled out, but if it is less than that, it will create variations in the spectra.

// phaser with a soundfile
e = Buffer.read(s, "sounds/a11wlk01.wav");

(
{
var signal;
var phase = MouseX.kr(0.000022675,0.01, 1); // from a sample resolution to 10 ms delay line

var ph;

signal = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);

ph = AllpassL.ar(PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1), 4, phase+(0.01\
.rand), 0);
/* // try 4 phasers
ph = Mix.ar(Array.fill(4, 
			{ AllpassL.ar(PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1), 4, phase+(0.01\
.rand), 0)}
		));
*/

(signal + ph).dup 
}.play
)


// try it with a sinewave (the mouse is shifting the phase of the input signal
(
{
var signal;
var phase = MouseX.kr(0.000022675,0.01); // from a sample to 10 ms delay line
var ph;

signal = SinOsc.ar(444,0,0.5);
//signal = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
ph = AllpassL.ar(SinOsc.ar(444,0,0.5), 4, phase, 0);

 (signal + ph).dup 

}.play
)


// using an oscillator to control the phase instead of MouseX
// here using the .range trick:
{SinOsc.ar(SinOsc.ar(0.3).range(440, 660), 0, 0.5) }.play

(
{
var signal;
var ph;

// base signal
signal = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
// phased signal
ph = AllpassC.ar(
		PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1), 
		4, 
		LFPar.kr(0.1, 0, 1).range(0.000022675,0.01), // a circle every 10 seconds 
		0); // experiment with what happens if you increase the decay length

 (signal + ph).dup // we add them together and route to two speakers
}.play
)

/*
NOTE: Theoretically you could use DelayC or CombC instead of AllpassC.
In the case of DelayC, you would have to delete the last argument (0) 
(as DelayC doesn't have decay argument).
*/

Flanger

In a Flanger, a delayed signal is added to the original signal with a continuously-variable delay (usually smaller than 10 ms) creating a phasing effect. The term comes from times where tapes were used in studios and an operator would place the finger on the flange of one of the tapes to slow it down, thus causing the flanging effect.

Flanger is like a Phaser with dynamic delay filter (allpass), but it usually has a feedback loop.

(
SynthDef(\flanger, { arg out=0, in=0, delay=0.1, depth=0.08, rate=0.06, fdbk=0.0, decay=0.0\
; 

	var input, maxdelay, maxrate, dsig, mixed, local;
	maxdelay = 0.013;
	maxrate = 10.0;
	input = In.ar(in, 1);
	local = LocalIn.ar(1);
	dsig = AllpassL.ar( // the delay (you could use AllpassC (put 0 in decay))
		input + (local * fdbk),
		maxdelay * 2,
		LFPar.kr( // very similar to SinOsc (try to replace it) - Even use LFTri
			rate * maxrate,
			0,
			depth * maxdelay,
			delay * maxdelay),
		decay);
	mixed = input + dsig;
	LocalOut.ar(mixed);
	Out.ar([out, out+1], mixed);
}).add;
)

// audioIn on audio bus nr 10
{Out.ar(10, AudioIn.ar(1))}.play(s, addAction:\addToHead)

a = Synth(\flanger, [\in, 10], addAction:\addToTail)
a.set(\delay, 0.04)
a.set(\depth, 0.04)
a.set(\rate, 0.01)
a.set(\fdbk, 0.08)
a.set(\decay, 0.01)

// or if you prefer a buffer:
b = Buffer.read(s, "sounds/a11wlk01.wav"); // replace this sound with a nice sounding one !\
!!
{Out.ar(10, PlayBuf.ar(1, b.bufnum, BufRateScale.kr(b.bufnum), loop:1))}.play(addAction:\ad\
dToHead)

a = Synth(\flanger, [\in, 10], addAction:\addToTail)
a.set(\delay, 0.04)
a.set(\depth, 0.04)
a.set(\rate, 1)
a.set(\fdbk, 0.08)
a.set(\decay, 0.01)

// a parameter explosion results in a Chorus like effect:
a.set(\decay, 0)
a.set(\delay, 0.43)
a.set(\depth, 0.2)
a.set(\rate, 0.1)
a.set(\fdbk, 0.08)

// or just go mad:
a.set(\delay, 0.93)
a.set(\depth, 0.9)
a.set(\rate, 0.8)
a.set(\fdbk, 0.8)

Chorus

The chorus effect happens when we add a delayed signal with the original with a time-varying delay. The delay has to be short in order not to be perceived as echo, but above 5 ms to be audible. If the delay is too short, it will destructively interfere with the un-delayed signal and create a flanging effect. Often, the delayed signals will be pitch shifted to create a harmony with the original signal.

There is no definite algorithm to create a chorus. There are many different ways to achieve it. As opposed to the Flanger above, this chorus does not have a feedback loop. But you could create a chorus effect out of a Flanger by using longer delay time (20-30 ms instead of 1-10 ms in the Flanger)

// a simple chorus
SynthDef(\chorus, { arg inbus=10, outbus=0, predelay=0.08, speed=0.05, depth=0.1, ph_diff=0\
.5;
	var in, sig, modulators, numDelays = 12;
	in = In.ar(inbus, 1) * numDelays.reciprocal;
	modulators = Array.fill(numDelays, {arg i;
      	LFPar.kr(speed * rrand(0.94, 1.06), ph_diff * i, depth, predelay);}); 
	sig = DelayC.ar(in, 0.5, modulators);  
	sig = sig.sum; //Mix(sig); 
	Out.ar(outbus, sig!2); // output in stereo
}).add


// try it with audio in
{Out.ar(10, AudioIn.ar(1))}.play(addAction:\addToHead)
// or a buffer:
b = Buffer.read(s, "sounds/a11wlk01.wav"); // replace this sound with a nice sounding one !\
!!
{Out.ar(10, PlayBuf.ar(1, b.bufnum, BufRateScale.kr(b.bufnum), loop:1))}.play(addAction:\ad\
dToHead)

a = Synth(\chorus, addAction:\addToTail)
a.set(\predelay, 0.02);
a.set(\speed, 0.22);
a.set(\depth, 0.5);
a.set(\pd_diff, 0.7);
a.set(\predelay, 0.2);

Reverb

Achieving realistic reverb is a science on its own, to deep to delve into here. The most common reverb technique in digital acoustics is to use parallel comb delays that are fed into few Allpass delays.

Reverb can be analysed into 3 stages: * Direct sound (from the soundsource) * Early reflections (discrete 1st generation reflections from walls) * Reverberation (Nth generation reflections that take time to build up, and fade out slowly)

SynthDef(\reverb, {arg inbus=0, outbus=0, predelay=0.048, combdecay=15, allpassdecay=1, rev\
Vol=0.31;
	var sig, y, z;
	sig = In.ar(inbus, 1); 
	
	// predelay
	z = DelayN.ar(sig, 0.1, predelay); // max 100 ms predelay
	
	// 7 length modulated comb delays in parallel :
	y = Mix.ar(Array.fill(7,{ CombL.ar(z, 0.05, rrand(0.03, 0.05), combdecay) })); 

	6.do({ y = AllpassN.ar(y, 0.050, rrand(0.03, 0.05), allpassdecay) });
	Out.ar(outbus, sig + (y * revVol) ! 2); // as fxlevel is 1 then I lower the vol a bit
}).add; 


{Out.ar(10, AudioIn.ar(1))}.play(addAction:\addToHead)

b = Buffer.read(s, "sounds/a11wlk01.wav"); // replace this sound with a nice sounding one !\
!!

{Out.ar(10, PlayBuf.ar(1, b.bufnum, BufRateScale.kr(b.bufnum), loop:1))}.play(addAction:\ad\
dToHead)

a = Synth(\reverb, [\inbus, 10], addAction:\addToTail)

a.set(\predelay, 0.048)
a.set(\combdecay, 2.048)
a.set(\allpassdecay, 1.048)
a.set(\revVol, 0.048)

Tremolo

Tremolo is fluctuating amplitude of a signal, well known from analogue guitar amplifiers, and heard in surf music, or garage punk such as The Cramps.

SynthDef(\tremolo, {arg inbus=0, outbus=0, freq=1, strength=1; 
   var fx, sig; 
   sig = In.ar(inbus, 1); 
   fx = sig * SinOsc.ar(freq, 0, strength, 0.5, 2); 
   Out.ar(outbus, (fx+ sig).dup ) 
}).add; 

{Out.ar(10, AudioIn.ar(1))}.play(addAction:\addToHead)

b = Buffer.read(s, "sounds/a11wlk01.wav"); // replace this sound with a nice sounding one !\
!!
{Out.ar(10, PlayBuf.ar(1, b.bufnum, BufRateScale.kr(b.bufnum), loop:1))}.play(addAction:\ad\
dToHead)


a = Synth(\tremolo, [\inbus, 10], addAction:\addToTail)

a.set(\freq, 4.8)
a.set(\strength, 0.8)

Distortion

Distortion can be achieved through diverse algorithms, but the most basic one could be to raise the amplitude of the signal so much that it starts to clip (below -1 and above 1), thus turning a sine wave into a square wave, adding harmonics.

(
{
var in, gain;
	in = AudioIn.ar(1);
	gain = MouseX.kr(1,100);
	in=in.abs;
	((in.squared + (gain*in))/(in.squared + ((gain-1)*in) + 1))
!2}.play
)

SuperCollider has a .distort method. 
(
{		// mouseX is pregain, mouseY is postgain
		var in, distortion, fx, y, z;
		in = AudioIn.ar(1);
		distortion = ((in * MouseX.kr(1,10)).distort * MouseY.kr(1,10)).distort;
		fx = Compander.ar(distortion, distortion, 1, 0, 1 ); // sustain
		Out.ar(0, LeakDC.ar(fx + in ) !2 );
}.play
)

// Here not using AudioIN:
b = Buffer.read(s, "sounds/a11wlk01.wav"); // replace this sound with a nice sounding one !\
!!
{Out.ar(10, PlayBuf.ar(1, b.bufnum, BufRateScale.kr(b.bufnum), loop:1))}.play(addAction:\ad\
dToHead)

(
{		// mouseX is pregain, mouseY is postgain
			var in, distortion, fx, y, z;
			in = In.ar(10);
			distortion = ((in * MouseX.kr(1,10)).distort * MouseY.kr(1,10)).distort;
			fx = Compander.ar(distortion, distortion, 1, 0, 1 ); // sustain
			Out.ar(0, LeakDC.ar(fx + in ) !2 );

}.play(addAction:\addToTail) // for addAction, see Synth helpfile or tutorial 13
)

Compressor

The compressor reduces the dynamic range of a signal if it exceeds certain threshold. The compression ratio determines how much the signal exceeding the threshold is lowered. 4:1 compression ratio means that for every 4 dB of signal that goes into the unit, it lowers the signal such that only 1 dB is outputted.

(
{
	var in, compander;
	in = AudioIn.ar(1);
	compander = Compander.ar(in, in, MouseX.kr(0.001, 1, 1), 1, 0.5, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

e = Buffer.read(s, "sounds/a11wlk01.wav");

(
{
	var in, compander;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	compander = Compander.ar(in, in, MouseX.kr(0.0001, 1, 1), 1, 0.5, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

Limiter

The limiter does essentially the same as the compressor, but it looks at the signal’s peaks whereas the compressor looks at the average energy level. A limiter will not let the signal past the threshold, while the compressor does, according to the ratio settings.

The difference is in the slopeAbove argument of the Compander (0.5 in the compressor, but 0.1 in the limiter)

(
// limiter - Audio In
{
	var in, compander;
	in = AudioIn.ar(1);
	compander = Compander.ar(in, in, MouseX.kr(0.001, 1, 1), 1, 0.1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

(
// limiter - Soundfile
{
	var in, compander;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	compander = Compander.ar(in, in, MouseX.kr(0.0001, 1, 1), 1, 0.1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

Sustainer

The sustainer works like an inverted compressor, it exaggerates the low amplitudes and tries to raise them up to the threshold defined.

(
// sustainer - Audio In
{
	var in, compander;
	in = AudioIn.ar(1);
	compander = Compander.ar(in, in, MouseX.kr(0.001, 1, 1), 0.1, 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

(
// sustainer - Soundfile
{
	var in, compander;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	compander = Compander.ar(in, in, MouseX.kr(0.0001, 1, 1), 0.1, 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

// for comparison, here is the file without sustain:
{PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1)!2}.play

Noise gate

The noise gate allows a signal to pass through the filter only when it is above a certain threshold. If the energy of the signal is below the threshold, no sound is allowed to pass. It is often used in settings where there is background noise and one only wants to record the signal and not the (in this case) uninteresting noise.

(
// noisegate - Audio In
{
	var in, compander;
	in = AudioIn.ar(1);
	compander = 	Compander.ar(in, in, MouseX.kr(0.005, 1, 1), 10, 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

(
// noisegate - sound file
{
	var in, compander;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	compander = 	Compander.ar(in, in, MouseX.kr(0.001, 1), 10, 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)

The noise gate needs a bit of parameter tweaking to get what you want, so here is the same version as above, just with a MouseY controlling the slopeAbove parameter.

(
// noisegate - Audio In
{
	var in, compander;
	in = AudioIn.ar(1);
	compander = 	Compander.ar(in, in, MouseX.kr(0.005, 1, 1), MouseY.kr(1,20), 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)


(
// noisegate - soundfile
{
	var in, compander;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	compander = 	Compander.ar(in, in, MouseX.kr(0.001, 1), MouseY.kr(1,20), 1, 0.01, 0.01);
	compander ! 2 // stereo
}.play
)


(
// for fun: a noisegater with a bit of reverb (controlled by mouseY)
// better use headphones - danger of feedback!
{
	var in, compander;
	var predelay=0.048, combdecay=3.7, allpassdecay=0.21, revVol=0.21;
	in = AudioIn.ar(1);
	compander = 	Compander.ar(in, in, MouseX.kr(0.005, 1, 1), 10, 1, 0.01, 0.01);
	z = DelayN.ar(compander, 0.1, predelay);
	y = Mix.ar(Array.fill(7,{ CombL.ar(z, 0.05, rrand(0.03, 0.05), MouseY.kr(1,20, 1)) })); 
	6.do({ y = AllpassN.ar(y, 0.050, rrand(0.03, 0.05), allpassdecay) });
	y!2
}.play
)

Normalizer

Normalizer uses a buffer to store the sound in a small delay and look ahead in the audio. It will not overshoot like a Compander, but the downside is the delay. The normalizer normalizes the input amplitide to a given level.

(
// normalizer - Audio In
{
	var in, normalizer;
	in = AudioIn.ar(1);
	normalizer = Normalizer.ar(in, MouseX.kr(0.1, 0.9), 0.01);
	normalizer ! 2 // stereo
}.play
)

(
// normalizer - sound file
{
	var in, normalizer;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	normalizer = Normalizer.ar(in, MouseX.kr(0.1, 0.9), 0.01);
	normalizer ! 2 // stereo
}.play
)

Limiter (Ugen)

Like the Normalizer, the Limiter uses a buffer to store the sound in a small delay buffer to look ahead in the audio. It will not overshoot like the Compander, but you have to put up with a slight the delay. The limiter limits the input amplitude to a given level.

(
// limiter - Audio In
{
	var in, normalizer;
	in = AudioIn.ar(1);
	normalizer = Limiter.ar(in, MouseX.kr(0.1, 0.9), 0.01);
	normalizer ! 2 // stereo
}.play
)

(
// limiter - sound file
{
	var in, normalizer;
	in = PlayBuf.ar(1, e.bufnum, BufRateScale.kr(e.bufnum), loop:1);
	normalizer = Limiter.ar(in, MouseX.kr(0.1, 0.9), 0.01);
	normalizer ! 2 // stereo
}.play
)

Amplitude

Amplitude tracks the peak amplitude of a signal. It is not really an audio effect, but it can be a key element in the design of effects, (for example adaptive audio effects) and is therefore included here in this section.

In the example below, we map the input amplitude to frequency of a sine:

{SinOsc.ar(Amplitude.kr(AudioIn.ar(1), 0.1, 0.1, 12000, 0), 0, 0.3)}.play;


// with a noise gater as explained above
(
{
var noisegate, in;
in = AudioIn.ar(1);
noisegate = Compander.ar(in, in, MouseX.kr(0.005, 1, 1), MouseY.kr(1,20), 1, 0.01, 0.01);
SinOsc.ar(Amplitude.kr(noisegate, 0.1, 0.1, 12000, 0), 0, 0.3) ! 2
}.play;
)


// Compare the two following examples 

{SinOsc.ar(
	MouseX.kr(100, 6000, 1),
	0,
	Amplitude.kr(AudioIn.ar(1), 0.1, 0.1, 1, 0)
)!2}.play

// -- huh? --

{SinOsc.ar(
	MouseX.kr(100, 6000, 1),
	0,
	AudioIn.ar(1)
)!2}.play

Pitch

Pitch tracks the pitch of a signal. If the pitch tracker has found the pitch, the hasFreq variable will be 1 (true), if it doesn’t hold a pitch then it is 0 (false). (Read the helpfile about how it works)

NOTE: it can be useful to pass the input signal through a Low Pass Filter as it is easier to detect the pitch of a signal with less harmonics.

Tip: People often ask about the hashtag in front of the pitch and hasPitch variables. This is a way to assign two variables with valued from an array.

# a, b = [444, 555];
a
b

The simplest of patches - mapping pitch to the frequency of the sine

(
{
	var env, in, freq, hasFreq;
	
	// the audio input
	in = AudioIn.ar(1); 
	
	// the pitch variable and the hasFreq (Pitch.kr returns a list like this [freq, hasFreq])
	# freq, hasFreq = Pitch.kr(in, ampThreshold: 0.2, median: 7);
	
	// when the hasFreq is true (pitch is found) we generate a ADSR envelope that is open until
	// the hasFreq is false again or the amplitude is below the ampThreshold of the Pitch.
	env = EnvGen.ar(Env.adsr(0.51, 0.52, 1, 0.51, 1, -4), gate: hasFreq);
	
	// we plug the envolope to the volume argument of the Sine
	SinOsc.ar(freq, 0, env * 0.5) ! 2

}.play;
)

// a bit more complex patch where we use Amplitude to control volume:

(
{
	var env, in, freq, hasFreq, amp;
	
	// the audio input
	in = AudioIn.ar(1); 
	amp = Amplitude.kr(in, 0.25, 0.25);
	
	// the pitch variable and the hasFreq (Pitch.kr returns a list like this [freq, hasFreq])
	# freq, hasFreq = Pitch.kr(in, ampThreshold: 0.2, median: 7);
	
	// when the hasFreq is true (pitch is found) we generate a ADSR envelope that is open until
	// the hasFreq is false again or the amplitude is below the ampThreshold of the Pitch.
	env = EnvGen.ar(Env.adsr(0.51, 0.52, 1, 0.51, 1, -4), gate: hasFreq);
	
	// we plug the envolope to the volume argument of the Sine
	SinOsc.ar(freq, 0, env * amp) ! 2
	
}.play;
)

(
SynthDef(\pitcher,{
	var in, amp, freq, hasFreq, out, gate, threshold;
	
	threshold = 0.05; // change 
	
	// using a LowPassFilter to remove high harmonics
	in = LPF.ar(Mix.new(AudioIn.ar([1,2])), 2000);
	amp = Amplitude.kr(in, 0.25, 0.25);
	
	# freq, hasFreq = Pitch.kr(in, ampThreshold: 0.1, median: 7);
	gate = Lag.kr(amp > threshold, 0.01);	

	// -- to look at the values, uncomment the following lines 
	// -- (you need a recent build with the Poll class)
	//Poll.kr(Impulse.kr(10), freq, "frequency:");
	//Poll.kr(Impulse.kr(10), amp, "amplitude:");
	//Poll.kr(Impulse.kr(10), hasFreq, "hasFreq:");
	
	out = VarSaw.ar(freq, 0, 0.2, amp*hasFreq*gate);
	
	// uncomment (3 sines (octave lower, pitch and octave higher mixed into one signal (out)))
	//out = Mix.new(SinOsc.ar(freq * [0.5,1,2], 0, 0.2 * amp*hasFreq*gate));
	6.do({
		out = AllpassN.ar(out, 0.040, [0.040.rand,0.040.rand], 2)
	});
	Out.ar(0,out)
}).play(s);
)

In the example below we use the Tartini UGen by Nick Collins. In my experience it performs better than Pitch and is part of the SC3-plugins external plugins.

(
SynthDef(\pitcher,{
	var in, amp, freq, hasFreq, out, threshold, gate;

	threshold = 0.05; // change 
	in = LPF.ar(Mix.new(AudioIn.ar([1,2])), 2000);
	amp = Amplitude.kr(in, 0.25, 0.25);

	# freq, hasFreq = Tartini.kr(in);
	gate = Lag.kr(amp > threshold, 0.01);	
	
	// -- to look at the values, uncomment the following lines 
	// -- (you need a recent build with the Poll class)
	//Poll.kr(Impulse.kr(10), freq, "frequency:");
	//Poll.kr(Impulse.kr(10), amp, "amplitude:");
	//Poll.kr(Impulse.kr(10), hasFreq, "hasFreq:");
		
	out = Mix.new(VarSaw.ar(freq * [0.5,1,2], 0, 0.2, gate* hasFreq *amp ));
	//out = Mix.new(SinOsc.ar(freq * [0.5,1,2], 0, 0.2 * amp*hasFreq*gate));
	6.do({
		out = AllpassN.ar(out, 0.040, [0.040.rand,0.040.rand], 2)
	});
	Out.ar(0,out)
}).play(s);
)

Filters

The filter Ugens in SuperCollider use complex time-domain algorithms to achieve the desired effect.

Low Pass Filter

(
{
	var in;
	in = AudioIn.ar(1);
	LPF.ar(in, MouseX.kr(80, 4000));
}.play
)

(
{
	var in;
	in = Blip.ar(440);
	LPF.ar(in, MouseX.kr(80, 24000));
}.play
)

Resonant Low Pass Filter

(
{
	var in;
	in = Blip.ar(440);
	RLPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

(
{
	var in;
	in = WhiteNoise.ar(1);
	RLPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

High Pass Filter

(
{
	var in;
	in = Blip.ar(440);
	HPF.ar(in, MouseX.kr(80, 22000));
}.play
)

(
{
	var in;
	in = WhiteNoise.ar(1);
	HPF.ar(in, MouseX.kr(80, 22000));
}.play
)

Resonant High Pass Filter

(
{
	var in;
	in = Blip.ar(440);
	RHPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

(
{
	var in;
	in = WhiteNoise.ar(1);
	RHPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

Band Pass Filter

(
{
	var in;
	in = Blip.ar(440);
	BPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

(
{
	var in;
	in = WhiteNoise.ar(1);
	BPF.ar(in, MouseX.kr(80, 22000), MouseY.kr(0.0001, 1));
}.play
)

Band Reject Filter

{ BRF.ar(Saw.ar(200,0.1), FSinOsc.kr(XLine.kr(0.7,300,20),0,3800,4000), 0.3) }.play;

{ BRF.ar(Saw.ar(200,0.5), MouseX.kr(100, 10000, 1), 0.3) }.play;

SOS - A biquad filter

A second order filter, also known as biquad filter. The helpfile shows the algorithm itself:

out(i) = (a0 * in(i)) + (a1 * in(i-1)) + (a2 * in(i-2)) + (b1 * out(i-1)) + (b2 * out(i-2))

Where you can see that the filter reaches back to the second sample after the current one, and uses parameters (a0, a1, b1 and b2) to affect the function of the filter.

(
{
	var rho, theta, b1, b2;
	theta = MouseX.kr(0.2pi, pi);
	rho = MouseY.kr(0.6, 0.99);
	b1 = 2.0 * rho * cos(theta);
	b2 = rho.squared.neg;
	SOS.ar(WhiteNoise.ar(0.1 ! 2), 1.0, 0.0, 0.0, b1, b2)
}.play
)

Resonant filter

This filter will resonate frequencies at the set frequency. The bwr parameter is the bandwidth ratio, that is, how much energy is passed on each side of the centre frequency.

{ Resonz.ar(WhiteNoise.ar(0.5), 2000, XLine.kr(1, 0.001, 8)) }.play

// high amp input (from Impulse) and low RQ makes a note 

{Resonz.ar(Impulse.ar(1.5, 0, 50), Rand(200,2000), 0.03) }.play

// try putting 500 in amp and 0.003 in RQ
{Resonz.ar(Impulse.ar(1.5, 0, 500), Rand(200,2000), 0.003) }.play


// for fun ( if you don't like the polyrhythm, put 1 instead of trig)
// or if you like it, then put some more tempi in there and appropriate weights

(
var trig;
var wait = 4;
Task({
	20.do({
		trig = [1, 1.5].wchoose([0.7, 0.3]);
		{Resonz.ar(Impulse.ar(trig, 0, 50*rrand(5,10)), Rand(200,2000), 0.003) ! 2}.play;
		(wait + rrand(0.1,1)).wait;
		wait = wait - rrand(0.01, 0.2);
	})
}).play
)

Chapter 13 - Fast Fourier Transform (FFT)

Most of the well known audio effects process audio in the time domain, typically varying samples in amplitude (ring modulation, waveshaping, distortion) or time (filters or delays). Fast Fourier Transform (FFT) is a computational algorithm that allows us to manipulate sound in the frequency domain, performing various calculations on the independent frequency bins of the signal.

In FFT, windows are taken from the sound signal and analysed one by one. (The window size is typically 512 or 1024 samples creating list of 256 or 512 bins: values of magnitude and phase). The processing (using the PV plugins of SC) is done in the frequency domain and then converted back to the time domain before playback. The windows are normally overlapped mixed using with a Hanning window to prevent smearing between frequencies.

Using FFT in SuperCollider, you need to do the FFT analysis, using the FFT UGen, then diverse PV_Ugens (Phase Vocoder Ugens) can be applied to operate mathematically on the signal, finally the resulting signal will need to be converted back into the time domain using the Inverse Fast Fourier Transform (IFFT).

Or, in short: FFT -> PV_Ugens -> IFFT

where FFT translates the signal from the time domain into the frequency domain, the PV_UGens perform some functions on the sound and then we use Inverse Fast Fourier Transform (IFFT) to translate the signal back to the time domain.

Frequency bins are a sets of magnitude and phase. The larger the windows, the better pitch resolution we have, but worse precision in time. The smaller the windows, the worse pitch resolution but better precision in time.

sample rate/window size 44100/512 = 86.1328125 // so the first (lowest) frequency of a 512 window is 86.13 Hz 44100/1024 = 43.06640625 // so the first (lowest) frequency of a 1024 window is 43.06 Hz

For a window size of 1024 samples we get 512 bins. These are the frequencies of which we will get the mag and phase: Post << 512.collect({|i| (22050/512)*(i+1)}) (And we would need a 1024 frame Buffer to store that (mag and phase for each freq))

The full list of frequencies, including DC, that a 1024-point FFT theoretically generates: a = 1024.collect({|i| (44100/1024)*i}); Except we ignore the bins above Nyquist since they’re redundant: a = a[..512]; Resulting in: a.postcs;””

NOTE : some of the examples below use the FFT plugins from the library of Bhob Rainey

So in general, it is important to understand that FFT analysis of a sound gives you two arrays, bins (frequencies - depending upon the size of the window) and mags (the magnitude/amplitude of the frequencies). FFT Ugens do manipulation on either the bins or the mags.

Fast Fourier Transform examples

// load the buffers (and place your sounds into the buffers)
(
b = Buffer.alloc(s,2048,1);
c = Buffer.alloc(s,2048,1);
//d = Buffer.read(s,"sounds/oceanMONO.aif");
//d = Buffer.read(s,"sounds/insand/camina.aif");
d = Buffer.read(s,"sounds/digireedoo.aif");
e = Buffer.read(s,"sounds/holeMONO.aif");
f = Buffer.read(s, "sounds/a11wlk01.wav");
)

MagAbove

Passes only bins whose magnitude are above a given threshold.

(
SynthDef(\pvmagabove, { arg out=0, soundBufnum1;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	//in = WhiteNoise.ar(0.2);
	chain = FFT(LocalBuf(2048), in);
	chain = PV_MagAbove(chain, MouseY.kr(30, 1)); 
	Out.ar(out, 0.5 * IFFT(chain)!2);
}).play(s,[\out,0, \soundBufnum1, e.bufnum]);
)

BrickWall

Clears bins above or below a cutoff point (works as lowpass or highpass filters)

(
SynthDef(\pvbrickwall, { arg out=0, soundBufnum1;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	//in = WhiteNoise.ar(0.2);
	chain = FFT(LocalBuf(2048), in);
	chain = PV_BrickWall(chain, MouseX.kr(-1,1)); 
	Out.ar(out, 0.5 * IFFT(chain)!2);
}).play(s,[\out,0, \soundBufnum1, e.bufnum]);
)

RectComb

Generates a series of gaps in a spectrum

(
SynthDef(\pvrectcomb, { arg out=0, soundBufnum1;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	//in = WhiteNoise.ar(0.2);
	chain = FFT(LocalBuf(2048), in);
	chain = PV_RectComb(chain, 8, LFTri.kr(0.097,0,0.4,0.5), 
		LFTri.kr(0.24,0,-0.5,0.5)); 
	Out.ar(out, 0.5 * IFFT(chain)!2);
}).play(s,[\out,0,\bufnum,b.bufnum, \soundBufnum1, e.bufnum]);
)

{language= JavaScript, line-numbers=off}

Rectcomb - controllable with mouse ~~~

(
SynthDef(\pvrectcomb, { arg out=0, soundBufnum1;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	//in = WhiteNoise.ar(0.2);
	chain = FFT(bufnum, in);
	chain = PV_RectComb(chain,  MouseX.kr(0, 32), MouseY.kr, 0.2); 
	Out.ar(out, 0.5 * IFFT(chain));
}).play(s,[\out,0, \soundBufnum1, e.bufnum]);
)

MagFreeze

Freezes magnitudes at current levels when freeze > 0

(
SynthDef(\pvmagfreeze, { arg out=0, soundBufnum1;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	//in = WhiteNoise.ar(0.2);
	chain = FFT(LocalBuf(2048), in);
	chain = PV_MagFreeze(chain, MouseX.kr(-1, 1) ); // on the right side it freezes
	Out.ar(out, 0.5 * IFFT(chain)!2);
}).play(s,[\out,0, \soundBufnum1, f.bufnum]);
)

CopyPhase

Combines magnitudes of first input and phases of the second input.

(
SynthDef(\pvcopy, { arg out=0, soundBufnum=2;
	var inA, chainA, inB, chainB, chain;
	inA = PlayBuf.ar(1, soundBufnum, BufRateScale.kr(soundBufnum), loop: 1);
	inB =  SinOsc.ar(SinOsc.kr(SinOsc.kr(0.08, 0, 6, 6.2).squared, 0, 100, 800));
	chainA = FFT(LocalBuf(2048), inA);
	chainB = FFT(LocalBuf(2048), inB);
	chain = PV_CopyPhase(chainA, chainB); 
	Out.ar(out,  0.5 * IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum, d.bufnum]);
)

Magnitude smear

Average a bin’s magnitude with its neighbours.

(
SynthDef(\pvmagsmear, { arg out=0, soundBufnum=2;
	var in, chain;
	in = PlayBuf.ar(1, soundBufnum, BufRateScale.kr(soundBufnum), loop: 1);
	chain = FFT(LocalBuf(2048), in);
	chain = PV_MagSmear(chain, MouseX.kr(0, 100)); 
	Out.ar(out, 0.5 * IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum, e.bufnum]);
)

Morph

Morphs between two buffers.

(
SynthDef(\pvmorph, { arg out=0, soundBufnum1=2, soundBufnum2=3;
	var inA, chainA, inB, chainB, chain;
	inA = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	inB = PlayBuf.ar(1, soundBufnum2, BufRateScale.kr(soundBufnum2), loop: 1);
	chainA = FFT(LocalBuf(2048), inA);
	chainB = FFT(LocalBuf(2048), inB);
	chain = PV_Morph(chainA, chainB, MouseX.kr); 
	Out.ar(out,  IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum1, d.bufnum, \soundBufnum2, e.bufnum]);
)

XFade

Interpolates bins between two buffers.

(
SynthDef(\pvmorph, { arg out=0, soundBufnum1=2, soundBufnum2=3;
	var inA, chainA, inB, chainB, chain;
	inA = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	inB = PlayBuf.ar(1, soundBufnum2, BufRateScale.kr(soundBufnum2), loop: 1);
	chainA = FFT(LocalBuf(2048), inA);
	chainB = FFT(LocalBuf(2048), inB);
	chain = PV_XFade(chainA, chainB, MouseX.kr); 
	Out.ar(out,  IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum1, d.bufnum, \soundBufnum2, e.bufnum]);
)

Softwipe

Copies low bins from one input and the high bins of the other.

(
SynthDef(\pvsoftwipe, { arg out=0, soundBufnum1=2, soundBufnum2=3;
	var inA, chainA, inB, chainB, chain;
	inA = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	inB = PlayBuf.ar(1, soundBufnum2, BufRateScale.kr(soundBufnum2), loop: 1);
	chainA = FFT(LocalBuf(2048), inA);
	chainB = FFT(LocalBuf(2048), inB);
	chain = PV_SoftWipe(chainA, chainB, MouseX.kr); 
	Out.ar(out,  IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum1, d.bufnum, \soundBufnum2, e.bufnum]);
)

MagMinus

Subtracting spectral energy - Subtracts buffer B’s magnitudes from buffer A.

(
SynthDef(\pvmagminus, { arg out=0, soundBufnum1=2, soundBufnum2=3;
	var inA, chainA, inB, chainB, chain;
	inA = PlayBuf.ar(1, soundBufnum1, BufRateScale.kr(soundBufnum1), loop: 1);
	inB = PlayBuf.ar(1, soundBufnum2, BufRateScale.kr(soundBufnum2), loop: 1);
	chainA = FFT(LocalBuf(2048), inA);
	chainB = FFT(LocalBuf(2048), inB);
	chain = PV_MagMinus(chainA, chainB, MouseX.kr(0, 1)); 
	Out.ar(out,  IFFT(chain).dup);
}).play(s,[\out, 0, \soundBufnum1, d.bufnum, \soundBufnum2, e.bufnum]);
)

Language manipulation of bins

The PV_ UGens are blackboxes. We can read their helpfiles, but we don’t see clearly what they do unless we look at their C++ sourcecode. But what if we want to manipulate the bins on the language side?

A pvcollect method (phase vocoder collect) SuperCollider allows this, so instead of:

1 FFT -> PV_Ugens -> IFFT

as we looked at above, we can now do:

1 FFT -> our bin calculations -> IFFT

We do this through pvcollect (see the collect method in the Collection helpfile) pvcollect processes each bin of an FFT chain separately (see pvcollect helpfile), but pvcollect takes a function and it is inside this function that we can have fun with the magnitude and the phase of the signal (as taken into the frequency domain).

We have magnitude, phase and index to play with. The pvcollect returns an array of [mag, phase]. We can then use all kinds of algorithms to play with the mag and the phase, for example using the index as a parameter in the calculations.

(
s.boot.doWhenBooted{
c = Buffer.read(s,"sounds/a11wlk01.wav");
}
)

Spectral delay - here we use a DelayN UGen to delay the bins according to MouseX location

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	v = MouseX.kr(0.1, 1);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		mag + DelayN.kr(mag, 1, v);
	}, frombin: 0, tobin: 256, zeroothers: 1);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Another type of spectral delay where the high frequencies get longer delay times, this is the trick:

250.do({|i|(i*(250.reciprocal)).postln;})
(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	v = MouseX.kr(0.1, 2);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		mag + DelayN.kr(mag, 1, v*(index*256.reciprocal));
	}, frombin: 0, tobin: 256, zeroothers: 0);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Yet another spectral delay where the each bin gets a random delay time

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	v = MouseX.kr(0.1, 2);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		mag + DelayN.kr(mag, 1, v*1.0.rand);
	}, frombin: 0, tobin: 256, zeroothers: 0);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Spectral delay where the delaytimes are modulated by an oscillator

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	v = MouseX.kr(0.1, 2);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		mag + DelayN.kr(mag, 1, v*SinOsc.ar(0.5).range(0.1,1));// play with Tri or LFSaw, etc.
	}, frombin: 0, tobin: 256, zeroothers: 0);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Amplitude controlled with MouseX and phase manipulation with MouseY

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
		
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		[mag * MouseX.kr(0.5, 2), phase / MouseY.kr(0.5, 30)]
	}, frombin: 0, tobin: 250, zeroothers: 0);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Here we add noise to the phase

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
		
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		[mag, LFNoise0.kr.range(0, 3.14)];
	}, frombin: 0, tobin: 250, zeroothers: 1);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Square the magnitude and put a random phase (from 0 to pi (3.14))

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		[mag.sqrt, pi.rand];
	}, frombin: 0, tobin: 256, zeroothers: 1);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Here we use the index and we subtract it with a LFPar on a slow sweep

(
{
	var in, chain, v;
	in = PlayBuf.ar(1, c, BufRateScale.kr(c), loop: 1);
	chain = FFT(LocalBuf(1024), in);
	
	chain = chain.pvcollect(b.numFrames, {|mag, phase, index|
		if((index-LFPar.kr(0.1).range(2, b.numFrames/20)).abs < 10, mag, 0); // swept bandpass
	}, frombin: 0, tobin: 250, zeroothers: 0);
	
	Out.ar(0, 0.5 * IFFT(chain).dup);
}.play(s);
)

Chapter 14 - Busses, Nodes, Groups and Signalflow

The SuperCollider Server is an extremely well designed application which allows us to structure nodes on busses and add effects before or after, just like we would do on a well designed hardware mixer. This chapter will explore the ins and outs of the Server.

Busses in SC (Audio and Control Busses)

What are Busses? They are virtual placeholders of signals. A good description is to be found in the Server-Architecture helpfile:

Audio Buses Synths send audio signals to each other via a single global array of audio buses. Audio buses are indexed by integers beginning with zero. Using buses rather than connecting synths to each other directly allows synths to connect themselves to the community of other synths without having to know anything about them specifically. The lowest numbered buses get written to the audio hardware outputs. Immediately following the output buses are the input buses, read from the audio hardware inputs. The number of bus channels defined as inputs and outputs do not have to match that of the hardware.

Control Buses Synths can send control signals to each other via a single global array of control buses. Buses are indexed by integers beginning with zero. {/blockquote}

If you look at the source file of ServerOptions, you will see that there are default number of audio and control busses assigned to the server on booting. You can change these values, of course:

1 var <>numAudioBusChannels=128;
2 var <>numControlBusChannels=4096;
3 var <>numInputBusChannels=8;
4 var <>numOutputBusChannels=8;

We see that we’ve got 128 audio busses and 4096 control busses. This should be more than enough in most cases, but if you need more you can:

a) question why you need more! Are you designing your program correctly? b) change the number in the ServerOptions file and recompile.

We also see that by default we have 8 output and 8 input busses. This means that in this setting the 8th Audio bus is actually the 1st input channel. Change this to fit your soundcard if you want.

Busses are not exactly the same as audio channels. Channels as we normally think of them are physical channels as in a mixer or a sound card, but a Bus is rather like an abstract representation of a channel. Thus a bus can be mono or stereo or even 5 channel, depending on your needs.

Audio Busses

Audio busses run on audio rate (e.g., 44.1kHz).

Here below is some code that shows how the busses work. The figure in chapter 2 can be helpful here, although it is simple.

(
SynthDef(\bustest, {arg outbus=0, freq=440;
	Out.ar(outbus, SinOsc.ar(freq, 0, 0.3));
}).add
)

a = Synth(\bustest, [\outbus, 0]) // left speaker
b = Synth(\bustest, [\outbus, 1]) // right speaker
c = Synth(\bustest, [\outbus, 2]) // channel 3?

// now we free a and b
a.free; b.free; 

// but c is still running on bus 2 - we just can't hear it (assuming you're in stereo)

// so we create a synthdef that can listen to any bus and output on any bus
(
SynthDef(\bustest2, {arg inbus=10, outbus=0;
	Out.ar(outbus, In.ar(inbus, 1));
}).add
)

// and we listen to bus 2 and output on bus 0. - don't worry about addAction now.
d = Synth(\bustest2, [\inbus, 2, \outbus, 0], addAction:\addToTail);

// If you were wondering about the comment on inbusses and outbusses, you can try
// to listen to the audio in bus (by default on bus 8) (if you have an active mic that is)

d = Synth(\bustest2, [\inbus, 8, \outbus, 0], addAction:\addToTail);

Control Busses

Here signals run on control rate (such as 689 times per second))

A control bus can be mapped to control values in many synths. Let’s make a control synth that maps the freq value of the synth above.

(
SynthDef(\lfo, {arg ctrlbus = 2, freq=4, mul=100;
        Out.kr(ctrlbus, SinOsc.kr(freq, 0, mul: mul, add: 200)); // note the .kr
}).add;
)

// we create our synth
a = Synth(\bustest);

// we make a control bus that will be controlling the freq of our synth
b = Bus.control(s, 1);
b.value = 200;

// then we map the frequency of that bus to the freq parameter of the synth

a.map(\freq, b.index);

// and we can try to put different values into the control bus

b.value = 600;
b.value = 400;

// and, of course, the values of the control bus can by dynamic

c = Synth(\lfo, [\ctrlbus, b.index]);
c.set(\freq, 7);
c.set(\freq, 2);
c.set(\mul, 200);

// let's change the lfo to a LFSaw

(
SynthDef(\lfosaw, {arg ctrlbus = 2, freq=4, mul=100;
        Out.kr(ctrlbus, LFSaw.kr(freq, 0, mul: mul, add: 200)); // note the .kr
}).add;
)

c.free;
d = Synth(\lfosaw, [\ctrlbus, b.index]);
d.set(\freq, 7);
d.set(\freq, 2);
d.set(\mul, 200);

This way you can really plug synths into each other just like on an old fashioned modular synth. For a different take on modular coding, check the JIT lib, the use of ProxySpace and Ndefs.

Nodes

We have already been using nodes in this tutorial. Creating a synth like this: a = Synth(\bustest); is creating a node. We can then set the frequency of the node a.set(\freq, 880); or just free it: a.free;

Nodes live on busses. The busses can be seen as a mythic monster with a head facing up and a tail facing downwards that eats audio. This monster (the bus) can take audio in from one bus and output into another bus (SynthDef handles that). The audio runs from the head to the tail. You can put your synths at the top of the monster (where the sound will run through it) or at the tail (where it will receive the signal that runs through it).

When you start SC there is a default group that receives all nodes s.queryAllNodes; // Note the RootNode (ID 0) and the default Group (ID 1)

By default synths are added to the HEAD of a group (in this instance the default group)

So in the following program you don’t hear anything (but see the 2 synths on the server window)

(
{Out.ar(2, PinkNoise.ar(0.3)!2)}.play;
{In.ar(2, 2)}.play // this is added to the head of the bus (but the PinkNoise is below)
)

But in the example below you will hear: (because the sound is put onto the head AFTER the listener (In))

(
{In.ar(2, 2)}.play;
{Out.ar(2, PinkNoise.ar(0.3)!2)}.play;
)

Or better: be specific and simply add the In listener to the tail of the default group and we hear:

(
{Out.ar(2, PinkNoise.ar(0.3)!2)}.play;
{In.ar(2, 2)}.play(addAction:\addToTail)
)

This is the meaning of \addToHead, \addToTail, \addAbove, and \addBelow.

And if we keep these synths running we can see that they have been added to the Group (default)

s.queryAllNodes; 

{Out.ar(2, SinOsc.ar(200)!2)}.play; // adding to head by default

s.queryAllNodes; 

Here is a practical example using a reverb and a delay for a snare

(
SynthDef(\reverb12, {arg inbus=0, outbus=0, predelay=0.048, combdecay=5, allpassdecay=1, re\
vVol=0.31;
	var sig, y, z;
	sig = In.ar(inbus, 2); 
	z = DelayN.ar(sig, 0.1, predelay); // max 100 ms predelay
	y = Mix.ar(Array.fill(7,{ CombL.ar(z, 0.05, rrand(0.03, 0.05), combdecay) })); 
	6.do({ y = AllpassN.ar(y, 0.050, rrand(0.03, 0.05), allpassdecay) });
	Out.ar(outbus, sig + (y * revVol)); 
}).add; 

SynthDef(\delay12, {arg inbus=0, outbus=0, maxdelaytime=6, delaytime=0.3, decaytime=2.31;
	var sig, y, z;
	sig = In.ar(inbus, 2); 
	sig = CombN.ar(sig, maxdelaytime, delaytime, decaytime);
	Out.ar(outbus, sig); 
}).add; 

SynthDef(\snare12, { arg out=0, tempo=2;
	var snare, base, hihat;
	tempo = Impulse.ar(tempo); // for a drunk drummer replace Impulse with Dust !!!

	snare = 	WhiteNoise.ar(Decay2.ar(PulseDivider.ar(tempo, 4, 2), 0.005, 0.5));

	Out.ar(out, snare * 0.4 !2)
}).add;
)

A snare on outbus 0 - no effects
a = Synth(\snare12, [\outbus, 20]);

We create a reverb synth on audiobus 20 and delay on audiobus 22 (stereo signal)

b = Synth(\reverb12, [\inbus, 20, \outbus, 0]);
c = Synth(\delay12, [\inbus, 22, \outbus, 0]);

s.queryAllNodes; 

a.set(\outbus, 20)
a.moveBefore(b)
s.queryAllNodes; 

a.set(\outbus, 22)
a.moveBefore(c)
s.queryAllNodes; 


{Out.ar(20, AudioIn.ar(1))}.play(addAction:\addToHead) // we add audio in to the snaredrum

And we could add a synth AFTER the delay:

    a = Synth(\snare12, [\outbus, 22, \tempo, 4], addAction:\addToTail)
	a.free;

Or we add it BEFORE the delay:

    a = Synth(\snare12, [\outbus, 22, \tempo, 4], addAction:\addToHead)
	a.free;

Groups

Groups can be useful if you are making complex things and you want to group certain things together. You can think of it like grouping in Photoshop (i.e. making a group that you can move around without having to move every line). For a good explanation of Groups, check Mark Polishook’s tutorial which can be found in the distribution of SC

Group example (check the Group and Node helpfiles for more)

g = Group.new // we create a new group

And few synths that respond to the freq argument, but multiply it differently

{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq,0,0.12))}.play(g)
{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq*1.2,0,0.12))}.play(g)
{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq*1.4,0,0.12))}.play(g)

g.set(\freq, 255) // we change the frequency and ALL the synths get a new frequency
g.set(\out, 10) // we move the output to bus 10

s.queryAllNodes; 

Here we could try to listen to bus 10, but it’s added to the head

{Out.ar(0, In.ar(10,1))}.play(g)
s.queryAllNodes; 

// so we explicitly add the synth to the tail 
{Out.ar(0, In.ar(10,1))}.play(g, addAction:\addToTail)
s.queryAllNodes; 

We see that we now have 5 synths in a Group (called g)

h = Group.new // we create a new group

{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq,0,0.12))}.play(h)
{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq*1.2,0,0.12))}.play(h)
{arg freq=333, out=0; Out.ar(out, SinOsc.ar(freq*1.4,0,0.12))}.play(h)

h.set(\freq, 255) // we change the frequency and ALL the synths get a new frequency
h.set(\freq, 955) // we change the frequency and ALL the synths get a new frequency

s.queryAllNodes; 

h.moveAfter(g) // we can move h (not that it matters here, but when making effects, it's us\
eful)

s.queryAllNodes

Part IV

Chapter 14 - Musical Patterns on SC Server

The SC Server is highly streamlined, small and functional piece of software. It does not have the whole of the SuperCollider language to do timing, data flow, etc., but it does have unit generators that can do much of the same.

Stepper and Select

The stepper is a pulse counter that outputs a signal.

A scale of frequencies from 500 to 1600 in steps of 100 (as it is multiplied by 100)

1 {SinOsc.ar( Stepper.kr(Impulse.kr(10), 0, 4, 16, 1) * 100, 0, 0.2)}.play;

And here the steps are -3 so there are more interesting step sequences

1 {SinOsc.ar(Stepper.kr(Impulse.kr(6), 0, 5, 15, -3).poll(6, "stepper") * 80, 0, 0.2)}.play;

We poll the Stepper to see the output.

And here we use Lag (generating a line from the current value to the next in specified time) for the frequency.

{SinOsc.ar(Lag.kr(Stepper.kr(Impulse.kr(6), 0, 5, 25, -4) * 90, 6.reciprocal), 0, 0.2)}.pla\
y;

// perhaps more understandable like this:
(
{
	SinOsc.ar( 			// the sine
		Lag.kr( 			// our lag
			Stepper.kr(Impulse.kr(6), 0, 5, 25, -4) * 90, // the stepper
			6.reciprocal),// the time of the lag
		0,  				// phase of the sine
		0.2) 			// amplitude of the sine
}.play;
)

NOTE: the lag time is the reciprocal of the Impulse frequency, i.e. the impulse happens 6 times per second, i.e. every 0.16666666666667 seconds. If you check the reciprocal of 6, you get that number. In this case it doesn’t matter whether we use 0.16666666666667 or 6.reciprocal, but if Impulse frequency is in a variable, it could be useful, as in:

f = {arg rate;	
	{SinOsc.ar(Lag.kr(Stepper.kr(Impulse.kr(rate), 0, 5, 25, -4) * 90, rate.reciprocal), 0, 0.\
2)}.play;
}

f.(6)
f.(12)
f.(24)

Select

(
{
	var scale, cycle;
	//scale = Array.fill(12,{ arg i; 60 + i }).midicps; // we fill an array with a scale
	scale = [60, 61, 63, 64, 65, 67, 68, 69, 70].midicps; // we fill an array with a scale
	cycle = scale.size / 2;

	SinOsc.ar(
			Select.kr( 
				LFSaw.kr(0.4, 1, cycle, cycle),
				scale
			)
	);
}.play;
)

Select and Stepper together

Here we use the Stepper to do what LFSaw did above, it is just stepping through the pitchArray and not generating the pitches like in the Stepper examples above.

(
var pitchArray; //Declare a variable to hold the array
	//load the array with midi pitches
pitchArray = [60, 62, 64, 65, 67, 69, 71, 72].midicps; 
{
	SinOsc.ar(
		Select.kr(
			Stepper.kr(Impulse.kr(8), max: pitchArray.size-1), // try with Dust
			pitchArray),
		mul: 0.5)
}.play
)

PulseCount and PulseDivider

We could also use PulseCount to get at the items of the array

(
{
	var scale, cycle;
	//scale = Array.fill(12,{ arg i; 60 + i }).midicps; // we fill an array with a scale
	scale = [60, 61, 63, 64, 65, 67, 68, 70].midicps; // we fill an array with a scale
	cycle = scale.size / 2;

	SinOsc.ar(
			Select.kr( 
				PulseCount.ar(Impulse.ar(scale.size), Impulse.ar(1)), // we go through the scale in 1 s\
ec
				scale
			)
	);
}.play;
)

PulseDivider is also an interesting UGen, it outputs an impulse when it has received a certain numbers of impulses

Here we use it to create a drummer in one synthdefinition. (quite primitive, and just for fun, but look at the CPU : )

(
SynthDef(\drummer, { arg out=0, tempo=4;
	var snare, base, hihat;
	tempo = Impulse.ar(tempo); // for a drunk drummer replace Impulse with Dust !!!

	snare = 	WhiteNoise.ar(Decay2.ar(PulseDivider.ar(tempo, 4, 2), 0.005, 0.5));
	base = 	SinOsc.ar(Line.ar(120,60, 1), 0, Decay2.ar(PulseDivider.ar(tempo, 4, 0), 0.005, 0.\
5));
	hihat = 	HPF.ar(WhiteNoise.ar(1), 10000) * Decay2.ar(tempo, 0.005, 0.5);
	
	Out.ar(out,(snare + base + hihat) * 0.4!2)
}).add;
)

a = Synth(\drummer);
a.set(\tempo, 6);
a.set(\tempo, 18);
a.set(\tempo, 180); // check the CPU! no increase.

Demand UGens

In chapter 2 we saw how we could use Patterns to control the server. Patterns are language side streams used to control the server. The Demand UGens are server side and don’t need the SC language. So you could use this from languages like Python, Java, etc.

The Demand UGens follow the logic of the Pattern classes of the SCLang - We will look further at Patterns in the next tutorial.

(
{
	var freq, trig, reset, seq1;
	trig = Impulse.kr(10);
	seq1 = SinOsc.ar(2, mul: 200, add: 700); 
	freq = Demand.kr(trig, 0, seq1);
	SinOsc.ar(freq + [0,0.7]).cubed.cubed * 0.1;
}.play;
)

// same as above, but here we Demand more frequently and the sine is slower
// and we poll the freq
(
{
	var freq, trig, reset, seq1, trigrate;
	trigrate = 20;
	trig = Impulse.kr(trigrate);
	seq1 = SinOsc.ar(1, mul: 200, add: 700).poll(trigrate.reciprocal, "freq"); 
	freq = Demand.kr(trig, 0, seq1);
	SinOsc.ar(freq + [0,0.7]).cubed.cubed * 0.1;
}.play;
)

Using LFSaw instead of a SinOsc

(
{
	var freq, trig, reset, seq1, seq2;
	trig = Impulse.kr(10);
	seq1 = LFSaw.ar(1, mul: 200, add: 700); 
	freq = Demand.kr(trig, 0, seq1);
	SinOsc.ar(freq + [0,0.7]).cubed.cubed * 0.1;
}.play;
)

Using LFTri and now we use the mouse to control the mul and add of the Freq osc.

(
{
	var freq, trig, reset, seq1, seq2;
	trig = Impulse.kr(10);
	seq1 = LFTri.ar(1, mul: MouseX.kr(200,1000), add: MouseY.kr(200,1000)).poll(10.reciprocal,\
 "freq"); 
	freq = Demand.kr(trig, 0, seq1);
	SinOsc.ar(freq + [0,0.7]).cubed.cubed * 0.1;
}.play;
)

There are useful Ugens like Dseq and Drand (compare to Pseq and Prand)

(
{
	var freq, trig, reset, seq1, seq2;
	trig = Impulse.kr(10);
	seq1 = Drand([72, 75, 79, 82]-12, inf).midicps; 
	seq2 = Dseq([72, 75, 79, Drand([82,84,86])], inf).midicps; 
	freq = Demand.kr(trig, 0, [seq1, seq2]);
	SinOsc.ar(freq + [0,0.7]).cubed.cubed * 0.1;
}.play;
)

Dseries

(
{ 
	var a, freq, trig;
	a = Dseries(0, 1.4, 20); // we build a series of values
	trig = Impulse.kr(MouseX.kr(1, 40, 1));
	freq = Demand.kr(trig, Impulse.kr(0.5), a) * 30 + 340; 
	SinOsc.ar(freq) * 0.1

}.play;
)

Dgeom

(
{ 
	var a, freq, trig;
	a = Dgeom(1, 1.4, 20); // we build a series of values
	trig = Impulse.kr(MouseX.kr(1, 40, 1));
	freq = Demand.kr(trig, Impulse.kr(0.5), a) * 30 + 340; 
	SinOsc.ar(freq) * 0.1

}.play;
)

The Dbrown and Dibrown Ugens

These UGens are good for random walk (drunken walk)

(
{ 
	var a, freq, trig;
	a = Dibrown(0, 20, 2, inf);
	trig = Impulse.kr(MouseX.kr(1, 40, 1));
	freq = Demand.kr(trig, 0, a) * 30 + 340; 
	SinOsc.ar(freq) * 0.1
}.play;
)

Dwhite is whitenoise - not drunk anymore but jumping around madly

(
{ 
	var a, freq, trig;
	a = Diwhite(0, 15, inf);
	trig = Impulse.kr(MouseX.kr(1, 40, 1));
	freq = Demand.kr(trig, 0, a) * 30 + 340; 
	SinOsc.ar(freq) * 0.1

}.play;
)

Using TDuty to demand results from demand rate UGens

(
{
	var minDur = 0.1, delta = 0.01;
	var trig = TDuty.ar(Dbrown(minDur, minDur+delta), 0, Dwhite(0, 1));
	Ringz.ar(trig, TRand.ar(2000, 4050, trig), 0.1)!2
}.play
)

Chapter 15 - Musical Patterns in the SCLang

Throughout this tutorial we have been creating synthesizers, effects, routing them through busses, putting them into groups and more, but for many the question is how to make musical patterns or arrange events in time. For this we need some kind of a representation of the events, for example stored in an array, or generated algorithmically on the fly. Chapter 3 introduced some basic ways of controlling synths, but in this section we will explore in a bit more detail how to arrange musical events in time.

The SynthDefs

For now we’ll use two synth definitions.

SynthDef(\sine, {arg out=0, amp=0.1, freq=440, envdur=1, pan=0.0;
	var signal;
	signal = Pan2.ar(SinOsc.ar(freq, 0, amp**amp).cubed, pan); // note the pan
	signal = signal * EnvGen.ar(Env.perc(0.01, envdur), doneAction:2);
	Out.ar(out, signal);
}).add;

SynthDef(\synth1, {arg out=0, freq=440, envdur=1, amp=0.4, pan=0;
    var x, env;
    env = EnvGen.kr(Env.perc(0.001, envdur, amp), doneAction:2);
    x = Mix.ar([FSinOsc.ar(freq, pi/2, 0.5), Pulse.ar(freq,Rand(0.3,0.7))]);
    x = RLPF.ar(x,freq*4,Rand(0.04,1));
    x = Pan2.ar(x,pan);
    Out.ar(out, x*env);
}).add; 

Routines and Tasks

We have already explored how to play a melody using a Task and a Routine (check the documentation for each, but in short a Task is a Routine that can be paused).

Function has a method called “fork” which will turn the function into a Routine (co-routine, and some could think of it as a “thread” - although technically it’s not), but this allows for a process to run independently of what is happening elsewhere in the program.

Routine({
	1.postln; 
	Synth(\sine, [\freq, 220]);
	0.5.wait;
	2.postln;
	Synth(\sine, [\freq, 220*2]);
	0.5.wait;
	3.postln;
	Synth(\sine, [\freq, 220*3]);
	0.5.wait;
}).play

This could also be written as:

1 { 3.do({arg i; (i+1).postln; Synth(\sine, [\freq, 220*(i+1)]); 0.5.wait }) }.fork

Or unpacked:

{ 
	3.do({arg i; 
		(i+1).postln; 
		Synth(\sine, [\freq, 220*(i+1)]); 
		0.5.wait;
	}) 
}.fork

So with a little melody stored in an array we could play it repeatedly:

m = [60, 63, 64, 61];

{ inf.do({arg i; Synth(\sine, [\freq, m.wrapAt(i).midicps]); 0.5.wait }) }.fork

The “fork” is running a routine and the routine is played by SuperCollider’s default TempoClock.

If you keep that code running and then evaluate:

1 TempoClock.default.tempo = 2

You will see how the tempo changes, as the 0.5.wait in the Routine is half a beat of the tempo clock that has now changed its tempo.

Clocks in SuperCollider

All temporal tasks in SuperCollider run from one of the language’s clocks. There are 3 clocks in SuperCollider:

1 - SystemClock (the main clock that starts when you launch SC)
2 - TemploClock (same as SystemClock but counts in beats as opposed seconds)
3 - AppClock (musically unreliable, but good for communicating with GUIs or external hardware)

Routines, Tasks and Patterns can all run by these 3 different clocks. You pass the clocks as arguments to them.

SystemClock

Let’s have a quick look at the SystemClock:

(
SystemClock.sched(2.0,{ arg time;  
	time.postln; 
	0.5 // wait between next scheduled event
});
)

(
SystemClock.sched(2.0,{ arg time;  
	"HI THERE! Long wait".postln; 
	nil // no wait - no next scheduled event
});
)

// You can also schedule an event for an absolute time:
(
SystemClock.schedAbs( (thisThread.seconds + 4.0).round(1.0),{ arg time;
	("the time is exactly " ++ time.asString 
		++ " seconds since starting SuperCollider").postln;
});
)

AppClock

The AppClock works pretty much the same but uses different source clocks (MacOS’s NSTimers).

You could try to create a GUI which is updated by a clock.

w = Window.new("oo", Rect(100, 100, 240, 100)).front;
x = Slider.new(w, Rect(20, 20, 200, 40));

// This works
{inf.do({x.value_(1.0.rand); 0.4.wait})}.fork(AppClock)

// However this won't work (as it's using the TempoClock by default)
{inf.do({x.value_(1.0.rand); 0.4.wait})}.fork

You will get an error message that could become familiar:

“Operation cannot be called from this Process. Try using AppClock instead of SystemClock.”

You can also get this done by “deferring” the command to the AppClock using .defer.

1 {inf.do({ {x.value_(1.0.rand)}.defer; Synth(\sine); 0.4.wait})}.fork

So here we are using the SystemClock to play the \sine synth, but deferring the updating of the GUI to the AppClock.

TempoClock

TempoClocks are typically used for musical tasks. You can run many tempo clocks at the same time, at different tempi or in different meters. TempoClocks are ideal for high priority scheduling of musical events, and if there is a need for external communication, such as MIDI, GUI or Serial communication, the trick is to defer that message with a “{}.defer”.

Let’s explore the tempo clock:

t = TempoClock(2); // tempo is 2 beats per second (120 bpm);

Many people who think in BPM (beats per minute) typically set the argument to the tempo clock as “120/60” (where 120 bpm equals 2 beats per second), or “60/60” (which is 1 bps, and SuperCollider’s “default” tempo).

The clock above is now in a variable “t” and we can use it to schedule events (at a particular beat in the future):

t.schedAbs(t.beats.ceil, { arg beat, sec; [beat, sec].postln; 1});
t.schedAbs(t.beats.ceil, { arg beat, sec; "ho ho --".post; [beat, sec].postln; 1 });

And we can change the tempo:

t.tempo_(4)

t.beatDur // we can ask the clock the duration of the beats
t.beats // the beat time of the clock

t.clear

Polyrhythm of 3/4 against 4/4

(
t = TempoClock(4);
t.schedAbs(t.beats.ceil, { arg beat, sec;
	beat.postln;
	if (beat % 2==0, {Synth(\sine, [\freq, 444])});
	if (beat % 4==0, {Synth(\sine, [\freq, 333])});
	if (beat % 3==0, {Synth(\sine, [\freq, 888])});
	1; // repeat
});
)
t.tempo_(6)

Polyrhythm of 5/4 against 4/4

(
t = TempoClock(4);
t.schedAbs(t.beats.ceil, { arg beat, sec;
	if (beat % 2==0, {Synth(\sine, [\freq, 444])});
	if (beat % 4==0, {Synth(\sine, [\freq, 333])});
	if (beat % 5==0, {Synth(\sine, [\freq, 888])});
	1; // repeat
});

)

Or perhaps a polyrhythm of 5/4 against 4/4 where the bass line is in 4/4 and the high synth in 5/4.

(
t = TempoClock(4);

t.schedAbs(t.beats.ceil, { arg beat, sec;
	if (beat % 2==0, {Synth(\sine, [\freq, 60.midicps])});
	if (beat % 4==0, {Synth(\sine, [\freq, 64.midicps])});
	if (beat % 5==0, {Synth(\synth1, [\freq, 72.midicps])});
	if (beat % 5==3, {Synth(\synth1, [\freq, 77.midicps])});
	1; // repeat
});
)

Another version

(
t = TempoClock(4);

t.schedAbs(t.beats.ceil, { arg beat, sec;
	if (beat % 4==0, {"one".postln; Synth(\sine, [\freq, 60.midicps])});
	if (beat % 4==2, {"two".postln; Synth(\sine, [\freq, 72.midicps])});
	if ((beat % 4==1) || (beat % 4==3), {Synth(\sine, [\freq, 84.midicps])});
	
	if (beat % 5==0, {Synth(\synth1, [\freq, 89.midicps, \amp, 0.2])});
	if (beat % 5==2, {Synth(\synth1, [\freq, 96.midicps, \amp, 0.2])});
	1; // repeat
});
)

We can try to make this a bit more interesting by creating another synth:

(
SynthDef( \klanks, { arg freqScale = 1.0, amp = 0.1;
	var trig, klan;
	var  p, exc, x, s;
	trig = Impulse.ar( 0 );
	klan = Klank.ar(`[ Array.fill( 16, { linrand(8000.0 ) + 60 }), nil, Array.fill( 16, { rran\
d( 0.1, 2.0)})], trig, freqScale );
	klan = (klan * amp).softclip;
	DetectSilence.ar( klan, doneAction: 2 );
	Out.ar( 0, Pan2.ar( klan ));
}).store;
)

And play the same polyrhythm.

(
t = TempoClock(4);

t.schedAbs(t.beats.ceil, { arg beat, sec;
	if (beat % 4==0, {"one".postln; Synth(\klanks, [\freqScale, 40.midicps])});
	if (beat % 4==2, {"two".postln; Synth(\klanks, [\freqScale, 52.midicps])});
	if ((beat % 4==1) || (beat % 4==3), {Synth(\klanks, [\freqScale, 43.midicps])});
	
	if (beat % 7==0, {Synth(\synth1, [\freq, 88.midicps, \amp, 0.2])});
	if (beat % 7==3, {Synth(\synth1, [\freq, 96.midicps, \amp, 0.2])});
	if (beat % 7==5, {Synth(\synth1, [\freq, 86.midicps, \amp, 0.2])});

	1; // repeat
});
)

t.tempo_(8)

// an example showing tempo changes 

(
t = TempoClock(80/60); // 80 bpm
// schedule an event at next whole beat
t.schedAbs(t.beats.ceil, { arg beat, sec; 
	"beat : ".post; beat.postln;
	if (beat % 4==0, { Synth(\sine, [\freq, 60.midicps]) });
	if (beat % 4==2, { Synth(\sine, [\freq, 67.midicps]) });
	if (beat % 0==0, { Synth(\sine, [\freq, 72.midicps]) });
	1 // 1 here means that we are repeating/looping this
});
t.schedAbs(16, { arg beat, sec; 
	" ****  tempochange on beat : ".post; beat.postln; 
	t.tempo_(150/60); // 150 bpm
});
5.do({ |i| // on beats 32, 36, 40, 44, 48 
	t.schedAbs(32+(i*4), { arg beat, sec;
		" ****  tempo is now : ".post; (150-(10*(i+1))).post; " BPM".postln; 
		t.tempo_((150-(10*(i+1)))/60); // going down by 10 bpm each time
	});
});
t.schedAbs(60, { arg beat; t.tempo_(200/60) }); // 200 bpm
t.schedAbs(76, { arg beat;
	t.clear;
	t.schedAbs(t.beats.ceil, { arg beat, sec; 
		"beat : ".post; beat.postln;
		if (beat % 4==0, { Synth(\sine, [\freq, 67.midicps]) });
		if (beat % 4==2, { Synth(\sine, [\freq, 74.midicps]) });
		if (beat % 0==0, { Synth(\sine, [\freq, 79.midicps]) });
		1 // 1 here means that we are repeating/looping this
	});
	t.schedAbs(92, { arg beat; t.stop }); // stop it!
}); // 200 bpm
t.schedAbs(92, { arg beat; t.stop }); // if we tried to stop it here, it would have been "c\
leared"
)

A survey of Patterns

We can try to play the above synth definitions with Patterns and it will play using the default arguments of patterns (see the Event source file). Let’s start by exploring the Pbind pattern. As we saw in chapter 3, if you run the code below:

().play // "()" is an empty Event dictionary
Pbind().play // Pbind plays an empty Event

You can hear that there are default arguments, like a note played every second, an instrument is used (SuperCollider’s \default) and a frequency (440Hz).

In the example below, we use Pbind (Pattern that binds keys (synth def arguments) and their arguments). Here we pass the \sine synth def as the argument for the \instrument (again as defined in the Event class).

Pbind(\instrument, \sine).play // it plays our synth definition

Pbind(\instrument, \sine, \freq, Pseq([60, 65, 57, 62].midicps)).play // it plays our synth\
 definition

Our \sine synth has a frequency argument, and we are sending the frequency directly. However, if we wanted we could also send ‘note’ or ‘midinote’ arguments, but here the values are converted internally to the \freq argument of \sine.

Pbind(\instrument, \sine, \note, Pseq([0, 5, 7, 2])).play // it plays our synth definition

Pbind(\instrument, \sine, \midinote, Pseq([60, 65, 57, 62])).play // it plays our synth def\
inition

Pattern definitions (Pdef) are a handy way to define and play patterns. They are a bit like Synth definitions in that they have a unique name and can be recompiled on the fly.

(
Pdef(\scale, Pbind(
	\instrument, \sine,
	\freq, Pseq([62, 64, 67, 69, 71, 74], inf).midicps,
	\dur,  Pseq([0.25, 0.5, 0.25, 0.25, 0.5, 0.5], inf)
));
)

a = Pdef(\scale).play;
a.pause // pause. the stream
a.resume // resume it
a.stop 	// stop it (resets it)
a.play 	// start again

Then we can set variables in our instrument using .set

Pdef(\scale).set(\out, 20); // outbus 20 
Pdef(\scale).set(\out, 0); // outbus 0 

// here we set the duration of the envelope in our instrument
Pdef(\scale).set(\envdur, 0.2);

Patterns use default keywords defined in the Event class, so take care not to use those keywords in your synth definitions. If we had used dur instead of envdur for the envelope in our instrument, this would happen:

1 Pdef(\scale).set(\dur, 0.1);

because dur is a keyword of Patterns (the main ones are \dur, \freq, \amp, \out, \midi)

Resetting the freq info is not possible however :

Pdef(\scale).set(\freq, Pseq([72,74,72,69,71,74], inf).midicps);

One solution would be to resubmit the Pattern Definition:

(
Pdef(\scale, Pbind( \instrument, \sine,
				\freq, Pseq([72,74,72,69,71,74], inf).midicps, // different sequence
				\dur,  Pseq([0.25, 0.5, 0.25, 0.25, 0.5, 0.5], inf)
)); 
)
// and it's still in our variable "a", it's just the definition that's different
a.pause
a.resume

Patterns and environmental variables

We could also use Pdefn (read the helpfiles to compare Pdef and Pdefn) (here we are using envrionment variables to refer to patterns)

We use a Pdefn to hold the scale

1 Pdefn(\scaleholder, { |arr| Pseq(arr.freqarr) });

And we add an array to it

1 Pdefn(\scaleholder).set(\freqarr, Array.fill(6, {440 +(300.rand)} ));

Then we play a Pdef with the Pdefn

Pdef(\scale, 
		Pbind( 	\instrument, \synth1,
				\freq, Pn(Pdefn(\scaleholder), inf), // loop
				\dur, 0.4
			)
			
); 
a = Pdef(\scale).play;

And we can reset our scale

1 Pdefn(\scaleholder).set(\freqarr, Array.fill(3, {440 +(300.rand)} ));

Another example

(
Pdefn(\deg, Pseq([0, 3, 2],inf));

Pset(\instrument, \synth1, 
	Ppar([
		Pbind(\degree, Pdefn(\deg)),
		Pbind(\degree, Pdefn(\deg), \dur, 1/3)
])
).play;
)

Pdefn(\deg, Prand([0, 3, [1s, 4]],inf));
Pdefn(\deg, Pn(Pshuf([4, 3, 2, 7],2),inf));
Pdefn(\deg, Pn(Pshuf([0, 3],2),inf));

(
Pdefn(\deg, Plazy { var pat;
	pat = [Pshuf([0, 3, 2, 7, 6],2), Pshuf([3, 2, 6],2), Pseries(11, -1, 11)].choose;
	Pn(pat, inf)
});
)

Or perhaps:

(
Pdef(\player).set(\instrument, \synth1);

Pdef(\player,
	Pbind(
		\instrument, 	Pfunc({ |e| e.instrument }),
		\midinote, 	Pseq([45,59,59,43,61,43,61,61,45,33,31], inf),
		\dur, 		Pseq ([0.25,1,0.25,0.5,0.5,0.5,0.125,0.125,0.5,0.25,0.25], inf),
		\amp, 		Pseq([1,0.1,0.2,1,0.1125,0.1125,1,0.1125,0.125,0.25,1,0.5], inf)
	)
);
)

Pdef(\player).play;

Pdef(\player).set(\instrument, \synth1);
Pdef(\player).set(\envdur, 0.1);
Pdef(\player).set(\envdur, 0.25);
Pdef(\player).set(\envdur, 1);
Pdef(\player).set(\instrument, \sine);

( ~scale = [62,67,69, 77];

c = Pdef(\p04b, Pbind( \instrument, \synth1, \freq, (Pseq.new(~scale, inf)).midicps, // freq arg \dur, Pseq.new([1, 1, 1, 1], inf); // dur arg ) );

c = Pdef(\p04c, Pbind( \instrument, \synth1, \freq, (Pseq.new(~scale, inf)).midicps, // freq arg \dur, Pseq.new([1, 1, 1, 1], inf); // dur arg ) ); )

Pdef(\p04b).quant([2, 0, 0]); Pdef(\p04c).quant([2, 0.5, 0]); // offset by half a beat Pdef(\p04b).play; Pdef(\p04c).play;

// (quant can’t be reset in real-time, so we use align to align patterns). // align takes the same arguments as quant (see helpfile of Pdef)

Pdef(\p04c).align([4, 0, 0]); Pdef(\p04c).align([4, 0.75, 0]); // offset by 3/4 a beat

1 Another useful pattern is Tdef (Task patterns)
2 
3 {language= JavaScript, line-numbers=off}

Tdef(\x, { loop({ Synth(\sine, [\freq, 200+(440.rand)]); 0.25.wait; }) });

TempoClock.default.tempo = 2; // it runs on the default tempo clock

Tdef(\x).play(quant:1); Tdef(\x).stop;

// and we can redefine the definition “x” in realtime whilst playing Tdef(\x, { loop({ Synth(\synth1, [\freq, 200+(440.rand)]); 1.wait; }) });

Tdef(\y, { loop({ Synth(\synth1, [\freq, 1200+(440.rand)]); 1.wait; }) }); Tdef(\y).play(quant:1);

Tdef(\y).stop;

// to change the values in a pattern in realtime, use List instead of Array:

~notes = List[63, 61, 64, 65];

Pbind( \midinote, Pseq(~notes, inf), \dur, Pseq([0.4, 0.2, 0.1, 0.2], inf) ).play;

~notes[1] = 80

// yet another (known?) melody ( Pbind( \midinote, Pseq([72, 76, 79, 71, 72, 74, 72, 81, 79, 84, 79, 77, 76, 77, 76], 1), \dur, Pseq([4, 2, 2, 3, 0.5, 0.5, 4, 4, 2, 2, 2, 1, 0.5, 0.5, 2]/4, 1) ).play )

Using Pfx (effects patterns)

// make the synthdef and add it

SynthDef(\testenv2, { arg in=0, dur=2; var env; env = EnvGen.kr(Env.sine(dur), doneAction:2).poll; XOut.ar(0, 1, (In.ar(in, 1)+WhiteNoise.ar(0.1)) * env); // add noise for clarity }).add;

p = Pbind(\degree, Pseq([0, 4, 4, 2, 8, 3, 2, 0]), \dur, 0.5); p.play q = Pfx(p, \testenv, \dur, 4); // play it… all working (sine env is 4 secs) q.play

// now write the def to disk

SynthDef(\testenv2, { arg in=0, dur=2; var env; env = EnvGen.kr(Env.sine(dur), doneAction:2).poll; XOut.ar(0, 1, (In.ar(in, 1)+WhiteNoise.ar(0.1)) * env); // add noise for clarity }).writeDefFile;

// quit SuperCollider, open it again and now try this p = Pbind(\degree, Pseq([0, 4, 4, 2, 8, 3, 2, 0]), \dur, 0.5); q = Pfx(p, \testenv, \dur, 4); // not working (sine env is 2 secs, the synthdef default) q.play

// but here is the trick, read the SynthDescLib and try again!

SynthDescLib.global.read; q = Pfx(p, \testenv, \dur, 4); // not working (sine env is 2 secs, the synthdef default) q.play

// rendering the pattern as soundfile to disk (it will be written to your SuperCollider folder)

q.render( “ixi_tutorial_render_test.aif”, 4, sampleFormat: “int16”, options: Server.default.options.numOutputBusChannels_(2) );

1 ## TempoClock and Patterns
2 
3 Should we want to chage the tempo of the above Patterns, we can use the default TempoClock \
4 (as we didn't register a TempoClock for the pattern)
5 
6 {language= JavaScript, line-numbers=off}

TempoClock.default.tempo = 1.2 ~~~

But to have each pattern playing different TempoClocks, you need to create two clocks and use them to drive each pattern (this way one can do some nice phasing/polyrhytmic stuff).

(
t = TempoClock.new;
u = TempoClock.new;
Pdef(\p04b).play(t);
Pdef(\p04c).play(u);
u.tempo = 1.5
)

It is hard to get this clear as they are running the same pitch patterns so let’s redefine one of the patterns:

(
Pdef(\p04c, 
	Pbind(
		\instrument, \synth1,
		\freq, (Pseq.new(~scale.scramble, inf)).midicps*2, // freq arg
		\dur, Pseq.new([1, 1, 1, 1], inf);  // dur arg
	)
)
)
// and try to change the tempo 
u.tempo = 1;
u.tempo = 1.2;
u.tempo = 1.8;
u.tempo = 3.2;

Popcorn

An example of making a tune using patterns. For an excellent example take a look at spacelab, in examples/pieces/spacelab.scd

SynthDescLib.global.read;
// the poppcorn 

(
~s1 = [72, 70, 72, 67, 64, 67, 60];
~s2 = [72, 74, 75, 74, 75, 74, 72, 74, 72, 74, 72, 70, 72, 67, 64, 67, 72];

~t1 = [0.25, 0.25, 0.25, 0.25, 0.125, 0.25, 0.625];
~t2 = [0.25, 0.25, 0.25, 0.125, 0.25, 0.125, 0.25, 0.25, 0.125, 0.25, 0.125, 0.25, 0.25, 0.\
25, 0.125, 0.25, 0.5 ];

c = Pdef(\moogy, 
	Pbind(
		\instrument, \synth1, // using our synth1 synthdef
		\freq, 
			Pseq.new([
				Pseq.new([
					Pseq.new(~s1.midicps, 2),
					Pseq.new(~s2.midicps, 1)
				], 2),
				Pseq.new([
					Pseq.new((~s1+7).midicps, 2),
					Pseq.new((~s2+7).midicps, 1)
				], 2)	
			], inf),
		\dur, Pseq.new([ 
			Pseq.new(~t1, 2),
			Pseq.new(~t2, 1)
			], inf)
		)
);
Pdef(\moogy).play
)

Mozart

A little transcription of Mozart’s Piano Sonata No 16 in C major. Here the instrument has been put into a variable called “instr” so it’s easier to quickly change the instrument.

(
var instr = \default;
Ppar([
// right hand - using the Event-style notation
Pseq([
        (\instrument: instr, \midinote: 72, \dur: 1),
        (\instrument: instr, \midinote: 76, \dur: 0.5),
        (\instrument: instr, \midinote: 79, \dur: 0.5),
        (\instrument: instr, \midinote: 71, \dur: 0.75),
        (\instrument: instr, \midinote: 72, \dur: 0.125),
        (\instrument: instr, \midinote: 74, \dur: 0.125),
        (\instrument: instr, \midinote: 72, \dur: 1),
        (\instrument: instr, \midinote: 81, \dur: 1),
        (\instrument: instr, \midinote: 79, \dur: 0.5),
        (\instrument: instr, \midinote: 84, \dur: 0.5),
        (\instrument: instr, \midinote: 79, \dur: 0.5),
        (\instrument: instr, \midinote: 77, \dur: 0.25),
        (\instrument: instr, \midinote: 76, \dur: 0.125),
        (\instrument: instr, \midinote: 77, \dur: 0.125),
        (\instrument: instr, \midinote: 76, \dur: 1)
], 1),

// left hand - array notation
Pbind(\instrument, instr, 
        \midinote, Pseq([60, 67, 64, 67, 60, 67, 64, 67, 62, 67, 65, 67, 60, 67, 64, 67,
	                 60, 69, 65, 69, 60, 67, 64, 67, 59, 67, 62, 67, 60, 67, 64, 67 ], 1),
        \dur, 0.25
        )], 1).play
)

Syncing Patterns and TempoClocks

SynthDef(\string, {arg out=0, freq=440, pan=0, sustain=0.5, amp=0.3;
	var pluck, period, string;
	pluck = PinkNoise.ar(Decay.kr(Impulse.kr(0.005), 0.05));
	period = freq.reciprocal;
	string = CombL.ar(pluck, period, period, sustain*6);
	string = LeakDC.ar(LPF.ar(Pan2.ar(string, pan), 12000)) * amp;
	DetectSilence.ar(string, doneAction:2);
	Out.ar(out, string)
}).add;

SynthDef(\impulse, {
	Out.ar(0, Impulse.ar(0)!2);	
}).add

Synth(\impulse)

Pbind(
	\instrument, \impulse,
	\dur, 1
).play(TempoClock.default, quant:1)

// not working
TempoClock.default.play({
	Synth(\impulse, [\amp, 2]); // this is the problem
	1.0
	}, quant:[1, Server.default.latency] );

// working
TempoClock.default.play({
	s.sendBundle(0.2, ["/s_new", \impulse, s.nextNodeID, 0, 1]);
	1.0
	}, quant:[1, 0] );

TempoClock.default.tempo = 2.5

Pbind(
	\instrument, \string,
	\freq, Pseq([440, 880], inf),
	\dur, 1
).play(TempoClock.default, quant:1);

TempoClock.default.play({arg i;
	s.sendBundle(0.2, ["/s_new", \string, s.nextNodeID, 0, 1, \freq, if(i.asInteger.even, {660\
}, {770}), \amp, 0.3]);
	1.0
	}, quant:[1, 0] );

Chapter 16 - JIT lib and ProxySpace

JIT lib, or Just in Time library, is a system that allows people to write Ugen Graphs (signal processing on the SC server) and rewrite them in real time. This is ideal for live coding, teaching, experimenting and all kinds of compositional work.

ProxySpace

In order to use the JIT lib you create a ProxySpace which becomes the Environment or reference space for the synths that will live on it.

p = ProxySpace.new;
p.fadeTime = 3; // fadeTime is the time of crossfading

p[\snd].play; // we create a reference \snd in the environment
p[\snd] = { SinOsc.ar(440) };
p[\snd] = { Saw.ar(333, 0.4) };

p[\ctrl] = {LFSaw.ar(2)}
p[\snd] = {WhiteNoise.ar(p[\ctrl])}
p[\ctrl] = {LFNoise2.ar(2)}
p[\snd] = {Saw.ar([p[\ctrl]*1000, p[\ctrl]*1000+1], 0.3)}
p[\ctrl] = {LFSaw.ar(0.4)}

Or from the ProxySpace examples file. Here we find the "~" symbol used to reference signals\
 in the dictionary:

~lfo = { LFNoise2.kr(30, 300, 500) };
~out = { SinOsc.ar(~lfo.kr, 0, 0.15)  };
~out = { SinOsc.ar(~lfo.kr * [1, 1.2], 0, 0.1) * Pulse.ar(~lfo.kr * [0.1, 0.125], 0.5) };
~lfo = { LFNoise1.kr(30, 40) + SinOsc.kr(0.1, 0, 200, 500) };
~out = { SinOsc.ar(~lfo.kr * [1, 1.2], 0, 0.1)  };
~lfo = 410;

Ndef

Tdef

Chapter 18 - Tuning Systems and Scales

In this chapter we will look at how we can explore tuning systems, scales and microtonal composition using algorithmic means to generate tunings and scales.

The SynthDefs

For this chapter we want as pure waveform as possible so we can hear the ratios between the notes.

(
// We include two envelopes to choose from.
SynthDef(\pure, {arg freq=440, pan=0.0, vol=0.5, envdur=0.5, envType=0;
	var signal, envArray, env;
	env = EnvGen.ar(Env.perc(0.01, envdur), doneAction:2);
	signal = Pan2.ar(SinOsc.ar(freq), pan) * env  * vol;
	Out.ar(0, signal);
}).add;

// and another one almost identical that plays a sample
SynthDef(\puresample, {arg bufnum, rate=1, pan=0.0, vol=0.5, envdur=0.5, envType=0;
	var signal, envArray, env;
	envArray = [	
		EnvGen.kr(Env.linen(0.05, envdur, 0.1, 1), doneAction:2), 
			EnvGen.kr(Env.perc(0.01, envdur), doneAction:2)
			];
	env = Select.kr(envType, envArray);
	signal = Pan2.ar(PlayBuf.ar(1, bufnum, rate), pan) * env * vol;
	Out.ar(0, signal);
}).add;
)

Tuning systems are generally called “temperaments”. There are many different temperaments, but since the inventon of the piano the equal temperament has become the most common temperament and is typically used in computer music software.

For a bibliography and further information on scales and tunings, visit:

http://www.huygens-fokker.org/scala/

The scales can be found here: http://www.huygens-fokker.org/docs/scales.zip

A good source for microtonal theory is the Tonalsoft Encyclopedia of Microtonal Music Theory: http://tonalsoft.com/enc/

// NOTE: Tuning systems are not scales. We can have scales in different tuning systems.

Equal Temperament

Equal temperament is the most common tuning system in Western music. The octave is divided logarithmically into series of equal steps, most commonly the twelve tone octave. Other systems are also used such as the nineteen tone equal temperament (19-TET) or the thirty one tone equal temperament (31-TET).

Indian and Arabic music often uses a twenty four tone equal temperament (24-TET), although the instruments are frequently tuned using just intonation. Javanese Gamelan music is mainly tuned in a 5-TET

About the cent: The cent is a logaritmic unit (of equal steps) where 1200 represent an octave. In a 12-TET system, the half-note (of two adjacent keys on a keyboard) is 100 cents.

The logarithmic formula for pitch (exponential) for 12 tone equal temperament can be found in the following formula: fundFreq * 2.pow(n/12); (or roughly 1.05946309)

For Equal temperament of 12 notes in an octave these are the values we multiply the fundamental key with:

1 Array.fill(12, {arg i; 2.pow(i/12);})

// -> returns : [ 1, 1.0594630943593, 1.1224620483094, 1.1892071150027, 1.2599210498949, 1.33483985417, 1.4142135623731, 1.4983070768767, 1.5874010519682, 1.6817928305074, 1.7817974362807, 1.8877486253634 ]

(
var freq, n_TET;
n_TET = 25; // try, 5 19, 24, 31, 72...
freq = 440;
~eqTempFreqlist = Array.fill(n_TET, {arg i; freq * 2.pow(i/n_TET);});

~eqTempFreqlist = ~eqTempFreqlist.add(freq*2); // let's add the octave finally

[\freqlist, ~eqTempFreqlist].postln;

Task({
	~eqTempFreqlist.do({ arg freq, i; // first arg = item in the list, next arg = the index (i)
		Synth(\pure, [\freq, freq]);
		0.5.wait;
	});
}).start;
)

// now compare the list we've got (in a 12-TET)
~eqTempFreqlist.size
// to this:
[69,70,71,72,73,74,75,76,77,78,79,80].midicps // the midi notes in an octave starting with A

// and further... you can check the MIDI notes of a say 19-TET equal temperament:
~eqTempFreqlist.cpsmidi // where we get floating point MIDI notes

NOTE (SC LANG): If you are wondering about the ~freqlist.do compare this:

1 a = [111,222,333,444,555,666,777,888];
2 
3 a.do({arg item, i; [\item, item, \i, i].postln;}) // a is the array

To this:

1 a.size.do({arg i; [\item, a[i], \i, i].postln;}) // a.size is an integer.

Just Intonation

Just intonation is a very natural system frequently used by vocalists or instrumentalists who can easily tune the pitch. Instruments tuned in just intonation will have to be retuned in order to play in a different scale. This is the case with the Hapsichord for example.

Just intonation is a method of tuning intervals based exclusively on rational numbers (integers). It is based on the intervals of the harmonic series. Depending on context, the ratio might be different for the same note. (e.g. 9/8 and 10/9 for the major second). Any interval tuned as ratio of whole numbers is a just interval, but usually it is only ratios with small numbers.

Examples of intervals:

2/1 = octave 3/2 = fifth 4/3 = fourth 5/4 = major third 6/5 = minor third

Many composers (e.g. La Monte Yong and Terry Riley) prefer to compose for just intonation tuned instruments.

A major scale ~justIntFreqlist8 = [1, 9/8, 5/4, 4/3, 3/2, 5/3, 15/8];

A whole 12 note octave

1 ~justIntFreqlist = [1/1, 135/128, 9/8, 6/5, 5/4, 4/3, 45/32, 3/2, 8/5, 27/16, 9/5, 15/8, 2/\
2 1];

And we put in a fundamental note (A)

1 ~justIntFreqlist = ~justIntFreqlist * 440

Let’s play the scale:

(

Task({
	~justIntFreqlist.do({ arg freq, i; // 1st arg is the item in the list, 2nd is the index (i)
		Synth(\pure, [\freq, freq]);
		0.7.wait;
	});
}).start;
)

// test some versions of just intonation

~justIntFreqlist1 = [1/1, 135/128, 9/8, 6/5, 5/4, 4/3, 45/32, 3/2, 8/5, 27/16, 9/5, 15/8, 2\
/1];
~justIntFreqlist2 = [1/1, 16/15, 9/8, 6/5, 5/4, 4/3, 45/32, 3/2, 8/5, 5/3, 16/9, 15/8, 2/1]
~justIntFreqlist3 = [1/1, 25/24, 9/8, 6/5, 5/4, 4/3, 45/32, 3/2, 8/5, 5/3, 9/5, 15/8, 2/1];
~justIntFreqlist4 = [1/1, 16/15, 9/8, 6/5, 5/4, 4/3, 17/12, 3/2, 8/5, 5/3, 9/5, 15/8, 2/1];

~justIntFreqlist1 = ~justIntFreqlist1 * 440
~justIntFreqlist2 = ~justIntFreqlist2 * 440
~justIntFreqlist3 = ~justIntFreqlist3 * 440
~justIntFreqlist4 = ~justIntFreqlist4 * 440

// here we listen to different tunings in parallel and we can hear the difference:

(
Task({
	13.do({ arg freq, i; // first arg is the item in the list, next arg is the index (i)
		Synth(\pure, [\freq, ~justIntFreqlist1[i], \envdur, 1.56]);
		Synth(\pure, [\freq, ~justIntFreqlist2[i], \envdur, 1.56]);
		//Synth(\pure, [\freq, ~justIntFreqlist3[i], \envdur, 1.56]); // try these as well
		//Synth(\pure, [\freq, ~justIntFreqlist4[i], \envdur, 1.56]);
		1.4.wait;
	});
}).start;
)

Pythagorean tuning

Pythagorean tuning was invented by the Greek Philosopher Pythagoras in the 6th century BC. He was interested in harmony, geometry and beans. The Pythagorean tuning is based on perfect fifths, fourths and octaves.

~pythFreqlist8 = [1, 9/8, 81/64, 4/3, 3/2, 27/16, 243/128, 2/1]; // a major scale

~pythFreqlist = [1, 256/243, 9/8, 32/27, 81/64, 4/3, 729/512, 3/2, 128/81, 27/16, 16/9, 243\
/128, 2/1];

~pythFreqlist = ~pythFreqlist * 440;

(

Task({
	~pythFreqlist.do({ arg freq, i; // first arg is the item in the list, next arg is the inde\
x (i)
		Synth(\pure, [\freq, freq]);
		0.7.wait;
	});
}).start;
)

Now let’s compare Equal Temperament to the Pythagorean tuning.

First we make the equal temperament scale array

~eqTempFreqlist = Array.fill(12, {arg i; 440 * 2.pow(i/12);});
~eqTempFreqlist = ~eqTempFreqlist.add(440*2); // let's add the octave finally

(
Task({
	12.do({ arg freq, i; // first arg is the item in the list, next arg is the index (i)
		Synth(\pure, [\freq, ~pythFreqlist[i], \envdur, 1]);
		Synth(\pure, [\freq, ~eqTempFreqlist[i], \envdur, 1]);
		1.4.wait;
	});
}).start;
)

// and here we compare Just Intonation with Pythagorean tuning.

(
Task({
	12.do({ arg freq, i; // first arg is the item in the list, next arg is the index (i)
		Synth(\pure, [\freq, ~pythFreqlist[i], \envdur, 1]);
		Synth(\pure, [\freq, ~justIntFreqlist[i], \envdur, 1]);
		1.4.wait;
	});
}).start;
)

Scales

Scales are usually but not necessarily designated for an octave - so they repeat themselves over all octaves. There are countless scales with different note count, the most common in Western music is the diatonic scale. Other common scales (defined by note count) are chromatic (12 notes), whole tone (6 notes), pentatonic (5 notes) and octatonic (8 notes)

A dictinary of Scales

James McCartney wrote this dictionary of scales. (they are MIDI notes - no microtones and all are equal tempered)

(
z = (
// 5 note scales
	minorPentatonic: [0,3,5,7,10],
	majorPentatonic: [0,2,4,7,9],
	ritusen: [0,2,5,7,9], // another mode of major pentatonic
	egyptian: [0,2,5,7,10], // another mode of major pentatonic
	
	kumoi: [0,2,3,7,9],
	hirajoshi: [0,2,3,7,8],
	iwato: [0,1,5,6,10], // mode of hirajoshi
	chinese: [0,4,6,7,11], // mode of hirajoshi
	indian: [0,4,5,7,10],
	pelog: [0,1,3,7,8],
	
	prometheus: [0,2,4,6,11],
	scriabin: [0,1,4,7,9],
	
// 6 note scales
	whole: (0,2..10),
	augmented: [0,3,4,7,8,11],
	augmented2: [0,1,4,5,8,9],
	
	// hexatonic modes with no tritone
	hexMajor7: [0,2,4,7,9,11],
	hexDorian: [0,2,3,5,7,10],
	hexPhrygian: [0,1,3,5,8,10],
	hexSus: [0,2,5,7,9,10],
	hexMajor6: [0,2,4,5,7,9],
	hexAeolian: [0,3,5,7,8,10],
	
// 7 note scales
	ionian: [0,2,4,5,7,9,11],
	dorian: [0,2,3,5,7,9,10],
	phrygian: [0,1,3,5,7,8,10],
	lydian: [0,2,4,6,7,9,11],
	mixolydian: [0,2,4,5,7,9,10],
	aeolian: [0,2,3,5,7,8,10],
	locrian: [0,1,3,5,6,8,10],
	
	harmonicMinor: [0,2,3,5,7,8,11],
	harmonicMajor: [0,2,4,5,7,8,11],
	
	melodicMinor: [0,2,3,5,7,9,11],
	bartok: [0,2,4,5,7,8,10], // jazzers call this the hindu scale
	
	// raga modes
	todi: [0,1,3,6,7,8,11], // maqam ahar kurd
	purvi: [0,1,4,6,7,8,11],
	marva: [0,1,4,6,7,9,11],
	bhairav: [0,1,4,5,7,8,11],
	ahirbhairav: [0,1,4,5,7,9,10],
	
	superLocrian: [0,1,3,4,6,8,10],
	romanianMinor: [0,2,3,6,7,9,10], // maqam nakriz
	hungarianMinor: [0,2,3,6,7,8,11],	
	neapolitanMinor: [0,1,3,5,7,8,11],
	enigmatic: [0,1,4,6,8,10,11],
	spanish: [0,1,4,5,7,8,10],
	
	// modes of whole tones with added note:
	leadingWhole: [0,2,4,6,8,10,11],
	lydianMinor: [0,2,4,6,7,8,10],
	neapolitanMajor: [0,1,3,5,7,9,11],
	locrianMajor: [0,2,4,5,6,8,10],
	
// 8 note scales
	diminished: [0,1,3,4,6,7,9,10],
	diminished2: [0,2,3,5,6,8,9,11],
	
// 12 note scales
	chromatic: (0..11)
);
)
z.at('chromatic').postln;

// now we try one of those scales

(
x = z.at('hirajoshi').copy; // test the scales above by replacing the name
x = x.add(12); // add the octave
x = x.mirror;

Task({
	x.do({ arg ratio, i; // first arg is the item in the list, next arg is the index (i)
		Synth(\pure, [\freq, (69+ratio).midicps, \envdur, 0.94]);
		0.41.wait;
	});
}).start;
)

(
// do we get a nice melody?
Task({
	x.mirror.do({ arg ratio, i; // first arg is the item in the list, next arg is the index (i)
		Synth(\pure, [\freq, (69+x.choose).midicps, \envdur, 0.84]);
		0.18.wait;
	});
}).start;
)

{language= JavaScript, line-numbers=off}

The Scala Library

For a proper exploration of scales we will use the Scala project and the SCL class written in SuperCollider to use the Scala files.

The Scale Archive can be found here (with over 3000 scales): http://www.huygens-fokker.org/docs/scales.zip

And a SuperCollider class that interfaces with the archive can be found here (XiiScala.sc) https://github.com/thormagnusson/TuningTheory

Note that you have to provide the path to where you install your Scala libaray for example “~/scwork/scl/”

a = XiiScala(“bohlen-p_9”); a.tuning.octaveRatio a.degrees a.semitones a.pitchesPerOctave

z = x.degrees.mirror;

( Task({ z.do({ arg ratio, i; // first arg is the item in the list, next arg is the index (i) Synth(\pure, [\freq, 440*ratio]); 0.3.wait; }); }).start; )

( x = SCL.new(“cairo.scl”.standardizePath, 440); z = x.getRatios.mirror;

Task({ z.do({ arg ratio, i; // first arg is the item in the list, next arg is the index (i) Synth(\pure, [\freq, 440*ratio]); 0.3.wait; }); }).start; )

( x = SCL.new(“kayolonian_s.scl”.standardizePath, 440); z = x.getRatios.mirror;

Task({ z.do({ arg ratio, i; // first arg is the item in the list, next arg is the index (i) Synth(\pure, [\freq, 440*ratio]); 0.3.wait; }); }).start; )

Using Samples

We can of course control the pitch of sampled sounds too, and here the playback rate will control the pitch o f the sample.

First we load a sound and we get a sound with a simple tone (replace this sound with your own)

1 b = Buffer.read(s, "sounds/xylo/02.aif");

The pythagorean scale:

~pythFreqlist8 = [1, 9/8, 81/64, 4/3, 3/2, 27/16, 243/128, 2/1]; // a major scale

~pythFreqlist8 = ~pythFreqlist8.mirror;

(
{
	~pythFreqlist8.do({arg item;
		Synth(\puresample, [\bufnum, b.bufnum, \rate, item, \envType, 1]);
		Synth(\pure, [\freq, 752*item]); // 752 is just rough freq of the 02.aif sample
		0.5.wait;
	});
}.fork
)

x = SCL.new("degung5.scl".standardizePath, 440);

x = SCL.new("diaconv6144.scl".standardizePath, 440);

x = SCL.new("bagpipe1.scl".standardizePath, 440);

x.name
x.steps // how many notes are there in the scale
x.getRatios


Synth(\puresample, [\bufnum, b.bufnum, \rate, 1, \envType, 0]);
Synth(\pure, [\freq, 752]);


// p is our scale
p = x.getRatios
p.size

p = p.mirror // up and down again! (See Array helpfile for .mirror)

(
{
	p.do({arg item;
		Synth(\puresample, [\bufnum, b.bufnum, \rate, item, \envType, 1]);
		Synth(\pure, [\freq, 752*item]); // 752 is just rough freq of the 02.aif sample
		0.5.wait;
	});
}.fork
)

The Scale and Tuning Classes

SuperCollider comes with Scale and Tuning classes. They make encapsulate and simplify the things we have done above in easy to use methods of the Scale and Tuning libraries.

An example - we choose a minor scale:

a = Scale.minor;
a.degrees; 
a.semitones;	
a.cents;	
a.ratios;	

d = Pdef(\minor_et12, Pbind(\scale, a, \degree, Pseq((0..7) ++ (6..0), inf), \dur, 0.5, \am\
p, 0.1)).play;

// we choose a tuning:
t = Tuning.just; // just intonation
a.tuning_(t);

e = Pdef(\minor_just, Pbind(\scale, a, \degree, Pseq((0..7) ++ (6..0), inf), \dur, 0.5, \am\
p, 0.1)).play;


// So let's listen to equal tempered and just intonation together. You can hear the beating
(
a = Scale.minor;
t = Tuning.et12; // equal tempered tuning 
a.tuning_(t);
d = Pdef(\minor_et12, Pbind(\scale, a, \degree, Pseq((0..7) ++ (6..0), inf), \dur, 0.5, \am\
p, 0.1)).play;

b = Scale.minor;
t = Tuning.just; // just intonation
b.tuning_(t);
e = Pdef(\minor_just, Pbind(\scale, b, \degree, Pseq((0..7) ++ (6..0), inf), \dur, 0.5, \am\
p, 0.1)).play;
)

Check the scale directory

1 Scale.directory

And the available tunings

1 Tuning.directory

Part V

Chapter 19 - Creating Classes

In object oriented programming classes serve as the blueprint for objects, the genotype information that results in instances of phenotypes. Like a recipe for cookes. This can be extremely useful when creating data structures that have properties (parameters or variables) and have behaviours (methods or functions).

For further information on what a class is, then a good start is the Wikipedia:

// run this line in SuperCollider “open ‘http://en.wikipedia.org/wiki/Class_(computer_science)’“.unixCmd;

Creating Classes

A good introduction to writing classes are in the Help documentation

“Writing-Classes”.openHelpFile

So below we have two TestClasses, one that subclasses the first one (like the guitar is a subclass of a string instrument).

Save both classes in a document that could be called “TestClass.sc” and make sure it is saved in the classpath of SuperCollider (where it is compiled on recompilation or startup). On the Mac the classpath for third party classes is in ~Library/Application Support/SuperCollider/External.

TestClass {
	
	classvar <>myvar; // classvariables
	var <>addnr, >addnrSet, <addnrGet; // instance variables
	// this is a normal constructor method
	*new { arg argaddnr; 
		^super.new.initTest(argaddnr) 
	}
	
	initTest { arg argaddnr;
		addnr = argaddnr ? 3;
	    // do initiation here
	}
	
	calc {arg a, b;
		var c;
		c = a+b;
		^c // return
	}

}

TestClass2 : TestClass {
	calc { arg a, b;
		var c;
		c = a * b + addnr;
		^c;
	}
	
	setAddNr_ { arg newnr;
		addnr = newnr;
	}
	
	getAddNr {
		^addnr;
	}
}

When the classes have been saved and compiled we can now test the class:

t = TestClass.new // 
t.calc(3,4)

r = TestClass.new(9)
r.addnr

v = TestClass2.new
v.calc(3,4)

v.addnr_(55)
v.addnr // our new class
t.addnr // the other of course still is just 9

v.addnrSet = 33 // we can set this number (because of > (a setter) )
v.addnrSet_(33) // another way of setting a variable (same as = )


v.addnrGet = 33 // Wrong! we cannot set this number ( because it is a getter < )

// proper object orientated programming uses setter and getter methods 
// (rather than accessing variables directly)

// here we use the setAddNr_ method to set our variable.
v.setAddNr_(333)
// and we can look at it:
v.addnr 
// but should really look at it with the getter method we made:
v.getAddNr

You can see that the < > symbols in the class are so called setters (>) and getters (<), which, if specified in front of the property specification, can serve instead of methods that set the properties. Therefore you can equally write

v.addnr(44) // using a setter and v.addnrSet(44) // using a method

Another test class

SonicArtsClass {
	
	var win, textfield, textfield2, rect; // get text but set text2
	var name, <>profession; // a getter and setter variable
	var friends;

	*new { arg name, rect, color; 
		^super.new.initSAClass(name, rect, color);
		}
	
	initSAClass { arg argname, argrect, color;
		var scramblebutton;
		
		rect = argrect;
		name = argname;
		win = SCWindow(name, rect, resizable:false).front;
		win.view.background_(color);
		textfield = SCStaticText(win, Rect(10, (rect.height/2)-30, rect.width, 30));
		textfield.string_("");
		textfield.font_(Font("Helvetica-Bold", 24));
		textfield2 = SCStaticText(win, Rect(10, (rect.height/2)+30, rect.width, 30));
		textfield2.string_("");
		textfield2.font_(Font("Helvetica-Bold", 14));
		scramblebutton = SCButton(win, Rect(10,10, 200, 30))
				.states_([
				["change friends color",Color.black,Color.clear]]
				)
				.action_({
				friends.do({arg friend; friend.changeColor(Color.rand)});
				});

		friends = List.new;
	}
	
	speak_{arg string;
		textfield.string_(string);
	}

	speak2_{arg string;
		textfield2.string_(string);
	}
	
	updateGUI {
		win.refresh;
	}
	
	addFriend {arg friend;
		friends.add(friend);
	}
	
	getName {
		^name; // note the return symbol
	}
	
	setName_ {arg newname; // note the underscore used when you are setting
		name = newname;
	}
	
	removeFriend {arg friend;
		var friendindex;
		friendindex = friends.indexOfEqual(friend);
		friends.remove(friendindex);
	}
	
	showFriends {
		var namesOfFriends;
		namesOfFriends = List.new;
		friends.do({arg friend; namesOfFriends.add(friend.getName)});
		textfield2.string_(namesOfFriends.asString);
	}
	
	getFriends {
		^friends
	}
	
	getFriendNames {
		var namesOfFriends;
		namesOfFriends = List.new;
		friends.do({arg friend; namesOfFriends.add(friend.getName)});
		^namesOfFriends;
	}
	
	changeColor {arg color;
		win.view.background_(color);
		win.update;
	}
}

{language= JavaScript, line-numbers=off}

And here is some code to try the class

a = SonicArtsClass("john", Rect(50, 800, 300, 200), Color.red)
a.speak_("Hi! I'm John")
a.profession = "singer"
a.speak2_("I am a" + a.profession)

b = SonicArtsClass("george", Rect(350, 800, 300, 200), Color.blue)
b.speak_("Hi! I'm george")
b.profession = "bass player"
b.speak2_("I am a" + b.profession)

c = SonicArtsClass("paul", Rect(650, 800, 300, 200), Color.green)
c.speak_("Hi! I'm paul")
c.profession = "guitarist"
c.speak2_("I am a" + c.profession)

// let's fix the roles

b.profession = "guitarist"
b.speak2_("I am a" + b.profession)
c.profession = "bass player"
c.speak2_("I am a" + c.profession)

a.addFriend(b)
a.addFriend(c)
a.showFriends

b.showFriends
c.showFriends

b.addFriend(a)
b.addFriend(c)
b.showFriends // check his friends

// what if john wants to change his name?

a.setName_("ringo");
a.speak_("Hi! I'm"+a.getName)
// we can get the name like this
a.getName
// but not like this:
a.name
// however, we can get the profession like this
a.profession
// WHY?
// the reason is the < (get) and > (set) properties of the profession variable

Chapter 20 - Functional Programming

SuperCollider is an object oriented programming (OOP) language inspred by SmallTalk. In SmallTalk (and unlike languages such as C++ or Java) everything is an object, so it’s possible to create methods and subclass practically all data structures. Thus we find in SuperCollider the methods of .toLower for a string or a char (“HEY”.toLower and $A.toLower) or .neg or .cos for a number (10.neg and 2.cos). Here the actual number is an object that has methods (e.g. .neg).

We could for example create a .double method for SimpleNumber. We’d simply open the SimpleNumber class and create a method like this:

double { ^this * 2 }

But don’t do that. If you are creating your own methods and classes, keep them in your own Extensions folder, so the next time you update SuperCollider, you still have your classes and methods at hand.

However, there is another way of thinking and it’s different from OOP, a bit like the difference between Plato (reality is static ideal types) and Heracleitus or Buddha (reality is a flow). Let’s explore functional programming.

SuperCollider is also a functional programming language. You can program solely using the functional paradigm and we will look at the FP classes below.

Functional Programming

In functional programming (FP) the idea is not to create classes and instantiate objects that exist in the computer’s memory, responding to messages and calls. It’s not a world of things, but a world of movement, behaviour, events.

So it’s not “John.drives(work)| but rather a “drives - work - John”. The function is to drive, and it’s John who is going to work.

So the doubling of the number above would be a function:

double = {arg num; num*2}

In short, the idea is to avoid state and mutable objets, as those are typically the source of bugs and errors (often called side effects) in imperative and object orientated programming languages.

Functions as first class citizens

In a functional programming language, it is important to be able to send functions around into other functions as arguments, and if that’s possible, the language supports “functions as first class citizens”

~double = {arg num; num.value * 2 }

~double.value( {3+3 )

~square = {arg num; num.value * num.value }

~double.value( ~square.value({3}) ) 

Above, the result of the function ~square was passed to the function ~double

Recursion

Iteration or looping is typicaly done with recursion in functional programming. So instead of the C and Java:

fact = 1;
for(int i=1; i>6; i++) {
	println(fact*i);
}

Or normal SuperCollider:

var fact = 1;
5.do({arg i; (fact = fact*(i+1)).postln;})

The Scheme function for the above would be

(define factorial 
  (lambda (n)
     (if (= n 0)
         1
         (* n (factorial (- n 1))))))

(factorial 5)

And a Python version could be written with some explanations like this:

def factorial(n) :
   print "entering factorial level (n) %i" % n
   if n == 1 :
       print "deepest level of recursion reached! n is now : %s" % n
       return 1
   else :
       r = factorial(n - 1)
       x = n * r
       print " n is: %s and r is:  %s" % (n, r)
       return x

level = 5
print "top level function returns %i when n is %i" % (factorial(level), level)

The SuperCollider recursion would be something like

f = {arg n;
	"n is: ".post; n.postln;
   	 if(n==0, {
	   	"n is now zero".postln;
        	1
   	 }, {
        	(n * f.value(n-1)).postln;
    	});
};

f.value(5);

Chapter 21 - Live Coding

Live coding needs no introduction, but as a summary it comes with an imperative that performers project their screens such that the audience is able to participate in the musical creation. Some people argue that this should be done from a clean slate where the code is designed in realtime. Others use prewritten code and change parameters on the fly (often called “CJs”, or code jockeys). A dedicated forum exists for practitioners called Toplap and various papers have been written on live coding, with MIT Press publishing a Handbook on the topic in 2021.

A typical problem for the live coder is the high level of expertise required for such performance. Very few performers are able to exhibit those skills without consistent dedication to practise.

The level of abstraction for live coding therefore becomes important. Are we coding in C/C++ or in higher levels? Here language such as Tidal, Sonic Pi and ixi lang have proposed solutions to more realtime environments.

Chapter 22 - Other clients

Other sc-synth clients than the sclang include Tidal (written in Haskell), Sonic Pi (written in Ruby), ixi lang (written in sc-lang), and many others.

Creating a client is very easy. It involves sending OSC messages from your language of choice to the sc-synth. Here below is the totality of the commands you need to create a fully functional sc-synth client:

The following is a list of all server commands but there is more detail in the Server Command Reference document.

/quit - Quit program. Exits the synthesis server.

/notify - Register to receive notifications from server

/status - Query the status. Replies to sender with the following message:

/status.repl

/dumpOSC - Display incoming OSC messages.

/sync - Notify when async commands have completed.

/clearSched - Clear all scheduled bundles. Removes all bundles from the scheduling queue.

/error - Enable/disable error message posting.

/version - Query the SuperCollider version.

/d_recv - Receive a synth definition file.

/d_load - Load synth definition.

/d_loadDir - Load a directory of synth definitions.

/d_free - Delete synth definition.

/n_free - Delete a node.

/n_run - Turn node on or off.

/n_set - Set a node’s control value(s).

/n_setn - Set ranges of a node’s control value(s).

/n_fill - Fill ranges of a node’s control value(s).

/n_map - Map a node’s controls to read from a bus.

/n_mapn - Map a node’s controls to read from buses.

/n_mapa - Map a node’s controls to read from an audio bus.

/n_mapan - Map a node’s controls to read from audio buses.

/n_before - Place a node before another.

/n_after - Place a node after another.

/n_query - Get info about a node.

/n_trace - Trace a node.

/n_order - Move and order a list of nodes.

/s_new - Create a new synth.

/s_get - Get control value(s).

/s_getn - Get ranges of control value(s).

/s_noid - Auto-reassign synth’s ID to a reserved value.

/g_new - Create a new group.

/p_new - Create a new parallel group.

/g_head - Add node to head of group.

/g_tail - Add node to tail of group.

/g_freeAll - Delete all nodes in a group.

/g_deepFree - Free all synths in this group and all its sub-groups.

/g_dumpTree - Post a representation of this group’s node subtree.

/g_queryTree - Get a representation of this group’s node subtree.

/u_cmd - Send a command to a unit generator.

/b_alloc - Allocate buffer space.

/b_allocRead - Allocate buffer space and read a sound file.

/b_allocReadChannel - Allocate buffer space and read channels from a sound file.

/b_read - Read sound file data into an existing buffer.

/b_readChannel - Read sound file channel data into an existing buffer.

/b_write - Write sound file data.

/b_free - Free buffer data.

/b_zero - Zero sample data.

/b_set - Set sample value(s).

/b_setn - Set ranges of sample value(s).

/b_fill - Fill ranges of sample value(s).

/b_gen - Call a command to fill a buffer.

/b_close - Close soundfile.

/b_query - Get buffer info.

/b_get - Get sample value(s).

/b_getn - Get ranges of sample value(s).

/c_set - Set bus value(s).

/c_setn - Set ranges of bus value(s).

/c_fill - Fill ranges of bus value(s).

/c_get - Get bus value(s).

/c_getn - Get ranges of bus value(s).

/done - An asynchronous message has completed.

/fail - An error occurred.

/late - A command was received too late. not yet implemented

/n_go - A node was started. This command is sent to all registered clients when a node is created.

/n_end - A node ended. This command is sent to all registered clients when a node ends and is deallocated.

/n_off - A node was turned off. This command is sent to all registered clients when a node is turned off.

/n_on - A node was turned on. This command is sent to all registered clients when a node is turned on.

/n_move - A node was moved. This command is sent to all registered clients when a node is moved.

/n_info - Reply to /n_query. This command is sent to all registered clients in response to an /n_query command.

/tr - A trigger message.

There is much more to learn about this, so see the Server Command Reference file for that. This list is just provided here to show how the commands map to the constructs in the SuperCollider language that we have been learning in this book.

Chapter 23 - Twitter code

A musical miniature form of writing pieces for Supercollider in under 280 character, which is the current char limits of Twitter. Earlier Twitter pieces (#sctweets) would be 140 character which was the char limit until recently.

Of course, no sane SuperCollider user would write code this way. The Twitter constraints (of 280 chars) forces people to consider how code can be compressed as much as possible, for example writing 999 instead of 1000 (thus saving a char) or 9e10, which becomes 90000000000. This is not code to learn how to write music in SuperCollider.

Notable sctweet composers: https://twitter.com/redFrik

My own twitter pieces can be found here: https://github.com/thormagnusson/sctweets

Some favourites:

// headcube

play{l=LFSaw;SinOsc.ar(15**(l.kr(-4.8,1)*l.kr(-1.8,1))*20).sqrt+(99**l.kr(-0.6,0.5)/99*Cusp\
L.ar)+Blip.ar(0.8,1+LFNoise0.kr(0.2)*3e3,4)!2/4}

// redfrik
Pbind(\freq,Pseq("SUPERCOLLIDER".ascii,inf)*Pstutter(64,Pseq([3,4,5],inf))*[1,2.045],\dur,0\
.03,\amp,Pseq([0,0.1],inf)).play//

play{a=LFPar;GVerb.ar(VarSaw.ar(a.ar(1,0,5,a.ar([0.05,0.04],0,50,160).round(50)),0,a.ar(0.2\
,0,0.5,a.ar(3,0,0.2,0.5)))/8,80)}//#SuperCollider

{n=16;Splay.ar(Ringz.ar(Impulse.ar(4/{8.rand+1}!n),{exprand(80,1.2e3).round(80)}!n,{rrand(0\
.01,2.0)}!n,4/n))}.play #SuperCollider

play{Ringz.ar(CoinGate.ar([5,3,9,6]/20,Impulse.ar(1<<[1,2,3,4])),[30,9e3,40,7e3],[9,7,3,6]/\
10,[8,3,6,2]).sin.sum.tanh!2}

@rukano
{n=16;Splay.ar(Ringz.ar(Impulse.ar(4/{8.rand+1}!n),{rrand(80,2e3).round(80)}!n,{rrand(0.01,\
2.0)}!n,4/n))}.play // #SuperCollider

@rukano
{SinOsc.ar(Duty.ar(1/8,0,Dshuf("TAKEKO".ascii.midicps,inf))*[1,1.01]/[1,2,4].choose)*Decay2\
.ar(Impulse.ar(8),0.1)}.play // #SuperCollider

@rukano
{GVerb.ar(Limiter.ar(Line.ar(0,2**[8,12,16].choose,20)>>[4,8,24,32].choose<<LFSaw.ar([4,3]/\
8,0,8,8)%24)/4,99,4)}.play

@rukano
{GVerb.ar(Limiter.ar(Line.ar(0,2**[8,12,16].choose,20)|[4,8,24,32].choose<<LFSaw.ar([4,3]/8\
,0,8,8)%24)/4,99,4)}.play // #SuperCollider

@rukano
{MoogFF.ar([Pulse,Saw].choose.ar([60,65,67].choose.midicps/LFPulse.kr(1).range(2,4)),LFNois\
e0.kr(8,1e3,500),1.0.rand)!2}.play // #SuperCollider

@rukano
{MoogFF.ar(LFSaw.ar(64*LFPulse.kr(1/8,0,1/2,3).midiratio.lag)!2,LFPar.kr(1/8,0,400,500)*LFN\
oise0.kr(8,1,1))}.play #SuperCollider

@rukano
{MoogFF.ar(LFSaw.ar([64,64.1]*Duty.kr(4,0,Dseq([0,3].midiratio,inf)).lag),LFPar.kr(1/8,0,40\
0,500)*LFNoise2.kr(8,0.3,1))}.play #SuperCollider

@rukano
{Blip.ar(Duty.kr(1/4,0,Dshuf([60,61,63,64].midicps/8,inf))*[1,1.01]*Duty.kr(4,0,Dseq([0,-3,\
3,5].midiratio,inf)),6)}.play // #SuperCollider

@rukano
{Blip.ar(Duty.kr(1/4,0,Dseq([60,61,63,64].midicps/4,inf))*[1,1.01]*Duty.kr(4,0,Dseq([0,-3,3\
,5].midiratio,inf)),4)}.play // #SuperCollider

@rukano
{Blip.ar(Duty.kr(1/4,0,Dseq([60,61,63,64].midicps,inf))*[1,1.01]*Duty.kr(4,0,Dseq([0,-3,3,5\
].midiratio,inf)),2)}.play // #SuperCollider

Andre Bartetzki compiled this collection:

/*
Here's a quick way to save your ears while trying out potentially
exploding tweets:

(
f = { play { ReplaceOut.ar(0,In.ar(0,2).clip2)} };
ServerTree.add(f,s);
)
*/


/* by  dan stowell */
{ Klank.ar(`[(33,44..77).midicps,nil,0.1], SinOsc.ar(LFSaw.kr(1/60).exprange(0.01, 200)).ex\
prange(100,1000)).dup/9 }.play //#supercollider

/* by  tim walters */
play{HPF.ar(GVerb.ar(({|k|({|i|y=SinOsc;y.ar(i,y.ar(i+k**i)/Decay.kr(Impulse.kr(0.5**i/k),[\
1,2]+i,k))}!6).sum}!16).sum,1,1)/180,40)}

/* by  tim walters */
play{HPF.ar(({|k|({|i|SinOsc.ar(i/96,Saw.ar(2**(i+k))/Decay.ar(Impulse.ar(0.5**i/k),[k*i+1,\
k*i+1*2],3**k))}!6).product}!32).sum/2,40)}

/* by  miquel parera jaques */
{Pan2.ar(SinOsc.ar(Demand.kr(Impulse.kr(rrand(2,33)),0,Dseq([0.3.rand,0.3.rand]+rrand(99,20\
0),inf)),0,SinOsc.ar(rrand(6,33))),0,0.08)}.play;

/* by  miquel parera jaques */
{Pan2.ar(FreeVerb.ar(SinOsc.ar(Demand.kr(Impulse.kr(Rand(10,100)),0,Dseq([Rand(100,333),Ran\
d(100,333)],inf)),0,0.08),1,1,1),0)}.play;

/* by  miquel parera jaques */
{a=rrand(25,50);Pan2.ar(SinOsc.ar(Demand.ar(Impulse.ar(1/a),0,Dwhite(100,333)))*EnvGen.ar(E\
nv.perc(a/2,a/2,0.1),Impulse.ar(1/a)),0)}.play;

/* by  redfrik */
r{99.do{|i|x={Pan2.ar(SinOsc.ar(i+1,SinOsc.ar((i%9).div(3)*100+(i%9)+500),0.03),1.0.rand2)}\
.play;2.wait;x.release(25)}}.play//#SuperCollider

/* by  redfrik */
r{99.do{x={Pan2.ar(BPF.ar(Impulse.ar(18.linrand+0.5),9999.linrand,0.3.linrand,5),1.0.rand2)\
}.play;3.wait;x.release(9)}}.play//#SuperCollider

/* by  redfrik */
r{loop{x=play{t=SinOsc.ar(999.rand).abs;Formlet.ar(TDuty.ar(t,0,t),4e3.linrand,t,1-t)!2};wa\
it(9.rand+1);x.release(39)}}.play//#SuperCollider

/* by  redfrik */
r{loop{z=20.rand+6;x={y=LFTri.ar(z).abs/9/z;RLPF.ar(TDuty.ar(y,0,y),z*600,0.06,9)!2}.play(s\
,0,z);wait(26-z);x.release}}.play//#SuperCollider

/* by  redfrik */
r{loop{z=60.rand+1;x={y=LFTri.ar(z).abs/z;RLPF.ar(TDuty.ar(y,0,y),z*99+y,0.01,6+y)!2}.play(\
s,0,z);wait(z/3);x.release}}.play//#SuperCollider

/* by  redfrik */
r{loop{x={GVerb.ar(MoogFF.ar(ClipNoise.ar*0.4,LFPar.kr({0.3.rand}!2,0,600,990)),9,9,1)}.pla\
y(s,0,19);3.wait;x.release}}.play//#SuperCollider

/* by  redfrik */
r{loop{x={BPF.ar(Pluck.ar(Crackle.ar([1.9,1.8]),Impulse.ar(5.rand+1),0.05,0.05.linrand),120\
0.rand)}.play(s,0,9);wait(9);x.release(69)}}.play

/* by  redfrik */
play{x=LFNoise1.ar(0.5!2);Formlet.ar(Crackle.ar(x.range(1.8,1.98)),TExpRand.ar(200,2e3,x).l\
ag(2),x.range(5e-4,1e-3),0.0012)}//#SuperCollider

/* by  redfrik */
{|i|x=i+6.rand;Pbind(\dur,0.06,\sustain,1,\amp,0.01,\degree,Pgauss(x,sin(x+Ptime()%6/6e3)*9\
),\pan,Pkey(\degree)-x*9).play}!6//#SuperCollider

/* by  redfrik */
play{a=SinOsc;LeakDC.ar(a.ar(a.ar(0.31),a.ar(a.ar(0.21),a.ar(a.ar(0.11,a.ar(0.01)),0,a.ar([\
2,3],0,400))),a.ar([0.3,0.21])))}//#SuperCollider

/* by  minkepatt */
(f=Duty.kr(_,16,Dseq("berlin_calling".ascii,inf));play{RHPF.ar(LFTri.ar(f.(5/34)*7),f.(1/13\
)*16,(LFSaw.ar(0.25,1)))};)

/* by  minkepatt */
(f=Duty.kr(_,3,Dseq("berlin_calling".ascii.mirror,inf));play{RLPF.ar(LFTri.ar(f.(5/34)*7),f\
.(1/13)*16,(LFSaw.ar(0.5,1)))};)

/* by  minkepatt */
(play{x=SinOsc.ar(gcd (441*SinOsc.ar(5),882*SinOsc.ar(3))); (x.distort % x)};)

/* by  minkepatt */
(play{LFTri.ar(CombL.ar(Dust.kr(GrayNoise.ar)*44.1,1e-17,1e-6).round(441/3.5))};) 

/* by  minkepatt */
(9.rand.do{{LFCub.ar(441/3+441*5.rand)*LFGauss.kr(0.25+2.25.rand.round(0.125));}.play})

/* by  minkepatt */
(play{Schmidt.ar(Saw.ar([48,50,60].midicps), (LFPar.kr(0.87.rand)), (LFPar.ar(CuspL.ar(0.45\
.rand))))};)

/* by  minkepatt */
(play{Ringz.ar((Impulse.kr(4))+(Dust.kr(9))*2/3*(HPF.ar(ClipNoise.ar(1/6),3300)),[34,21].sq\
uared)};)

/* by  minkepatt */
(play{var a,f,t;a=Dseq([1,3,5,8,2,5]*[2,0,0.5],inf);t=(Impulse.kr([5,4,7]));f=Demand.kr(t,0\
,a)*[13,89];LFTri.ar(f)*0.2**Dust.ar(2)};)

/* by  minkepatt */
(play{var a,f,t;a=Dseq([1,8]*[2,0,0.5],inf);t=(Saw.kr([5,4]));f=Demand.kr(t,0,a)*[13,89];LF\
Tri.ar(f)*0.2**Dust.ar(2)};)

/* by  minkepatt */
({PanB2.ar(Gendy1.ar(3,4,2,2,305,500,0.33)*LFNoise2.kr(0.2,0.5),Saw.kr(0.2,3))}.play;) 

/* by  minkepatt */
({PanB2.ar(Gendy1.ar(6,2,3,4,rrand(15,250),5000,0.1)*LFNoise2.kr(0.2,2),LFNoise0.kr(0.2))}.\
play;)

/* by  lfsaw */
{Splay.ar({RHPF.ar(Saw.ar(Rand(0.5,9),LFSaw.ar(BrownNoise.ar(1299,500))),LFNoise2.ar(f=LFNo\
ise2.ar(9,9,1),999,0))*((f)**10).clip2}!9)}.play;

/* by  Monocular Sludgebath the 7th */
{Mix.fill(99,{|i| var a=LFNoise2.kr(i*[0.001,0.01]); a*RLPF.ar(Saw.ar(40+(i*10).round(5.73)\
,0.02),(a*2e3).max(100),(i+1)*0.001)})}.play

/* by  Monocular Sludgebath the 7th */
{b=99;GVerb.ar(Mix.fill(b,{|i|a=Saw.ar([b.rand+i,LFNoise0.ar(i*0.5.rand,b,rrand(b,2e3))]);R\
LPF.ar(Saw.ar(a,a)*0.05,(1000*a).max(b))}))}.play

/* by  Monocular Sludgebath the 7th */
{a=99; Mix.fill(a,{|i| e=EnvGen.ar(Env({1.0.rand}!a,{rrand(1,7)}!98,i.neg,97,0));e*Blip.ar(\
LFNoise2.ar(7*e,7,e*[a,1e3]+8),e*a%7,0.1)})}.play

/* by  juan alzate romero */
play{Splay.ar(SinOsc.ar(Latch.ar(SinOsc.ar(1.3,0,4e2,5e2),Impulse.ar([1,1.0001])*[1,2,3,4])\
.lag(0.05),0,0.9))}

/* by  juan alzate romero */
fork{loop{d=rrand(1,9.0);play{f=500.rand+60;Splay.ar(SinOsc.ar(XLine.kr(f,(f,f+1..f+f),d,1,\
0,2),0,0.1))};d.wait}}

/* by  juan alzate romero */
play{Resonz.ar(Crackle.ar!2,Duty.kr(Dseq([1,1,4,2,2]/8,inf),0,Dseq([99,Dwhite(99,9e3,1)],in\
f)),TExpRand.kr(0.001,1,Impulse.kr(8)))*4}

/* by  juan alzate romero */
play{a=LFNoise0;b=FBSineC.ar(a.kr(4,2e4,2e4),a.kr(10,16,17),1,1.005,0.7)!2;Latch.ar(b,Impul\
se.ar(a.kr(0.5,5e3,4e3)))*a.kr(5,0.4,0.5)}

/* by  juan alzate romero */
play{x=LFDNoise3;Blip.ar(x.kr(1,[400,100],500),x.kr([1,2],6,5))*Gendy3.ar(1,1,1,1,x.kr(1,99\
,91),1,0.1)*x.ar(0.5,0.5,0.5)}

/* by  pyoungryang ko */
Routine({inf.do{|i|{SinOsc.ar(263.rand-i%97.nthPrime*7,0,EnvGen.kr(Env.sine(9.9.rand,0.01.r\
and),doneAction:2))}.play;(i%0.13).wait}}).play

/* by  pyoungryang ko */
Routine({inf.do{|i|{SinOsc.ar(57.rand+i%181.nthPrime*5,0,EnvGen.kr(Env.sine(9.rand,0.01.ran\
d),doneAction:2))}.play;(i%0.17).wait}}).play

/* by  matt rogalsky */
{Splay.ar(Array.fill(100, { Pulse.ar(LFNoise2.kr(1,2500,10000)) + BPF.ar(Dust.ar(rrand(0.00\
1,0.009)),LFNoise2.kr(0.01,50,5000),0.01,500) }))}.play;

/* by  lfsaw */
{Splay.ar({SinOscFB.ar(LFSaw.ar(Sweep.ar(Impulse.ar(Rand()),2)).range(Rand(99,999), Rand())\
,Sweep.ar(Impulse.ar(Rand())))}!10)}.play

/* by  arthur carabott */
{a=99;Limiter.ar(FreeVerb.ar(Mix.fill(a, {|i|Saw.ar(55*i, Lag.kr(Impulse.kr(i*0.04), 1.0.ra\
nd))}),SinOsc.kr(6), FSinOsc.kr(0.01,0.5)))}.play

/* by  micromoog */
play{LFCub.ar(LFSaw.kr(LFPulse.kr(1/4,1/4,1/4)*2+2,1,-20,50))+(WhiteNoise.ar(LFPulse.kr(4,0\
,LFPulse.kr(1,3/4)/4+0.05))/8)!2}//#supercollider

/* by  micromoog */
play{BPF.ar(VarSaw.ar(LFNoise1.kr(3,40,200),0,0.25)+PinkNoise.ar(0.1),LFNoise2.kr(12,700,10\
00),0.3)!2};//#supercollider

/* by  micromoog */
4.do({{PanB2.ar(Gendy1.ar(1,1,1,1,rrand(150,250),500,0.1)*LFNoise1.kr(1,0.25,0.75),LFNoise1\
.kr(1))}.play})//#supercollider

/* by  micromoog */
play{a=VarSaw.ar(SinOsc.ar(1/20,7/3,80,80),0,LFNoise1.kr(1,1/2,1/2))*Line.ar(0,1)!2;CombN.a\
r(a,2,2,20,1,a).softclip}//#supercollider

/* by  micromoog */
play{VarSaw.ar((Hasher.ar(Latch.ar(SinOsc.ar((1..4)!2),Impulse.ar([5/2,5])))*300+300).round\
(60),0,LFNoise2.ar(2,1/3,1/2))/5}//#supercollider

/* by  micromoog */
{BPF.ar(DynKlang.ar(`[[3,5,[4,6]]],Demand.kr(Impulse.kr(1/3),0,Dseq([9,8,6,4],inf))*7).floo\
r,LFPar.ar(1,2,911,999))/3}.play;//#supercollider

/* by  micromoog */
play{a=LFNoise1;SinOsc.ar(round(VarSaw.ar(a.kr(10,0.1),0,1)*a.kr(1!2!2,400),a.kr(1/50,32,76\
))).tanh}//#supercollider

/* by  mstep */
{LeakDC.ar((GVerb.ar(GrainIn.ar(2,LFSaw.ar(LFSaw.ar(10)),Trig.ar(SinOsc.ar(4),1)*200,SinOsc\
.ar([60, 70]),0,-1),15)).tanh*0.2) }.play

/* by  nathaniel virgo */
p={|f,a=5|GVerb.ar(LFPulse.ar(f)*a)+f};play{tanh(HPF.ar(p.(99-p.(1/2,20)*(1+p.(2,1/5))+p.(4\
+p.(1/2)),0.5),80,XLine.kr(4e-4,1/8,61,1,0,2)))}

/* by  nathaniel virgo */
Ndef(\,{LPF.ar(x=DelayN.ar(LeakDC.ar(Ndef(\).ar,1-2e-6)*0.99,1,0.01)+Dust.ar(0.5!2);x+(Trig\
1.ar(x<(x.mean.lag(30)),4e-3)*0.05),800)}).play

/* by  nathaniel virgo */
Ndef(\,{x=DelayL.ar(n=Ndef(\);n.ar,2,LFNoise0.kr(0.03*_!20)+1)+Blip.ar(0.5);LeakDC.ar(LPF.a\
r(x+x.mean*0.15,4e3)).sin});play{Splay.ar(n.ar)}

/* by  nathaniel virgo */
play{w=LFSaw;a=w.ar(-3,1)+1/2;f=Sweep.ar(0,3).floor;f=(f**3+f%8+4)*(f%3+3)%49*3;CombN.ar(RL\
PF.ar(w.ar(f)*a,f**a*30,0.3).tanh,5/6,5/6,6)!2}

/* by  nathaniel virgo */
play{PitchShift.ar(CombN.ar(Formant.ar(101,4**LFNoise1.kr(0.5)*450,200),1,0.5,99),1,Duty.kr\
(4,0,Dseq([[6,8,10],[6,7.2,7]]/8,inf))).sum/25!2}

/* by  nathaniel virgo */
p=Impulse;play{mean({|i|Pluck.ar(LFSaw.ar([102,101]),x=p.ar(1,i/10)+p.ar(0),1,1/Latch.ar(1.\
015**Sweep.ar(0,1)*64%1+1*200,x),4,0.2)}!10)}

/* by  nathaniel virgo */
play{x=Saw.ar([50,50.1]);8.do{|i|f=2**(8-i);x=BRF.ar(AllpassN.ar(x,1,0.1/(12-i),2),80**TRan\
d.ar(0,1,Impulse.ar(f/32,1/2)).lag(1/f)*80,2)};x}

/* by  nathaniel virgo */
play{Splay.ar({|i|f=1.9**i/128;BPF.ar(PinkNoise.ar(1!2),4**LFNoise2.kr(1.2**i/16)*300,0.15)\
*(5**LFNoise2.ar(f)/(i+8)*20)}!15)}

/* by  nathaniel virgo */
play{x=Splay.ar({|i|RLPF.ar(0.6**i*40*Impulse.ar(2**i/32,1/2),4**LFNoise0.kr(1/16)*300,5e-3\
).sin}!8);2.do{x=FreeVerb2.ar(*x++[0.1,1,1])};x}

/* by  nathaniel virgo */
play{({|i|x=Dbufrd(b=LocalBuf(5).clear,i);x=x**x-LFNoise0.ar(1/(2**i),50).floor%16;Pulse.ar\
(Duty.ar(1/8,0,Dbufwr(x,b,i))*20)}!5).mean!2}

/* by  nathaniel virgo */
play{b=LocalBuf(4e5,2).clear;BufCombL.ar(b,LeakDC.ar(LPF.ar(PlayBuf.ar(2,b,16/15,0,0,1),300\
))+Blip.ar([20,21],1),2,40)/20}//#supercollider

/* by  nathaniel virgo */
play{b=LocalBuf(4e5,2).clear;BufCombL.ar(b,LeakDC.ar(BufRd.ar(2,b,LFNoise1.ar(0.25)+1*2e5)*\
0.98)+Blip.ar(2!2,10),2,20)/10}//#supercollider

/* by  nathaniel virgo */
play{b=LocalBuf(1e5,2).clear;x=BufRd.ar(2,b,Phasor.ar(0,1,0,1e5))*0.6;BufWr.ar(Blip.ar([1,1\
.01],10)/5+x,b,LFNoise1.ar(0.2)+1*5e4);x}//#sc

/* by  nathaniel virgo */
Ndef(\,{x=DelayN.ar(LeakDC.ar(Ndef(\).ar),1,z=1e-2);LPF.ar(Trig1.ar(Amplitude.kr(x,5,120)*1\
.5+x+z-Dust.ar(2),4e-3)*0.1+x*0.99,1200)}).play

/* by  nathaniel virgo */
f=g=0;Routine({loop{g=g+1e-3;f=f+g%1;play{l=Line.kr(1,0,3,doneAction:2);h=2**f*100;e=Pluck.\
ar(CuspL.ar,1,i=1/h,i,2,0.3)!2};0.15.wait}}).play

/* by  nathaniel virgo */
Ndef('x',{x=Ndef('x').ar+0.01;a=BPF.ar(x,6**Latch.ar(x,Dust.ar(x))*200,0.1).sin;9.do{a=Allp\
assN.ar(a,0.2,{0.2.rand}!2,9)};a+a.mean}).play;

/* by  nathaniel virgo */
Ndef('x',{x=(Ndef('x').ar*1.8).tanh;BPF.ar(x+[0.01,0.1],12**Latch.ar(x.mean,Impulse.ar(3)).\
lag(0.1)*200)}).play//#supercollider

/* by  nathaniel virgo */
b=Buffer.read(s,"sounds/a11wlk01.wav");play{t=Impulse.kr(5);PlayBuf.ar(1,b,1,t,Demand.kr(t,\
0,Dseq(1e3*[103,41,162,15,141,52,124,190],9)))!2}

/* by  nathaniel virgo */
Ndef(\x,{DelayN.ar(BRF.ar(Saw.ar(20!2)*0.01+Rotate2.ar(*(Ndef(\x).ar*2).tanh++0.1),20**LFNo\
ise1.kr(0.6)*500,1),1,1)}).play//#supercollider

/* by  nathaniel virgo */
Ndef('x',{Normalizer.ar(FreqShift.ar(Rotate2.ar(*Ndef('x').ar++1/8).tanh,20*[-3,0.995])+Dus\
t.ar(1!2,0.005),1,0.5)}).play//#supercollider

/* by  nathaniel virgo */
play{p=PinkNoise.ar(1!2);BRF.ar(p+Blip.ar(p+2,400),150,2,0.1)+LPF.ar(FreeVerb2.ar(*LPF.ar(p\
+0.2*Dust.ar(0.1),60)++[1,1,0.2,1e4]).tanh,2000)}

/* by  nathaniel virgo */
x=0;Pbind(*[type:\set,id:{|freq=10|f=freq;LPF.ar(Saw.ar(f),f.lag(1)*3)!2}.play.nodeID,freq:\
Pfunc{x=x+32%35;x%12+1*40},dur:1/6]).play

/* by  nathaniel virgo */
x=0;Pbind(*[type:\set,id:{|freq=10|LFTri.ar(freq.lag(0.1))!2}.play.nodeID,freq:Pfunc{x=x+32\
%355;x%12+1*40},dur:1/6]).play//#supercollider

/* by  nathaniel virgo */
n={|r,f,n=0,d=1|round(r**LFNoise0.ar([4,1,8,2]!d)*f,n)};play{Splay.ar(d=n.(3,0.6);Ringz.ar(\
d*0.01,n.(2,n.(20,400),40,20),d).mean.tanh)}

/* by  nathaniel virgo */
n={|r,f,d=1|2**LFNoise0.kr(1!d,r)*f};{p=n.(4,1e3);CombN.ar(Ringz.ar(LFPulse.ar(1,0,0.01),n.\
(2,p,80),0.6).sum,8/5,8/5,60)*4e-4!2}.play

/* by  nathaniel virgo */
x=LFPulse;d={|l,h,f,p,n|sum({Ringz.ar(x.ar(f,p,0.01),exprand(l,h).round(n),0.6)}!40)};{d.(3\
0,150,2,[0,0.3],[1,x.kr(1/8)*10+40])*3e-4!2}.play

/* by  nathaniel virgo */
d={|l,h,f,p,n|sum({Ringz.ar(LFPulse.ar(f,p,0.01),exprand(l,h).round(n),0.5)}!20)};{d.(50,15\
0,[2,1,1],[0,1/4,3/4],[1,40,50])*3e-4!2}.play

/* by  nathaniel virgo */
d={|l,h,f,p|({Ringz.ar(LFPulse.ar(f,p,0.01),exprand(l,h),0.5)}!20).sum};{d.(50,100,2,[0,1/4\
])+d.(3e3,1e4,4,0)+d.(2e2,3e3,1,0.5)*3e-4!2}.play

/* by  nathaniel virgo */
{LocalOut.ar(a=CombN.ar(BPF.ar(LocalIn.ar(2)*7.5+Saw.ar([32,33],0.2),2**LFNoise0.kr(4/3,4)*\
300,0.1).distort,2,2,40));a}.play//#supercollider

/* by  nathaniel virgo */
f=0;Routine({inf.do{|i|f=i/12+f%[4,3];{Formant.ar(2**f*100,2**(i%8*f*0.2)*100,100)*Line.kr(\
0.1,0,1)}.play;0.25.wait;}}).play//#supercollider

/* by  nathaniel virgo */
p={|f,a=1|LFPulse.ar(f)*a*[1,1.01]};{p.(p.(100-p.(1/16,20))+p.(2,1+p.(1/4))-0.5*200)+p.(100\
-p.(1/8,20),p.(8))*0.1}.play//#supercollider

/* by  nathaniel virgo */
{a=LFTri.ar(1);20.do{a=BAllPass.ar(a,80,1);a=((a+0.02)*LFNoise0.kr(1/2)*8).tanh;a=LeakDC.ar\
(a,0.995)};a*0.1!2}.play//#supercollider

/* by  nathaniel virgo */
{a=PinkNoise.ar(1!2);50.do{a=BBandStop.ar(a,LFNoise1.kr(0.05.rand).exprange(40,15000),expra\
nd(0.1,2))};LPF.ar(a,1e5)}.play//#supercollider

/* by  andrea valle */
{"|i|{Saw.ar(i.ascii.midicps,Line.kr(1,0,1,1,0,2))}.play;0.1.wait".do{|i|{Saw.ar(i.ascii.mi\
dicps,Line.kr(1,0,1,1,0,2))}.play;0.1.wait}}.fork

/* by  batuhan bozkurt */
{a=LocalIn.ar;LocalOut.ar(Mix.ar(x=SinOsc.ar((Decay.ar(Impulse.ar([4,4.005]),1e3*a.abs)*50)\
, a).distort));x;}.play;//tryingharder_to_noavail

/* by  batuhan bozkurt */
{f=LocalIn.ar(2).tanh;k=Latch.kr(f[0].abs,Impulse.kr(0.5));LocalOut.ar(f+AllpassN.ar(Pulse.\
ar([2,3],k*0.01+1e-6,0.9),1,k*0.3,100*k));f}.play

/* by  batuhan bozkurt */
play{f=LocalIn.ar(2).tanh;k=Latch.kr(f[0].abs,Impulse.kr(1/4));LocalOut.ar(f+CombC.ar(Blip.\
ar([4,6],100*k+50,0.9),1,k*0.3,50*f));f}//44.1kHz

/* by  batuhan bozkurt */
play{t=Impulse.ar(75);Sweep.ar(t,150).fold(0,1)*PlayBuf.ar(1,Buffer.read(s,"s*/*".pathMatch\
[2]),1,t,Demand.ar(t,0,Dbrown(0,2e5,2e3,inf)))!2}

/* by  batuhan bozkurt */
play{f={LocalBuf(512)};r={|k,m|RecordBuf.ar(Pulse.ar(8,m,6e3),k)};r.(a=f.(),0.99);r.(b=f.()\
,0.99001);Out.ar(0,IFFT([a,b]).tanh)};//44.1kHz:)

/* by  batuhan bozkurt */
play{AllpassC.ar(SinOsc.ar(55).tanh,0.4,TExpRand.ar(2e-4, 0.4,Impulse.ar(8)).round([2e-3,4e\
-3]),2)};// #supercollider with bass please...

/* by  batuhan bozkurt */
play{Mix({a=LFNoise1.ar(0.2.rand);DelayC.ar(BPF.ar(WhiteNoise.ar(Dust2.ar(a*a*4**2).lag(8e-\
3)),10e3.rand+300,0.09),3,a*1.5+1.5,45)}!80).dup}

/* by  batuhan bozkurt */
play{a=BPF.ar(Saw.ar([40,40.001]),LFNoise0.kr(128)+1*4e3+146,LFNoise1.kr(1)+1*5e-2+0.01).ta\
nh;CombC.ar(a,9,a.abs.lag(2)*9,a.abs.lag(1)*100)}

/* by  batuhan bozkurt */
play{LocalOut.ar(x=DelayC.ar(LPF.ar(LFNoise0.ar(8)**2+LocalIn.ar(2).tanh.round(0.05),6e3),1\
,LFNoise0.ar(8!2).range(1e-4,0.02)));x.tanh}//#sc

/* by  batuhan bozkurt */
play{t=Impulse.ar(8)*LFNoise1.ar(2);CombL.ar(Saw.ar([3,4],Decay.ar(t,0.1)).tanh,1,TRand.ar(\
0,0.01,t).round(15e-4),TRand.ar(-30,30,t))};//#sc

/* by  batuhan bozkurt */
play{LocalOut.ar(x=LFNoise1.ar(0.5*LocalIn.ar(1)+0.1,0.5,0.5));PitchShift.ar(PitchShift.ar(\
Pulse.ar([90,90.01],x),10,x*4,x),10,4-(x*4),1-x)}

/* by  batuhan bozkurt */
play{q=[0,3,5,7,10];t=Impulse.kr(4)*LFNoise0.kr>0;PitchShift.ar(Saw.ar(Demand.kr(t,0,Drand(\
(q+12++q+33).midicps,inf)),Decay.kr(t,3)),7,2)!2}

/* by  batuhan bozkurt */
play{(HPF.ar(LFNoise1.ar(2),[10,10.1])*100).tanh}// #supercollider yay! (be very careful wi\
th this one, very loud)

/* by  batuhan bozkurt */
play{t=[0,3,5,7,10,12]+30;a=Demand.kr(Impulse.kr(8),0,Drand(t+24++t,inf));(BHiPass.ar(LFNoi\
se1.ar(8)**3,[a,a+0.2].midicps,a/2e3,67-a)).tanh}

/* by  batuhan bozkurt */
play{t=[0,0,0,1,5,7,10,12,12,12]+30;a=Duty.kr(1/8,0,Dxrand(t+24++t++t,inf));(BHiPass.ar(LFN\
oise1.ar(8)**6,[a,a+7].midicps,a/3e3,67-a)).tanh}

/* by  batuhan bozkurt */
play{LeakDC.ar(BRF.ar(Saw.ar(8,Decay2.kr(x=Duty.kr(1/8,0,Drand([0,Drand((0.4,0.5..1))],inf)\
),0.01,0.3))**1.5,x*20+[45.1,45],0.1)).tanh}//#sc

/* by  batuhan bozkurt */
play{t=Impulse.kr(1/4);k=Duty.kr(2,0,Dxrand((50,52..90),inf));r={|a|Saw.ar(TIRand.kr(1,6,t)\
,a+1*3)}!3;BPF.ar(r.sum,[k,k+0.5],1/9,50).tanh}

/* by  batuhan bozkurt */
play{Mix({|k|k=k+1/2;2/k*Mix({|i|i=i+1;Blip.ar(i*XLine.kr(rand(2e2,4e2),87+LFNoise2.kr(2)*k\
,15),2,1/(i/a=XLine.kr(0.3,1,9))/9)}!9)}!40)!2*a}

/* by  batuhan bozkurt */
play{t=[0,3,5,7,10,12]+40;p=Duty.ar(1/4,0,Drand((t+12++t).midicps,inf));Blip.ar([b=TRand.ar\
(1500,2e3,Impulse.ar(16)).lag,b+p],1).mean!2**2}

/* by  batuhan bozkurt */
play{n=LFNoise0.kr(_);v=Blip.ar([2e4,2e4-9],1,n.(16)*0.5+0.5**9);42.do{v=LeakDC.ar(AllpassC\
.ar(v,1,n.(5)*5e-2+(5e-2+1e-3),100))};tanh(v*99)}

/* by  batuhan bozkurt */
play{Mix(HPF.ar(MidEQ.ar(Limiter.ar(GVerb.ar(HPF.ar(Pulse.ar([[0.1,0.11],[0.12,0.13]],0.6,5\
e-3),99),[[1,5/4],[1.5,2]],99)),9e3,0.9,9),200))}

/* by  batuhan bozkurt */
{r=44100;BufRd.ar(1,Buffer.read(s,"s*/*".pathMatch[2]),[r,r+500]*LFNoise0.ar(8,9,9)+(Sweep.\
ar(t=Impulse.ar(8),r)%TRand.ar(100,r/2,t)))}.play

/* by  batuhan bozkurt */
play{i=Impulse.ar(_);SinOsc.ar(i.(2).lagud(0,0.4)*360,Integrator.ar(Integrator.ar(i.(64).la\
g(LFNoise1.ar(2!2,2,2))*99,0.9),0.99).fold2(pi))}

/* by  batuhan bozkurt */
play{Splay.ar(Integrator.ar(LFNoise0.ar(5!3,k=Duty.ar(6.4,0,Dseq([0.05,Drand([0.04,0.08],1)\
],inf))).round(k/10)).sin.sqrt.tanh,0.3)}//#sc...

/* by  batuhan bozkurt */
play{LocalOut.ar(k=LeakDC.ar(Integrator.ar(Pulse.ar(pi/[1,2.57],[0.54,0.46]),(1-LocalIn.ar(\
2)).fold(0,[0.9,0.85])).wrap2(pi).sin));k.mean!2}

/* by  batuhan bozkurt */
{n=LFNoise0.ar(_);f=[60,61];tanh(BBandPass.ar(max(max(n.(4),l=n.(6)),SinOsc.ar(f*ceil(l*9).\
lag(0.1))*0.7),f,n.(1).abs/2)*700*l.lag(1))}.play

/* by  batuhan bozkurt */
play{LocalOut.ar(k=tanh(500*(PinkNoise.ar(1!2)**(8/(LagUD.ar(y=LocalIn.ar(2).range(0, 0.9)+\
0.1,0,0.2))))));Splay.ar(k,1-y)}

/* by  batuhan bozkurt */
play{i=Impulse.ar(8).lag(0.3)!2;10.do{i=LeakDC.ar(AllpassC.ar(i,1,LFNoise0.ar(8).range(1e-5\
,0.2),-0.15,LFNoise0.ar(8).range(1,3))).tanh};i}

/* by  batuhan bozkurt */
play{n=LFNoise0.kr(*_);v=Blip.ar([60,61],5,n.(4)**8);12.do{v=LeakDC.ar(CombC.ar(v,1,n.([1,0\
.05,0.06]).lag(5e3),9))};Limiter.ar(v,0.9,1)}

/* by  batuhan bozkurt */
play{k=Pulse.ar([3,3.01,2,2.01]/2,[0.4,0.6]).lag(0.8,0);k=[k[0],k[1]]*[k[2],k[3]];Splay.ar(\
(k*Blip.ar([80,81],2)*6.5).tanh,0.5)}

/* by  julian rohrhuber */
play { [0,0.08] * Pulse.ar(2020) * product(LFPulse.ar([8,1])) + [PinkNoise,WhiteNoise].sum(\
_.ar(XLine.kr(0.4, 0.02, 20) ! 2)) };

/* by  julian rohrhuber */
play { [0,0.08] * Pulse.ar(2020) * product(LFPulse.ar([8,1])) + Dust.ar(XLine.kr(10000, 10,\
 20) ! 2, 2) };

/* by  taylan cihan */
play{{Mix.fill(8,{a=LFDNoise0.kr(9);b=a.range(1e-4,0.2);c=a.abs*2e3;d=b.sqrt;RLPF.ar(CombL.\
ar(FBSineL.ar(c,a,a,a),b,b,d),c,d).tanh*0.2})}!2}

/* by  taylan cihan */
play{Mix.fill(8,{a=LFDNoise1.kr(18);b=a.range(1e-4,0.1);AllpassC.ar(CombC.ar(FBSineC.ar(a.a\
bs*2e4,a,a,a),b,b,b**a),b,b,a).distort*0.2}!2)}

/* by  chad mckinney */
play{{|i|r=LFNoise2.ar(0.3,0.5,0.5);n=DelayC.ar(In.ar(i*r),1,i);Out.ar(i*r,Ringz.ar(n+Impul\
se.ar(),Pitch.kr(n)[0]*i,200).softclip)}!30};

/* by  pyoungryang ko */
play{Mix.fill(64,{Resonz.ar(Impulse.ar(0.1.rand,0.8,9.9.rand),9+99.rand.nthPrime*SinOsc.kr(\
[1,1.7]*0.001).range([59,2],[2,11]),0.0004,33)})}

/* by  johannes quint */
{z=300;e=EnvGen.kr(Env([20,16000],[z],8),doneAction:2);Pan2.ar(RLPF.ar(RHPF.ar(BrownNoise.a\
r(0.5),e),e),Line.kr(-1,1,z))}.play

/* by  scott wilson */
{GVerb.ar(Formant.ar(Duty.kr({Drand([0.1,0.17],inf)},0,["SCBerlin10!","Vielen Dank,Andre!"]\
.collect({|t|Dseq(t.ascii/2,inf)})))/4).sum}.play

/* by  scott wilson */
{Formant.ar(Duty.kr({Drand([0.01,0.3],inf)},0,["SCBerlin10!","Vielen Dank,Andre!"].collect(\
{|t|Dseq(t.ascii/Drand([2,9]),inf)})))*0.4}.play

/* by  scott wilson */
{f=Duty.kr({Drand([0.1,0.13],inf)},0,["SCBerlin10!","Vielen Dank,Andre!"].collect({|t|Dseq(\
t.ascii/2,inf)}));Formant.ar(f,f*19,f*50)/4}.play

/* by  andre bartetzki */
{SinOsc.ar([20,23.7]*3,0,0.4).sqrt.thresh(LFNoise0.ar(11.3)).scaleneg(WhiteNoise.ar(0.4)).r\
ing2(Impulse.ar([12.7, 10.1]))}.play

/* by  andre bartetzki */
{FSinOsc.ar([11,11.3],0,100).sqrt.thresh(LFNoise0.ar(1.3,70)).scaleneg(Pulse.ar([7,8]*1000)\
).distort.ring4(LFNoise2.ar([0.3,0.3]))}.play

/* by  tedthetrumpet */
Pbind(\freq,Pseq((40,43..400).pyramid(3)),\dur,0.01).play;

/* by  julian rohrhuber */
Ndef(\y, { a = { Drand([Dshuf({0.005.rand}!rrand(2,12),SinOsc.kr(0.1)*40), 0.1], inf)}; TDu\
ty.ar(a!2,0,0.1)  }).play;

/* by  julian rohrhuber */
Ndef(\y, { a = {{ Drand([Dshuf({0.0001.rand}!rrand(2,11),SinOsc.kr(0.1)*40), 0.05], inf) }}\
; TDuty.ar(a!2!8,0,0.1).sum  }).play;

/* by  sciss */
play{{|i|b=LocalBuf(512);d=HPZ1.ar(Dust.ar(99));x=(IFFT(FFT(b,PlayBuf.ar(1,b)+d,0.5,0,1-d))\
).wrap;RecordBuf.ar(InFeedback.ar(i),b);x}!2}

/* by  sciss */
play{{|i|b=LocalBuf(64);d=HPZ1.ar(Dust.ar(99));x=CombN.ar(IFFT(FFT(b,PlayBuf.ar(1,b)+d,1,0,\
1-d)))%1;RecordBuf.ar(InFeedback.ar(i),b);x}!2}

/* by  xe kondo */
play{LPF.ar(SelectX.ar(LFNoise0.kr(0.2,20,20)lag:5,{Ringz.ar(TDuty.ar(Dseq([2,3,5].scramble\
*0.1,120)),Rand(110,440))*0.5}!40),999)!2}

/* by  tweety on a power cord */
{SinOsc.ar(SinOsc.ar([0.1, 0.2],0.5,{LFNoise2.kr(25, 0.5, 2.5)}!2)/SampleDur.ir)}.play;

/* by  mr proxxxy */
(q=4;q.do{|i|{BPF.ar(PinkNoise.ar(Impulse.kr([2*q,2*q+(0.01*q)])), 400*q,0.4/q)}.play;}; {S\
inOsc.ar([60,61],mul:0.25)}.play)

/* by  mr proxxxy */
(q=10;q.do{|i|{SinOscFB.ar([75+i,85+i],LFPulse.kr([60/q,60.05/q],add:0,mul:LFTri.kr(20/q,mu\
l:20/q)),mul:1/q)}.play})

/* by  tedthetrumpet */
Ndef(\x,Pbind(\midinote,Prand((20..60),inf),\amp,20,\dur,0.2));Ndef(\y,{Ndef.ar(\x).clip2(0\
.1)}).play;

/* by  jonatan liljedahl */
play{a=PinkNoise.ar(LFNoise1.ar(9,0.05));9.do{a=LeakDC.ar(CombN.ar(a,0.3,{0.3.rand+0.03}!2,\
7))};LPF.ar(tanh(LPF.ar(a+a.mean,777)),7777)}

/* by  jonatan liljedahl */
n=LFNoise1;Ndef('x',{a=SinOsc.ar(65,Ndef('x').ar*n.ar(0.1,3),n.ar(3,6)).tanh;9.do{a=Allpass\
L.ar(a,0.3,{0.2.rand+0.1}!2,5)};a.tanh}).play

/* by  jonatan liljedahl */
play{tanh((z=Ringz).ar((i=Impulse).ar(1,[0,5.7/8,1.7/8],[4,1,0.2]).sum,[400,300],0.3)+z.ar(\
i.ar(1/2,3/4),50,0.7,1+PinkNoise.ar))}

/* by  jonatan liljedahl */
play{v=Saw.ar([40,41])*4;6.do{|n|v=LeakDC.ar(AllpassL.ar(v,0.2,n*0.004+0.014+Saw.ar(0.25+n*\
0.25,0.01),1))};v.clip2(1)}

/* by  jonatan liljedahl */
play{v=Dust2.ar(4!2);7.do{|n|v=LeakDC.ar(CombL.ar(v,0.2,n*0.001+0.03+LFNoise1.kr(10,0.001),\
3))};v.clip2(1)}

/* by  tristan strange */
play{GVerb.ar(Saw.ar(GbmanN.ar([4,9])*XLine.kr(30,1600,120))/3,10).tanh};// robotic guitari\
sts and violinists were never likely to get on

/* by  marcus wrangoe */
play{GVerb.ar(CombC.ar(Dust.ar(100)/Blip.ar(200,Line.kr(5,50,200),0.1),Dust.ar(1),0.01,0.02\
),9,9,mul:0.0001)}

/* by  marcus wrangoe */
play{GVerb.ar(CombC.ar(Gendy1.ar(6,6,3,Dust.ar([100,101],1),20,400),1,0.01,0.02),9,9,mul:0.\
1)}

/* by  marcus wrangoe */
play{x=1500;b=SinOsc;d=Dust;t=Trig.ar(d.ar(x),b.ar(0.01));f=t*b.ar(x+b.ar*t);Limiter.ar(GVe\
rb.ar(GrainIn.ar(2,t,f,f,b.ar(x)),9,9)*0.2,0.4)}

/* by  marcus wrangoe */
play{x=6666;b=SinOsc;d=Dust;t=Trig.ar(d.ar(x),b.ar(0.001));f=t*b.ar(x+b.ar*t);Limiter.ar(GV\
erb.ar(GrainIn.ar(2,t,f,f,b.ar(x)),9,9)*0.2,0.6)}

/* by  jonatan liljedahl */
play{i=Impulse;LocalOut.ar(b=DelayL.ar(HPF.ar(LocalIn.ar(2),2e2)+Ringz.ar(i.ar(1/2),TExpRan\
d.kr(50,99,i.kr(1/2)),2,4).sin,7/6,7/6).tanh);b}

/* by  jonatan liljedahl */
play{GVerb.ar(Mix.fill(14,{|i|BPF.ar(Decay.ar(Impulse.ar(i*0.1+0.5),0.5,1.2),800*i+40)/(i+1\
)}),2.7,9).tanh}

/* by  jonatan liljedahl */
play{GVerb.ar(Mix.fill(10,{|i|Ringz.ar(Impulse.ar(i*0.05+0.5),300*i+40,0.12,3+PinkNoise.ar(\
3)).sin/(i+1)*0.6}),3.4,7).tanh}

/* by  josh parmenter */
{r={|y,z|50.collect{Rand(y,z)}};Pluck.ar(Crackle.ar*0.01,Dust.kr(r.(9,12)),0.01,SinOsc.kr(1\
/r.(50,20),r.(0,6),4e-4,5e-4),2,0.01).sum}.play

/* by  aucotsi */
play{c=LFPulse.kr(0.5);b=SinOsc.kr(0.0034);a=Line.kr(0.1,2pi,37);FreeVerb.ar(Blip.ar(a*340!\
2*Impulse.kr(b*34/pi),c*TIRand.kr(0,113,c),c))}

/* by  aucotsi */
play{b=SinOsc.kr(5);a=Line.kr(0.1,2pi,37);FreeVerb.ar(SinOsc.ar(a*341!2*Impulse.kr(b*680/pi\
),LFPulse.kr(2,0,pi/2),LFPulse.kr(3,0,1/pi*b)))}

/* by  aucotsi */
play{c=LFPulse.kr(50);a=Line.kr(0.1,2pi,37000);b=SinOsc.kr(a/pi);FreeVerb.ar(Blip.ar(a*340!\
2*Impulse.kr(b*340/pi),c*TIRand.kr(0,999,c),c))}

/* by  josh parmenter */
3.do({{b=Decay2.kr(Dust.kr(0.1),5,7);Pan2.ar(HPF.ar(LPF.ar(WhiteNoise.ar(b+0.4),b*1500+200)\
,b*100+100)*5,1.0.rand2)}.play})//surf's up

/* by  aucotsi */
Array.fib(7,0,4)!2.do({|i|{SinOsc.kr(i)*Line.ar(0.3,0,36,1,0,2)*Blip.ar(340!2*LFPulse.kr(2,\
0,1/i),TRand.kr(-2pi,2pi,Impulse.kr(9)))}.play})

/* by  aucotsi */
play{b=SinOsc.kr(1);a=Line.kr(0.1,2*pi,37);FreeVerb.ar(SinOsc.ar(b*340!2*Impulse.kr(b*680),\
LFPulse.kr(2,0,pi/2),LFPulse.kr(3,0,1/pi*a)))}

/* by  marcus wrangoe */
play{Klank.ar(`[[99,98,953,923],nil,[1,1,1,1]],GVerb.ar(CombC.ar(Gendy1.ar(6,6,3,Dust.ar([9\
9,98],1),20,400),1,0.01,0.02),9,9,mul:0.005))}

/* by  tim walters */
play{({|i|Resonz.ar({Gendy1.ar}!2,i*2+1*50,Blip.ar(i+1/99,i+8).squared*0.005*LFTri.kr(i+1*0\
.001).abs*25)*(i+1).sqrt}!15).sum/4}

/* by  jonatan liljedahl */
play{p=LFPulse;tanh(p.ar(40)*p.kr([0.5,1])+mean({|n|(p.ar(n*2e2+50*p.kr(2-n/[1,3,5],(1..3)/\
10).sum+4e2)*p.kr(n+1*6,0,0.8).lag)}!2)/2)}

/* by  jonatan liljedahl */
play{p=LFPulse;tanh(p.ar([50,52])*p.kr([2,1]/4)+mean({|n|(p.ar(n*3e2+100*p.kr(2-n/[1,5,7],(\
0..2)/10).sum+2e2)*p.kr(n+1*6,0,0.8).lag)}!2)/2)}

/* by  aucotsi */
play{#a,b,c=[LFSaw,TRand,SinOsc];a.ar(c.kr(5),b.kr(-2pi,2pi),b.kr(0.1,1.0,c.kr(340)))*c.ar(\
680!2*c.kr(b.kr(87,393,c.kr(7))),0,c.kr(pi/13))}

/* by  lucas samaruga */
q=play{SinOsc.ar({\freq.kr.lag(0.9.rand)}!2)*0.1};fork{inf.do{5.do{fork{var x=1.7.rand;(180\
0.rand!10).do{|i|q.set(\freq,i);x.wait}}};7.wait}}

/* by  lucas samaruga */
q=play{SinOsc.ar({\freq.kr.lag(0.9.rand)}!2)*0.1};fork{20.do{5.do{fork{(1800.rand!10).do{|i\
|q.set(\freq,i);0.9.rand.wait}}};7.wait};q.release}

/* by  aucotsi */
play{#a,b=[SinOsc,TIRand];FreeVerb.ar(a.kr(2)*Formant.ar(68!2*Stepper.kr(LFTri.kr(3),0,4*b.\
kr(-4,12,a.kr(3).ceil)),a.kr(8*11),222,16))}

/* by  aucotsi */
play{#a,b=[SinOsc,TIRand];FreeVerb.ar(a.kr(1.1)*Formant.ar(44!2*Stepper.kr(LFTri.kr(2.9),0,\
b.kr(-4,12,a.kr(0.61).ceil)/4),a.kr(0.1),99,8))}

/* by  josh parmenter */
{t=Dust.kr(1);FreeVerb.ar(GrainSin.ar(1,t,0.1,Select.kr(LFNoise1.kr(1,3,2),[[4,6,7,9,11,12]\
,[0,4,4,7,7,7]].flop+84).midicps),1,1)*0.1}.play

/* by  tim walters */
play{GVerb.ar(({|i|Blip.ar(i+1/99,i+1).exprange(1/9999,0.008)*Gendy1.ar(6,6,1,Blip.ar(i+1*[\
2,3],3).abs,i+2*50,i+3*50)}!16).sum,3,9,0.3)}

/* by  julian rohrhuber */
Ppar({|i| Pbind(\dur, 0.01, \note, sin(i+1*0.002.rand*Pseries()+2.0.rand)*40.rand-10) } ! 5\
).play;

/* by  stefan nussbaumer */
play{e=Drand((3..65),inf);d={|i|p=Impulse.ar(i+1*2)}!2;GVerb.ar(Blip.ar(Demand.kr(A2K.kr(p)\
,0,e),TExpRand.ar(2e4,20,d*10)),100,0.02,0.02)}

/* by  nathaniel virgo */
x=Ndef(\x,Pbind(\freq,Pseq(a=(3..5);a/.x a*.x[40,80],8)));Ndef(\,{Limiter ar:GVerb.ar(Pitch\
Shift.ar(Ndef ar:\,1,2,0,0.1),30,9)/4+x.ar}).play

/* by  nathaniel virgo */
x=Ndef(\x,Pbind(\freq,Pseq(a=(3..5);a*.x a*.x[4,8],8)));Ndef(\,{Limiter ar:GVerb.ar(PitchSh\
ift.ar(Ndef ar:\,1,2,0,0.1),20,20)/4+x.ar}).play

/* by  david granstroem */
play{z=SinOsc;y=222;GVerb.ar(z.ar(y+z.kr([27,81],0,y*LFDNoise1.kr(0.4,1.4).abs))*0.03+GVerb\
.ar(PinkNoise.ar(Decay2.kr(Dust.kr(1!2))*0.02)))}

/* by  david granstroem */
play{z=LFDNoise0;x=Blip.ar(Duty.kr(8,0,Dseq([220,174],inf)).lag,4);5.do{x=BPF.ar(x,z.kr(8!2\
,400,700).abs,z.kr(0.8!2).range(0.1,1).abs)};x*2}

/* by  david granstroem */
play{x=if(LFPulse.kr(2,0,LFNoise0.kr(1).range(0,1)),SinOsc.ar(1.5)*200,Saw.ar(4,Duty.kr(8,0\
,Dseq([400,800],inf))));LFTri.ar([120+x,480+x])}

/* by  alexandra cardenas */
{a=Dust;b=TRand;Splay.ar(Blip.ar(b.kr(1,40,a.kr(2.5)),b.kr(1,150,a.kr(0.5))!3),1,b.kr(0,1,a\
.kr(2)),b.kr(-1,1,a.kr(0.7)))}.play

/* by  alexandra cardenas */
{d=Dust;r=TRand.kr(1,2,d.kr(1));z=TRand.kr(1,2,d.kr(2));l=LFClipNoise;Crackle.ar([z,r], 0.6\
)+SinOsc.ar([l.ar(r,60,90),l.ar(z,40,100)])}.play

/* by  jonatan liljedahl */
play{GVerb.ar({|i|Ringz.ar(LPF.ar(Impulse.ar(2/(1.2+i)),6e3),1e3/(i+1),0.3)}.dup(20).scramb\
le.sum/2,18,5).tanh}

/* by  jonatan liljedahl */
play{x=0;(100..110).do{|f|f=f/2;x=SinOsc.ar(f+[0,1],x*LFTri.kr(0.4,2).range(1.2,1.6))};x}

/* by  jonatan liljedahl */
play{sum({|i|GrainSin.ar(1,Impulse.ar(LFTri.kr(0.05*i+[0.2,0.201]).range(1,30)),0.02, 200*i\
+50)}!9).tanh}

/* by  jonatan liljedahl */
play{sum({|i|GrainSin.ar(1,Impulse.ar(LFNoise1.kr(0.5!2).range(1,30)),0.025, 200*i+100)}!7)\
.tanh}

/* by  jonatan liljedahl */
play{a=ar(PinkNoise,5e-4);ar(GVerb,({ar(Ringz,a*LFNoise1.kr(0.2),exprand(60,8000),3)}!40).s\
um,50,99).tanh}

/* by  jonatan liljedahl */
play{GVerb.ar(GVerb.ar(Impulse.ar([8,6,4]/100,[0,0.2,0.6])*8,[2,3,1.5],90,drylevel:0).sum.s\
in.sum*0.5,62,24).tanh}

/* by  jonatan liljedahl */
play{LeakDC.ar(GVerb.ar(Mix.fill(14,{|i|BPF.ar(Decay.ar(Impulse.ar(i*0.1+0.5),0.5,1.5),1000\
*i+40)/(i+1)}),2.7,7)).tanh}

/* by  jonatan liljedahl */
play{GVerb.ar(Mix.fill(10,{|i|Ringz.ar(Impulse.ar(i*0.05+0.5),300*i+40,0.12,3+PinkNoise.ar(\
3)).sin/(i+1)*0.6}),3.4,7).tanh}

/* by  chad mckinney */
play{b=LocalBuf(8192,4);d=Dust.kr(9)+Dust2.kr([4,4]);r=PlayBuf.ar(2,b,TExpRand.kr(0.01,19,d\
),d,0,1)/4;FFT(b,d+(d*r));Out.ar(0,r.clip2(1))};

/* by  dan stowell */
d=(TChoose.kr(Impulse.kr(0.5),_));{([LPF,BPF,HPF].choose.ar(Saw.ar(d.((3..6)/3**2)*[44,45])\
,LFTri.ar(d.((1..8)))*9**3)*9).tanh}.play

/* by  andrea valle */
{t="With my prayers";999.do{|i|{MoogFF.ar(Saw.ar(t[i.postln%15].ascii),i%99*99)*EnvGen.kr(E\
nv.perc)}.play;0.2.wait}}.fork

/* by  scacinto */
{var w,x,z;x=LFNoise0.kr(5,0.3,0.3);w=Mix(SinOsc.ar(x*99%50)!4);z=SinOsc.ar(x*500*w,w%pi+pi\
).clip(x,x*3)*w.clip;z!2}.play

/* by  alexandra cardenas */
{d=Dust;r=TRand.kr(40,100,d.kr(1));Splay.ar(Ringz.ar(LFPulse.ar([0.1,0.2,1,1.2]),[r,r*1.2,r\
*3,r*7],[0.6,0.4,0.2,0.1]),1,0.03,0.0)}.play

/* by  alexandra cardenas */
play{d=Dust;r=TRand.kr(50,800,d.kr(15));t=TRand.kr(100,700,d.kr(0.2));Splay.ar(SinOsc.ar({|\
i|LFNoise2.kr(rrand(10,200),r,t)}!9),1,0.3,0)}

/* by  redfrik */
play{f={|o,i|if(i>0,{SinOsc.ar([i,i+1e-4]**2*f.(o,i-1),f.(o,i-1)*1e-4,f.(o,i-1))},o)};f.(60\
,6)/60}//#SuperCollider

/* by  redfrik */
play{a=SinOscFB;sum({|i|a.ar(a.ar(a.ar(a.ar(i+1,1/9,999),1/9,a.ar(1/9,1,1/9)),a.ar(0.1,3),i\
+2*999),a.ar(1/9,1/9),1/9)}!9)!2}//#SuperCollider

/* by  redfrik */
play{b=LocalBuf(9e4,2).clear;i=Sweep.ar(BufRd.ar(2,b,Saw.ar(12,3e4,4e4)),9e4);BufWr.ar(Saw.\
ar([8,9]),b,i);BufRd.ar(2,b,i)/2}//#SuperCollider

/* by  redfrik */
play{b=LocalBuf(8e4,2).clear;i=Sweep.ar(BufRd.ar(2,b,Saw.ar(3.1,4e4,4e4)),8e4);BufWr.ar(Bli\
p.ar([2,3]),b,i);BufRd.ar(2,b,i)}//#SuperCollider

/* by  redfrik */
play{b=LocalBuf(5e3,2).clear;i=Sweep.ar(BufRd.ar(2,b,Saw.ar(50,2e3,5e3)),6e4);BufWr.ar(Saw.\
ar([4,3]),b,i);BufRd.ar(2,b,i)/6}//#SuperCollider

/* by  redfrik */
play{b=LocalBuf(1e4,2).clear;i=Sweep.ar(BufRd.ar(2,b,Saw.ar(1,2e3,5e3)),5e5);BufWr.ar(Saw.a\
r([8,50]),b,i);BufRd.ar(2,b,i)/3}//#SuperCollider

/* by  redfrik */
play{a=LFPulse;b=(1..4);Mix(a.ar(a.ar(a.ar(a.ar(b/32)+1/8)+1*b)+(Mix(a.ar(b/64))+a.ar(4/b)*\
(a.ar(a.ar(b/8))*2+b))*100))/8!2}//#SuperCollider

/* by  redfrik */
r{{|j|a=play{sin(Decay.ar(Duty.ar(1/50,0,Dseq(flat({|i|asBinaryDigits(j+1*i)}!8),4),2),j+1*\
0.008))/2!2};5.12.wait}!256}.play//#SuperCollider

/* by  redfrik */
play{a=1/(2..5);GVerb.ar(Splay.ar(Ball.ar(LPF.ar(Impulse.ar(a),500),7-(1/a),1e-5,LFNoise2.k\
r(a/5,2e-4,12e-4))/2),5,0.5,0.9)}//#SuperCollider

/* by  redfrik */
play{Splay.ar({|i|f=i+5*99;RHPF.ar(Ringz.ar(Ball.ar(Saw.ar(i+1)>0,SinOsc.kr(0.1,0,1/5,0.3),\
0.05,0.02)/99,f,0.05),f,0.1)}!5)}//#SuperCollider

/* by  redfrik */
{|j|r{{|i|x=sin(i/5+(j*5));Ndef(i%5+(j*5),{Pan2.ar(LFCub.ar(j*2+x*40+400+i)/15,i%5/2-1)}).p\
lay;wait(x.abs+0.5)}!500}.play}!5//#SuperCollider

/* by  redfrik */
{CombL.ar(In.ar(8).tanh/8,1,1,8)!2}.play;Pbind(\amp,8,\dur,1/4,\degree,Pseq(List.fib(32)%(L\
ist.fib(64)%12),inf),\out,8).play//#SuperCollider

/* by  redfrik */
play{GVerb.ar(ceil(In ar:8*4+4)-4/10)};Pbind(\dur,2,\legato,Pgeom(0.5,1.1),\degree,Pseq(Lis\
t fib:8+[[1,4]]-9,9),\out,8).play//#SuperCollider

/* by  redfrik */
play{MoogFF.ar(LFTri.ar(CombN.ar(Duty.ar(1/8,0,Dseq(Dshuf(List.fib(16)%8*99,8),inf)),4,4,16\
))/4,LFTri.kr(1/16,0,2e3,3e3))!2}//#SuperCollider

/* by  redfrik */
play{{|i|CombC.ar(In.ar(8),3+i,LFTri.ar(0.5,0,1,2+i),99)}!2};Pbind(\out,8,\note,Pstutter(8,\
Pseq(List.fib(32)%9/3,inf))).play//#SuperCollider

/* by  redfrik */
play{a=LFPar;GVerb.ar(VarSaw.ar(a.ar(1,0,5,a.ar([0.05,0.04],0,50,160).round(50)),0,a.ar(0.2\
,0,0.5,a.ar(3,0,0.2,0.5)))/8,80)}//#SuperCollider

/* by  redfrik */
x=0;{|i|Pbind(\dur,i+1/4,\lag,i/6/6,\octave,i+3,\legato,i+1/6,\degree,Pn(Plazy{x=x+1%6;Pseq\
(asDigits(x+1*142857))})).play}!6//#SuperCollider

/* by  alexandra cardenas */
{p=Pan2;n=LFNoise2;p.ar({Blip.ar(n.kr(0.1).range(12),n.kr(0.1).range(10,1200))}!8,0,0.2)}.p\
lay 

/* by  alexandra cardenas */
play{n=LFDNoise3;Limiter.ar(Splay.ar(VarSaw.ar([n.kr(1).range(60,180),80,n.kr(2).range(70,1\
20)])*n.ar(50).wrap2(n.kr(1.7)),1,1,0),0.5,0.01)} 

/* by  juan alzate romero */
play{GVerb.ar(Blip.ar((Duty.kr(1/[1,2,4],0,Dseq([0,3,7,12,17]+24,inf))).midicps*[1,4,8],LFN\
oise1.kr(1/4,3,4)).sum,200,8)} // #sc

/* by  juan alzate romero */
play{d=Duty;t=d.kr(1,0,Dseq(0-[0,7,5,2],inf));f=d.kr(1/3,0,Dseq([0,3,7,12,8]+60,inf));GVerb\
.ar(Blip.ar(f.midicps*t.midiratio,4),99,4)} //#sc

/* by  juan alzate romero */
play{d=Duty;f=d.kr(1/[1,2,3],0,Dseq([0,3,7,12,8]+36,inf));GVerb.ar(Blip.ar(f.midicps*[1,2,3\
],LFNoise1.kr(1,8,8)).sum,99,4)} // #sc

/* by  juan alzate romero */
play{t=Impulse.kr(6);f=lag(Demand.kr(t,0,Drand(midicps((10,13..40)),inf)));RLPF.ar(Pulse.ar\
(f),1+Decay2.kr(t)*LFNoise2.kr(2,8,9)*f,1/9)!2}

/* by  juan alzate romero */
9.do{|i|Pbind(\scale,#[0,3,5,7,10],\octave,3.rand+3,\dur,(12.rand+1)/5,\degree,Pn(Plazy{Pse\
q(18.partition(8.rand+2)-1,4)})).play}

/* by  adina izarra */
({(Mix.fill(98,{SinOsc.ar(5000*rrand(1.0,1.9),0,1/18)}))*EnvGen.ar(Env.perc(0.01 * 0.5, 0.0\
1 * 0.9),Dust.ar(6))!2}.play)

/* by  steel stylianou */
play{x=SinOsc;y=LFNoise1;a=y.ar(4);(MoogFF.ar(LFPulse.ar(y.ar(1,96,768))+x.ar(Pulse.ar(6)*6\
4)+x.ar(8+(a*96)),a*XLine.ar(1,8,9,96,9,0)))!2/3}

/* by  scott wilson */
{f=Duty.kr({Drand([0.1,0.11],inf)}!2,0,Dseq("AllWorkAndNoPlayMakesPAaDullBoy".ascii/2,inf))\
;Formant.ar(f,f*19,LFNoise2.kr(1!2,f*70))/4}.play

/* by  juan alzate romero */
Buffer.allocConsecutive(8,s,1024,1,{|b,i|b.sine1Msg(1/(1..((i+1)*6)))},0);{f=[50,75,99];VOs\
c3.ar(LFNoise1.kr(1/4,3,4),*f).lag(3e-3)!2}.play

/* by  juan alzate romero */
play{a=Decay2.ar(Impulse.ar(2),1e-4)*SinOsc.ar(40);Compander.ar(Blip.ar([40,40.5]*TChoose.k\
r(a>0.6,(1..8)),4),a,8e-3,1,0.001,0.1,1/3)+a}

/* by  juan alzate romero */
play{RandSeed.kr(1,5);e=ExpRand;Splay.ar(Decay2.ar(Impulse.ar({e.new(1,8).round/3}!9),5e-3,\
{e.new(0.1,1)}!9,LFCub.ar({e.new(66,666)}!9)))}

/* by  juan alzate romero */
play{e=ExpRand;Splay.ar(Decay2.ar(Impulse.ar({e.new(1,8).round/3}!9),5e-3,{e.new(0.1,1)}!9,\
LFCub.ar({e.new(66,666).round(20)}!9)))}

/* by  juan alzate romero */
play{o=LFCub.kr(1/[24,8,12,16]);PitchShift.ar(LFPulse.ar(o[0]*50+100,0,0.1).lag(1/1000),o[3\
]*0.299+0.3,o[2]*8+8,o[1],o[0]*0.1+0.1)!2} 

/* by  juan alzate romero */
play{o=LFPulse.kr(1/[4,7,13,19]);PitchShift.ar(LFPar.ar(o@0*1e3+1e3,0,0.1).lag(1/1000),o@3*\
0.299+0.3,o@2*2+2,o@1,o[0]*0.1+0.1)!2}

/* by  juan alzate romero */
play{f=FreeVerb;x=Decay2.ar(Impulse.ar([3,4]),1/99,1/9,LFCub.ar({rrand(40,400)}!4,0,0.1)).s\
um;5.do{x=f.ar(x,LFTri.kr(1/16,0,1/4,0.3))};x!2}

/* by  juan alzate romero */
play{Splay.ar(LFCub.ar(LFTri.kr([1,1.1],0,LFPar.kr(1/7,0,10),{(100,150..500).choose}!5),0,L\
FSaw.kr(4/[1,3,4,6,8])**LFPar.kr(1/8,0,7,8)))}

/* by  juan alzate romero */
fork{loop{x=play{l=LFPar;Splay.ar(l.ar(l.kr(rrand(1,8)).wrap(l.kr(12/{rrand(1,9)}!9))*100+{\
(40,80..400).choose}!9))/9};16.wait;x.free}}//#sc

/* by  juan alzate romero */
play{l=LFCub;Splay.ar(l.ar(l.kr(9.1,0,100,(200,300..500)),0,(l.kr(9,0,1,l.kr(1/[2,3,5,7],0,\
0.5,0))>(l.kr(1/2,0,0.4,0.5))).lag/2))} // #sc

/* by  juan alzate romero */
play{l=LFTri;l.ar(l.kr(1/4,1/1.5,l.kr(1,0,10,10),{|i|50+i}!8+l.kr(1/3,0,3,40)),0,l.kr((1..8\
),0,0.2).excess(0.01)*({|i|16-i/20}!16)).sum!2}

/* by  juan alzate romero */
play{l=LFSaw;l.ar(l.kr(l.kr([1,3],0,4,[5,6]),l.kr(1/[4,8]),l.kr(1,0,99,100),l.kr(1/2,0,50,l\
.kr(1/4,1/3,50,100))))/2} // #sc LFSaw galore

/* by  juan alzate romero */
play{l=LocalIn.ar+Decay2.ar(Impulse.ar(3),3e-2,0.05,LFCub.ar(LFNoise2.kr(19,500,600)));l=De\
layN.ar(l,1,1/2);LocalOut.ar(l/3);l!2}

/* by  juan alzate romero */
fork{loop{r=rrand(-9,9);9.do{play{Pan2.ar(SinOsc.ar(LinRand(40,900,r))*EnvGen.ar(Env.sine(9\
),1,0.1,0,1,2),rrand(-1,1.0))}};rrand(6,8).wait}}

/* by  juan alzate romero */
play{Splay.ar(Logistic.ar(LFCub.kr([1,2,4,8],0,1.9,2),LFNoise0.kr(1/[16,12,8,4],400,999).ro\
und(150)))}

/* by  juan alzate romero */
play{a={rrand(-1,1.0)}!1e3;e=InterplEnv(a,[1e-3],a*3);IEnvGen.ar(e,LFSaw.kr(LFCub.kr(8,0,30\
0,300),0,e.times.sum)*LFNoise2.kr(99,1,10))!2}

/* by  juan alzate romero */
l=Line;fork{loop{play{Splay.ar(SinOsc.ar({x=rrand(1,7)*99;l.kr(x,x/(0.25,0.5..4).choose,9)}\
!9,0,1/9)*l.ar(1,0,9,0.5,0.5,2))};1.wait}}

/* by  matt barber */
{g=LFNoise2.ar(3,11,12);f=LFNoise0.ar(g,4,6).octcps;x=Resonz.ar(LFNoise0.ar(f),f/6,(g**2.3)\
/f).clip;AllpassC.ar([x,0-x],1,g/23,99)}.play

/* by  matt barber */
{g=LFNoise2.ar(3,11,12);f=LFNoise0.ar(g,4,6).octcps;x=Resonz.ar(LFNoise0.ar(f),f/6,(g**2.3)\
/f).clip;AllpassC.ar([x,0-x],1,g/230,99)}.play

/* by  matt barber */
{g=LFNoise2.ar(3,11,12);f=LFNoise0.ar(g,4,6).octcps;x=Resonz.ar(LFNoise0.ar(f),f/6,(g**2.3)\
/f).clip;AllpassC.ar([x,0-x],1,g/230,g)}.play

/* by  matt barber */
{g=LFNoise2.ar(3,11,12);f=LFNoise0.ar(g,4,6).octcps;x=Resonz.ar(LFNoise0.ar(f),f/6,(g**2.3)\
/f).clip;AllpassC.ar([x,0-x],1,g/230,g/9)}.play

/* by  alexandra cardenas */
play{l=LFDNoise3.kr(0.5);Splay.ar(MoogFF.ar(SinOsc.ar([150,200]*BrownNoise.ar(1)*LFPulse.ar\
([10,20])),l.range(10,5000),l.range(0,4)),1,1,0)}

/* by  alexandra cardenas */
{Splay.ar({Integrator.ar(LFPulse.ar(rrand(0.1,42.0),0.3,4e-4),0.999,VarSaw.ar(LFDNoise3.kr(\
2.1).range(100, 3600)),0)}!22,1,0.7,0)}.play

/* by  thor */
play{x=Decay;d=Dust.ar(4);FreeVerb.ar(LPF.ar(x.ar(d,0.5,WhiteNoise.ar),2000)+x.ar(d,0.15,Si\
nOsc.ar([40,47],pi,5)), 0.4,0.6,TRand.ar(0,1,d))}

/* by  nathaniel virgo */
p={|x,a,f|9.do{x=BAllPass.ar(x*a,f).tanh};x};play{i=Dust.ar(2);p.(Dust.ar(EnvGen.ar(Env.per\
c,i,1e3,0,1).poll),1.4,50)/9+p.(i,1.8,40)!2}

/* by  aliaksandr tsurko */
{Mix.fill(100, {arg a = 1 + a; Pan2.ar(SinOsc.ar(70 * a, 0, Dust.kr(0.01 + (a * 0.009), 0.5\
)), {rrand(-0.2, 0.2)})})}.play

/* by  jonathan siemasko */
{CombC.ar(Klang.ar(`[[100,101,1000,1001]],1,0)*0.1,0.33,LFTri.ar(0.1, 0, 0.1, 0.11)+LFTri.a\
r(0.17, 0, 0.1, 0.22),10)!2}.play;


/* by  thor */
play{a=SinOsc.ar(0.15).clip(0,1);(Decay2.ar(LFSaw.ar(13)*a, 0.1)+DelayC.ar(RLPF.ar(PinkNois\
e.ar(0.3)*a,2300,0.1),3,3))!2}

/* by  thor */
play{x=SinOsc;y=LFNoise0;a=y.ar(8);(x.ar(Pulse.ar(1)*24)+x.ar(90+(a*90))+MoogFF.ar(Saw.ar(y\
.ar(4,333,666)),a*XLine.ar(1,39,99,99,0,2)))!2/3}

/* by  thor */
b=Buffer.read(s,"sounds/a11wlk01.wav");play{t=Dust.kr(99);r=TRand;o=Warp1.ar(1,b,r.kr(0,4e4\
,t),r.kr(0.5,1,t),1)*Linen.kr(t,1);GVerb.ar(o)}

/* by  mark hadman */
play{o=SinOsc;a=[2,3,5,7,11,13,17];Splay.ar(o.ar(o.ar(a,0,50,o.ar(0.119,0,5,32))*a,0,o.ar(0\
.027,-pi/2,9,9).dbamp/(1,2..7))).tanh}//#sc140

/* by  juan alzate romero */
play{a=Blip.ar(60,4,LFGauss.ar(4,1/8));a=a/4+LocalIn.ar(2);a=FreqShift.ar(a,LFNoise0.kr(1/4\
,90));LocalOut.ar(DelayC.ar(a,1,0.1,0.9));a}

/* by  juan alzate romero */
play{Splay.ar(SinOsc.ar(LastValue.kr(LFNoise0.kr(3!4,250,220),200).round(50)*LFPar.kr((1..4\
)/8,0,{0.02.rand}!4,1),0,1/2))}

/* by  juan alzate romero */
play{a=0;6.do{x=Sweep.ar(Dust2.kr(0.1+2.0.rand),9.rand+9)+LFNoise1.kr(0.1,60,80);a=a+Pan2.a\
r(Gendy1.ar(1,1,1,1,x,x+9),LFNoise2.kr(1))};a}

/* by  juan alzate romero */
play{GVerb.ar(Mix(9.collect{SetResetFF.ar(Dust.ar(8.rand+1),Dust.ar(8.rand+1)).lag*SinOsc.a\
r(LFNoise2.kr(0.1,25,exprand(40,1500)))/9}))}

/* by  juan alzate romero */
play{i=Impulse;a=Splay.ar((PulseCount.ar(i.ar((1..8)),i.ar(1/3))>4)*Blip.ar((40,44..62).mid\
icps,2));a*0.1+GVerb.ar(HPF.ar(a,4e3),300,14)*4}

/* by  juan alzate romero */
play{f=LFPar.kr(0.22,0,100,150).round(50).lag;Gendy2.ar(6,6,0.5,0.01,f,f+[1,2],1/2,1/2,19,f\
/4)*SetResetFF.ar(Impulse.ar(4),Dust.ar(4)).lag}

/* by  juan alzate romero */
play{l=LFNoise2;FreeVerb2.ar(*XFade2.ar(SinOscFB.ar([80,81],l.kr(1)+1/2),SinOscFB.ar([121,1\
60],l.kr(1)+1/2),l.kr(1)))}

/* by  juan alzate romero */
fork{loop{a=play{p=Pulse;p.ar(p.ar([p.kr(1)*3+4,4.01]*10,1/[3,4],[98,99],199),p.kr(1/4,1/2,\
1/2,1/2).lag)*p.kr([2,3],0.9)};4.wait;a.free;}}

/* by  juan alzate romero */
play{t=TDuty.kr(Dshuf([4,2,1]/8,inf));c=TChoose;Pan2.ar(SinOscFB.ar(c.kr(t,(50,75..300)),De\
cay2.kr(t)+0.8),c.kr(t,[-0.5,0,0.5]).lag)}

/* by  juan alzate romero */
play{r=LFNoise0.kr(1/4,6,6).round.midiratio;Splay.ar({ToggleFF.ar(Impulse.ar(8.rand+1)).lag\
(1/99)*SinOsc.ar((50,100..800).choose*r)}!40)}

/* by  juan alzate romero */
fork{loop{h=[100,800,3000].choose;play{Splay.ar({SinOsc.ar(exprand(60,h),0,0.1)}!40)*LFGaus\
s.ar(19,1/4,0,0,2)};8.wait}}; 

/* by  juan alzate romero */
play{l=LFNoise2;RLPF.ar(Crackle.ar,SpecPcile.kr(FFT(LocalBuf(1024),LFGauss.ar(1/[6,8,20].ch\
oose,0.01)),0.5),l.kr(1/8,0.1,0.15),2).fold2!2}

/* by  juan alzate romero */
play{a=Decay2.ar(Impulse.ar(2),1e-4)*SinOsc.ar(40);Compander.ar(Blip.ar([40,40.5]*TChoose.k\
r(a>0.6,(1..8)),4),a,8e-3,1,0.001,0.1,1/3)+a}

/* by  juan alzate romero */
play{e=ExpRand;Splay.ar(Decay2.ar(Impulse.ar({e.new(1,8).round/3}!9),5e-3,{e.new(0.1,1)}!9,\
LFCub.ar({e.new(66,666).round(20)}!9)))}

/* by  juan alzate romero */
play{o=LFCub.kr(1/[24,8,12,16]);PitchShift.ar(LFPulse.ar(o[0]*50+100,0,0.1).lag(1/1000),o[3\
]*0.299+0.3,o[2]*8+8,o[1],o[0]*0.1+0.1)!2} 

/* by  redfrik */
play{f=LFPar.ar(1/14).round*20+80;Splay.ar(LFPar.ar({|i|[i+1*f,i*f+(i+1/3)]}!4)>BrownNoise.\
ar(Pulse.ar({|i|i+1}!4,0.35))/3)}//#SuperCollider

/* by  redfrik */
play{x=CombN.ar(Phasor.ar(0,{|i|i+1/20}!22),0.042,0.042);y=Phasor.ar(LPF.ar(x,LFPar.ar(1/99\
,0,400,500)),x);Splay.ar(y)*0.25}//#SuperCollider

/* by  redfrik */
play{x=CombC.ar(Phasor.ar(0,{|i|i+1/4}!5),0.2,LFPar.ar(0.09,0,0.05,0.1).round(0.022));Splay\
.ar(Phasor.ar(BPF.ar(x,50),x)/4)}//#SuperCollider

/* by  redfrik */
play{a=LFCub;n=8;Splay.ar(a.ar({|i|pow(i+1,a.kr(1/n,i/n,1/n,1))}!n*150,0,a.kr({|i|pow(i+1,a\
.kr(i+0.5/n,i/n))}!n).max(0))/4)}//#SuperCollider

/* by  redfrik */
play{a=SinOsc;Splay.ar({|i|i=i+1;a.ar(a.ar(i)+1**a.ar(2**a.ar(i/500)*(9-i))*a.ar(9*i).expra\
nge(90,2**a.ar(i/20)*800))}!5)/4}//#SuperCollider

/* by  redfrik */
play{o=SinOsc.ar(1/RunningMax.ar(Sweep.ar(LocalIn.ar(6)),Impulse.ar([1,0.749,6,12,3,4])));L\
ocalOut.ar(o);Splay.ar(o).tanh/2}//#SuperCollider

/* by  redfrik */
play{c=[97,99];l=3**9;a=LocalBuf(l,2).clear;BufWr.ar(Saw.ar(c/5),a,BPF.ar(VarSaw.ar(c),98,0\
.1)*l);PlayBuf.ar(2,a,1/4,1,0,1)}//#SuperCollider

/* by  redfrik */
play{a=SinOsc;Limiter.ar(LeakDC.ar(a.ar(0.11,BRF.ar(a.ar(a.ar(0.12).exprange(1,1e4),2pi),1/\
a.ar(0.13).range(1,[99,100])))))}//#SuperCollider

Nathaniel writes

Here are some new ones from me:

play{f={|i|DelayC.ar(0-LPF.ar(InFeedback.ar(2+i),2000).tanh,1,1.5**i/4e3)}!6;Splay.ar(f)++(\
Dust.ar(f*2e4+1)*LFSaw.kr(3/4**(0..5)/8)/5+f)}

play{f=({|i|DelayC.ar(0-LPF.ar(1-SinOsc.ar(1/2**i)*0.53*InFeedback.ar(2+i),1e3).sin,1,0.9**\
i/3e2)}!9);Splay.ar(f)++(WhiteNoise.ar/7+f)}

play{x=mean({|i|LPF.ar(DelayC.ar(LeakDC.ar(InFeedback ar:2,1-y=1e-4),2,40**LFNoise2.ar(0.8*\
*i)/20),1e4.rand)}!40);[x,x,x+Trig1.ar(y+x,2*y)]}

play{(x=AllpassN.ar(mean(DelayL.ar(LeakDC.ar(InFeedback ar:2),1,1+((a=LFNoise2.ar(0.1))*[1,\
-1])/50)),1,[1,1.01],20))/5++(x+SinOsc.ar(24,x))}

play{CombN.ar(x=mean(DelayL.ar(LeakDC.ar(InFeedback.ar(2)),1,1+(LFNoise0.kr(1/6)*[1,-1])/50\
)),1,1,[30,-30])/8++(CuspN.ar(800)+x)}

play{CombN.ar(x=LPF.ar(mean(DelayL.ar(LeakDC.ar(InFeedback.ar(2)),1,1+(LFNoise0.kr(4)*[1,-1\
])/180)),1e3),1,1/2,[8,4])*9++(x+Impulse.ar(4))}

play{b=LFTri.kr(1/6);CombN.ar(a=sum({BPF.ar(b<0+(b>0.3)*PinkNoise.ar(1!2)*b,b>0.29+0.2*expr\
and(30,1e4),0.3**2.0.rand/8)}!30),1,1/40,-0.02)}

play{DelayN.ar(x=RLPF.ar(Saw ar:Duty.kr(1/4,0,Dseq(2**([0,3,5,7]+.x flat([5,5,0,3]!8)/12)*9\
0,inf)),0.1**LFSaw.kr(4,1)*200),1,1/8)*1.4+x/2!2}

play{Splay.ar({BPF.ar(LFNoise1.kr(1/16)>0.5*9*Saw.ar(4**6.0.rand)*LFPulse.ar(2**11.rand/32,\
8.rand/8,0.5.rand),2**9.0.rand*20,0.5.rand)}!99)}

play{Splay.ar({BPF.ar(PinkNoise.ar(20)*LFPulse.ar(2**rrand(-2,2),rand(4)/4,0.5.rand)*(LFNoi\
se1.kr(1/4)>0),exprand(50,15000),1.0.rand)}!10)}

Ndef(\,{CombN.ar(BPF.ar(Saw.ar(1/8)*1e5+CuspN.ar*1e-3-Ndef(\).ar/40,200,4),1,LFNoise0.ar(1/\
8).lag+2/[2.02,3]/99++4,9).sum.tanh!2}).play

play{l=LFSaw;SinOsc.ar(15**(l.kr(-4.8,1)*l.kr(-1.8,1))*20).sqrt+(99**l.kr(-0.6,0.5)/99*Cusp\
L.ar)+Blip.ar(0.8,1+LFNoise0.kr(0.2)*3e3,4)!2/4}

play{o=SinOsc;f=Duty.kr(0.8,0,Dseq([5,5,9,8]*9,inf));RLPF.ar(Pulse ar:f,4**o.ar(4**LFNoise1\
.kr(1.2)*4)*4*f)+o.ar(9**LFSaw.kr(-2.5,1)*9)/4!2}

p={|x,a,f|9.do{x=BAllPass.ar(x*a,f).tanh};x};play{i=Dust.ar(2);p.(Dust.ar(EnvGen.ar(Env.per\
c,i,1e3,0,1).poll),1.4,50)/9+p.(i,1.8,40)!2}

play{x = PinkNoise.ar(0.1!2);x+FreqShift.ar(x, [0.15,0.1501])} // homesick for good old Bri\
ghton-by-the-sea #supercollider

f=0;{inf.do{|i|f=f+log2(2*i%6+1+floor(f)/(i%5+1))%2;play{SyncSaw.ar(2**f*99+[0,1],i%8+2*52)\
*Line.kr(0.1,0,1,1,0,2)};0.3.wait}}.r.play

play{GVerb.ar(VarSaw.ar(Duty.ar(1/5,0,Dseq(x=[[4,4.5],[2,3,5,6]];flat(x*.x allTuples(x*.x x\
)*4).clump(2)++0)),0,0.9)*LFPulse.ar(5),99,5)/5}

t={|u,d,a|u.ar(Duty.ar(d/5,0,Dseq(a++0))*300)};play{t.(Saw,1,x=[6,5,9,8];flat(y=allTuples(x\
/.t x)[(127..0)+[0,127]]%1))+t.(LFTri,4,y*2)!2/6} 

play{GVerb.ar(Pulse.ar(Duty.ar(1/8,0,Dseq(x=[5,2,7,3];1/flat(allTuples(x/.t x).reject(any(_\
,{|i|i%1==0}))%1)*.x[1,3,2,6]*40++0))),165,7)/5}

play{GVerb.ar(Saw.ar(Duty.ar(1/8,0,Dseq(x=[5,2,9,3];1/(flat(allTuples(x/.t x).reject(any(_,\
{|i|i%1==0}))/.-1 x)%1)*30,inf))),165,5)/5}

play{GVerb.ar(Saw.ar(Duty.ar(1/8,0,Dseq(x=[5,2,[9,7],3];1/(flat(allTuples(x/.t x).reject(an\
y(_,{|i|i%1==0}))/.-1 x)%1)*30++0))),165,1)/5}

x=Ndef(\x,Pbind(\freq,Pseq(a=(3..5);a*.x a*.x[4,8],8)));Ndef(\,{Limiter ar:GVerb.ar(PitchSh\
ift.ar(Ndef ar:\,1,2,0,0.1),20,20)/4+x.ar}).play

And fredrik writes:

//--tweet0045
play{a=SinOsc;a.ar(a.ar(a.ar(0.11)),a.ar(a.ar(95*a.ar(0.01,0,1,1),0,a.ar(5e-3,0,50),100),a.\
ar([98,97]),pi+a.ar(5e-4))).tanh}//#SuperCollider

//--tweet0046
play{a=LFTri;GVerb.ar(Mix(Limiter.ar(BRF.ar(a.ar(50,1e-4),a.ar(a.ar([1.01,1.0111])*a.ar(8e3\
)*1e3+4e3,55),a.ar(0.01)*3))))/9}//#SuperCollider

//--tweet0047
play{CombN.ar(Limiter.ar(BRF.ar(LFSaw.ar(10,0,0.01),LFTri.ar([5,6]*0.1))),0.1,LFTri.kr(0.1,\
0,0.05,0.05).round(0.01))}//#SuperCollider#SC2012

//--tweet0048
play{a=Impulse;b=SinOsc;c=b.ar(0,BRF.ar(a.ar([7,8]),a.ar(9).lag2(1e-3),1.5,2pi));Ringz.ar(c\
,b.ar(0.02,0,99,150),1/9)+c*0.02}//#SuperCollider

//--tweet0049
play{Splay.ar(SinOsc.ar(9,SinOsc.ar(midicps((Sweep.ar(0,(33..3))%128&(Sweep.ar(0,(3..9))%(L\
FSaw.ar(3)*9+99)))+33),0,pi)))/3}//#SuperCollider

//--tweet0050
play{a=Saw;b=(2..12);c=0.015;GVerb.ar(Splay.ar(Klank.ar(`[b*50+b,c,c],Hasher.ar(a.ar(b/4pi,\
a.ar(c)*b+b).ceil)))/9,5.rand+1)}//#SuperCollider

//--tweet0051
play{a=Saw;GVerb.ar(Splay.ar(BBandPass.ar(a.ar("sunday".ascii),a.ar(9/"slow".ascii)*400+500\
,a.ar(7/"coding".ascii)+1.1)/5))}//#SuperCollider

//--tweet0052
{Splay.ar(BLowPass.ar(Impulse.ar("sunday".ascii),LFTri.ar(3/"live".ascii)*1800+1900,LFTri.a\
r(4/"coding".ascii)+1.01))}.play// #SuperCollider

//--tweet0053
Pbind(\freq,Pseq("SUPERCOLLIDER".ascii,inf)*Pstutter(64,Pseq([3,4,5],inf))*[1,2.045],\dur,0\
.03,\amp,Pseq([0,0.1],inf)).play// #SuperCollider

//--tweet0054
play{CombN.ar(SyncSaw.ar(Saw.ar([3,4],32,64),Saw.ar([4,3],99,Duty.kr(1,0,flop(Dseq(2!6++4++\
3,99)*(4**(0..4))))))/9,1,1/6,2)}//#SuperCollider

//--tweet0055
play{a=Pulse;CombN.ar(Slope.ar(a.ar(a.ar([1,2]/3,1/9,50,[50,150])),a.ar([3,4],1/3)+a.ar([2,\
3],1/4)/10+0.005).cos/5,1,1/6,2)}//#SuperCollider

//--tweet0056
play{MantissaMask.ar(Pulse.ar(LFPulse.ar(1/8,0,0.55,15,76)+LFSaw.ar([0.1,0.11]),Saw.ar(10))\
,LFPar.ar(1/16,[0,0.5],3,3),0.7)}//#SuperCollider

//--tweet0057
a=GVerb;fork{loop{z=play{#b,c,d,e,f,g,h,i=(1..50).scramble;a.ar(a.ar(a.ar(a.ar(Dust.ar(1),b\
,c),d,e),f,g),h,i)/20};6.wait;z.release(5)}}//#sc

//--tweet0058
play{CombN.ar(SinOsc.ar(Saw.ar(3,64,99),Saw.ar([3,4],Saw.ar(1,32,128),Duty.ar(1,0,flop(Dseq\
([0,8,1,5])*[1,4,8]))))/9,1,1/6)}//#SuperCollider

//--tweet0059
a=LFTri;play{CombN.ar(SinOsc.ar(Saw.ar(3,128,128),Saw.ar([3,4],a.ar(a.kr(0.1,0,8,12),0,32,1\
28)).sin)/4,1,1/6,a.kr(1/32)+1)}// #SuperCollider

//--tweet0060
a=LFSaw;play{FreeVerb.ar(CombN.ar(VarSaw.ar(a.ar([32,48],0,42*a.ar(1/[16,24]),8),0,a.ar([18\
,12],0,1/64,1/64)).sin/2,1,1,2))}//#SuperCollider

//--tweet0061
a=Demand;b=SinOsc;play{b.ar(a.ar(t=Saw.ar([9,9.01]),0,Dseq(0!6++500,inf)),b.ar(a.ar(t,0,Dsh\
uf((0..7)*99,inf)).lag(0.04)))/2}//#SuperCollider

//--tweet0062
play{a=SinOsc;b=(1..9);Splay.ar(a.ar(b*55).clip(a.ar(2/b,0,0.5),a.ar(3/b,0,0.5,1))*a.ar(b*5\
5+(4/b),0,a.ar(1/b,0,6)).tanh)/5}//#SuperCollider

//--tweet0063
format(a="c=SinOsc;play{FreeVerb.ar(c.ar(0,c.ar(Duty.ar(v=1/8,0,Dseq("+($%!96)+",inf)!2))),\
v,1)}",*a.ascii-96*96).interpret// #SuperCollider

//--tweet0064
format(a="play{GVerb.ar(SinOsc.ar(0,SinOsc.ar(Duty.ar(1/8,0,Dseq("+($%!16)+",inf))))/8,20,1\
/8)}",*a.ascii.midicps).interpret//#SuperCollider

//--tweet0065
format(a="play{SinOsc.ar(%/[%,%],LPF.ar(LFSaw.ar(Duty.ar(16/%,0,Dseq("+($%!96)+",inf)),%),%\
,%))}",*a.ascii).postln.interpret//#SuperCollider

//--tweet0066
tr(a="play{VarSaw.ar(Duty.ar(0.1,0,Dseq("+($%!8)+".flat.midicps,inf)!2).lag3(0.03),0,0.3)}"\
,$%,a.ascii%64+36).post.interpret//#SuperCollider

//--tweet0067
("nj_wy_;JDRpg,_p&.,./*.*.,/*0ng'9QglMqa,_p&77)_*Quccn,_p&Q_u,_p&Y/*/,./03[(_'*2..(_'#_',r_\
lf-0{".ascii+2).asAscii.interpret//#SuperCollider

//--tweet0068
play{a=LocalIn.ar(2);LocalOut.ar(a=Hasher.ar(a.round(LFTri.ar(LFTri.ar(1e-4)/4+1e-3,0,LFTri\
.ar(1e-3)).round(2e-4))));a*0.45}//#SuperCollider

//--tweet0069
play{a=LocalIn.ar(2);LocalOut.ar(a=Hasher.ar(a.round(LFPar.ar(4e-3).round(3e-3)/3+a)));Free\
Verb2.ar(a[0],a[1],0.33,1,1,0.4)}//#SuperCollider

//--tweet0070
play{a=LocalIn.ar(2);LocalOut.ar(a=Hasher.ar(a.round(SinOsc.ar(3.3e-4,a*2pi).round(5e-4))))\
;a/3+CombN.ar(a,1,[1,0.9],3,0.4)}//#SuperCollider

//--tweet0071
play{a=LFTri;b=(2..5);Splay.ar(a.ar(abs(a.ar(b/9/9/9).round(a.ar(9-b*99,9-b/9,a.ar(b/9,b/99\
)))*a.ar(9,0,9-b*99,99*b),b/9)))}//#SuperCollider

//--tweet0072
play{a=Pulse;b=(1..8-1);GVerb.ar(Limiter.ar(Splay.ar(a.ar(abs(a.ar(b,1/8,8-b/8)).round(a.ar\
(b*8,b/8,a.ar(b))))))/8,8,1,0.8)}//#SuperCollider

//--tweet0073
play{a=Pulse;b=(1..8);CombN.ar(Splay.ar(a.ar(a.ar(b,a.ar(b/9),b*9,b*99+99),1/3,a.ar(b/9+a.a\
r(1,2/3,8,10)/9)).tanh),1,2/3,4)}//#SuperCollider

//--tweet0074
play{a=Pulse;BLowPass4.ar(a.ar(a.ar(2,0.2,a.ar(3,0.3)*500,[600,606]*a.ar(5))).sin,LFPar.ar(\
0.07)*4e3+5e3,LFPar.ar(0.1)+1.3)}//#SuperCollider

//--tweet0075
play{a=SinOsc;b=(1..16)*8;a.ar(a.ar(b).sum+[2,3]+a.ar(1/8)*99*a.ar(b/(a.ar(1/6)*2+2.05),0,4\
+a.ar(1/8)).reduce('bitOr'))*0.5}//#SuperCollider

//--tweet0076
play{a=SinOsc;a.ar(a.ar([1,2,4,8]/4*999).sum*50+[2,1]/3,a.ar(60,0,a.ar([1,2]/3)*a.ar(1/8,0,\
a.ar(1/8)*8)).tanh*a.ar(4)*6)/2}// #SuperCollider

//--tweet0077
play{a=SinOsc;b=a.ar(a.ar(1/[5,6])+[798,912],a.ar(1/16)*19+99*a.ar([9,8]),a.ar(a.ar(6)*a.ar\
(0.009)));a.ar([201,301],b).tanh}//#SuperCollider

//--tweet0078
play{a=GrayNoise.ar;b=(1..9);CombL.ar(a,1,b/Duty.ar(3,0,Dseq([0.5,1,2,3]*99,99)).lag3(1)).m\
ean/2+Ringz.ar(a/99,b*99).mean!2}//#SuperCollider

//--tweet0079
play{Saw.ar((99,111..999),LFSaw.ar(1.1/(1..76))).mean.distort.distort.distort.distort.disto\
rt.distort.distort.distort*3.5!2}//#SuperCollider

//--tweet0080
play{a=SinOsc;b=a.ar(1/3);Duty.ar(SampleDur.ir,0,Dseq([0,1],inf)).bitXor(a.ar(b>0*30+60,0,a\
.ar(4,0,a.ar([3,2]/9,b*3,9))))/9}//#SuperCollider

//--tweet0081
fork{inf.do{t=3.0.linrand;play{{XLine.ar(1.0.rand,0.5.rand,t)}!2*SinOsc.ar(XLine.ar(999.ran\
d+99,999.rand,t,1,0,2))};t.wait}}//#SuperCollider

//--tweet0082
play{a=LFTri.ar(1/9)*0.07+0.0708;CombN.ar(Decay2.ar(Duty.ar(Dseq([1e-4,a/2],inf),0!2,Dseq([\
-1,0,1,0],inf)),a/9,a)/5,1,1,12)}//#SuperCollider

//--tweet0083
play{a=LFCub;Splay.ar({|i|i=i+1;Formant.ar(*Sweep.ar(a.ar(i/[1,2,3])>a.ar(i/9,i/9,1/6,1/3),\
0.05)*99*i+99*i)*a.ar(0.1/i)}!6)}//#SuperCollider

//--tweet0084
play{a=Saw;Splay.ar(Formant.ar(a.ar((5,7..15)*19)*99+199,a.ar((1,3..13)*29)*199+299,a.ar((3\
,5..11)*a.ar(3,2,3))*299+399))/3}//#SuperCollider

//--tweet0085
play({Duty.ar(1/9600,0,Dseq((0..255).collect{|i|[1]++(1-i.asBinaryDigits.reverse)++[0]}.fla\
t,inf),2)!2},s,0,0)// #SuperCollider talks serial

//--tweet0086
play{a=LFNoise2.kr(1/(9..17));Splay.ar(Ringz.ar(BPF.ar(Dust2.ar(a.abs*1e4),a.exprange(99,1e\
4),1.1-a),(9..1)*99,a+1.1,a)/5)}// #SuperCollider

//--tweet0087
play{BLowPass4.ar(Splay.ar(VarSaw.ar(200*Duty.kr(1/(1..5),0,Dseq(flat({|x|{|y|y+1/(x+1)}!8}\
!8),inf)))),5e3,LFTri.kr(9)+1.1)}//#SuperCollider

//--tweet0088
play{a=SinOsc;LPF.ar(LeakDC.ar(a.ar([98,99],a.ar([8,9],a.ar(1/[88,99],0,2pi),pi).lag(a.ar([\
9,8])),a.ar(1/[8,9])*9)%1),9e3)}// #SuperCollider

//--tweet0089
play{GVerb.ar(Splay.ar(Ringz.ar(Blip.ar(a=[4,14,5,15,6,16,8],LFNoise0.ar(4/a)*99,LFNoise1.a\
r(4/a).max(0)),a*99,4/a))/6,200)}//#SuperCollider

//--tweet0090
play{FreeVerb.ar(Splay.ar(BBandPass.ar(Blip.ar(b=(1..8)+1,LFTri.ar(1/b)*9e3,LFTri.ar(3/4/b)\
.max(0)),b*999,1/b),2,3),0.3,1)}// #SuperCollider

//--tweet0091
play{a=LFPulse;Splay.ar(Pulse.ar((1..10)*a.ar(1/24+a.ar(1/3)*12,0,1/9,a.ar(1/12,0,0.5,9,48)\
).abs+6).reduce(\mod).softclip)}// #SuperCollider

//--tweet0092
play{Mix(Pan2.ar(Formlet.ar(Dust.ar(b=(1..8)),b*99,b/99,b/9),SinOsc.ar(b),LFSaw.ar(9.5-b,b/\
9,LFTri.ar(b/5)*4).max(0)).sin)}// #SuperCollider

//--tweet0093
play{x=SinOsc;a=LocalIn.ar(2);z=x.ar([3.1,4.2]+a)-Balance2.ar(a[0],a[1],x.ar(a*x.ar(a)*999)\
);LocalOut.ar(CombN.ar(z/3));z/5}//#SuperCollider

//--tweet0094
play{a=Blip;b=LFSaw;CombN.ar(a.ar(a.ar(b.ar(1/[9,99])*1e3+4e3,b.ar(1/[23,24])*4+5,b.ar(1/[5\
,6])+b.ar(1/[8,9])*9)),0.3,0.3)}// #SuperCollider

//--tweet0095

{|i|a=VarSaw;b=i/8;play{Pan2.ar(a.ar(b*666+a.ar(b+0.03,b),0,b+0.06,a.ar(b+1,0,b+0.1,6+b,7+b\
)).sin.tanh,a.ar(b+1,b),0.2)}}!8// #SuperCollider

By Jonathan Siemasko

Pspawner{|sp|6.do{|i|sp.par(Pbind(*[degree:Pseq(((0..i)*2),inf),octave:7-i,dur:0.2*(2**i)])\
)};sp.seq}.play

{h={|f|1-LFTri.ar(f)};l={|s,e|Line.ar(s,e,1200,1,0,2)};FreeVerb.ar(h.(l.(147,5147))*h.(l.(1\
117,17))*h.(100)*h.([55,55.1])*0.05,0.7,1)}.play

{CombC.ar(Klang.ar(`[[100,101,1000,1001]],1,0)*0.1,0.33,LFTri.ar(0.1, 0, 0.1, 0.11)+LFTri.a\
r(0.17, 0, 0.1, 0.22),10)!2}.play;

{CombC.ar(Mix(SinOsc.ar((1..20)*6.12))*SinOsc.ar([SinOsc.ar(15.4,0,20),SinOsc.ar(1.9,0,37)]\
)*SinOsc.ar([500,400]),1,0.01,10)*0.01}.play

{CombC.ar(Mix(SinOsc.ar((1..20)*145.12))*SinOsc.ar([SinOsc.ar(0.14,0,40),SinOsc.ar(0.19,0,3\
7)])*SinOsc.ar([0.023,0.012]),1,0.1,10)*0.09}.play

{(CombC.ar([Mix(SinOsc.ar((40..50)*7.23,(1..10)/10)),Mix(SinOsc.ar((40..50)*6.41,(1..10)/10\
))],10,SinOsc.ar(0.0001,0,10),2)*0.02)}.play

play{a=Mix(Array.fill(75,{|i|SinOsc.ar(rrand(1,50)*i+10,0,LFNoise1.kr([1.8,2.3]))}))*0.02;C\
ombL.ar(a,15,SinOsc.ar([0.1,0.11],0,0.5,0.6),10)}

{CombC.ar(GrainFM.ar(2,Impulse.kr(LFTri.kr(0.08,0,10,10)),0.1,LFTri.kr(0.04,0,40,400),500,L\
FNoise1.kr.range(1,10),0,-1),0.3,3,5)}.play;

fork{y=[Blip,Saw,Pulse,SinOsc];loop{x=play{z=_.ar(rand(2000));z.(y.choose)*z.(y.choose)!2*0\
.4};(rand(0.2)+0.05).wait;x.free;}}

fork{y=[Blip,Saw,Pulse,SinOsc];loop{x=play{z=_.ar(rand(999));z.(y.choose)*z.(y.choose)*z.(y\
.choose)!2};(rand(0.1)+0.05).wait;x.free;}}

play{x=FreeVerb;y=SinOsc;DFM1.ar(x.ar(x.ar(GVerb.ar(Dust.ar(9),10,3,0.5),1,1)*y.ar(60),1,1)\
*y.ar(150),y.kr([0.1,0.17]).range(900,9999))}

play{x={|a,b,c|Pulse.kr(a).range(b,c)};(Blip.ar(x.(0.87,4,80)*x.(1.7,1,10)*x.(1.13,1,10))*S\
inOsc.ar(LFTri.ar(0.006,0,9999)))*SinOsc.ar(50)}