live visuals, Tom Luyten and OSC

live visuals, Tom Luyten and OSC

Tom Luyten

Watching a whole show by a guy playing the double bass is likely to be a little bit boring, visually speaking at least. Right from the beginning I had the idea to combine and control visual elements with my setup.

The whole programming of the setup itself took so much time that I realised at an early stage that I won't be able to do this on my own. Luckily, I met Tom Luyten, a visual artist and teacher at the ZUYD University of Applied Science.
I told him about my ideas and he agreed to work with me for my graduation concert.

I am beyond happy that he will be part of the show.

Concept

I imagined visuals, which are generated by my audio signals. I have them as separate files in my computer.
Tom works in the field of computer generated art. But to generate art, you need data. I thought of ways of how I could provide him data out of my setup with which he could work.

data sources

These qre possible data of audio:

  • Note pitch
  • Velocity
  • Frequency spectrum

OSC

I talked about OSC earlier, it is stable, easy to use and I have it integrated in my setup.

Transformation audio into data

I found plugins from showsync which are free and analyse audio data and transform them into OSC. Like an EQ where you can analyse the frequency spectrum and generate OSC data.

vimeo video
vimeo video

This can be send over WIFI to Tom and he will feed this data into his system to generate visuals. We had a recent test to see if it works and it did right away!

yipiie.

<p><a href="https://schlapbe.de/master-thesis-overview/">go to the Master Research Overview</a></p>


Pedalosophy

PEDAL-O-SOPHY

Why so many pedals?

"Playing is the foundation of learning, creativity, self-expression, and constructive problem-solving. It’s how children wrestle with life to make it meaningful. ”
Susan Linn - Contemporary American psychiatrist

When I play the bass, I'm often focused on using the right technique, intonation and timing. This comes from my classical training where the focus lies on the mistakes, with the goal to improve, of course. But it's this conditioning which sometimes keeps me from getting into a creative and playful flow.

When I use pedals and move my focus (partly) towards them, I‘m a child again, playing with my favourite toys and forgetting about the world. It is this mindset which is great for creativity.
I like to think about effect pedals as an extension of my instrument and toolkit for musical expressions. It does not replace any of the traditional ways of expression.

Pedals inspire me. The feel of twisting knobs is something which triggers that playful instinct. It is a direct link of cause and result.

But there is also a downside. Pedals and technology in general can lead to distraction and even worse, to procrastination. There is even a term for this:
G.A.S.
It stands for Gear Acquiring Syndrome. It means you buy a pedal or piece of gear only for the satisfaction of having it. It happened to me in the past, too. It needs discipline and an attentive mindset to find a balance.

It is a omni-directional process:

pedals helping to be creative
pedals helping to be creative

For this master research I was looking for a special sort of pedals. Sample/loop manipulation, granular and modulated delay/reverb and pitch shifting.

These are the 3 categories I focused on.

Reverb creates a virtual room by imitating a real one. By changing the reverb, I can change the location. Imagine a sad melody played in a closet or in a big church. It makes a big difference. Not only for the listener; it foremost influences the way I play.

Delay repeats something I played. It makes me play feweer notes. Another name for delay is Echo. When I hear an echo of what I've played, it feels as if somebody else is playing with me. The great thing about old tape or drum echos is that each repeat sounds different. Imagine a mirror image which imitates your moves, but later and every time a little differently.
Delay is my favourite effect.

I use pitch shifting to play more than one note at the same time. Like this I can play chords and extend the range of the instrument dramatically.

As my main research question is about loops and their modulation or variation, I use 6 different pedals with different approaches and workflows.

go to the Master Research Overview


Basic concept for my setup

taking control

At the time I started developing my setup with ClyphX PRO, the basic concept was to have a midi controller on which I could choose how many bars I wanted to record and where. I could easily capture ideas and re-arrange them on the spot, without looking at a computer monitor or touching a mouse/trackpad.

I took a Launchpad X from Novation and built a basic structure.

  • Bars: I can select the length of bars I want to record. (Short press small number, long press higher number)
  • Slot: I can select the slot (scene) the loop gets recorded. (Short press small number, long press higher number)
  • Numbers 1-8: These pads represent an audio track for audio loop recordings.
  • Loop 1-4: I have assigned two functions:
    • With a short press, I select how many loops I want to record after each other. (1-4 loops)
    • by holding down and pressing one of the 1-8, I select in which audio track the loop gets recorded. (e.g. Loop1 into loop track 3, Loop2 into track 4 etc….)
  • Kick, SN1, SN2, T4 represent 4 drum tracks, which I can trigger with my piezo pickups attached to my bass.
  • with Mute, Play, Stop NQ and Stop Q, I can control all 8 loop tracks as well as the drum tracks. NQ stands for Non-Quantised, it will stop a loop immediately whereas Q will stop a loop on the one of the next bar.
  • with the pads (none, quarter, eighths, triplets and sixteenth notes) I select the quantisation grid applied after recording. (This works only for beats at the moment)

There are some more buttons I will explain at a later point.

Effect pedals

The second key ingredient to my setup are guitar effect pedals. Why? I'll explain this here in a post: PEDALOSOPHY

Each pedal is connected through a separate input and output of my audio interface.
Regarding the concept of a low-threshold accessible setup I had to address these pedals in my control surface as well. On my main Launchpad X I have 11 pads representing those pedals. I can send each of my loop or beat tracks to any pedal from within ableton. I moved this feature to another controller though. I will reuse these pads for other uses, like arm/unarm tracks, mixer control in the future. I gathered my ideas of what I want to program in my setup in the future here: Future plans

However, after first experiments an idea of a mixer matrix flourished, where I can send everything everywhere. There is a dedicated post about The Matrix

For each pedal which is capable of MIDI I have a dedicated MIDI track routed to each pedal. This allows me to control Program Changes (Preset Selection), modulation of parameters with LFOs or to turn knobs (on the desk) remotely with a (foot-)expression pedal on the floor. And this can be assigned on the spot or scripted in advance.

Setup complexity and Improvisation

It became obvious at an early stage of building this setup that it will be highly complex. Nevertheless, it was important to me that operating remained intuitive. I think and feel about it more as an instrument than as a tool. As a trained musician I know that learning an instrument takes time, but establishing a "muscle memory" is a key competence to be able to improvise. The tactile aspect, the colouring, labelling and visual feedback of a launchpad helps a great deal.

an idea becoming reality

How did I do all of this and how does it work?
I won't explain every aspect and detail(because of the sheer amount), but I do explain my use of ClyphX Pro and give some examples in the next posts:

go to the Master Research Overview


an interview with KNOBS (a.k.a. Scott Harper)

This content is password protected. To view it please enter your password below:



the holy quest of the perfect looper

First of all, there is no such thing as THE PERFECT LOOPER. But there is a perfect looper for ME and a perfect looper for YOU. And they are most likely very different. But first, I want to tell you about my search for the perfect looper and something about the history of looping.

(my) history of looping

I remember I did start live looping back in 2001 at a local steady gig where the band played without a drummer. Drum loops came from Ableton live triggered by our keyboard player. I looped my bass line and played guitar parts on top. It was super hard to get the loop lining up with the computer. There was no such thing as MIDI sync at that time. But it was a great practice to stomp in the right moment on the foot switch. I believe I used a Headrush E1 from AKAI back then.
I had some other loopers like the BOSS RC-2 and the bigger version BOSS RC-20XL. But I was looking for a looper which could sync to a midi clock or be synced in any other way.
In 2002 the company "Electrix Pro" released a unit called "Repeater".

ELECTRIX REPEATER
ELECTRIX REPEATER

It had everything I was looking for. It had MIDI, four independent tracks with faders and loops were stored on a Compact Flash Card. The best feature though was an effect send/return. You could record a loop and send and manipulate it afterwards and even re-record it. It was also possible to connect a standard 2way foot switch for recording & stopping and place the unit up right next to you.

In 2004 for my diploma graduation concert I did a solo performance with it which was filmed:

Solo performance from 2004, "Casa, dolce casa"

Actually, the story could end here, this unit was versatile, flexible and had almost every feature I could ask for (full feature list). It was acclaimed as the "Best looper unit of all time".
Here comes the sad part: Unfortunately, it broke after a short time. The company was out of business and the product discontinued. I got a unit second hand which had some issues with the power supply. After some research I sent it to "Condor Electronics" in Oregon, USA. They fixed it and installed some improvements. When I got it back from customs, I plugged it in with great excitement. It went on, but after 30 seconds it made a puff noise and there was a funny smell of burned capacitors. Having spent a ridiculous amount of money and left with electronic scrap in my hands, I'd had enough of looping. I took everything and threw it in the bin. That was in 2005. When I started my master study in Maastricht, I rediscovered my interest in technology and looping. I wanted to explore the possibilities of 2020'ies in regards of live looping. To develop a one-man-show setup due to the covid pandemic seemed logical to me as well.

What attributes does my perfect looper have to have?

I started developing an idea into a concept. I had very limited resources to buy new gear, so I worked with the gear I already had and see what I could achieve. It was basically some pedals, an audio interface and a computer.

Of course, looper units became much more sophisticated and developed over the last 15 years. But still, they all lack an important aspect for me: flexible audio routing for multitrack output and more importantly various send/return loops for external effects.

And they are very expensive.

I'm working with ABLETON LIVE since 2003. There are so many aspects why ABLETON could be the core/heart/brain of a perfect looping machine:

  • It is designed to work with loops in a totally different way compared to other DAWs.
  • It is extremely stable and designed for live performance.
  • It has the ability to connect to remote hardware, e.g. launchpads to control all sorts of functionality
  • It can handle both MIDI and audio signals
  • It is capable of complex audio and midi routings.
  • It has built-in effects (optimised for live performance)
  • With MaxDSP integrated, you have got an incredible powerful platform for custom effects and tools of all kind.

In a perfect looping machine I want possibilities like:
- different loop lengths
- pitch shifting of individual loops
- independent effects for each loop
- quantisation of recorded material

BINKLOOPER and scripted looping

My dear teacher, Matthias Nowak, recommended me to check out Frank Wienk aka BINKBEATS who is an artist who does complex loop performances.

Fortunately he gives some insight on how he uses ABLETON to record his loops and doing all effects.
He uses a MAX for LIVE device called BINK LOOPER with which he can "script" the recording process. Here is a link to a video where he explains everything: Binkbeats: Live Setup
I used this device (which he gives away for free here) to build a loop performance of a tune called "Mo'Better Blues":

MoBetterBlues Performance

I reflect on the process using BINK LOOPER here. How to program, its specialities and frustrations. Everything needs to be planned in advance and it is very fussy to program.
I switched to Clyphx Pro as soon as I found out about this software.

ClyphX Pro

Improvisation is a very important aspect to my musical identity. Being flexible, spontaneous and fast with my looping setup was a key ingredient. I need the operation to be low-threshold and intuitive.

Funny enough it was this BINK LOOPER tutorial video of Connor Shafran where I found out about Clyphx Pro. He mentioned it in the comments. This software is so powerful and gives me everything I need to bend ABLETON LIVE to my needs. It became the center piece of my setup and still has a steep learning curve. Because it is very complex and extensive, it has its own chapter in my Master Thesis:

ALL POSTS regarding Clyphx Pro

Summary

The quest is fulfilled. I found the ultimate, perfect looping machine. For me. At least I see the path I have to follow further. The possibilities it can offer me already are not only more than enough, there are actually too many and I need limitations in order not to be overwhelmed.

go to the Master Research Overview


CONCLUSION

Reflection and Discussion

I feel that by now I have reached a milestone. I could not have imagined that I would start programming in a proper language like Python. I recall a conversation I had with Laurent Peckels at the beginning of our study in Maastricht when we talked about my project. He said: "It could be very useful if you did some programming in Python…."
I replied: "I will, never, never ever start to program in Python for my setup. Because, if I do so, I'll open Pandora's box and I won't have enough time to use my setup and make great music with it."

Well, I did open Pandora's box wide open and I spent uncountable hours getting my ideas formed into working code. Before I started making music, I had programmed some software when I was 11 years old. I remember it as a rewarding and deeply satisfying process. At the same time, I also spent a whole week in front of a monitor. If my mother had not provided regular meals, I probably would have starved a great deal.
When I encounter a technical problem, the curiosity to find a solution is very powerful in me. This has been a driving force for me ever since. It has led to many great solutions and my current setup is working in a way I would never have imagined. The simultanious possibilities and stability of my setup is most satisfying.

That being said, I have not spent enough time making music in comparison with how much time I spent developing it. I would estimate that to 1 part making music, I invested 7 parts in programming.
That is why it feels like I am more at the beginning of my project, rather than at a point of drawing a final conclusion.

The good thing in this: the fun part lies ahead of me. When playing and experimenting with my setup, it is easy to get lost for hours just jamming and making soundscapes. At some point, I had to force myself to forget about all future plans of improving and re-programming my seltup, and instead work with what I have got. That strategy worked greatly in my favour.

choices and limits

I learned about the beauty and effectiveness of setting limits. In many conversations I had with Matthias Nowack, Scott Harper and Markus Birkle, it was a recurring motif that I should lower my possibilities to give my creativity an appropriate room to unfold.
It lies in my character to first look at all possibilities and collect them. It is an acquired competence to set limits. That does not happen naturally or instinctively to me. In regards to my development as an artist, this technique is the most helpful and effective one I have learned over the course of the last two years.
It helps to focus on the music. I read somewhere that our brain is not capable to process more than 3 parameters simultaneously. It definitely works best when focusing on only 1 thing.

side effects

The world of pedals is a small one, when you put the few big brands aside. There are a vast number of small companies that build the quaintest pedals. I was able to make connections with some companies and the people behind them. It feels like a familial community. e.g. I became a beta tester for 3degrees audio. I also did an interview with Scott Harper, the brain behind the youtube channel KNOBS and the co-author of the book "PEDAL CRUSH".

I had zoom calls with other members of the ClyphX Pro community to exchange problem solutions. I also became an alpha tester for the new version which works with ABLETON LIVE 11.

Reflection

I am very satisfied of how far I've come and I am thrilled to see where it will go. I had few occasions to test my setup in a setting where I can improvise freely with it. Due to the open concept and flexibility of ABLETON LIVE and ClyphX Pro it is easy and fast to integrate and realise new ideas. ABLETON LIVE is designed to connect to other software so the idea to integrate my setup into an existing band which also has ABLETON LIVE is relatively seamless.

go to the Master Research Overview


swelling into the unknown - morning jam

freeze with reverb and delay

Recently, I had a session with Markus Birkle in which he gave me the great advice to use one of the expression pedals to control the volume of my mic going into a reverb.

Instead of turning reverb on and off, this is a volume swell BEFORE the effect. This enables me to "freeze" layers or tones on the fly while playing.

different signal flows
different signal flows

I played around with this today and here is the result:

Overall I think this works really great, especially in an improvised context. I recorded two basic loops on D and C. I switch back and forth between them to shake things up. Later I recorded an ostinato on flaggeolets D and A and sent them into a delay. I also recorded a beat with my triggers which worked not so well. They are attached in a new fashion and their instability, wrong thresholds meet my in-ability to play a beat tight with my fingers.

There is also a quadravox algorithm going on in the H9:

the quadravox algorithm in the eventide H9
the quadravox algorithm in the eventide H9

user action source code

# Import UserActionsBase to extend it.
from ClyphX_Pro.clyphx_pro.UserActionsBase import UserActionsBase
## from variables.py import *
source_vars=[
["position","sourcename","SENDLetter","MIDI CH","MIDI CC","source-led--note"],
[1,"input","-",1,102],
[2,"Pedalboard","K",1,104,12],
[3,"Distortion","I",1,105,24],
[4,"microcosm","E",1,106],
[5,"blooper","C",1,108],
[6,"mtlasm.856","B",1,109],
[7,"MOOD","J",2,105],
[8,"tensor","A",1,105],
[9,"ottobit jr.","D",2,108],
[10,"H9 exevntide(DESK)","G",2,110],
[11,"CXM 1978","F",2,112],
[12,"Volante","I",2,102],
[13,"MIC1 - Input 9","-",1,110],
[14,"MIC2 - Input 10","-",1,111],
[15,"MIC3 - Input 11","-",1,112],
[16,"MIC3 - Input 12","-",1,113],
]
submix_vars=[
["Position","NAME","mono(1)/stereo(2)","MIDI CH","MIDI CC","Ableton IN","RME IN"],
[1,"re-record",2,13,104],
[2,"Pedalboard",1,13,106],
[3,"Filter",1,14,108],
[4,"microcosm",2,13,108],
[5,"blooper",1,13,110],
[6,"856mtlasm",1,13,111],
[7,"MOOD",1,14,107],
[8,"tensor",1,13,107],
[9,"ottobit",2,14,110],
[10,"H9 Desk",2,14,112],
[11,"CXM 1978",2,14,114],
[12,"VOLANTE",2,14,104],
[13,"PHONES",2,13,112],
[14,"MAIN OUT",2,13,116],
[15,"PHONES2",2,13,114],
[16,"ATEM",2,14,102],
]
## transition Hotfix for old mixer matrix
mic_matrix=[0]*4

SOURCES=('INPUT','Pedalboard','Dist','Microcosm','blooper','856','MOOD','Tensor','ottobit','H9','CXM','Volante')
TARGETS=('Rec','Pedalboard','Dist','Microcosm','blooper','856','MOOD','Tensor','ottobit','H9','CXM','Volante')
SOURCELIGHTS=('11','9','0','10','19','16','18','17','8','2','1','3','')
VOLLIGHTS=('G0','D#0','E0','F0','F#0')
INPUTS=('BASS MIC (1)','INP2 (Pickup)', 'INP3', 'Mic84 (4)')
INPLIGHTS=('G#0','A0', 'A#0', 'B0')
EXPSOURCES=("fcb-exp1", "fcb-exp2", "fcb-exp3", "fcb-exp4", "moogexp")
EXPTARGETS=('\"M-CXM\" / DEV(\"CC-CXM-1\") p7 b1','\"M-Trem\" / DEV(\"CC-Trem-1\") p1 b1','\"M-Phaser\" / DEV(\"CC-Phaser\") p2 b1','\"M-Strymon\" / DEV(\"CC-Strymon-1\") p8 b1','\"M-Moog\" / DEV(\"CC-MOOG-MP201\") b1 p1','\"M-Whammy\" / DEV(\"CC-Whammy\") p1 b1','\"M-H9\" / DEV(\"CC-H9-p2\") p8 b1;','\"M-microcosm\" / DEV(\"CC-microcosm-1\") p3 b1','\"M-Tensor\" / DEV(\"CC-Tensor-1\") p8 b1','856','\"M-MOOD\" / DEV(\"CC-MOOD-1\") p8 b1','\"M-blooper\" / DEV(\"CC-blooper-2\") p8 b1','\"M-otto\" / DEV(\"CC-otto-1\") p8 b1', '\"M-Trem\" / DEV(\"CC-Trem-1\") p2 b1', 'L/DEV(\"Delay\") p8 b1', '\"BASStoFX\"/DEV(\"EXPtoFX\") p8 b1', '5', '6', '7','8')
EXPSTRINGS=('CXM Exp','Tremotron depth','Phaser exp','Volante exp','Filter lowpass','Whammy Pedal','H9 expression','Microcosm exp','Tensor expression','mtl.asm 856','Mood expression','Blooper exp','ottobit exp','Tremotron rate','int. DELAY Feedback','BASSFX SENDS','5','6','7','8')
EXPLIGHTS=('F1','F#1', 'G1', '')
LPX_LOOPLIGHTS = ('C-1','C#-1','D-1','D#-1','G#-1','A-1','A#-1','B-1')
LPX_BAR_SLOT_LED = ('C-2','C#-2','D-2','D#-2','G#-2','A-2','A#-2','B-2')
LPX_BEAT_LED = ('C1','C#1','D1','D#1')
LPX_LEDCLIP = "\"LPX\" / CLIP(\"LPXSTATE1\") ";
loopstatus_matrix = [0] * 16
routes=['']*15
matrix = [0] * 192
submixtotal = 12
mixstateclip = "\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) "
fxsends = (7,5,4,8,3,10,9,11,2,6,1)
sendletters = ("A","B","C","D","E","F","G","H","I","J","K","L","M")
# Array to save Sends of a track
sends = [0] * 12
otto_cc_notes = (1,13,17,21,25,29,33,37,41,46,50,54,58,74,78,83,87,91,95,99,103,107,111,115,119,127,0)
otto_cc_names = ('-12','-11','-10','-9','-8','-7','-6','-5','-4','-3','-2','-1','0','1','2','3','4','5','6','7','8','9','10','11','12','MUTE','OFF')
otto_cc_matrix = [12] * 6

# Your class must extend UserActionsBase.
class ExampleActions(UserActionsBase):

# Your class must implement this method.
def create_actions(self):
self.add_global_action('namerecclip', self.name_recclip)
self.add_global_action('qntclip', self.name_quant_beat_clip)
self.add_global_action('setvar', self.set_vars)
self.add_global_action('oscmatrix', self.osc_matrix)
self.add_global_action('matrixstate', self.matrix_state)
self.add_clip_action('matrixsave', self.matrix_save)
self.add_clip_action('matrixload', self.matrix_load)
self.add_global_action('matrixrecall', self.matrix_recall)
self.add_global_action('submixrecall', self.submix_recall)
self.add_global_action('selectphones', self.select_phones_submix)
self.add_global_action('expbind', self.exp_bind)
self.add_global_action('volled', self.vol_led)
self.add_global_action('inpled', self.inp_led)
self.add_global_action('moodstate', self.moodstate)
self.add_global_action('loopstatusled', self.loopstatusled)
self.add_global_action('barslotled', self.barslotled)
self.add_global_action('muteled', self.muteled)
self.add_global_action('muteledoff', self.muteledoff)
self.add_global_action('montorec', self.mon_to_rec)

self.add_global_action('micmatrix', self.mic_matrix)
self.add_global_action('selplayingscene', self.selplayingscene)

self.add_global_action('ottostep_inc', self.ottostep_inc)
self.add_global_action('ottostep_dec', self.ottostep_dec)
self.add_global_action('ottostep_matrix', self.ottostep_matrix)
self.add_global_action('ottostep_mute', self.ottostep_mute)
self.add_global_action('ottostep_reset', self.ottostep_reset)
self.add_global_action('ottostep_matrix_send', self.ottostep_matrix_send)

self.add_track_action('montosends', self.mon_to_sends)
self.add_track_action('copysends', self.copy_sends)
self.add_track_action('clearsends', self.clear_sends)
self.add_track_action('loopstatus', self.loopstatus)

def name_recclip(self, action_def, args):
""" determines the name of the desired recordingclip and triggers it"""
#login: [0] loops [1] looplength beat or bars
self.canonical_parent.show_message('loopnaming')
vars = args.split();
quant_clip_name = '\"QUANT_BEAT_%sbar\"' % vars[1]
self.canonical_parent.clyphx_pro_component.trigger_action_list('%%quantbeatclip%%=%s;' % quant_clip_name)

if vars[1].find("beat")>0:
rec_clip_name = '\"rec-%sloops-%s\"' % (vars[0], vars[1])
ua_clip_name = '\"unarm-%sloops-%s\"' % (vars[0], vars[1])
else:
rec_clip_name = '\"rec-%sloops-%sbar\"' % (vars[0], vars[1])
rec_clip_name_l2 = '\"rec-loopx-%sbar\"' % vars[1]
ua_clip_name = '\"unarm-%sloops-%sbar\"' % (vars[0], vars[1])
ua_clip_name_l2 = '\"unarm-loopx-%sbar\"' % vars[1]

self.canonical_parent.clyphx_pro_component.trigger_action_list('%%recclip%%=%s; %%uaclip%%=%s; %%recclip2%%=%s; %%uaclip2%%=%s;' % (rec_clip_name, ua_clip_name, rec_clip_name_l2,ua_clip_name_l2))
self.canonical_parent.log_message('action def: %s' % action_def)
self.canonical_parent.log_message('args: %s' % args)
## self.canonical_parent.show_message('first arg: %s , 2nd arg: %s, loopname: %s' % (vars[0], vars [1],rec_clip_name_l2))

def name_quant_beat_clip(self, action_def, args):
quant_clip_name = '\"QUANT_BEAT_%sbar\"' % args
self.canonical_parent.clyphx_pro_component.trigger_action_list('%%quantbeatclip%%=%s;' % quant_clip_name)
self.canonical_parent.show_message('beatclipname: %s' % quant_clip_name)

def set_vars(self,__,args):
vars=args.split();
i=0
while i<len(vars): self.canonical_parent.clyphx_pro_component.trigger_action_list('%%%s%%=%s;' % (vars[i],vars[i+1])) ## self.canonical_parent.show_message('first arg: %s , 2nd arg: %s' % (vars[0], vars [1])) print(vars[i],vars[i+1]) i=i+2 def osc_matrix(self,__,args): vars=args.split(); matrixrow=int(vars[0])//12+1; mb=int(vars[0]); ms=int(vars[1]); vol=int(vars[2]); cell=mb+ms-1; matrix[cell]=int(vars[2]); routes[matrixrow]=''; for x in range(12): if (int(matrix[x+mb])>0):
routes[matrixrow]=routes[matrixrow]+SOURCES[x]+" "+str(matrix[x+mb])+", ";
message="osc int /matrix/%s/%s %s " % (matrixrow, vars[1], vars[2])
message=str(routes[matrixrow])
message2="osc str /routes%s '%s' " % (matrixrow, routes[matrixrow])

## self.canonical_parent.show_message(message2)
self.canonical_parent.clyphx_pro_component.trigger_action_list(message)
self.canonical_parent.clyphx_pro_component.trigger_action_list(message2)

def mic_matrix(self,__,args):
vars=args.split()
if (int(vars[2])==144):
mic_matrix[int(vars[0])-1]=int(vars[1])
## log_message="mic %s set to %s vol" % (vars[0],vars[1])
## self.canonical_parent.show_message(log_message)
## self.canonical_parent.log_message(log_message)
## else:
## self.canonical_parent.show_message(" mic not on phones submix vasr 2 is"+vars[2])

def matrix_state(self,__,args):
vars=int(args);
x = 0
for x in range(12):
cell=(x+vars)
if int(matrix[cell])>0:
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) OFF" % (SOURCELIGHTS[x])
##self.canonical_parent.show_message("LIGHT ON: "+cmd)
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (SOURCELIGHTS[x])
##self.canonical_parent.show_message("LIGHT OFF : "+cmd)
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

def matrix_save(self,action_def,args):
clip = action_def["clip"]
self.canonical_parent.show_message("matrixsaved")
if clip:
if action_def["xtrigger"] != clip:
clip.name = str(matrix)
else:
self.canonical_parent.log_message("Error: Tried to rename the Xtrigger clip itself")

def submix_recall(self,action_def,args):
self.canonical_parent.log_message("SUBMIX RECALL")
global matrix
if int(args) in range(1,17):
self.canonical_parent.log_message("only SUBMIX %s recall" % (str(args)))
submix_recall=int(args)
submix_ch=str(submix_vars[submix_recall][3])
submix_cc=str(submix_vars[submix_recall][4])
submix_cmd="MIDI CC %s %s 127" % (submix_ch, submix_cc)
self.canonical_parent.clyphx_pro_component.trigger_action_list(submix_cmd)
#self.canonical_parent.log_message(submix_cmd)
for x in range(1,submixtotal+1):
input_ch=str(source_vars[x][3])
input_cc=str(source_vars[x][4])
matrix_pos=(submix_recall-1)*12+x-1
matrix_vol_value=str(matrix[matrix_pos])
midi_cmd="MIDI CC %s %s %s" % (input_ch,input_cc,matrix_vol_value)
self.canonical_parent.clyphx_pro_component.trigger_action_list(midi_cmd)
#self.canonical_parent.log_message(midi_cmd)

def select_phones_submix(self,action_def,args):
self.canonical_parent.log_message("select Phones Submix")
submix_ch=str(submix_vars[13][3])
submix_cc=str(submix_vars[13][4])
submix_cmd="MIDI CC %s %s 127" % (submix_ch, submix_cc)
self.canonical_parent.clyphx_pro_component.trigger_action_list(submix_cmd)
return

def matrix_recall(self,action_def,args):
for x in range(1,17):
self.submix_recall(action_def,x)
self.canonical_parent.log_message("Matrix recall function %s" % x)
self.select_phones_submix(action_def, args)

def matrix_load(self,action_def,args):
global matrix
clip = action_def["clip"]
if clip:
if action_def["xtrigger"] != clip:
matrixstring = eval(clip.name)
matrix=list(map(int, matrixstring))
#self.canonical_parent.log_message("matrix loaded: %s" % matrix)
else:
self.canonical_parent.log_message("Error: Tried to rename the Xtrigger clip itself")

def exp_bind(self,__,args):
vars=args.split()
cmd="BIND %s %s" % (EXPSOURCES[int(vars[0])], EXPTARGETS[int(vars[1])])
cmd2='%%%s_target%%=%s;' % (EXPSOURCES[int(vars[0])],EXPSTRINGS[int(vars[1])])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.show_message("EXP COMMAND "+cmd)
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd2)

def vol_led(self,__,args):
volled=int(args)
length=len(VOLLIGHTS)
if volled == 0 :
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) OFF" % (VOLLIGHTS[0])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) OFF" % (VOLLIGHTS[1])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
for x in range(2,length):
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (VOLLIGHTS[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (VOLLIGHTS[0])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (VOLLIGHTS[1])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
for x in range(1,length-1):
if x == volled :
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) OFF" % (VOLLIGHTS[x+1])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (VOLLIGHTS[x+1])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

return

def inp_led(self,__,args):
inpled=int(args)
for x in range(0,4):
if x == inpled :
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) OFF" % (INPLIGHTS[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else :
cmd="\"LPMINI3\"/CLIP(\"MIXSTATE\") NOTES(%s) ON" % (INPLIGHTS[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

def moodstate(self,__,args):
channels=args.split()
midi_cc = int(channels[0])+int(channels[1])
cmd = "MIDI CC 11 103 %s" % (midi_cc)
self.canonical_parent.show_message('action: %s' % cmd)
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

def mon_to_sends(self, action_def, args):
track = action_def['track']
track_index = list(self.song().tracks).index(action_def['track'])+1
for x in range(0,11):
cell = 144+fxsends[x]
# hier kommt der wert aus der MIX-Matrix, if schleife min-MAX?
sends[x] = matrix[cell]
action='%s/SEND %s %s' % (track_index, sendletters[x], sends[x])
self.canonical_parent.show_message('action: %s' % action)
self.canonical_parent.clyphx_pro_component.trigger_action_list(action)

def mon_to_rec(self, action_def, args):
self.canonical_parent.show_message("Monitor to rec")
## select submix re-record
submix_midi_action="MIDI CC %s %s 127" % (submix_vars[1][3],submix_vars[1][4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(submix_midi_action)
for x in range(1,13):
vol=0
cell_row=143
vol=int(matrix[x+cell_row]);
RME_mixer_action="MIDI CC %s %s %s \n" % (source_vars[x][3],source_vars[x][4],vol)
## self.canonical_parent.clyphx_pro_component.log_message(RME_mixer_action)
if (source_vars[x][3]!=0):
self.canonical_parent.clyphx_pro_component.trigger_action_list(RME_mixer_action)
for i in range(0,4):
RME_mixer_action="MIDI CC %s %s %s" % (source_vars[13+i][3],source_vars[13+i][4],mic_matrix[i])
self.canonical_parent.clyphx_pro_component.trigger_action_list(RME_mixer_action)
## select MON submix
submix_midi_action="MIDI CC %s %s 127" % (submix_vars[13][3],submix_vars[13][4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(submix_midi_action)

def copy_sends(self, action_def, args):
track_index = list(self.song().tracks).index(action_def['track'])
sends = list(self.ChainMixerDevice().Sends)
self.canonical_parent.log_message('sends: %s' % sends)
self.canonical_parent.log_message('args: %s' % args)

def clear_sends(self, action_def, args):
track = action_def['track']
track_index = list(self.song().tracks).index(action_def['track'])+1
for x in range(0,11):
action='%s/SEND %s 0' % (track_index, sendletters[x])
self.canonical_parent.show_message('action: %s' % action)
self.canonical_parent.clyphx_pro_component.trigger_action_list(action)

def loopstatus(self, action_def, args):
matrix_string = "slots playing : ";
track_index = list(self.song().tracks).index(action_def['track'])
for x in range (0,8):
loopstatus_matrix[x]=self.song().tracks[track_index+x].playing_slot_index+1
matrix_string += "+ Track(%s): %s +" % (x,loopstatus_matrix[x])
## self.canonical_parent.show_message('loopmatrix: %s' % matrix_string)

def loopstatusled(self, action_def, args):
for x in range(0,8):
if int(loopstatus_matrix[x])>0:
cmd="%s NOTES(%s) OFF" % (LPX_LEDCLIP, LPX_LOOPLIGHTS[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="%s NOTES(%s) ON" % (LPX_LEDCLIP, LPX_LOOPLIGHTS[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
message = ' count total: %s - last loopmatrix clip etc %s' % (x, cmd)
## self.canonical_parent.show_message(message)

def barslotled(self, action_def, args):
leds=args.split()
if int(leds[0]) <=4 : for x in range (0,4): cmd="%s NOTES(%s) GATE >16" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
if (int(leds[0])-1) == x:
cmd="%s NOTES(%s) OFF" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="%s NOTES(%s) ON" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
elif int(leds[0]) <=8:
for x in range (0,4):
cmd="%s NOTES(%s) GATE <16" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
if (int(leds[0])-1) == x+4:
cmd="%s NOTES(%s) OFF" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

else:
cmd="%s NOTES(%s) ON" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

if int(leds[1]) <=4 : for x in range (0,4): cmd="%s NOTES(%s) GATE >16" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
if (int(leds[1])-1) == x:
cmd="%s NOTES(%s) OFF" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
else:
cmd="%s NOTES(%s) ON" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
elif int(leds[1]) <=8:
for x in range (0,4):
cmd="%s NOTES(%s) GATE <16" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
if (int(leds[1])-1) == x+4:
cmd="%s NOTES(%s) OFF" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

else:
cmd="%s NOTES(%s) ON" % (LPX_LEDCLIP,LPX_BAR_SLOT_LED[x+4])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)

def muteled(self, action_def, args):
return
def muteledoff(self, action_def, args):
return

def selplayingscene(self, action_def, args):
sceneid=1;
self.canonical_parent.show_message("select actual playing scene")

def ottostep_inc(self, action_def, args):
step=int(args)
if otto_cc_matrix[step] < 24:
otto_cc_matrix[step]=otto_cc_matrix[step]+1
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (step+1,otto_cc_notes[otto_cc_matrix[step]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

def ottostep_dec(self, action_def, args):
step=int(args)
if otto_cc_matrix[step] in range(1,25):
otto_cc_matrix[step]=otto_cc_matrix[step]-1
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (step+1,otto_cc_notes[otto_cc_matrix[step]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

def ottostep_matrix(self, action_def, args):
steps=list(args.split())
maxargs = len(steps)
self.canonical_parent.log_message("steps list : %s, len: %s" % (steps,maxargs))
for x in range (0,int(maxargs)):
if "MUTE" in steps[x].upper():
otto_cc_matrix[x] = 25
elif "OFF" in steps[x].upper():
otto_cc_matrix[x] = 26
else:
for y in range (0,25):
if steps[x] == otto_cc_names[y]:
otto_cc_matrix[x] = y
for x in range(0,6):
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (x+1,otto_cc_notes[otto_cc_matrix[x]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

def ottostep_mute(self, action_def, args):
step=int(args)
otto_cc_matrix[step]=25
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (step+1,otto_cc_notes[otto_cc_matrix[step]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

def ottostep_reset(self, action_def, args):
step=int(args)
otto_cc_matrix[step]=12
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (step+1,otto_cc_notes[otto_cc_matrix[step]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

def ottostep_matrix_send(self, action_def, args):
for x in range(0,6):
cmd="\"M-otto\" / DEV(\"CC-otto-steps\") B1 P%s %s" % (x+1,otto_cc_notes[otto_cc_matrix[x]])
self.canonical_parent.clyphx_pro_component.trigger_action_list(cmd)
self.canonical_parent.log_message("command : %s " % cmd)

 

go to the Master Research Overview


Future Plans

things to do

After writing so much about what I did, I now want to tell you about what ideas I have got for future developments.

From an artistic standpoint, my aim is to play more with this setup and make music. Because it is big and takes some time to build up and pack, I imagine there is potential for an online community where I can perform.

However, there are many things I want to re-program and implement:

Matrix 2.0

I already wrote how the first matrix came together. Just like in the movie, it needs an update from the ground up. I want to store and handle all the date either in a 2 dimensional array, a dictionary or I might even rewrite it as classes

Either way, this would give me much more flexibility. I could even store preset names, presets for pedals etc. with 1 matrix setting. This way I could have a "Total recall" possibility.

But it also means, I have to go through EVERYTHING I programmed so far. I can only do this, if I take 2 weeks off, where I don't have to make music with it.

Retrospective looping

The way my loop recording works at the moment: I press a button, then my machine records. With retrospective looping, it is the other way round. The machine records all the time, when I press a button, it makes a loop out of what I just played.
The first person I saw doing this was Lucky Paul, a percussionist from New Zealand.
It takes off a lot of stress, when you improvise and think: "oh, that was nice." and the press of button later, as it is preserved.

Presets for ambient improvisations

Similar to the matrix 2.0 I plan to have pre-made starting environments for my setup.

OSC / Tablet

At the moment I use my tablet for the display of information, as the connection of my OSC system was unreliable at the beginning. I became a member of "Beardyman" Patreon. He built and re-built his setup for live-looping-beatboxing for the last 12 years. I uses four iPads, all with Lemur, (a similar programm to TouchOSC) and Ableton Live.
With an iPad you have much more possibilities to control things as well as visual feedback.
This as well will require a kind of total re-work.

but for now

Let's make some great music, shall we?

go to the Master Research Overview


session with Markus Birkle

Questions:

  • is the overall concept doable?
  • strategies for ambient improvisation
  • modes of the ct5
  • What can be improved in the setup
  • tips & tricks?
  • Level management

Answers:

  • volume swell into reverb and or delay -> solved
  • rule of thumb: one pedal -> one function
  • ct5 explained how to use it
  • drone bass
  • few functions that are ALWAYS available and accessible
  • fade level control of RME (to do)
  • distortion
  • album recommendation (guitar player from Nils Petter Molvaer)
  • more dirty sounds
  • Interview Bugge & Hendrik Schwarz LINK

resulting morning jam

after this session I did a morning jam

go to the Master Research Overview