29 Tips Learned at Ableton Loop 2018 Day 3 – Music / Software / Instruments
Ableton Live Loop 2018 Day 3
Before reading this, I want you to know that I have added some affiliate links to some of the gear I have learned about at the Ableton 2018 Loop conference. I do earn a commission if you click on one of the affiliate links below and buy something. This is of no extra charge to you. Thank you and cheers!
Sunday Morning
Well the third day at loop. I am more open. I’m meeting a lot more people. It is really starting to sink in how powerful this event is. I am having a blast. I am not carrying anything with me accept a small notepad to take notes and contacts information. I know I will have to stand in the EastWest Studios line for a little bit to for the machine learning course. I’m really excited about this. I love computers and music, so it will be exciting to hear what machine learning can do for us creative musicians.
Earth Moment Live Packs
While waiting in line at the EastWest studios, I met a co-founder of Earth Moments plug-ins Kris Karra. He told me how Earth Moment started out as a label in India. Now they have plugins packs on the Ableton website. Their most famous of the Ableton Live Packs is Zen Pads. Lé Slow and Waterworxs are a couple of their other best sellers. I thought I’d check them out. It was interesting to talk to him about his company in Vancouver.
Machine Learning and the creative practice: Computer as Collaborator
We were let into the EastWest Studios for the machine learning seminar. I sat next to sit next to Luke Hunter. He is from Miami and has a label called Type 3 Records. We were talking about a little bit about music before the ceremony started. Machine learning is a method that uses Artificial Intelligence to learn and make decisions as a human would within software.
YACHT
One of the most interesting sessions was YACHT’s (a band from the northwest) presentation on machine learning. They talked about how they use machine learning to create a song. They are an interesting band. Once they sent out their album cover over fax (haha even the album cover was created by sound and tones).
Yacht made a song with machine learning for the demonstration in this class. They created a song’s melodies, lyrics, drum from with a computer using machine learning software. They start with a melody. It was very interesting and it sounded very human. They had some rules and limitations for what the computer spit out. They couldn’t alter it but could cut it in certain lengths. The machine learning software also created a list of lyrics. They also applied the same rules; for example, they couldn’t change the lyrics but cut out certain parts if they were a minimum length. The song that came out was incredible. It was so interesting and unique and very human. I was surprised how machine learning can create such humanistic melodies and lyrics.
Google’s Magenta
Next session was the Google team from Magenta. Magenta is Google department making machine learning tools for musicians. Jesse Engel and Adam Roberts from Magenta talked about some machine learning algorithms they made into four Max Patches. One of the Max patches they demonstrated, changed melodies from one to another over a series of time using machine learning. The melody went from a chord to arpeggios. It slowly evolved it’s melody from one to the other. It was so human like.
I was really impressed by how human machine learning sounded. The Magenta team also talked about a Max drum Patch that they created. They took a quantized drum pattern and put it through their Machine Learning Max for Live patch. It gave it an incredible human feeling. Can check out the Max for live patches here: Magenta Max for Live Machine Learning Patches.
Once I got out of that machine learning class, I went right back in line at the EastWest Studios. There was a “designing a live show” class by Laura Escudé coming up that I couldn’t miss.
I met a few people. I met some young Ableton Live users from Texas. They were talking about renting scooters from lime to get around the city. We don’t have that Las Vegas look like an interesting idea. Unfortunately, I didn’t get a chance to try it out.
I also met somebody who uses PreSonus Studio One to track vocals. A lot of people seem to use Apple’s Logic Pro or another platform for vocals because you comp vocal tracks easily. Studio One has another cool feature where you can create a clipboard for your arrangements. I was always interested in this and asked him about it. We talked about Izotope Nectar and some other things. His name was Koshlo. You can check out his cool music here: Koshlo.
Tips I Learned
- Machine Learning is a form of Artificial Intelligence that is making a lot of headway in creating humanistic musical patterns with computers
- Google’s Magenta team made 4 Max for Live patches that use machine learning to create really cool melodies and drum patterns
- Lime is a fun way to get around town using rent-able scooters
Laura Escudé and the Art of Designing a Live Show
Everybody was excited for Laura Escudé presentation on performing live. The room was packed. She talked about using technology to elevate your performance and not the feel ashamed about it. For example, not to feel ashamed to use vocal auto tune or other tools, because they can enhance your performance.
Roli
She had a Roli Seaboard Block MIDI keyboard controller. The Roli Block can connect to other Roli blocks. You have probably seen them before. They are an all-black keyboard and look like they have a black plastic cover over all the keys. Rolis Seaboard blocks are wireless and they have a lot of neat expressive functions including: Strike, Glide, Slide, Press and Lift. I noticed a lot of demonstrators were using these Rolis at Loop. I haven’t tried one out myself but am very interested.
Realtime Redundancy Back Up Laptops and Switches
In the beginning, she was going over all the cool controllers and instruments she was going to use to perform on the stage. Laura started playing a loop then explicitly pulled the audio output out of the laptop. The loop kept going. She was showing us how her redundancy back up set up worked.
She used a redundancy switch that would switch over to a second laptop’s audio if the sound of the first was stopped. This way, if you have any problem with the power going out or audio failure on the main laptop the switch would switch over to the second laptop. I didn’t find the exact name of the redundancy switch she used. I thought she mentioned magic box, but I couldn’t find it. Here is a redundancy switch I did find made by Radial Engineering called the SW8.
Radial SW8 Redundancy Switch
Let me explain how a redundancy backing setup would work with the Radial Engineering Sw8 8-Channel Passive Auto Switcher. The SW8 has 16 inputs (8 A inputs and 8 B inputs) and 8 outputs. Your master system would go into the A inputs and your redundancy system (back up laptop) would go into the B inputs. The 8 outputs of the SW8 would go to your PA.
The Radial SW8 has an auto switching input for a drone signal. The drone signal would come from your master laptop. It could be a sine wave that you set up to playback on a separate channel with your backing tracks. Any time the sine wave is not heard by the SW8, it automatically switches over to the B inputs.
The B inputs are the inputs where your redundancy laptop is playing into. You can also send a SMPTE time code into the auto switching drone signal input. The SW8 has a filter that can change the SMPTE time code into audio for a drone signal. Anytime the SMPTE drops signal, boom the SW8 switches to the B inputs.
Now, you would want to sync your two laptops so that they are at the same point of playback for your show at all times. This being the case, you may need an external device to sync both laptops to the same point via MTC or SMPTE. I suppose you could use Ableton’s link from the main laptop but if it crashes, your second laptop wont sync. This would only protect you for if there was an audio problem with the main laptop. It seems that using an external master time code would be best. With Ableton Live, you would have to sync via MTC or MIDI Clock or Link. Unfortunately, Ableton doesn’t support SMPTE.
Laura’s Visuals
Laura used Unreal Engine for visual projections effects. It created bubbles that would trigger off her violin playing and intensity. She had a lot of controllers. Each one was on a light up box that lit up when she played her Ableton Push.
She used a Max for Live patch to create DMX from MIDI from her push. I think you could use something like DMXis with a USB to DMX interface like to controller lighting from Ableton. Here are some USB to DMX interfaces: Enttec USB to DMX interfaces. Everytime she played notes on her push the different boxes lit up different colors.
Controllers and Software
She used a Livid Instruments CNTRL:R to load Ableton Live scenes. I guess some of them are rare. She mentioned that it was real hard to get. It looked like a real cool controller with a lot of sliders, buttons and knobs. Laura mentioned using MIDI Merlin to change audio to midi. She would take an audio signal from her violin and when the volume increased so it created MIDI information that would change the sounds of her synths. Some other people mentioned using MIDI Merlin at Loop, so it is definitely something worth checking out.
Laura mentioned that she used Isotonix Studios Clyphx to trigger the key of each song for push. A lot of people at Loop were raving about Clyphx this year. It is definitely something worth checking out. Clyphx lets you add scripts in clips in Ableton’s scene mode to do all kinds of neat controls.
Laura also used a Wii controller with Ableton to control vocal effects. This was really cool. She would wave a Wii controller to control MIDI messages in Ableton. You could us Osculator or a Max for Live device to use the Wii with Ableton.
After explaining all her equipment Laura performed two songs with Vocals, violins and her array of controllers. It was amazing.
Tips I Learned
- A lot of people use PreSonus Studio One or Apple’s Logic for comping vocal tracks, it is just easier then using Ableton for comping
- I saw a few people using Roli Seaboard Block controllers at Loop for their sessions and lectures
- Don’t be afraid to use technology to elevate your performance
- Redundancy Back up switches are great for live performance with backing tracks check out the Radial Engineering Sw8 8-Channel Passive Auto Switcher
- Unreal Engine can create some amazing visuals that respond to audio for visual performance
- You can use Ableton and Max to create DMX from MIDI to control lights in your performance (try these DMXis & Enttec USB to DMX interfaces).
- Livid Instruments make some unique Ableton Controllers but are hard to find. Check out this one: Livid Instruments CNTRL:R.
- MIDI Merlin was mentioned a few times at Loop 2018, it can take audio amplitude and change it to MIDI
- You can control Ableton with a Wii controller using Max or Osculator.
Speed Up Your Workflow with 23 new Ableton 10 Keyboard Shortcuts
Sign up for my Ableton tips newsletter, offers and promotions and get a free Ableton 10 keyboard shortcuts PDF with 23 new shortcuts
Sunday Afternoon
Steve Duda on Making Music Software
Next, I went to the Steve Duda’s session about creating software for musicians. It was so interesting, but I didn’t realize that I had a studio session with the Scientist halfway between, so I caught a little bit of Steve Duda session.
Steve Duda is the creator behind one of the most popular synths called Serum. He mentioned that it took 3 years to create Serum. We then watched a video of all the different phases of the UI of Serum. Steve talked about a lot of interesting people he worked with, but I had to catch my studio session with the scientist, so I couldn’t stay. He has made some other very popular plugins like, LFO Tool and Cathlu. You can check them out at his site: Xfer Records.
Studio Session with Dub Master Scientist
I’m back into the East West studios to control room for the Scientist Dub studio session. The Scientist , Dub Robot, Dani Deahl and Photay all came to the session in Dub Robots unique car. You can watch the video here it’s hilarious.
Scientist started off by going over pre-amp circuitry in a program called Multisim. You can use Multisim to create electronic circuits and test them out in the software. Then when you design a circuit, you can send via software to China to get printed? He talked about how people are so overwhelmed or scared about overloading pre-amps. He demonstrated in Multisim that these amp circuits can take a lot more input than you think.
Mixing the Kick
The Scientist started mixing a song. He talked about using EQ and how he never liked to use compression. You can use EQ to separate tracks. He started mixing the kick in real tight and showed us how he creates lot of bass in the kick. He used a notch filter and picked the gain up real high around 150 HZ, I think. It was way up there. I would try boosting the gain on the kick with a notch filter with a small Q and sweep it to where it sounds right. I think this is one of his tricks to get that massive Dub kick.
He told us you must use the big speakers to hear that kick. He compared the big monitors to the little Yamaha monitors. You can’t realistic lay hear the kick with little 6 or 8-inch speakers. Now I realize how beneficial it is to use big monitors for the low end. Maybe even run your mix through a PA if you don’t have large monitors to see how it sounds.
Scientist hates Compression
Scientist compared compression to putting’s crappy wheels on a Ferrari. He demonstrated how compression really ruined the sound of the kick. He used EQ to separate everything. Putting each track in it is in its place with the EQ and he didn’t touch compression.
Snare
He used the gate on the snare to take out the overheads. He ended up using the digital gate inside Pro Tools because it sounded a lot better than the one on the SSL board. It was more accurate. He really EQed each track so that each instrument in its own separate world. He kept saying how important this was. No compression, no effects, just EQ.
Sample Drums if Needed to Make Each Track More Separated
He mentioned, and Dub Robot explained how, if needed, he would use a sampler to sample the drum parts and then re-program in via midi. This way would really clean up the tracks so there was not a lot of bleed from the overheads. This resolves the problem of the phasing issue two mics have on the same sound at different phases. Having out of phase sounds can create cancellations which decrease the punch and clarity of each track. You can use the “Slice to MIDI” command in Ableton Live to easily create samples out of a drum track. This creates easy to play samples off the transients in the track.
Overheads
As I remember, he would roll the kick out of the overheads with a high pass filter. Again, this helped separate the track and reduce phasing. He showed how having the kick in the overheads took all the punch out of the kick track because of the phasing. You could hear the difference.
Mixing, Trim and Gain Staging
Once he mixed the drums, he started gain staging the SSL board. He would use the trims so that each of the track’s faders were right around 70% at where he felt they sounded good in the mix. This made the song a lot easier to mix. He said he loves using a digital console better than analog boards because they were more accurate.
Somebody asked him if he would rather record for horns together or separate, he said he would record them separate he likes to keep everything real distinct in on its own channel.
Delay
Delay, this is what we’ve all been waiting for. One of the signatures of Dub. He started routing delay through the SSL board. Traditionally you would route the same signal through a couple of the channels of the analog board. This would create a Delay naturally because of the amount of time for the audio to go through the different channels. But for this demonstration they used the Waves H-Delay Hybrid Delay. Dub Robot mentioned that the 220 milliseconds is the magic delay number. It seems to work for everything. They set up two mono delays without eq filter cuts or anything. He feed it back into the SSL.
Reverb
We were running out of time and Dub Robot mentioned at the end that he would put the delays through a reverb to create a reverb pop effect. Unfortunately, we didn’t have enough time to go into this.
Playing the SSL Console like an instrument
Scientist started playing the SSL console mix board like it was an instrument. They grouped certain tracks together. He played the delay by moving faders up and down on the group tracks. It was like he was playing the song in real-time on the SSL console. Unfortunately, our time was up in the mix room, but I learned so much.
Tips I Learned
- You can make, test and print electronic circuits with a program called Multisim.
- Scientist EQ gain boosted a notch filter on the kick up high around 150 kHz (I think it was around 150 but couldn’t tell, try a Notch gain boost on your kick and sweep it around until it sounds right)
- Scientist compared compression to putting’s crappy wheels on a Ferrari (Hahaha).
- Use EQ to make each track separate instead of compression
- Don’t let phasing ruin your song, use high and low pass filter so that tracks don’t phase each other out
- Sample drum track if needed to avoid phasing. Ableton Live can do this easily by right clicking on the drum clip and selecting “Slice to MIDI”.
- Large monitors or sub monitors can really help your mix with the the low end
- Use a gate on snare and kick mics to separate the tracks and avoid phasing
- Use the trim and gain stage on each track so the mix sounds good when faders are at 70%, and then mix from that point. In Ableton you can use the Utility to trim the gain.
- The Waves H-Delay Hybrid Delay is a great sounding plugin used by Dub Robot and Scientist for the Dub delay sounds
- Push a delay into a reverb to create a reverb pop like in dub
- Play the mix board like an instrument to create your mix
- A digital mix console is way more accurate than analog and is preferred by Scientist
Speed Up Your Workflow with 23 new Ableton 10 Keyboard Shortcuts
Sign up for my Ableton tips newsletter, offers and promotions and get a free Ableton 10 keyboard shortcuts PDF with 23 new shortcuts
Sunday Evening
After the studio session I walked back to the Ivar Theater for a music jam session. I had fun playing guitar. There were a lot of synths hooked up and other instruments but we only had a few moments left before I got back to the Montalbán Theatre for the closing ceremonies. I caught the tail end of the closing ceremonies.
The Kid live performance by Kaitlyn Aurelia Smith
Right after the closing ceremonies, the lights went out and the curtains raised for a show. It was a live performance by Kaitlyn’s Aurelia Smith performing her album The Kid. Wow, this sounded so cool. It was like a mix of synths, with vocals and harmonies. She sounded like Boards of Canada meets Enya / Kate Bush. Honestly, it was hard to compare to anything because her music was so original. It was its own world.
She had 3 laptops on stage. She had a headset mic which she sang through. I think she was controlling I filters through an analog synth. Occasionally she pushed a foot pedal, I couldn’t figure out if it was a sync for the synths or not. The vocals had a lot of harmonies which sounded cool and distinct. I think she had delay on her vocals too. It sounded like a slap-back delay.
Someone mentioned to me that the geometric projections in the background were probably a digital. They were so sharp, and the blacks were deep black. I was really transported to a new world with this performance. You can check out Kaitlyn’s album The Kid here: The Kid.
The Chris’ and Farewell Taco Stand
Leaving the Montalbán Theatre, the outside it was cram packed with people whom I’ve met through the weekend. I was sad but also excited to have all this new inspiration. I said farewell to a lot of friends and I met a couple of new friends. They were both named Chris. One was from Utah “Elchrisso” and one from San Francisco “Idea Unsound“.
We started talking about raves, Ableton and Video VJ software. Idea Unsound told me about Resolume and Touchdesigner for visuals for live shows. I was always curious how people create great visual projections for Raves and DJs. I wanted to incorporate something as such when I perform “Valley of Sapphire“, so I am going to check them out.
The three of us ended up going out for tacos. They both talked about Flying Lotus. I was talking to Idea Unsound about preset automation in Ableton. He mentioned how he uses clips to copy and paste automation in the edit window. You can lengthen the automation and by using clips.
They both raved about a book called “Homo-deus”. They talked about how Ableton started out as a Max Patch. We chatted about the Live Objects Model chart for Max “The LOM“.
A great discussion came up about releasing more music. I mentioned how The Songwriting Academy has helped me write more songs by having a monthly songwriting challenge. I mentioned body rhythms and how certain times of the day I am just not productive, and I usually do brainless activities then. Especially between 5 and 7pm.
ElChrisso headed back to his place and Idea Unsound and I took an Uber back to my place I gave him a ride back to his. I came back to my hotel home and ate my Thai Food leftovers I had in my fridge.
Tips I Learned
- Kaitlyn’s Aurelia Smith performance showed the power and uniqueness of analog synths
- Resolume and Touchdesigner are great ways to produce visuals for live music
- Automation in Ableton Live is easier to copy and change lengths when copied from the clip edit view
- Homo-deus by Yuval Noah Harari is a book worth checking especially if your a technology enthusiast
Loop Conclusion
What incredible experience I’ll never forget Loop. I PLAN TO GO EVERY YEAR. Ableton has been my chosen DAW for the past 15 years because I really strive to perform innovative music with computers. Creating my album “Valley of Sapphire” with Ableton has given me the ease of performing it that no other DAW could. Because I created the album with Ableton, all the album’s automation is already in the project file. All I had to do was take out the guitar track and play my guitar live through that track. All of the automation is already on that track. And this album has very complicated automation on the guitar which make it very unique. It has effects sweeps, delays, reverb, phasers and etc.
Ableton works so well for performing but to my amazement, Loop wasn’t about Ableton Live. No, it was about creating innovative music no matter the means. Loop brought some of the smartest most technologically advanced musicians, programmers, visual artist together for three days to learn, connect and explore possibilities. No matter the DAW. It was about more than that. Please don’t miss Loop next year, and look me up if you go, so we can chat over tacos about the LOM and automation, etc. Hahaha.
Tips I Learned
- Machine Learning is a form of Artificial Intelligence that is making a lot of headway in creating humanistic musical patterns with computers
- Google’s Magenta team made 4 Max for Live patches that use machine learning to create really cool melodies and drum patterns
- Lime is a fun way to get around town using rent-able scooters
- A lot of people use PreSonus Studio One or Apple’s Logic for comping vocal tracks, it is just easier then using Ableton for comping
- I saw a few people using Roli Seaboard Block controllers at Loop for their sessions and lectures
- Don’t be afraid to use technology to elevate your performance
- Redundancy Back up switches are great for live performance with backing tracks check out the Radial Engineering Sw8 8-Channel Passive Auto Switcher
- Unreal Engine can create some amazing visuals that respond to audio for visual performance
- You can use Ableton and Max to create DMX from MIDI to control lights in your performance (try these DMXis & Enttec USB to DMX interfaces).
- Livid Instruments make some unique Ableton Controllers but are hard to find. Check out this one: Livid Instruments CNTRL:R.
- MIDI Merlin was mentioned a few times at Loop 2018, it can take audio amplitude and change it to MIDI
- You can control Ableton with a Wii controller using Max or Osculator.
- You can make, test and print electronic circuits with a program called Multisim.
- Scientist EQ gain boosted a notch filter on the kick up high around 150 kHz (I think it was around 150 but couldn’t tell, try a Notch gain boost on your kick and sweep it around until it sounds right)
- Scientist compared compression to putting’s crappy wheels on a Ferrari (Hahaha).
- Use EQ to make each track separate instead of compression
- Don’t let phasing ruin your song, use high and low pass filter so that tracks don’t phase each other out
- Sample drum track if needed to avoid phasing. Ableton Live can do this easily by right clicking on the drum clip and selecting “Slice to MIDI”.
- Large monitors or sub monitors can really help your mix with the the low end
- Use a gate on snare and kick mics to separate the tracks and avoid phasing
- Use the trim and gain stage on each track so the mix sounds good when faders are at 70%, and then mix from that point. In Ableton you can use the Utility to trim the gain.
- The Waves H-Delay Hybrid Delay is a great sounding plugin used by Dub Robot and Scientist for the Dub delay sounds
- Push a delay into a reverb to create a reverb pop like in dub
- Play the mix board like an instrument to create your mix
- A digital mix console is way more accurate than analog and is preferred by Scientist
- Kaitlyn’s Aurelia Smith performance showed the power and uniqueness of analog synths
- Resolume and Touchdesigner are great ways to produce visuals for live music
- Automation in Ableton Live is easier to copy and change lengths when copied from the clip edit view
- Homo-deus by Yuval Noah Harari is a book worth checking especially if your a technology enthusiast
Tools
- Earth Moments
- Lé Slow Ableton Live Pack
- Waterworxs Ableton Live Pack
- Magenta
- Magenta Max for Live Machine Learning Patches
- Ableton Live 10 Suite DAW
- PreSonus Studio One 4 Professional DAW
- Apple’s Logic Pro
- iZotope Nectar 3
- Roli Seaboard Block Modular Wireless Midi Touch Interface
- Radial Engineering Sw8 8-Channel Passive Auto Switcher
- Livid Instruments CNTRL:R
- Unreal Engine
- Enttec USB to DMX interfaces.
- DMXis
- Livid Instruments MIDI controllers
- MIDI Merlin
- Isotonix Studios Clyphx
- Osculator
- Max for Live devices
- Xfer Records
- Serum
- LFO Tool
- Cathlu
- Multisim
- Waves H Delay plugin
- SSL Console
- Resolume
- Touchdesigner
- The LOM
- The Songwriting Academy
- Lime
People / Music
- Kris Karra
- EastWest Studios
- Type 3 Records
- YACHT
- Jesse Engel
- Adam Roberts
- Laura Escudé
- Koshlo
- Steve Duda & Xfer Records
- Scientist
- Dub Robot
- Dani Deahl
- Photay
- Kaitlyn’s Aurelia Smith
- Idea Unsound
- Elchrisso
- Flying Lotus
- Yuval Noah Harari (Author of “Homo-Deus”)