Charlie Ellam_DJPD51_HardwareBlog

Charlie Ellam_DJPD51_HardwareBlog

Contact information, map and directions, contact form, opening hours, services, ratings, photos, videos and announcements from Charlie Ellam_DJPD51_HardwareBlog, Musician/Band, .

16/12/2019

Final Compositions

Techno Performance,

After my session with the Prophet 8, I used the improvised samples in my final performance. This performance took a lot more preparation as there were 12 different tracks and so I automated a selection of clips prior to the performance, meaning that I could focus on the rest of the track. For example some drum clips were filtered for buildup etc. also for other aspects such as the acid lead, I changed the volume and length of certain clips so they develop over time. This gave me time to focus on the structure and automating other components my hand, rather than in the box.

Using empty clips I was able to scroll down through my clips and take out different elements of the track at certain points. This was especially useful when taking out the pad, as this created a darker atmosphere to main sections.

Using the Push gave me a greater accessibility in what I could achieve with my performance and this is a workflow I will adapt with future productions as it proved to be very productive. Unfortunately I made a slight mistake in the performance regarding the structure and so in my final submission i edited this slightly.

Breakbeat Composition,

This composition required a minimal amount of processing, all sounds recorded in lessons fit wholy to the time and style of the track. Recordings in this track were mainly taken from the Drumbrute, Pico Module and Prophet 8. Using the Eventide space I was able to create a nice space for different elements of the track, this helped give an epic stadium effect, making the overall sound larger.

This track showed me the capabilities of different hardware equipment, all sounds required minimal to no processing. The hands on use of equipment is a productive workflow when improvising, in the context of a track. I also found the use of different hardware opened a door to being more experimental with my music.

16/12/2019

The prophet 8 helped with the development of both of my tracks, the sounds generated helped to glue elements of the tracks together. Using jack cables to come from the main output of the Prophet, I patched into the ASP and made sure the level wasn't peaking. I then created an audio track in Ableton and used external input 3, then selected IN on the monitoring section.

I then played the WIP of my track and improvised over, whilst recording into Ableton. I used the LFO’s and filter on the prophet to ensure the sounds generated were synced to the desired BPM, for musicality.

This workflow proved to be productive for me as I could quickly cycle through sounds whilst using the WIP as a reference. The prophet is used for both the distorted guitar sound in my breakbeat composition and the acid lead in my techno performance.

16/12/2019

Today I gathered my samples and arranged them in clip view. I then used my Akai MPK Mini to map the pads and pods to both trigger certain clips and automate different parameters. With the use of MIDI mapping in Ableton I was able to trigger certain drum loops and other aspects of the composition. With the MIDI mapping, I am able to set a reasonable range for the parameters I am automating by setting a a minumum and maximum value. This will ensure that sounds stay musical and that I don't apply to much automation. This was effectively a warm-up to my final performance giving me a basic structure of a song which I could then use to improvise over later in the studio.

This was very helpful and a productive workflow as it pushed me to create a fundamental structure for my composition, however I felt limited using the Akai MPK mini in comparison to the Push as there are only 8 pads and 8 macros. You are able to use 16 pads but flicking between bank A and B is not efficient under the pressure of a performance.

Photos from Charlie Ellam_DJPD51_HardwareBlog's post 16/12/2019

Today's lecture was particularly interesting, we used a range of abstract techniques in order to generate sounds. At first, we used a Sega Mega Drive, the cartridge was a multi-operator FM synth that used MIDI messages via the controller inputs, in order to shape the sound.

Later, in the studio we played around with circuit bent kids toys, the use of variable resistors for pitch control allowed for a drastic change in the timbre of each sound. The use of switches also enabled the use of different distortion effects.

At the end of the session, we walked around each production room to see other esoteric techniques. One that especially caught my eye was with the use of FM transmitters. Using the laptop they sent a radio signal via the FM transmitter, this gave a crunchy sound which may be desired in compositions. Using the contact mic they were also able to get some interesting sounds, feeding back the signal by holding the mic up to the speaker.

Although not many samples were used in my final composition it opened my eyes to another interesting workflow whereby there are no real limits to how you generate sounds. This mindset proved effective in future studio sessions.

16/12/2019

After the introduction sessions with the modules and no input mixing, I decided to book out the equipment by myself so that the sounds generated were more focused in the context of my compositions.

I used the Akai Rhythm Wolf as the external MIDI clock and to have a 4x4 drum pattern for improvisation. Due to some technical difficulties with the clock i needed to refer to the user guide, see the comments. Again i used the Pico module as in previous sessions, as I had struggled to acquire useful content from the make noise module. Finally, I used the Allen and Heath mixer for feedback loops and ran the master through the Waldorf 2 Pole Filter to create different distortion effects. I also used the 2-Pole as an external effect by sending sounds generated in previous sessions. This was achieved by sending the channel on Ableton via the external output of the ID and in to the Waldorf. I then came back in through the ASP and in to a new track in Ableton via the external inputs. All other instruments were also ran through the ASP and recorded separately in Ableton.

This session was helpful and some samples were used in the final compositions, again this was a productive workflow in the context of sample generation. On the other hand I did struggle to make proper use of the Waldorf and so I didn't utilise its full potential.

15/12/2019

This weeks session involved using the MIDI clock function on the Drumbrute to send tempo information to each module and synth synchronously, so that the audio of each unit can play in time ensuring musicality of the mix. On the Pico module we used the eloquencer and its complex sequencing to trigger the oscillator unit using the gate outputs, this basically tells the oscillator on/off information, and the Control Voltage outputs, allows for the control of pitch. We referenced the eloquencer manual (link in comments) to adapt the pitch, swing and probability of the sequence.

To be able to hear the audio we then patched out of the oscillator unit and in to the voltage controlled amplifier (VCA) to amplify our signal. Using the level pot we could adjust the gain of the unit to ensure we had a healthy signal. In order to measure the signal, we then patched out of the VCA and in to the ASP and used the meter, to get a good level. Once our signal wasn't clipping, we then routed an audio channel in Ableton to receive the external input from the ASP. I then selected the IN function on the channel, this then allowed us to hear the module unit. We then experimented with sounds and decided to break the signal from the oscillator to the VCA and then patched in the filter. To complete the signal path we then came out of the filter and into the VCA. The filter allowed us to drastically change the audio being generated, giving us another range of useful sounds.

This session proved to be productive as many of the samples generated were used in my breakbeat style composition. Again working as a group in this session proved to be beneficial as we were able to work together when figuring out the modules and also one person had a good knowledge of the sequencer. However one downside was that there were 5 people in the studio meaning we had limited time to focus on creating sounds for our own compositions.

15/12/2019

For this week's session, we used the Link function in Ableton via WiFi to create a synchronous multi-DAW performance. We used two scarlett soundcards for the laptops routing them directly to the ASP, recording the mix in to the Ableton on the dBs Mac. Each of us used the Ableton Push to both, trigger different MIDI clips that we had previously created in our DAWs and improvise with different synth and vocal sounds. Using the scale function on the push and transposition we were able to select a key and ensure that each sound stayed in Fm. The Link function allowed us to sync our BPMs and ensure each MIDI clip is triggered on the relevant bar.

We originally decided on a Lo-Fi house style track but ended going more upbeat as we started to experiment. For the drums, I used the 909 core kit in Ableton and with the sequencer on the Push built a range of drum loops for build-ups and drops etc. giving the song structure. I also created a bell-like a sub pluck complementing the groove and automated the frequency throughout the performance.

I enjoyed this session as it pushed me to adopt a new workflow and also to work in a group as I haven't had many experiences with collaborations. Unfortunately, all the sounds didn't make the final compositions but I learned useful techniques. See below a video of the performance.

Photos from Charlie Ellam_DJPD51_HardwareBlog's post 15/12/2019

In this session we experimented with feedback loops, using the Behringer MX 1604A. This was achieved by patching the auxiliary send to the line input of channel one. As you slowly send the signal via the aux send you begin to see the signal on the output meter. Once the signal was at a healthy level we turned the monitoring fader up to hear the feedback. Using the EQ section on the channel you can begin to adjust the wavelength which can be heard as the clicks increase and decrease in time. For the next step we patched the headphone output to the line input on channel 2 and started to increase the headphone output this then creates another signal which manipulates the original sound.

After this, we then patched the Drumbrute impact to channel 3 and created a drum sequence, once this is sent via the auxiliary it begins to modulate the feedback signal giving a variety of interesting timbres.

Finally, we effectively inserted the Lexicon MX300 into the end of the signal chain, after the mixer output and before the soundcard. The Lexicon gives an epic reverberation effect and can be used to drastically change the timbre of the sound via the different parameters. As this just applies reverb to the output we then decided to insert the reverb in between the aux send and channel input 1 which then modulates the feedback signal rather than just applying reverb to the mix output.

To summarise the session, we were able to acquire a diverse range of sounds in each frequency band, with a simplistic and easy workflow. The reverb also created interesting textures which could be used for atmospheric sounds. However, as we were taking it in turns to experiment for the first time we didn't manage to get many useful samples.

Videos (show all)

FinalPost
IdeaGen Performance

Website