Sunday, March 26, 2006

Week 4


Audio Arts

In Audio Arts this week we looked at gates and gating. Basically a gate’s main function is to remove background noise from a recorded source particularly for drums although has other applications. In the gate plug in Pro Tools, there are 5 parameters you can select.
1. Threshold sets the level at which the gate is opened and the signal is passed through.
2. Attack sets the rate at which the gate opens thus smoothing the opening of the gate.
3. Hold defines the length of time the gate remains open after the signal is below the threshold.
4. Decay is the inverse of attack thus smoothing the closing of the gate.
5. Range allows a little bit of bleed to come through.

Setting the correct parameter/s can drastically affect the sound of a drum mix. Gating has also been used to keep the stabs of an instrumental section sounding tight by triggering the gate of the section from the lead player.

Grice, David. 2006. Tutorial on Gating. University of Adelaide, 21 March.


Creative Computing

In Creative Computing, we looked over a section of the Super Collider help file to give us a grounding in the basics of the program. We looked as the different types of message format including long, int, float, double, string and bytes as well as the tags which accompany them. In SC the backbone of the program is made up by a unique, modern and sophisticated idea of server architecture. In SC there is the internal sever (graphics) and the local server (audio) which need to be booted or switched on for there to be an outcome. Christian then explained to that Open Sound Control or OSC was a protocol for controlling synthesisers in SC. As a group we then looked at examples in the help files to explain the function code, which is found in the curly brackets of the code. The function code encapsulates unit generators or synths that create the sound.

Haines, Christian Haines. 2006. Tutorial on Creative Computing. University of Adelaide, 23 March.


Workshop

This week in workshop David Harris he brought in a selection of music by north-east American composers. the first piece we listened to was titled Surf Music II composed by Jack Vees. Vees’s specialty is electric bass which was used in the composition. To me the piece had an underlying drone with swirling water noises over the top of it much like the feeling of the swell of an ocean rising and falling waiting for a wave to break whilst surfing. This outcome was created by using a combination of bowed bass guitar, electric guitar, delays, feedback and filters that created a sonically interesting composition. We them listened to a composition titled Fog Tropes II by Ingram Marshall. This was composed for string quartet and tape. I liked how the tape was used to create a delay which layered the sounds of the strings over each other and how the tape output interacted with the strings. the next piece we listened to was by composer Michael Gordon titled Trance IV. The recording we heard was performed by a group called Icebreaker. While seeing it’s place in the scheme of composition of that time I found this piece un-interesting and a little ‘cheesy’ mainly due to the progressive rock feel. David Harris then played us a pieces titled Piano Piece which he composed that explored the harmonies created by the resonant frequencies. The piece focused on the process of the instrument rather than the process of the composition. It was interesting to hear how cupping my hands over my ears changed what I heard. David finished the session by playing another piece he composed titled COmpossible 14 which was influenced by bird sounds for Piere Boulez and was a homage to John Cage and obviously influenced by Cage with a use of space in-between notes like the Japanese concept of ma.

Harris, David. 2006. Tutorial on listening. University of Adelaide, 23 March

Sunday, March 19, 2006

Week Three


Audio Arts

In Audio Arts this week we continued on from last week with the topic of stereo micing techniques. As a group we created the mid-side technique for piano. When using this technique, it is best to use two of the same microphones thus we used two Neumann U87’s both in their cradles. The first was set up just inside the piano as close as possible to the soundboard with the polar pattern set at figure 8. We then experimented with the direction of the microphone to find the sweet spot where the bottom end sounded fuller. The 2nd microphone was placed above the first microphone however this microphone was placed upside-down about .5cm away from the microphone below. The polar pattern was set to omni. We then recorded the piano into Protools creating two tracks. The recording from the figure 8 microphone was then duplicated and the duplicate was inversed. The original was panned right and the inverted was panned left. The result of this process was quite striking with the sound similar to that of actually being seated at the piano.

Grice, David. 2006. Tutorial on Mid-side Micing. University of Adelaide, 14 March.


Creative Computing

We started off the class by looking at what a node was in Super Collider. As Christian told us, a node is an object that is shown by a chunk of code. These objects are a widely known as a synth or a group. A synth can be an actual synthesiser, a sound modifier or a trigger. An actual synthesiser in SC is a collection of unit generators. A group is a bunch of synths. Each node is accompanied by an identifying number or ID. Each synth in a node has audio and control data which is known as the control bus. We also touched on Buffers. Buffers are a memory/location to store information in the SC code. 32 bit sound information is stored in the header information in the SC code

Haines, Christian Haines. 2006. Tutorial on Creative Computing. University of Adelaide, 16 March.


Workshop

This was our first week of workshop with David Harris a composition lecturer at Adelaide University. David showed us some examples of electro-acoustic composition, the first of them being an installation piece by Robert Ashley. Ashley’s piece was also an example of sound poetry where he had used tape-like techniques to create an effect that sounded like he was not taking a breath. This gave the piece a feeling of uneasiness and awkwardness. The second piece we listened to was Gloria from Glen Branca’s Symphony no.3. This piece was composed for keyboards, guitars and drums. This may sound like the instrumentation of a rock band however there were 6 keyboards that were tuned for overtones over 7 octaves and 7 guitars that were bowed instead of plucked or strummed. Throughout the piece the changing combinations of overtones gave the piece a spatial feel that swirled around like waves creating something that was interesting to listen to.

Harris, David. 2006. Tutorial on listening. University of Adelaide, 16 March


Forum

This week in forum we had composer and mathematician Gordon Monro speak to us. He explained to us that his area of specialty was generative artwork which involved creating a system that was used to create the art. The system created rules and boundaries for the piece but the exact outcome was unknown. An example that he showed to us was a program he created called Evochord which was a genetic algorithm which tried to evolve the harmonious chord. This was represented by a visual element consisting of blob like objects that moved and changed colour to represent changes in pitch, modulation, etc. Monro also showed us examples of some other works including Red Grains and What Are You Really Thinking. It was interesting to hear about Monro’s ways of using mathematics to composed music and his thoughts on wether musicians should have a strong knowledge of maths.

Monro, Gordon. 2006. Tutorial on Generative Artworks. University of Adelaide, 16 March

Sunday, March 12, 2006

Week 2


Audio Arts

This session focussed on micing techniques so firstly we discussed microphone types and polar patterns. The main polar pattern types are omni, cardioid, hyper-cardioid and figure 8. We then talked about various micing techniques such as spaced, x.y, and mid-side. The mid-side technique was new to me and involves the use of 2 mics, one with a figure 8 pattern and another an omni. The 2 mics are placed on top of each other then positioned above the edge of the piano by the mid strings. The sweet spot of the figure 8 mic is aimed over the lower strings. In post production, the recording of the figure 8 mic is copied then inverted and paned left adding another special element.

Grice, David. 2006. Tutorial on Stereo Micing. University of Adelaide, 7 March.


Creative Computing

This week Christian introduced us to the program Super Collider, and the parts inside it. The post window was first discussed with Christian describing it as similar to the Max window in Max/MSP. The post window is used to post data/output to the screen and is used for debugging. Christian explained that coding for SC is done in a text document saved in RTF format. The pros and cons of the format were also covered. Christian then showed us the hello world coding example demonstrating .post and .postln and how it is similar to the print object in Max/MSP. Christian also told us that when debugging code we should print it out and highlight the mistakes to improve our understanding of the code.

Haines, Christian Haines. 2006. Tutorial on an Introduction to Super Collider. University of Adelaide, 9 March.


Forum

Composer Warren Burt was bought in to talk to us. Burt’s presentation followed his history with music technology and was accompanied by photos from the past 30+ years. Burt explained that he had a background in electronics as well as music, which allowed him to be involved in synthesiser construction. Burt also had a strong interest in tuning, creating his own tuning forks and composing for them as well as creating his own scales with aid from the computer. Burt believes in deep structure in music thus given his education, Burt has a liking of algorithmic composition and the use of maths in composing. He spent some time explaining randomness and chaos and how it relates to his music. Throughout his presentation, Burt had some interesting photos of his pieces similar to Minard in the previous week, particularly the installation style use of using large objects such as advertising signs as speakers. Burt spoke about what he would be doing as part of the festival, which was interesting. He explained how he uses a program called Scala to create an audio and visual piece with Scala being triggered via a PlayStation controller. I found Burt’s presentation to be interesting although I wasn’t as inspired as I was by Minard’s Presentation although that may have influenced by the longer length of this week’s presentation. It was good to see what other ideas technology composers use in their creative process.

Burt, Warren. 2006. Tutorial on Warren Burt and Music Technology. University of Adelaide, 9 March

Week 1

Audio Arts

In audio arts this week we spent some time getting acquainted with David discussing his history as well as our backgrounds, future plans and current musical/audio experience. This was a good way to hear about David’s credentials and also hear about other classmates ambitions and current projects. We then spent the remainder of the class in the control room. We looked at session management particularly Playlists and Groups. Although parts of this were revision, it was good to see them applied to a high level, as the session David had was larger than anything I had worked with. This showed me how it is important to use Playlists and Groups to achieve efficient session management.

Grice, David. 2006. Tutorial on Session Management. University of Adelaide, 28 February


Creative Computing

Christian introduced us to Super Collider which we will be spending this year learning. He started off by explaining textural or text based programming, known as procedural programming. This includes programming languages such as Java, C++, etc. Super Collider was constructed from the C programming language by James McCartney in 2002. SC has a higher level of abstraction that causes the programmer to develop a new methodology of thinking. Due to the text based coding, a high level of granulation of control is enabled for the user, which allows a high level of precision and accuracy particularly in the areas of music, audio and visual production. We also discussed the idea of programs being extensible. This concept allows the consumer to expand the software by adding script thus modifying the software within certain restraints. This is known as an open system. A closed system is the opposite of this with Mictrosoft’s Windows an example of this.

Haines, Christian. 2006. Tutorial on Procedural Programming. University of Adelaide, 2 March.


Forum

This week we have musician and sound installationist, Robin Minard to speak to us about the development of the Sound Installation idea. Robin gave us a brief talk about his history then focussed on how he came up with idea of a sound installation. He explained that he lived underground in Montreal, an environment where Musak is played all the time. This prompted Minard to think about the function of music in relationship to the space. This lead to thought on how the sound changes the perception of the space it is performed in and also that traditional music looses it’s meaning when performed in a public space like in the tunnels in Montreal. Further experimentation by Minard led to the use of small speakers used to add an extra dimension to a space and then later, the use of sculpture to add to the visual element of the installation. Minard’s presentation was accompanied by some detailed photo’s of some of his works particularly the works with use of multiple output sources that create a sense of movement in the space. He also looked at how colour influences your perception. Minard explained that he uses synthetic sounds that sound natural to accompany the space and plays them at a level such that the listener is unsure if the sounds are coming from the space or the speakers. I found this talk very interesting especially the examples of works that Minard had constructed.

Minard, Robin. 2006. Lecture on Sound Installation. University of Adelaide, 3 March.