Reflection: Taking inspiration from Soma and The CymArtist…. can the CymaScope App respond to techno intuitively and is it actually a good thing?

As I continue pursuing my research into the feasibility of cymatic and Faraday formations for intuitive real-time music to visual performance, I have discovered two very different but inspiring sources. The paper by Bergstrom and Lotto (2016) is a particularly interesting perspective on the connection and limitations of dual music and visual performance. In exploring the comparative nature as to how visuals are coordinated and gesturally performed, Bergstrom and Lotto embark on developing their own software to address limitations with VJ software capabilities. It is interesting to see in this paper how the authors’ emphasise the intuitive visual response in many software applications with sound often limited to tracking the beat and amplitude.

Personally, this is where I feel cymatic and Faraday formations sourced from a physical installation of a cymascope would add extra capability and versatility to a VJ, either mixed with other visuals or as a solo entity. As I have discussed in previous posts, the limitations in reproducing cymatic and Faraday formations in software are apparent in that there is a lack of transitional fluidity in frequency changes. The benefit of utilising a cymascope installation is personally more appealing due to the characteristics of the cymatic formations. I stumbled across the below video on Facebook, which gives a great example of a live music performance being visually represented by a cymascope via a live camera feed and projections.

Il Cambia-Mondo, n.d.

As can be seen from the formations that occur in the water, these demonstrate fluid transitions (no pun intended) between different frequencies being visualised. Unlike the CymaScope app, which as previously discussed, cuts off any audio or visual tail to play the next sound to provide the associated frequency formation. The interesting aspect to finding this video on Facebook, was the trail it led me to discovering more experimentation’s on YouTube. The next video, taken from the channel of ‘The CymArtist’ shows good creative merit with varied lighting colours adding to the effectiveness of the visualisation. Other videos from the same channel demonstrate further creative qualities in reverse engineering to sound with post-production.


The CymArtist (2015)

The combination of these two different sources of experimentation with music and visuals, with and without cymatic frequency, suggests to me that having a physical installation with a cymascope and live camera feed would be a better option. This will have some disadvantages, primarily being cost and transportation, however software replication is not intuitive enough to provide a convincing display of fluidity in motion. As I reflect on my research, I still feel troubled as to the lack of music production, filming and editing I have done outside of this research project. Am I living a dream, or dreaming a dream about cymatics in visualisation and straying beyond my discipline and capabilities?

Throughout the duration of this research project I have been eager to connect a live feed from a DJ mixer to a cymascope to see the results with electronic music.  There are plenty of experiments using Chladni plates and water online, however these are frequently utilising classical music, voice and chant sounds to establish very clear, distinct patterns. The only method I currently have at my disposal is the CymaScope app on my smartphone, so I will have to use that.

The tricky aspect about using CymaScope is there are minimal options for audio input. These are limited to the piano roll and one’s smartphone’s microphone. Therefore my only option was to play my music loud through my studio monitors, so the microphone could clearly track the signal and the app could respond. How do I capture the visualisation appearing on the app to document this experiment? My initial thought was to set up my GoPro and film my smartphone’s screen, until I remembered from previous experimentation that you can visualise the iPhone with Quicktime Player on a Mac via the USB charger cable. I gave myself a victorious pat on the back for that.

So with my iPhone SE connected to my MacBook Pro, I selected ‘New Screen Recording’ in Quicktime Player and hit record. I then loaded up the CymaScope app, selected the microphone option and played one of my recent tracks through my iMac and studio monitor speakers, loud.  At the end of the song, I then imported the video footage into Final Cut Pro X and dubbed over the audio with the original audio file of the chosen track. This gave an enhanced audio signal, while having the same placement and subsequent visualisations as the microphone signal. I then zoomed in on the visual frame to remove the musical note text CymaScope provides underneath the visualisation, added some captions, fades and hey presto!


Jones (2017)

As I mentioned previously, the CymaScope app is visually satisfying to witness, but the lack of fluid transitions makes it seem somewhat laboured. There are definite repeating formations in the above example, however I am sceptical of their accuracy due to their lack of consistency.  There are repeat patterns formed at similar points every few bars, but this is not always the case, so it leads me to believe that the CymaScope is attempting to find the nearest frequency response based on its data.

This upholds my previous thoughts that an actual physical cymascope would be the best representation of cymatic and Faraday formations, due to programming limitations in software. Why go to the effort of conducting an experiment? I am yet to explore Max/MSP, but due to my lack of experience it might be a long time before I achieve a successful test phase. In my current circumstances, this is the best I could do. So where do I go from here? Has my research been thorough enough? As I reflect on the last six weeks or so, I am sceptical of how I have spent my time and energy. Has this been a wild goose chase into an ambitious dream which may never become reality? I would hope not, because I intend to make cymatic and Faraday formations a reality in the realm of real-time music-visual performance.

References:
Bergstrom, I., Lotto, R.B. (2016). ‘Soma: Live Musical Performance in Which Congruent Visual, Auditory and Proprioceptive Stimuli Fuse to Form a Combined Aesthetic Narrative’. Leonardo 49, 397–404. doi:10.1162/LEON_a_00918
Il Cambia-Mondo, n.d. – Un momento straordinario con la voce… [online] Available at: https://www.facebook.com/ilcambiamondo/videos/1348846065177115/ [Accessed 20 November 2017].
The CymArtist (2015) Cimatica/Cymatics – Experiment 12 (432 Hz) [online] Available at: https://youtu.be/NFFJXxkILHk [Accessed 20 November 2017]
Jones, C. (2017) Sejon – Peace Out (Visual Cymatic Interpretation by CymaScope App) [online] Available at: https://youtu.be/n5YOU0kI94Q [Accessed: 23 November 2017].

#Reflection

 

Leave a comment