Reflection: Why are immersive visuals with cymatic frequency necessary for music performance?

As I continue to research the potentiality of cymatic frequency being visually formed in an intuitive response to music performance, I have to question myself why? At times over the past month, it has been somewhat frustrating attempting to find sources that demonstrate the intention I have to take my future work, perhaps because it is unknown territory for many visual artists and musician alike. As much as I would like to think I am being very original in this concept, I am reserved to offer myself the gratification of such an idea.

My intention to bring the phenomenon of cymatic frequency and the visual delights of Faraday waves into real-time music performance, is to offer stimulation, immersion, education and therapy in the moment of that performance. As I have specified before, anecdotal evidence suggests to me that the very principle of frequency, sound and vibration being at the backbone of all matter, emotion and perceived reality is beyond the grasp of many people. Perhaps being able to visualise sound in many music performance scenarios will enable an audience to be amazed, intrigued and educated by music and sound other than solely consuming the medium as great art for entertainment and distraction.

Looking away from attempting to find visual artists embracing cymatics in their performance art, has enabled me to discover some great alternatives. One particular visual artist whose work is unprecedented in my opinion is Mathieu Le Sourd, otherwise known as Maotik.


(Maotik, 2013)

Maotik often showcases his works in a 360° immersive installation, using real-time live visuals, which proves to be powerful, stimulating and stunning. This could be considered great visual art, but it is not offering the audience anything particularly educational, as cymatics would in comparison. Maotik’s work is undeniably a great example for visual artists looking to create an immersive experience which entertains and provides potential therapeutic qualities. How is this relevant to my creative aspirations? If this approach could be adopted in a live music performance environment, with a hybrid inclusion of cymatic and Faraday formations from physical installation, I feel this would gain greater audience attention and curiousity towards questioning the concept.

Having discovered a visually stunning book, Water Sound Images, I am strongly beginning to think that effective cymatic, or Faraday, formations would be ideally executed as a physical medium in a live performance setting. My current research has not provided definitive insight into the versatility of using visual software for this practice, even though the team behind KIMA succeeded this very feat. Water Sound Images demonstrates the effective use of capturing water reactions to various vibrations, with additional ambient lighting that naturally enhances the resulting Faraday wave formations.

(Lauterwasser, 2006)

This principle of capturing the results of a physical Faraday waves experiment are also present in the ‘Cymascope’ app available for Apple and Android devices. The concept behind Cymascope is to have an interactive piano surrounding a virtual water container, that when any piano key is played, the associated cymatic/Faraday pattern emerges and translates whenever a new note is played. Each visual formation response to musical notes was physically captured with a real cymascope and camera feed, then programmed into the app to respond accordingly with user interaction. Varying ranges of pitch are attainable with options of 440hz, 432hz and 444hz all displaying diverse alterations to the visual formations.


(CymaScope, 2016)

Upon first look, this seems like a great idea, with the addition of a microphone input also generating a formation response. However, being a music producer enables one’s ears to be accustomed to smooth transitions, to which there are none when using the piano keys. New notes bluntly cut off any decay from the previous note, making the transitions seem somewhat abrupt. Considering this, solely utilising this app with third-party audio input and visual output could be a better option.

Figure_3_-_Singing_into_the_CymaScope_App

So where can I move forward from here? I will attempt to test this with contemporary electronic music to see how intuitively responsive the app is with an audio input signal. Nonetheless, to have an actual cymascope with a live camera feed in a music performance setting would perhaps showcase the best formation responses. That could be a bold statement as I have not experimented with either medium, so I require a level of reservation in reaching judgemental conclusions.

References:
Maotik (2013) ‘DROMOS – An immersive performance by Maotik and Fraction’. [Online]. Available at: https://vimeo.com/80542263. [Accessed: 31 October 2017]
Lauterwasser, A. (2006) Water Sound Images: The Creative Music of the Universe. Illustrated ed. USA: Macromedia Publishing.
CymaScope. (2016) ‘CymaScope MusicMadeVisible app demo’. [Online]. Available at: https://youtu.be/mvWdIHUo9SU. [Accessed: 3 November, 2017]

#Reflection

Leave a comment