Reflection: Critical analysis – KIMA – so far so good, or not?

As I continue to write my critical analysis on KIMA (Gingrich et al, 2013), I am feeling restricted by the consecutive two page limitation on this section of the assignment and the inevitable unease of my monkey-mind. Having wrote almost 1000 words on a chosen two page section, which determines some aspects of the conceptual and technical approach to KIMA, it has been suggested by my tutor that this section is perhaps not the wisest choice. However, I have also been informed that the work I have written is good. So where do I go from here?

It is Sunday afternoon, the weather is beautiful and I have been sat inside the Learning Resource Centre (LRC) for almost an hour, attempting to decide which direction to take. So naturally I decided to express my reflective thoughts with a blog post in a bid to find clarity to proceed. I can understand and almost agree with my tutor’s comments suggesting that pages focusing on the specific technical components of KIMA would be ideal for critical analysis. However, I am also confident my initial choice has a good degree of relevance towards the concepts and technical perspective I wish to implement in my future works.

The added difficulty is the format of this paper is with information spilling over beyond two pages, whichever section I were to choose. Being the only paper I have discovered which specifically utilises cymatic frequencies as visualisations, I feel I had to choose KIMA purely out of relevance. However it is by no means the best written source I have found but one of the most confusing.

Its journey into the art of telepresence and mix of author vision, combined with scientific perspective often makes the paper a minefield to understand. So why did I choose and retain this text? Perhaps it is a good thing to challenge my understanding constantly until I finally grasp their concept and technical setup. However, I would argue that Ginrich et al could have presented the technical setup nearer the beginning to avoid confusing readers by an overly artistic, descriptive tone which dips in and out of scientific discussion.

Perhaps it is fair to say that confusion and a lack of clarity for me is abundant in this module, but maybe that is part of gaining an expert understanding in one’s disciplinary field? Having spent some time reflecting with this blog post, I have momentarily decided to run with my plan of analysing the initial pages I chose. Maybe this will change in due course, but for now I will continue as I started out.

References:
Gingrich, O., Renaud, A., Emets, E., 2013. ‘KIMA — A Holographic Telepresence Environment Based on Cymatic Principles’. Leonardo 46, 332–343. [Online]. Available at: https://muse.jhu.edu/article/512222. [Accessed: 26 November 2017]

#Reflection

 

Reflection: Taking inspiration from Soma and The CymArtist…. can the CymaScope App respond to techno intuitively and is it actually a good thing?

As I continue pursuing my research into the feasibility of cymatic and Faraday formations for intuitive real-time music to visual performance, I have discovered two very different but inspiring sources. The paper by Bergstrom and Lotto (2016) is a particularly interesting perspective on the connection and limitations of dual music and visual performance. In exploring the comparative nature as to how visuals are coordinated and gesturally performed, Bergstrom and Lotto embark on developing their own software to address limitations with VJ software capabilities. It is interesting to see in this paper how the authors’ emphasise the intuitive visual response in many software applications with sound often limited to tracking the beat and amplitude.

Personally, this is where I feel cymatic and Faraday formations sourced from a physical installation of a cymascope would add extra capability and versatility to a VJ, either mixed with other visuals or as a solo entity. As I have discussed in previous posts, the limitations in reproducing cymatic and Faraday formations in software are apparent in that there is a lack of transitional fluidity in frequency changes. The benefit of utilising a cymascope installation is personally more appealing due to the characteristics of the cymatic formations. I stumbled across the below video on Facebook, which gives a great example of a live music performance being visually represented by a cymascope via a live camera feed and projections.

Il Cambia-Mondo, n.d.

As can be seen from the formations that occur in the water, these demonstrate fluid transitions (no pun intended) between different frequencies being visualised. Unlike the CymaScope app, which as previously discussed, cuts off any audio or visual tail to play the next sound to provide the associated frequency formation. The interesting aspect to finding this video on Facebook, was the trail it led me to discovering more experimentation’s on YouTube. The next video, taken from the channel of ‘The CymArtist’ shows good creative merit with varied lighting colours adding to the effectiveness of the visualisation. Other videos from the same channel demonstrate further creative qualities in reverse engineering to sound with post-production.


The CymArtist (2015)

The combination of these two different sources of experimentation with music and visuals, with and without cymatic frequency, suggests to me that having a physical installation with a cymascope and live camera feed would be a better option. This will have some disadvantages, primarily being cost and transportation, however software replication is not intuitive enough to provide a convincing display of fluidity in motion. As I reflect on my research, I still feel troubled as to the lack of music production, filming and editing I have done outside of this research project. Am I living a dream, or dreaming a dream about cymatics in visualisation and straying beyond my discipline and capabilities?

Throughout the duration of this research project I have been eager to connect a live feed from a DJ mixer to a cymascope to see the results with electronic music.  There are plenty of experiments using Chladni plates and water online, however these are frequently utilising classical music, voice and chant sounds to establish very clear, distinct patterns. The only method I currently have at my disposal is the CymaScope app on my smartphone, so I will have to use that.

The tricky aspect about using CymaScope is there are minimal options for audio input. These are limited to the piano roll and one’s smartphone’s microphone. Therefore my only option was to play my music loud through my studio monitors, so the microphone could clearly track the signal and the app could respond. How do I capture the visualisation appearing on the app to document this experiment? My initial thought was to set up my GoPro and film my smartphone’s screen, until I remembered from previous experimentation that you can visualise the iPhone with Quicktime Player on a Mac via the USB charger cable. I gave myself a victorious pat on the back for that.

So with my iPhone SE connected to my MacBook Pro, I selected ‘New Screen Recording’ in Quicktime Player and hit record. I then loaded up the CymaScope app, selected the microphone option and played one of my recent tracks through my iMac and studio monitor speakers, loud.  At the end of the song, I then imported the video footage into Final Cut Pro X and dubbed over the audio with the original audio file of the chosen track. This gave an enhanced audio signal, while having the same placement and subsequent visualisations as the microphone signal. I then zoomed in on the visual frame to remove the musical note text CymaScope provides underneath the visualisation, added some captions, fades and hey presto!


Jones (2017)

As I mentioned previously, the CymaScope app is visually satisfying to witness, but the lack of fluid transitions makes it seem somewhat laboured. There are definite repeating formations in the above example, however I am sceptical of their accuracy due to their lack of consistency.  There are repeat patterns formed at similar points every few bars, but this is not always the case, so it leads me to believe that the CymaScope is attempting to find the nearest frequency response based on its data.

This upholds my previous thoughts that an actual physical cymascope would be the best representation of cymatic and Faraday formations, due to programming limitations in software. Why go to the effort of conducting an experiment? I am yet to explore Max/MSP, but due to my lack of experience it might be a long time before I achieve a successful test phase. In my current circumstances, this is the best I could do. So where do I go from here? Has my research been thorough enough? As I reflect on the last six weeks or so, I am sceptical of how I have spent my time and energy. Has this been a wild goose chase into an ambitious dream which may never become reality? I would hope not, because I intend to make cymatic and Faraday formations a reality in the realm of real-time music-visual performance.

References:
Bergstrom, I., Lotto, R.B. (2016). ‘Soma: Live Musical Performance in Which Congruent Visual, Auditory and Proprioceptive Stimuli Fuse to Form a Combined Aesthetic Narrative’. Leonardo 49, 397–404. doi:10.1162/LEON_a_00918
Il Cambia-Mondo, n.d. – Un momento straordinario con la voce… [online] Available at: https://www.facebook.com/ilcambiamondo/videos/1348846065177115/ [Accessed 20 November 2017].
The CymArtist (2015) Cimatica/Cymatics – Experiment 12 (432 Hz) [online] Available at: https://youtu.be/NFFJXxkILHk [Accessed 20 November 2017]
Jones, C. (2017) Sejon – Peace Out (Visual Cymatic Interpretation by CymaScope App) [online] Available at: https://youtu.be/n5YOU0kI94Q [Accessed: 23 November 2017].

#Reflection

 

Reflection: Enhancing emotional immersion when the eye listens, is music more powerful with visualisation?

Having broadened my search away from specifically focusing on the presence of cymatic and Farady frequencies in music-visual performance, I have discovered some interesting research papers which deserve attention. Primarily focusing on psychological aspects regarding the enhancement of emotions and immersion in music, they provide an interesting insight into human response when visual performance is present.

Platz and Kopiez (2012), embark on a thorough investigative analysis in their research paper, ‘When The Eye Listens’. Delving into multiple, experimental publications investigating the connection between music and visualisation from 1940 to 2011, they attempt to provide a definitive outcome, which lacks precisely that. Although this is an interesting avenue of research, my personal opinion is that they needed to focus on a specific area, without casting their net so wide. Their resulting metadata suggests a stronger connection to music when visuals are involved, but not definitively to any particular genre of music, or sub-type of concert performance.

In comparison, the work of Baltes and Miu (2014), provides a detailed, practical experiment of human responsiveness to music performance in real-time. Taking a focused approach towards participants responsiveness to a live opera performance, the results showed an enhancement in emotion at key moments during the show. Their research is a good, practical indicator towards increased immersion and fluctuation in mood regarding dual, simultaneous aural and visual stimulation, as opposed to music individually.

So how does this connect with my creative practice and aspirations? Having studied both of these papers, I am reflecting back to my earlier comparison as to how electronic music is presented in clubs and concerts. Inclusive of my previous blog post regarding Awakenings, it is apparent from my research, plus anecdotal evidence that the audience attending music events prefer dual audio-visual stimulation. It is evident from my own experience, that when walking into a venue which is either predominantly dark, or streamlined with effective lighting or visuals, provides greater captivation and the mood of the audience. There have been many times I have entered an event which promises potentially good music, but minimal effort has been spent on decor, visuals and lighting. This results in audience members not being intrigued by mystery and drawn in by what they see, or experiencing an impactful transition in how they feel. It could be agreed, that many in this situation want to see and hear but not be seen or heard while expressing their emotions, whether dancing, watching or both simultaneously.

One could compare this with visiting the cinema to watch a film without the auditorium turning the house lights down. This would undeniably steer focus away from the screen and be distracting for the audience. It may be obvious to some, but not to others that creating a mood suitable to the musical performance and steering focus towards the stage, or screen will increase immersive qualities.

What can I take from these papers? From my perspective, it remains a challenging prospect to include cymatic and Faraday formations in a real-time music to visual display in my future projects. Perhaps one that I will not succeed in making a reality, however I will continue researching its feasibility with the knowledge that its inclusion will potentially only be a powerfully, positive, immersive factor.

References:
Platz, F., Kopiez, R. (2012) ‘When the Eye Listens: A Meta-Analysis of How Audio-Visual Presentation Enhances the Appreciation of Music Performance’. Music Perception; Berkeley 30, 71–83.
Baltes, F.R., Miu, A.C. (2014) ‘Emotions During Live Music Performance: Links With Individual Differences in Empathy, Visual Imagery, and Mood’. Psychomusicology; Washington 24, 58–65.

#Reflection

 

 

Reflection: Moving forward taking inspiration from KIMA, Cymascope technology and Awakenings….

As I begin to write my critical analysis for this module on my chosen source, naturally I am still questioning my grasp on cymatic frequency and Faraday waves towards an interpreted live performance scenario. Taking what I have learnt from researching multiple sources, primarily being scientific over artistic, the next step is to fuse the knowledge together with my own creative ambitions for visualised audio.

I realised a short time ago that I have not launched my DAW, nor have I picked up my camera to film anything, or edited any footage in Final Cut for two weeks now. Understandably studying for an MSc means certain things get pushed to the back of the queue. However, where my head has been submerged in scientific discourse for the last few weeks, I feel the need to actually be creative with music to refresh my perspective and remind myself in asking the question: why did I chose this path of research? In an attempt to inspire myself into undertaking a moderate amount of studio time to focus on creating some techno music, I have taken to researching relevant events to the scene which may have scope for visualised audio based on cymatic principle.

So who are at the forefront of music-visual experiences in the techno music scene? One such regular event, renowned in the underground techno music scene is Awakenings. Based in Amsterdam, Netherlands, Awakenings is a techno event held frequently at the Gashouder venue, in the old gaswork district of the city. The venue’s sheer expansive size, in all dimensions enables an opportunity to deliver an immersive audio and visual experience for all attending.

AwakeningsGashouderKBK-1140x760

Fig 1: Awakenings at Gashouder
(Awakenings, n.d.)

Having not personally experienced Gashouder or Awakenings yet, it would be unjustified for me to critically judge on the immersive quality of the event. However, with the quantity of broadcasts that are streamed live to YouTube from the event, these can give a good indication to the level of aural and visual stimulation that awaits any visitor. From the outside looking inside, it appears that Awakenings events use the expanse of Gashouder’s structural beams to showcase a very effective, 360° use of lighting and lasers. This includes a moderate use of visual screens, depending on which DJs or record label’s are co-hosting. As I mentioned in an earlier post about visualising techno, akin to the Detroit tradition, a dark room with simple lighting can be very effective immersion.

nl-gashouder

Fig 2: Gashouder, Amsterdam
(Resident Advisor, n.d.)

How could my ambitions with immersive cymatic visualisation be brought to life in such an environment? Awakenings, whose visuals are provided by KBK Visuals, clearly aspire to embrace many different objectives with their lighting show, and without suggesting a large focal point screen, I cannot help but vision a live feed from a cymascope adding extra dynamic to the Gashouder atmosphere. Perhaps being hoisted at a 45° angle on the circular light gantry in the middle of the arena, or circulated 360° around the walls. This is opposed to placement behind the DJ, as is typical in electronic music events. In theory this would potentially offer some diversity to the predominant lighting show format without making a drastic change to a forumla that clearly works well.


(BE-AT.TV, 2017)

How would this be done? KIMA (Gingrich et al, 2013) demonstrates that software algorithms can be used, however I am somewhat critical of the output resolution on display. In the age of high definition (HD), there seems a distinct lack of high quality resolution in the cymatic visuals being displayed, as is evident from YouTube videos. Without seeming too critical, clearly this is an avenue which requires more focus in future development. Personally, I would prefer to see a direct audio feed from the DJ mixer to a physical installation of a cymascope as per Lauterwasser (2006), with a return visual feed back to the VJ in the arena. This installation would likely have to be isolated away from the stage area, free from vibrational bass discharge in order to efficiently generate an equalised, accurate Faraday wave depiction of vibrational frequencies occuring in real time.

This is an avenue I am keen to pursue during my Masters studies, perhaps as a music-visual hybrid album or DJ performance. However, for the time being I will try not to jump to far ahead and focus on continuing to research cymascope technology for real-time performance.

References:
KBK Visuals (n.d.) Awakenings at Gashouder [Online image]. Available at: http://www.kbkvisuals.com/project/awakenings/ [Accessed: 08 November 2017]
Resident Advisor (n.d.) Gashouder [Online image]. Available at: https://www.residentadvisor.net/club.aspx?id=3154 [Accessed: 08 November 2017]
BE-AT.TV (2017) Ben Klock @ ADE 2017 – Awakenings x Klockworks present Photon [Online]. Available at: https://youtu.be/BjGE79i6YPg [Accessed: 08 November 2017]
Gingrich, O., Renaud, A., Emets, E., 2013. ‘KIMA — A Holographic Telepresence Environment Based on Cymatic Principles’. Leonardo 46, 332–343. [Online]. Available at: https://muse.jhu.edu/article/512222. [Accessed: 4 October 2017]
Lauterwasser, A. (2006) Water Sound Images: The Creative Music of the Universe. Illustrated ed. USA: Macromedia Publishing.

#Reflection

Reflection: Rupert Sheldrake, Russell Brand & Faraday Waves…

As I continue to research the plausibility of cymatic frequency in real-time, music-visualised performance, perhaps feeling a recurrence of concern towards my own scientific and creative capabilities to achieve such a feat. I subsequently discover a very compelling, newly published paper determining thorough, extensive experiments into Faraday wave behaviour in water. Having searched frequently on the University of Hertfordshire’s online library catalogue, sometimes being overwhelmed at the level of scientific complexity with many peer-reviewed sources, I was relieved to be able to coherently follow Sheldrake’s (2017) paper.

The unusual factor is how I discovered this paper, not via the university’s online library but from a page I follow on Facebook, called ‘Cymatics’. Personally I am not a huge advocate for Facebook, but it does at best enable the free flow of interesting, artistic, and scientific information, with this occasion being a true representation of that. Reading this paper provided me with an understandable insight into the complexities of Faraday waves that my previous blog posts have touched on, without going very deep. Perhaps something equally coincidental is that upon discovering this paper I was currently listening to comedian Russell Brand’s latest ‘Under The Skin’ podcast, with guest Rupert Sheldrake.


(Brand, 2017)

In this episode, Brand and Sheldrake discuss morphic resonance, consciousness and the universe in great depth, which could be considered beyond the typical range of conventional science. Listening to Brand discuss Sheldrake’s work and acknowledging the controversy surrounding much of his research methodology, reminded me of the similar stance some academics and those in the scientific community took towards the perspective of Hans Jenny’s experimental theories with cymatics in the 1960s. Would I be frowned upon for including this exciting, new paper in my annotated bibliography? There is definitely scope for acknowledging Sheldrake’s work in the need to bridge the gap between sound, matter, physicality, therapy and consciousness. However that is an entirely separate debate.

Without becoming too tangential in this post, a key factor in this paper towards my objective to include cymatic frequency and or Faraday waves in real-time music visualised performance, was the methodology in how to accurately create Faraday waves. Sheldrake (2017) determines that an analogous approach would provide a level of certainty to wave resonations that equations and technological algorithms could not compare due to their complexity and environmental randomisation. I am not convinced I fully understand the nature of these experiments, however this informs me that for real-time music visualisations, in cymatic or Faraday formations, would be more effectively executed with a physical installation with a cymascope or Chladni plate, as with Lauterwasser’s (2006) work.


(Zen Sound, 2014)

It seems that software applications, such as Max/MSP with Jitter or TouchDesigner could very well be programmed or reverse engineered to react to certain frequencies in sound. However the randomised, complexity of electronic music may prove to be a lengthy, scripted process without embracing the beauty of real-time formations that a physical installation could aspire to. Nonetheless, I am still somewhat speculating. My research continues.

References:
Sheldrake, M. & Sheldrake, J. (2017) ‘Determinants of Faraday Wave-Patterns in Water Samples Oscillated Vertically at a Range of Frequencies from 50-200 Hz’. WATER – Online Multidisciplinary Research Journal [Online]. Available at: http://waterjournal.org/volume-9/sheldrake. [Accessed 04 November 2017].
Brand, R. (2017) ‘Under The Skin: Rupert Sheldrake’. [Online]. Available at: https://youtu.be/dAS-QzWvj8g [Accessed: 07 November 2017]
Lauterwasser, A. (2006) Water Sound Images: The Creative Music of the Universe. Illustrated ed. USA: Macromedia Publishing.
Zen Sound (2014) ‘Water Sound Images Of A Gong by Alexander Lauterwasser’. [Online]. Available at: https://youtu.be/XXkZPfL4qLo [Accessed: 05 November 2017].

#Reflection

Reflection: Why are immersive visuals with cymatic frequency necessary for music performance?

As I continue to research the potentiality of cymatic frequency being visually formed in an intuitive response to music performance, I have to question myself why? At times over the past month, it has been somewhat frustrating attempting to find sources that demonstrate the intention I have to take my future work, perhaps because it is unknown territory for many visual artists and musician alike. As much as I would like to think I am being very original in this concept, I am reserved to offer myself the gratification of such an idea.

My intention to bring the phenomenon of cymatic frequency and the visual delights of Faraday waves into real-time music performance, is to offer stimulation, immersion, education and therapy in the moment of that performance. As I have specified before, anecdotal evidence suggests to me that the very principle of frequency, sound and vibration being at the backbone of all matter, emotion and perceived reality is beyond the grasp of many people. Perhaps being able to visualise sound in many music performance scenarios will enable an audience to be amazed, intrigued and educated by music and sound other than solely consuming the medium as great art for entertainment and distraction.

Looking away from attempting to find visual artists embracing cymatics in their performance art, has enabled me to discover some great alternatives. One particular visual artist whose work is unprecedented in my opinion is Mathieu Le Sourd, otherwise known as Maotik.


(Maotik, 2013)

Maotik often showcases his works in a 360° immersive installation, using real-time live visuals, which proves to be powerful, stimulating and stunning. This could be considered great visual art, but it is not offering the audience anything particularly educational, as cymatics would in comparison. Maotik’s work is undeniably a great example for visual artists looking to create an immersive experience which entertains and provides potential therapeutic qualities. How is this relevant to my creative aspirations? If this approach could be adopted in a live music performance environment, with a hybrid inclusion of cymatic and Faraday formations from physical installation, I feel this would gain greater audience attention and curiousity towards questioning the concept.

Having discovered a visually stunning book, Water Sound Images, I am strongly beginning to think that effective cymatic, or Faraday, formations would be ideally executed as a physical medium in a live performance setting. My current research has not provided definitive insight into the versatility of using visual software for this practice, even though the team behind KIMA succeeded this very feat. Water Sound Images demonstrates the effective use of capturing water reactions to various vibrations, with additional ambient lighting that naturally enhances the resulting Faraday wave formations.

(Lauterwasser, 2006)

This principle of capturing the results of a physical Faraday waves experiment are also present in the ‘Cymascope’ app available for Apple and Android devices. The concept behind Cymascope is to have an interactive piano surrounding a virtual water container, that when any piano key is played, the associated cymatic/Faraday pattern emerges and translates whenever a new note is played. Each visual formation response to musical notes was physically captured with a real cymascope and camera feed, then programmed into the app to respond accordingly with user interaction. Varying ranges of pitch are attainable with options of 440hz, 432hz and 444hz all displaying diverse alterations to the visual formations.


(CymaScope, 2016)

Upon first look, this seems like a great idea, with the addition of a microphone input also generating a formation response. However, being a music producer enables one’s ears to be accustomed to smooth transitions, to which there are none when using the piano keys. New notes bluntly cut off any decay from the previous note, making the transitions seem somewhat abrupt. Considering this, solely utilising this app with third-party audio input and visual output could be a better option.

Figure_3_-_Singing_into_the_CymaScope_App

So where can I move forward from here? I will attempt to test this with contemporary electronic music to see how intuitively responsive the app is with an audio input signal. Nonetheless, to have an actual cymascope with a live camera feed in a music performance setting would perhaps showcase the best formation responses. That could be a bold statement as I have not experimented with either medium, so I require a level of reservation in reaching judgemental conclusions.

References:
Maotik (2013) ‘DROMOS – An immersive performance by Maotik and Fraction’. [Online]. Available at: https://vimeo.com/80542263. [Accessed: 31 October 2017]
Lauterwasser, A. (2006) Water Sound Images: The Creative Music of the Universe. Illustrated ed. USA: Macromedia Publishing.
CymaScope. (2016) ‘CymaScope MusicMadeVisible app demo’. [Online]. Available at: https://youtu.be/mvWdIHUo9SU. [Accessed: 3 November, 2017]

#Reflection

Reflection: Jitter or TouchDesigner with Max MSP – is there some scope for Cymatic Frequencies and Faraday Waves in software?

Following on from a very helpful tutorial session where I shared my research ideas and prospective future plans with classmates, I was kindly prompted to speak to another lecturer within the university. I was led into a lecture theatre and introduced to Rob Godman, who I explained to the basis of my research. After summarising my intention to utilise cymatic frequency as a form of intuitive music to visual display, he reassuringly replied with great common interest.

We discussed the phenomenon of cymatics and Dr Hans Jenny’s work, of which Godman was somewhat skeptical about the nature of its revelation and interpretation in the new age, spiritual community. He introduced me to another experimental term called ‘Faraday Waves’. This process is specific to the vibrating of liquid, as with sand on a Chladni plate, however the liquid is enclosed allowing for resonating waves to take reflective motion around the container. This is turn produces a myriad of interesting original waves, plus their resonations which are also still subjected to new vibrations. The result is geometrical, symmetrical patterns, formed of lines, squares and hexagons in lattice formations. Whether my interpretation clearly explains this holds a level of uncertainty, however this process has been undertaken by those experimenting with cymatic vibrations as previous examples in this blog have demonstrated.


(Godman, 2017)

Godman showed me a piece of his work, which incorporated his music accompanied by some footage he filmed of Faraday waves during a practical experiment. He explained that a visual artist then reverse engineered the film footage to react to his music in an intuitive manner. Personally, I think this is a beautiful piece that demonstrates the organic methodology of utilising a physical medium to capture visuals, but also the potentiality of software to infuse with music. Godman expressed his preference towards this practice over using software entirely to replicate Faraday waves, primarily due to the organic nature of practical experimentation.

To consider this philosophy towards my intention of achieving real-time music to visual intuition in a live, immersive performance setting, perhaps Faraday waves are potentially more plausible in physical practice? I felt positively overwhelmed after spending a short time with Rob Godman and upon reflection it made me consider thoughtfully what lies ahead.

The prospective idea of having a physical installation with a live camera feed, mixed with visuals sourced from a chosen software application, could be considered a strong contender in achieving this objective in a real-time, live performance setting.

Spending time exploring the potentiality of using physical installations, such as a Chladni plate and Faraday waves in a real-time scenario, I have come to the conclusion that this could be plausible in a controlled environment with the correct tools. However, I am still yet to answer the question as to whether visual formation of cymatic frequencies, or Faraday waves can be replicated with intuitive, algorithmic software?

Until now my focus has been primarily on researching various scientific experiments through peer-reviewed sources, to gain an understanding if such physical practice can be adopted in a music to visual performance setting. Although this approach would be feasible with a physical installation, plus live camera feed, there would need to be software incorporated to enhance the colour, texture and offer additional visual variation. Researching software usage for specifically replicating cymatic frequency and Faraday waves has gained few results, however there is potential in the visual applications available.

Jitter, which was released in 2003, is an additional facet to the programming language capabilities of Max/MSP. Specifically for 2D, 3D video graphical composition, it is notably capable of designing algorithmic image generators and audio visualizers for real-time display (Products of Interest, 2003). Reading this small account in the Computer Music Journal led me to research more detailed publications regarding the specific programming process in generating intuitive audio visualizers. Having discovered the book ‘Multimedia Programming Using Max/MSP and TouchDesigner’, it is apparent that Patrik Lechner (2014) has constructed an easily understandable text for the benefit of readers wishing to learn this style of programming.


(Landon Speers, 2009)

However after studying Lechner’s content on Jitter and TouchDesigner, it became apparent to me that the latter is perhaps a superior application with advanced capabilities and potential to execute cymatic frequency formations. The only issue is that there are no examples contained within the book to suggest this is plausible. Perhaps the area I have chosen to research is too obscure? Maybe that in itself has pioneering strength when considering my future work in audio and visual performance. However without former programming knowledge or experience in Max/MSP, Jitter or Touch Designer, it is impossible for me to comprehend the practicality of generating intuitive audio visualizers based on cymatic principles. Nonetheless, Lechner’s work is of definitive value for future reference, therefore I will include it within my annotated bibliography.

The following video shows a 3D render developed in TouchDesigner. Although this is named ‘Cymatic Landscapes’, my opinion is that it is a poor imitation of actual cymatic frequency formations. To the naked eye, the animation is of a high standard, and is visually captivating in motion, colour and texture but it lacks the geometric complexity, definition and resilient strength present in practical cymatic formation experiments.


(Derivative, 2017)

References:

Products of Interest (2003) Computer Music Journal 27, 112–120. [Online]. Available at: http://www.jstor.org/stable/3681600. [Accessed: 30 October 2017].
Lechner, P. (2014) Multimedia Programming Using Max/MSP and TouchDesigner. 1st ed. Birmingham: Packt Publishing.
Landon Speers (2009) ‘Alva Noto – u_07-  Mutek_10’. [Online]. Available at: https://youtu.be/Lghn2FGHgOI. [Accessed: 30 October 2017].
Derivative (2017) ‘Matthew Ragan’s Cymatics Landscape’. [Online]. Available at: https://vimeo.com/207380152. [Accessed: 30 October 2017].

#Reflection

Reflection: Cymatic displays from software algorithms or physical installation?

Research has suggested that many captured visualizations of cymatic frequencies have been generated by practical experiments utilising the Chladni plate, with dry particulates, or a petri dish with water. Aside from the KIMA installation, there are limited numbers of examples where software algorithms have been successfully used to intuitively generate accurate renditions of cymatic frequency, in real-time with source music during a performance setting.

Woolman (2002) provides early century examples of software generated visualization for music, with Akira Rabelais’ concept for ‘Aural Mutations’ being particularly interesting. A key factor to note with this example and the other contributing artists in this publication, is the common hybridity of technology and physical art. The combination of texture and colour are typically rendered together using a fusion of physical art pieces and projected lighting, as with Glenn McKay’s piece, ‘Liquid Sound’.

None of the works within utilise cymatic frequency, but the principle is present. I have decided to include this publication into my annotated bibliography due to detailing early millennium concepts of music represented visualization. Although this is perhaps neither particularly current or relevant compared to modern capabilities, it demonstrates a fusion of practical and technological concepts that may be necessary for cymatic frequency in a performance setting.

Another good example of mixed visual sources is the utilisation of filmed media in unison with multiple software applications, as demonstrated in the video below.


(Vucinic, 2012)

Here Jovan Vucinic displays his experience of software applications, encompassing Maya, Quartz Composer and Unity together with VDVX to produce an effective visual performance for ‘Steeper’ by Fis. This is great to watch, with every beat and timbre generating a swift, flashing transition to the diverse visual material on show. However, as Vucinic specifies, this is triggered by human interaction in time with the music using a MIDI controller. I expect there are certain elements intuitively generated by the software in use, as VDVX itself is capable of reacting to frequency and volume. Nonetheless this does not necessarily include intuitive software triggers which are inherent to cymatic frequency response.

Again I begin to question whether software applications are capable of generating visualized cymatic responses to audio? Do such algorithms exist? Can computer programming and mathematical algorithms replicate these frequencies?

Further research has led me to understand that visual formations typically associated with frequencies subjected to a Chladni plate, can be accurately replicated using concise mathematical equations. Unfortunately, this initial finding is not from a peer-reviewed paper, however other sources are beginning to reveal the origin of this information is indeed accurate.

Screen Shot 2017-10-21 at 23.47.25

‘Cymatic System Supplementary’. (Plunkett, 2015)

Plunkett (2015) has produced a series of digital magazines to support his cause for various art installation and architectural projects, some of which have the principles of cymatics at their very foundation. His Burning Man issue captured my eye with particular interest towards the double page spread showcasing 2D and 3D examples (see featured image) of cymatic frequency visual patterns generated from formula (Plunkett, 2015). Personally, I do not regard myself highly in understanding mathematical equations, they appear as another language on paper to me. However, attempting to grasp the concept of the experiment has given me a stronger understanding of the principles for scientific research.

As Rossing (1982) presented in his research paper from the early eighties, and more recently, Zhou et al (2016), there are precise cymatic principles that can be achieved on a Chladni plate, through applying equations during practical experiments. Whether this methodology is simply a way to accurately define which shape on the Chladni plate will form, or whether it can be reverse engineered into a software application remains to be seen.

Reflecting upon finding this information, really has me questioning my capability to understand the complexities of cymatic frequency, mathematics and software algorithms. Is this beyond my capabilities as a creative? Is this relevant to my work? I will continue my research and as I stated before, look towards the capabilities of software applications that can create intuitive visuals on the principles of cymatics, not necessarily involving specific frequency formations.

References:
Woolman, M. (2002) Sonic Graphics: Seeing Sound. 1st ed. UK: Thames & Hudson.
Vucinic, J., (2012) Vimeo. [Online]. Available at: https://vimeo.com/56292984. [Accessed 18 October 2017].
Plunkett, T. (2015). ‘Cymatic System Supplementary’. issuu. [Online]. Available at: https://issuu.com/tobyplunkett/docs/systemsupplementary_150. [Accessed 21 October 2017].
Rossing, T.D. (1982). ‘Chladni’s law for vibrating plates’. American Journal of Physics 50, 271–274. doi:10.1119/1.12866
Zhou, Q., Sariola, V., Latifi, K., Liimatainen, V. (2016). ‘Controlling the motion of multiple objects on a Chladni plate’. Nature Communications 7, ncomms12764. doi:10.1038/ncomms12764

#Reflection

 

Reflection: Is cymatic frequency in visual performance a limited peer-reviewed research avenue?

My research is progressing but it is still in its early stages. Having discovered a paper written in 1983 exploring the potentiality of cymatic music, it has once again triggered my thoughts into questioning why there is not more widely accessible, mainstream scientific sources regarding cymatic frequency in general.  The paper I refer to, although not specifically relevant to the focal point of my research, expresses the beginning of a potentially exciting future regarding the utilisation of emerging electronic art technologies and their capabilities in sound and light (Pellegrino, 1983). However, there remains limited widespread knowledge on the potentially informative and educational prospects of cymatic frequency, particularly regarding matter.

My research is primarily aimed towards its use for visual art and unlike the anticipation of the author, thirty-six years later, there seems to be a lack of evident experimentation in mainstream art. That could be a bold, foolish statement at this stage. Perhaps I am not looking hard enough and yet to find more examples that merit peer-reviewed publication, or millions of plays on YouTube. It is apparent from my perspective, that most mainstream, popular or accessible music genres focus their visual art on the individual as opposed to the music or sound element. This could be due to the high presence of vocals in songs. Many artists gain the listener’s attention by singing beautiful, edgy, distasteful or poetic words and perhaps this makes the musical element redundant to merely an enhancement, secondary to vocals.

Personally, I am a fan to a minority of singers, and generally those that I like are where the music is as predominant in the performance as the vocals. This could explain why I primarily love electronic, techno, soundtrack and instrumental music. Perhaps many music fans are actually fans of the artist’s ego, story or image instead of the music itself? If we took some popular songs and removed the vocal element, would fans still love the instrumental piece as much? Are we capable of being immersed in music enough to generate our own story, vision and escape through imagination, or do we require a voiced rendition of an artist’s experience? Undeniably a vocalist can generate beautiful tones, moods and emotion with song, which is visible in cymatic response, but is it more powerful than the music can be itself?


(Stanford, 2014)

A great example of experimenting with cymatic frequencies in a contemporary, innovative concept is the above video from Nigel Stanford. With over fourteen million views on YouTube, it could be considered as the most widely seen cymatic frequency experiment on the Internet. Utilising the ‘Chladni plate’, ferro fluid and other devices, Stanford presents a brilliant experiment showcasing the wonder of cymatic frequencies in real-time. If other music videos took this example, perhaps fans would be investigating science through artistic expression instead of embracing ego? That is an entirely different debate, but perhaps in being tangential I am merely expressing thoughts that occurred while exploring Pellegrino and Stanford’s work.

752895954812de9bb569f8ea3b201387

‘CYMATICS: Science Vs. Music – Nigel Stanford’

As we head towards the fourth week of this module, I have to touch base with the plausibility factor of actually being able to thoroughly research the question I have given myself. This is perhaps not due to my inability to choose a focal point for research, or knowing where to look, but to the ever more apparent circumstance that there are limited peer-reviewed, or published sources that could be considered high-caliber enough to merit a place in my annotated bibliography.

I am aware that I have expressed my thoughts on the lack of public awareness and minority of academic research on cymatic frequency in general. However in being a minority topic, there are still plenty of viable sources to research cymatics, but not its integration into real-time, music to visual, synaesthesia style performance. On this basis I feel inclined to broaden my research topic slightly away from solely focusing on cymatic frequency as a modern, digital performance initiative. This would mean extending my reach to include real-time sound representative visuals based on the principles of cymatics, not necessarily inclusive of the frequency dynamic as per Dr. Hans Jenny’s experiments.

Even though I am reluctant to make this change, due to my fascination of cymatics and keen artistic intention to integrate them into my future music and visual hybrid work, I have to make a realistic decision which is going to positively impact my completion of this module. Having discussed the likely topics with fellow classmates, many of them were also concerned at the lack of specific peer-reviewed, published sources that could form the backbone of a current and relevant research focal point within the discipline of music technology.

When compared to other artistic disciplines, the creative distinction and technological aspects of music are constantly evolving, perhaps at a rate that has not yet led to an abundance of expert research which other fields may have. That could be a foolish statement, but nonetheless having discussed this with fellow classmates we agreed that we need to be at the forefront of research in our discipline and continue with rigorous intent.

References:
Pellegrino, R.A. (1983). ‘Cymatic Music: Towards a Metatheory of Harmonic Phenomena: My Interactive Compositions and Environments’. Leonardo 16, 120–123. doi:10.2307/1574798
Stanford, N. (2014) ‘CYMATICS: Science Vs. Music – Nigel Stanford’. [Online]. Available at: https://youtu.be/Q3oItpVa9fs. [Accessed: 14 October 2017]

#Reflection

Reflection: KIMA, does this mean it can be done?

To reiterate the nature of my research, I have presented myself with this question:

Is cymatic visual sound representation feasible in a live techno music performance environment and will it have educational and therapeutic benefits?

Perhaps as I continue my research the very extent of this question may reduce to a finer, focal point. To understand the actuality of cymatic frequencies being visualised in real-time may be enough to focus on at this stage. The merit of extending this research to uncover the educational and therapeutic benefits could be considered as an entire other avenue to explore on another occasion. As I begin to research scholarly sources specific to the cymatic phenomenon, it is becoming apparent that the complexity of this practice exceeds my creative and technical experience, greatly.

One such example of this is KIMA, an interactive art installation that projects real-time sound generated by users and performers alike (Gingrich et al., 2013). Having read this particular paper twice, I intend to read and refer back to it repeatedly to gain a thorough understanding as to how they successfully made this concept a reality. No doubt there is great level of thorough research, knowledge and expertise at the forefront of this project, which I find astounding. If anyone asked me how this concept works, I honestly would not be able to explain it in greater detail than giving a vague summary.


(Emets, 2015)

Attempting to make sense of KIMA’s deep, technical language is currently proving challenging. When comparing to my typically ‘organic’ approach towards visual work, it seems I have been working in a dated style, or perhaps just a different style. Nonetheless my desire to understand the very nature of how KIMA functions has led me on a lengthy, tangential deviation.

I commenced writing this post at around 2pm today, being Sunday, breaking away to take a reflective walk in the late afternoon sunshine to refresh and refocus. I returned and read the paper again, out loud, attempting to decrypt the terminology and understand its context. I think they used programming in Max MSP to template certain cymatic shapes triggered by definitive sound types? Maybe? I am not certain.

I have only been to one programming lecture so far and that seemed beyond my coding capabilities. It is now 11:15pm and having researched Max/MSP and Max 4 Live initially thinking they were the same thing, and apparently they are, but could potentially be different too. My confusion is enhanced. At this moment I feel behind the time with my technological expertise and awareness, which is daunting and somewhat troubling. In summary, I have established visual representation of cymatic frequencies can be done in a real-time performance setting, but whether it could be done by me in the future is another matter. Such practice will require complex skill development in programming and mathematics, for now I will continue researching other examples and its general feasibility.

References:
Gingrich, O., Renaud, A., Emets, E. (2013) ‘KIMA —A Holographic Telepresence Environment Based on Cymatic Principles’. Leonardo 46, 332–343.
Emets, E. (2015) ‘KIMA by Analema Group and Union Chapel Organ Project’. [Online]. Available at: https://youtu.be/PatNkvxByzc. [Accessed: 13 October 2017]
#Reflection