Reflection: KIMA, does this mean it can be done?

To reiterate the nature of my research, I have presented myself with this question:

Is cymatic visual sound representation feasible in a live techno music performance environment and will it have educational and therapeutic benefits?

Perhaps as I continue my research the very extent of this question may reduce to a finer, focal point. To understand the actuality of cymatic frequencies being visualised in real-time may be enough to focus on at this stage. The merit of extending this research to uncover the educational and therapeutic benefits could be considered as an entire other avenue to explore on another occasion. As I begin to research scholarly sources specific to the cymatic phenomenon, it is becoming apparent that the complexity of this practice exceeds my creative and technical experience, greatly.

One such example of this is KIMA, an interactive art installation that projects real-time sound generated by users and performers alike (Gingrich et al., 2013). Having read this particular paper twice, I intend to read and refer back to it repeatedly to gain a thorough understanding as to how they successfully made this concept a reality. No doubt there is great level of thorough research, knowledge and expertise at the forefront of this project, which I find astounding. If anyone asked me how this concept works, I honestly would not be able to explain it in greater detail than giving a vague summary.


(Emets, 2015)

Attempting to make sense of KIMA’s deep, technical language is currently proving challenging. When comparing to my typically ‘organic’ approach towards visual work, it seems I have been working in a dated style, or perhaps just a different style. Nonetheless my desire to understand the very nature of how KIMA functions has led me on a lengthy, tangential deviation.

I commenced writing this post at around 2pm today, being Sunday, breaking away to take a reflective walk in the late afternoon sunshine to refresh and refocus. I returned and read the paper again, out loud, attempting to decrypt the terminology and understand its context. I think they used programming in Max MSP to template certain cymatic shapes triggered by definitive sound types? Maybe? I am not certain.

I have only been to one programming lecture so far and that seemed beyond my coding capabilities. It is now 11:15pm and having researched Max/MSP and Max 4 Live initially thinking they were the same thing, and apparently they are, but could potentially be different too. My confusion is enhanced. At this moment I feel behind the time with my technological expertise and awareness, which is daunting and somewhat troubling. In summary, I have established visual representation of cymatic frequencies can be done in a real-time performance setting, but whether it could be done by me in the future is another matter. Such practice will require complex skill development in programming and mathematics, for now I will continue researching other examples and its general feasibility.

References:
Gingrich, O., Renaud, A., Emets, E. (2013) ‘KIMA —A Holographic Telepresence Environment Based on Cymatic Principles’. Leonardo 46, 332–343.
Emets, E. (2015) ‘KIMA by Analema Group and Union Chapel Organ Project’. [Online]. Available at: https://youtu.be/PatNkvxByzc. [Accessed: 13 October 2017]
#Reflection

 

Leave a comment