Ive built an experience where the image gets distorted with the beats of the sound creating a wave effect which changes with the frequencies of the sound . It gives the user a feeling as if they’re swimming and that made me create an underwater swimming experience to end the show.
Ive used p5.sound library to read and analyse the audio data and created a simple sinewave in the fragment shader, by using the bass frequencies of the audio track to control its frequency and the mid frequencies to control its amplitude.
In the 2nd sketch, i’ve created an underwater swimming experience which starts by clicking the button “lets go deep !”. Ive enabled poseNet to identify the ears , eyes and nose to place swimming googles and created bubbles and made them move by using function bubbles and moveLoopY.
Its unbelievable from the time we started in week 1 to week 12 that I will be able to create this . I actually asked Ellen in one of the sessions that what will I end up creating and she was like you will know soon. Im very happy with the outcome and I will take this opportunity to thank Ellen for making it happen .All credits for my project go to Ellen.
I struggled to combine my sketches which would have given a seamless experience as the shader function wasn’t allowing me to do it cos it uses the entire expanse of the sketch to fill in shapes in webGL mode.
Attaching the links to my sketches :
- wave effect | audio based image distortion : https://editor.p5js.org/ShauryaSeth/sketches/fgt81f8kx
- underwater swimming experience : https://editor.p5js.org/ShauryaSeth/sketches/QTkfqk2w4
Going forward I will use coding in my projects as it allows me to experiment and combine concepts of colour, video & sound, poseNet and many more.
User Experience links : IMG_6805 , IMG_6809
Credits : 1.P5.js
2.p5.sound
3.Neundex
4. Sound by Freesound
5. Images by pixabay