University of Applied Sciences and Arts Northwestern Switzerland, CHE
Video 3'30"
Keywords: Machine learning, p5.js, Creative coding, Generative art
This project is about the use of artificial intelligence to create video material and the control of images by sound. It is a music video that shows a journey from the countryside into a city. It aims to show the emotions of someone travelling through land- and/or cityscapes, starring out of the window, maybe listening to some music, daydreaming, while the world passes by. The images are generated through RunwayML with four different StyleGAN models I trained with datasets based on selected found footage of landscapes, highways, cities and city-streets. As these images are generated through an array of numbers (vectors), I was able to control them with code in the p5.js web-editor. The code maps the speed of the sound (bpm) to the images, so that it synchronizes the images to the beats, for example intervals of 2 seconds at 120bpm. How fast the images between these “new images” change can also be controlled by an “animation-factor” in the code, so I was able to give the movements some kind of rhythm. With this code I calculated and downloaded thousands of images from the different models and with different “animation-factors”, then put the images together in After Effects so I had several sequences I could then export as videos. The last step was then to cut these video-snippets together and arrange them to the sound.
Sound design: Joshua Stofer. Mentored by: Ludwig Zeller.
The Future Vision Online Exhibition was available from 11 February to 11 March 2021.
What you find here is a fleeting glimpse of the 21 works – or in better words, 21 future visions – coming from artists, designers, researchers, professionals, and students who explored complex questions from a critical and creative perspective.
We extend our warmest thanks to everyone who participated in the PCD21.
View all works