tagCloud

Let’s Work Together

Latent Frames

Realtime Generative Personalization

Machine learning, social media, LED display, locative media
2020

Computational technologies reframe the way we understand and experience our surrounding spaces, constantly proposing new schemes for communication and interaction. The Internet of Things suggests a range of methods for upgrading physical objects as augmented digital devices that offer new functions and choreographies on how our hybrid environments are to be navigated.

This work demonstrates a series of wall media frames that respond only to their owners. With their geolocation awareness (built with a mobile phone tracker), the wall frames are capable of knowing the distance of their owners in relation to them. If the distance is shorter than the location where the frames are hanged at, they will scrape the latest images from their owners’ social media accounts. The extracted images are then added to the Style-GAN machine learning algorithm, and the resulting visuals are finally displayed on each frame independently.

The main objective of this case study is to consider how A.I. aesthetics can be utilized for the creation of dynamic environments that have the ability to blend the physical and the virtual, and, in addition, to contemplate how personalization is framed through creative applications of computational media.