AUTOCONSTRUCCION
In this digital artwork, we explore the creation of collaborative imaginary 3D scenarios through machine learning. We utilize large datasets of images to visually synthesize a global phenomenon. To avoid internet biases and reflect the reality of peripheral areas and major cities, we create a customized dataset that represents the dreams and desires of non-architects. This results in a unique form of digital hybridization.
Self-taught construction has been evolving for decades and has proven to be one of the most feasible ways for the majority of Earth's citizens to have a roof over their heads and a livable space. This is due to economic crises, real estate speculation, and the growing resource problems caused by climate change. As a result, this phenomenon has expanded beyond marginalized neighborhoods and countries considered to be developing.
This phenomenon can now be explored like never before through the vast eyes of Google Maps. A significant part of this research has involved understanding the evolving "styles" that are emerging and accumulating complex interpretations of different architectural styles.
TBy utilizing StyleGAN as the primary pattern perceptor, we have created various typologies through this sophisticated data visualization technique.
This chapter explores the need to expand beyond the limits of megacities, even in today's era, into areas where nature remains relatively untouched. The visual elements consist of screens that display images generated by machine learning, inspired by the informal styles observed in self-built architectures in peripheral areas, categorized into different typologies.
In the analysis of images, the landscape around the buildings is mostly land still with trees, near roads, and very frequently at the foot of large hills and mountains, places where there are still natural rivers where the [urban layout] begins. in an intuitive way, and in the best of cases with a collective planning.
Through this selection of videos, a synthesis of these styles is proposed, evoking the multitude of possible facades that could emerge in these scenarios yet to be built, configuring the urbanism of the future.
This chapter explores the concept of the floating population, individuals who do not have a fixed abode and may transiently inhabit different cities. The cityscape is constructed using results generated by the AI, which selects the most compelling facades that hybridize informal architectural styles from various countries. The empty character of the cityscape reflects the reality of many peripheral cities that are deserted during the day.
The floating population arises from the evolution of cities in response to urbanization phenomena, metropolization, and the emergence of large marginalized populations that lack access to decent housing and urban services. In recent times, the term also encompasses individuals who temporarily share a city as a result of globalization and information technology incorporation, without being permanent residents.
This scene is experienced through an avatar that navigates a tower-like structure, which accumulates urban gestures and facades from diverse Latin American locations, resulting in a portrayal of a slum-like environment with indeterminate construction. This scene serves as a transitional moment leading to the final scene.
As a concluding reflection, this chapter presents a study created within the context of these unfinished works, where the authors' exploration is intertwined with technological waste. The space reflects deeply on the concept of creation in different parts of the world where technology is not only not produced but rather is re-appropriated by its users.
It emphasizes the fight against obsolescence and the search for unique aesthetics that do not romanticize precariousness, but approach it from a hacker and biohacker perspective, exploring the limitless possibilities of human creativity expanded through technology.
Live Cinema Coding (Procedural animation)
Development uses live coding to sequence cinematic shots, transform a 3D scene, animate actors, control lighting, process video, etc. We use the efficiency and possibilities of the Tidal Cycles pattern system to create sound-visual synesthesia and musical visualization by refactoring musical patterns into visual patterns.
The solution is based on the development of a Processing server (JAVA) to be a bridge between the algorithmic music platform Tidal Cycles and the Unreal Engine video game engine.
We also created a dictionary of terms to visualize music composition with 3D gaming environments through live coding; allowing us to hybridize the cinematographic and virtual reality language, obtaining as a result an audiovisual synesthesia and a control of the moments and intentions of the work with a written and also generative character.
Guiding or driving animation and cinematic narratives with musical algorithms involves ordering the flow and temporality of images with the properties of music, and by properties of music, we refer to the structures of musical composition. This methodology of visualizing music based on data and generative digital processes allows for generating visual variations linked to the disaggregated structure of the composition, and consequently allows for guiding or emphasizing isolated sections of the musical work for the eye.
This visualization based on the values of musical algorithms in runtime allows automating visual changes for parts of the musical work such as changes in melodic contour, or only for a certain timbre or instrument, or only certain accents of rhythm, subtleties that allow visualization to the pre-processed sound and that is very different from the audio reaction to the intensity of the sound or the frequency when visualizing the post-processed sound.
"TidalCycles is a live coding environment which is designed for musical improvisation and composition. In particular, it is a domain-specific language embedded in Haskell, and is focused on the generation and manipulation of audiovisual patterns". [Wikipedia] [oficial webpage]
The following are some of the functions written in Haskell, added to Tidal Cycles to manipulate Unreal Engine game environments through live coding.
Computational infrastructures have a direct impact on the experiences and narratives of digital works. In cases where image and sound are essential, selecting the appropriate technological infrastructure ensures that technical artifice virtually disappears and the focus on the work is deep, emotional, and without distractions. In works based on program execution and therefore generated in real-time, infrastructure becomes even more relevant, as it defines the scope in processing visual fiction (2D or 3D) and the complexity of sound algorithms for composition and improvisation.
Autoconstrucción utilizes the processing capacity of the Unreal Engine gaming platform, which heavily utilizes the GPU of the video card to compile and process Shaders. Additionally, we use platforms such as Tidal Cycles, SuperCollider, and Ableton Live7 for live interpretation of electronic music. For this reason, we decided to distribute the computational load across two computers and synchronize it with a local network, communicating with the OSC (Open Sound Control) protocol.
The diagram simplifies the infrastructure used for the concert:
1 Computer running the video game platform, primarily for real-time visuals (algorithmic-music-driven video game). It also contributes some incidental sounds using Unreal Engine's "Meta sound" system for sound physics and interactive sound design.
2 Computer where live music programming takes place, which serves as the score guiding the cinematic dynamics and visual manipulation of the other computer.
34 OSC connection via a router for the local network dedicated exclusively to the concert.
5 Multi-screen device to mix the video signals from both computers and allow projection of the audiovisual narrative and its algorithmic composition (live coding).
6 Audio mixer to output sounds from both computers.
7 Projection or large-format LED screen to enhance the reference or theme: architecture.
8 Full-range audio system (PA), tailored to the concert hall, auditorium, or venue.