autoncstruccion
MALITZIN CORTES (CNDSD) + IVAN ABREU

AUTOCONSTRUCCION

2022-2024
AV CONCERT / SCREENNING / VIDEO INSTALLATION
LIVE CODED AUDIOVISUAL CONCERT AND VIDEO INSTALLATION, ALGORITHMIC-MUSIC-DRIVEN VIDEO GAME ANIMATION AND MACHINE LEARNING.
Video.1 - 00:03:30 | Concept, live acts and synthesis of the process.
Video.2 - 00:26:14 | Screening (The Live cinema coding is executed in real time).
AUTOCONSTRUCCION is a live coded audiovisual concert and a video game animation executed by algorithms in real time. The AV concert narrates with fictions of speculative architecture, the phenomenon of informal housing in countries like Mexico, the United States, Latin America, Asia, India and some European peripheries. We are interested in the ability of writing and live editing (coding) to enunciate, create and tell audiovisual stories in a liquid and granular way. Self-construction represents the most real option for the majority of the popular classes that inhabit megacities, defying the rigid limitations imposed by traditional architecture, real estate speculation and the economic crisis, transforming their living spaces into a constant work in progress, giving rise to new forms of self-expression, where flexibility, informality and pragmatism reign, reflecting the true essence of humanity.
PROCESS
Trainings in search of the essence of informal styles around the world
image
Data set (extract). Collection of 5000 images compiled by the projector's architects and assistants
Video.3 - 00:01:04 | Obra negra en bruto I. AI generated latent space "non-architects" Typologies of different houses in Edomex, Toluca, Mexico City, Tijuana, Bogota.
Video.4 - 00:00:47 | —ITERATIONS OF AN UNFINISHED SYSTEM. Nfts, prints and gans around a specific training: black work fusion of different geographies and intermediate finishes in Mexico City and Tijuana.

In this digital artwork, we explore the creation of collaborative imaginary 3D scenarios through machine learning. We utilize large datasets of images to visually synthesize a global phenomenon. To avoid internet biases and reflect the reality of peripheral areas and major cities, we create a customized dataset that represents the dreams and desires of non-architects. This results in a unique form of digital hybridization.

Self-taught construction has been evolving for decades and has proven to be one of the most feasible ways for the majority of Earth's citizens to have a roof over their heads and a livable space. This is due to economic crises, real estate speculation, and the growing resource problems caused by climate change. As a result, this phenomenon has expanded beyond marginalized neighborhoods and countries considered to be developing.

This phenomenon can now be explored like never before through the vast eyes of Google Maps. A significant part of this research has involved understanding the evolving "styles" that are emerging and accumulating complex interpretations of different architectural styles.

TBy utilizing StyleGAN as the primary pattern perceptor, we have created various typologies through this sophisticated data visualization technique.

image
Volume transformation process (3D models) based on the houses proposed by analysis with artificial intelligence
Award, Support, Residency & Festival Selection
image
Selected and support by the SONART Shanghai Festival 2023, A Coder and Violin, by Curator and Director of Futurology Center, Open Media Lab, China Academy of Art, Yao Dajuin
image
Selected and support by the ICLC 2023, International Conference of Live Conding. Organized by Creative Coding Utrecht
image
2023 Gold Award Winner, The Lumen Prize
image
Selected and support by the MediaLab Matadero. Madrid, for Laboratory #2 The Metabolic Sublime 2022, on energy sovereignty, material circularity and ecosystemic governance.
image
Selected and support by MUTEK-MX 2022, International Festival of Digital Creativity and Electronic Music. by Curator y Festival Director Damian Romero
image
Residency and support by ZKM | Center for Art and Media and the On-the-fly program to promote Live Coding practice, a performative technique focused on writing algorithms in real-time. Screening in the sound dome, The Cube, Karlsruhe, Germany, 2022.
Summary (Award, Support, Residency & Festival Selection)
- SONART, A Coder And Violin. SHANGHAI (CHN) 2023. AV Concert
- 2023 Gold Award Winner, The Lumen Prize
- Media Architecture Biennale. TORONTO (CA) 2023. AV Concert
- ICLC, International Conference of Live Coding , Creative Coding.Utrecht (NL) 2023. AV Concert
- CutOut Fest 2022, International Animation and Digital Arts Festival, Arte Abierto Gallery, Mexico City (MX) 2023. AV Concert
- Medialab Matadero. MADRID (SP) 2023. AV Concert
- PIKSEL Festival | 20 ANNIVERSARY . Bergen (NO) 2022. AV Concert
- MUTEK MX. Mexico City (MX) 2022. AV Concert
- ZKM, On The Fly. - Zentrum für Kunst und Medien. Karlsruhe (DE) 2022. Screening in The sound dome, The Cube
Episodes
The audiovisual piece is divided into four chapters that are performed live and exhibit variations due to the algorithmic and generative nature of the musical composition, which controls the entire experience.
EPISODE 1
Informal Style
image

This chapter explores the need to expand beyond the limits of megacities, even in today's era, into areas where nature remains relatively untouched. The visual elements consist of screens that display images generated by machine learning, inspired by the informal styles observed in self-built architectures in peripheral areas, categorized into different typologies.

In the analysis of images, the landscape around the buildings is mostly land still with trees, near roads, and very frequently at the foot of large hills and mountains, places where there are still natural rivers where the [urban layout] begins. in an intuitive way, and in the best of cases with a collective planning.

Through this selection of videos, a synthesis of these styles is proposed, evoking the multitude of possible facades that could emerge in these scenarios yet to be built, configuring the urbanism of the future.

EPISODE 2
Floating City - Floating Population
image

This chapter explores the concept of the floating population, individuals who do not have a fixed abode and may transiently inhabit different cities. The cityscape is constructed using results generated by the AI, which selects the most compelling facades that hybridize informal architectural styles from various countries. The empty character of the cityscape reflects the reality of many peripheral cities that are deserted during the day.

The floating population arises from the evolution of cities in response to urbanization phenomena, metropolization, and the emergence of large marginalized populations that lack access to decent housing and urban services. In recent times, the term also encompasses individuals who temporarily share a city as a result of globalization and information technology incorporation, without being permanent residents.

EPISODE 3
Agglomeration
image

This scene is experienced through an avatar that navigates a tower-like structure, which accumulates urban gestures and facades from diverse Latin American locations, resulting in a portrayal of a slum-like environment with indeterminate construction. This scene serves as a transitional moment leading to the final scene.

EPISODE 4
Anti-Obsolescence Shelter
image

As a concluding reflection, this chapter presents a study created within the context of these unfinished works, where the authors' exploration is intertwined with technological waste. The space reflects deeply on the concept of creation in different parts of the world where technology is not only not produced but rather is re-appropriated by its users.

It emphasizes the fight against obsolescence and the search for unique aesthetics that do not romanticize precariousness, but approach it from a hacker and biohacker perspective, exploring the limitless possibilities of human creativity expanded through technology.

Technical Process. Development of Tools
GITHUB REPOSITORY

Live Cinema Coding (Procedural animation)

Development uses live coding to sequence cinematic shots, transform a 3D scene, animate actors, control lighting, process video, etc. We use the efficiency and possibilities of the Tidal Cycles pattern system to create sound-visual synesthesia and musical visualization by refactoring musical patterns into visual patterns.

The solution is based on the development of a Processing server (JAVA) to be a bridge between the algorithmic music platform Tidal Cycles and the Unreal Engine video game engine.

We also created a dictionary of terms to visualize music composition with 3D gaming environments through live coding; allowing us to hybridize the cinematographic and virtual reality language, obtaining as a result an audiovisual synesthesia and a control of the moments and intentions of the work with a written and also generative character.

Algorithmic-music-driven video game animation: Tidal Cycles patterns for VR and visual storytelling, and new functions in Haskell.

Guiding or driving animation and cinematic narratives with musical algorithms involves ordering the flow and temporality of images with the properties of music, and by properties of music, we refer to the structures of musical composition. This methodology of visualizing music based on data and generative digital processes allows for generating visual variations linked to the disaggregated structure of the composition, and consequently allows for guiding or emphasizing isolated sections of the musical work for the eye.

This visualization based on the values ​​of musical algorithms in runtime allows automating visual changes for parts of the musical work such as changes in melodic contour, or only for a certain timbre or instrument, or only certain accents of rhythm, subtleties that allow visualization to the pre-processed sound and that is very different from the audio reaction to the intensity of the sound or the frequency when visualizing the post-processed sound.

Tidal Cycles patterns for VR and visual storytelling

"TidalCycles is a live coding environment which is designed for musical improvisation and composition. In particular, it is a domain-specific language embedded in Haskell, and is focused on the generation and manipulation of audiovisual patterns".  [Wikipedia]   [oficial webpage]

image
This algorithm expresses an instrument that activates four times over two musical cycles or measures and alternates once with the C minor chord in the fourth octave and the other time with the C# minor chord in the fifth octave. This occurs in compositional layer 1 (d1).
image
This algorithm is a rewrite or refactoring of the first algorithm to which a camera change was added with the chord change, but the order of the camera change was prevented from being executed 4 times over two cycles, creating a list that stacks algorithms that run in parallel. The first one only deals with the sound, and the second one only calls cameras 8 and 2, but at the beginning of the musical cycle so it only synchronizes with the first beat of the instrument.
image
If we wanted the cameras to change with each beat, the algorithm could be rewritten as the code in this image shows: This algorithm switches to cameras 8, 7, 6, and 5 when hearing the C4 minor chord, and when the C# minor chord from the fifth octave is heard, cameras 1, 2, 3, and 4 are activated.
New functions in Haskell

The following are some of the functions written in Haskell, added to Tidal Cycles to manipulate Unreal Engine game environments through live coding.

image
Computing infrastructure for the AV concert (hardware, software and custom development)

Computational infrastructures have a direct impact on the experiences and narratives of digital works. In cases where image and sound are essential, selecting the appropriate technological infrastructure ensures that technical artifice virtually disappears and the focus on the work is deep, emotional, and without distractions. In works based on program execution and therefore generated in real-time, infrastructure becomes even more relevant, as it defines the scope in processing visual fiction (2D or 3D) and the complexity of sound algorithms for composition and improvisation.

Autoconstrucción utilizes the processing capacity of the Unreal Engine gaming platform, which heavily utilizes the GPU of the video card to compile and process Shaders. Additionally, we use platforms such as Tidal Cycles, SuperCollider, and Ableton Live7 for live interpretation of electronic music. For this reason, we decided to distribute the computational load across two computers and synchronize it with a local network, communicating with the OSC (Open Sound Control) protocol.

image
Basic diagram of the hardware setup for the audiovisual concert.

The diagram simplifies the infrastructure used for the concert:

1 Computer running the video game platform, primarily for real-time visuals (algorithmic-music-driven video game). It also contributes some incidental sounds using Unreal Engine's "Meta sound" system for sound physics and interactive sound design.

2 Computer where live music programming takes place, which serves as the score guiding the cinematic dynamics and visual manipulation of the other computer.

34 OSC connection via a router for the local network dedicated exclusively to the concert.

5 Multi-screen device to mix the video signals from both computers and allow projection of the audiovisual narrative and its algorithmic composition (live coding).

6 Audio mixer to output sounds from both computers.

7 Projection or large-format LED screen to enhance the reference or theme: architecture.

8 Full-range audio system (PA), tailored to the concert hall, auditorium, or venue.

Conclusion
Overall, the piece invites contemplation on the intersection of art, technology, and the ever-evolving urban landscape, challenging our perceptions of architecture, aesthetics, and social dynamics in the digital age. It provokes thought on the future of cities, the impact of technology on urbanism, and the evolving relationship between humans and their built environment. As a dynamic and generative experience, the audiovisual piece captivates audiences with its immersive and thought-provoking exploration of the complex interplay between art, technology, and society. It presents a vision of the future that is shaped by the creative potentials of technology and invites audiences to reflect on the possibilities and implications of the ever-evolving relationship between humans,non architects and architecture, and urban environments.
Credits

Original artwork by
CNDSD (Malitzin Cortes) & Ivan Abreu
Lead Programmer
Ivan Abreu
Music and Sound Design
CNDSD
Programming
CNDSD, Mariana Mena
3D Modelling, Texture and CGI Art
CNDSD, Amuleto Studio, Mariana Mena
Procedural and real time animation
Ivan Abreu, CNDSD, Mariana Mena