Culture as Infrastructure: AI, Music, and the Shape of Human Presence

published on 05 January 2026

Read in Portuguese

As we step into a new year, AI feels faster than ever. More capable. More present. More unavoidable. And yet, alongside this acceleration, something else becomes increasingly visible: a sense of disorientation. A quiet feeling that while systems are becoming smarter, our collective experience is not necessarily becoming richer, more meaningful, or more human.

This tension is not new. Félix Guattari was already writing about it decades ago, long before artificial intelligence entered everyday language. For Guattari, technology was never neutral. It always shaped subjectivity; how we feel, how we relate, how we imagine ourselves and others. Machines were not just tools; they were part of what he called assemblages: networks of humans, systems, environments, and desires constantly producing reality.

Seen through this lens, AI is not simply a technological upgrade. It is a cultural force. And culture, whether we acknowledge it or not, is the infrastructure that determines how this force lands in the world.

Culture Is Not Decoration

In many technology conversations, culture appears as an afterthought. Something to be added once the system works. Something aesthetic. Something optional, but culture has always been a form of technology itself.

Music, rituals, gatherings, and collective experiences are systems that organize bodies in space and time. They regulate emotion. They create belonging. They transmit memory. Long before algorithms, humans developed sophisticated ways of coordinating themselves through rhythm, repetition, and shared presence.

One of my masters, José Miguel Wisnik often speaks of music as a form of knowledge, not abstract, but embodied. Music teaches us how to listen, how to wait, how to respond. It is intelligence that happens between people, not inside isolated units. When we talk about culture today, especially in the context of events and communities, we are talking about living systems of coordination. About how people move together, feel together, and remember together. This is where AI becomes interesting and risky.

The Risk of AI Without Rhythm

Most AI systems are designed around optimization: speed, efficiency, scale. These are not inherently bad goals. But when optimization becomes the only value, something essential is lost and Guattari warned about this. He argued that when systems optimize without care for subjectivity, they produce alienation rather than liberation. The machine becomes disconnected from the human rhythms it is meant to serve.

In cultural spaces; festivals, concerts, communities; this risk becomes even more visible. These are environments where people are not looking to be optimized. They are looking to feel, to connect, to be present. An AI system that ignores this becomes noise.

But an AI system that understands rhythm, context, and collective experience can become something else entirely: a quiet orchestrator. A supportive layer. An invisible guide.

Culttech as an Ethical Practice

This is where the idea of culttech begins to take shape; not as a sector, but as a responsibility. Culttech is not about adding technology to culture. It is about designing technology from within culture.

It asks different questions: Does this system respect how people actually gather? Does it amplify presence or fragment it? Does it listen, or only extract? Does it create space for participation, or only consumption?

In music, timing matters. Silence matters. Call and response matters. Wisnik reminds us that meaning in music emerges from relation, not from isolated notes. The same should be true for AI in cultural contexts.

From Control to Orchestration

There is a temptation, especially at scale, to use technology to control behavior. To push, direct, optimize, and measure everything. But orchestration is different from control. Orchestration is about creating conditions where movement flows naturally. 

Where people feel guided, not commanded. Where intelligence emerges from interaction rather than enforcement. Guattari’s idea of ecosophy; the interconnectedness of mental, social, and environmental systems; offers a useful framework here. A well-designed AI system in a cultural environment should operate across all three: supporting individual experience, strengthening collective dynamics, and respecting the physical and emotional environment.

When AI works this way, it doesn’t dominate the experience. It disappears into it.

Why This Matters Now

As AI becomes embedded everywhere, the question is no longer whether we will use it, but how and for what. Culture is one of the last spaces where human presence still resists full automation. Where bodies matter. Where memory is built through shared moments. Where technology must negotiate with emotion, not override it.

If we get this wrong, AI will accelerate disconnection. If we get it right, it can help sustain and scale what makes us human. Perhaps the future of AI is not about making machines more human, but about designing systems that leave room for humans to remain whole. And maybe culture; music, gatherings, shared presence; is not something technology should disrupt. Maybe it is what technology should learn from.

Prompt with Sora
Prompt with Sora

by Rods Rodrigues // Membrz.Club General Manager

Read more