In our current wave of techno-optimism and planetary panic, voices like Andres Colmenares feel like grounding forces. Andres is a researcher, curator, and co-founder of IAM—an organization exploring alternative imaginaries for the digital economy, and leads the Master in Design for Responsible AI at ELISAVA, Barcelona. He is also behind the concept of “Solar-Centered Design.” We invited Andres to The Greenhouse to talk about what it means to be human, how we define intelligence, and how design might move beyond its extractive and anthropocentric logics.
Marta: You proposed that we revisit a very fundamental question: What is humanity?
Andres: It’s a question that has become more urgent in 2025. We are in the middle of a very critical decade. If we consider the facts of the climate emergency, we’re at a tipping point for the conditions that are required for humans to be alive as part of the planet. So essentially, we have to ask: What is a human in 2025? What makes a human? And even more urgently: What definitions of humanity are we using when we govern our societies?
In addition, the Universal Declaration of Human Rights—an agreement meant to protect the most basic rights for all human beings—is being trespassed, significantly, intentionally, and without shame. At the same time, the socio-technical systems that run our day-to-day lives are also leaving many behind—through inequality, through discrimination.

Marta: These are great points. In this context, do you think human-centered design is critically considering what we are?
Andres: I find a lot of dissonance. We go back into our workspaces, our classrooms, our studios—and we see these polished decks and slides on human-centered design, with slogans like “let’s make things human again.” It feels like people are getting lost in abstractions. The version of the human that’s being centered is usually a consumer. A user. A demographic. A type of person whose identity and living conditions do not reflect the full diversity of our ecosystems or our populations. It ends up amplifying inequality, discrimination, injustice. It creates a toxic mindset for how we build our worlds.
I found a story from Wired incredibly telling—about how Wikipedia editors chose a photo to represent the average human. People assume it’s going to be a white man, largely because of the cultural dominance of Hollywood and tech, but demographically, the average human is closer to a rural worker in Southeast Asia. That fact alone should shift how we talk about representation. So what is humanity? It’s not a static definition. It’s not a slide in a design deck. It’s a living, plural, entangled experience. And if we’re serious about humanity, we have to start by unlearning the ideologies that have claimed to define it—and listening more closely to the people and places that have been excluded from that definition all along.
Marta: You’re currently teaching in AI. What is AI actually?
Andres: That’s a question I ask people all the time—especially when they use it in a sentence. What AI are you referring to? For many people, the word AI is equal to—very often—ChatGPT. I like to explain it this way: AI is to ChatGPT what mathematics is to a calculator. ChatGPT is a product. Artificial intelligence is a field of research.AI, as a term and area of study, emerged in the 1950s. It was shaped by a group of philosophers and mathematicians—mostly white, mostly male, and informed by a very monocultural imagination. At that time, inspiration came from a small set of concentrated sources, especially science fiction. One of the most influential was Metropolis, a 1920s film that made popular the image of a humanoid robot. The teenagers who saw that film grew up to be the researchers who would later coin and define the term artificial intelligence.
Their core speculation was this: the processes we associate with being human—like cognition and creativity—could be modeled in such a precise way that a machine could replicate them through computation. In the last 10 years or so, the convergence of planetary-scale computation and data-driven decision-making has taken that speculative vision and made it operational as products used by billions of people.

Marta: Taking into consideration embodied cognition, robots would never be able to think like humans because they lack our bodies. Is this correct or should we be concerned?
Andres: AI was invented within a conceptual dualism—human vs. machine. With the image of a humanoid robot, it set the stage for how we still frame AI: as something separate from us, something other. Take for example the fear that robots—or now, AI—are coming for our jobs. That idea isn’t new. But if you look closely, it’s not that far from “migrants will take our jobs.” These are fear narratives. They rely on framing some alien, external entity—be it a robot, a foreigner, or an algorithm—as a threat to something valuable we believe we own.
What’s really happening is that tech monopolies are resisting governance, resisting regulation, resisting any limits to their power. And they benefit when people are distracted by the idea that the danger is coming from “the machine.” But machines don’t have intentions. A robot didn’t wake up one day and decide to replace you. There is always a human behind it. A team. A company. An incentive structure. Once we name that, we can begin to hold systems accountable.
Marta: We need more awareness on the systemic issues.
Andres: The costs of AI are very real. Think of the gig workers delivering your groceries, or the content moderators—humans—sifting through the most horrific materials online so others don’t have to. These people are exploited. They are the hidden backbone of generative AI and digital platforms. It’s not just about data; it’s about human trauma, labor, and pain. Recently in Spain, one of Meta’s main moderation contractors shut down after 75% of its staff sued the company for psychological harm. That’s the human cost of running platforms at scale. And yet, we’re told it’s “smart” technology making the decisions. It’s not tech that’s the problem. It’s the broader system. One where we’ve come to associate success with comfort. Where convenience is considered inherently good. And in that system, everything—people, environments, time—is extractable.

Marta: You’ve proposed three design alternatives: human rights-centered, more-than-human rights, and solar-centered design. Could you walk us through each?
Andres: At some point, after nearly a decade of critiquing human-centered design, I just got tired of being the angry guy in the room. So I started thinking of alternatives. And not only thinking, but learning—because the alternatives are already out there. They have been there for a long time. We’re just not paying attention.
One of those alternatives is to ask: What if instead of human-centered design, we practiced human rights-centered design? The difference might sound subtle, but it’s huge. In most design conversations, “human” is this abstract, relative idea. Often, it refers to a privileged subset of humanity—an idealized user, consumer, or citizen. But with Human Rights-Centered Design, we could anchor decisions in something real: the Universal Declaration of Human Rights. That document outlines the conditions needed for people to live with dignity, peace, and security. What if every design process began with a commitment to not harm, and beyond that, to do good? We could build systems that expand empathy, solidarity, and responsibility.
But then, we can go even further. What about everything outside of us humans that is also part of our ecosystem? That’s where More-Than-Human Rights-Centered Design begins. I think humans, in general, are very narcissistic. But the truth is, we are not alone. We are living systems, part of a larger living system. Many Indigenous and ancestral cultures already practice this. They recognize rivers, trees, animals—even stories—as sacred beings. Not just as objects or resources, but as entities with rights, and often with higher value than humans—because they are older, wiser, more essential. And then, there’s the approach that I’ve been focusing on the most recently: Solar-Centered Design.
Marta: I loved it from the moment I came across it. It’s beautifully poetic and embodied.
Andres: It started as a provocation. If I’m against human-centered design—if I believe we shouldn’t put ourselves at the center—then what should be the center? In a moment of meditation during the pandemic, the answer came to me very clearly: the sun. Because, quite literally, that’s our center. As a planet, we revolve around it. Our energy, our food systems, our bodies—all depend on it. It’s a cosmic truth, untinted by ideology. It just is. So Solar-Centered Design asks: What would it look like to orient our decisions around that truth? It’s not about putting solar panels everywhere. That would just be another extraction model. It’s about remembering our sacred bond with the sun—reconnecting to its cycles, its presence, its rhythms. When we lose contact with the sun, we lose contact with ourselves. We start running on chronological time, not solar time. And that disconnection—like putting a plant in a dark office—leads to stress, disorientation, and illness.
Marta: Any tips for designing with the sun in the center?
Andres: Many architects already design with the sun in mind—orientation, shading, light. But this solar-centered design approach is not just technical. It’s philosophical. It’s spiritual. As someone who practices kundalini yoga, I also think of the solar chakra, the source of vital energy. In that tradition, we carry inner suns. The sun outside powers our chemical processes. You can feel it—its presence, or its absence. Solar-Centered Design is a practice of remembering. It’s not about the next innovation. It’s about healing. Looking inward. Reconnecting. And from that place, designing spaces, systems, and cultures that are in rhythm with life—not against it.
Resources
- Taina AI: A project by Indigenous groups developing their own artificial intelligence rooted in ancestral knowledge systems.
- Bio-circular data centers: A concept and movement toward regenerative data infrastructure, reducing environmental impact through circular principles.
- Grow Your Own Cloud: A speculative project that stores digital data in the DNA of living plants.
- Solar Protocol: A networked web platform hosted across solar-powered servers located in different parts of the world.
- Permacomputing: A set of principles for low-impact, resilient, and regenerative computing based on permaculture ethics.
- Ancestral AI: Part of the Slow AI research initiative by AIxDesign, exploring alternative, time-extended relationships with artificial intelligence.