Directed By Speculative Architect Liam Young And Written By Fict|Ut Hidden Figures Background|009 Acm Final Report 191014|35630451691 C8e66a2679 O|Itai Palti Credit Gil Hayon|Seoul City Machine Film Still 05|Urban Thinkscape3 Sahar Coston Hardy Photography

Game Changers 2020: Itai Palti and Liam Young on the Role of Tech in Creating Responsive Cities

Next in our annual Game Changers series, Itai Palti and Liam Young discuss how tech can help build stronger relationships between cities and their inhabitants, both human and machine.

As era-defining events continue to unfold around the world, digital platforms have catalyzed civic debate as never before. So for our annual spotlight on Game Changers—that is, the practitioners and researchers reshaping the A&D field—we used videoconferencing to connect people from different parts of the globe and from disparate spheres of design. Over one-hour sessions, our Game Changers discussed social justice, health, technology, and urban place-making—all topics implicated in the maelstrom of 2020. In the following text, we recap highlights from those conversations. An important takeaway is that our most pressing problems must be solved through interdisciplinary, networked approaches. 

Itai Palti Credit Gil Hayon
ITAI PALTI is an architect and the director of Hume, a practice backed by its Human Metrics Lab, a group of researchers using behavioral neuroscience to improve outcomes in architecture and urban design projects. In 2015, he founded the Conscious Cities movement, a field of research for building responsive environments through data analysis, AI, and tech. Palti is the director of the think tank The Centre for Conscious Design.Courtesy Gil Hayon

Advancements in tech are rapidly shifting the ways in which people interact with the built environment, yet many questions remain regarding privacy, power, and the politics of such technology. Here, architects Itai Palti and Liam Young theorize about a future in which science and tech—from behavioral neuroscience to AI—help build stronger relationships between cities and their inhabitants, both human and machine.

35630451691 C8e66a2679 O
LIAM YOUNG is an architect and filmmaker who operates in the spaces between design, fiction, and futures. He is founder of the think tank Tomorrow’s Thoughts Today, a group whose work explores the possibilities of fantastic, speculative, and imaginary urbanisms. He has taught internationally including at the Architectural Association and Princeton University and now runs an MS program in Fiction and Entertainment at SCI-Arc.Courtesy Liam Young

ON SPECULATION

LIAM YOUNG: In many ways, I define my practice as a form of speculative architecture. I don’t design buildings, but rather, I tell stories about the ways that emerging technologies are changing our cities and communities.

A lot of these technologies that we talk about when we’re thinking about smart cities or our future cities are what I define as “before-culture technologies.” Since ideology rarely evolves at the same pace as technology, speculative practices and processes are really a form of prototyping cultural responses to and consequences of technologies that aren’t quite here yet.

That’s why speculation and futures are really important. It’s a way of engaging an audience and empowering them to make decisions about the futures that they want to live in because inevitably, the future is a verb, not a noun. The future isn’t something that just washes over us like water. It’s something that we actively shape and define.

ITAI PALTI: A lot of my work is also related to speculation. The Conscious Cities movement is really a critique of the smart cities movement and asks whether we have control over where technology takes us.

I think there’s an element of dystopia that speculative architecture has to bring in, in order to arouse critical thinking of current decision making. So, for example, when we talk about how an AI controls a city or a street or a building, what kind of dialogue does this AI create between people and the city?

There are a lot of ways to think about this in terms of dystopia—a lot of ways to talk about imbalance of control and privacy, for example. It’s a really important act to speculate, but I also want to question whether speculation has the power to disrupt in the manner that we think it does. I’m interested in knowing how speculation has a real impact on decision-making and on the behavior of people.

Directed By Speculative Architect Liam Young And Written By Fict
Directed by Young and written by Tim Maughan, In the Robot Skies (2016) is a short movie filmed entirely through autonomous drones. The film explores the drone as a cultural object and is an example of Young’s work in examining narratives of the built environment through nonhuman perspectives. Courtesy Liam Young

ON HUMAN-CENTERED DESIGN

ITAI PALTI: In our work at Hume in the Human Metrics Lab, the performance measures that we are looking at for space are based on human experience. We’re talking about how the human experience is central to the project—not just in the design process but in how space keeps up a dialogue with its users.

Architects have a level of responsibility to see how their design intent has affected others and whether that design intent can be calibrated over time to be better for the user experience. Science and technology, and this is my optimistic side, can help us inform those decisions and become more aware of how our decisions affect others. That’s the power of technology as I see it, to bring in responsibility and accountability in certain places.

LIAM YOUNG: I think we share a lot of the same sentiments, though I would challenge you on the terminology. I just did a book called Machine Landscapes: Architectures of the Post-Anthropocene, in which I argue for an end to human-centered design. In many ways, I think this concept has been the problem for quite a long time—it has been a proxy to talk about user-centered design and, in turn, customer-centered design.

The human that we’ve been designing for has been generally a white male, and we’ve been engineering our spaces around the needs and desires of that individual. We need to be much more radical if we’re truly interested in humans in the sense that I think you’re talking about it, which is one part of a really complex ecosystem, humans in a much more equitable sense, and trying to privilege the people that aren’t normally a part of these discussions and conversations.

So often, human-centered design is an alibi for a way to ignore that kind of stuff and just focus on the scale of the body and our own desires and comforts, and I think that’s a real issue. So we’ve done a couple of films, like Where the City Can’t See and In the Robot Skies, that are trying to look at narratives from nonhuman perspectives.

Ut Hidden Figures Background
Palti was an architectural consultant for Urban Thinkscape (2017), a data- and outcome-driven project that addresses playful learning within cities. Built in West Philadelphia, the project explores the role a city can play in boosting healthy child development from social skills to scientific thinking. Courtesy Sahar Coston-Hardy Photography

ITAI PALTI: I agree with you on the use of the term. I recently spoke on a webinar called “We’re Nowhere Near Human-Centered Design Yet.” Questioning whether human-centered design is right and whether it’s being done right are two different things. I would say it’s not being done right. If you look at [Abraham] Maslow’s pyramid and you say, “Okay, right at the top is self-actualization,” the concept of self-actualization is understood very differently in different cultures. In Western colonialist society, we look at self-actualization as a very individualist idea. When you look at societies that see themselves as much more integrated in nature, self-actualization means something completely different.

To me, human-centered design isn’t just on the individual, which is using your skills to further your career, making money, and having the American dream or whatever. My understanding of self-actualization—what human centered means—is much more integrated with the world around us, to nature, to society. To me, that’s really a question of cultural values.

ON RESPONSIVE CITIES

ITAI PALTI: I coauthored the manifesto for Conscious Cities in 2015 with Professor Moshe Bar, a neuroscientist at Bar-Ilan University, and we really massaged this topic of how can neuroscience inform the way that we design spaces? We believe that in order for us to guide technology with the cultural values that we wish to go into the future with, we need to embed knowledge about ourselves into that technology.

There’s a lot of speculation about whether cities can be run by AI completely or whether we’re talking about a consciousness that we’re unable to imagine. I think we often imagine a consciousness in terms of a centralized information processing unit because that’s the way that we work, but there are plenty of other examples of different types of consciousness in the living world and in animals. I think we need to be able to imagine what the city can do for us in the type of consciousness and the type of AI that we can create for it, not necessarily replicate the city in the form of how a person thinks.

Seoul City Machine Film Still 05
Seoul City Machine is a film directed by Young and Alexey Marfin, with a script derived entirely from a conversation with a Mitsuku chatbot. Questioning what these relationships look like, Young comments, “For a long time, the world around us has stood silent, and now we confess suicidal thoughts to Amazon Alexa. We are more intimate with these devices than we are with each other.” Courtesy Liam Young

LIAM YOUNG: It’s interesting. When I think about AI and cities, I’m interested in the ways that cities then become shared spaces between humans and nonhuman inhabitants. A lot of our work, particularly in our short film Seoul City Machine, has been looking at the forms of interaction that might emerge and in many ways already exist between us as citizens of a city and the machine inhabitants that we’re actually sharing those spaces with. Now, the metrics through which we design cities, and I’m sure you have a lot to say on this, should actually also be including more than just the poetics of our own forms of occupation and our own modes of being in cities.

In the film, we were looking at the dominant form of interface right now, which is voice controlled AI, where, for the most part, designers are trying to push human/machine interactions through the lens of human-to-human interactions. The script is derived entirely from a conversation with a chatbot trained on urban operating systems that we developed and wrote, and it’s a conversation between a city and its inhabitants.

We’re trying to think about what that relationship might be. What does it mean to have a meaningful conversation with the AI governors of the city? To just squeeze them through the traditions of human-to-human interactions seems to deny other potentials or more productive ways we might interface with this world.

ON TRUSTING TECH

ITAI PALTI: The dialogue between user and space has always existed, but unfortunately, there are a lot of examples where it has existed to create a hierarchy of power. What I think technology is giving us is the ability to have more of a two-way dialogue, rather than the one-way dialogue we’re used to having with architecture.

It brings in the question of trust. When you are interacting with something that has a processing unit, that’s very far from you psychologically, does it understand who you are? Does it have your interests at heart?

I think that’s where the dystopian aspects of this relationship with a sentient environment come in. How do we create trust between person and space? Technology, I think, is making it harder in some ways.

009 Acm Final Report 191014
A 2019 report by Hume’s Human Metrics Lab in collaboration with Harvard’s Center on the Developing Child explores children’s museums as a people-centered learning environment. Hume analyzed the cultural, behavioral, social, and cognitive emotional states of users in accordance with educational goals. Courtesy Itai Palti

LIAM YOUNG: I’d trust an AI to govern my city much more than I’d trust an idiot politician who is only interested in their reelection and their wealth. I think that all these conversations around trust in technology, privacy, and data collection in many ways are well founded—we need to be talking about regulation around data politics and data privacy. But to do so in the way that we have denies the extraordinary potential of some of the systems and also denies the extraordinary corruption and problems that already exist with humans.

So I think we need different ways of framing that question. Technology is no grand solution to anything. All it does is exaggerate problems, our own biases, our own complications—it just dials the volume up and extrapolates the inequalities, the contradictions, and the complexities that are already present.

The narrative around trust and technology is really actually a narrative around trust and ourselves. What we should be having is a cultural conversation, and I guess that’s why we use narratives and that’s why I make films. So many of these issues are cultural problems, they’re political problems, and to complicate that conversation by talking about technology as if it has some kind of agency and autonomy I think is really, really dangerous.

ITAI PALTI: I agree with a lot of that. I think it’s a bit of a double-edged sword because technology inherits our prejudices. We know that about AI, and when you think about who is currently creating the codes that will guide us into the future, there’s not a fair amount of the population doing that.

On the other hand, if we are aware of our own faults in decision-making and in empathy, we might be able to guide technology to do a better job than we can. We’re currently not a society that is working against inequality as a narrative, and my worry is that technology is only going to make it worse.

I question why it’s taking so long to create legislation around data, around the use of data, around AI. Why is it that we don’t have governments talking about it as much, and leaving a lot of these decisions and power to the tech companies? I think we agree that this is a cultural dialogue, one about power and about what it means to govern.

Urban Thinkscape3 Sahar Coston Hardy Photography
Urban Thinkscape (2017) Courtesy Sahar Coston-Hardy Photography

You may also enjoy “Game Changers 2020: Diana Anderson and General Architecture Collaborative on Wellness and Community

Would you like to comment on this article? Send your thoughts to: [email protected]


Register here for Metropolis Webinars
Connect with experts and design leaders on the most important conversations of the day.

Recent Profiles