This article is based on two recent talks I gave—one at Design Korea, the other at Amsterdam University of Applied Sciences. Both were on the same theme: With AI now part of the creative process—whether we like it or not—how will the role of designers change?
After my recent interview with Saleh Kayyali, I’ve been following the work of Meng To, who not only uses generative AI coding tools to make software, he then uses the resulting software himself. In this post, he notes that in his coding process, “I totally skipped design and went straight to code.”
We can take this as evidence that AI is already replacing human designers. Kayyali discusses this in more detail, and maybe you’ve heard related stories; if you’re a designer yourself, maybe you even have first-hand experience of losing work to AI, or conversely getting more work because you use AI yourself.
So the question of whether AI should be considered a Designer has to be yes. Questions about the creativity of AI are more nuanced. I discuss how AI is changing what we mean by design, then very practically how designers can work with it.
What is design?
Who am I? In one of my lives, I’m a designer, and I’ve designed software and websites for years, even won a few awards. I’ve run two Masters programs in leading design schools. Currently I’m attempting to create an AI system to support cross-cultural communication – more on this below.
I’ve always blurred the boundaries between art, design and coding, and in our interview Kayyali details how companies are now merging the roles of designer and coder into new jobs like “Design Engineer”.
Whether you consider yourself more of a designer or a coder, you know that you’ve got to start working with AI. In my experience this is a game-changer: when it works, the making process is not only faster, it becomes truly collaborative, such that I can say that “we” are making something together. If you know me, you know I’m naturally a skeptic, not prone to such Silicon Valley hype-speak.
This is therefore a good moment to go back to basics, and redefine what we mean by design.
One of the most basic, and boring, definitions of design comes from Herbert Simon via Richard Buchanan: Design is “the conception & planning of the artificial” [source]. Design as planning, as in planning for the future. And design as artificial, as in created by humans.
A counterpoint comes from the philosopher Daniel Dennett. In this talk, he discusses a research and development process that exploits information to improve the design of things. In his definition, this includes evolution by natural selection as well as human “intelligent” design. While humans deliberately plan, design by natural selection constitutes “competence without comprehension” as he calls it.
Design by natural selection is, however, costly in terms of time and resources. Therefore, it’s often rarely useful at human scale, except as inspiration (e.g. bio-design) or for human use (e.g. synthetic biology).
But given the global effects that human-centred design has had on the environment, and given our impending societal collapse (I told you I was a skeptic), maybe humans should not be the only ones to benefit from design, nor should they be regarded as the only designers. Should we think of AI as an “artificial”, “intelligent” designer?
Design v. design
I have written about reducing design from a noun to a verb: differentiating capital-D “Design” as a profession, from lower-case “design” as a process—one that incorporates artistic strategies, scientific and even journalistic methods of investigation.
I think Dennett would agree, because in evolutionary terms, everything—from individual organisms to social organizations—is temporary, and constantly changing. We should think in terms of verbs, not nouns.
Additionally, much of our contemporary world is not designed by human Designers. Many design decisions in the urban environment, for example, are governed by agreed-upon building conventions and zoning regulations. Design without Designers. (Maybe without competence or comprehension either. Such design-by-committee results in the ugly, inefficient, unequal cities may of us live in today.
If Design involves planning the artificial, it often fails to account for wide or long-term consequences. According to architects Beatriz Colomina and Mark Wigley, “the planet itself has been completely encrusted by design as a geological layer.”
We can certainly say that in the scope of human history, design as deliberate planning of the artificial applies to technologies, broadly defined from stone tools to smartphones, to serve some purpose and someone's interest.
From planning to programming
If design is planning, it’s a form of instructions. A blueprint, a technical drawing, a design spec are literal versions. In art, there’s a long and interesting history of instructions as artworks, by artists like Yoko Ono and Sol Lewitt. Rules, scripts, recipes, programs, algorithms, games.
I’ve written about design as a form of programming: programming the future, programming people even. If we think of the world as a kind of computer that computes the future, design is the way that we program this computer. If this sounds like technological determinism, consider that back in the 1970s, thinking of the human brain as a computer was just a metaphor. But today, actual programs and algorithms increasingly shape our reality, and not just on social media.
Whether you live in Singapore or the Australian outback, James Bridle notes that we all already live in a giant, real computer, which extends up into space via satellites. And it’s almost invisible to us. Think of what it means to see the world from above: a god’s eye view, a plan view, a map—detached, objective, used generally for planning. For design.
As in Borges’ famous one-paragraph story, the map becomes the territory. But no matter how detailed, the map misses lots of details that the person on the ground experiences.
Information can be regarded as material for design, directing people’s attention, influencing their behavior, and thereby our collective future. And in a world over-saturated with information, we welcome the algorithms and systems that help us wade through, filter, focus. We welcome AI: it’s here to save us from ourselves, and what we’ve created.
How to design with AI
If design can be considered programming in this way, it means that designers (human or otherwise) have an important moral imperative and ethical responsibility in addressing the problems Design has already caused, and not creating more of them.
In another recent newsletter article, Laura Herman told me that AI is not necessarily putting creatives out of work. In fact, as social media grows and grows, there’s more demand for content creation.
Yes, AI already curates what we see, and we already create work for our AI curators. AI has certain preferences, built into it by its human creators, who are primarily in Silicon Valley and doing this mainly for their own profit. Human faces on social media, for example, get more likes and attention. When you do a Google search, it’ll give you the result that most matches your query, of course—this means it’s literal and direct.
Herman points out that because AI can produce high-fidelity ideas right at the start of a workflow, the human designer now becomes more like a curator—they can direct and select from a range of options.
At the Design Korea event, Kaya Kim, Design Director of LG Electronics, used the metaphor of moviemaking: the designer becomes like a film director, directing AI as an actor or maybe camera operator, and thinks in terms of storytelling.
Google’s Ovetta Sampson, laughing at my rather abstract and academic take on design, brought things even more down to earth, with some practical ways to design products that have AI in them. Before, she said, designers would design how a product is built; now, they have to think about how product will behave—infused with AI, it will keep changing after it’s released. It’s stochastic, probabilistic: you won’t know in advance what users will do with your product.
It’s no longer just a user interacting with a device, she said, but a whole system interacting with content. Interaction design becomes “Intelligent System Experience Design”. A user’s context becomes multi-agency contexts of use. Technology use becomes decision-based, and the affordances change over time.
Automation v. imitation
Bringing this back to my admittedly more academic perspective, this means that we have to treat AI as a socio-technical system, that includes both people and machines—physical systems interacting with social ones.
But to be inside of a system means, to some extent, to be complicit in that system. Again, we need to keep the focus on ethics.
Alan Blackwell argues that there are generally two kinds of AI: practical automation, as in the use of machine learning to analyze, understand, predict and optimize processes; and fictional imitation—this is much of generative AI. All the text, images and video being generated by AI, we should treat simply as entertainment, he says, and nothing more. Like certain politicians, AI can often spout nonsense.
I would add some nuance here: Using AI to generate code is much more on the practical automation side. – as I said, I’ve found it truly transformative. This shows how fast the AI landscape is changing, even since Blackwell’s book was published four months ago.
If we keep our focus on practical applications and problem solving (I’ve written plenty of newsletter articles on artistic approaches to AI), there are plenty of good news stories around. In fact, there’s already a backlash against the backlash: Lots of “normal people” have become wary of AI—for good reasons—and sometime skeptics like Dario Amodei are putting out more positive visions.
A positive example: designing spaceship parts. A human might go through a few iterations in a week, the article points out, where AI can go through 30 or 40 in an hour.
There’s a trade-off, though: the good thing is that the AI sometimes generates ideas that no human would: “It comes up with things that we wouldn’t think of,” an engineer is quoted as saying. “We wouldn’t be able to model, even if we did think of it.” The bad thing is that it also generates things that are just wrong: filling in a hole where a part might need to attach to a spacecraft, for example.
This mirrors my own experience: the video analysis system I’ve been creating spotted things that I’d missed in looking at video. But it also mis-identified a plastic bag as a cat, for example. This reflects the model I used initially: it was trained on a dataset that was pretty US-centric (and also collected without transparency or consent).
Coming back to Kayyali, who I mentioned at the start, he stresses that we need to keep an eye on the economic forces and motives behind the AI that we use, and create. The sci-fi writer Ted Chiang observes that a lot of our fears about AI are actually fears about capitalism.
Getting to work
Given all that, if designers need to start using the new AI tools, how do they wade through all the ethical issues, and practically get to work?
First, keep an eye on the big picture. This means not only being aware of the fast-changing AI landscape, technically and socio-economically, but in a practical sense, making sure you keep a diverse skillset. If AI can take up specific roles, it pays to be a generalist, so that you can move across different areas, in agile fashion.
Business professor Aswath Damodaran writes that “in a world of specialists operating in silos and exhibiting tunnel vision, AI will empower generalists, comfortable across disciplines, who can see the big picture.”
From my experience teaching and practicing design, here are some other approaches that involve thinking differently (to use the old Apple tagline):
1. Make your own tools. This is what I primarily use AI for: not to generate images, video, text or finished work, but as a tool to create tools. I needed a randomized arrangement of circles for a web page, for example, so instead of simply generating some, I generated some code to create new ones each time the page is loaded.
2. Break the tools. Use an AI image generator not to create yet another hyper-realistic rendering, but something low-res, sketchy, abstract. Push the boundaries of what the system can do—the companies making these tools usually appreciate such efforts of “extreme users”.
3. Use the tools in different ways. Use that code generator to create images, use an image generator to create text. Turn things upside down, subvert their intended purpose. A nice bit of inspiration is Grand Theft Hamlet.
4. Step away from the screen. Go outside, take inspiration from the natural world. Or even create things that might live there (for example this project I worked on).
5. Iterate. Make, test, make, test, constantly. It’s good to reflect and criticize, but I like to think through making, to engage in practice-based research. Sometimes the research actually becomes the work.
6. Know when to stop. Yes, iterate, but know when to publish, exhibit, ship. It’s never going to be perfect—all work is work in progress (verbs, not nouns, remember?). I’ve seen students forever paralyzed from showing work because they’re perfectionists. That’s not necessarily a bad quality—until it becomes an extreme form of self-critique.
Great advice from a colleague, the Head of Fashion at the Royal College of Art, a former designer herself: there are times when you’ve got to open up and explore the design space widely, and there are moments when you need to close down—to produce a collection, in that case. It’s similar to the Double Diamond design methodology. Deadlines and deliverables focus the mind and motivate the hand.
7. Do it anyway. I’ve seen a lot of students abandon an idea because, after a quick internet search, they find that someone has already done it. Do it anyway—everything has been done, there are no new ideas. The way you do it will inevitably going to be different and personalized. Put your own spin on it, do it the way you think it should be done.
Know when to stop
Taking my own advice, I’ll wind up here with some final thoughts. AI researcher Caterina Moruzzi told me that knowing when to stop is one measure of creativity. AI can generate endless outputs, but it’s often up to the human designer to have intentionality—doing something with a purpose, knowing what you want and where you’re going.
If you agree that design is a form of programming (the future or people, take your pick), remember that this power comes with moral and ethical responsibility. That includes AI: If you agree that AI can be considered a designer, the companies and individuals making it need to not only share in that ethical imperative, but it needs to be built into the tools.
Yes, AI is doing design now. But it also empowers human designers to go further, faster.