The thing nobody tells you about AI design tools is that they have impeccable technique and absolutely no taste.

Last week, someone on X posted that seven out of ten Pinterest results for interior design, fashion, and cooking are now AI-generated. The complaint wasn't that AI images look fake. It's that they look the same. The same lighting, the same compositions, the same eerie cleanliness that signals this came from a text box, not a person. The platform has been aesthetically colonised, and most people scrolling through it don't even notice yet. They just feel vaguely underwhelmed.

That's where we are with AI and design in 2026.

The received wisdom is that AI has democratised design. Anyone can now spin up a landing page, a deck, a brand identity, a social carousel. You don't need a designer anymore, or so the argument goes. Just describe what you want, and the machine builds it. Fast, affordable, good enough.

The good enough part is doing a lot of work in that sentence.

Here's what's actually happening. Tools like v0 and Bolt default to their training distribution, which means shadcn/ui components, blue-gradient SaaS layouts, Inter typeface at 16px, cards with too much whitespace, and a hero section that looks exactly like every other startup that launched this month. It's not a failure of the model. It's a failure of the brief. The model is picking its aesthetics from its defaults because nobody told it anything better.

One developer put it plainly this week: Their design instincts are the entire reason the output doesn't look like a bootcamp project. The AI executes. The taste still has to come from somewhere human.

This is the gap nobody's selling into. Everyone's racing to show clients that AI can build faster. Almost nobody is showing clients that the speed is worthless if the output looks like it was assembled from a SaaS Starter Kit.

The fix isn't complicated, but it requires actually doing it. The smartest teams we've seen aren't prompting from scratch every time. They're encoding their design DNA into portable files before they touch the tools: typography rules, colour systems, spacing logic, approved components, reference examples of work they admire, explicit anti-examples of things they hate. Then they give that context to the agent and ask it to build within those constraints.

The metaphor that clicked for me: the model is the kitchen. The prompt is the order. But the recipe, the actual creative specification that makes something taste like you and not like everyone else, is what most people are missing. You can have a Michelin-starred kitchen and still produce cafeteria food if you haven't written the recipe down.

For agencies, this has a direct commercial implication. Clients aren't just buying a website anymore. They're buying a reusable creative system, a documented design identity that AI agents can extend without wrecking the brand six months later when you're not in the room. That's the deliverable worth building. That's what has legs.

What this means for you, practically, this week: before your next AI-assisted design sprint, spend two hours writing what you actually want. Collect five examples of design you love. Note three things you've seen recently that felt cheap or generic. Write down your typography choices and why. Give your agent a brief that has opinions in it. The output quality jump will be immediate and obvious.

The brands that come out of this period looking distinct aren't the ones with the best AI subscriptions. They're the ones that treated taste as infrastructure, something worth documenting, refining, and protecting, rather than something the machine would figure out on its own.

The machine won't figure it out. It doesn't care how you look. You have to.