The Future of Design and Code - Part 2

Exploring the power of instant feedback in creative mediums and what digital tools can learn from the immediacy of physical creation.

March 25, 2025

← Part 1

In Part 1, I explored how our current tools shape our thinking and questioned whether we're truly evolving our approach to creation. Now, I want to dive deeper into something that might be missing from most digital design tools: the concept of instant feedback.

Think about painting. When you dip a brush into paint and make a stroke on canvas, you immediately see the result. Your hand moves, the brush follows, and the paint responds in real-time. There's no delay, no processing, no waiting for the computer to catch up. The feedback is instant, and that immediacy creates something magical.

The moment your hand moves, the medium responds. This instant feedback loop between intention and result is where inspiration lives.

When you're painting, you're not just executing a pre-conceived plan. You're in a conversation with the medium itself. You make a stroke, see how it looks, and that visual feedback informs your next decision. Sometimes, the paint does something unexpected—a color bleeds, a texture emerges, a shape suggests itself—and that surprise becomes the seed of a new idea.

This isn't unique to painting. Sculptors feel the clay respond to their touch. Musicians hear the note the moment they press a key or pluck a string. Writers see words appear as they type (though even this has been compromised by some modern writing tools). In all these mediums, there's a direct, immediate connection between the creator's action and the medium's response.

Now, think about how we create digital designs today. In Figma, you select a tool, click, drag, and... wait. The interface updates, but there's a layer of abstraction. You're not directly manipulating pixels or shapes—you're manipulating representations of them. The feedback loop exists, but it's slower, more mediated, less immediate.

And sometimes, many of our current "AI-powered" tools seem to break this feedback loop entirely. You type a prompt, wait for the AI to process, and then receive a result. There's often no conversation, no back-and-forth, no opportunity for the medium itself to inspire you. It can feel more like ordering from a menu than creating with your hands.

But here's what's interesting: when you do get that instant feedback in digital tools, something changes. When you're coding and see your changes reflected immediately in a browser, the creative process feels more fluid. When you're sketching on an iPad with Apple Pencil and the strokes appear instantly, it feels closer to drawing on paper. The tool starts to feel like an extension of your hand rather than a separate system you're operating.

The best creative tools seem to understand this. They minimize the gap between intention and result. They make the feedback loop so tight that you forget you're using a tool at all. You're just creating, and the creation is responding to you in real-time.

So what would a design tool look like that truly embraced instant feedback? What if every adjustment you made—every color change, every size tweak, every layout shift—happened immediately, with no lag, no loading, no processing delay? What if the tool could respond to your gestures as naturally as paint responds to a brush?

More importantly, what if the tool could surprise you? What if, like paint bleeding on canvas, the digital medium could offer unexpected suggestions, show you possibilities you hadn't considered, and inspire new directions in the moment of creation?

This is where I think the future of design tools is heading. Not toward more powerful AI that replaces human creativity, but toward tools that create tighter feedback loops, that respond instantly, and that become true partners in the creative process—tools that don't just execute our ideas, but help us discover new ones.

The gap between what we can imagine and what we can create might need to be as small as possible. In painting, that gap is nearly zero. In the best digital tools, we're getting closer. But we're not there yet.

The Phases of Creation

But I've been thinking about something else: not all creation happens the same way. The type of instant feedback you need might depend on what phase of work you're in. And I wonder if most design tools treat every phase the same way, which could be limiting.

Think about the different phases of building a product:

The blank slate. You're starting from nothing. No constraints, no existing code, no legacy decisions. This might be where you need the most freedom, the most immediate feedback, the most room to explore. Like a painter facing an empty canvas, you might need tools that let you move fast, try things, discard ideas, and iterate quickly. The feedback loop here might need to be wide open—you're discovering what you want to make.

Improving existing work. You have a product that exists, with real data, real users, real constraints. Now you're not creating from scratch—you're refining, optimizing, making things better. The feedback you need here might be different. You might need to see how your changes affect the existing system. You might need to understand context: how does this button change affect the user flow? How does this color adjustment work with the existing brand? The feedback loop might need to show you the relationship between your changes and what already exists.

Technical debt and maintenance. This is often the least glamorous but most common phase. You're fixing bugs, updating dependencies, refactoring code that's become messy over time. The feedback you need here might be about precision and safety. You might need to know immediately if your change broke something. You might need to see the impact of your fix across the entire system. The feedback loop might need to be protective—catching problems before they become bigger problems.

Incremental updates from user feedback. You've talked to users, you've seen how they actually use your product, and now you're making small, targeted improvements. This phase might require a different kind of feedback entirely. You might need to see your changes in the context of real user behavior. You might need to understand how this small tweak affects the overall experience. The feedback loop might need to connect your changes to user outcomes.

Each phase of creation might require a different type of feedback, but our tools often force us into the same workflow regardless of what we're actually trying to accomplish.

It seems like most design tools are optimized for the blank slate phase. They're great when you're starting something new, but they can become frustrating when you're working with existing systems. Figma is amazing for creating new designs, but try using it to update a complex, existing interface with hundreds of components and you might feel the friction.

And it seems like most development tools are optimized for the technical debt phase. They're great at catching errors and managing complexity, but they can be frustrating for blank slate exploration. Try starting a new idea in a traditional IDE and you might spend more time setting up the project than actually creating.

What if a tool could adapt its interface and feedback mechanisms based on what phase of work you're in? What if, when you're starting from scratch, the tool got out of your way and gave you maximum freedom? What if, when you're working with existing code, the tool automatically showed you the context and constraints? What if, when you're making incremental updates, the tool connected your changes to user data and outcomes?

Maybe the interface itself should change. Maybe the feedback should change. Maybe the workflow should change. Because the creative process seems to change depending on what you're trying to accomplish.

Perhaps this is why one-size-fits-all tools can feel limiting. They're trying to be everything to everyone, which might mean they're not optimized for any specific phase of creation. Maybe the future of design tools isn't about making one tool that does everything—maybe it's about tools that understand context and adapt accordingly.

Stay tuned for part 3, where I'll explore what this might look like in practice.