Thinking
Smart experiences beyond the chatbot
Smart experiences don’t just respond. They read the context, anticipate needs and guide people with simplicity. In this article, we explore how to design truly useful AI interactions: adaptive, consistent, integrated

In recent months, there has been a lot of talk about AI and design. However, as our feeds filled up with generic chatbots and flashy demos, one question emerged across several of our projects: What does it mean to design a smart experience?
We’re not talking about an assistant that responds well to prompts. Nor are we talking about a "smart" function tacked onto an already complex interface. We're talking about a system that understands our intentions, interprets context, and assists us proactively.
It's an intelligence that doesn't impose itself but supports. It doesn't surprise; it simplifies. It isn't visible, but it's felt.
Useful AI is contextual and knows its place
Today, the most common risk is prioritizing technological novelty over people’s actual needs. In many digital experiences, AI is introduced as an external add-on: a generic chatbot, a voice command assistant, or a "smart" feature that is poorly adapted to the context.
However, designing with AI doesn't mean adding a new feature. It means rethinking interactions based on what this technology can uniquely do: interpret context, learn from data, and act dynamically. The question is not whether to use AI but where and how it can create real value for users and organizations.
When it delivers value, AI doesn't need to show off. It's most effective when it's invisible yet present. It's most effective when it doesn't interrupt but completes. It acts in context, based on the moment, environment, and users' micro-intentions.
It's an intelligence designed to be perceived, not flaunted. It saves time, reduces friction, and anticipates real needs.
Designing contextual AI requires stepping beyond the "prompt → response" logic to think in terms of collaboration and dialogue between the user and the system. This requires feeding the AI realistic examples, not just words, but also situations, cues, and implicit signals.
It's also a question of balance between an AI that suggests and supports, and one that decides and acts. In either case, the user must retain a sense of control and understanding. Intelligence should be a silent companion, not a mysterious force.
Intent-based interaction: design for goals, not commands
Many AI interfaces today are built around command-based logic. The user submits a request, which the system then executes with varying effectiveness. However, that’s not how intelligence works in real life. A thoughtful waiter doesn't ask, "Would you like the menu?" They sense when you might need it.
This shift is from UX built on predefined flows to experiences that adapt dynamically to what’s needed in the moment. A useful feature doesn't force you to find the right command: It understands your goal and integrates itself into the experience without friction.
This is where an intent-based approach comes in, designing for the goal rather than the phrasing of the request. This requires understanding context, creating realistic use scenarios, and training AI to suggest rather than respond.
It means rethinking touchpoints instead of adding chatbots everywhere. Sometimes, the most effective approach isn't to revolutionize the entire experience but to focus on areas that matter most.
Smart micro-interactions: silent, effective, valuable
A new interface or full-on conversation isn’t always necessary. Often, AI makes the biggest difference in smart micro-interactions: small automations, contextual suggestions, dynamic completions, and adaptive sorting.
These invisible yet valuable interventions quietly enhance the experience without drawing attention to themselves. Together, they demonstrate an intelligence that works because it integrates. It's the sum of these moments that shapes the perception of a brilliant product.
Because they don't impose themselves, they must be visually consistent, logically harmonized, and part of a fluid narrative. The goal is not to impress, but to simplify.
Assistive or agentive?
When designing AI-powered experiences, it’s helpful to distinguish between:
- Assistive AI: Proposes, suggests, facilitates
- Agentive AI: Decides, guides, automates
There is no single right solution. But there is a balance to be found, which changes depending on the context. In sensitive scenarios, users will want to retain control. In others, they will be happy to delegate.
The task of design is to regulate this relationship: to instil trust, make things understandable, and prevent intelligence from turning into opacity.

Not just prompts: toward intelligence design
All too often, we mistakenly believe that designing the interface is enough. However, with AI, what matters even more is what’s behind it: the data, examples, and use cases that teach the system to recognize context.
Designers play a key role in defining scenarios, prototyping behaviours, and teaching the system what is expected of it.
Designing AI-powered experiences isn't just about deciding where to implement AI. It means training the system with realistic prompts built from real user needs and scenarios. It means testing, observing, and refining.
Most importantly, it means not letting AI dictate the shape of the experience. Design remains a fundamental tool for shaping intelligence, channelling its potential, resolving ambiguities, and making it usable and trustworthy.
At Tangible, we’re doing this across several projects. We’ve learned that there is no magic formula.
However, there is a guiding principle to which we always return: start with people. Consider their contexts, goals, and expectations.
Only then can AI become a concrete form of innovation.