Technology

The Next Great Interface Might Not Have a Screen

For years, design was about screens, gestures, and pixels. But what if the next leap in product design isn’t visual at all?

SM

Shahi M.

July 20, 20255 min read
The Next Great Interface Might Not Have a Screen

I still remember when Google Search felt like a miracle. A single box. No clutter. Just questions and answers. It was one of the most beautifully designed products I’ve ever used—and probably one of the most influential. It shaped how we interact with the internet. It shaped how we find truth.

But something subtle has been happening. We’re typing less. We’re talking more. And not just to each other—but to machines.

Last month, OpenAI dropped a video. A quiet conversation between Sam Altman and Jony Ive—two people who don’t usually sit in front of cameras like this. It wasn’t flashy. No launch. No keynote. Just a dimly lit bar, a shared drink, and the kind of curiosity that doesn’t feel rehearsed.

They didn’t say much explicitly. But they didn’t have to.

If you’ve worked in product long enough, you could feel it. The ambition wasn’t about features or specs. It was about rethinking the device itself—the very object we use to think, search, speak, work, connect.


From rectangles to presence

For the past 15 years, the dominant paradigm has been a touchscreen. A slab of glass that we poke, swipe, and stare at. Phones got faster. Cameras got sharper. But the core interaction didn’t change.

What Altman and Ive are building seems to reject that. It’s not a phone. It’s not a wearable. It’s not even screen-first. The early prototypes reportedly fit in a pocket or sit on a desk. That’s it.

In other words, it’s a device built not for visibility—but for presence.

It listens. It understands. It responds. Not with buttons or taps, but with intent and context. And maybe that’s what comes next—not a better phone, but a calmer computer. One that knows when not to interrupt.


A quiet shift already underway

If this sounds far-fetched, consider this: We already moved away from the search box.

A 2025 survey showed 71.5% of people use AI tools for search, with 14% using them daily. Google’s global search share dropped below 90% for the first time in a decade. And even though AI chat still accounts for just ~3% of total search traffic, it’s growing fast—up 80% year-over-year.

We may not notice it day to day, but the interaction model has changed. Instead of typing keywords and scanning links, we just ask.

What started as an experiment—chatting with AI—has become a habit. And habits, not headlines, are what change markets.


But… what is this new device?

Let me be clear—I don’t think this product, whatever it ends up being, will immediately replace the iPhone. Or that we’ll all carry screenless assistants next year.

But I do believe something new is coming.

A third device. Not a replacement, but a companion. Smaller. More convenient. More passive. One that understands us without always asking for attention. And most importantly, one that’s designed from day one to be AI-first.

Not AI bolted onto an old interface. But something where the intelligence is the interface.


The product design challenge ahead

This shift poses new questions for us as designers and builders:

  • How do you design for voice and presence, not screens?

  • How do you build trust into a device that listens all the time?

  • What does UX look like when it’s ambient—when you’re not looking at it?

These are no longer academic questions. They’re real. They’re near. And they’re going to define the next decade of product design.

We’ve seen this before. Apple didn’t invent the smartphone—but they redefined it. Google didn’t invent search—but they designed it into something people could fall in love with. Now, we’re at the edge of the next shift.

AI has already changed how we search. The next step is changing where we search from.

And someone is going to build the object that makes that feel… human.