Other Types
Algolia

The search revolution is here. And we're just getting started.

Published:
Back to all blogs

Listen to this blog as a podcast:

I joined Algolia twelve years ago as an intern at a five-person startup. Today we power 1.8 trillion searches a day, serve 18,000+ customers, store 30 billion records, and return results in under 20 milliseconds. But more interesting than those numbers is what that journey taught me about the fundamental tension at the heart of every search experience, and why generative AI is the first technology in decades that has a real shot at resolving it.

At a recent event, AI Day, in Paris, I had the opportunity to share my thoughts on this publicly. The full talk is linked below, and what follows is a deeper look at the core thesis.

xavier-AI-day-talk.jpg

The lie we all accepted

When Amazon launched, they buried the search bar. It was almost an afterthought. Today it's the front door. But despite moving to the center of every digital product, search has been asking users to adapt to it, not the other way around.

Think about how you actually type into a search box. You strip out articles, crush your intent into two or three keywords, omit context that would be obvious in conversation. We call this "natural" because we've been trained to do it for decades. It isn't natural at all. It's a concession to the machine.

blockquote1.png

Three moments, three failure modes

To build the right mental model, I use three real shopping experiences from my own life. They map cleanly onto the three zones of the purchase funnel, and each one exposes a different failure mode in how search currently works.

3-moments-failure-modes.jpg

The bottom-of-funnel case is the one where search engines were historically designed for, and even there, we fail users constantly. The no-results page is one of the most expensive things a commerce site can show. It ends journeys. Before Algolia, slow, static results pages made it impossible for users to interactively narrow criteria and see results adapt in real time. Speed and relevance working together was our original bet, and it still matters enormously.

But the top and middle of the funnel? That's where traditional search has been structurally blind. These aren't edge cases. They represent the majority of high-value discovery journeys.

three-eras-of-search.jpg
Three eras of search

The demo that showed me what's possible

At the event, I ran a live demo (with the expected internet gremlins, and anyone who presents live demos knows the feeling). The scenario: I need a complete outfit for a casual wedding, discovered through conversation, not filters.

I typed "casual wedding" as a rough intent. The system guided me toward categories: a blazer, then shoes, suggesting color options. At each step I could go deeper into structured facets if I wanted precision, or stay in natural language if I wanted to keep exploring. One intent, a complete outfit, navigated across multiple categories without me ever having to understand the taxonomy of the site.

blockquote2.png

The same pattern applies far beyond e-commerce. Consider media and knowledge: with the volume of news hitting us daily, being able to ask for "articles covering diverse perspectives on this topic" rather than searching for keywords is a meaningfully different experience. Support and documentation have the same opportunity. This is a horizontal shift in how humans interact with structured information.

The integration problem is the hard part

Here's the engineering challenge I want to be direct about: the chat bolt-on is not the answer. Right now, most AI-augmented search experiences look like a chat widget in the bottom-right corner coexisting with a search bar in the header. These are two separate mental models running in parallel, and users hate choosing between them.

Every time users face a fork, "do I search or chat?", a meaningful percentage take the third option: they defer and leave. Friction that was supposed to be solved by adding AI has actually been multiplied. The value is invisible because the surface doesn't expose it.

The hard technical work is unifying the keyword-and-facet paradigm with the conversational paradigm into a single coherent surface. That means understanding how structured queries and natural language coexist, when to switch modes, and how to maintain state across turns. This is where Algolia is investing heavily right now (and a direction where analyst firms such as Gartner are recommending).

We are early. And that's the exciting part.

The technology is real. The capability is there. But the experience hasn't caught up yet. Today's AI search integrations are like the first smartphone browsers: technically functional, but not yet designed around what the medium makes possible.

When deep integration arrives, when conversation and structured search are truly unified, when the AI layer is embedded in every step of the journey rather than appended to it, that's when you'll feel the step function. Not a marginal improvement. A qualitative one.

Twelve years ago search was something companies hid. Today it's the entry point. Tomorrow it will be the expert that guides you through the entire experience. I've never been more certain about where this is going, or more energized about building it.

Watch the full talk from AI Day in Paris, including a live demo of conversational search in action, below. 

Recommended

We think you might be interested in these:

Get the AI search that shows users what they need