Other Types
  1. All Blogs
  2. Product | Algolia
  3. Ai | Algolia
  4. E-commerce | Algolia
  5. User Experience | Algolia
  6. Algolia
  7. Engineering | Algolia

Ask AI: smarter search for docs, support, and beyond

Published:

If you’ve ever hit ⌘/Ctrl + K on a docs site and seen an Algolia logo in the search box, you’ve probably used DocSearch.

DocSearch started 10 years ago as a simple idea: help developers make open-source documentation instantly searchable. Fast forward to today and DocSearch shows up across ecosystems and beyond open-source (React, Tailwind, Vue, and a long listof others). It’s become part of the muscle memory of reading docs.

But documentation search has changed a lot in the last decade… and honestly, in the last year. Developers now expect LLM-powered answers alongside keyword search — without trading away speed, trust, or the ability to quickly jump to the source.

At Algolia DevCon 2025, I hosted a panel with the folks who’ve been living this evolution day-to-day:

  • Tanya Herman, Product Manager for DocSearch, the Crawler, and Ask AI

  • Natan Yagudayev, Engineering Manager who built the initial Ask AI prototype

  • Sian King, Staff Product Designer who led the DocSearch UI refresh and Ask AI design

In this post I’ll try to summarize the core topics of that conversation: what made DocSearch “stick” with developers, what’s new in DocSearch v4, and how Ask AI fits into the modern docs experience. You can also watch the full discussion here:

Why developers love DocSearch

Tanya framed DocSearch’s origin ten years ago as a gift to the developer community. It was built to solve one job extremely well: make technical documentation searchable.

Over the years, a few DocSearch features have become “non-negotiables” that developers rely on across all doc sites.

Keyboard-first by default

DocSearch helped normalize the Cmd K (or Ctrl K) pattern across developer docs. Once that shortcut becomes instinct, every site that doesn’t support it feels… off.

A recognizable, low-friction UI

Sian called out the details developers notice: the familiar modal layout, the shortcut hints, and the small “powered by Algolia” cues that signal you already know how this works.

Setup that doesn’t turn into a project

DocSearch became popular partly because it’s fast to adopt. You shouldn’t need a sprint to make your docs searchable.

There’s also a behind-the-scenes detail Tanya shared that shows how much the product has grown: in the early days, every DocSearch application was manually reviewed by a human. Great for quality, terrible for scale. Modern DocSearch needed to keep quality high and remove that bottleneck. This fits in with an overarching philosophy of keeping what developers love while modernizing what held us back.

What changed in DocSearch v4

DocSearch v4 modernizes both the developer and user experience across UI, onboarding, and features while trying hard not to mess with what developers loved over the last 10 years.

Self-service onboarding in minutes

Instead of waiting on a manual review loop, DocSearch now supports automated eligibility verification and onboarding so teams can go live quickly. 

A modern, more flexible UI foundation

DocSearch v4 is built on Algolia Autocomplete, and the docs emphasize improvements like accessibility, responsiveness, and overall UX. 

An easy upgrade path

One of the clearest product goals Tanya described: evolve DocSearch without adding friction. The migration path from v3 to v4 is documented, and the intent is to avoid forcing teams to “start over.” 

And then there’s the biggest shift…

Ask AI: conversational answers, inside the DocSearch flow

DocSearch v4 introduces Ask AI as a conversational assistant mode directly in the UI. Ask AI layers conversational answers on top of the search experience developers already trust. But the part that mattered most to us in the panel wasn’t “we added a chatbot.” It was: how do we add answers without breaking the developer workflow?

One input, two modes

Sian walked through a key design decision: keep keyword search and Ask AI in the same modal, so it’s easy to move between the two.

The interaction pattern is intentionally lightweight:

  • You type in the same search box as always.

  • The first result presented is the Ask AI option.

  • Hit Enter to get the conversational answer.

  • Or tab down to the classic keyword results.

That’s subtle, but important: Ask AI becomes an “upgrade path” inside the same motion developers already know.

Natan also pointed out that the conversational flow intentionally doesn’t mimic a generic chat app. For example, the input stays at the top—consistent with DocSearch’s layout—rather than pushing you into a bottom-up chat paradigm. It’s a small design choice to enhance the experience.

Under the hood: making AI answers work on real docs

On the engineering side v4 required new frontend support, plus backend infrastructure for retrieval and answer generation. In the process of building this we learned a few valuable lessons.

Retrieval has to be first-class

Ask AI is designed to use your Algolia index as the source of truth for responses—turning search hits into context-aware responses. In practice, that meant the system needed to reliably retrieve the right chunks before an LLM can do anything useful.

Index structure matters more than you think

Not every index is automatically “LLM-ready.” Records may be too thin, too messy, or shaped in a way that’s hard to turn into grounded answers. One v4 addition here: markdown indexing support (via crawler) to improve chunking and context for LLM consumption.

Tooling + guardrails aren’t optional

Natan talked about iterating on prompting many, many times—partly because people will try to hijack it. The official Ask AI docs also explicitly cover prompting techniques and security/compliance best practices. 

“Bring your own LLM” and cost controls

Ask AI is BYO LLM: you connect your preferred provider using your own API keys, and choose the model you want. (see the Getting Started docs) In the panel, Natan mentioned broad provider/model support, but the core point for developers is the flexibility: you can pick the model that matches your quality/latency/cost needs, and switch as the model landscape changes.

Cost control also came up. Two practical levers matter most:

  • Max tokens per response

  • How many search hits the model can use to construct an answer

Those are the kinds of knobs that keep an “AI feature” from becoming an “AI bill surprise.”

Ask AI, Agent Studio, MCP: a ladder of solutions

One part of the conversation I keep coming back to is Tanya’s “ladder” framing of Algolia’s agent offerings:

  • Ask AI: packaged, plug-and-play RAG for docs/search

  • Agent Studio: more customization and workflows beyond the packaged experience

  • MCP: maximum extensibility for teams orchestrating agents and tooling across systems

That ladder matters because not everyone wants the same thing. Some teams want the fastest path to “answers in docs.” Others want deep orchestration across product data, support systems, and custom UIs.

(And it’s worth mentioning that Ask AI isn’t only for docs—it’s positioned for docs, blogs, and support content, and can be integrated beyond the default DocSearch UI if you want to build your own interface.)

If you’re building docs, this is the new baseline

Docs are an important part of your product surface. The search box has always been an important part of your docs, but more and more agents are how users learn your system.

DocSearch v4 tries to honor what made DocSearch beloved (speed, keyboard-first UX, low setup friction), while acknowledging the reality that developers now want:

  • Fast keyword results

  • Grounded conversational answers

  • A legacy of control, trust, and performance

Check out DocSearch today, and use Cmd/Ctrl-K to learn more!

Recommended

Get the AI search that shows users what they need