DALL-E/Every illustration.

Vibes-based Search

The next big disruption in software is here, and it understands what you’re looking for better than you do

64 4

Was this newsletter forwarded to you? Sign up to get it in your inbox.


The problem with the internet is that we have too much and can find too damn little. In our palms, we hold the total sum of human knowledge, but most of the time, that isn’t enough to be genuinely useful. Due to a combination of SEO-optimized garbage and the technical challenge of organizing the world’s information, surfacing the answers we’re looking for on Google obliges us to navigate by knowledge star charts, triangulating our position in the digital universe with funky search terms like “best hiking trails AND iceland AND reddit.” 

But “best hiking trails AND iceland AND reddit” is simply not how humans think. As I’ve discussed previously, much of our intellectual labor happens in the subconscious, in the elements of taste that make up how we process thought. We think in vague terms, like, “Flights to Iceland are cheap, I’ve heard the hiking there is pretty good.” Put simply: There is a vibe to how we think that is lost when we transform our questions into something Google search can understand.

The good news is that LLMs are the first technology that is explicitly trained on those subconscious elements of thought. Rather than rely on boolean searching—the borderline hieroglyphic set of logic rules you can use to make Google search more precise—you can fine-tune an LLM to think the way you do. That way, instead of mustering up the perfect search terms, you only have to wave in the direction of what you are looking for. You can type, “I’ve heard Iceland hiking is good, anything under five miles that's kinda epic?” and get a list of solid options instead of getting stuck in SEO hell.

I call it “vibes-based search.” And it represents a profound evolution in the way we find and interact with information online. 

By making a large language model the principal technology in a search engine, vibes-based search evolves the process beyond carefully chosen keywords and star maps to something more direct and actionable. But because it understands how we think, it can also help us find what we want without knowing exactly what it is that we are looking for. Put another way, it represents an evolution beyond information and answers, to understanding. 

I recognize that this description is a little dense and art-schooly. In plain English, LLMs have three new capabilities that change the nature of search:

  1. Intuitive leaps: LLMs can answer poorly written questions with spooky-accurate answers. Even when users only vaguely gesture towards what they are trying to say, LLMs can intuit what they actually mean.
  2. Ingest and understand new materials: Because it’s easy to integrate an LLM into an existing product, you can incorporate search into essentially any dataset. Even better, the LLM can “understand” the material, allowing you to surface answers, not just key phrases.
  3. Co-create new answers: Generative AI allows you to generate new insights from existing data or remix multiple datasets together easily. It means search becomes a creative piece of software, as flexible as a programming environment. 

Vibes-based search is an amalgam of these three capabilities. It means that you can write, code, and create in a way that traditional search can’t match. Let me show you what I mean.

Dumb prompt, smart answer

I have a particular ear for melodies and absolutely no talent for lyrics. I’ll start humming a song that is stuck in my head, but I’ll make up my own words because I have no idea what the song actually is. I imagine that when my wife and I were dating, this was mildly cute. Now, going into year four of our relationship, I am confident it has progressed to annoying. (Though she would never complain.)

While driving last week, I heard screeching guitars in my head and spontaneously started singing, “Na na na na na na na na.” Normally, I would give up at this point without identifying the song. However, this is one problem that vibes-based search can solve. I pulled up Google and queried: “There is a song that came out that went like nah nah nahh nah na nahhh na naa nahhhh. what song is that? i think it was angry and emo.”

Boom—immediate success. Google’s take on LLM search gave me an answer.

Source for all images: Author’s screenshot.

It was My Chemical Romance! Great band. Great song, albeit one that is embarrassingly titled “Na Na Na (Na Na Na Na Na Na Na Na Na).”  

LLMs are more intuitive than traditional search because they grasp the meaning behind your words, not just the words themselves. This makes vibes-based search helpful when you’re doing analytical work, too. For example, last week I wrote an article arguing that despite slowing investment, now is a great time to build a SaaS company because AI dramatically reduces costs. In the back of my head, I remembered stumbling across a data point about how software companies weren’t forecasting a ton of growth. Normally—though I admit that I should probably come up with a better note-taking system—it would be an excruciatingly difficult thing to find. I’d probably spend 30 minutes Googling, fumbling with Twitter search, and writing specific search queries in my email inbox. Instead, I just asked X.AI’s Grok 2 model, “Are software companies growing fast this year?”

You’ll note that this is a terrible question. It’s incredibly vague, with no clear parameters about what I mean by “fast.” I don’t even say that I’m looking for data on it. But the system was able to intuit what I meant. Grok gave me a five-part overview of various statistics around startup growth rates, followed by a horizontal scroll of tweets related to the topic at hand. And there, like magic, was the tweet I had seen three months ago.

I have an experience like this multiple times a week. I’ll search for something weird or vague, ranging from dishes I can cook with specific ingredients to the definition of a technical concept, and the answer will appear. Hours of work that I used to have to put in just to navigate the internet have evaporated. And part of that comes down to no longer having to rely on the same old datasets.

Create a free account to continue reading

The Only Subscription
You Need to Stay at the
Edge of AI

The essential toolkit for those shaping the future

"This might be the best value you
can get from an AI subscription."

- Jay S.

Mail Every Content
AI&I Podcast AI&I Podcast
Monologue Monologue
Cora Cora
Sparkle Sparkle
Spiral Spiral

Join 100,000+ leaders, builders, and innovators

Community members

Already have an account? Sign in

What is included in a subscription?

Daily insights from AI pioneers + early access to powerful AI tools

Pencil Front-row access to the future of AI
Check In-depth reviews of new models on release day
Check Playbooks and guides for putting AI to work
Check Prompts and use cases for builders

Comments

You need to login before you can comment.
Don't have an account? Sign up!
Claire Bouvier about 1 year ago

Love this! Just shared it on LInkedin - Human Centric AI is where it's at! Thank you!!!

Jennie Pakula about 1 year ago

Great post ... but you nearly lost me in the first paragraph, stating that we have 'the total sum of human knowledge in our hands'.

@jtowers349 about 1 year ago

Excellent. Every on a roll this week.

Sacha Horowitz 11 months ago

There is a special vibe to Evan's articles. I wonder about the writing with AI course.