I was thinking about how much searching for information has changed recently.
Before, when you wanted to learn something, you would open your browser and have many tabs open at once. You’d collect bits of information from different websites to get a complete answer. It was a slow process, and you always had to sort through a lot of irrelevant content. Sometimes, the search would just end with: “No results match your query.”
Now, many of us just open one tab with a chatbot like ChatGPT or Gemini. This single interface has become the new starting point. And unlike a traditional search, it almost never says, “I don’t know.” It always provides an answer.
Because of this, a new race has started. Anyone who creates content or sells a product knows that to reach people, their information has to be included in the AI’s answers. They need to be visible to the AI.
I noticed something the other day that proves this. I was on a major website (TheirStack) and saw a link in their footer to a file called llm.txt. For years, websites have used a file called robots.txt to communicate with search engines, but they never put a visible link to it in the footer. Seeing llm.txt placed there so openly shows how seriously companies are taking this. They feel there is still time to become a source for these AI models, and they’re acting on it quickly.
The path to reaching users has changed. It used to be the wide-open internet, but now it’s becoming the single, direct answer from an AI. Companies know that nearly everyone will be getting their information this way. As a result, the entire industry is adapting to this reality, and it’s affecting more than just search engines.
Another point is how pervasive these LLMs are. A normal website is isolated; it can’t access or interfere with another website’s content. But LLMs can. They can process and connect information from all over the internet. And because they see our questions and prompts, they are also learning how we think and what problems we are trying to solve.
I’ve always thought that humans had plenty of data. The real challenge was getting the exact data you needed, at the moment you needed it, with the ability to use it instantly. LLMs are solving this problem. For the first time, we’ve built a system that combines all of this information with the power to process it. And since we are constantly telling it what we’re working on, it’s learning from us in real-time.
