
In recent months, what until recently was a hypothetical conversation has become a tangible change.
Click, The impact of artificial intelligence platforms on web traffic is no longer a matter of “if it will happen”, But how much is happening without many realizing. David Bell, of the expected SEO consultant, has documented it with data in a recent analysis published in Search Engine Land: between January and May 2025, sessions from attendees such as Chatgpt, Perplexity, Claude, Gemini and Co -cilot went from 17,076 to 107,100 in 19 Google Analytics 4 properties, A 527% jump in just five months.
Complex questions, precise answers
The phenomenon is not distributed uniformly. Bell shows that sectors where trust and precision are essential The majority of this new traffic concentrate. Legal, finance, health, small businesses and insurance They already add up to 55% of the visits caused by language models. In some cases, as in certain Saas sites, more than 1% of all sessions (still little, but growing) arrive from these platforms.
This is no coincidence: users tend to consider AI complex questions And very contextualized, of the type they previously reserved for a human specialist, such as the preparation of a contract, the compatibility of a medication or the structuring of a payroll.
In this new cast, Chatgpt leads with slack, generating between 40% and 60% of traffic from AI In almost all sectors. But the map is diversifying.
Thus, for example, Perplexity has become strong in finance, legal and small businesses; Copilot stands out especially in finance and legal; Gemini begins to emerge in insurance and small businesses; And Claude, although still with very modest figures, is present in all verticals. For Bell, this diversity is a clear warning: optimizing content thinking about a single platform is a risky strategy.
A speed change …
But the background change goes beyond the origin of visits. The way language models find and show the content Do not follow Google rules. According to the document, there is no gradual tracking, nor a hard competition to get a blue link on the first page, or waiting periods to spread a canonical label. The criteria is immediate: If the content answers the question clearly, reliable and structured, the model can be cited instantly.
Bell describes this change as the entry into an “era of instant surfacing”, in which an article, a guide or even a product page can be discovered and used by the AI Before having climbed positions in organic results:
«The content does not need to appear at the top of the Google Serp to be found. It must be clear, structured and cited by the model, either in a blog, an aid document, a practical case or a knowledge base. And this means that the ancient mentality – publishing, waiting and waiting for Google to discover it – is dangerously obsolete ».
This term, “surfacing”, is not very common (at least it is not heard too much) and refers to moment in which a content is discovered and shown by a platform (In this case, a LLM such as Chatgpt, Perplexity or Gemini) in response to a user's consultation. That is, in the SEO of a lifetime, the process for your content to “surface” (forgiveness for the word) goes through several phases:
Google tracks your page. The algorithm decides its position in the ranking. The user does a search. Your page appears in the results and, if you are lucky, the user clicks.
That flow may take days or weeks, and visibility depends on competing with other pages per position in the results. On the other hand, in the era of instant surfacing (forgive again) described by Bell, LLMS can show your content without waiting for you to be well positioned on Google. The idea is that if a LLM such as Chatgpt with Web Navigation or Google's generative search experience (SGE) is trained or has access to your content, you can “select” the relevant information immediately. You don't have to wait for your page “to rise from post”
This turn requires a rethinking of visibility strategies. For Bell, the first step is Measure, even in an imperfect way, the traffic that comes from these platforms: Use monitoring parameters, observe unusual peaks in direct traffic and look for correlations with mentions in ia chats.
… And a change of approach
From there, the content must be restructured so that it works not only on human screens, but within the answers generated by a model: clarity, concise, well -defined sections and formats that facilitate the extraction of information. The important thing is no longer rankear in a specific position, but Become the source that a model selects and shows. And this applies to the entire funnel, from articles and study cases to product pages or support documentation.
The background change It goes beyond the origin of visits. As we have seen, the way in which language models find and show the content does not follow Google rules. And this is where Bell's photo crosses with the reality that we already see in the ecosystem dominated by Google. The company, through its vice president and Head of Search, Liz Reid, defended a few days ago that its new search experience promoted by AI maintains web traffic and that it even increases the “quality” of clicks. However, the experience of creators and Marketers does not always coincide: impressions can be maintained, but organic clicks fall.
This tension creates an interesting contrast: while Google ensures that its “highlights the website” and predictable shows that LLMS can open new paths of discovery, The fact is that in both cases the information access model is moving away from the traditional organic click. In Google, visibility does not guarantee traffic; In the LLMS, the selection depends on factors such as the clarity, structure or authority perceived by the model, not of the Seo Ranking of always.
We are in a transition phase
It is evident that we are still adapting to the new era of the generative search, and as this will be a thousand reports in the coming months. But what the predictable study makes clear is that SEO does not disappear, but is divided. A part will continue to revolve around the positioning of traditional search engines; The other, increasingly relevant, will depend on how the AI models find and decide to show the information.
And, as in each change of previous rules, those who adapt first not only will they maintain their visibility, but can conquer land before the rest of competitors react.
Image: Chatgpt