The pressure is real. The direction is not.
We hear it in almost every conversation with the companies we work with: “We need to do something with AI.” The pressure is real. According to McKinsey, 92% of companies plan to increase AI investments over the next two years. Yet only 1% of executives consider their organisation mature in how AI is actually deployed. Most are still in the experimentation or piloting phase.
That gap between spending and readiness is where the problems start, not only in corporate communications. Without a clear direction, organisations risk deploying tools that are misaligned with their messaging, building on content that has not been reviewed for accuracy or relevance, or simply ignoring how their stakeholders have started looking for information.
How people find corporate information is changing
One shift worth paying attention to: a growing share of queries now goes through what we might call answer engines, AI-powered tools that synthesise responses instead of just listing links. ChatGPT counts over 900 million weekly active users (2026). Google’s AI overviews appear in a quarter of all searches, and when they do, clicks to websites drop by up to 58% (Ahrefs, Feb 2026). In Google’s AI Mode, 93% of sessions end without a single external click (Semrush, 2025).
Fewer visits, but better ones: visitors from AI channels convert at 4.4 times the rate of traditional search (Semrush, 2025).
The catch is that you only benefit if the model picks your content as a source.
GEO: a new discipline, not a new trick
The emerging discipline around this is called Generative Engine Optimization (GEO). But it would be a mistake to treat it as a bag of tricks. The algorithms behind answer engines are opaque, results can shift rapidly.
This is the reason to focus on fundamentals. The signals that answer engines reward over time, as their algorithms evolve, are rooted in trust: accurate, well-structured, current information backed by credible sources.
What this means in practice
- Map the questions your stakeholders actually ask.
- Organise content into topic clusters deep enough that an AI model can find a complete answer in one place.
- Keep pages current.
- Write in a modular way, where each section can be extracted and understood on its own, because that is how AI process information.
Wikipedia: a direct input into what AI tells your stakeholders
Wikipedia appears on average in over 12% of AI-generated citations (ChatGPT, Perplexity, AI overview), making the Wikipedia article about your company a direct input into what answer engines tell your stakeholders.
Governing those contents (keeping them accurate, well-sourced, up to date) requires sustained work: research, drafting, community engagement. We have been doing this for 18 years, and its relevance has only grown.
When companies build their own AI tools
Then there is the question of companies building their own AI tools: a search engine on the corporate website, an investor relations assistant, an internal chatbot. These systems are only as good as the knowledge base that feeds them. We help define scope, prepare content, and test outputs against tone of voice and strategic priorities before they reach stakeholders.
Start from the need, not from the tool
None of this starts with technology. It starts with clarity about what the organisation needs, a defined perimeter, and then the right tools for the job. That is where we come in.
We are hosting an invitation-only event in Milan, April 2026: AI in corporate: finding your bearings before setting off. From the simplicity trap to decisions that last. Leaders from major companies that have built AI solutions for their stakeholders will share what worked, what failed, and what they would do differently.
Places are limited. For information: event@lundquist.it