BrightonSEO: The SEO Manager’s log

Darren at brighton seo conference in two polaroid pictures on a blue background

BrightonSEO: The SEO Manager’s Log

Like most people walking into BrightonSEO this year, we half-joked that it should really be called Brighton AI or maybe even Brighton AEO. Even the main host opened by acknowledging the sheer number of AI-focused talks on the schedule, proof, if it were ever needed, that it’s the industry’s obsession and buzzword right now.

Each speaker brought their own take on how AI fits into our industry – from reporting and audits to content strategy and workflow automation. However, they all landed on the same conclusion: humans are still better at doing the job than AI.

The message across the conference was clear: AI is a great assistant, but it’s not the strategist. It can help you speed up tasks and uncover insights, but the real value still comes from human judgment, understanding intent, and the actual people behind the searches we’re trying to reach.

There was one question that got a lot of us cheering – that being: Can we please stop calling it GEO? Because at the end of the day, it’s still just SEO.

My SEO takeaways from BrightonSEO October 2025

For me, the real value of being at BrightonSEO wasn’t just talking about AI, as much as I could geek out on that all day. The sessions that were most exciting to me were the ones that let me dig into the technical side of SEO, where I could pick up small wins and new ideas to bring back for our clients at Trio.

Surprisingly, the first highlight of the week for me wasn’t a flashy new AI tool or trend; it was something far more fundamental: server log files. Get ready for the techy stuff…

Log analysis for SEO is non-negotiable

The first talk I went to on Thursday really set the tone for me: “What Log Files Tell About Your Visibility in AI Search.” My main note from this session was simply: LOG ANALYSIS IS NON-NEGOTIABLE.

 

Effective log file analysis can help:

  • Identify crawl inefficiencies
  • Prioritise which pages need optimisation
  • Spot trends in how search engines, including AI-driven ones, are interacting with your site

Right now, AI-driven search data is largely a black box. Search Console doesn’t tell you much about how your site is being crawled by AI bots. Log file analysis gives you a clear picture of what’s happening. You can see exactly when bots are crawling your site, which pages they hit most, and how often.

That information can be gold for SEOs. It lets you identify crawl inefficiencies, prioritise which pages to optimise, and spot trends in how search engines (including AI-driven ones) are actually interacting with your website. In short, you can’t manage what you can’t see, and log files give you that visibility, something that’s becoming even more critical as AI continues to change the search landscape.


Some of the additional insights I picked up from digging into logs:

  • Hallucinated pages:AI or search results sometimes show pages that don’t exist but still get clicked. By identifying these, you could potentially turn them into real, optimised pages to capture that traffic.
  • Tracking crawls vs. users Tools like ChatGPT follow redirects, which can make GA4 show two “users” for the same crawl. Log files let you see that it was actually a single bot crawl, giving a more accurate picture of site activity.
  • It takes effort, but it pays off:Analysing and compiling log files isn’t a one-click task. It requires careful mapping and interpretation. But when done properly, it can produce powerful insights that directly inform optimisation strategy.

 

Turning log file analysis into actionable SEO reporting

How to use log files for SEO & AI Insights

  • Regularly review log files to understand how bots are crawling your site.
  • Adjust robots.txt or server rules if one bot is consuming too much crawl budget.
  • AI agents prioritise new or updated pages.
  • Identify and fix “hallucinated” or unexpected pages with redirects or content updates.
  • Analyse crawl patterns over time to schedule updates when bots are most active.
  • Regular log analysis gives actionable insights for both SEO and AI-driven search visibility.

This was a talk I knew I had to attend after the initial Log File talk I went to. Log file analysis isn’t just for technical deep-dives. It can also be a tool for ongoing SEO reporting, even if it’s just to your team. Regularly reviewing logs can give you insight into how different bots are crawling your site and help you make smarter decisions.

You might notice that a certain bot is using a large portion of your crawl budget. If that’s limiting how often other important bots (like Googlebot) can crawl your key pages, you can adjust your robots.txt or other server rules to prioritise crawls that impact your SEO performance most. Essentially, you’re making sure your crawl budget is spent where it matters.

AI agents tend to prioritise new or recently updated pages. You can use log file data to identify which pages are being crawled first by AI bots and ensure your most important content is discoverable and fully optimised. It also gives you a chance to correct “hallucinated” pages or unexpected URL hits by adding redirects or improving content.

Log files also reveal crawl patterns over time. You might find that AI bots crawl less during weekends or at night, mirroring or differing from your organic SEO traffic. This is useful for scheduling updates, publishing content, or timing site changes to maximise visibility when bots are most active. For example, releasing a major update when bots are crawling frequently increases the chances it will be indexed faster.

In short, analysing logs regularly doesn’t just tell you what’s happening, it gives you actions: optimise crawl budget, refine your content strategy, and better align your site for both human and AI interactions.

The role of schema in SEO:

What LLMs can’t ignore

As a Technical SEO geek, I’ve probably said the word Schema more times than my clients or colleagues want to hear, and BrightonSEO didn’t change that. There’s always a debate about how LLMs (large language models) interact with structured data.

Technically, LLMs don’t crawl Schema the way Googlebot does. Instead, the orchestrator- the system that feeds the LLM – uses Schema to understand and prioritise content when answering queries. In other words, while the LLM itself doesn’t read your markup, the orchestrator can use it to know what content exists, what’s important, and how it relates to other pages.

One of the biggest takeaways from this discussion also involved the ongoing debate around LLM.txt. Some people argue it has no effect, while others are showing early proof that it can influence how content is picked up for AI responses. The key point: this is new territory.

Even if it doesn’t seem impactful today, LLM.txt files and structured data are likely to play a larger role in AI-driven search in the future, as orchestrators increasingly rely on clear, machine-readable signals to determine which content to feed LLMs.

Tying this back to my earlier highlights, it hammers home a familiar theme: human judgment and technical work matter. Whether it’s log files giving you actionable visibility or Schema feeding orchestrators that power AI answers, the underlying principle is the same: mastering the fundamentals now puts you ahead when the tools and algorithms play catch-up.

Server-Side tracking & sGTM – Better data, better insights

Why would you change to server-side Google Tag Manager?

  • Moving from client-side to server-side tracking could improve page speed by up to 15%, boosting user experience and conversions.
  • Server-side tracking recovers lost data blocked by ad blockers or strict browser settings (especially Safari).
  • Misattributed users: 25–42% of “new” users on Safari may actually be returning visitors.
  • Longer buying cycles: preserves first touchpoints that would otherwise be lost due to cookie expiration, giving accurate attribution.
  • Reduces impact of third-party scripts on page load times by moving them off-page.
  • Setup costs exist and it is not cheap, but the improved accuracy, performance, and insight could be worth it.

Day 2 at BrightonSEO continued the AI-heavy talk trend, but my biggest takeaway was all about Server-Side Tracking and sGTM (Server-Side Google Tag Manager). For anyone who hasn’t looked into this yet, it’s one of those changes that can have a surprisingly big impact if implemented well.

 

Faster Page Speed and Conversion Impact

One of the most immediate benefits I learned about is page speed. Moving from client-side tracking to server-side can improve page load times by up to 15% . That’s huge, especially for e-commerce clients, where every second counts. Faster pages don’t just improve user experience, they can directly improve conversions.

 

Recovering Data Lost to Privacy Settings and Ad Blockers

Another critical insight was the impact of ad blockers, browser privacy settings, and particularly Safari’s Intelligent Tracking Prevention. Across modern browsers, we’re often losing 10–20% of actionable data because it simply can’t be captured client-side. On Safari specifically, up to 25–42% of “new” users are actually returning visitors, misattributed due to cookie restrictions and attribution settings. That means you’re not seeing the real picture of your audience or how your marketing channels perform. Server-side tracking helps recover this lost or misattributed data.

One example that stood out is in the case of long buying cycles. If a customer takes weeks or months to convert, the initial touchpoint often disappears because the cookie has expired. In client-side tracking, that conversion is attributed to “Direct,” and the original Google Ad, organic search, or other channel gets no credit. Server-side tracking preserves that first interaction, giving you a much clearer view of channel performance and ROI.

There’s also a performance benefit beyond page speed. With client-side tracking, third-party scripts load on the page, often slowing things down and leaving you with little control over what’s being sent. Moving these scripts server-side lets you control it and ensures that tracking doesn’t interfere with site performance.

 

Is Server-side tagging worth it?

Of course, server-side tracking isn’t free – it requires setup, a dedicated server, maintenance, and some investment. However, for the insights, accuracy, and performance gains it offers, it’s often worth the cost. For e-commerce businesses, recovery of lost data and improved page performance alone should justify the investment.

Why AI & LLMs Won’t Replace SEO

  • LLMs excel at reasoning but don’t inherently know everything. They figure things out from external sources.
  • Retrieval and grounding are what make LLMs accurate. If your content isn’t visible to those retrieval layers, it effectively doesn’t exist.
  • Search engines remain the foundation. AI might guide the answers, but without SEO, your content has nowhere to be shown.
  • For brands, the competitive edge comes from controlling the quality and discoverability of the information LLMs access.

A recurring theme across almost every AI talk was the same: LLMs like GPT-5 aren’t built to know everything; they’re built to reason, search, and use tools effectively. OpenAI deliberately designed GPT-5 to prioritise intelligence over encyclopedic knowledge. It’s not stuffed with all the world’s facts; instead, it’s trained to think logically, follow instructions, and work with the information and tools it’s given.

Why does this matter for SEO? Because LLMs need structured, well-referenced information to work effectively. They rely on retrieval layers, web search, plugins, and external sources to get fresh, accurate data. Without this, the models are practically useless for real-world queries. In other words, SEO isn’t just relevant, it’s critical. The better your content is structured to be referenced and discoverable, the more likely it is to be correctly interpreted and surfaced by AI.

Over the week, I saw this principle in action across sessions on log file analysis, crawl optimisation, schema, server-side tracking, and more. Each session reinforced the same point: AI tools can speed things up and uncover insights, but they don’t replace human judgement, strategic thinking, or the technical groundwork that makes content truly discoverable.

GPT-5 and other advanced models change the game, but they don’t make SEO obsolete. If anything, they make it the bridge between AI and the information people actually see. Master the fundamentals, embrace the tools, and keep your focus on strategy – that’s how you stay ahead in an AI search-driven world.

Turn technical SEO into results

From crawl data and server-side tracking to AEO, we help brands translate technical SEO into real-world visibility. Speak to our team today and see how we can make it work for you.

Like what you read? Share this post
Latest posts
Marketing Paid Media SEO
Marketing Paid Media