SEO & GEO Best Practices
Server-Side vs Client-Side Rendering
Server-side rendering (SSR) refers to the practice of generating HTML content on the server. Client-side rendering (CSR) means generating DOM elements containing content using JavaScript in the browser by fetching content asynchronously (often as JSON), an example being single-page apps (SPA).
AEM uses a surgical approach that includes all the canonical content of a page included in the initial HTML that is served over the wire. Redundant content that doesn't provide a significant signal to a crawler / index, is removed by default to optimize for specificity of the canonical content as well as experience performance, which has a positive SEO impact.
Header & Footer
Non-canonical content like the header and the footer of a page usually has very different lifecycles from the pages they appear on. They are published and fetched separately, then decorated on the client side for performance reasons.
As detailed in our web performance documentation, page loading is separated into three distinct phases: eager, lazy, and delayed. The eager phase should focus on the resources needed to reach the Largest Contentful Paint (LCP). The header and specifically the footer are not in the critical path to the LCP, which is why they are loaded asynchronously. Cumulative Layout Shift (CLS) is not a problem because the space for the header at the top of the page is reserved from the first paint.
Fragments
AEM supports the inclusion of other pages as fragments. It is important to know when to use fragments and when to avoid them. When used correctly fragments, are not canonical content and are used on a large number of pages, and hence do not provide real significance for any particular page.
Sitemap
It is crucial to maintain an up-to-date sitemap for your site. AEM comes with an out-of-the-box sitemap that is updated automatically as you publish content. It helps search engines understand your site's structure and crawl all its relevant pages. You can decide to exclude pages from the sitemap by setting their robots metadata property to noindex.
Robots.txt
The robots.txt provides a standardized way of telling crawlers which parts of your site they are free to crawl and which parts are off limits. It also typically contains a link to your sitemap(s). AEM allows you to specify a robots.txt in the site configuration. Note that this file will only be delivered by AEM on your production domain. The aem.page and aem.live URLs of your site are protected from crawlers by a combination of prohibitive robots.txt and an x-robots header to prevent those sites from being indexed..
Google (and other Search Engines)
Google's first pass of indexing your page only consumes the initial HTML payload, which is why it is recommended that all of the primary content of your page is part of it. Header, footer and fragments are typically not considered primary – or canonical – page content.
In a second pass, Google will load your page in a headless browser, allowing it to index the full page content, including secondary content being decorated on the client side. This pass is time-boxed, so it is imperative that your page loads fast and all the relevant secondary content is added in a timely fashion.
Leveraging from the AEM Boilerplate and following the web performance best practices will ensure that your page can be fully indexed by Google and other search engines.
Large Language Models (LLM)
LLM agents and bots (e.g. ChatGPT-User) need access to content as quickly as possible and therefore do not render pages. They need the relevant content to be part of the initial HTML payload when requesting a page. The Header and footer of a page are not important to them.
Adobe recommends using LLM Optimizer to ensure your site is represented in the best possible light when users search for relevant keywords in LLMs.
Experiments with server-side header, footer, and fragment inclusion
As explained above, AEM already renders all canonical content that is relevant to the page on the server side and provides an optimal data set for SEO and GEO purposes.
Nevertheless, we frequently find ourselves in conversations involving claims diverging from our own observations. For instance, using server-side techniques for entire pages – including headers, footers and fragments – keeps coming back as a recommendation in the SEO domain. We are open to improving the server-side output of AEM through experimentation, tied to validation through measurable positive outcomes via authoritative tooling.
Interested in experimenting with us?
We are looking for participants who are willing to run server-side rendering experiments on their production environments. If you are interested and agree to the following rules of engagement, we are looking forward to hearing from you on Slack or Teams:
Rules of engagement
- Agreement on pre-defined goals based on numbers from Google Search Console as the authoritative source
- Experiment runtime should be around 3 months
- Access to Google Search Console must be provided for the duration of the experiment
- During the experiment, the SLO will be reduced from 99.99% 99.9%
- Revert to boilerplate rendering if pre-defined goals are not met at the end of the experiment
- Anonymized experiment data will be published here on
www.aem.live
Experiment #1: www.aem.live includes header and footer on the server side
Goal: Validate the recommendation that the inclusion of headers and footers on the server side improves SEO performance.
Execution: On November 14, 2025, we started including headers and footers on the server side on all of www.aem.live. The following charts show authoritative data from Google Search Console after 60 days of runtime:
Preliminary conclusion: There is no measurable upside on any SEO metric. Data even shows a decline which could be attributed to a theoretical performance degradation via slower LCP (Largest Contentful Paint).