SEO’s Guide to MCP: What SEOs need to know about (and do about) Model Context Protocol for AI Agents

Summary

While your website was built for humans it is less than perfect for AI Agents, and the LLMs behind them, to access. Especially now that it isn’t just for the purpose of finding information that AI Agents want to access your Web Offering; it is to take actions and interact on users’ behalf.

To ensure that AI Agents can buy products, fill forms, book appointments and more for their humans they need a reliable and standardised way to interact with the back-end of websites.

MCP (Model Context Protocol) is that standardised solution: a translator between AI Agents and whatever RESTful API, SQL Database or other database or tool you may have (or need to soon have) available.

This guide takes SEOs and Site Owners through why MCP is something we will be hearing a lot about and is a real opportunity for those who adopt it to gain an advantage.

What Agentic AI means for SEO

Agentic AI is more than just AIOs and AI Mode and will transform the SEO industry in a way that these additional AI features, which have been added to Google Search, never could.

If you’re still getting to grips with Agentic Search, you may want to just skip to full on Interactive Agentic AI, of which ‘search’ is only a part, at best and arguably a misnomer for what AI Agents will actually do when finding and processing relevant information.

What we are talking about is a reality where some users for some activities which previously they would have used a website for will now just use an AI Agent. The AI Agent will replace the Searching part, the Researching and Knowledge Gathering part and the Interaction part of a user journey.

SEO isn’t dead because it will take a long time for everyone to move to AI Agents, in the same way that some people were slow to (or still don’t) use e-commerce sites or prefer to call a restaurant to book a table.

What is important for us SEOs though is that we are the best placed to optimise our clients’ sites for AI Agents: at least in terms of it naturally fitting with what we do. Though in terms of knowledge it seems that a lot of SEOs have a long way to go, especially when compared to Developers and Data Engineers.

So this guide is going to help you get to grips with what will be an absolutely key part of optimising sites for LLMs (Large Language Models), specifically the accessibility part (the new technical SEO’ if you like) and more specifically than that MCP – Model Context Protocol.

n.b Why do we say LLMs above and not AI Agents then? Well AI Agents are powered by LLMs, we will use the terms fairly interchangeably, mainly referring to AI Agents, but in reality it is the LLM behind the AI Agent that is doing the heavy lifting and using its MCP Client abilities to communicate with MCP Servers.

If you haven’t heard of MCP before then I’m proud to be the first to introduce to you something that you may hear about and work with so often you end up dreaming about it, especially if you work with e-commerce sites.

Basics of ‘What is MCP’

Model Context Protocol is an Open Standard that allows AI Agents to interact directly with your sites’ tools, data, APIs and more. It is a layer, or translator, between your data or tool and the LLM behind the AI Agent, which allows the LLM to work out how to interact with your data and tools and get back data in the format it needs.

This could, for example, mean that MCP allows:

  1. An AI Agent to know that your product database is available to it to access, and
  2. Secondly how to request data and then ensure it comes back in a standard JSON format that the AI Agent can understand.

Even if you have a database that doesn’t follow a popular standard right now, MCP can act as a layer to allow access to it without making any changes to the database, normally.

Likewise APIs can be interacted with through MCP by an LLM, meaning that a RESTful API for example can have GET and POST commands sent to it, (and potentially PUT & DELETE).

This could allow all sorts of interactions between the AI Agent and the backend of the site, replicating what users could do through the site, such as filling in forms, booking appointments, setting up accounts and even purchasing products (once some outstanding security issues have been addressed).

Alternative Options for Machine Accessibility to Website Data & Actions

MCP is just one way for AI Agents to interact with websites directly but it seems to be the one preferred by Google, including from comments in the Google I/O 2025 Keynote, and also seemingly by Claude, OpenAI and others.

Nothing else right now is able to act as a universal translator between LLMs and Endpoints like APIs and Databases, which is specifically what MCP is designed to do.

If we think of AI Agents as being the next generation of search engines we may decide to focus on them crawling web pages, which they can do, but this only allows them to see the content of pages, not interact with the site, assuming that they can actually see and understand the full context of the content.

Now though we have to switch our thinking from content accessibility to also thinking about ‘Action’ accessibility.

Add Structured Data: Still putting your content into Structured Data and having it in the head section of your source code is a good idea for content accessibility, or you can have a JSON-LD Structured Data version of your page on a separate URL.  

Using Structured Data can make it quicker and easier to digest content, product details etc. and give more context, but it’s still not interactive. Even if you add Schemas for available actions it just means AI Agents can tell a user what they can do on a page and direct them there.

Accessible Forms: Some AI Agents and browser add-ons allow AI Agents to interact with a site through a user’s browser, for example filling in forms. You can therefore try and make your forms and checkouts easier to interact with, e.g. with Structured Data and avoiding SPAs. This can be a lot of work though and still the user is going to have to interact and watch the Agent work through a form or checkout for them.

APIs & Databases without MCP: You can setup an API or a database and not bother with MCP as well of course and hope the LLMs find them and work out how to interact, but as there is an Open Standard, MCP, that will just make things easier for LLMs, sites that support it are going to have a major advantage.

A Step-By-Step overview of MCP in action

  1. AI Agents / LLMs that already support Model Context Protocol (MCP) have MCP client functionality built in
  2. LLMs can then automatically discover and connect to MCP Servers
  3. MCP Servers are set up and hosted by site owners, usually one per site, the MCP Server defines available tool endpoints (e.g. getProductDetails, createAppointment) using JSON Schema, in other words what an LLM can do via the Server.
  4. The LLM’s MCP Client sends requests to the MCP Server using the standardised MCP format.
  5. The MCP Server acts as a translator or bridge and converts the requests based on the setup of the endpoint (database, APIs or other tools) and passes them along.
  6. The Endpoint processes the request and sends back relevant information, (such as product data, an article or booking confirmation), to the MCP Server
  7. The MCP Server formats the response (data etc.) into a structured JSON response (again using the standardised MCP format) and sends it back to the LLM’s MCP Client.
  8. The LLM interprets the data and may send another request. There may be several steps backwards and forwards between the Client and Endpoints, via the MCP Server, such as when filling out forms.

Basic Comparison of MCP vs APIs

MCP is not an alternative or replacement for APIs, it is an extra layer that works with a site’s API. So if you work on a site that already has an API setup, you’re more than half way to making data and actions available to AI Agents.

APIs exist in many websites and are central to making CMSs, Checkouts and more work on many sites. Right now your API is probably not generally or publically available in any way online.

How APIs are setup varies hugely, even within RESTful APIs the exact way they work and the actions and data that are available depend on how they have been built or which platform is being used. That is necessary when building a website but means that they aren’t immediately useful to anyone without the documentation.

MCP could be described as a translator, using the documentation to facilitate a conversation between the API and the AI Agent which would otherwise be unable to understand each other.

MCP doesn’t replace…..

Model Context Protocol isn’t a database or an API and doesn’t store data, don’t let the term ‘MCP Server’ make you think this is a vast server with your data on, the MCP server may just be a simple Node.js app, or integrated into your CMS/API.

Of course the setup of the Endpoint needs to be suitable for what you want the AI Agent to be able to do (e.g. Create an Account, Book an Appointment) before the MCP is setup and the data accurate, this is your first priority.

One risk of every data source online suddenly setting up MCP access is suddenly a lot of data becoming available that may be used by AI Agents to make decisions or find insights which could be based on seriously flawed data.

Having MCP setup also doesn’t mean you don’t need to optimise your website’s front end to be as machine accessible as possible. As mentioned earlier this includes having structured data setup for all of your content, products and actions. It also includes avoiding overly complex Web Apps and blocking parts of your site, such as account sign up pages, to crawlers and bots.

MCP has been built, for now, to be pull based, from the AI Agents’ side. When AI Agents evolve to be ‘always on’ they will need to be able to get push based notifications, e.g. ‘this order has shipped’, so will either need to work directly with Webhooks, or MCP will need to be expanded to handle Webhooks and other push notifications.

Why AI Tools prefer MCP

It may seem that basic RESTFUL APIs or SQL Databases would be easy enough for AI Agents to work out and to access? There would be several challenges though, and we want to make it as easy as possible for AI Agents to access our data and services.

AI Tools prefer MCP because:

  1. Without MCP AI Agents and the LLM behind them would still need to know that Endpoints are there, whereas MCP Servers can be auto discovered,
  2. Whatever format the Endpoint outputs (e.g. XML, HTML, JSON) the LLM always receives back JSON with MCP.
  3. Because everything in MCP is done in JSON with a standard protocol, the LLM knows the correct protocol and is told by the MCP how it can be used and what to use it for.
  4. This means LLMs know the tools purpose straight away, and the order that needs to be used e.g. searchProducts, then addToCart, then checkout
  5. This also means knowing what inputs are required and in what format, for example it could be instructions that an order number can be alphanumeric or only numeric or that all time stamps are in UTC +0.
  6. The setup of the MCP server also limits what the LLM can do, so for example you may well want to block any PUT or DELETE endpoints to avoid losing data.

Made for LLMs / AI Agents :At the end of the day MCP is made to work with LLMs, and APIs and other Endpoints never were.

Why MCPs are good for Users

MCPs have the potential to make AI Agents much more useful.

  1. It will be quicker and less resource intensive for LLMs to get information, as they don’t need to crawl a site finding the right pages and then load and render a whole page potentially to see everything. This is also great for the planet as far less data needs to be transmitted (though running LLMs is still highly energy intensive).
  2. LLMs being able to access more options for products and services, means a better chance to find just the right option and at the best price.
  3. More than this though MCP has the potential to allow AI Agents to do far more actions and interactions which could be the true AI revolution.

Why MCPs are good for SEOs & Site Owners

Until MCP was introduced it would be hard for those running websites or their SEO to know how to start optimising for the future of Agentic AI.

  1. Now we have an accepted standard being backed by Google, Claude, Open AI and others.
  2. Giving AI Agents access to our data and tools they can use to interact with our sites on the face of it just cuts out our websites and Google Search, but that’s the way things are going and if we aren’t making it as easy as possible, with MCP, for AI Agents to interact with our web offering then we hand the advantage to our competitors.
    That doesn’t mean though that if everyone moves to having MCP setup allowing full machine accessibility then the status quo is maintained.
  3. Agentic AI levels the playing field. Sites like Amazon are starting points for shoppers now because they have established a number of advantages for users (see below) but many of those are, or will be, made null by Agentic AI taking things further and quickly comparing products, reviews, prices and more from every site, not just Amazon resellers.

So this helps if you are doing SEO or running a site for a less established retailer, but is probably a concern if you work with the Amazons, Walmarts (NAMER), Bols (NL) and JD.coms (CN) of this world.

Use cases for MCP Now:

  • Providing Product Data, with Schema.org Structured Data
  • Access to articles and content in structured format
  • Pull data from Analytics and other packages
  • Complete analysis based on access to a database
  • Access to official repositories such as legal cases, laws and statutes,

Use cases for MCP in Near Future (or in Beta):

  • AI Agents can fill out Forms
  • AI Agents can make Appointments, e.g. Sales Calls, Medical Appointments
  • AI Agents can make Applications such as Job applications, Visa Applications
  • Access to SaaS tools including CRM, ERP, Project Management
  • Book a hotel or other service (with payment on arrival)
  • AI Agents can check the status of orders, deliveries and other services

Use cases for MCP in 1-3 Years:

  • AI Agents can setup Accounts
  • AI Agents can make Purchases of products and services including payments
  • AI Agents can make re-purchases
  • AI Agents can access and check individuals’ protected information such as Medical Records
  • AI Agents can update users based on receiving notifications

Payments and push notifications may be through another service with part of the process through MCP (TBC)

Preparing for MCP

Platforms

Some platforms are, as on this moment on 12th June 2025, easier to set up MCP servers for than others. Some have invested in adding MCP options directly in the CMS. Wix and Shopify (not everyone’s first choice for SEO-friendliness) are in fact among the best platforms right now for an MCP compatible platform.

If you are currently thinking of moving to Magento then you may have to do a chunk of extra work to get things running or you may need to use a third party add-on. Amazon of course have already setup their own MCP server.

Platform / SystemMCP Support StatusCurrent Use Cases & Notes
Shopify✅ Full supportOfficial Storefront MCP server for product/cart/checkout/features (drupal.org, shopify.dev)
Magento / Adobe Commerce✅ Community + EnterpriseMCP via community SDKs; BoldCommerce and open-source GitHub servers enable product and order tools
WordPress✅ Mature ecosystemOfficial Automattic plugin + third-party MCP support via AI‑Engine, Zapier, CData
Drupal✅ Alpha to stableDrupal.org MCP module + companion servers turning content endpoints into MCP tools
Wix✅ First-party supportComprehensive MCP server for e‑commerce, CMS, bookings, payments
Squarespace🔶 Emerging via APINo official MCP; custom integration possible via Zapier/Pipedream
Joomla🔶 Community-drivenNo core MCP yet; could add via extension frameworks
Salesforce🔶 Enterprise pilotsLikely via MCP server frameworks (e.g. SDK) but currently limited
BigCommerce🔶 Marketplace supportMCP emerging via app ecosystem; no official server yet
Headless CMS (Contentful, Sanity, Strapi)✅ Fully compatibleCan easily wrap existing APIs with MCP servers
Enterprise Backend (Postgres, Salesforce, GitHub, Slack)✅ Reference examplesAnthropic repo offers sample MCP servers
Cloud Platforms (AWS / Azure / GCP)✅ SupportedIntegrations via Copilot Studio (Azure) and Serverless frameworks

Setting up a MCP Server

Depending on the platform you are working with it could be as little as 1 day’s work for an experienced Developer, on average 2 days, but in some cases it could be a lot more, 1+ weeks, of work.

What MCP Developments to keep an eye on?

Right now there are security issues to overcome to ensure that AI Agents only request information and handle information they are authorised to and in a secure way. So there are limitations now to processing individual’s information. Then there’s accepting payments for purchases that have been implemented by an AI Agent.

These issues aren’t insurmountable though and Visa and Mastercard are already working on solutions to accept payments via AI Agents, most likely through a wallet style system. The role of MCP here may be to process the purchase up to payment and then hand the process to another solution via a link or key.

Lastly it is important to mention that while less important in the context of what we do as SEOs and making websites accessible to machines, MCP can also be used to connect AI Agents to Data Sources in closed or local environments. That could be for example connecting an AI Agent used for data analysis by a company to a local source of company data via a locally hosted MCP Server.

Glossary

API (Application programming interface)

An interface between different software systems allowing them to share data and actions.

RESTful API

A common type of API which uses web URLs and simple actions (GET, POST, PUT, DELETE) to allow interactions and data sharing over an internet connection.

Client (MCP Client)

A client is any device or app that requests data or services. In the case of a MCP Client it is the LLM sending requests to the MCP server.

Server (MCP Server)

A server is any system that stores and shares data or services when requested by a client. An MCP server is the layer that has the specific function of providing a list of tools and services available and then processing requests and data going back and forth between the Client and Endpoint.

Endpoint

Endpoints are a specific URL or location in a system where data can be accessed or a specific operation can be performed. In MCP an MCP Server makes Endpoints available to AI agents. It defines what the Agent can ask for (e.g., getProductDetails) and how.

Schema(.org)

A structured format (often based on JSON Schema) that defines how data and inputs/outputs are structured so the AI Agent can interact correctly.

JSON (JavaScript Object Notation)

A type of notation that formats data in a way that both humans and machines can easily understand with a key-value pair of a definition and value such as { “price”: 19.99 }.

JSON-LD

A format for expressing ‘linked data’ in JSON, commonly used for structured data on web pages. Linked Data is where the definition is obtained from a relationship to a global standard, such as Schema.org and other ‘vocabularies’.

Node.js

A tool that allows Javascript to run on a computer or server rather than only in a web browser.

Webhooks

A way for one system to instantly notify another when something happens — pushing information when relevant so that a system doesn’t have to keep requesting data to check a status (for example a delivery status).

Sources


My main source is the official MCP website and documentation and especially their introduction https://modelcontextprotocol.io/introduction

There is a MCP Inspector tool you can use https://modelcontextprotocol.io/docs/tools/inspector

MCP was created and introduced in November 2024 by Anthropic https://www.anthropic.com/news/model-context-protocol

There are also great resources available on YouTube including from IBM https://www.youtube.com/watch?v=7j1t3UZA1TY

Scroll to Top