Structured Data is a way of marking up information on a web page so that robots can understand exactly what it means. It is written in accordance to fixed conventions set out at schema.org so that there is absolute clarity for robots as to what information is being shared. For this reason, Google and LLM search tools – such as ChatGPT – use it as a reliable source of truth for their responses.
If you’re in my circle at all on socials, you’ll have heard me harking on about a tool we’ve built to help websites get Structured Data, called AIProfiles.co.uk.
But it’s not a glamourous subject, and it kinda requires a little understanding of the subject to even know that you need Structured Data and what it is.
And that’s why I received this email from a contact last night:
Hi Lisa
Just seen your post. I’m intrigued. But also wonder how https://aiprofiles.co.uk/ is more than just another listings site? If AIO like SEO is all about EEAT then what difference – with respect – will have another listing offer?
So I thought I’d share my response, as it hopefully – when you see the code that actually constitutes Structured Data – explains what it is.
Hi,
Thanks for your email – it’s prompted me to think about answering this in another post / blog post!
Firstly, AIProfiles isn’t a directory.
With that said, let’s forget about AIProfiles for a minute and understand Structured Data…
Structured Data is code, often in a format such as JSON, which follows the “categories”, if you like, of structure defined on schema.org.
It’s a way of letting robots understand what something is on your website.
For example, if you had an event on your website, you might have some text that said:
“Next Monday, we’re holding our annual Christmas Party. It’s at 7 o’clock, at the Avon Gorge Hotel, and tickets are £5.”
A robot – such as a GoogleBot, or a bot from OpenAi’s ChatGPT, might find that and read it and just about make sense of it. But Google wouldn’t (or would be very unlikely to) display it as an event in search results, or bring it back for a search of “Christmas parties next week” because it wouldn’t be 100% sure it had interpreted the date or time correctly.
So instead, adding structured data for events would mean that even though people might see that sentence above on the web page, behind the scenes, robots would see:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Event",
"name": "Christmas Party",
"startDate": "2025-12-01T19:00",
"endDate": "2025-12-01T23:00",
"offers": {
"@type": "Offer",
"name": "Ticket",
"price": "5",
"priceCurrency": "",
"validFrom": "",
"url": "",
"availability": "https://schema.org/InStock"
}
}
</script>
Everything has it’s place, and is laid out according to a strict convention – the start date for an event for example is always called “startDate” – not sometimes “Start” and sometimes “Starts on:”.
Whilst Google has always liked structured data, it’s been kinda a “nice to have” in SEO; we’ve only really done it for our biggest clients who get millions of visitors a day and really heavily invest in SEO.
However, research is showing that ChatGPT and other similar LLM (large language models) are really heavily leaning on structured data. And that’s because they want to give the best answer they can. And by it’s nature, structured data spells out to them what stuff is so they can trust they’ve gotten it right.
The case study I shared on LinkedIn yesterday also highlighted how well known brands are just not surfacing in AI searches if they don’t have structured data on their websites – because even though they’re a household name, and so yes they’ll have strong EEAT signals and lots of citations across the web (although not necessarily in the right places), they don’t necessarily spoon feed robots their data in a way the robots can know they are definitely interpreting correctly.
So – ideally, every website should have all their important information on their website also presented as structured data.
However, this isn’t feasible for everyone, because it involves a developer or plugins depending on your platform. You can find tools online in order to generate the data yourself, but then you still need to get it into the right part (behind the scenes) of your website / uploaded to your server.
So that’s why we built AIProfiles. You fill in your company data, and it produces the structured data (like the snippet shown above) for you. And it focuses on the bits of structured data we know are important for AI – business info, services, people, awards, testimonials etc.
You build a profile, you pay £24pa (if you’ve got a promo code – “2026” will do it – otherwise it’s £60pa) and you link to it from your website footer. So your website is then telling the robots “here’s the structured data version of our company info”.
If you don’t link to it, that data is still accessible to robots, but linking to it from your website pushes the point more that it’s definitely your data.
I hope that explains it better and helps it make sense? Please do let me know either way as I need to make sure I’m explaining it clearly!
Thank you!
But then I remembered LLMS.txt files – so I sent a follow up email:
Ooh! And I forgot to add that AIProfiles also gives you a llms.txt file which is a new file type that’s been invented (not by us) as a suggested standard for the best way to convey your key information to AI bots.
As we can’t put this on your server, and we can’t inject JSON code into your website (or we really shouldn’t be able to anyway!), we create you a standalone profile and recommend you link to it. It’s designed to be as easy as possible for non-technical people to use, whilst remaining as cost effective as possible too.