Google rolls out core updates fairly regularly, and they’ve developed a bit of a reputation for either making people mildly anxious or completely ruining someone’s Tuesday. The March 2026 update was one of the bigger ones in recent memory, and it was one that was worth taking notice of. Not in a “the sky is falling” way, but in a “here are the things that are genuinely worth your attention” way.
The headline, if you want one: Google is getting better at telling the difference between content that was created because it’s genuinely useful, and content that exists purely to rank. And it is much less tolerant of the latter than it used to be.
Here’s what changed, and what I’d suggest doing about it.
It’s not anti-AI. It’s anti-drivel.
I want to get this out of the way first, because there’s a lot of talk out there suggesting this update punished AI-generated content. That’s not quite right.
What it punished was generic, low-effort, repetitive content. Content that adds nothing new. Content that reads like it was produced by asking an AI to “write a blog post about X” and hitting publish without so much as a read-through. There is a lot of that out there, and Google has become much better at spotting it.
If you’re using AI tools to help you write – and most of us are, in some form – the question isn’t whether AI was involved. It’s whether the finished article is actually good. Does it say something original? Does it reflect real experience? Does it give the reader something they couldn’t get from the first three results they already looked at? If yes, you’re probably fine. If no, this update may well have something to say about it.
Original content and genuine insight went up. Generic stuff went down.
Pages that added something – original research, a real case study, a take that wasn’t already everywhere – saw rankings improve. Pages that were essentially rephrasing what’s already out there did not.
The practical takeaway here is fairly simple but not always easy: what do you know, from your own experience, that your audience can’t easily find somewhere else? That’s the content worth writing. Not just “here are five tips about X” where all five tips are things you copied from somewhere else and restated. But: “here’s what we actually saw happen when we did X with a client, and here’s what surprised us.”
I appreciate this post does kinda fall into the “here are 5 tips about X” category… but I also want to explain it in an easy to understand way for the people on my newsletter!
Google got much tougher on “parasitic SEO”
There’s a practice in SEO where someone gets their content published on a high-authority website – a news site, a large directory, a well-known brand — in order to borrow that site’s credibility and rank for things they’d never rank for on their own domain.
The March update went after this hard. If a big domain is hosting thin, low-quality third-party content that only exists to manipulate rankings, Google is now much better at discounting it. Which is broadly good news for everyone who’s been doing things properly, and less good news for anyone who’s been paying for guest posts on websites they’d never actually read.
E-E-A-T got even more important — and structured data is how you prove it
E-E-A-T – Experience, Expertise, Authoritativeness, and Trustworthiness – has been Google’s framework for evaluating content quality for a few years now, but this update put verifiable expertise particularly front and centre.
The word “verifiable” is doing a lot of work in that sentence. It’s not enough to be an expert – Google needs to be able to confirm that you are. And the way you help it do that is through structured data on your website.
Specifically, the markup that tells Google (and AI tools) who wrote something, what they know about, and where else on the internet they exist. Things like sameAs links to your LinkedIn profile, your Wikipedia page if you have one (living the SEO dream if you do!), any published work – anything that corroborates your identity and expertise beyond just your word for it. knowsAbout markup that explicitly lists your areas of specialism. Author and publisher schema that creates a clear, machine-readable picture of the person behind the content.
This is actually an area where AIProfiles comes in handy – it’s designed specifically to help you build a verified, structured profile that both AI and search engines can read and trust. If E-E-A-T is the test, structured authorship data is part of the answer.
Enjoying this explainer?
Get Lisa’s weekly email with 1 important thing to do on your website this week and other timely website news.
Grab the newsletterCore Web Vitals thresholds got tighter
Page speed and user experience signals have been part of Google’s ranking considerations for a while, but the March update tightened the expectations around Core Web Vitals – the specific metrics Google uses to measure how a page feels to use.
Things like how quickly the main content loads (LCP), how responsive it is to interaction (INP), and how much the layout shifts around as it loads (CLS).
The thresholds that were previously “good enough” aren’t necessarily good enough anymore. If your site was sitting in the “needs improvement” band rather than passing comfortably, this is worth addressing.
The most useful first step is knowing where you actually stand. SiteVitals has a free speed report that checks your Core Web Vitals and other important website metrics – just put your URL in. That gives you a clear starting point.
If you want to go further, SiteVitals’ performance monitoring keeps an eye on these metrics over time and alerts you if something degrades. Page speed problems have a habit of creeping in slowly – a new plugin here, a larger image there – and you don’t always notice until you check. Having something that flags it for you means you can catch the drift before Google does and holds it against you.
Google got stricter about matching content to intent
This one’s subtle but has a really practical impact. Google is better than ever at understanding not just what someone is searching for, but what they’re actually trying to do. And it’s penalising pages that match the keywords but miss the point.
The classic example: someone searches for a comparison between two products. Google knows they want a quick, scannable comparison table. If your page on that topic is a 2,000-word essay with no table and lots of brand-preference language, you’re answering the wrong question – even if your keywords are perfect.
The fix is to think about what the person searching your target phrase is actually trying to accomplish, and make sure your page gives them that, in the format they’d expect. Sometimes that’s a detailed article. Sometimes it’s a table. Sometimes it’s a tool. Match the format to the need.
Reviews had a shake-up too – which matters especially for local SEO
Local SEO also shifted in this update, with Google rewarding Business Profiles that have genuine, high-quality, regularly refreshed content and authentic photos.
And the timing isn’t coincidental – Google also updated its review guidelines around the same period, with some meaningful changes to what you can and can’t do when asking for reviews. I’ve written about that in detail in a separate post here – the short version is: no cherry-picking, no incentives, no telling people what keywords to use, and no review kiosks. Just ask consistently, make it easy, and keep it honest.
Fresh reviews, consistent activity on your Business Profile, and original local content all got rewarded. If your Google Business Profile is something you set up once and haven’t really touched since, it’s worth giving it some attention.
One interesting shift for specific sectors
Worth flagging if you’re in health or anything adjacent to government/official information: the update saw gains for government domains and clinical, research-driven sources on fact-heavy queries. Broad consumer health sites that were covering everything for everyone without particular depth or credentials took a hit.
This is consistent with everything else in the update – specificity, credibility, and genuine expertise won. Breadth without depth didn’t.
To sum-up…
Looking at all of this together, the pattern is pretty clear. Google isn’t trying to create a more complicated algorithm. It’s trying to build a better version of the same question a sensible person would ask: is this actually good? Is the person who wrote it credible? Does the page do what it promises? Does the site work properly?
Most of what the March update rewarded are things that have always been worth doing – original thinking, verified expertise, fast pages, honest reviews, content that actually matches what people are looking for. The update just raised the bar on what “good enough” means.
If your website is built on solid foundations and your content reflects real knowledge and experience, you’re probably in reasonable shape. If it’s been a while since you checked your page speed, or you’ve been publishing content mostly to hit a frequency target rather than because you had something genuinely useful to say, this is a decent prompt to revisit both.