I’ve been loath to write on the topic of Search Engine Optimization (or “SEO”) since opening my consultancy and starting this blog. In fact, if you flipped through the pages describing Kairos Media’s services, you’d find that they barely mention SEO at all, save for a passing reference when talking about website planning and architecture.
There are plenty of reasons that I should write more about the subject, to be sure, and yet I’ve stubbornly refused out of what I can only describe as a sense of professional dignity.
Let me try to explain by way of analogy. Say that you’re a tradesperson – specifically, a plumber. Imagine that you’re at a party, meeting someone new, when they ask, “so what do you do for a living?”
“I’m a plumber,” you reply. “Hmm,” they nod, pausing for a moment before adding (in all sincerity and earnestness, mind), “…y’ever do any work with sinks?”
Now, maybe you’d answer by telling them that yes, as a plumber, you actually work on sinks quite often. Maybe you’d have to stifle a laugh before you do. Maybe you’d ask them to repeat the question, just to be sure you heard them correctly the first time.
What you probably wouldn’t do – because you’re a nice person – is stand there, mouth agape, boggling at the utter inanity of that question. I mean, it would be rude of you, but you could boggle. You’d be wholly justified in boggling.
Hearing a stranger ask that question at a party would be one thing. But now, I want you to imagine that you run your own business – a plumbing service – and that roughly one out of every three phonecalls you receive is someone asking whether or not you’ve ever worked with sinks before.
Imagine that you’ve lost business in the past, because a client tells you they need “a plumber with sink experience”. Meanwhile, just up the street, your competitor has been making a killing ever since he changed the name of his business from “Jimmy’s Plumbing” to “Jimmy’s Sinks & More”.
Basically, that’s the relationship I have with “SEO” as a digital marketer. I know that it’s one of the most common areas of online marketing that local businesses seek help with, and I’ve handled plenty of SEO projects over the years. On a technical level, I’ve SE-Optimized my own website about as much as I can get it without…well, without being especially popular or well-read (and without having bought/earned any decent backlinks as of yet).
Sure, yes, I work with sinks. There are a lot of sinks out there, and they all need plumbing. But I don’t want to be known as just a “sink guy”. I’m a plumber, dammit!
(As a sidenote, this is probably the same logic which leads me to refer to my online advertising services as “AdOps“, even though virtually no one in Winnipeg – not my clientele, not other marketing agencies, nobody – seems to be familiar with the term. Yet still I insist on calling them AdOps services, because… well, that’s what they’re called.)
All of that to say, consider this post my way of slapping a decal on the side of my cargo van, one that reads “YES, WE DO SINKS!!!”.
Inspired by the Internet’s recent collective obsession with a morbidly obese cat named Cinder-Block, and the recent viral success of local artist Alex Plante (plus a line in the Google Doc where I keep all my blog-post ideas, where I’d written “CHONKY CATS & SKINNY SITES”), this post tackles one of the most impactful (and frequently neglected) aspects of search engine optimization – web performance.
I’ll start by discussing why performance matters to your SEO, talk about some of the main factors which impact website performance, and close with some actionable steps you can take that could have an outsized impact on your website’s SEO.
Off we go!
Introductory SEO for Rumsfelds
Let’s begin – as we so often do – with some big-picture, broad-strokes stuff by asking: what is the ultimate purpose of search engine optimization? Well, consider the unique interests of each of the parties involved:
- Search engine users want to find useful responses to their search queries.
- Search engines want to present those users with the best possible answers via their search engine results pages (or SERPs) – that’s how they attract and maintain their user bases.
- Website owners want their own pages to show up more prominently in those SERPs, so that more search engine users will be exposed to (and hopefully land on) their websites.
So in a nutshell, SEO aims at improving the ranking (or “position”) of a website’s various pages in organic search engine results, whenever users are making search queries relevant to those pages’ intended audience.
Eagle-eyed readers may have noticed that I included a pretty huge “wiggle word” in the preceding bullet points – that word being “best”. Out of the literal trillions of unique web pages crawled and indexed by the search engines, how do they determine which of these to return as the “best” results for any given search query?
That is the question around which the entire, global, multi-billion-dollar SEO industry revolves. And the most honest answer that I (or anybody else in this line of work) can offer you is this:
I’m not sure.
To be clear, we know that search engines rely on a number of different “ranking factors” in deciding which pages appear on the SERPs (and in what order) for a given user’s specific search query. But no one outside of the core search teams actually knows what that full list of ranking factors is, or how each one of them gets weighted. What’s more, that list of ranking factors and their respective weights is under constant change.
That uncertainty is intentional, and it’s necessary to preventing unscrupulous webmasters from “gaming the system” so that their pages rank higher than they otherwise should in organic search results. But it also means that anybody who conducts SEO work has to rely on inferences based on experimentation, observation, and the (infrequent) disclosures made by search engines themselves.
Given this lack of certainty, how can digital marketers know which steps they should be taking in order to improve their search rankings over time?
Ask Not What Google Can Do For Your Website…
While there may not be any single clear answer to that question, it’s useful to bear in mind that search engines aren’t trying to cater to the interests of website owners (when it comes to organic search, at least); they’re catering to the interests of searchers. Google sums up this principle in their Webmaster Guidelines with a handy maxim: “make pages primarily for users, not for search engines”.
In other words, rather than asking ourselves “how can I get Google to rank my pages higher?”, we should ask “what would make my pages more useful to searchers within my desired audience?”
Well, for starters, those searchers should be able to find your pages via the SERPs – that is, search engines should be able to access and crawl your website, quickly updating their search index as new content gets added and changes are made. Technical SEO factors such as your indexing setup, as well as the presence and content of structured data markup play a major role there.
Beyond that, you want your website’s audience to find meaningful content that’s relevant to their search queries – that’s where things like content creation and keyword optimization come in.
You also want your site’s content to be seen as trustworthy and authoritative; that largely depends on “off-site SEO” factors such as backlinks, social media engagement/shares, and other brand mentions across the web.
“If It Drops Below 50… It Blows Up”
Alright, pop quiz, hotshot: can you think of anything missing so far from this list of things people want when they’re searching online?
That’s right – it’s speed.
Searchers want answers to their online queries, and they want those answers fast. There’s plenty of evidence out there to back this claim, but I’ll rattle off a few well-worn examples here:
- In 1999, Google’s Marissa Mayer (the company’s 20th-ever employee, and later CEO of Yahoo!) discovered that a half-second increase in the load-times for their search engine results page resulted in a 20 per cent drop in searches made.
- Sometime prior to 2006, Amazon ran tests which found a similar 20 per cent decline in their website traffic when page-load times went up by 500ms, and that even a 100ms increase led to a 1 per cent decline in sales.
- These might actually be the most often-cited numbers you’ll find anywhere online regarding the impact of page-speed, but I still had one Hell of a time tracking down where they came from. Turns out, the original source for both figures was a guest lecture presented by pioneering Amazon engineer Greg Linden to a class of computer science students at Stanford University back in 2006. I managed to track down a link where you can download Greg’s original PowerPoint slides [with presentation notes!] on his personal blog. Yay, Internet history!
- A 2018 study by Pingdom found an average bounce rate of 6 per cent for pages that loaded within 2 seconds. For pages which took 5 seconds to load, that average bounce rate shot up to 38 per cent. Once the load-times hit 7 seconds, more than half of all visitors who landed on a web page would leave without taking any further action.
- A 2016 analysis of Google Analytics data found that more than half of all web traffic now comes from mobile devices, and that 53 per cent of mobile site visitors will abandon a page that takes longer than three seconds to load.
Page speed has long been known to have an impact on search rankings; Google announced that they had begun incorporating it as a ranking signal back in 2010. For a long time, that signal only directly impacted search results on desktop, but following the rollout of Google’s “Speed Update” in July 2018, page speed is now a direct ranking signal for mobile searches as well.
It’s also worth pointing out that search engines tend to reward those pages where users actually find what they’re looking for (rather than having to back out and click through more results). Considering how often we know that users will abandon a slow-loading website, web performance has SEO implications which go far beyond any individual ranking signal.
This might go a ways to explaining why Google added new (experimental) speed reports to its Search Console dashboard this week, just as I was finishing up this blog-post. Talk about serendipity!
Speeding Up A Sluggish Site
If you’re looking for ideas on how you might improve the performance of your website, then I strongly encourage you to run a few tests using Google’s PageSpeed Insights. The tool provides details on how quickly a given website loads via desktop and mobile devices, compares its performance to other websites using HTTPArchive data (plus data from the Chrome User Experience Report, if available), and provides suggestions on how to optimize load-times and overall performance. Other popular (and free!) page-speed testing tools are offered by GTMetrix, Pingdom, and WebPagetest.org.
After much testing and fiddling, I’m proud to say that Kairos Media’s homepage now ranks among the fastest of any digital marketing company based in Winnipeg – not bad, considering that this thing currently runs on a free-tier Amazon EC2 instance with a CloudFront CDN, so I’m paying literally pennies in web hosting fees.
Before you crack open a new tab and start testing, though, do you want to know what the answer to “why is my website so slow?” really is, nine times out of ten? Because if you want to know, then I’ll tell you. Are you sure? Okay…
Your website’s too fat. There, I said it. Ya website chonky.
The term “page weight” refers to the overall size of the various sub-resources that make up a given web page. Generally speaking, the larger or “heavier” a website is, the longer it takes to load.
According to historical data from the HTTP Archive, as of 2017 the “average” page weight clocked in at roughly 3MB, a substantial increase from the 1MB averages seen five years prior. 2017 was also the year that Pingdom reported an overall average page weight of 3.4MB, based on 262 billion page tests conducted during that year.
What stands out in both of those datasets isn’t so much that the average web page is getting heavier, but where this “page bloat” comes from. If you check out this breakdown of the Pingdom data, or this one put together by Tammy Everts at SpeedCurve from the HTTP Archive data, you’ll find that image files – and to a lesser degree, video files – have been the main contributors to rising page weights in recent years.
Visual content can absolutely help to capture a user’s attention, and keep them engaged with your site – clearly, those are both pluses. But poorly optimized, overweight image and video files are easily the most common cause of “page bloat” leading to slower load-times that I come across in my work – particularly when it comes to homepages and blog-posts. My theory is that because these are the areas of a website most frequently accessed and updated by non-developers (be they copywriters, “content marketers”, digital marketing coordinators, etc.), they’re also the places where the majority of under-optimized assets get introduced.
Ironically, a lot of online content being produced for (mainly) SEO purposes winds up creating UX issues which considerably limit that content’s effectiveness.
As for video’s role in contributing to page bloat: do you remember a couple of years back, when it seemed like everybody was adding a “hero video” to their homepage? You know, those embedded, above-the-fold, full-autoplay backgrounds, the kind that typically featured nothing but generic B-roll and bore only a tenuous relationship to the content of the websites they appeared on?
Remind me again, why… did we all do that, exactly? Was it one of those things where a new WordPress theme comes out, and everybody wants to try it out at the same time? Or maybe the big telecoms were paying web designers under-the-table, as part of a massive conspiracy to charge their customers more when they’d go over their monthly data limits?
Either way, man, hero videos. Glad we got over those.
Good Luck, Cinder-Block!
There’s probably at least one web developer out there who is fuming at me right now over my seeming obsession with page weight. And yes, in fairness to that web-dev, what I’m presenting here is an incredibly reductive view of the many factors that contribute to website performance.
But this post isn’t really intended to add to the spirited debates going on amongst developers and UX designers; it’s meant moreso for the other folks involved in digital marketing, the ones who might not have a strong technical background, but whose work has direct implications for web performance nonetheless.
If that sounds like you, then as a proactive measure, it’s a good idea to optimize image and video assets for the web before uploading them as part of a blog-post or other section of your website. Some online publishing services will automatically handle image compression/optimization for you (Facebook and YouTube don’t want to pay for hosting and serving files any larger than they have to), but your web server likely doesn’t.
Consider running image files through TinyJPG or TinyPNG, and see how the results turn out; both are free online compression tools which can often reduce the size of image files by more than 70 per cent. On more than one occasion, I’ve used these tools myself as a quick-and-dirty way to carve 30 per cent out of a website’s average page weight. The impact that kind of sudden weight-loss can have on a website’s performance (and your overall SEO efforts) is not insignificant.
Alternatively, you could pay somebody to do that for you. Hell, you could pay me to do that for you. But it’s the kind of issue that’s so prevalent in digital marketing (and so easily fixed) that I don’t really feel like it should require outside help from an “SEO specialist” as often as it does.
After all, your plumber doesn’t get mad at you if you buy yourself a drain snake. A good one would probably want to make sure you know how to use it properly. And I like to consider myself a fairly decent plumber. Marketer. Whatever.