People continue to struggle with this challenge. The available tools are, in my opinion, woefully inadequate. But I can’t do anything about them.
Here are my concerns with what I see in requests for help with server response time:
- People put too much faith in the tool reports
- People assume Google is penalizing their sites for being too slow
- People want to load their pages with junk and make them fast
On the last point, as I prepared to write this article someone said in a Facebook group that his page scored well if he removed the videos he was embedding and scored poorly if he put the videos back on the page.
He didn’t say why it was important to embed videos on the page. His dilemma underscores the compromise we must all strike between the page we want to design and serve and the page the speed testing tools score well.
1 – The Correct Way to Use Speed Testing Tools
Professional testing standards across multiple industries stipulate that you cannot test under the same conditions all the time. This is the most common mistake I see people make.
You have control over the conditions of the tool, but you have no control over the conditions of the server. So if you’re using GT Metrix, you can tell the tool to test from different locations around the world. That’s a good thing.
On the other hand, if you’re running these tests during peak traffic times on your server, your results will always disappoint you if you have a busy Website.
*=> Server response time includes how long any process request must wait in a queue.
If you’ve ever looked at a “CPU Load” statistic, you’ve probably wondered what it means. This stat refers to how many requests are queued (waiting for CPU time) over a short period of time (15 seconds to 1 minute). The higher the number the longer the average wait time.
Web servers never sit idle, even when no traffic is hitting the server. They are always doing something. You could be testing the server when it’s running a lot of maintenance programs.
*=> Server response time includes all router responses in the path.
1 bad router can kill your Website performance – in ONE AREA of the world.
This is why you must:
- Run multiple tests at multiple times
- Run multiple tests from multiple areas
You must average the results to set your benchmarks.
I don’t recommend testing constantly or even frequently. But testing your site performance once or twice a month, or several times a quarter, should be sufficient for most Websites.
If you’re using a speed testing tool that doesn’t allow you to test from different locations, you’ll need to use more than 1 tool.
2 – Google Doesn’t Care How Slow Your Site Is
Well, they don’t care to the extent many Web marketers believe Google cares.
Website speed was never a major ranking signal. It never will be.
And yet for some reason, people believe we’re all doomed if we don’t score 100 on some speed test tool.
Google NEVER relies on a single speed estimate in calculating your site’s crawl budget (how often it crawls your site in a given period) or crawl priorities (what is crawled, how often, and when).
Google computes average speed times for every Website. So if Googlebot thinks the site is running slow today, you’re not doomed forever. Googlebot will clock the site again tomorrow, the day after tomorrow, and for as long as Googlebot crawls the site.
Bing handles site speed the same way.
So while keeping sites fast is important, it’s not a guaranteed disaster if someone breaks a thing that causes a page load time to hang for 30 seconds. The poor user experience will cost you more conversions in the immediate future, but it won’t affect search engine rankings if the problem is resolved soon after it occurs.
*=> Most sites probably have 6-10 days to deal with problems like this.
3 – Decide What Is Important for the User Experience
People are often willing to wait for content. They only need to know in advance they are waiting for it.
Yes, you should also use lazy loading strategies, or use image thumbnails (for both large images and videos), etc. Find ways to create a smooth, streamlined on-page experience – but if you’re creating content that needs all this time-consuming stuff, let visitors know they’ll have to wait.
The page title and headings are usually enough to prepare visitors for a short pause. We’re talking about adding 1-2 seconds to a page’s load-and-render time over fast connections. People WILL wait if they know what is coming.
So, “Watch a video interview with Abraham Lincoln” is good – “Abraham Lincoln said [X]” is not good. Many people don’t realize how effective those titles can be in warning visitors of what to expect.
The same is true for meta descriptions. They are SUPPOSED to describe what is on the page. But SEO copywriters often fill them up with keyword phrases (thinking they are optimizing for the queries they want to use). The search engines don’t use meta descriptions for ranking signals – so you SHOULD be writing meta descriptions that convince people to click through to the page.
*=> A meta description is a free text ad. Use it that way. But set honest expectations.
Google’s Search Console Crawl Stats Report Is Often Overlooked
When I join discussions about site speed, most people reveal they never looked at the GSC crawl stats report. The new report is more informative than the old one.
Even when people do look at the report, they may omit the context of what Google is telling them.
For example, if you delete a lot of content from a Website (or redirect many URLs to other destinations), your average crawl time should decline. That’s because Googlebot spends less time fetching non-existent or redirected URLs.
The response time for a 301/2 status code and a 404 status code is about the same – unless you’re specifying a large 404 error document file that must be returned. If you don’t know to look for these things, you may not understand why crawl stats look weird.
*=> Use an http headers checking tool to see what is being returned for URL requests.
Google’s “URL Inspection Tool” offers to show you what Googlebot sees when you test a live URL – but that report leaves much to be desired. Validate or expand upon what the tool shows you with other tools.
The crawl stats report will show you when Google is struggling to fetch content from the server. You should match those rough periods to server bandwidth reports. Most if not all Web hosting accounts should provide some kind of bandwidth report.
If you can match the slow crawl rates with peak bandwidth times, can you determine what happened?
SEO Crawling Is A -BIG- Part of the Problem
*=> Some people crawl their (clients’) Websites constantly. Imagine what that must do to server response time.
I strongly urge people NOT to turn SEO crawlers loose on Websites. These crawlers may be convenient and easy to use, but they miss a lot of critical things. AND they add to server load – slowing down site performance, maybe even blocking legitimate visitors from loading pages.
You should not crawl a site as part of an audit unless you have strong reason to suspect there are many broken internal links.
If you can document broken links through a visual inspection of page source code, you may be able to fix a lot of problems quickly without going through a crawl. Schedule the crawl for later to confirm that you fixed the problem.
A well-managed firewall should save the average Website money every month. It can potentially save a big site hundreds to thousands of dollars a month in mitigation costs.
*=> Competitors crawl other people’s sites.
This is highly unethical behavior in my opinion. I block crawlers that hit our server because they are so poorly managed they bring our server to an unacceptably slow performance.
Paying for scalable cloud hosting just to handle all the rogue crawlers is a waste of money. You should pay for scalable hosting because your site’s legitimate traffic is increasing. Don’t pay for the bandwidth that competitors suck out of your hosting with their rogue crawls.
*=> Use Bing and Google console reports to find the 404 errors.
They’ve already done the work for you. Leverage their efforts. These are the errors that matter for SEO because they are the errors the search engines care enough about to report.
1 million 404 errors that Google doesn’t know or care about are not worth the time and money you’ll waste crawling a site.
Conclusion
We already have most of the data we need to manage Website speed and performance. Unfortunately, most SEO pundits focus on how to use the many tools that commercial vendors create while ignoring the tools bundled in Web hosting and search engine Webmaster dashboards.
If you cannot find a problem in the Web hosting bandwidth reports, or in the search engine crawl reports, there probably isn’t one. And any problem you find through a 3rd-party tool that isn’t reflected in the Web hosting and search console reports usually isn’t a problem worth your time and money.
3rd-party tools are most valuable when used pre-emptively on new Websites and/or content sections that search engines and visitors haven’t yet found. The tools can reveal potential issues that should be fixed before the rest of the Web finds them.
By the time most people care about Website speed and performance, they’re not thinking about all the reasons why the tools are seeing slow response times. It’s usually not because you’re embedding videos on the page. It’s usually because the server is struggling to keep up with demand. That’s a problem no 3rd-party speed testing or crawling tool will reveal to you.