The most common cause of a sudden drop in website traffic is a recent search algorithm update. Penalties, redirects, incorrect robots.txt rules and ranking losses are all other legitimate reasons why you may see a drop in website traffic. Fortunately, in most cases, if you’re affected by a sudden decline in traffic there are a couple of things that you can check. Hopefully, by the end, you’ll be able to diagnose why things might have changed.
1. Algorithm updates
Google doesn’t hide away from the fact that it releases multiple updates throughout the year, some more significant than others. Unfortunately, trying to get solid details of the changes is quite frankly like trying to get blood from a stone. However, an easy way to gauge whether your site may have been impacted by an algorithm update is to keep a close eye on confirmed changes from Google themselves. But, by far the easiest way to get information on algorithm changes is through Moz and SEMrush. If you find that there has in fact been a recent update, analyse the sites that have been affected the most. Try to spot any correlation between them and ensure that your site doesn’t suffer the same fate.
2. Tracking errors
Many webmasters and site owners manage to pull their tracking codes from the site and wonder why traffic nosedives. Fortunately, it’s a mistake that can be easily fixed, but in the long run, you will miss out on data – so the quicker you spot this and get it sorted, the better. If you notice that there’s suddenly no sessions being recorded in Google Analytics or a Tag isn’t firing then chances are the tracking codes either have an error or have been removed entirely. If you have access, check to make sure that the code is present and correct.
3. Incorrect robots.txt rules
Are you sure that your site isn’t blocking search engines from crawling in the robots.txt file? It isn’t uncommon for developers to leave robots.txt files unchanged after migrating from a development or staging website. Most of the time, when this happens it’s completely accidental. Go to your sites robots.txt file and make sure that the following rule isn’t present:
User-agent: *
Disallow: /
Sitemap: https://www.example.com/sitemap.xml
If it is, you’ll need to remove the Disallow rule and resubmit your robots.txt file through Google Search Console and the robots.txt tester.
4. Redirect errors
Most sites, especially large websites will have redirects in place. They’re most frequently added via a .htaccess file, or if you’re using WordPress, a plugin to make life a little easier. Whenever you add a new permanent redirect (301) to your site, test it before pushing it to a live environment, even more so if you’re adding large quantities of redirects. Ensure that redirects are still working as they’re expected to, by analysing the response codes and final destinations.
5. Crawl errors
Use Search Console‘s index coverage report and check for any URLs that have an error. Any URLs in the coverage report that have an error associated with them won’t be included in the index. Typical errors that are found in this report include:
- Server errors.
- Redirect errors.
- URLs that are blocked by robots.txt.
- URLs that are marked with a noindex tag.
- Soft 404 errors.
- URLs that return an unauthorised request.
- URLs that aren’t able to be located (404s).
- Crawling errors.
More information on these reports can be found here.
6. Ranking losses
Another really common reason for seeing website traffic decline is due to a loss in organic rankings. If you’re tracking your performance using a rank tracker, then troubleshooting this will be a lot easier. If you’re not, then utilising data from Search Console will be your best bet. Use the following process to get an idea of any ranking changes:
- Using Google Analytics and Search Console or your preferred rank tracking tool, identify when traffic started to drop.
- Take an export of the ranking keywords before and after the drop.
- Using Excel or G Sheets create a table and paste in the data side by side.
- Compare the change in positions.
- Retarget dropped terms with keyword research and mapping.
7. XML sitemap changes
If you’re knowledgeable in SEO you’ll know (hopefully) that only URLs that return a 200 response and are indexable should be visible in your sitemaps, unless you’ve purposely left redirected URLs to ensure that search engines find them quicker. One reason why you could be seeing traffic plummet is a change in your XML sitemap. Crawl the sitemap URLs and ensure that they all return a 200 OK response and that any new landing pages or articles are included too. If your site contains 200 URLs and there are only 50 in the sitemap you’ll want to regenerate and resubmit it using Search Console.
8. Manual actions and penalties
A manual action will be issued against your site if one of the eagle-eyed human reviewers finds content on the site that goes against Google’s guidelines. You can find more information on their webmaster guidelines here. You can see if your site has been affected by manual actions by using the manual actions report in Search Console.
9. URLs being de-indexed
Search engines like Google are not immune to ‘de-indexing’ bugs that cause sites to see important pages appear to be removed from the index almost overnight. Finding those important URLs are no longer available in the search results can be a massive factor when investigating a sudden website traffic loss.
- Check the index coverage report in Search Console for any errors.
- Using the URL inspection tool, check that important pages are still in the index.
- If not, use the “Request Indexing” option in Search Console.
10. Keyword cannibalisation
If you’ve recently created a lot of new content around a specific topic without considering the keyword targeting, you may have accidentally fallen victim to a keyword cannibalisation issue. Cannibalisation occurs when a website appears for a keyword with multiple URLs. If traffic is being spread across multiple pages or posts, you could be losing valuable organic traffic. There are a number of tools use can use to highlight cannibalisation errors.
11. SERP layout changes
Changes in the way Google and search engines display organic results can have an impact on your traffic levels. So making sure that you’re adaptable and willing to make changes will go a long way. Google, in particular, has made a number of changes to the way results are displayed; showing Featured Snippets, Knowledge Graphs and making ads more prominent to name a few, making life for SEO agencies and professionals very frustrating. Before you see any sign of an organic result you need to compete with Ads, Knowledge Graphs, Featured Snippets and Google’s Suggestions.
This doesn’t even take into consideration a number of other SERP features. Analyse the keywords that you’re targeting; just because they once weren’t triggering a SERP feature doesn’t mean that they don’t now. The easiest wast to do this is by simply searching the keyword on Google or other search engines. If the keywords you’re targeting are triggering featured snippets and instant answers, and you’re not the featured snippet, you’re going to be losing clicks and traffic to your site.
Summary
Seeing website traffic drop can be very disheartening, but there is always a reason why, and if there’s a reason, it can usually be fixed. If you take one thing away from this post, it’s that a sudden decline could be due to a number of reasons combined, or even just one key traffic-rich page that has fallen from the index. Make sure that you investigate every possible avenue, and you’ll quickly discover the cause and get a recovery plan in place.