9 Factors that Reduce Organic Traffic to Your Site

Organic traffic is a sacred cow for most sites that live solely on income from selling space for contextual advertising. If organic traffic begins to decline, and this decline continues for several weeks, it’s a warning sign. It is necessary to take countermeasures as soon as possible.

There are many reasons for the decline of organic traffic. Frequent drawdowns occur after moving to a new domain, after a global redesign, or implementing other changes. Any global changes may cause a noticeable decrease in traffic. SEO specialist with extensive experience helped write my paper for me and compile these tips to reduce organic traffic. 

A decrease in organic traffic usually follows in the following cases:

  • change in page URLs;
  • move to a new CMS;
  • changes in the structural component;
  • updating page meta tags and other elements (product cards, sections, headings);
  • deletion of essential pages.

Consider these and other reasons in more detail.

1. Robots file

During development, the new site is often unintentionally closed from indexing through a system file robots.txt. During the move, technicians often forget to change this file for the current one, which does not prohibit search engine indexing: even if the action is made in the correct order, it is tough to keep track of absolutely everything.

Suppose the site’s indexing turned out to be restricted in the system file robots.txt. In that case, this problem can be eliminated by re-forming the robot’s file or by deleting lines from it mentioning Google search bots containing the disallow directive. 

2. Mass or partial URL changes

If you change the general appearance of the links, sagging traffic is inevitable. The fact is that weight, and other link metrics go to new pages far beyond a single day.

You can minimize the adverse effects of link changes by setting up 301 redirects from all obsolete URLs to new ones. 

Lost prescribed meta tags for web pages

This situation can occur when:

  • change of CMS;
  • the shift in system files;
  • evolution of the site theme;
  • moving to a new domain.
  • A shitload of other situations.

Before making any global changes to your site, be sure to make a backup! If you do not know how to do this, write to the support of your hosting. Here I will not describe because the process may vary from hoster to hoster.

Return the previously configured meta tags pages can be manual. The algorithm is as follows:

  • Find the meta tags of all the old pages and copy them into a separate document, for example, an Excel spreadsheet. Note that you need to copy the meta tags for web pages and cards, headings, brands, and categories. An alternative option is to generate a meta generator based on the data that was on the outdated web pages. Here you will have to “code,” so this work should involve a qualified programmer.
  • Transfer the resulting meta tags to the new pages. Do not carry the meta tags if they are frankly perepamleny (contain more than two keys in one title in a natural occurrence). Such “optimization” will only harm the site after the move.
  • Install all necessary 301 redirections.
  • All updated pages should return only the 200 code.

3. Slow loading speed.

If, after upgrading/moving to a new domain or hosting, the site began to slow down, and the pages’ loading time has increased significantly, you need to check how the hosting copes with the existing load. To do this, write a request to the appropriate support web hosting.

Analyze the rate plan for your hosting. Is there enough disk space for your site? Also, pay attention to the static load, which the site creates on your hosting. If the bag is very high, check whether the correct PHP version of the site is selected in the hosting administration panel. 

Other reasons for a high static load on hosting:

  • the site uses a large number of plugins;
  • a lot of PHP- and Perl scripts are involved;
  • the site sends a large amount of static (for example, it has a lot of images);
  • DDoS-attacks, HTTP-flood, too many requests from one or more ip-addresses.

4. Some of the new URLs can not be indexed

Quite a common situation after relocation. It occurs when linking is set up incorrectly. For example, old links to pages already deleted or moved. Google does not like it. 

Sometimes problems with indexing new URLs occur when the appearance of the web page (and, accordingly, the availability of content) looks different for the crawler and the live visitor. To check this version, you should disable JavaScript in your browser. 

5. The menu in the site header

If you have partially changed the site theme or design, you may have removed critical URLs in the site header bringing in traffic. If links were removed from the title, they automatically lose weight. An excessive number of URLs in the header is also not welcomed by Google – reduced overall weight, and in particularly severe cases, the site imposed a filter. 

6. Forgot to move tags, categories, filters

When you move to take into account everything, especially if the site is quite old and voluminous, it is challenging. For example, in the new version of the site forgot to move, it would seem insignificant elements: rubrics, subcategories, page tags. These elements could be essential precisely for search engine optimization.

The above situation arises when the site is working on several independent professionals rather than a team that coordinates all actions. 

7. 301 redirects are not working

If you’ve moved recently, you need to make sure your 301 redirects are working correctly. If they haven’t been set up at all, now is the time to fix it.

The main task is to set up redirects from all obsolete URLs to new ones.

The main mistake is changing URLs en masse without creating an appropriate redirect.

The main reasons for the decrease in organics are manual URL changes, switching to semantic URLs, mass changes in page titles, and URL changes.

Even if the new URLs are already in Google’s index, you can solve the problem. To do this, as I mentioned above, you need to make 301 redirects from all obsolete URLs to updated URLs.

Redirect is configured not only for web pages but also for all elements of the site that have addressed: filters, cards, categories, sections. But this is ideal. Sometimes it’s almost impossible to do such mass work, especially if the site is vast and the number of pages is in the thousands.

To make the redirection, you will need to match the outdated URLs with the current ones. To do this, you will have to unload them along with the web page titles. Current page URLs can be unloaded in the following ways:

  • Extract URLs and page titles manually – not always possible if there are a lot of pages;
  • extract URLs and page names from the database; you will need a 
  • Extract addresses from the XML map of the site.

You may have a hard time getting outdated URLs. You can use the following methods to do this.

  • Extract obsolete URLs and page names from older versions of the database. These databases are not always accessible. You will need help from a web programmer.
  • Extract obsolete URLs and page titles with the help of the Internet Archive web service. Just specify the domain and select the period of interest at the top of the screen.

8. URL errors in Google Webmaster tools

Google Search Console report can help you find problem pages. Open the webmaster tools and select the “Coverage” section:

Server Errors” and “Not Found” reports should be analyzed. If there are more than 10 pages, it makes sense to fix the errors. If there are 2-3 such pages, you may not bother with them at all.

The “Not found” report will show all URLs that have recently returned the 200th response. So now they will have to produce a 404 code. The “Not found” message is good because its URLs are displayed according to the hierarchy: at the top – the most “heavy,” at the bottom – the links with the least weight.

You can export problematic URLs with the help of Google’s webmaster tools. Open Google Analytics and select “Traffic Sources,” then click “All Traffic. Open the “Channels” section and under Default Channel Grouping, select “Organic”:

Now select “Login Page” as the primary metric, and just below specify how many lines on the page you want to show. In the upper right corner, enter the time interval to get statistics. 

The resulting list of URLs should be checked. If all or most of the pages return a 200 code, everything is fine. If most URLs return 301, make sure that all redirected URLs lead to “their” pages and not to the home page, for example. If most of the URLs return a 404, just set up 301 redirects for all problem pages.

9. Incorrect installation of SSL certificate

If some pages open over HTTP protocol and others over HTTPS, there is a problem with mixed content. In short:

  • Fix all URLs on the site: replace HTTP in all links with HTTPS, including HTML and XML sitemaps, robots.txt, images.
  • Make 301 redirects from HTTP to HTTPS.


A sharp decline in organics is an important signal that should not be missed. Finding the source of the problem can be difficult, but by checking every point in this manual, you will come up with an error sooner or later. 


Elissa Smart is an omnipotent demiurge behind PaperHelp’s blog. Driven by seething creativity, not only she helps students with particular research and writing requests, but also finds the energy to share her extensive expertise via blog posts.

Leave a Reply

Your email address will not be published.