Seven reasons why your rankings dropped and how to fix them

Seven reasons why your rankings dropped and how to fix them

Do you know the triumph when your content finally hits the first page of Google and attracts significant traffic? Unfortunately, nobody is safe from a sudden drop in rankings. The thing is that the reasons for it may be different and not obvious at all.

In this post, you’ll discover what could cause a sudden drop in traffic and how to fix the issue.

The tip of an iceberg

Unfortunately, there’s no one size fits all decision, when it comes to SEO. When you face the drop in your rankings or traffic, it’s just the tip of an iceberg. So, get ready to check lots of issues, before you identify the problem.

Graph on issues that cause ranking drops

Note: Percentages assigned in the above graph are derived from personal observation.

I’ve illustrated the most common reasons for a plummet. Start from checking these parameters to find out how you can recover your rankings and drive traffic to your website.

Algorithms test

First of all, check the SERP. What if it’s not only your website that changed its positions in search results? These sharp shifts may happen when Google tests its algorithms. In this case, you don’t even have to take any further steps, as the rankings will be restored soon.

If you track your rankings with Serpstat, you can analyze your competitors’ positions as well. It’ll help you understand whether the SERP was changing a lot lately. From the moment you create a new project, the tool starts tracking the history of top-100 search rankings’ changes for the selected keywords. The “Storm” graph illustrates the effect of the changes that have occurred in the search results.

The "Storm" graph that illustrates the factors causing the ranking drop

On this chart, you see that for the “cakes for dads” keyword the storm score was pretty high on 21st March. Now, let’s look at how the top-10 positions that were changing on this date.

Graph showing a phrase-wise rise and drop in the SERP

The graph shows a sharp drop and rise that occurred in most of the positions. In a few days, all the rankings were back to normal again.

This example tells us that whenever you witness a significant drop in your search rankings, you should start with analyzing the whole SERP. If there’s a high storm score, all you need to do is to wait a bit.

In case you checked your competitors’ positions and didn’t see any movements, here’s the next step for you.

Technical issues

Technical SEO affects how search robots crawl and index your site’s content. Even though you have optimized your website technically, every time you add or remove some files or pages, the troubles may occur. So, make sure you’re aware of technical SEO issues on your site. With Google’s URL Inspection tool, you can check the way search engines see your website.

These are the main factors crucial for your rankings:

1. Server overload

If your server isn’t prepared for traffic surges, it can take your site down any minute. To fix this problem, you can add a CDN on your website or cache your content, set up a load balancer, or set up a cloud hosting,

2. Page speed

The more the images, files, and pop-ups you add to your content, the more time it takes for your pages to get loaded. Mind that page speed isn’t only a ranking factor, but it also influences user experience. To quickly check the issue, you can go with Google’s PageSpeed Insights. And to speed up your website, you can:

  • Minimize HTTP requests or minify and combine files
  • Use asynchronous loading for CSS and JavaScript files
  • Defer JavaScript loading
  • Minimize time to first byte
  • Reduce server response time
  • Enable browser caching
  • Reduce image sizes
  • Use CDN again
  • Optimize CSS delivery
  • Prioritize above-the-fold content (lazy loading)
  • Reduce the number of plugins you use on your site
  • Reduce redirects and external scripts
  • Monitor mobile page speed

3. Redirections

It’s the most common cause of lost rankings. When you migrate to a new server or change the structure of your site, never forget to set up 301 redirects. Otherwise, search engines will either fail to index your new pages or even penalize your site for duplicate content.

Detecting site errors can be quite difficult especially if it’s located solely on one page. Inspecting every page would be time-consuming. Also, it’d be very costly if you’re running a business. To speed up the process of identifying such errors you can use different SEO tools and site audit tools, like Serpstat, OnCrawl, and other such ones.


Wrong keywords

Are you using the right keywords? If you hadn’t considered user intent when collecting the keywords, it might have caused some problems. Even if your site was ranking high for these queries for some time, Google could have changed the way it understands your site’s intent.

I’ll provide two examples to illustrate the issue.

Case one

There’s a website of an Oxford Summer School named “”. The site didn’t contain any long-form descriptions but services pages. Once Google began to rank the website for queries with informational intent, SEO experts noticed the traffic dropped. After they added more texts to the service pages, they succeeded in fixing the problem.

Case two

This case occurred to a flower delivery agency. While the website was ranking for transactional queries, everything was alright. Then Google decided the site better suits informational intent. To restore the site’s rankings, SEOs had to add keywords with high transactional intent, such as “order”, “buy”, and many such keywords.

To collect the keywords that are right for your business goals, you can use KWFinder. With the tool, you can identify relevant keywords that you can easily rank for.

Screenshot of a suitable keywords' list in KWFinder

Outdated content

This paragraph doesn’t require long introductions. If your content isn’t fresh and up-to-date anymore, people won’t stay long on your site. Moreover, outdated content doesn’t attract shares and links. All these aspects may become good reasons for search engines to reduce your positions.

There’s an easy way to fix it. Update your content regularly and promote it not to lose traffic. The trends keep changing, and if you provided a comprehensive guide on the specific topic, you don’t want it to become outdated. Instead of creating a new guide every time, update the old one with new data.

Lost links

Everybody knows your link profile is a crucial part of your site’s SEO. Website owners take efforts to build quality links to the new pieces of content. However, when you managed to earn a large number of backlinks, you shouldn’t stop monitoring your link profile.

To discover whether your link profile has undergone any changes for the last weeks, go with Moz or Majestic. The tools will provide you with data on your lost and discovered links for the selected period.

Screenshot of discovered and lost linking domains in Moz

If you find out you’ve lost the links from trustworthy sources, try to identify the reasons why these links were removed. In case they’re broken, you can always fix them. If website owners removed your links by chance (for example, when updating their websites), then ask them to restore links. If they did it intentionally, no one can stop you from building new ones.

Poor user experience

User experience is one more thing crucial for your site’s rankings. If it had started ranking your page high on search results and then noticed it didn’t meet users’ expectations, your rankings could have suffered a lot.

Search engines usually rely on metrics such as the click-through rate, time spent on your page, bounce rate, the number of visits, and more. That’s why you should remember the following rules when optimizing your site:

1. Provide relevant metadata

As metadata is used to form snippets, it should contain relevant descriptions of your content. First of all, if they aren’t engaging enough, users won’t click-through them and land on your site. On the other hand, if your snippets provide false promises, the bounce rate will increase.

2. Create an effective content structure

It should be easy for users to extract the necessary information. Most of your visitors pay attention to your content structure when deciding whether they’ll read the post.

Break the texts into paragraphs and denote the main ideas in the subheadings. This step will help you engage visitors looking for the answer to their very specific questions.

3. Avoid complicated design and pop-ups

The content isn’t the only thing your audience looks at. People may also decide to leave your website because of irritating colors, fonts, or pop-up ads. Provide simple design and minimize the number of annoying windows.

Competition from other websites

What if none of the steps worked? It might mean that your rankings dropped because your competitors were performing better. Monitor changes in their positions and identify the SERP leaders.

You can analyze your competitors’ strategies with Serpstat or Moz. With these tools, you can discover their backlink sources, keywords they rank for, top content, and more. This step will help you come up with ideas of how you could improve your own strategy.

Never stop tracking

You can’t predict whether your rankings will drop one day. It’s much better to notice the problem before you’ve already lost traffic and conversions. So, always keep tracking your positions and be ready to react to any changes quickly.

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat.

The post Seven reasons why your rankings dropped and how to fix them appeared first on Search Engine Watch.

Amazon Listing Optimization – How to Increase Sales on Amazon

Amazon Listing Optimization – How to Increase Sales on Amazon

Amazon Listing Optimization - How to Increase Sales on Amazon

If you want to understand the principles of Amazon listing optimization and how to increase sales on Amazon this article will give you the strategies and best practices to help your listings be successful. Providing a high-level view of strategy to the specific element applications, you will come away with a breadth of knowledge that will be sure to take your Amazon sales to the next level.

The importance of typography

The importance of typography

For a blog, it’s of great importance that people can read the texts of your post properly. Reading from screens is hard, so make sure you don’t make it any harder than it already is. In this post, we’ll give tips on how to improve the typography of your blog.

Typography and readability

The readability of a particular text depends both on its content (for example, the complexity of its vocabulary) and its typography. In my previous post about readability, I gave tips to make sure the complexity of your text is adapted to your audience.

Typography is the science of arranging your letters in order to make written text readable and appealing. Before digitization kicked in, typography was a specialized occupation, nowadays typography is something everybody has to deal with, at least everybody who owns or maintains a website. Typography involves selecting typeface (font family), font size, line length, line-spacing and letter-spacing.


Font size

Make sure you use at least 14 pixels for your font size. That size is a good read on both the larger desktop screens and our mobile screens. The preferred font size for a website nowadays is much larger than it was ten years ago. Back then, a font of 10 pixels allowed you to add more text to a page and made your page look more like a book. With the growth of computer screens, nowadays 16 pixels is very normal.

Font color

What font color to use is largely depending on the type of blog you have and what design your website uses. In general, we say that using a black font on a white background is still the best read. The general thought is that outlines are sharper and letters are easier to distinguish and identify.

The one thing you should do regarding font and background color, is test the contrast of your font color and background. There are a number of tools available, all with their own kind of contrast checker. A really easy and good one is Colorable. Colorable allows you to enter the foreground (text color) and background (background color). It will tell you immediately if the contrast is right or wrong and what score the combination of colors gets. Colorable is based upon the WCAG accessibility guidelines.


Next to font size, you also want to make sure that the text has sufficient room to breathe. If you’re using a larger font size, but forget to add whitespace for headings and paragraphs, your text will still be unreadable. Whitespace is especially important on a mobile device.

Besides adding enough whitespace between headings and paragraphs, you should also add enough whitespace between the lines. If you neglect to add a proper line height, your well-constructed sentences become one big block of letters. This is far from user-friendly and will make your page very unappealing for a visitor.

using whitespace to improve readability

Typography of links

The design of the links in your texts itself is important. Of course, backlinks are important for SEO. However, the design of the links is important for Usability. Make absolutely clear what’s a link and what isn’t. You can do so by picking a different color or by adding an underline.

To emphasize the link you can easily combine the two options above. Also, make sure you change the style when hovering your mouse cursor over the link. Remove the underline or change the color.

Line length

A final thing to consider when it comes to typography is the length of your text lines. In Readability: the Optimal Line Length, Christian Holt mentions a number of suggested text lines, stating these should be 50 to 65 or 75 characters. Ilene Strizver states that non-justified text should be 9 to 12 words per line, and justified text should be 12 to 15 words. From our experience, 10 to 15 words is indeed a good read.


Making sure your text is nicely written and not too difficult is only one aspect of readability. In order to read a text properly, the typography of your text should be OK too. Make sure to use a decent font size, think about the contrast of the colors you use and add whitespace. Focusing on both aspects of readability (typography and difficulty) will make sure that people will start and keep reading your blog posts.

Read on: 5 tips to improve the readability of your post »

The post The importance of typography appeared first on Yoast.

Scraping the SERPs to Determine Timing of Journalist’s Topic Coverage

Scraping the SERPs to Determine Timing of Journalist’s Topic Coverage

You researched the right websites, the right contacts, wrote an exciting e-mail and now you’re waiting for a story to pick up. This is the moment where a lot of us get anxious because the outcome is out of our control. Some journalists will get in touch right away, some will open your emails over and over and do nothing, and a some will publish a story days or weeks later. If the story isn’t picked up quickly, we start questioning if the campaign will fail.

Sometimes you can have a good story, but the timing is wrong. Have you ever wondered when journalists are likely to publish about a certain topic? Well, now you can have that answer.

Using a crawler and some easy XPath rules, you can scrape Google News and find out the specific dates those topics have hit the news in the past. When does the Christmas season start in the press? When is the new GoT season likely to become trendy? Keep reading to find out!

Guide and template for SERP scraping

We’ll show you how to run this using Screaming Frog, but I imagine other crawlers could do the same. The first step is to learn some XPath rules, which I learned from this guide published by BuiltVisible.

The goal is to scrape the top 100 (or more if you like) results for a certain topic and extract the date when these articles were published. This will give you a view for recurrent events (such as holidays) or even general behaviour towards a topic.

I very much recommend learning those rules because there’s much more you can do using XPath rules – but for the sake of this exercise, you just need to configure Screaming Frog with a few rules.

Spider Configuration:

  • Untick all boxes from Basic Configuration
  • JavaScript Rendering


  • Chrome


  • List

Configuration > Custom Extraction:

  • XPath
  • Add //h3, //div/span and choose “Extract Text”
  • Add //h3/a and choose “Extract Inner HTML”

If you want to save the trouble on all of the above, here are the configured files for Google universal and Google News.

Now, we need to find the exact URL we want to scrape. This is where we I narrow down our target. Since I work for Wolfgang Digital and we’re an Irish agency, we picked St Paddy’s as a test.

The event happens every year in March, so I decided to filter anything available on Google News published between 1st February and 31st of March. You can filter the content directly on Google.

Usually shorter periods work better because you’re showing the 100 most important pieces of coverage on that particular topic and a larger selection of dates will just ignore many articles.

Once you have the URL, just add &num=100 at the end, so you can see the top 100 results in one page.

All configured? Then hit Start on Screaming Frog and go to the Custom tab (the very last one), where you’ll find the extracted data you just requested. The view is not great as all the data is in a horizontal line, so the best thing to do is to export to a csv file.

Making your data pretty

Once exported, organise your data for a better view. This blog post is focused on the publication date, but since you have the page title and URL, feel free to deep dive into what has been published on that topic. Have your target publications covered something along the lines of your campaign? Is there a pattern?

To clean up your results, there are a few handy Excel spreadsheet rules to use. This is the way I do it, but if you’re familiar with Excel or G-Sheets, feel free to ignore the below and do it your own way!

Date tab – Use “Text to Columns” and a “-” as a separator. This will split the publication name and date in two columns.

URL tab – select the whole column and replace “<a href=”/url?q=” for nothing. Then use “Text to Columns and a “&” as a separator.

This will give you a nice visualisation like the below:

Almost there! Your final step is to make your data look good. If you have a huge list, let’s say 100 results, this raw list will still make it difficult to  find out the specific dates when articles were published.

Simply select your dates in a separate column, remove duplicates and use a =COUNTIF formula to count the number of publications per day and put them on a chart to make visualisation much easier!

More examples

We took the first 100 results on Google News for a few topics and did the exercise above. Note that some results will be different depending on the time frame and location, so these may vary.

Valentine’s day

Location: Google News Ireland

Time frame: Jan 1 – Feb 18 2019

What we learned: the first week of February seems to be a good time to outreach. Stories started to pick up on the 6th and peaked on day before Valentine’s Friday (14th). Outreaching on the event day leaves you a smaller chance of getting  coverage.

Game of Thrones

Location: Google News Ireland

Time frame: 3 to 17th July (2 weeks) and 18th July to 1st Aug (2 weeks)

What we learned: Two weeks before Game of Thrones 7th season was released, the press was already quite interested, so your campaigns can be out there quite early. After that, you will still find days with high coverage but the numbers can vary a lot.


Scraping the SERPS can give you good insights into a topic, what type of content is being published and a bit more confidence when reaching out. It’s always important to listen to your gut, but if you’re not familiar with a given topic, this can help you time your outreach perfectly.

Crafting strong linkable assets takes a lot of time and effort, so make sure you don’t  reach out too early or too late and miss a huge opportunity!

The post Scraping the SERPs to Determine Timing of Journalist’s Topic Coverage appeared first on BuzzStream.