Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit

Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit

While technical SEO is a topic that only some of us make use of rigorously, it is a part of everybody’s life. Well, which part of SEO is not technical if we were to look at it thoroughly?

 

SEO issues, mistakes, tips and recommendations are all included in today’s technical checklist. We wanted to cover, in the most effective way possible, all the elements that are important for making your website user-friendly, efficient, visible in SERP, functional and easy to understand. Therefore, gather all the information you have on your site and let’s get better. 

 

Technical SEO the Complete List

 

I. Website Loading Speed Time

  1. Improve Server Response Time
  2. Optimize & Reduce Image Size Without Affecting the Visual Appearance
  3. Minimize the Render-Blocking Javascript and CSS
  4. Limit the Number of Resources & HTTP Requests
  5. Set a Browser Cache Policy
  6. Reduce the Number of Redirects & Eliminate Redirect Loop
  7. Avoid Loading Your Site With Too Much Stuff

 

II. Website Functionality & Usability

  1. Make Sure Your Website Is Mobile Friendly
  2. Build Search Engine Friendly URLs
  3. Use the Secure Protocol – HTTPs
  4. Set Preferred Version
  5. Set up Correctly the 301 Redirects After Site Migration
  6. Make Sure Your Resources Are Crawlable
  7. Test Your Robots.Txt File to Show Google the Right Content
  8. Verify the Indexed Content
  9. Review Your Sitemap to Avoid Being Outdated
  10. Review Blocked Resources (Hashbang URLs) with Fetch as Google
  11. Optimize Your Crawl Budget
  12. Avoid Meta Refresh for Moving a Site
  13. Use Redirect for Flash Site to the HTML Version
  14. Use Hreflang for Language and Regional URLs
  15. Make Sure Your Tracking Is Working Properly

 

III. Content Optimization

  1. Reidrect/Replace Broken Links & Resources
  2. Audit Internal Links to Improve Your Chances to Rank Higher
  3. Get Rid of Duplicate Content
  4. Use Structured Data to Highlight Your Content
  5. Keep a Reasonable Number of Links On-Page
  6. Avoid Canonicalizing Blog Pages to the Root of the Blog

 

IV. User-Friendlier Website

  1. Set up Your AMP the Right Way – Mobile Friendlier
  2. Add Breadcrumbs for a Better Navigation
  3. Test On as Many Platforms and Devices as Possible
 

I. Website Loading Speed Time

 

On the web, time is of the essence. Websites all around the world load pretty slow, with an average of 19 seconds to load on a 3G mobile network. Testing has confirmed that around 50% of users abandon a website if it doesn’t load faster than 3 seconds, on average.

 

If your website loads slowly, you can lose a lot of visitors.

 

Disclaimer & Warning: Playing with PHP, servers, databases, compression, minification and other similar things can really mess up your website if you don’t know what you’re doing. Make sure you have a proper backup of the files and the database before you start playing with these options.

 

When we talk about speed, there are a few things we need to consider for making your site efficient and easy to access for your users. A faster loading speed time means higher conversion and lower bounce rates. For that, we’ve selected some mandatory speed optimization suggestions. Using Google’s Speed Test, you can perform easy and short analyses of your website’s loading speed time.

Audit & Fix Your Site Now

The tool has improved over the years and now you can see helpful charts for large websites to understand how each website is performing. One example is the Page Load Distributions.

 

Page load distribution

 

The Page Load Distribution uses two user-centric performance metrics: first, contentful paint (FCP) and DOMContentLoaded (DCL). The contentful paint marks the first bit of content there is on the screen when the browser starts to render pixels. The DOMContentLoaded marks the moment when the DOM is ready and there are no stylesheets that are blocking JavaScript execution. These two metrics show exactly which percentage of the content loads faster and the one that needs improvement by looking at those pages with average and slow speed (if you follow the chart).

 

Another example includes the speed and optimization indicators which show where each website is situated. In the picture showed below, we can see the FCP and DCP score. These two metrics use the data from the Chrome User Experience. It indicates the page’s median FCP (1.8s) and DCL (1.6s) ranks it in the middle third of all pages. That means this page has a low level of optimization because most of its resources are render-blocking. 

 

Speed and optimization

 

1. Improve Server Response Time

 

Server response time refers to the period of time it takes to load the HTML code to begin rendering the page from your server. Basicaly, when a you access a page, it sends a message to the server and the time it take to show you that information is considered to be the server response time.

 

There are lots of reasons why a website has a slow response time. Google announces just some of them:

There are dozens of potential factors which may slow down the response of your server: slow application logic, slow database queries, slow routing, frameworks, libraries, resource CPU starvation, or memory starvation.
Google logo Google Developers
 

The server response time depends on how much time the Googlebot needs to access the data. Be it 1, 2 ,3 seconds or more, it will convert your visitor or not. Google says that you should keep the server response time under 200ms.

 

There are 3 steps you need to follow to test and improve the server response time:

  1. Firstly, you need to collect the data and inspect why the server response time is high.
  2. Secondly, measure your server response time to identify and fix any future performance bottlenecks.
  3. Lastly, monitor any regression.

 

Many times, the reason why a website loads slow is the server itself. It’s very important to choose a high quality server from the beginning. Moving a site from a server to another might sound easy in theory, but it can be accompanied by a series of possible problems such as file size limits, wrong PHP versions and so on.

 

Choosing the right server can be difficult because of pricing. If you’re a multinational corporation, you probably need dedicated servers, which are expensive. If you’re just starting out with a blog, shared hosting services will probably be enough, which are usually cheap.

 

However, there are good shared hosting servers and bad dedicated ones and vice versa. Just don’t go after the cheapest or the most renowned. For example, Hostgator has excellent shared hosting services for the US, but not so excellent VPS ones.

 

2. Optimize & Reduce Image Size Without Affecting the Visual Appearance

 

If a website is loading really slow, one of the first things that come in mind are images. Why? Because they’re big. And we’re not talking in size on screen but in size on disk.

 

Besides all the information an image has, as mentioned before, it also downloads lots of bytes on a page, making the server take more time than it should to load all the information. Instead, if we optimize the page, the server will perform faster because we removed the additional bytes and irrelevant data. The fewer the downloaded bytes by the browser, the faster a browser can download and render content on the screen.

 

Since GIF, PNG, and JPEG are the most used types of extension for a picture, there are lots of solutions for compressing images.

 

image-compressor

Source: www.cssscript.com

Here are a few tips and recommendations to optimize your images:

  • Use PageSpeed Insights;
  • Compress images automatically in bulk with dedicated tools (tinypng.com, compressor.io, optimizilla.com) and plugins (WP Smush, CW Image Optimizer, SEO Friendly Images) and so on;
  • Use GIF and PNG formats because they are lossless. PNG is the desired format. The best compression ratio with a better visual quality can be achieved by PNG formats;
  • Convert GIF to PNG if the image is not an animation;
  • Remove transparency if all of the pixels are opaque for GIF and PNG;
  • Reduce quality to 85% for JPEG formats; that way you reduce the file size and don’t visually affect the quality;
  • Use progressive format for images over 10k bytes;
  • Prefer vector formats because they are resolution and scale independent;
  • Remove unnecessary image metadata (camera information and settings);
  • Use the option to “Save for Web” from dedicated editing programs.

 

Compress images

Source: www.quora.com

 

If you’re using WordPress, you can choose a simple solution, such as the Smush Image Compression Plugin.

 

Update: As of 2019, Google PageSpeed Insights recommends using new format images such as JPEG2000 or WEBP. However, not all browsers and devices display these formats well yet, so regular image compression is still recommended, despite Google making efforts to push this.

 

You can see which images are the biggest on your website with the Site Audit by CognitiveSEO. Simply head to the Images section, under Content. There you can see a list of images over 500kb (consider that for a photographer website, these images might be relatively small in size. However, it’s a good idea to display the full HD version under a sepparate download link).

 

The only real issue with PageSpeed Insights is that you can only check one page at a time.

 

We, here at CognitiveSEO, know that many of you want to check the PageSpeed Insights in bulk. So that’s why we’ve developed our tool to be able to bulk check the Page Speed Insights scores on multiple pages at the same time:

 

Check PageSpeed Insights in Bulk

 

However, note that if you have a very big website, this process might take a very long time. It’s better if you opt out of this process at first before the first analysis is done (so that you may have all the data and start fixing some of the issues) and start the PageSpeed process later. It can take up to 10 seconds per page, so if you have 60,000 pages it can take a week.

 

 

3. Minimize the Render-Blocking Javascript and CSS & Structure HTML Accordingly

 

When you perform a speed test with Google’s PageSpeed Insights, you will see this message: Eliminate render-blocking JavaScript and CSS in above-the-fold content in case you have some blocked resources that cause a delay in rendering your page. Besides pointing out the resources, the tool also offers some great technical SEO tips regarding:

  • Removing render-blocking JavaScript;
  • Optimizing CSS delivery.

 

You can remove render-blocking JavaScript by following Google’s guidelines and avoid or minimize the use of blocking JavaScript using three methods: 

  • Inline JavaScript;
  • Make JavaScript Asynchronous;
  • Defer loading of JavaScript.

 

If Google detects a page which delays the time to first render because it contains blocking external stylesheets, then you should optimize CSS delivery. In this case, you have two options:

  • For small external CSS resources, it is recommended to inline a small CSS file and help the browser to render the page;
  • For large CSS files, you have to use Prioritize Visible Content to reduce the size of the above-the-fold content, inline CSS necessary for rendering it and then defer loading the remaining style.

 

PageSpeed shows which files need to be optimized through the minifying technique. When we talk about resources, we understand HTML, CSS, and JavaScript resources. Basically, the tool will indicate a list of HTML resources, CSS resources, and JavaScript resources, depending on the situation. Below you can see an example of such kind:

 

Minify JavaScript Resources

 

For each kind of resources, you have individual options:

 

Below you can see an example on how to minify your CSS:

css minifier

Source: www.keycdn.com

 

There are 3 processes that need to be followed in the minifying process, explained by Ilya Grigorik, Web performance engineer at Google:

  1. Compress the data. After you eliminate the unnecessary resources, you need to compress the ones that the browser needs to download. The process consists in reducing the size of the data to help the website load the content faster.
  2. Optimize the resources. Depending on what sort of information you want to provide on your site, make an inventory for your files and keep only the one that is relevant, to avoid keeping irrelevant data. After you decide which information is relevant to you, you’ll be able to see what kind of content-specific optimizations you’ll have to do.

Let’s take, for example, a photography website that needs to have pictures with a lot of information, such as camera settings, camera type, date, location, author and other information. That information is crucial for the particular website, while for another website it might be irrelevant.

  1. Gzip compression is best used for text-based data. In the process, you are able to compress web pages and style sheets before sending them to the browser. It works wonders for CSS files and HTML because these types of resources have a lot of repeated text and white spaces. The nice part of Gzip is that it temporarily replaces the similar strings within a text file to make the overall file size smaller.

 

 

For WordPress users there are simpler solutions:

  1.  Autoptimize plugin to fix render blocking scripts and CSS. You need to install the plugin and afterward you can find it in Settings » Autoptimize to configure the settings. All you have to do is check the box for JavaScript and CSS, in our case, and click on Save Changes.

 

Autoptimize CSS and JavaScript

Source: www.webid-online.com

 

  1. W3 Total Cache to fix render-blocking JavaScript. This is another tool provided for WordPress users and it requires a little more work. After you install it, you need to go to Performance » General Settings and look for the Minify section.

 

Minify scripts with W3 Total Cache

Source: www.factoriadigital.com

 

Check the enable box from the Minify option and then in Manual mode. In the end, click on Save all settings and add the scripts and CSS that you want to minify. After that, you’re set.

 

However, don’t get tricked by Google. The truth is that PageSpeed Insights is just a guideline. For example, PageSpeed Insights shows Analytics and Tag Manager as being JS that blocks the loading of important content. Yet they force you to put it in the <head> section.

 

You can follow this guide to better set up the W3 Total Cache Plugin.

 

Never remove something that is essential for tracking or for your site’s functionality just to get 100% score on PageSpeed Insights or GT Metrix.

 

4. Limit the Number of Resources & HTTP Requests

 

One of the first actions that come to mind when we talk about website speed is reducing the number of resources. When a user enters your website, a call is made to the server to access the requested files. The larger those files are, the longer it will take to respond to the requested action.

 

Rapid, multiple requests always slow down a server. It’s a combination of multiple factors that lead to this, but you can compare it to copying 1 large file on a hard disk against copying a very large number of small files. Usually, the small files take longer to copy because the disk needle has to keep moving. This is different with SSD technology where there are no needles but there’s still a lot more work to do to copy multiple files than to copy a single larger file.

 

To check your HTTP requests, you can open an Icognito (to make sure you don’t have cached requests which won’t take place) Tab in Chrome, right click and hit Inspect (at the bottom). Then you need to find the network subtab and hit F5 to refresh the page. This will start monitoring the requests and at the end you’ll see the number of requests.

 

Check HTTP Requests Chrome

 

There’s no general number, we can say that you should try to keep this number under 100. This really depends on the page. If it’s a HUGE page, then it can have more requests. Then again, it could be a good idea to paginate it.

 

The best thing you can do is delete unnecessary resources (like sliders) and then minimize the overall download size by compressing the remaining resources.

 

Another thing you can do is combine the CSS and JS files in a single file so that 1 single request is being made. Plugins such as Autoptimize and W3 Total Cache (both mentioned above) can do this. Through the combine option, the plugin basically takes all the CSS and JS files and merges them into a single file.

 

This way, the browser will only have to make one request to the server for all those files instead of one request for each file.

 

However, be careful! This option can usually break an entire site or make it display really messed up, so make sure you have a proper backup of the files and database before you start making any changes.

 

5. Set a Browser Cache Policy

 

The browser cache automatically saves resources in the visitor’s computer the first time they visit a new website. When users then enter the site a second time, those resources will help them get the desired information at a faster speed, if they return to that page. This way, the page load speed is improved for returning visitors.

 

For visitors that want to return to a page or visit a new page that in a specific moment can’t be accessed, there’s the option to view the cached version directly from SERP.

 

Cached website in SERP

 

The best way to significantly improve the page speed load is to leverage the browser cache and set it according to your needs.

 

Most of the Minification, Compression and Combination plugins are actually cache plugins, so they all have this function. You can use W3 Total Cache or any other caching plugin suits you best. However, a combination between W3 Total Cache’s caching and Autoptimize’s compression and combining is best.

 

Using a cache will also make changes harder to spot. If you make a change to your website, open an Icognito tab to see the changes and go to the plugin settings from time to time to reset the cache.

Audit & Fix Your Site Now

 

6. Reduce the Number of Redirects & Eliminate Redirect Loop 

 

Redirects can save you from a lot of trouble regarding link equity/juice and broken pages, but it can also cause you lots of problems if you have tons of them. A large number of redirects will load your websites at a slower speed. The more redirects, the more time a user must spend to get on the landing page.

Plain and simple, WordPress redirects slow down your site. That’s why it’s worth taking the time to minimize the number of redirects visitors to your site experience. There are times that it’s appropriate to intentionally create and use redirection, but limit the use of redirection to necessary instances and make sure your visitors have the fastest experience possible when browsing your WordPress website.
Jon Penland Jon Penland
 Support Engineer at Kinsta@jonrichpen

 

One other thing worth mentioning is that you need to have only one redirect for a page, otherwise you risk having a redirect loop. A redirect loop is a chain of redirects for the same page, which is misleading because the browser won’t know which page to show and will end up giving a pretty nasty error.

 

Redirect loop

Source: www.matrudev.com

 

In case you have 404 pages, there are lots of ways to customize the page and give some guidelines to the users so you won’t lose them. Design a friendly page and send the user back to your homepage or to another relevant and related piece of content.

 

For finding the broken pages for your website, you can use the Google Search Console by looking at Crawl » Crawl Errors, then click on Not found (if any).

 

lots-of-crawl-errorsnot found

 

Site Explorer offers a similar feature, pointing out the link juice you are losing (the number of referring domains and links for each broken page).

 

Broken pages

 

You can also use the new Technical SEO Site Audit Tool to analyze all your site’s redirects. After you set up the campaign and the tool finishes crawling and analyzing your site, simply head to Architecture > Redirects.

 

Fix 301 Redirects

 

7. Avoid Loading Your Site With Too Much Stuff

 

Over time, sites tend to get clogged up with useless images, plugins and functions that are never used. Why?

 

If you use WordPress, for example, you might test a lot of plugins and install them on your website, only to find out that you don’t really need them. Sure, you can disable them and eventually uninstall them but the problem with WordPress uninstalls is that they’re often dirty, leaving traces in your Database which can make it a little slower.

 

websites with bad UX and very many ads usually load slow

Try not to get your site looking like this, it’s probably not the best UX.

 

Another very common type of plugin that webmasters use are Sliders. Sliders used to be popular but recent testing has shown over and over again that they kill conversions.

 

Not only that, but Sliders also usually load your site with a lot of things you don’t need. The first one is usually the Javascript file which tends to load on all pages (either in the footer or the head section of your HTML). However, the slider is most probably used only on the homepage.

 

Also, if you have 6 slides on your homepage, with 6 big pretty images, your site can be 2 or 3 times slower because of the size in bytes of the images. Unfortunately, nobody is probably going to look past the second image, if it auto-slides, of course.

 

A good workaround is having some sort of development environment where you can test 5-10 plugins until you find exactly what you need. Then, make a plan of implementation so that you know only the essentials you need to install on the live version.

 

After that, you can reset the development version by deleting it and copying the updated live version to it. This way, the live version will not be clogged either and will resemble the live version more.

Audit & Fix Your Site Now

 

II. Website Functionality & Usability

 

After you make sure your website can load fast for your users, it’s time to see what you can do to improve your visibility in the search engines. There are very many aspects that go into this, but the following ones are a mixture between the most important ones and the most common mistakes that webmasters make.

 

8. Make Sure Your Site Is Mobile Friendly

 

There’s nothing much to say here. Since more than 50% of all the users worldwide are using their mobile devices to browse the internet, Google has prioritized mobile indexation. You should make sure that your website is optimized for mobile devices.

 

This is usually meant in terms of design, but also in terms of speed and functionality. Generally, it’s preferred to have a responsive design rather than a fully separate mobile version, as the m.site.com subdomain requires extra steps to be implemented correctly using rel=alternate tag.

 

You can ensure that your site is mobile friendly by testing it on Google’s Mobile Friendly Test Page.

 

Page is mobile-friendly

 

 

9. Build Search Engine Friendly URLs

 

URLs are very important because it’s good not to change them. This means you have to get them right the first time. It’s useful for users and search engines to have URLs that are descriptive and contain keywords.

 

However, many people often forget this and build websites with dynamic URLs that aren’t optimized at all. It’s not that Google doesn’t accept them. They can rank but, eventually, you’ll get to the point where you’ll have to merge to new ones to improve performance, UX and search engine visibility and it’s going to be a struggle.

 

Changing page URLs very often results in issues with search engines. It’s always better if you get them good the first time.

 

We’ve talked on this topic various times before because it is important to have easy-to-follow URLs. Avoid having query parameters in URL. You can’t keep track of that URL in Analytics, Search Console and so on. Not to mention it is difficult to do link building. You might lose linking opportunities because of your URLs appearance.

 

URL-structure-query-parameter

Source blogspot.com

 

If you’re a WordPress user, you have the option to personalize and set up your permalink structure. If you take a look at the next picture, you can see the options you have for your URL structure.

 

wordpress-permalink-settings

 

Building friendly URLs is not so hard, you can follow the next 3 tips:

  • use dashes (-) instead or underscores (_);
  • make it shorter;
  • use the keyword (focus keyword).

 

Building easy-to-read and focus-keyword-URLs you are thinking about your users and therefore focusing on user experience. David Farkas has the same vision on the matter:

If you focus on user experience, you’ll be building sustainable links – and building trust with users. To build a truly great link, you have to look at every aspect of the link from the user’s perspective.
David Farkas David Farkas
Founder & CEO TheUpperRanks

 

You can always check your ‘unfriendly’ URLs using the CognitiveSEO Site Audit. After you set up your campaign, you just have to go to Architecture > URLs.

 

user friendly URLs for SEO

 

Then you’ll be able to see a list of your URLs that don’t contain any keywords. You can also identify other issues using this feature. For example, in the following screenshot, although the URLs are blurred  in order to protect the client’s identity, we’ve identified a hreflang problem. The titles and content for some secondary languages were generated in the main language when proper content in the secondary language was not provided.

 

Descriptive URLs for SEO

 

This means that the URLs were actually OK, just not descriptive due to the content being generated in the wrong language.

 

10. Use the Secure Protocol – HTTPS

 

On August 6, 2014, Google announced that HTTPS protocol is on their new ranking factors list and recommended to all the sites to move from HTTP to HTTPS.

 

HTTPS (Hypertext Transfer Protocol Secure) encryptes the data and doesn’t allow it to be modified or corrupted during transfer, while protecting it against man-in-the-middle attacks. Besides, the improvement in data security has other benefits, such as:

 

  • It helps your website have a boost in rankings, since it is a ranking factor.
  • It offers referrer details included under “Direct” traffic source in Google Analytics.
  • It assures the users that the website is safe to use and that the data provided is encrypted for avoiding hacking or data leaks.

 

If you use the HTTPS, will see a lock before the URL in the navigation bar:

 

https protocol

 

In case your website doesn’t use the HTTPS protocol, you’ll see an information icon and if you click on it, a new message will alert you that the connection is not safe, therefore the website is not secure.

 

http protocol

 

While it is best to move from HTTP to HTTPS, it is crucial to find the best way to recover all your data after moving your website. For instance, lots of users complained they lost all of their shares after moving the website and the same thing happened to us.

 

After we experienced the same issue, we’ve created a guideline on how to recover Facebook (and Google+) shares after an https migration that you could easily follow:

  • Find out how many Facebook shares you have at a URL;
  • Set both your HTTP and HTTPs social shares to zero;
  • Update rel=”canonical”;
  • Identify Facebook’s Crawler.

 

Again, this issue is related to URLs so, every time you need to do mass redirects, issues can occur. It’s always a good idea to have your URLs well set up from the beginning. However, if you really need to migrate the site from HTTP to HTTPS, you can check out this HTTP to HTTPS migration guide.

 

11. Set Your Preferred Version

 

You also want to make sure that all your other versions are pointing to the correct, preferred version of your site. If people access one version they should automatically be redirected to the correct version.

 

These are all the versions:

  • http://site.com
  • https://site.com
  • http://www.site.com
  • https://www.site.com

 

So, if your preferred version is https://www.site.com, all other versions should 301 directly to that version. You can also test if this is alright in the SEO Audit Tool. Simply head to Indexability > Preferred Domain. Look for the Everything is OK message. If you can’t find it, then guess what: not everything is OK.

 

Migrate HTTP to HTTPS

 

12. Set up Correctly the 301 Redirects After Site Migration

 

Site migration is a recommended operation in case the website is changed completely and the same domain won’t be used anymore. Setting up the 301 redirects can be applied in case you make a switchover from HTTP to HTTPS and want to preserve the link equity.

 

In case of a site migration, it is crucial to set up correctly the redirects. To avoid losing lots of links and have broken pages on your site, it is best to follow a correct 301 redirection procedure. For that, you need to take into consideration the next recommendations. For the vast majority, we’ve already covered some of them in the previous steps:

  • Set up the 301 redirect code from the old URLs to the new URLs;
  • Avoid redirection loops;
  • Remove invalid characters in URLs;
  • Verify the preferred version of your new domain (www vs. non-www);
  • Submit a change of address in Search Console;

 

Google-Search-Console-Change-address

 

  • Submit the new sitemap in Google; 
  • Check for broken links and resources.

 

 

13. Make Sure Your Resources Are Crawlable

 

Having non-crawlable resources is a critical search engine optimization technical issue. Crawling is the first step, right before indexing, which comes and puts your content in the user’s hands/eyes. Basically, Googlebot crawls the data and then sends it to the indexer which renders the page and after that, if you’re lucky, you’ll see that page ranking in SERP.

 

how search works

www.slideshare.net/ryanspoon

 

It is very important that the users see the same content that the Googlebot does.

 

If your CSS files are closed from indexing, Google won’t be able to see the pages like a users does. The same situation applies to Javascript, if it isn’t crawlable. With JavaScript, it is a little bit more complicated, especially if your site is heavily built using AJAX. It is necessary to write codes for the server to send an accurate version of the site to Google.

 

If you’re not blocking Googlebot from crawling your JavaScript or CSS files, Google will be able to render and understand your web pages like modern browsers.

 

Google recommends using Fetch as Google to let Googlebot crawl your JavaScript.

 

Fetch as Google in Google Webmaster Tools

Source www.webnots.com

 

Update: As of 2019, the Google Search Console has launched a new version which doesn’t have many of the features the old version had. Luckily, you can still access the old version if you need those features. However, they are likely to be completely removed at some point, who knows.

 

New Google Search Console

 

14. Test Your Robots.Txt File to Show Google the Right Content

 

Crawlability issues are usually related to the robots.txt file. Testing your robots.txt file helps Googlebot by telling it which pages to crawl and which not to crawl. By using this method, you give access to your data to Google.

 

You can view your robots.txt file online if you search for http://domainname. com/robots.txt. Make sure the order of your files is right. It should look similar to what you can see in the following picture:

 

View robots.txt file online

 

Use the robots.txt Tester tool from Search Console to write or edit robots.txt files for your site. The tool is easy to use and shows you whether your robots.txt file blocks Google web crawlers from specific URLs. The ideal situation would be to have no errors:

 

robots.txt Tester no errors

 

The errors appear when Google is unable to crawl the specific URL due to a robots.txt restriction. There are multiple reasons for that and Google names just some of them:

 

For instance, your robots.txt file might prohibit the Googlebot entirely; it might prohibit access to the directory in which this URL is located; or it might prohibit access to the URL specifically.
Google logo Google
 

 

The common issues that appear when Googlebot is blocked to access your website happen because:

  • There are DNS issues and Google can’t communicate with DNS server;
  • The firewall or DoS protection system is misconfigured ;
  • The Googlebot is intentionally blocked from reaching the website.

 

After you’ve checked the issues and found out which are the blocked resources pointed in the Tester tool, you can test again and see if your website is ok.

 

The site’s crawlability can be verified better on a larger scale using the CognitiveSEO Audit Tool. You simply have to go to Indexability > Indexable Pages and look for the Disallowed in Robots.txt links. Click on the red line and it will show you a list of URLs that have been disallowed.

 

Crawlability Issues

 

15. Verify the Indexed Content

 

James Parsons, expert in content marketing and SEO, explains in an article on AudienceBloom the crucial significance of the indexing phase for a website.

Indexed pages are those that are scoured by Google search engines for possible new content or for information it already knows about. Having a web page indexed is a critical part of a website’s Internet search engine ranking and web page content value.
James Parsons James Parsons
Blogger at JamesParsons.com

Search Console can provide lots of insightful information regarding the status of your indexed pages. The steps are simple, go to Google Index then to Index Status and you’ll be able to see a similar chart to the one shown below:

 

Index status in Search Console

 

The ideal situation would be that the number of indexed pages is the same as the total number of the pages within your website, except the ones you don’t want to be indexed. Verify if you’ve set up proper noindex tags. In case there is a big difference, review them and check for blocked resources. If that concluded with an OK message, then check if some of the pages weren’t crawled, therefore indexed.

 

In case you didn’t see something that was out of the ordinary, test your robots.txt file and check your sitemap. For that check the following steps (9 and 10).

 

You can also use the Site Audit tool to view the URLs that have been marked up with No-Index tag. They’re in the same section as the URLs blocked by Robots.txt (Indexability > Indexable Pages).

 

 

16. Review Your Sitemap to Avoid Being Outdated

 

An XML Sitemap explains to Google how your website is organized. An example you can see in the picture below:

 

Sitemap example

Source: statcounter.com

 

Crawlers will read and understand how a website is structured in a more intelligible way. A good structure means better crawling. Use dynamic XML sitemaps for bigger sites. Don’t try to manually keep all in sync between robots.txt, meta robots, and the XML sitemaps.

 

Search Console comes to rescue once again. In the Crawl section, you can find the Sitemap report, where you can add, manage and test your sitemap file.

 

Up to this point, you have two options: test a new sitemap or test a previously added one. In the first case:

  • Add the Sitemap;
  • Enter the URL of the sitemap;
  • Click on Test sitemap and then refresh the page if needed;
  • When the test is completed, click Open Test Results to check for errors. Fix your errors;
  • After you fix your errors, click Submit Sitemap.

 

Google-Webmaster-Tools-Add-a-Sitemap

 

In the second case, you can test an already submitted sitemap; click on Test and check the results.

 

Sitemap Tester in Search Console

 

There are three things you need to do in the situation explained in the second situation.

  • Update the Sitemap when new content is added to your site or once in a while;
  • Clean it from time to time, eliminating outdated and bad content;
  • Keep it shorter so that your important pages get crawled more frequently or break the sitemap into smaller parts. A sitemap file can’t contain more than 50,000 URLs and must not be larger than 50 MB uncompressed.

 

Using a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on complex algorithms to schedule crawling. However, in most cases, your site will benefit from having a sitemap, and you’ll never be penalized for having one.
Google logo Google
 

 

 

17. Review Blocked Resources (Hashbang URLs) with Fetch as Google

 

Hashbang URLs (URLs that have the #! in them) can be checked and tested in Fetch as Google now. John Mueller acknowledged that Google has the ability to fetch & render hashbang URL’s via the Search Console.

 

Google stopped supporting them on March 30, 2014, and that changed when it announced on October 14, 2015 that it deprecates their AJAX crawling system. At the moment hashbang URLs  can be tested.

 

Below you can see two situations for the same website. In the first picture, you can see the list of resources before using the fetch and render feature with the hashbang URL and in the second one you can see the situation after the fetch and render action was performed.

 

Before and After crawling hashbang

Source: www.tldrseo.com

 

18. Optimize Your Crawl Budget

 

The term “crawl budget” started to collect more value when Gary Illyes explained on January 16, 2017 how Google uses it.

 

Crawl budget means how many resources are allocated for crawling by a server or how many pages are crawled by the search engines in a specific period of time. Google says that there is nothing to worry if the pages tend to be crawled every day. The issues appear on bigger sites. It is very important to optimize your crawl budget.

 

Maria Cieślak, search engine optimization expert, explains in an article on DeepCrawl the importance of optimizing your crawl budget.

Google is crawling only a particular number of pages on your website, and may sort the URLs incorrectly (I mean differently than you wish). For example, the “About us” page (that doesn’t drive sales) can gain more hits than the category listings with the new products. Your aim is to present to Google the most relevant and fresh content.
Maria Cieślak Maria Cieślak
SEO Specialist at Elephate

 

The crawl limit rate comes into discussion, which limits the maximum fetching rate for a given site.

 

The actions recommended for optimizing the crawl budget are:

  • Check the soft 404s and fix them using a personalized message and a custom page;
  • Get rid of duplicate content to avoid wasting crawl budget;
  • Remove hacked pages;
  • Prevent indexation for low quality and spam content;
  • Keep your sitemap up to date;
  • Correct infinite space issues;

 

 

19. Avoid Meta Refresh for Moving a Site

 

Since we’ve talked about the redirection plan for migrating a site, it is best to understand why Google doesn’t recommend using meta refresh for moving a website. There are three ways to define redirects:

  • HTTP responses with a status code of 3xx;
  • HTML redirections using the <meta> element;
  • JavaScript redirections using the DOM.

 

Aseem Kishore, owner of Help Desk Geek.com, explains why it is better not to use this meta refresh technique: 

Although not particularly dangerous, Meta Refreshes are often used by unscrupulous webpage programmers to draw you into a web page using one piece of content and then redirect you to another page with some other content. Referred to as a black hat technique, most of the major search engines are smart enough not to fall for this method of “cloaking” web content.
Aseem Kishore Aseem Kishore
Owner and Editor-in-Chief at Help Desk Geek.com

When possible, always try to use HTTP redirects, and don’t use a <meta> element. HTTP redirection is the preferred option, but sometimes the web developer doesn’t have control of the server or can’t control it. And they must use other methods. Although HTML redirection is one of them, Google strongly discourages web developers to use it.

 

If a developer uses the HTTP redirects and forgets the HTML redirects, they aren’t identical anymore and might end up in an infinite loop, which leads to other problems.

 

In case you want to move a site, Google guidelines recommend to follow the next steps:

  • Read and understand the basic knowledge of moving a website;
  • Prepare the new site and test it thoroughly;
  • Prepare a URL mapping from the current URLs;
  • Correctly configure the server to make the redirects to move the site;
  • Monitor the traffic for old and URLs.
 

20. Use Redirect for Flash Site to the HTML Version

 

Creating a flash site without a redirect to the HTML version is a big SEO mistake. Flash content might have an appealing look, but just like JavaScript and AJAX, it is difficult to render. The crawler needs all the help it can get to crawl the data and send it to the indexer. The Flash site must have a redirect to the HTML version.

 

If you have a pretty site, what’s the point if Google can’t read it and show it the same way you’d want it to? Flash websites might tell a beautiful story, but it’s all for nothing if Google can’t render it. HTML is the answer! Build an HTML version with SWFObject 2.0. This tool helps you optimize flash content.

 

21. Use Hreflang for Multi-Language Websites

 

Hreflang tags are used for language and regional URLs. It is recommended to use the  rel=”alternate” hreflang=”x” attributes to serve the correct language or regional URL in Search results in the next situations:

 

  • You keep the main content in a single language and use translate the template (navigation and footer). Best used for user-generated content.
  • You have small regional variations with similar content in a single language. For a website that uses the English language targeted to the US, GB, and Ireland.
  • You have a site content that is fully translated. For websites where you have multiple language versions of each page.

 

Maile Ohye, former Developer Programs Tech Lead, explains how site owners can expand to new languages variations and keep the search engines friendly:  

 

 

Based on these options, you can apply multiple hreflang tags to a single URL. Make sure, though, the provided hreflang is valid:

  1. It doesn’t have missing confirmation links: If page A links to page B, page B must link back to page A.
  2. It doesn’t have incorrect language codes: The language codes must use them in ISO 639-1 format and optionally the region must be in ISO 3166-1 Alpha 2 format.

 

We’ve documented a complete guideline on the vital hreflang & multi-language website mistakes that most webmasters make, that we recommend you to follow.

 

Also, you can use the Site Audit to quickly analyze and identify hreflang issues on your website. Simply head to Content > Hreflang/Languages to get a list of your implementation issues. In the following screenshot you can see that this site has a lot of missing confirmation links, which means that Language A Page points to the Language B Page but Language B Page doesn’t point back to Language A Page.

 

Hreflang Technical Issues

 

22. Make Sure Your Tracking Is Working Properly

 

Tracking your website is really important. Without tracking your results, you won’t be able to see any improvements.

 

Tracking issues are common after migrations from HTTP to HTTPS or after minifying and combining JS files. They can break the tracking code resulting in a loss of data.

 

You need to make sure that everything is working properly so that you can track the results of the improvements you’re making over time.

Audit & Fix Your Site Now

 

III. Content Optimization

 

Now that you’ve fixed the general issues that can create crawlability and indexability issues, you can focus more on specific issues regarding your content, such as broken pages, internal linking and so on.

 

This is very important if you really want to surpass your competition, especially in highly competitive markets.

 

23. Redirect/Replace Broken Links & Resources

 

Sometimes the images from a webpage aren’t available, so a broken image is displayed in the client’s browser. It can happen to everybody. There are lots of reasons for that. And it is not a pretty situation. You know the saying: A picture is worth a thousand words and a missing picture with an ugly icon with a message will say something as well…

 

Broken images

 

A solution would be to add an error handler on the IMG tag: 

<img src="http://www.example.com/broken_url.jpg"onerror="this.src='path_to_default_image'" />

Some webmasters say that Chrome and Firefox recognize when images aren’t loaded and log it to the console, while others have other opinions.

 

Sam Deering, web developer specialized in JavaScript & jQuery, offers some great steps to resolve these issues:

  1. Firstly, search for some information on the current images on page;
  2. Secondly, use AJAX to test if the image exists;
  3. Then refresh image;
  4. Fix broken images using AJAX;
  5. Check the Non-AJAX function version.
In most browsers, the ALT tag is shown if the image is not found. This could be a problem if the image is small and the ALT tag is long as it seems the output width of the element is not forced by the length of the alt tag.
Sam Deering Sam Deering
Front-end Web Developer

 

This is also the case with broken URLs. Although nothing weird will be displayed on the site, if the user clicks on a broken link, it will lead to a bad experience. You can view which resources are broken on your website by heading to the Architecture section in the Site Audit tool.

 

Broken URLs and Images are bad for SEO

 

 

24. Audit Internal Links to Improve Your Chances to Rank Higher

 

Internal links are the connection between your pages and, due to them, you can build a strong website architecture by spreading link juice, or link equity, as others refer to it. 

 

Creating connections between similar pieces of content creates the terminology of Silo content. This method presumes to create groups of topics and content based on keywords and it defines a hierarchy.

 

benifits-intrernal-linking

Source: www.seoclarity.net

 

There are a lot of advantages for building internal links because it:

  • opens the road to search engines spiders by making it accessible;
  • transfers link juice;
  • improves user navigation and offers extra information to the user;
  • organizes the pages based on the keyword used as an anchor text;
  • highlights the most important pages and transfers this information to the search engines;
  • organizes site architecture.

 

The more relevant pages are combined with each other when crawled repeatedly, and as the crawling frequency rises, so does the overall rank in search engines.

Kasia Perzyńska Kasia Perzyńska
Content Marketer Unamo

 

When you audit internal links, there are four things that need to be checked:

  • Broken links;
  • Redirected links;
  • Click depth;
  • Orphan pages;

 

You can easily do all of those using the CognitiveSEO Site Audit Tool under Architecture > Linking Structure.

 

Internal Linking Structure Audit Tool for SEO

 

 

25. Get Rid of Duplicate Content

 

When we talk about technical SEO, we also think of duplicate content, which is a serious problem. Be prepared and review your HTML Improvements from Search Console to remove the duplicates.

 

Keep unique and relevant title tags, descriptions within your website by looking into the Search Console at Search Appearance » HTML Improvements.

 

HTML-Improvements-Google-Webmaster-Tools

 

In Search Console, you can find a list of all the duplicate content leading you to the pages that need improvement. Remove or review each element and craft other titles and meta descriptions. Google loves fresh and unique content. Panda algorithm confirms it.

 

Another option would be to apply the canonical tag to pages with duplicate content. The tag will show to the search engines which is the original source with your rel=canonical tag. Canonicalizing irrelevant URLs to avoid content duplication is a recommended practice.

 

Jayson DeMers, Founder & CEO of AudienceBloom, considers that duplicate content can affect your website and discourage search engines to rank your website and it can also lead to bad user experience, as he says on Forbes.  

Just a few instances of duplicate content can trigger Google to rank your site lower in search results, leaving you unable to recover until those content duplication issues are addressed. Duplicate content can also interfere with your user experience, leaving your site visitors feeling that your site is more fluff than substance.
Jayson DeMers Jayson DeMers
 Founder & CEO of AudienceBloom

 

The CognitiveSEO Site Audit Tool can not only easily identify Duplicate Content, but it also has a feature to identify near duplicate content, which are pages that are very similar in content but not quite the same.

 

SEO Duplicate Content Issues

 

Fixing duplicate content issues is critical, especially for eCommerce websites where this practice/issue is common. The tool makes it very easy to fix.

 

 

26. Use Structured Data to Highlight Your Content

 

Structured data is the way to make Google understand your content and help the user choose and get directly on the page they are interested in through rich search results. If a website uses structured markup data, Google might display it in SERP as you can see in the following picture:

 

Rich snippets example

 

Beside rich snippets, structured data can be used for:

  • Getting featured in the Knowledge graph;
  • Gaining beta releases and having advantages in AMP, Google News, etc.;
  • Helping Google offer results from your website based on contextual understanding;

 

Structured data

Source: www.link-assistant.com

 

The language for structured data is schema.org. You can highlight your content using structured data. Schema.org helps webmasters mark up their pages in ways that can be understood by the major search engines.

 

If you want to get in rich search results your site’s page must use one of three supported formats:

  • JSON-LD (recommended);
  • Microdata;
  • RDFa.

 

After you highlight your content using structured data, it is recommended to test it using the Google Structured Data Testing Tool. Testing it will give you great directions to see if you set it right or if you didn’t comply with Google’s guidelines because you can get penalized for spammy structured markup.

 

Google doesn’t guarantee the appearance of each content highlighted using structured data markup.

 

27. Keep a Reasonable Number of Links On-Page

 

People from the web community often associate pages with 100 links or more with “link farms”. Also, UX has a significant impact on the number of links on a single page. A piece of content abundant of links will distract the users and fail to offer them any piece of information because most of it is linked. You need to add links only where you think it is relevant and it can offer extra information or you need to specify the source.

 

Patrick Sexton, Googlebot Whisperer, explains in an article on Varvy why it is important to keep a reasonable amount of links per page:

You may also wish to consider how well your webpage is linked to. If a webpage has many quality links pointing to it, that webpage can have many links (even more than 100) but it is important to remember the reasons why you shouldn’t have a huge amount of links on any given page.
Patrick Sexton Patrick Sexton
Googlebot Whisperer at Outspoken Media

In general, the more links on the page, the higher the need to keep that page more organized in order for the user to get the information they came for on that page. Also, be careful to search for natural ways to add links and don’t violate Google’s guidelines for building links. The same recommendation applies for internal links.

 

28. Avoid Canonicalizing Blog Pages to the Root of the Blog

 

John Mueller said in one Google Webmaster Hangout that Google’s doesn’t encourage canonicalizing blog subpages to the root of the blog as a preferred version. Subpages aren’t a true copy of the blog’s main page so doing that has no logic.

 

You can listen to the whole conversion from minute 16:28:

 

 

Even if Google sees the canonical tag, it will ignore it because it thinks it’s a webmaster’s mistake.  

 

Setting up blog subpages with a canonical blog pointed to the blog’s main page isn’t a correct set up because those pages are not an equivalent, from Google’s point of view.
John Mueller SEO John Mueller
Webmaster Trends Analyst at Google

 

Canonical links are often misunderstood and incorrectly implemented, so make sure you check all your URLs for bad canonical implementation. You can do this easily with the Technical SEO Audit tool by CognitiveSEO.

 

Canonical URLs Technical SEO

 

Remember that it’s a good idea to always have a self-referencing canonical tag pointing to your page. This will ensure there are less duplicate content issues.

Audit & Fix Your Site Now

 

IV. User-Friendlier Website

 

Google cares about UX, so why wouldn’t you? Many experts think that UX is crucial in the future of SEO, especially with all the evolution of machine learning technology. David Freeman, Search Engine Land Columnist, has a strong opinion on the role of UX:

 

Part of Google’s philosophy has always been focused on delivering the best user experience. With recent technological advances, Google and other search engines are now better placed than ever to deliver this vision.
Dave Freeman David Freeman
Group Head of SEO at Treatwell & Search Engine Land Columnist
 

29. Set up Your AMP the Right Way – Mobile Friendlier

 

Google recommends using AMP (Accelerated Mobile Pages) to improve the UX, highly valued by the company. Since the Google AMP change will affect a lot of sites, it is best to understand the way it works and the right way to set it up/install it on different platforms: WordPress, Drupal, Joomla, Concrete5, OpenCart or generate custom AMP Implementation.

 

On this topic, we created a guideline on how to implement AMP because it is a process which needs full understanding. Google AMP doesn’t directly affect SEO, but indirect factors that result from AMP can.

Historically, Google has acted as an index that points people away from Google to other websites. With its AMP search results, Google is amassing content on its own servers and keeping readers on Google.
Klint Finley Klint Finley
Content writer at Wired Business / @klintron

 

AMP is pretty difficult to implement correctly. You can always run into issues. Miss one tag closing bracket and you risk your AMP version not displaying at all. You can test the format of all your site’s AMP pages quickly using the CognitiveSEO Tool.

 

In the following example there are no AMP pages set up, but if you have any, you may want to take a look at the Incorrectly set up AMP Pages section to identify the problematic ones:

 

Test AMP pages SEO validator

 

 

30. Add Breadcrumbs for a Better Navigation

 

Breadcrumbs, used by Hansel and Gretel to find their way back home, are implemented by websites with the same purpose, to lead the user through the website. They help the visitors understand where are they located on the website and give directions for an easier accessibility.

 

location-based-breadcrumb-example-sitepoint

Source: www.smashingmagazine.com

 

Breadcrumbs can improve the user-experience and help search engines have a clear picture of the site structure. Fulfilling the need to a second navigation on the website, breadcrumbs shouldn’t replace the primary navigation though.

 

Another advantage of them is that they reduce the number of actions and clicks a user must take on a page. Instead of going back and forth, they can easily use the link level/category to go where they want. A technique can be applied to big websites or e-commerce sites.

 

W3Schools exemplifies how to add breadcrumbs in two steps.

  1. Add HTML
<ul class="breadcrumb"> <li><a href="#">Home</a></li> <li><a href="#">Pictures</a></li> <li><a href="#">Summer 15</a></li> <li>Italy</li> </ul>

 

  1. Add CSS
/* Style the list */ ul.breadcrumb { padding: 10px 16px; list-style: none; background-color: #eee; } /* Display list items side by side */ ul.breadcrumb li { display: inline; font-size: 18px; } /* Add a slash symbol (/) before/behind each list item */ ul.breadcrumb li+li:before { padding: 8px; color: black; content: "/\00a0"; } /* Add a color to all links inside the list */ ul.breadcrumb li a { color: #0275d8; text-decoration: none; } /* Add a color on mouse-over */ ul.breadcrumb li a:hover { color: #01447e; text-decoration: underline; }

 

If you want a simpler solution, you can use plugins for WordPress, such as Breadcrumb NavXT Plugin or Yoast SEO.

 

yoast-seo-breadcrumbs

 

Go to SEO in Dashboard, then click on Advanced and select Enable Breadcrumbs » Save changes. This method will apply the default setting for your breadcrumbs.

 

31. Test On as Many Platforms and Devices as Possible

 

People use different devices. If you want your users to have a good experience, you need to test on multiple devices. As many as you can!

 

You can start with Chrome by right clicking and hitting Inspect. Then you can toggle the device toolbar and select the type of device you want to view your site on.

 

SEO Testing on multiple devices

 

You can also use 3rd party tools, such as ScreenFly.

 

However, keep in mind that these tools only take the screen width into consideration. For example, if you don’t own an iOS device, you’ll never know that WEBM format videos don’t play on Safari Browser.

 

You really need to test on different devices and browsers. Test on Windows, iOS, Linux, Safari, Firefox, Edge, Chrome, Opera and even the sad, old and forgotten Internet Explorer.

 

If you don’t own an Android or an iPhone/iPad, go to a store if needed or find a friend. Whenever you can get your hands on a new device, take a minute or two to browse your website.

Audit & Fix Your Site Now

Conclusion

 

Firstly, this SEO guide offers solutions and points out directions on how to make a website fast and decrease the loading time by following the recommendations on Google Speed Insights and Google developers’ guidelines.

 

Secondly, we went through the functional elements of a website, by trying to check and resolve issues related to crawling errors, indexing status, using redirects and making a website accessible to Google. 

 

Thirdly, we looked for improving and optimizing the content by resolving critical technical SEO issues. We discussed how to remove duplicate content, replace missing information and images, make a strong architecture website, highlighting our content and making it visible to Google.  

 

Lastly, we pointed out two issues regarding the mobile-friendliness sites and navigational websites.

The post Technical SEO Checklist – The Roadmap to a Complete Technical SEO Audit appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Link reclamation: A practical guide for turning unlinked brand mentions into links

Link reclamation: A practical guide for turning unlinked brand mentions into links

Your latest content campaign has been covered by a top-tier global publication… but there’s no link! Your brand (or your client) has been mentioned, but that’s all.

At this stage, do you simply accept the brand value of a mention and move on to target your next link prospect? Or is there a process you can follow to at least try to get a link added in?

Sadly, unlinked brand mentions are one of the biggest challenges when building links through content marketing and digital PR. It’s more common than many link builders would like to admit.

But, seeing a link added in to an article after it’s been published can be easier to achieve than many assume.

You just need to know when it’s right to ask for a link, who you need to reach out to and what you should say. We’ll cover all these things below.

Content-led link building is hard — don’t let anyone tell you otherwise.

It often takes blood, sweat and tears to launch a campaign which earns significant numbers of links. And it’s for that reason that link reclamation should be a tactic which is executed as standard. After all, if you’ve put the effort in to land coverage in the first place, it makes sense to follow additional steps to secure a link if that’s what it takes.

How many people use link reclamation?

To demonstrate a point: I recently reached out to my Twitter and LinkedIn networks. I asked a simple question: ‘Do you use link reclamation alongside your content marketing campaigns?’

The responses surprised me…

Always: 29%
Sometimes: 47%
Never: 24%

Of those who took the time to respond, only three in ten are executing link reclamation as standard on every campaign.

Perhaps more surprising is that one in four aren’t using it at all.

Given some of the comments left alongside the poll, this is something which many turn to if they have time — rather than working it into a wider link building process.

Think of the links which could have been earned but which were simply let lie as brand mentions.

What is link reclamation?

Link reclamation is a simple but efficient tactic to turn brand mentions into links; usually those earned as part of a content marketing or digital PR campaign.

As SEOs, we understand the value of authoritative, editorial links and the impact which they can have upon our campaigns — just as much as we understand how hard it is to even earn coverage from top-tier publications in the first place.

That’s why it can be frustrating when we discover a brand mention which doesn’t link.

What’s really important to remember, however, is that, in many instances, journalists aren’t purposefully avoiding linking to you or your client. For one reason or another (whether that’s trying to speed up the publishing process, a question as to whether a link is really needed to tell the story or others…) articles sometimes go published without a link.

As an industry we need to accept that there’s little we can do to change a journalist’s own processes and publication criteria. What we can do is take action and follow a series of tried and tested steps to try to land that link.

After all, the hard work of getting the coverage in the first place is already done. Turning a brand mention into a link is surely easy in comparison, right?

I’d like to say yes. And in many cases it is. However, I’ve also seen some horrendous examples of link reclamation gone wrong, usually because of a lack of understanding as to whether a link is actually deserved or not.

How do you find unlinked brand mentions?

One way to find unlinked brand mentions is to use ahref’s content explorer and follow their tutorial here. Combining a CSV export with Screaming Frog to compile a list of web pages which mention your brand but which don’t link.

If you’re actively promoting a content marketing or digital PR campaign, however, you’ll undoubtedly already be looking for the latest coverage.

One of the easiest ways to find this is through Google News. Filter by ‘Past 24 hours’ to see coverage picked up in the past day, or set to ‘Past week’ if you’re looking to find additional articles and features.

screenshot of Google News filtered by past 24 hours or past week, to be used when finding relevant content for link building

This will often throw up a number of unlinked brand mentions  which you then can use link reclamation tactics for to try and turn them into a link.

Don’t forget to set up Google Alerts both across your brand name and campaign headlines as well to easily be alerted to further unlinked mention opportunities to explore.

When should you ask for a brand mention to be linked?

It’s not always right to ask for an unlinked brand mentioned to be turned into a link.

Despite what many may say, a journalist doesn’t owe you a link. Not even if they cover your campaign.

A link’s purpose is to take a user from A to B and, in order for that to make sense to be in place, it typically needs to add value of some sorts.

To put this into a working context, let’s look at a few different scenarios here.

  1. Your brand (or client) has been mentioned in an article in reference to a study which you conducted and which the article directly mentions. There’s no link but many of the statistics and findings have been revealed.
  2. A journalist has featured an infographic which you produced (and embedded it) but hasn’t linked. They have credited your brand.
  3. Your brand has been referenced alongside a quote which you supplied to a journalist to add further weight to their story around a subject.
  4. Your brand has been mentioned out of context. In this case, let’s base it around a Tweet which circulated last year; one of your physical stores has been mentioned in an online newspaper, only in reference to a robbery taking place over the road from it.

In which instance would you say you’re well-deserved of the link?

Scenario one. 

When there’s a clear opportunity to add value with what’s on the other end of the link, there’s no debating that a link should be in place and it’s easy to justify why. The good news is that, in many cases, the link will already be in place when there’s clear value to the user and is an important part of the wider article.

Scenarios two and three are the ones where most link reclamation activity happens. Those where a link references the original campaign or the brand who has supplied a quote. In most cases, the link isn’t already in place here because it isn’t vital to the story. However, the link is in context and can be requested as a way to cite a source.

Scenario four is where link reclamation should be avoided. The link is of no value to readers and doesn’t make contextual sense.

Always be mindful as to whether it makes sense for a link to be added in to an article. Ask yourself; “would a link add value to a reader?” Otherwise, you’re wasting your time trying to reclaim an out-of-context mention.

You need to be able to clearly outline where a link should point to.

Note: requesting homepage, category or service page links is often not as successful as those to content pages as it can be seen as overly commercial.

You also need to be able to justify why it makes sense to be in place to maximize your success rate at link reclamation.

Who should you approach with your request?

You need to make sure you’re making your request to the right person to increase your chances of seeing a brand mentioned turned into a link.

Your options of who to approach are usually:

  1. The journalist who wrote and published the article
  2. Their editor
  3. The publication’s corrections desk

You see, most go straight back to the journalist who they pitched the original story to, however this isn’t always as successful as it could be.

Why?

Journalists are busy people.

Once they’ve hit publish there’s a good chance they’ve moved onto writing their next article and have more or less forgotten about what they last put together. And we simply have to accept that. They have new priorities and they’re not about to go and drop everything to add your link back in.

Of course, that’s not to say that reaching back out to a journalist doesn’t work, simply that they’re not always the best option.

You could reach out to the editor of the section. However, again, they’re busy individuals and adding your link in likely doesn’t come as a high priority.

A corrections desk’s role is to make amends to articles which have already been published.

This makes them, at least for me, the first people to reach out to.

You’ll find corrections contacts listed for most publications. If we take a look at Metro’s ‘Contact Us’ page (found in their footer), we see:

contact info for Metro.co.uk, shows email for a corrections desk which can be the best option for reviewing unlinked brand mentions

The address clearly states that the purpose is for complaints or corrections. sSending your link reclamation request here often ensures both quick action and an increased chance of success.

Say you send to the corrections desk and either get no reply after three days or you don’t see the link added in. (Note that you often won’t be notified that a link has been added to the article after you request it through the corrections desk — so be sure to keep checking yourself.) In this case, you might go back and reach out to either the journalist or the editor (or both; essentially giving you three chances at getting that link).

What should you say to maximize your chances of getting the link?

I’ve spent hours in the past reworking emails, but am confident that the approach which I now take works well, at least across my own clients and campaigns.

I’ve learned that a successful link reclamation email includes the following:

  • A polite THANK YOU for covering your campaign, brand or client (manners really do go a long way)
  • A clear reference to the title of the article which contains the brand mention
  • A link to the article which contains the mention
  • The link which you want added in to the article
  • A simple justification as to why the link adds value to readers

And, in practice, here’s what that looks like for me:

example email of how to ask for a backlink to a currently unlinked brand mention, particularly where it adds value to the reader

It’s simple, straight to the point and polite; however what it does perfectly is justify why a link would add value to the article and should be added in.

In this particular example, the link was added into an article on USA Today within 2 hours of sending the email.

Earning extra links for your brand

Link reclamation is something, as far as I’m concerned, should be done alongside all content marketing and digital PR campaigns to help you maximize the number of quality links earned.

Once you understand what works (and what doesn’t) in terms of who to approach and what to say, you’ll find that it’s something you can spend half an hour on each day and see results from.

At the end of the day, links still work in SEO. And there’s every argument to be made to put in that extra bit of effort to earn more from your (already put in) hard work.

James Brockbank is Managing Director of Digitaloft, a multi-award winning SEO, PPC & Content Marketing agency. You can find him on Twitter @BrockbankJames.

The post Link reclamation: A practical guide for turning unlinked brand mentions into links appeared first on Search Engine Watch.

We Analyzed 912 Million Blog Posts. Here’s What We Learned About Content Marketing

We Analyzed 912 Million Blog Posts. Here’s What We Learned About Content Marketing

We Analyzed 912 Million Blog Posts. Here's What We Learned About Content Marketing

We analyzed 912 million blog posts to better understand the world of content marketing right now.

Specifically, we looked at how factors like content format, word count and headlines correlate with social media shares and backlinks.

With the help of our data partner BuzzSumo, we uncovered some very interesting findings.

And now it’s time to share what we discovered.

Here is a Summary of Our Key Findings:

1. Long-form content gets an average of 77.2% more links than short articles. Therefore, long-form content appears to be ideal for backlink acquisition.

2. When it comes to social shares, longer content outperforms short blog posts. However, we found diminishing returns for articles that exceed 2,000 words.

3. The vast majority of online content gets few social shares and backlinks. In fact, 94% of all blog posts have zero external links.

4. A small percentage of “Power Posts” get a disproportionate amount of social shares. Specifically, Specifically, 1.3% of articles generate 75% of all social shares.

5. We found virtually no correlation between backlinks and social shares. This suggests that there’s little crossover between highly-shareable content and content that people link to.

6. Longer headlines are correlated with more social shares. Headlines that are 14-17 words in length generate 76.7% more social shares than short headlines.

7. Question headlines (titles that end with a “?”) get 23.3% more social shares than headlines that don’t end with a question mark.

8. There’s no “best day” to publish a new piece of content. Social shares are distributed evenly among posts published on different days of the week.

9. Lists posts are heavily shared on social media. In fact, list posts get an average of 218% more shares than “how to” posts and 203% more shares than infographics.

10. Certain content formats appear to work best for acquiring backlinks. We found that “Why Posts”, “What Posts” and infographics received 25.8% more links compared to videos and “How-to” posts.

11. The average blog post gets 9.7x more shares than a post published on a B2B site. However, the distribution of shares and links for B2B and B2C publishers appears to be similar.

We have detailed data and information of our findings below.

Long-Form Content Generates More Backlinks Than Short Blog Posts

When it comes to acquiring backlinks, long-form content significantly outperforms short blog posts and articles.

Long-form content generates more backlinks than short blog posts

You may have seen other industry studies, like this one, that found a correlation between long-form content and first page Google rankings.

However, to our knowledge no one has investigated why longer content tends to perform so well. Does the Google algorithm inherently prefer long content? Or perhaps longer content is best at satisfying searcher intent.

While it’s impossible to draw any firm conclusions from our study, our data suggests that backlinks are at least part of the reason that long-form content tends to rank in Google’s search results.

Key Takeaway: Content longer than 3000 words gets an average of 77.2% more referring domain links than content shorter than 1000 words.

The Ideal Content Length For Maximizing Social Shares Is 1,000-2,000 Words

According to our data, long-form content generates significantly more social shares than short content.

However, our research indicates that there’s diminishing returns once you reach the 2,000-word mark.

The ideal content length for maximizing social media shares is 1,000 to 2,000 words

In other words, 1,000-2,000 words appears to be the “sweet spot” for maximizing shares on social media networks like Facebook, Twitter, Reddit and Pinterest.

In fact, articles between 1k-2k words get an average of 56.1% more social shares than content that’s less than 1000 words.

Key Takeaway: Content between 1k-2k words is ideal for generating social shares.

The Vast Majority of Content Gets Zero Links

It’s no secret that backlinks remain an extremely important Google ranking signal.

Google recently reiterated this fact in their “How Search Works” report.

Google – How search works

And we found that actually getting these links is extremely difficult.

In fact, our data showed that 94% of the world’s content gets zero external links.

94% of content published gets zero external links

It’s fair to say that getting someone to link to your content is tough. And we found that getting links from multiple websites is even more challenging.

In fact, only 2.2% of content generates links from multiple websites.

Only 2.2% of content generates links from multiple websites

Why is it so hard to get backlinks?

While it’s impossible to answer this question from our data alone, it’s likely due to a sharp increase in the amount of content that’s published every day.

For example, WordPress reports that 87 million posts were published on their platform in May 2018, which is a 47.1% increase compared to May 2016.

Number of posts published (WordPress)

That’s an increase of 27 million monthly blog posts in a 2 year span.

It appears that, due to the sharp rise in content produced, that building links from content is harder than ever.

A 2015 study published on the Moz blog concluded that, of the content in their sample, “75% had zero external links”. Again: our research from this study found that 94% of all content has zero external links. This suggests that getting links to your content is significantly harder compared to just a few years ago.

Key Takeaway: Building links through content marketing is more challenging than ever. Only 6% of the content in our sample had at least one external link.

A Small Number of “Power Posts” Get a Large Proportion of Shares

Our data shows that social shares aren’t evenly distributed. Not even close.

We found that a small number of outliers (“Power Posts”) receive the majority of the world’s social shares.

Specifically, 1.3% of articles get 75% of the social shares.

And a small subset of those Power Posts tend to get an even more disproportionate amount of shares.

In fact, 0.1% of articles in our sample got 50% of the total amount of social shares.

Top subset of

In other words, approximately half of all social shares go to an extremely small number (0.1%) of viral posts.

For example, this story about shoppers buying and returning clothes from ecommerce sites received 77.3 thousand Facebook shares.

This single article got more Facebook shares than the rest of the top 20 posts about ecommerce combined.

Key Takeaway: The majority of social shares are generated from a small number of posts. 75% of all social shares come from only 1.3% of published content.

There’s Virtually No Correlation Between Social Shares and Backlinks

We found no correlation between social shares and backlinks (Pearson correlation coefficient of 0.078).

In other words, content that receives a lot of links doesn’t usually get shared on social media.

(And vice versa)

And when content does get shared on social media, those shares don’t usually result in more backlinks.

This may surprise a lot of publishers as “Sharing your content on social media” is considered an SEO best practice. The idea being that social media helps your content get in front of more people, which increases the likelihood that someone will link to you.

While this makes sense in theory, our data shows that this doesn’t play out in the real world.

That’s because, as Steve Rayson put it: “People share and link to content for different reasons”.

So it’s important to create content that caters to your goals.

Do you want to go viral on Facebook? Then list posts might be your best bet.

Is your #1 goal to get more backlinks? Then you probably want to publish infographics and other forms of visual content.

We will outline the differences between highly-linkable and highly-shareable content below.

But for now, it’s important to note that there’s very little overlap between content that gets shared on social media and content that people link to.

Key Takeaway: There’s no correlation between social media shares and links.

Long Headlines are Correlated With High Levels of Social Sharing

Previous industry studies have found a relationship between “long” headlines and social shares.

Our data found a similar relationship. In fact, we discovered that “very long” headlines outperform short headlines by 76.7%:

Long headlines are correlated with increased social sharing

We defined “very long” headlines as headlines between 14-17 words in length. As you can see in the chart, there appears to be a linear relationship between headline length and shares.

And this same relationship played out when we analyzed the headlines in our dataset by character count.

Long headlines (100+ characters) are correlated with social shares

As you might remember from 2014, clickbait-style headlines worked extremely well for publishers like Buzzfeed and Upworthy.

And their posts tended to feature headlines that were significantly longer than average.

Although clickbait isn’t as effective as it once was, it appears that long headlines continue to be an effective tactic for boosting social shares.

There are, of course, exceptions to this rule. For example, this post with a 6-word headline received over 328k social shares.

Keto no-bake cookies post

But when you look at the headlines across our dataset of 912 million posts, it’s clear that content that uses longer headlines get more social shares.

Why long headlines work so well is anyone’s guess. However, I have two theories that may partly explain things.

First, it could be the fact that longer headlines pack more information in them compared to short headlines. This “extra” information may push people to read a piece of content or watch a video that they otherwise wouldn’t, increasing the odds that it goes viral.

Also, longer headlines contain more terms that can “match” keyword searches in Google and on social media sites where people commonly search (like Twitter). Again, this results in more eyeballs, which can lead to more shares.

Twitter search

Key Takeaway: Very long headlines (14-17 words in length) get 76.7% more social shares than short headlines.

Titles That End With a “?” Get an Above Average Amount of Social Shares

One interesting nugget from our data was that “question headlines” seem to be working well right now.

In fact, headlines with a question mark get 23.3% more social shares than non-question headlines.

For example, here’s a post with a question headline that boasts 3.3M shares:

Question titles may work because they add an element of intrigue that’s well-documented to increase click-through-rate. Put another way, you might decide to read a post in order to answer the question posed in the headline.

Obviously, question titles aren’t a magic bullet. But using questions in certain headlines may help increase shares and traffic.

Key Takeaway: Question headlines get 23.3% more social shares than non-question headlines.

There’s No “Best Day” to Publish New Content

What’s the best day to publish a blog post?

Well, according to our data, the day that you publish doesn’t make much of a difference.

(At least in terms of social shares)

Social shares by day of the week

We did find that Sunday had a slight edge over other days of the week. However, the difference in shares from content published on Sunday vs. the other 6 days of the week was only 1.45%.

Several industry studies and case studies have set out to answer the “best time to publish content” question. But most are either old (one of the most-cited industry studies I found was published back in 2012) or used a small sample size.

And this is likely the reason that the findings from those studies are so conflicting.

Considering that there’s no advantage to publishing content on a certain day, I recommend researching and testing the best publishing time for your industry and audience.

For example, after extensive testing, we found that publishing on Tuesday morning (Eastern) works best for the Backlinko blog. But I’ve heard from other bloggers that their publishing on Saturday works best for them.

So the “best” day to publish is ultimately whenever your audience is available to consume and share your content, something that’s best determined by testing.

Key Takeaway: There’s no “best” day for new content to come out. Shares are essentially equal across different days of the week.

List Posts and “Why Posts” Get a High Level Of Shares Compared to Other Content Formats

We investigated the relationship between content format and social shares.

Our data shows that lists posts and “Why Posts” tend to get more shares than other content formats.

List posts and

For example, this Why Post from Inc.com was shared on Facebook 164 thousand times:

Why reading books should be your priority

On the other hand, how-to posts and infographics don’t get shared on social media very often.

That’s not to say you should avoid any particular content format. There are infographics and how-to posts out there that generate tens of thousands of shares.

However, our data does suggest that focusing on list posts and Why Posts may increase the odds of your content getting shared on social media.

Key Takeaway: List posts perform well on social media compared to other popular content formats. Our study found that list posts generate 203% more shares than infographics and 218% more shares than how-to articles.

“Why Posts”, “What Posts” and Infographics Are Ideal Content Formats for Acquiring Backlinks

We found that “Why Posts”, “What Posts” and infographics get linked to more often than other content formats.

What’s interesting is that, while there’s some overlap, there’s a significant difference in the content formats that people share and link to.

Referring domains .vs. Average social shares

While our study found that list posts were the top content format for social sharing, they’re dead last in terms of getting backlinks from other websites.

For example, this list post has 207.8k social shares.

20 amazing writing prompts

But according to BuzzSumo, despite all those shares, this article has zero backlinks:

BuzzSumo – boredpanda.com – Shares

It’s a similar situation with infographics. Our data shows that infographics tend to get very few shares relative to list posts, “what posts” and videos.

However, when it comes to links, infographics are a top 3 content format.

This supports our other finding from this research that there’s no correlation between shares and links.

My theory on this is that certain formats are primed to get shared on social networks like Facebook and Twitter. And other formats designed to get linked to from the small group of “Linkerati” that run and contribute content to websites.

Infographics illustrate this contrast perfectly.

Although the occasional infographic may go viral, it’s fair to say that their novelty has worn off in recent years. Which may explain why infographics aren’t shared very much compared to other formats (like list posts).

However, due to the fact that infographics contain highly-citable data, they work as an effective form of “link bait”.

Also, unlike a list post or how-to post, infographics can be easily embedded in blog content. This further increases the chances of acquiring links.

Key Takeaway: “Why Posts”, “What Posts” and infographics appear to be ideal for link building. These three formats receive an average of 25.8% more referring domain links than how-to posts and videos.

B2B and B2C Content Have a Similar Share and Link Distribution

We analyzed a subset of content from our dataset that was published on B2B websites. Our goal was to find out if share and link behavior differed in the B2B and B2C spaces.

First, we did find that “normal” content generates significantly more shares than B2B content. In fact, the average amount of shares for all the content in our dataset is 9.7x higher than content published in the B2B space.

B2C content gets shared 9.7X more than B2B content

This finding wasn’t surprising. B2C content tends to cover topics with broad appeal, like fitness, health and politics. On the other hand, B2B content on hiring, marketing and branding only appeal to a relatively small group. So it makes sense that B2C content would get shared more often.

However, when we analyzed the distribution of B2B shares and links vs. all published content, we found that they largely overlapped.

For example, 93% of B2B content gets zero links from other websites.

93% of B2B content gets zero external links

The amount of B2B content without any links (93%) is similar to the figure (94%) from our full dataset.

The percentage of B2B posts get linked to from multiple websites also overlaps with B2C.

Only 3% of B2B content gets linked to from more than one website.

Only 3% of B2B content generates links from multiple websites

This largely matches the 2.2% that we found in our mixed dataset of B2B and B2C content.

Overall, B2B and B2C link distribution largely overlaps.

Similar share and link distribution

When it comes to B2B social shares, we found that 0.5% of B2B articles get 50% of social shares.

B2B

And 2% of B2B articles get 75% of social shares.

B2B subset of

Like with B2C content, B2B publishers have a small number of “Power Posts” that drive the majority of social sharing.

B2B and B2C shares stem from a small number of

Key Takeaway: Although B2B content doesn’t get shared as often, the distribution of shares and links in B2B and B2C appears to be similar.

Conclusion

I learned a lot about content marketing from this study, and I hope you did too.

I’d like to again thank BuzzSumo (in particular Henley Wing) for providing the data that made this research possible.

For those that are interested, here is a PDF of how we collected and analyzed the data for this research.

And now I’d like to hear from you:

What’s your #1 takeaway lesson from this study?

Or maybe you have a question.

Either way, leave a comment below right now.

The post We Analyzed 912 Million Blog Posts. Here’s What We Learned About Content Marketing appeared first on Backlinko.

Missed YoastCon 2019? Learn from our takeaways!

Missed YoastCon 2019? Learn from our takeaways!

YoastCon 2019 was amazing. The atmosphere was great, the venue inspiring, the speakers extraordinary and the attendees incredible. We had such a great time! Personally, this is my kind of conference. Where else can you discuss the importance of structured data and the power of entities in semantic search than at a conference like this? In case you missed it, or if you want to relive it — we asked some of our colleagues their SEO-related takeaways from this year’s YoastCon.

YoastCon was two days of SEO goodness

First, let’s take a step back. YoastCon 2019 was our two-day SEO and online marketing conference held on February 7 and 8 in Nijmegen, The Netherlands. In two days, attendees saw talks about many topics, from link building to site migrations and SEO copywriting to artificial intelligence in search. There were super smart guys from Bing and Google, the latter declaring their love for the WordPress CMS. Some of the talks will be available on our YouTube channel soon, while others will be exclusive material for subscribers to our online SEO training courses.

Joost de Valk and Alberto Medina from Google talking about WordPress at YoastCon

This is what stuck with us the most:

Marieke van de Rakt – CEO of Yoast

Rand Fishkin’s talk got me really thinking about brand and the importance of a good brand. Throughout the conference that resonated in many of the other talks as well. Having a strong brand and a clear mission is really important. For me, that’s going to be a big focus in 2019.”

Joost de Valk – CPO of Yoast

“What surprised me most is how everybody seemed to be on the same track again, using the website as “home” and social accounts as “outposts”. This was different from a couple of years ago and it’s a trend that I fully welcome.”

Omar Reiss – CTO of Yoast

“My favourite takeaway from YoastCon was when Wolfgang Blau presented SEO as a global interest that lacks real organization. There is no network of stakeholders and experts protecting the interest of “findability” or “information availability” on the internet. This got me really intrigued and I’m sure I’ll spend some more thoughts on this topic in the future.”

Willemien Hallebeek – Content team lead

“I love this writing hack from Kate Toon: “If you don’t know how to start your article or get stuck quickly, write with a white font. By doing this you can’t start editing before you finish your draft.” So simple, but effective!”

Sjardo Janssen – Front-end developer

“I liked Jason Barnard’s talk about getting in the knowledge graph. His advice: Make sure other sites confirm facts about you, if you want to get a knowledge graph in Google and Bing. The more sites point out facts about you, the bigger the chance! And don’t forget to claim the knowledge graph!”

Melina Reintjens – Content manager

“My main takeaway was that SEO is a lot of work and that you’re never done optimizing your site. There’s always more to improve — both technically as well as content-wise. And that’s not all: you need to invest time into building your brand, managing your reputation, all while the big players on the internet keep changing the rules. So big kudos to everyone who’s working tirelessly on their site: you’ll get there!”

Alexander Botteram – React engineer

“I loved Jono Alderson’s dystopian future. We explored a world where each of us competes over finite resources, and where an all-seeing, all-powerful AI called Global decides who wins and who loses. The main rules to live by?

– Rule 1: You should be healthy.

– Rule 2: You should be creative.

– Rule 3: You should be popular.”

Caroline Geven – Creative online marketeer

“My main takeaway was not directly related to SEO, but I did get to practice something I learnt at the conference. I was so impressed by Geraldine DeRuiter’s talk on online harassment. A week later I received my first piece of hate online, but instead of it getting me down I got inspiration for a new post on dealing with this kind of stuff.”

My takeaway

I was impressed by how many of the people I spoke were talking about Schema structured data and the underlying connectedness of everything. It seems that more people seem to understand how important this technology is to help search engines figure out what it all means. Search engines figured this content thing out pretty well — they can assess the quality and make assumptions based on what a piece of content is about. What it misses, is how everything fits together in the grand scheme of things. We can help them with that. There was even a search engine at YoastCon whose lead engineer showed how to do that — here’s Bing’s Arnold Overwijk:

“If you use markup data, you make our life much easier. We can show them as results, but we can also use it for machine learning.”

At Yoast, we share this view and that’s one of the reasons why we’re rebuilding and improving our structured data content blocks for the new WordPress editor. Soon, you’ll be able to make content powered by structured data by simply dragging a block into your content and filling in the content. Job postings, review, recipes — you name it!

YoastCon 2019 takeaways

YoastCon 2019 was a great ride. We’re sure every attendee had a fantastic time and left with a list full of SEO tips. We’d like to thank each and every one of them for coming and making it such a memorable two days. Hope to see you at the next YoastCon!

The post Missed YoastCon 2019? Learn from our takeaways! appeared first on Yoast.

Google Ads 2019: What to look out for

Google Ads 2019: What to look out for

2018 was an eventful year for Google Ads. We saw a number of big changes and improvements including:

  • A re-branding – Google Ads was re-branded from Google AdWords and according to Google this is more encompassing and makes it easier for businesses to advertise across their platforms.
  • New interface – according to Google is faster and easier to use than the previous interface and includes more features to reach your advertising goals.

However, 2019 is promising to be an even better year as advertisers take full advantage of the previous years changes and the new ones that are to be released:

15 seconds non-skippable video ads

Advertisers running TrueView in-stream Ads on YouTube can now benefit from 15 second non-skippable video ads. This has only been available to advertisers via YouTube reservation, but will now be available to advertisers running auction campaigns in Google Ads.

In an announcement, Google said:

“Today we’re expanding access to advertisers running auction campaigns. Recognizing that advertisers should have access to the full range of creative options regardless of how they buy – whether in advance via reservation or in the Google Ads auction.”

Google has started to roll it out to all advertisers and full availability will be coming in the next weeks. And Google has noted that it will cap the number of ads a user sees to ensure they have a great experience while watching YouTube videos.

As an advertiser you can now head over to Google Ads and setup a Video 360 or Display campaign. So you now have a wider range of creative lengths and viewer experiences to achieve your advertising goals.

Paying for conversions

An exciting new update for advertisers in 2019 is the option to pay for conversions for display ads. This is available for Display campaigns only and means you won’t pay for clicks, but only when visitors convert on your website or app.

This option is only available if you’re using a Target Cost Per Acquisition (CPA) bid strategy in a Display campaign. The main benefit is that you’re not charged for clicks or impressions like traditional PPC campaigns, but only for sales or leads that your campaigns receive.

Also, you won’t pay more than your Target CPA bid. For example, if your Target CPA is $5 and you drove 30 conversions over the week, you’ll pay only $150. This is a welcome update for advertisers on the Display Network where conversion rates are historically lower than Search campaigns.

So, during campaign setup you can opt for conversions bidding as seen below

opting for conversion bidding in Google Ads

Responsive search ads

Responsive search ads are the latest update to text ads. These gigantic text ads are the future of search ads as they take up more space in the auction search results and give advertisers more prominence.

responsive search ads on mobile

They include up to 15 headlines and 4 descriptions, which is far much more than expanded text ads.

The Google Ads system rotates the headlines and description in your ads and shows the most relevant things the search term. This saves you a lot of time too because you don’t have to create many ad assets and can instead focus on performance.

Competitor level

Competition on Google is on the rise. According to Google there are now over 4 million businesses advertising their products and services on the Google Ads platform. So, as an advertiser you have many competitors to contend with.

Research from Wordstream reveals that click costs have increased too. It’s now common to see costs higher than $10 per click and that’s not for just insurance, loan and mortgage keywords

With new advertisers joining Google Ads each year, it’s inevitable that costs will continue to rise. So, you should take action like updating ads, search terms and keywords to make them highly targeted and differentiated.

Also, look to achieve high ad positions by improving quality scores and not by increasing your bids.

Mobile search increase

According to Statista, mobile searches in 2018 accounted for over 51% of searches. And the number of people in the US using smartphones for search will increase to 220 million by 2020.

So the implications are clear. As an advertiser you should be working to offer a good user experience for your visitors.

That includes:

  • providing a fully responsive website
  • providing a mobile website that you can direct traffic too
  • use mobile bid modifiers to increase ad position on mobile devices (ranking at the top is especially important on mobiles)

Although conversion rates are still lower on mobile than desktop, that is set to change too. Users are finding it more comfortable completing transactions on mobile devices and this is set to increase in 2019 and beyond.

Cross-device attribution reporting

Cross device reporting is nothing new to Google Ads and reports like the Device report have allowed advertisers to attribute to conversions. However, cross device reporting is now coming to Google Analytics.

You can now segment and visualize your data across devices and see how each device is helping achieve conversions for your Google Ads campaigns.

These reports live within your Audiences section in Google Analytics and help you to see how many devices used to access your website. You can also see the last 5 devices that contributed to a conversion and you can see the relationship between acquisitions and conversions.

cross-device attribution reporting in Google Ads

Summary

2019 promises to be an interesting year for Google Ads. Google will be releasing new features and updating existing ones like the new interface released in 2018. As an advertiser you should look out for new releases to help you reach your goals.

The post Google Ads 2019: What to look out for appeared first on Search Engine Watch.