Given that you’re here, I’m guessing this isn’t news to you. So let’s get straight down to business. This article teaches you how to fix any of these three problems: Your entire website isn’t indexed. Some of your pages are…
Publishing original content to your website is, of course, critical for building your audience and boosting your SEO.
The benefits of unique and original content are twofold:
Original content delivers a superior user experience.
Original content helps ensure that search engines aren’t forced to choose between multiple pages of yours that have the same content.
However, when content is duplicated either accidentally or on purpose, search engines will not be duped and may penalize a site with lower search rankings accordingly. Unfortunately, many businesses often publish repeated content without being aware that they’re doing so. This is why auditing your site with a duplicate content checker is so valuable in helping sites to recognize and replace such content as necessary.
This article will help you better understand what is considered duplicate content, and steps you can take to make sure it doesn’t hamper your SEO efforts.
How does Google define “duplicate content”?
Duplicate content is described by Google as content “within or across domains that either completely matches other content or are appreciably similar”. Content fitting this description can be repeated either on more than one page within your site, or across different websites. Common places where this duplicate content might be hiding include duplicated copy across landing pages or blog posts, or harder-to-detect areas such as meta descriptions that are repeated in a webpage’s code. Duplicate content can be produced erroneously in a number of ways, from simply reposting existing content by mistake to allowing the same page content to be accessible via multiple URLs.
When visitors come to your page and begin reading what seems to be newly posted content only to realize they’ve read it before, that experience can reduce their trust in your site and likeliness that they’ll seek out your content in the future. Search engines have an equally confusing experience when faced with multiple pages with similar or identical content and often respond to the challenge by assigning lower search rankings across the board.
At the same time, there are sites that intentionally duplicate content for malicious purposes, scraping content from other sites that don’t belong to them or duplicating content known to deliver successful SEO in an attempt to game search engine algorithms. However, most commonly, duplicated content is simply published by mistake. There are also scenarios where republishing existing content is acceptable, such as guest blogs, syndicated content, intentional variations on the copy, and more. These techniques should only be used in tandem with best practices that help search engines understand that this content is being republished on purpose (described below).
Source: Alexa.com SEO Audit
An automated duplicate content checker tool can quickly and easily help you determine where such content exists on your site, even if hidden in the site code. Such tools should display each URL and meta description containing duplicate content so that you can methodically perform the work of addressing these issues. While the most obvious practice is to either remove repeated content or add original copy as a replacement, there are several other approaches you might find valuable.
This tool helps you tell Google not to crawl pages with specific parameters. This might be a good solution if your site uses parameters as a way to deliver content to the visitor that is mostly the same content with minor changes (i.e. headline changes, color changes, etc). This tool makes it simple to let Google know that your duplicated content is intentional and should not be considered for SEO purposes.
Source: Alexa.com SEO Audit
By actively checking your site for duplicated content and addressing any issues satisfactorily, you can improve not only the search rankings of your site’s pages but also make sure that your site visitors are directed to fresh content that keeps them coming back for more.
Got any effective tips of how you deal with on-site content duplication? Share them in the comments.
You’ve probably noticed we’re doubling down on our Schema structured data implementation. In Yoast SEO 11.0, we rewrote what we output and how we do that. Not only that, we put every piece of structured data in a neat, interconnected graph. We’re not done yet! In Yoast SEO 11.1, we’re introducing proper image markup and tying our Video SEO output into the graph.
Video SEO tied into the Yoast SEO Schema graph
Our Video SEO add-on for Yoast SEO helps your videos to show up in video search. In this new release, we make the plugin even more useful by adding the correct structured data. Not only that, we’ll also tie everything into the main graph as generated by Yoast SEO.
In the structured data code, you’ll find everything search engines need to make sense of the video, from duration to embed URL, and from video thumbnail to description. Search engines like Google may use this information to get your video into a carousel or give it a badge so it can be distinguished as a video in image search thumbnails.
We use Schema’s VideoObject to output the correct structured data and made video a real entity in our graph. The beauty of it is that you don’t need to do anything out of the ordinary to get search engines to pick up your video. Simply give it good meta data like titles and descriptions, add an attractive thumbnail and you’re good to go! The plugin will automatically generate all the valid Schema code in the background.
Another thing that was not final in Yoast SEO 11.0, was the way we handle images on a page and how we tie those into the graph. In Yoast SEO 11.1, we’re introducing a proper way to handle single images on a page. For the next version, we are also looking at ways of handling multiple images and how to determine the main one. Read all about how we generate the image parts for the Schema output.
Exposing the imageObject is very helpful for image SEO purposes. Google has said many times that adding structured data to your images is beneficial. Now, you can give search engines loads of context for your images. As we know, they still struggle to figure out what’s in an image, so they need every bit of help they can get. Schema provides the context by telling what an image is and what its properties and meta data are. Keep this in mind when working on your image SEO — which you should do naturally, of course.
Yoast SEO retrieves the image caption if set, or uses the alt tag if that’s set. It is easy to forget, but the caption and/or alt text are incredibly important for search engines. Please make use of it! Also, make sure that the filenames of your filenames are descriptive and recognizable. We have an extensive guide with loads of tips on image SEO, please read that.
Yoast SEO Premium: Better recognition of German keyphrases
In Yoast SEO Premium 10.1, we introduced word form support for the German language. This made it the second language, after English, to receive the full language support. In Yoast SEO 11.1, we’ve fine-tuned the language support. The plugin is now better at recognizing German keyphrases that include words with an i or e in between vowels (e.g., schrieen, schreien, speie). In addition, we’ve also improved the recognition of German 3rd person singular verb forms (e.g., “arbeitet”).
Update to Yoast SEO 11.1
Yoast SEO 11.1 not only features a number of Schema enhancements, improvements to our Video SEO add-on and better German language support, but also several bug fixes. You can find every change in the changelog of this release.
For the past couple of weeks, we’ve been improving our structured data support with an innovative implementation that includes a full graph. We’re not done yet! There’s still a lot to do and you can expect much more from us in the near future. Remember our structured data content blocks for WordPress’ new block editor?
In this video, Ross Tavendale has some great insights for you about Keyword Cannibalization. He explains the problems that could arise, how to determine if Keyword Cannibalization is a problem, which tools you can use, and how to evaluate issues.
Once upon a time, Greenlane had a client launch a new site in WordPress. It was in a staging area before launch, to which the environment was properly blocked from Googlebot. Instinctively upon hearing the news of the website going live, we decided to look for a robots <meta> tag.
Sure enough, every page was marked as “noindex, nofollow”. The developer forgot to remove the noindex tag. This tiny oversight could have easily cost the client hundreds of thousands of dollars if not caught. A major SEO issue averted!
Above I said “instinctively” because, well, this isn’t the first time I’ve seen a site launch set to block search engines. It’s probably not even the 50th. I worked on an eCommerce platform where many sites launched with this SEO issue. WordPress, as fine a platform as it is, makes it super easy to launch set to noindex. Since developers often build sites in staging areas, they’re wise to block bots from discovering their playground. But, in the hustle to push live an update or new design, they can forget a tiny (yet crucial) check box.
It’s the role of an SEO to monitor websites to ensure proper crawlability at all times. I recommend monitoring daily; with these tools, the time commitment is very low. Make it part of your early morning coffee and donuts routine.
I’ve gathered up four different ways you can monitor your sites for SEO problems without the use of server logs or an education in server administration. There are different kinds of advanced website monitoring (e.g., active, passive), but I’m keeping it simple and applicable for anyone. I wanted to pick a few that were diverse, free or affordable.
SEOradar is a great tool from Mark Munroe. The purpose of the tool is to alert you when technical SEO changes occur. You can set monitoring to be daily or longer, and cover changes in robots.txt, meta robots, internal links, schema, etc. Plus, you can get these alerts emailed to you.
From the website’s homepage: “SEORadar examines changes to pages and alerts users to issues with potential dire SEO consequences; including title changes, noindex tags, broken canonicals, 302s and much more. SEORadar checks for over 100 distinct site changes and warns users when they occur. Users can configure their own tests if we missed ones they need!”
Here’s an example of the dashboard for one of our enterprise clients. All is (mostly) calm today with only 2 minor changes to pay attention to. When they’re critical, we jump:
The interface is smart, allowing you to drill deeper into the alerts for identifying the SEO issue. I like being able to view two different source code documents together with the changes highlighted:
2. Little Warden
Little Warden’s tagline is “Vital Alerts for Hidden Issues.” Little Warden is in the same class as SEORadar. I’m fond of the dashboard that gives you a quick glance of discovered issues:
The notification controls are pretty smart. Little Warden even integrates into Slack, allowing for instant notifications when an issue is found.
I also really the “change history” interface:
3. Uptime Robot
While SEOradar and Little Warden can track a lot of technical SEO related details, the biggest problem is website downtime. I’m sure we’ve all been there – it’s 2pm in the afternoon, you’re relaxed after lunch when somebody finally notices the site is down. You do some digging, and it turns out the site has been down since 4am.
Most companies don’t think to look at their own website routinely. We all just assume it’s alive and well. All webhosts suffer from periodic downtime, but poor hosts – even in this day and age – can really screw you on this one. We’ve seen Google essentially bail on websites that are full of server errors.
Website monitoring services exist to keep you in the loop, from SaaS services like Pingdom and SiteUptime, to downloadable applications. I’m particularly fond of Uptime Robot because it’s easily configurable and free for 50 monitors. If your site goes down, not only do you get an email alert, but you’ve got a great interface to follow along:
Plus you can change the intervals, starting at 5 minutes. Keeps the control in your hands:
(Bonus) Daily Rank Tracking and Analytics
The truth is, you don’t necessarily need the above tools if you are not interested in reacting quickly. If you’re old-school and don’t want more tools in your arsenal, this tip is for you.
I was once pitched by an enterprise-level all-in-one SEO solution. When they discussed their rank tracking features, I asked, “can you track rankings on a daily basis?” There was a smug response: “no, we do weekly rank updates – but I can’t see a reason why you would ever want daily rank tracking.”
Well, on very large sites where popular pages are crawled every couple days, and many updates are being made, daily rank tracking can help you quickly discover where an issue lies. It can lead you directly to the page that has the unexpected problem. Not to mention, if you’ve been stung by a penalty, daily rank tracking (and analytics) will often be your first indicator. I like to know that day – not in a week.
Analytics isn’t any different. The data pours in fast on modern analytics platforms. In less than 24 hours, most sites will be able to provide enough clues that a problem is occurring. You just have to monitor routinely.
Daily Rank Tracking
Honestly, this isn’t my favorite method of monitoring for SEO problems, but it’s an easy one. If you have a page Google favors, it will get crawled often – as often as daily. If it’s a deeper, less important page (in Google’s perspective), I’ve seen it take months to be revisited. So, if you’re going to rely on rank tracking, choose daily tracking, and monitor your oft-visited pages. Clearly, this leaves a lot of monitoring on the floor – be aware, there’s plenty of room for error and latency.
Most analytics packages allow you to create a custom dashboard. Most also allow for daily emails. To make this work, your new job is to simply check your analytics a few times a day. For over a decade, this was the first thing I’d do when I got into the office. I wasn’t just looking at revenue, but hiccups. I’m looking for an atypical pattern.
It’s almost like a heartbeat monitor in a hospital. Clearly, this is not as simple as allowing SEOradar, Little Warden or Uptime Robot to send you an alert, but useful nonetheless.
What do you use to monitor your website? Let us know in the comments.
After testing the Search Console for more than a year, Google announced its release from beta last year.
In the previous year, maybe more, Google slowly rolled out the beta eventually doing a full open beta invite to all Search Console users and migrating their features from the old to the new version. From the new UI to the new features, the tool is currently performing at its best.
But it’s difficult to keep up with Google Search Console updates, let alone integrate them into your search marketing mix. However, because SEO is ever evolving, these updates always come at a good time. The following is a guide on the newest features you might not have heard of yet, and how to make use of them to improve your search marketing.
1. Improved UI
The main source of confusion surrounding the new version of the Search Console has been how Google is handling the transition. For starters, not all features have been moved directly into the new version. All features and the reports they provide are being evaluated so they can be modified and presented to handle the modern challenges facing the SEO manager. Google even published a guide to explain the differences between the two versions.
Overall, the tool has been redesigned to provide a premium-level UI. As a marketer, this benefits you in one major way: without the clutter, you’re able to remain more focused and organized.
You can look at reports that matter the most, and even those you don’t have to spend too much time on because they’ve been made briefer. Monitoring and navigation are also more time-efficient.
These may not seem like a direct boost to your SEO efforts, but with this improved UI, you can get more work done in less time. This freed up time can then be channeled to other search marketing strategies.
2. Test live for URL inspection
The URL Inspection tool got an important update that allows real-time testing of your URL. With the “Test Live” feature, Google allows you to run live tests against your URL and gives a report based on what it sees in real time not just the last time that URL was indexed.
Google says this is useful “for debugging and fixing issues in a page or confirming whether a reported issue still exists in a page. If the issue is fixed on the live version of the page, you can ask Google to recrawl and index the page.”
The URL Inspection tool is fairly new. It’s a useful tool as it gives you a chance to fix issues on your page. So Google doesn’t just notice what’s wrong with your page — it also tells you, allows you to fix it, and reindexes the page.
URL Inspection has other features: Coverage and Mobile Usability.
i. Coverage – This has three sub-categories:
Discovery shows the sitemap and referring page.
Crawl shows the last time Google crawled the page and if the fetch was successful.
Overall, URL Inspection is a handy feature for easily identifying issues with your site. Afterward, you can then send a report to Google to help in debugging and fixing the identified issues. The feature is also useful for checking performance and making sure your site is SEO-optimized and your pages, indexed.
3. Manual actions report
From the menu bar, you can see the “Manual actions” tab. This is where you find the new Manual Actions report that shows you the various issues found on your web page.
As you’d expect, the report is brief and only shows the most important information. It can even be viewed as part of the report summary on the Overview page. If any issues are found, you can optimize it and request a review from Google. The major errors that can be found and fixed from here are mobile usability issues and crawl error.
This feature helps you, as a search marketer, to minimize the amount of time you take to review your website performance. It’s one more step to improving your website speed and overall performance because issues are quickly detected and fixed. And of course, it’s no news that speed is one of the key attributes of an SEO-friendly website.
4. Performance report
The “Performance” report feature was the first to be launched in the beta version of the Search Console so it’s been around for more than a year. It replaces Search Analytics and comes in a new and improved UI.
Compared to Search Analytics, the main strength of the new report is in the amount of search traffic data. Instead of 3 months, the Performance report incorporates 16 months of data. This data includes click, CTR, impression, and average ranking metrics at all levels (device, page, query, and country).
You can use this data to optimize your website, improve mobile SEO, evaluate your keywords, check content performance and more. All these activities help improve your SEO.
5. Index coverage report
Index Coverage was launched alongside the Performance report. It’s an evolution of the previous Index Status and Crawl Errors reports.
Providing site-level insights, the Index Coverage report flags problems with pages submitted in a sitemap and provides trends on the indexed pages, those that can be indexed.
By allowing you to see the indexing and crawling issues from Google’s perspective, the report pinpoints the problems limiting your ability to rank high on the SERP.
The Google Search Console will continue to be one of the best free SEO tools out there. Every new feature adds to it a new ability to help marketers better manage their SERP appearances. If you care about where and how you appear on search engines, these and any future updates, including how to use them, will be of much interest to you.
Joseph is the Founder and CEO of Digitage. He can be found on Twitter @josephchukwube.
To highlight just how much harder it can be to compete in mobile search, and how big of a deal it is to keep up, we‘ve conducted research. We‘ve studied 50,000 random keywords in the US to find out just how different the SERPs are for the same search query on different platforms.