How to Find Soft 404 Errors on Your Website

A soft 404 is a type of error where a web server returns a 200 OK status code (indicating that the request succeeded), even though the delivered page doesn’t contain the expected content, and a 404 Not Found status would have been the appropriate response.

Think of a page with no or very little content, a page with an error message, or a search results page without any results – that’s what a soft 404 error looks like to you in the browser, despite the server sending a status code 200 in the HTTP response headers as if there were no problem.

Why Are Soft 404 Errors Problematic?

Soft 404 errors create a bad user experience, just like regular 404 errors. Clicking on a link, waiting for the page to load, and then not finding the expected content is frustrating and gravely damages the website’s credibility.

It can also impact the site’s search rankings if users encounter a soft 404 and quickly leave the page. Bounce Rate and Time on Page are two important metrics that influence a website’s SEO performance and signal to search engines how relevant and valuable the content is. In addition, soft 404s consume valuable crawl resources, causing search engines to continue crawling unimportant pages instead of important ones, leading to a reduced frequency of crawls, decreased indexation, and ultimately, a negative impact on the website’s search visibility.

How to Detect Soft 404 Errors?

Detecting soft 404 errors is tricky. You can’t trust the HTTP status code returned by the server but have to examine the page content. Standard link checkers don’t do this and therefore fail to identify soft 404s.

Our link checking solution, on the other hand, can rely on a large database of content patterns to automatically identify different kinds of soft 404s on a website. Starting with the Professional plan, detected soft errors are reported under the “Soft errors” tab in the sidebar.

Soft Errors in Dr. Link Check

An alternative (and free) way to identify at least some of the soft 404 errors is to check out the site’s “Indexing → Pages” report in Google Search Console. This report lists crawl errors, including soft 404s, that Google encountered when indexing your pages.

Google Search Console: Soft 404

Another resource you should take a look at is your website’s analytics data. Try finding pages with particularly high bounce rates or low time-on-page values as these are indicators of soft 404 errors.

Last but not least, verify that your server actually sends a 404 status code if a non-existent resource is requested:

  • Open a new browser window or tab.
  • Enter a URL with your website’s domain that you are certain should result in a 404 Not Found error (such as https://www.example.com/this-page-does-not-exist).
  • Open the browser’s developer tools (Control + Shift + I on Windows or Linux, Command + Option + I on macOS).
  • Select the “Network” tab and press Control + R (or Command + R on macOS) to reload the page.
  • Check which code was returned for the page in the “Status” column.

Chrome DevTools: 404

If the request was not redirected to a different URL and the server responded with code 200, you have stumbled upon a soft 404 error.

What Causes Soft 404 Errors?

Soft 404 errors are frequently the result of an incorrect server configuration or a programming error. Here are two real-life examples:

A website hosted on an Apache web server had a line similar to this in its .htaccess file to configure a custom 404 error page:

htaccess ErrorDocument 404 https://www.example.com/404.html

Instead of serving the content of the 404.html file directly, the server redirected to the URL https://www.example.com/404.html and returned the 404.html file with a 200 OK status. Changing the line to

htaccess ErrorDocument 404 /404.html

fixed the issue.

In a different case, a website had a custom “404 Not Found” page with the following PHP code at the top:

php+HTML <?php header("Status: 200 OK"); ?>

This line resulted in 200 OK being sent instead of the correct 404 code.

Sometimes soft 404s are also remnants of changed website structures or removed content. Products that are no longer available may result in empty search result pages or moved blog posts in empty categories. In situations like these, it can be a good idea to just remove the empty pages or the links pointing to them.

If that’s not possible or practical, you can restrict search engines from indexing the pages by adding a disallow rule to your site’s robots.txt file or including a meta robots tag with the parameter “noindex” (<meta name="robots" content="noindex">) in your pages’ HTML code.

Conclusion

Soft 404 errors can significantly impact a website’s user experience and search engine visibility. Website owners can identify these errors through the use of tools such as Dr. Link Check and Google Search Console and by carefully examining the website’s analytics. Resolving soft 404s may involve reviewing the server’s configuration files and delving into the website’s source code.


How to Reduce Bounce Rate and Increase Session Duration

When you’re sifting through your website analytics, one of the most important metrics you’ll find is your bounce rate. Bounce rate refers to the rate of visitors that leave your website after a single page visit. This generally means that they didn’t find the page interesting enough to continue browsing your site, let alone buy anything from your business.

Bounce Rate in Google Analytics

Entrepreneurs should always look for ways to lower the bounce rate of their web pages. After all, a lower bounce rate means that visitors are spending more time browsing your content and your online store, which will lead to more customer conversions and more sales. Fortunately, you can use many smart strategies to reduce your bounce rate and keep visitors around for longer. Here are 11 ways to lower your bounce rate and increase session duration.

1. Improve Your Website Loading Speed

Improving the loading speed of your website is one of the best things you can do. This simple change can almost instantly reduce your bounce rate, increase the average session duration of viewers, and enhance your search engine rankings. Of course, this will also have a positive impact on how people react to your website and how many viewers turn into customers.

Tricks such as compressing your content, minimizing HTTPS requests, and allowing asynchronous loading for certain files can help speed up your website. It’s also important to get high-quality hosting. However, to ensure that your website is as efficient as possible, you might want to ask a professional web developer to help you out. Even a 1-second increase in the average loading time can have a significant impact on session duration and conversions.

2. Make Your Website Easier to Navigate

Another way to reduce your bounce rate and keep visitors around longer is to make your website easier to navigate. If people can’t instantly find what they need on your website, they’re likely to get frustrated and leave. As such, you’ll want to ensure that everything is easy to find and that potential customers have no problem finding what they’re looking for.

Many websites handle this by using large navigation buttons for important parts of their website, such as their online shop and their FAQ page. Providing internal links between pages to link people to things they might be interested in can also help. Asking people to test your website for usability can help you tackle potential problems and improve the ease of navigation.

3. Fix Broken Links

Encountering a page with missing images or a “404 Not Found” message when clicking on a link is an immediate turnoff for many visitors. Errors like these make your site look unprofessional and unmaintained.

Use our broken link checker service to identify dead links and fix them before they affect your reputation and drive away potential customers.

4. Improve the Aesthetics of Your Website

Sometimes keeping visitors on your website is all about aesthetics. If someone visits your site and finds that it looks like a website from the 1990s, they’ll probably think that your business is old and outdated. Even though less is more sometimes, a visually unappealing website can cause visitors to swiftly leave.

While your website doesn’t need to be too ostentatious, a few visual upgrades and an attractive template can go a long way. You might even want to ask a web design service to help you make your website look as good as possible while still ensuring that it loads fast and is easy to navigate.

5. Use Internal Links in Your Content

Using internal links throughout your website has all kinds of benefits. Internal links can help you improve your Google ranking for certain keywords, which will help you gain more visitors. What’s more, if you use internal linking appropriately, visitors are much more likely to click on these links and keep exploring your website, leading to a huge reduction in your bounce rate.

You should use internal links in your blog posts to link relevant keywords to other helpful pages on your website. You should also include a call-to-action (CTA) on each page that leads viewers to your online store. Adding internal links between relevant products in your online store can also help you keep users browsing and boost your customer conversions.

6. Add Interactive Content to Your Blog Pages

Adding some enticing interactive content to your blog posts is an excellent way to increase the average session duration of visitors to your website. After all, people will naturally stay on your website longer if they’re watching a video, doing a quiz, or exploring a fascinating interactive infographic.

These features can also help you reduce your bounce rate. When your blog posts offer engaging features like videos, quizzes, and infographics, people will get invested and read more of your content. You can even enhance your sales by using these interactive features to lead people to your online store and including some interactive content on your product pages.

7. Ensure That Your Website Is Mobile-Friendly

One of the biggest causes of high bounce rates is websites that aren’t mobile-friendly. Many consumers nowadays use their smartphones, tablets, and other portable devices to browse the internet. If your website doesn’t cater to these devices, you’ll lose tons of visitors who would simply rather use a website they can read and browse on their phone.

Making your website more mobile-friendly involves enhancing your layout, making text readable on small devices, and breaking content into small paragraphs to make it easier to read. Once again, you might want to ask a professional web design service for help to make your website more mobile-friendly, especially as it’ll boost session duration and reduce your bounce rate.

8. Create an Appealing Online Store

If you want to turn more of your website viewers into customers, you need to make sure your online store is fun, appealing, and easy to navigate. The more time people spend browsing your online store, the more likely they are to ultimately buy from you. Naturally, this will also have a great impact on your bounce rate and session duration.

There are a few tricks you can use to keep people engaged in your online store. Adding image links to related products on every product page can catch the attention of viewers. You should also include high-quality product images and even product videos to demonstrate your products. You should make sure your product pages load fast and are easy to navigate on all devices. Including large “Buy Now” or “Add to Cart” buttons can also help boost sales.

9. Eliminate Off-Putting Pop-Ups

Some websites blast visitors with unwanted pop-ups as soon as they visit. Between advertisements, pop-up boxes asking them to accept all cookies, and requests to sign up to an email list, visitors can become frustrated, and these features may cause them to instantly leave a website. As such, you’ll want to avoid them as much as possible.

While you need to ask visitors to accept cookies, you should do so with a small footer rather than a huge pop-up box. You should also avoid big, annoying ads in favor of organic links in your content. Instead of using pop-up boxes to ask people to join your mailing list or check out your products, add these CTAs to your blog pages or somewhere on your website where they’re less obtrusive.

10. Ask People to Give Feedback on Your Website

Collecting feedback from website visitors is one of the best ways to improve your website. This helps you instantly discover and solve problems with your website usability. For instance, you might find out that mobile users find it hard to browse your website. You can then work on making your website easier to browse on portable devices.

By finding and tackling these problems, you can impress more viewers, resulting in a lower bounce rate and higher average session duration. You might want to send out feedback surveys to your customers to ask them how easy it was to use your website. You could also pay for usability testing, where impartial testers thoroughly test your website and give you tips on how to improve its usability.

11. Optimize Your Product Pages with Images and Videos

Encouraging visitors to stick around longer on your product pages is one of the best things you can do. The more time they spend browsing your products, the more likely they are to ultimately buy something. As such, you’ll want to optimize your product pages as much as possible to prevent visitors from leaving.

High-quality images of your products can help. Offer pictures from every angle so customers can check out each product thoroughly. Product videos can also help, especially as these can make visitors invest a few minutes into discovering more about each product. Detailed product descriptions and product reviews can also keep people reading and entice them to make a purchase.

Conclusion

If you want to boost your Google search ranking and enhance your sales, lowering your bounce rate and increasing your average session duration can help. By focusing on these analytics and improving them, you’ll keep people around on your website much longer. This will result in a higher rate of customer conversions as well as a significant boost in future traffic.

These 11 strategies can help you significantly improve these analytics and enhance the success of your website. Not only can these tips help online businesses make more sales, but even if you’re not trying to sell anything, decreasing your bounce rate can help you bring more visitors to your site and build a bigger following.


How to Perform an SEO Audit on Your Site

If your website isn’t getting many views despite high-quality content, or if you’re seeing a steady drop in organic traffic, odds are that poor SEO practices are to blame. While plenty of great SEO companies and consultants are available to help you improve your page ranking factors, you might be surprised to learn how much you can do yourself with a minimal understanding of WordPress or HTML.

Before you spend any money on professional services, use this quick DIY audit guide to gauge the quality of your SEO.

Optimize Page Speed

SEO involves much more than just keywords. Site speed has been one of the most crucial factors for quite some time – most notably since Google introduced a dedicated page speed update back in 2018. If your site takes more than two seconds to load, it’s likely causing your page rankings to suffer. Data released directly by Google reveals that every extra second that your page takes to load dramatically increases bounce rate (the percentage of users who leave the site after viewing only one page).

Luckily, Google’s PageSpeed Insights tool can tell you exactly how long your loading times are, as well as point out specific areas for improvement so that you or your web developer can make the necessary changes to boost performance, improving both your SEO and user experience.

Test Mobile Performance

If your site isn’t optimized for mobile use, it’s time to make that a priority; the majority of online traffic is mobile, and that trend isn’t changing any time soon. In fact, Google now actually indexes the mobile version of websites first, so the lack of an adaptive, responsive, or mobile-first design that makes it easy for people to navigate on their devices will have a sizable negative impact on your ranking.

As with page speed, Google offers free tools to analyze mobile performance and fix any potential problems. Just use Google’s Mobile-Friendly Test to see how your site stacks up.

Look for Duplicate Sites

One of the most basic and most important checks you can run is to ensure that only a single version of your site is being indexed by Google. In an extreme case, search engines could see four different versions of your site:

  • http://www.example.com
  • https://www.example.com
  • http://example.com
  • https://example.com

While this makes no difference to a user browsing your site, it can cause big SEO problems by making it difficult for search engines to know which version to index and rank for query results. In many cases, separate versions of a site can even be interpreted as duplicate content, which further impacts your content’s visibility and rankings.

If you run a manual check and discover a mix of site versions, the easiest fix is to simply set up a 301 redirect on the “duplicate” versions to let search engines know which one to index and rank. Another option is to use a rel="canonical" tag on your individual web pages, which is just as effective as a 301 redirect but may require less time for you or your web developer to implement.

Clean Up On-Page SEO

While your site’s performance is a major ranking factor, it’s still important to consider ways of improving your traditional on-page SEO. In addition to creating great, unique content that provides value to the user, you’ll want to look at optimizing title tags, meta descriptions, and image alt tags, as well as improving your internal linking and pulling any bad outbound links.

A number of great tools can help you find any problem areas or opportunities you have missed. Popular options that can quickly provide a list of actionable items include SE Ranking, SEO Tester, and SEMrush’s On-Page SEO Checker tool.

Another great SEO audit tool that focuses specifically on internal and outbound links is Dr. Link Check. It crawls your website and gives you a complete list of all its links. You can then filter that list down to show only the links you are interested in, such as broken links or dofollow links to external websites.

Check Your Backlinks

Backlinks are one of Google’s top three factors when determining page rankings, and they can be enormously beneficial when it comes to increasing traffic, as long as they’re legitimate, high-quality links. But there are also so-called “toxic” backlinks that can negatively impact your organic traffic and site rankings. In extreme cases, they can even result in a manual action being taken against your site by Google.

These harmful backlinks are usually the direct result of trying to game the system, whether it’s paying for links, joining shady private blog networks, submitting your site to low-quality directories, or blatantly spamming your links all over the web. While it’s an uncommon tactic, there are also cases of unscrupulous webmasters deliberately using toxic backlinks as a way to sabotage competitors. Even if you aren’t using any of these tactics yourself, it’s still worth doing a periodic check to make sure everything’s above board.

By using a combination of SEMrush’s Backlink Audit tool and the Google Search Console, you can determine whether you have any backlink issues to address. If you do find any problematic backlinks, the two main options are sending out removal requests via email and hoping they get removed, or using the Google Disavow Tool to tell Google that you want them to ignore certain links.

Know When to Hire an Expert

Keep in mind that the point of a DIY SEO audit is to conduct some simple checks that anyone who knows their way around WordPress or basic HTML can handle. If you’ve done everything on this list and still seem to be struggling to gain traction, or if you’re seeing a steady drop in organic traffic, the problem might be something more complex.

In that case, it’s important to know the limits of your own abilities. If you start poking around in code that you aren’t familiar with, there’s a very real risk of doing more harm than good. Hiring someone to fix mistakes is a costly headache that nobody wants to deal with. Instead, consider hiring an SEO specialist with more in-depth knowledge on the subject as soon as you know the problem is something you’re not sure how to fix. The good news is that if you’ve already done a basic audit, that means less work (and billable hours) that a professional needs to do before they can diagnose the issues.

While professional assistance is certainly necessary in some cases, the truth is that many of the most common SEO mistakes can be corrected without much technical know-how, thanks to the robust analytical tools and guidelines available. So before you spend your hard-earned money, walk through the steps outlined in this guide to identify your issues and see if it’s something that can be handled with a few minutes of your or your web developer’s time.


Great Web Design is More Than Just Pretty Pixels

Many users hit the back button within seconds of loading a web page, so it’s no surprise that lots of websites are primarily designed to look nice. Humans are inherently visual creatures, but that doesn’t mean that users ignore poor design just because it looks pretty. An ugly website that works great might turn away most visitors, but a pretty website that doesn’t work well won’t keep anyone around. To create a successful website, it’s critical to not just make it look good, but also ensure that it’s easy to use, fast, secure, and in compliance with every applicable law.

Usability

Users probably landed on your website for a reason. Maybe they were interested in purchasing your product; maybe they needed to call your customer support. Regardless, your website should make it easy for them to find what they’re looking for. Navigation should be intuitive, clear, and efficient so that visitors don’t get frustrated and leave. If you want to encourage visitors to do something (like sign up for your service), include a clear call to action on the relevant pages.

Your site has to work well on every device a potential customer might be using. Mobile-friendliness is a basic expectation in today’s world. Accessibility and internationalization aren’t nearly as hard as they sound, and they make a huge difference for users who aren’t from your country and those with disabilities.

Professionalism is also a big part of usability. Not proofreading content is a surefire way to lose anyone who reads carefully. Broken links are a more subtle issue, but they can tarnish your reputation and users’ trust when a link goes somewhere unexpected. Periodically clicking every single link on your website is a waste of time, so services like Dr. Link Check make it easy to avoid giving visitors an unwelcome surprise.

Searchability

Even the most easy-to-use website won’t attract any customers if it isn’t ranked highly by search engines. Following basic SEO guidelines will significantly boost the number of hits your site receives. Simple tweaks like making sure to implement <meta> tags on every page, using expressive <title> elements, and putting descriptive text in alt attributes on images makes it much easier for search engines to index and rank your pages.

Instead of manually adjusting the HTML on every page, your CMS or static site generator should offer some configuration options to automate these fixes. Additionally, a machine-readable XML sitemap is an easy automated addition that will help crawlers find every page you host on your site.

Performance

People are impatient when it comes to waiting around for web pages to load. Google found that 53% of visitors left when a page took more than three seconds to load. Making sure your website is fast (especially on mobile devices and connections) is an important and relatively easy way to significantly reduce the number of people who hit the back button before seeing any content.

Assets like images, JavaScript, and CSS files are some of the largest files your visitors will have to load from your site. In the case of images, be sure to downsize and compress them appropriately. For scripts and stylesheets, don’t forget to minify them. If page load times are still too high, try using a CDN, which will cache these large files near your users. Also, a frequently forgotten source of slow load times, if you’re using a CMS like WordPress, is unnecessary plugins. Aside from resulting in longer render times, plugins can sometimes inject additional JavaScript and CSS that your users have to download.

Security

Not using HTTPS is like publicly advertising that you don’t wear a seatbelt: it’s risky, it looks bad, and it doesn’t give you any benefits. Search engines will rank HTTP-only pages lower than sites that allow secure connections. More importantly, you’re needlessly risking the security of your users’ data and most likely violating regulations like the GDPR.

If your site has any kind of login functionality, be sure to treat user data carefully. Use strong hashing functions to prevent attackers from stealing passwords if your server is compromised. Regardless of what your site does, follow security best practices on your server: use strong passwords, update server software frequently (and plugins, if applicable), use a correctly-configured firewall, and limit remote access over SSH and similar protocols.

Legal Requirements

It might be easy to forget about legal and compliance issues when your business is just getting off the ground, but a single violation could cost you a ton of time and money. Be sure to write a privacy policy that complies with regulations like the GDPR and CCPA for customers in the EU and California, and store user data appropriately. It’s far cheaper to get a lawyer involved while writing a privacy policy than it is to defend yourself from a lawsuit. As much as they are a nuisance, cookie consent messages are a requirement in many locations and are relatively easy to implement. Last but not least, be sure to adequately credit photographers and other content producers to avoid copyright hassles in the future.

Marketing Strategy

Not every lead comes from a Google search. Social media marketing is especially important today, and it’s used by most successful sites to attract customers. Regardless of where it’s shared, good content marketing is also an important tool in your toolbox. Instead of just promoting your service, content marketing like articles and videos also offer something of value to viewers. After learning something useful, they will be interested in checking out your product. You can use A/B testing, where different visitors will be given different content, to gauge the effectiveness of your marketing or any other element of your website.

Sometimes, however, you do need to resort to direct advertising. Marketing tools like Google AdWords let you put your ads right where your most interested customers are already looking.

Maintenance and Monitoring

An unreliable or broken site is certainly a source of frustration for visitors. Too many site owners fail to implement a good backup policy, aren’t immediately notified when their site goes down, and don’t keep everything up-to-date and secure. Cloud providers like AWS and GCP offer easy ways to save snapshots of your servers, preventing you from losing everything if something goes haywire. Try a service like UptimeRobot to get a message as soon as your site goes down. You don’t want to find out that your site doesn’t work when customers start calling you.

Conclusion

As much as a good visual design is an important part of creating a successful website, everything under the skin is equally important. Keeping your site easy to find and use, fast and secure, and in compliance with regulations is less obvious than a pretty façade, but these things are just as important if you want to retain your customers and attract new ones.


Noindex vs. Nofollow vs. Disallow: What Are the Differences?

These days, most website owners are keenly aware of the important role that high quality content plays in getting noticed by Google. To that end, businesses and digital marketers are spending increasingly large amounts of time and resources to ensure that websites are spotted by search engine robots and therefore found by their target audiences.

But while every website owner wants high search engine rankings and the corresponding increases in traffic, there are certain areas of a site that are best hidden from the search engine crawlers completely.

Why hide parts of your website from search engines?

You might wonder why it’s good to keep crawlers from indexing parts of your website. In short, it can actually help your overall rankings. If you’ve spent lots of time, money and energy crafting high quality content for your audience, you need to make sure that search engine crawlers understand that your blog posts and main pages are much more important than the more “functional” areas of your website.

Here are a few examples of web pages that you might want the search robots to ignore:

  • Landing pages: Obviously, landing pages are super important for generating leads and even selling products directly. However, you might have certain landing pages that contain seasonal offers or are designed for specific (paid) advertising campaigns.
  • Thank you pages: Once your digital marketing expands, your website will probably contain multiple “thank you“ pages where visitors are redirected after downloading a lead magnet or signing up to a mailing list. You’ll almost certainly want to keep these pages away from search robots, as they can appear thin in content and be interpreted as “spammy.”
  • PDF downloads: Following on from the above example, you’ll also want to ensure that any giveaway or download pages and their file attachments are hidden from your audience, as you certainly wouldn’t want them to be easily accessible without collecting an email address first.
  • Membership login pages: If your site has a member’s forum or client area, you’ll probably want to hide those pages from search engines too.

As you can see, there are plenty of instances where you should be actively dissuading search engines from listing certain areas of your site. Hiding these pages helps to ensure that your homepage and cornerstone content gets the attention it deserves.

How to hide parts of your website from search engines?

So how can you instruct search engine robots to turn a blind eye to certain pages of your website? The answer lies in noindex, nofollow and disallow. These instructions allow you define exactly how you want your website to be crawled by search engines.

Let’s dive right in and find out how they work.

The noindex instruction

As you can probably imagine, adding a noindex instruction to a web page tells a search engine to “not index” that particular area of your site. The web page will still be visible if a user clicks a link to the page or types its URL directly into a browser, but it will never appear in a Google search, even if it contains keywords that users are searching for.

The noindex instruction is typically placed in the <head> section of the page’s HTML code as a meta tag:

<meta name="robots" content="noindex">

It’s also possible to change the meta tag so that only specific search engines ignore the page. For example, if you only want to hide the page from Google, allowing Bing and other search engines to list the page, you’d alter the code in the following way:

<meta name="googlebot" content="noindex">

A bit more difficult to configure and therefore less often used is delivering the noindex instruction as part of the server’s HTTP response headers:

HTTP/2.0 200 OK
…
X-Robots-Tag: noindex

These days, most people build sites using a content management system like WordPress, which means you won’t have to fiddle around with complicated HTML code to add a noindex instruction to a page. The easiest way to add noindex is by downloading an SEO plugin such as All in One SEO or the ever-popular alternative from Yoast. These plugins allow you to apply noindex to a page by simply ticking a checkbox.

The nofollow instruction

Adding a nofollow instruction to a web page doesn’t stop search engines from indexing it, but it tells them that you don’t want to endorse anything linked from that page. For example, if you are the owner of a large, high authority website and you add the nofollow instruction to a page containing a list of recommended products, the companies you have linked to won’t gain any authority (or rank increase) from being listed on your site.

Even if you’re the owner of a smaller website, nofollow can still be useful:

  • If you’re a creative agency, you might have other companies’ logos embedded on your case study pages that could be confusing in image searches.
  • If you’re a blogger, your comments section might contain links that you don’t want to support.

Even if your pages only contain internal links to other areas of your website, it can be useful to include a nofollow instruction to help search engines understand the importance and hierarchy of the pages within your site. For example, every page of your site might contain a link to your “Contact” page. While that page is super important and you’d like Google to index it, you might not want the search engine to place more weight on that page than other areas of your site, just because so many of your other pages link to it.

Adding a nofollow instruction works in exactly the same way as adding the noindex instruction introduced earlier, and can be done by altering the page’s HTML <head> section:

<meta name="robots" content="nofollow">

If you only want certain links on a page to be tagged as nofollow, you can add rel="nofollow" attributes to the links’ HTML tags:

<a href="https://www.example.com/" rel="nofollow">example link</a>

WordPress website owners can also use the aforementioned All in One SEO or Yoast plugins to mark the links on a page as nofollow.

The disallow instruction

The last of the instructions we are discussing in this blog post is “disallow.” You might be thinking that this sounds a lot like noindex, but while the two are very similar, there are slight differences:

  • Noindex: Search robots will look at the page and any links it contains, but won’t add the page to search results.
  • Nofollow: Search robots will add the page to results, but will ignore the links within the page for ranking purposes.
  • Disallow: Search robots won’t look at the page at all.

As you can see, disallowing a page means that you’re telling the search engine robots not to crawl it all, which signifies that it has no use at all for SEO. Disallow is best used for the pages on your site that are completely irrelevant to most search users, such as client login areas or “thank you” pages.

Unlike noindex and nofollow, the disallow instruction isn’t included into a page’s HTML code or HTTP response, but instead is included in a separate file named “robots.txt.”

A robots.txt file is a simple plain text file that can be created with any basic text editor and sits at the root of your site (www.example.com/robots.txt). Your site doesn’t need a robots.txt for search engines to crawl it, but you will need one if you want to use the disallow directive to block access to certain pages. To do that, you’ll simply list the relevant parts of your site on the robots.txt file like this:

User-agent: *
Disallow: /path/to/your/page.html

WordPress website owners can use the All in One SEO plugin to quickly generate their own robots.txt file without the need to access the content management system’s underlying file structure directly.

How to scan your site for noindex, nofollow, and disallow instructions

Do you know for certain which parts of your website are marked as noindex and nofollow or are excluded from being indexed by a disallow rule? If you are not sure, you might consider taking an inventory and reviewing your past decisions.

One way to do such an inventory is to go to www.drlinkcheck.com, enter the URL of your site’s homepage, and hit the Start Check button.

Dr. Link Check’s primary function is to reveal broken links, but the service also provides detailed information on working links.

Report of noindex links

After the crawl of your site is complete, switch to the All Links report and create a filter to only show page links tagged as noindex:

  1. Double-click on “Filter” at the top of the report in order to turn the filter bar into text mode.
  2. Enter NoIndex = true into the text field.
  3. Press Enter to apply the filter.

Add noindex filter

You now have a custom report that shows you the pages that contain a noindex tag or have a noindex X-Robots-Tag HTTP header.

Report of nofollow links

If you want see all links that are marked as nofollow, switch to the All links report, click on Add… in the filter bar, and select Nofollow/Dofollow from the drop-down menu.

Nofollow filter

Report of disallowed links

By default, the Dr. Link Check crawler ignores all links disallowed by the rules found in the site’s robots.txt file. You can change that in the project settings:

  1. Open the Account menu at the top-right corner and select Project Settings.
  2. Click on Advanced Settings.
  3. Tick the checkbox next to Ignore robots.txt.
  4. Click Update Project to save the settings.

Ignore robots.txt setting

Now switch the Overview report and hit the Rerun check button to start a new crawl with the updated settings.

Restart crawl

After the crawl has finished, open the All Links, click on Add… in the filter section, and select robots.txt status to limit the list to links disallowed by your website’s robots.txt file.

Disallowed by robots.txt filter

Summing up

While the vast majority of website owners are far more interested in getting the search engines to notice the pages of their websites, the noindex, nofollow and disallow instructions are powerful tools to help crawlers better understand a site’s content, and they indicate which sections of the site should be hidden from search engine users.


Click Depth Optimization: How To Improve Your Website’s Click Depth

When building a website, it’s important to consider how easy it is for visitors to navigate to subpages from the homepage. The homepage, of course, will likely generate the most traffic. If visitors can’t easily navigate to lower-level pages from there, your website’s performance will suffer. You can help visitors access relevant subpages by improving your website’s click depth.

What Is Click Depth?

Also known as page depth, click depth refers to the total number of internal links, starting from the homepage, visitors must click through to access a given page on the same website. Each click adds another level of click depth to the respective page. The more links a visitor must click through to access a page, the higher the page’s click depth will be.

Your website’s homepage has a click depth level of zero. Any subpages linked directly from the homepage have a click depth level of one, meaning visitors must click a single internal link to access them from the homepage.

How Does Click Depth Impact SEO?

Click depth is a metric that affects user experience and, therefore, search rankings. Visitors typically want to access subpages easily, with as few links as possible. If a subpage requires a half-dozen or more clicks to access from the homepage, visitors may abandon your website in favor of a competitor’s site.

Google has confirmed that it uses click depth as a ranking signal. John Mueller, Senior Webmaster Trends Analyst at Google, talked about the impact of click depth during a Q&A session. According to Mueller, subpages with a low click depth are considered more important by Google than those with a high click depth. When visitors can access a subpage in just a few clicks from the homepage, it tells Google that the subpage is highly relevant. As a result, Google will give the subpage greater weight in the search engine results pages (SERPs).

How To Identify Pages With High Click Depth?

An easy way to determine if a site suffers from high click depth is to run a crawl with Dr. Link Check. Even though Dr. Link Check’s primary function is to find broken links, the service can also be used for filtering links based on their depth:

  • Go to the Dr. Link Check website, enter your site’s URL, and hit the Start Check button.
  • Wait for the crawl to complete and open the All Links report.
  • In the Filter panel at the top click on Add… and select Link depth from the menu. Switch the filter condition from equals to greater than and enter the value 5.
  • Add a second filter option: Direction: Internal
  • Add a third filter option: Media type: HTML

Filter by link depth

Now you have a list of all internal page links with a click depth higher than five.

5 Tips To Improve a Website’s Click Depth

If the crawl revealed pages with a high click depth, it’s time to decide what to do about it. Here are five tips that will help you create a strategy for improving your site’s link structure.

1. Use a Narrow Hierarchy for Navigation Menu

The hierarchy of your website’s navigation menu will affect your site’s average click depth. If you use a broad hierarchy with just a few top-level categories and many lower-level categories, you can expect a higher average click depth. With this type of navigation, visitors must click through multiple category levels to access lower-level subpages, resulting in a higher average click depth.

Using a narrow hierarchy for your website’s navigation menu, on the other hand, promotes a lower average click depth. With a narrow hierarchy, your website’s navigation menu will have more top-level categories and fewer lower-level categories, which should allow visitors to access subpages in fewer clicks.

2. Include Internal Links in Content

When creating articles, guides, blog posts or other content for your website, include internal links to relevant subpages. Without internal links embedded in content, visitors will have to rely on your website’s navigation menu to locate subpages. Internal links in content offer a faster way for visitors to find and access subpages, which helps keep your website from suffering with a high average click depth.

Keep in mind that internal links are most effective at improving click depth when published on subpages with a low click depth. You can add internal links to all your website’s subpages, but those published on subpages with a click depth level of one to three are most beneficial because they are close to the homepage.

3. Use Breadcrumbs for Secondary Navigation

Another way to improve your website’s click depth is to use breadcrumbs for supplemental navigation. What are breadcrumbs? In the context of web development, the term “breadcrumbs” refers to links in a user-friendly navigation system that shows visitors the depth of a subpage’s location in relation to the homepage. An e-commerce website, for instance, may use the following breadcrumbs on the product page for a pair of men’s jeans: Homepage > Men’s Apparel > Jeans > Product Page. Visitors to the product page can click the breadcrumb links to go up one or more levels.

Breadcrumbs shouldn’t be as a substitute for your website’s navigation menu. Rather, you should use them as a supplemental form of navigation. Add breadcrumbs to each subpage to show visitors where they are currently located on your website in relation to the homepage. You can add breadcrumbs manually, or if your website is built on WordPress, you can use a plugin to add them automatically. Yoast SEO and Breadcrumb NavXT are two popular plugins that feature breadcrumbs. Once they are activated, you can configure either of these plugins to automatically integrate breadcrumbs into your website’s pages and posts.

4. Create a Visitor Sitemap

You can also use a visitor sitemap to lower your website’s average click depth. Not to be confused with search engine sitemaps, visitor sitemaps live up to their name by targeting visitors. Like search engine sitemaps, they contain links to all of a website’s pages, including the homepage and all subpages. The difference is that visitor sitemaps feature a user-friendly HTML format, whereas search engine sitemaps feature a user-unfriendly XML format.

After creating a visitor sitemap, create a site-wide link to somewhere in your website’s template, such as the footer. Once published, the visitor sitemap will instantly lower the click depth of most or all of your website’s subpages.

5. Don’t Overdo It

While optimizing your website for a lower average click depth can improve its performance, you shouldn’t overdo it. Linking to all your website’s subpages directly from the homepage won’t work. Depending on the type of website you operate, as well as its age, your site may have hundreds or even thousands of subpages. Linking to each one creates a messy and cluttered homepage without any sense of structure.

Final Words

A high average click depth sends the message that your website’s subpages aren’t important. At the same time, it fosters a negative user experience by forcing visitors to click through an excessive number of internal links. The good news is you can lower your website’s average click depth by using a narrow hierarchy for the navigation menu, including internal links in content, using breadcrumbs and creating a visitor sitemap. These strategies will help you improve your site rankings as well as improve your visitors’ experience.


Ältere Posts