A new SEMrush study highlights the most common SEO errors in 2020 that may play an important role in penalizing your website ranking in the search engines, mainly Google. Considering and avoiding these errors gives you a competitive advantage and help you reduce technical problems to be able to increase the performance and the presence of your site around the web. In this article, we will give details and statistical data about these SEO errors to enable you to be aware and to do the right thing.
Please note that some of these errors are from basics and many of you may consider that these problems don’t count to much in the process of website optimization for search engines. Although according to the SEMrush statistical study, most penalized websites in 2020 have these issues in common.
Here you go the eight most common errors in SEO that may cause you the lost of your precious positions:
Ignore HTTP status and server issues
The most critical technical problems with a site are often related to HTTP status and servers which can cause traffic loss and also degrade your Google positioning in the long term due to content that is not accessible. Most of content creators don’t pay close attention to these errors, especially when being busy running large websites and don’t paying close attention to Google Webmaster Tools Data.
The errors of this type are as follows:
404 errors (4xx codes): this is an HTTP error code from the server of your website to indicate that it did not find the page at the requested address so the page is inaccessible.
Pages not explored: pages inaccessible by robots because the response time of your website is too high or your server has refused access to the page. (I personally got this issue and I wasn’t aware of it for about a month until I discovered by accidentally that many of my pages has been deindexed due to an XML sitemap plugin conflict problem. It took me months to get all the page indexed and ranked again!).
Broken internal and external links: internal links that lead to pages that do not exist on your site or external links that lead to pages that do not exist on another site. It addition of SEO it cause also a bad UX (user experience).
Broken internal images: an image file that no longer exists or the image URL is not spelled correctly.
Sub-optimize and under-estimate META tags
Your meta tags help search engines identify the content of your pages and then associate them with the keywords and expressions that users are looking for. You must customize these tags according to your theme and your different pages to be able to stand out in the SERP.
The most common Meta tag errors are:
Duplicate the Title, Meta Description, and H1 tags: you must not have duplicates. Tags must be unique and personalized.
Missing H1 tags: H1 tags are the main title of your page. If you make it clear, it makes it difficult for Google to understand your page content.
Missing Meta-Descriptions: Well-written Meta-Descriptions help Google understand relevance and encourage users to engage and click on your page result and the SERP.
Missing ALT attributes: ALT attributes provide search engines with a description of the images of your content (You may all know that, basics duh). However a missing ALT text can always do more harm than Good for your images in SERPs in 2020.
Duplicated content
Duplicated content can penalize the ranking of your site in search engines. You should avoid copying content from other sites, whether or not it is a direct competitor. You are sick of hearing that but sill there many site owners who copy content or use spin content and complaining about why they got hit by a Google new update (I did some SEO audits for websites and I was shocked by business owners believing that duplicated content is good content as much as they get direct visits!)
Duplicate content: to avoid duplicate pages in SERPs, you can use rel = “canonical” or 301 redirects.
Don’t Pay close attention and neglect the optimization of internal and external links
The most common link problems that can affect your SEO:
Links leading to HTTP pages on an HTTPS site: links to old HTTP pages can complicate the task of search engines. All links must be checked to be kept up to date.
URLs containing underscores: Search engines may misinterpret underscores and not index your site correctly. Better to use dashes.
Forget important elements for exploration by robots
You should take care of any technical exploration issues that you may encounter as some pages on your site may lose visibility to the robots that crawl your site.
Problems encountered by robots on websites:
Nofollow attributes in internal links: internal links with the nofollow attribute prevents robots from circulating correctly on your site.
Sitemap.xml not found: missing sitemaps make it difficult for search engines to crawl and index your site pages.
Invalid pages found in the sitemap.xml: your sitemap.xml must not contain any pages not found. Make sure you get a 200 status code.
Sitemap.xml not specified in robots.txt: in your robots.txt file, it is better to add a link to your sitemap.xml so that search engines can fully understand the architecture of your site.
Ignore the indexing of your sites
Indexing issues can be caused by your tags, content, or the Hreflang attribute for sites with multiple languages.
Common problems with indexing:
Short / long Title tags: tags longer than 60 characters are shortened in SERPs and those shorter than 60 characters are not sufficiently optimized for search engines.
Hreflang source code conflicts: Multilingual websites can mislead search engines if the Hreflang attribute conflicts with the source code of a specific page.
Incorrect Hreflang links: Broken Hreflang links can cause indexing problems if relative URLs are used instead of absolute URLs.
Low word count: pages should have content and be as informative as possible.
Missing Hreflang and lang attributes: this is a problem when a page on a multilingual site does not contain the links or tags necessary for search engines to define languages according to users.
Do not check and neglect the validity of AMP pages
If you offer pages in AMP format, you must check their validity via Search Console. There is an option in the Google Webmaster Tools that allow you to check if your AMP pages are correct and to correct errors if there are any.
AMP HTML problems: if the HTML code does not correspond to AMP standards. These issues can be related to your style, layout, scripts used, or your page templates. Even if you are using AMP plugins in CMS like WordPress you may pay attention and check the validity of your pages, the plugins are not perfect and sometimes out of date and can cause indexing and duplicate content issues.
Do not optimize site performance
The pages loading time is very important in SEO. The loading speed of your pages is an essential element to take into account because the slower your site, the more you risk being less referenced and also losing visitors if the waiting time is too long for them. In average, if your page take more than 5 seconds to load, you have a problem.
The most common issues related to website speed performance are:
Slow page loading speed (HTML): You have to decrease the loading time of your site pages by compressing the HTML code and correcting W3C code errors because speed directly affects your ranking.
JavaScript and CSS files not cached: browser caching must be specified in the header.
Un-reduced JavaScript and CSS files: Magnifying these resources and removing unnecessary lines, comments, and white space can improve page loading speed.
As I mentioned in the beginning of this article, for many of the SEO experts and Gurus these are some basics in term of website optimization for search engines but still these are the most common errors that lead to a bad website performance. Most of the time we tend to look at some deep advances techniques to beat the competition and we forget about the simple basics which costs us too much.
No Comments