Since the founding of Google in 1998, the main mission of this company has been to provide the best results for users’ searches, this work was done by reviewing and analyzing users’ searches based on several different criteria; One of the most important of these criteria is the number of times this request was made and the keywords of that search were included in the content of a website.
Of course, if this question and these keywords were done frequently, the websites that sought to solve this need would receive higher ranks.
It can be said that with this method, everything was progressing peacefully until…
Businesses found out; higher rankings in Google results can significantly improve financial goals, brand awareness, popularity, etc. And this was the beginning of a great war.
Table of Contents
The battle for Google rankings
Different business owners were looking for higher rankings in their field of work, so they filled their website pages with keywords, related keywords, search terms, etc., so that they could get more traffic to their sites.
This negatively affected Google’s main mission, which was to improve search results based on the needs of the audience, because people tried to trick Google’s algorithms.
And this started a war between Google and SEO experts.
Google’s solution to solve this basic problem was to update its algorithms periodically and of course suddenly.
These updates are often aimed at checking the content and its suitability for users.
Note: From the latest updates, we can refer to the Google Core update on May 4, 2020.
For example; Google now fully understands and examines the quality of the content from the range of related keywords, repetition of keywords, the percentage of copies of that content, writing, sentence length, etc., and also look for unnatural links sent to our site from other websites.
In the meantime, the architecture of the website is very, very important in the heavier competition to achieve the top ranks, because a user should not click 10 times to get what he wants!
This is why Google updates have been very effective in improving the quality of search results.
The problem for many webmasters, marketers, bloggers, and online businesses, however, is that they can still get penalized by Google. Even if they follow Google’s red lines
These penalties can cripple a business and remove them from the Google search results list.
What is the Google penalty?
A penalty by Google means that a site, under the influence of a Google update, has a ranking drop or is penalized for violating the rules of the Webmaster’s Guide.
This change and drop in rankings from Google is often not clearly explained to you and can depend on various factors.
As a webmaster, you should understand the nature of an update by thinking with other webmasters or reading SEO articles or other methods and improving the status of your website accordingly.
Next, take some time and analyze what happened. Check to make sure that your rank change was exactly affected by this update or not!
Another reason for your site to be penalized could be because your site was manually penalized, which means your site was probably reviewed by a Google employee after being flagged for not following the guidelines and redlines. Is. While these penalties can be serious and very dangerous, you can easily fix them because these warnings appear as a report in your Search Console account.
What are the reasons for website penalties?
In the following, we will review some of the most important cases that may put your site at risk of being penalized:
1) Buying backlinks or external links
Buying backlinks can both increase Google’s rank quickly and also act in the opposite way, which in these cases usually leads to a decrease in rank, removal from search results, and finally a penalty.
The most common links to buy are known as site-wide backlinks, which are placed in the header, footer, sidebar, or part of the linking site, these external links are repeated on all pages of the site with an anchor text, this factor causes that your site receives as many links as the number of indexed pages of the linking website with an anchor text.
2) Too many outgoing links
If a page has more than usual outgoing links to another site, it will face a drop in rank, and if the anchor text contains a keyword, the decrease in rank will be more. It is better to include non-related links to the page, such as the outgoing links of the comments section, sources, or… no-follows.
3) Copy content
The Panda algorithm is responsible for penalizing sites whose site content production process is duplicated, or the relative amount of copied content within the site is higher than unique content.
This algorithm is one of the strongest and smartest algorithms of Google.
4) 404 error pages
It is better to re-index the 404 pages of the site because if you have a link to a deleted page if the 404 page is newly indexed, there is a possibility that that page will be indexed, which itself causes worthless and duplicate content to appear.
5) Excessive use of keywords
Gone are the days of repeatedly using the keyword in the text to get a better ranking!
According to the updates made in recent years, the more you repeat the keyword in the text, the more the page rank for that word will decrease.
Instead, it is better to use keywords related to the main word. You can extract these words through relevant Google searches and tools such as ahrefs, Moz, etc.
6) Using broken links
The presence of one or two non-functional broken links on the site can be normal, but if the number of these links increases, it becomes a problem for the site users, and this is far from Google’s policies.
7) Unprincipled redirects
Using the 301 redirects in some situations improves the page rank of the redirect host, but if these 3 points are not followed, the site’s status will be seriously damaged, the points are as follows:
– After redirecting page A to page B, all the links that are given from inside to page A must be edited and given to page B, because the presence of any redirected link inside the site will reduce the loading speed of the site as well as the budget. Google crawls, these two factors also have a direct negative impact on the ranking.
More than 70 to 80% of the authority of the redirected page is transferred to the host page, but no more power is given from page A to other pages of the site.
– For redirects, it is better to use only 301 types and be sure to consider that fewer redirects equal a more optimized site.
Be as obsessive as possible in using redirects.
8) Hosting problems
The loading speed of the site is a very important factor that partly depends on the quality of the site’s hosting. Also, if the host used has an uptime lower than 99%, it will cause consecutive errors from Google’s crawlers, which will ultimately lead to a decrease in rank and the appearance of a server error in the search panel becomes the console of the site.
9) The existence of a secret link
If you have links on your site that are hidden from the user’s view due to CSS, it is better to delete them or make them visible right now. Because this means bypassing Google and against the rules of its algorithms.
10) Page loading speed
Load speed is one of the factors officially announced by Google as an important factor in ranking. It can be said that the two factors of speed and mobile friendliness are the most important factors for improving the site’s SEO in 2020.
11) Hacking of the site
The reason for hacking a website is abuse, now either this abuse is through phishing and stealing user information or using the power of the site to perform black hat SEO, such as redirecting pages or hidden links to sites requested by the hacker.
12) Link exchange
Link exchange in the form of writing or page-to-page link mode cannot cause a penalty, but if it is placed in bulk, it will cause Google to be suspicious of the site.
Of course, according to experience, these links are ignored by Google.
Bypassing the user and Google or clocking is one of the black hat methods in which the user is directed to a page other than the real page and the search engine crawler to another page, to summarize: in clocking, the user sees one content and the search engine sees another content.