Myths, Spam and Penalties – Blackhat SEO

These days we believe that most people know (at least those interested) how search engines work. In the past it wasn’t exactly like this, and the existing misconceptions did cause some confusion to those just entering the SEO world.

This section will be about some SEO history, some SEO myths, and what’s necessary for effective SEO and how to avoid problems with spam and illicit practices.

Search Engines

Google, YouTube, Bing, Yahoo!, Yandex, DuckDuckGo

Submission Forms

Back in the 1990s search engines submission forms were the typical SEO process. Webmasters just had to tag their sites with keywords, and submit them to the search engines through those submission forms.

Impossible not to be nostalgic about those days when all it took was to submit a form for the bot to crawl and index the page.

As we might recall, scaling this form submission process didn’t happen in the best way, as a lot of spam started to be submitted. This lead to the replacement of the system by the current crawl process.

Even the search engines admit that today submitting URLs is a rare practice, so earning links is actually the best way we have to expose our site and contents to the search engines. It’s a natural process that happens when our contents have quality, like we saw on Chapter 7 of this guide.

Modern SEO existing these days no longer needs these submission forms. If we are offered such submission forms by some SEO service that will be a clear signal that it’s an outdated service that will not serve our purposes.

Asking to be crawled will never return the exposure we aim for. To be indexed we need to have quality content, and the votes from quality linking and sharing.

Meta Tags

Back in the days, meta-tags were also a very important part of SEO, especially keywords meta-tags. It was as simple as writing the keywords we wanted, and when people searched for those keywords, our site would come up on the search results.

Once more spam caused problems in this process, and it’s no longer seen as an important indexing signal by the major search engines. Title tags, Meta description tags, and meta-robots tags are still a very important part of a quality SEO process these days, but even if they have an important function or are important tools, they are no longer central in the process.

Keyword Stuffing

Keyword stuffing is stuffing a page with the same keyword over and over again, making it look spammy. All of us have seen pages like this: “Cheap Cars, Best Cheap Vehicles in the Business, as Cheap as it can get” or something similar.

In the past, people thought that keyword density would mean better search results, but it’s not the case. Using keywords in an intelligent way will give us far better results than making spammy pages, as the intelligence of search engines these days have other ways to determine relevance, have other ways to calculate ranking.

It’s never too much to repeat once again that usability should be the focus of our site. Good contents and good links are worth far more than stuffed content.

Search Engines: Paid Searches

This one is actually a simple myth to bust, as we know that paying for ads will not improve our SEO organic level. There is no evidence for such thing, and if our site is on top of the search results when we pay for the ads, it might as well come down as soon as we stop paying.

Nothing like investing in good SEO and not on ads in order to improve our position on the search results rankings. More, all the major search engines, including Google and Bing, like to keep things separate, so that the crossover doesn’t exist.

This just means that even if you are a company paying millions to Google on ads, not even so you get better rankings or special attention from the Google search teams.

Search Engines: Spam

It’s hard to imagine making a search or the internet in general without spam. It’s not a friendly reality, but it’s the one we have, and it has been increasing ever since search engines exist back in the 1990s.

The reason for spam is obvious, it generates affiliate revenues that can go up to millions, and everyone wants a piece of it. Those pages specially conceived to abuse search engine algorithms are getting a harder time now though, for two main reasons:

People hate Spam

Spam is almost seen as an online crime, and, in fact, every single user hates it. Search engines do have financial programs against spam, and this investment from Google has given its results. The proof of these results is that we don’t see practically any spam in our Google searches.

The effort spam makers have in creating those pages is more likely superior than making a quality page, so spam might actually be reaching a turning point these days, and hopefully, we might start seeing less of it.

The evolution of search engines has been impressing, and the fight against spam has been proportional to this evolution. A pretty much spam proof algorithm has been developed, and this has to be frustrating for the spam webmasters that see no rewards in their efforts.

The fact is that Google developers have more skills and resources to fight spam, than those possessed by spam creators, so these learning machines are getting more and more sophisticated against not only spam but also low-quality pages.

The growing number of webmasters seeking help for penalties on their sites because of their spam just makes us realize that trying to manipulate search engines is not worth it, it will not help, and we are better off using SEO in order to collect positive results to our pages.

Search Engines: Analyzing Spam at the Page Level

It’s still pretty common to find pages that are stuffed with keywords, with repeated sentences or terms trying to deceive search engines. This ends up being a waste of the spammer’s time, as scanning for this is an easy task for search engines, and not loading pages that have irrelevant keywords on the search results is something relatively simple to program a search engine to do.

How do search engines fight spam more effectively? They analyze spam on individual pages and on entire sites, in order to evaluate if there are spammy practices. Here’s what they analyze.

Manipulative Linking

To have more visibility our sites need to have popular links, and it’s a common spammy practice to try to manipulate this process of link acquisition. This is way harder to fight than keyword stuffing, precisely because manipulative linking appears in many ways.

We are talking about sites that create links pointing back to each other with the goal to increase their popularity. This is known as reciprocal link exchange programs, but search engines evaluate them immediately as soon as they determine a pattern.

On this category of manipulative linking, we find other sorts of link schemes, some even mentioned in our guide, like the link farms. On these link farms, fake sites are created just to have links to increase the popularity of other sites. Search engines analyze site registrations and link overlapping, among other things, when trying to discover them.

There are people that think that higher rankings are something that can be bought, so they pay people willing to put their links on their sites. This can evolve into big networks of people buying and selling links what is in fact something hard for search engines to stop.

Paid links are actually something that does add value to the rankings, but it’s obviously a manipulation of the system that will be penalized when they get caught.

There are link directories exclusively created to manipulate the search market. Needless to say that these have low quality, and are just another version of paid links, just in a more complex form. Identifying these fake directories is no easy task, but search engines often do take action against these sites.

We identified here the more common manipulative link tactics, but there are others that Google and other search engines always try to reduce their negative impact, while continuously improving their algorithm. When new spam appears, a new change to the algorithm happens too, but human reviews and reports are also a precious help.

Cloaking

Crawlers should see the same content as people visiting the site, so hiding text in the code is not considered a good practice. Cloaking happens when text is hidden in HTML code, but if it’s spotted by search engines the pages will not appear in search results.  There is also the concept of White Hat Cloaking that leads to a more positive user experience, and it’s even allowed by the search engines, so at the end of the day, it’s all about the intention, more than the technical questions behind the pages.

Search Engines: Analyzing Spam at the Domain Level

Search engines scan individual pages, but also whole domains and subdomains when looking for spammy properties.

Low-Value Pages

High positions in the search results are associated with quality, so unique content can certainly get our pages there. The engines have methods to detect this content, as well as to detect duplicate content that, even not being spam, doesn’t add much value to a page. These pages are often not included in search results, and once more algorithms analyze and screen them out.

Linking Practices

Issues with linking practices happen on individual pages of a site, as well as at the domain level, and are equally monitored by search engines. The penalties are obviously heavy, search traffic can be reduced or even banned completely, so it’s a case to decide if it’s worth to risk it.

Earning Trust

Some SEOs have noticed that apparently there are double standards from the part of search engines in what concerns big brand sites and other smaller independent ones.

This is a false question though because search engines give trust according to the earned links, so it’s just natural that a big brand domain (let’s say like Apple or Samsung) gets more links than a small new company just appearing.

More, having links on a blog and spam directories can cause us issues, while the exact same links on credible sites can make us rank very well. This is the importance of having links on quality sites, on sites people trust, on sites that are an authority on a certain matter.

So, nothing like avoiding duplicate content and bad linking to gain good rankings and trust. If on top of this we have the capacity to earn high-value links, we are definitely in the right direction to be on the top of the search engine results.

Valuable Content

Similarly to the value of a single page, also the value of a domain is given based on how unique the domain is. Pages with common content are often enough not ranked.  Even if they try to do quality SEO, the engines simply don’t want repeated content and use both manual methods and algorithms to fight it.  Search engines are good at what they do, and they are permanently testing the results their offer to users. For example, let’s say our site earned a good position after a query, but as soon as the user hits that link to our site, he also hit the back button of the browser to leave.  All this is noted by search engines that make the necessary readjustments according to the clicking. Earning rankings is just half way to success after that enters usability and quality content.

Search Engines: Are we being penalized?

The rule here is simple. If our site is ranking high we are collecting the benefits of our work, if it’s not is because we are not doing something right, and that is our penalty. It’s not crystal clear though, as sometimes the search algorithm changed or we just did something on the site that was negative for our ranking, but it can also be due to illicit practices. Nothing like checking the following steps to be aware of what’s going on, and what we can do about it.

Ruling out issues

To understand what’s happening with our SEO and our search engine results we need to rule out if our site has some errors that might be preventing crawling from happening. The Google Search Console is a very good place to see if we have those errors. We should check if recent changes we might have done to our pages (links, content, and so on), might have changed the way search engines see our content. Another thing worth verifying is that if similar backlink sites exist or sites with duplicate content, and if their rankings have changed too.                             Z

Follow simple steps

  • Once we have ruled out those situations, we should continue analyzing deeper to see what’s happening with our ranking, to realize if we have suffered some penalty or if we are losing ground to our competition.
  • The first thing we need to realize is if our site is still indexed. If it’s not, it’s because we were banned, and we need to follow the procedures to be included again, namely making the re-inclusion request, after we have cleaned our page from spam.
  • In the case our site is still indexed though, we need to understand if the domain name or other branded terms are still ranked. In the case they are not, most likely we have been penalized for link manipulation, spam, or other illicit activity.
  • Removing those bad links or eventually paid campaigns is the first step before asking nicely to be included again
  • But if our domain name is still ranked, we can continue analyzing deeper. For example, when searching for 4-5 strong keywords from our site, do we still appear in the first 20 results?
  • If we don’t, most likely some of the links have been evaluated by Google. Nothing like removing our bad links, ask to be included again and find natural quality links to help us climb the ranking again.
  • If on the other hand, we are still ranking on those top results, the thing is we don’t have any penalty, we are just losing ground to the competition, and to fix this nothing like good quality SEO, and high-quality links and content.
  • Naturally that these steps are merely indicative and don’t work for all the existing situations, but the underlying question here is that we need to develop skills to identify the difference between eventual spam penalties and simple ranking drops.

Search Engines: Lifting the Penalties

If we resort to illicit practices as webmasters and we got penalties, we will have to work to get them lifted, and this is not an easy task. Asking to be re-included is often enough something that is denied by search engines or with no feedback at all. Even so, trying the next steps increase our chances for success. If we haven’t already nothing like registering our site at the Google’s and Bing’s Webmaster Tools. Not only the tools are helpful, but we are also creating a connection with the search engine and therefore more trust. Following the information provided in those Tools will certainly help us solve issues with broken links or crawling errors, and we can also be surprised with the time’s Spam can be wrongly perceived as access issues. Sending the re-inclusion request through the Webmaster Tools will create a bigger trust and a better chance of feedback when comparing with the use of the public form. We should also be completely open when making the request to the search engines. Telling how we have spammed, links we might have bought, all those details are precious for Google to improve their algorithm, and seeing honesty from our part is certainly showing we deserve to be back in.

This process that can take weeks or even months. When hundreds or thousands of sites get this sort of penalty on a weekly basis, we can imagine the amount of work they have analyzed all the re-inclusion requests, so we need to be patient. Ultimately, we need to realize that search engines don’t have the responsibility to lift any penalties. Being included is a privilege we have, and that we should strive to maintain. Bad SEO can make us lose that privilege and all the benefits associated with being ranked high on search engines results.