Indexing and Ranking – Your Website!

SEO has dual function. It’s very important to make our site easy for the users to get to view it and understand it, but it’s also important to make the robots from the search engines to understand it and to index it, or else the site is invisible. The sophistication level search engines have achieved is high, but even so is not enough to perceive web pages in the same way that people do. Search engines are not that good! SEO is, therefore, useful for search engines to understand what the pages are about, and how search engines should list those pages in order to help the users that are interested in finding them.

Search Engines:

A reciprocal relation It’s common to hear people say that the engineers behind the search engines should not build a search engine that makes sites follow specific requirements to be indexed.

Sure, it’s just logic that something as advanced as a search engine has to be conceived with the capacity to crawl through any code or architecture to find and return the most important results. So, it should not be about the site, but just effective crawling and indexing.

On the other hand, search engines – no matter how well conceived they are – don’t have the capacity to analyze certain things the way humans do. Let’s say I have posted online a picture of my cat. I can describe it as a medium-sized cat, black, playing in the kitchen. And the question remains: how can a search engine understand a picture?

This is one of the reasons SEO exists, and why search engine companies like Google create tutorials. It’s not only for webmasters learn SEO, but also to enable webmasters with the power to help search engines to read as accurately as possible everything that the sites have. It’s a reciprocal relation, and a site should be optimized to be better read by search engines, and therefore appear better places at the results page. Webmasters do play, therefore, an important role.

They provide the clues to the search engines to help them understand what’s going on with a high level of detail. These clues are nothing more than an adequate content, a structured content. Knowing what search engines are capable of, knowing their limitations enables the webmaster to create, format, and annotate the site content in the best possible way for the search engine rank the site high, instead of making it invisible.

Search Engines: Limits and difficulties

Chapter one of this guide covered the basic principles under which search engines work, the whole crawling and indexing process, so if refreshment on these concepts is necessary, you can always refer back there.Crawling and indexing are, of course, easier said than done. We should never forget the billions of pages, images and documents the bots have to crawl and index in those giant databases. Artificial intelligence gets the job done, but of course that similarly to our intelligence, it has limitations too. Being aware of these technical limitations is being aware of what can be done to go around them.

Crawling and indexing limitations

When making our pages, we need to be aware that search engines have difficulties going through Online Forms, like the ones used to log in. The contents of these forms might remain hidden for the crawlers.

Sites that use Content Management System (CMS) sometimes create duplicate pages, and this turns out to be a problem for search engines that don’t see it as original content. We do know that original content is one of the things search engines value when indexing.
Webmasters make their pages in a way that the bots from search engines can crawl freely through the site, but if the site has issues with a file called robots.txt this might end up blocking them, with the obvious losses related with this non-indexation. Paying attention to the Crawling Directives is, therefore, vital for this process to happen smoothly.

Another very important situation for a good crawling and indexing is the Link Structure of the site. It has to be a structure that reaches effectively the whole site or else some content might not be reached.

Finally, Rich Media Format Text is another difficulty search engines have, so whenever we have such content, let’s say video, photos, flash files, and so on, we need to take into consideration that search engines might not see that non-HTML text.

Search Engines: It’s all about visibility

These technical difficulties inherent to search engines have to be anticipated by webmasters if they want to have their sites appearing at the top of the search results pages, but they are nothing but the basics in what SEO is concerned, the tip of the iceberg. The importance of a correct web development, or search engine friendly sites if we prefer, covers our needs only up to a certain point.

From that point on, content marketing is necessary. There is no magic formula that search engines use to determine if a site has more quality content over the other. The simply rely on metrics that help them assess the relevance of a certain content.

These metrics are obtained by the tracking and analysis of people’s behavior: comments, reactions, links, and so on. It’s not just about overcoming technical limitations and having great contents, it’s also necessary that our contents are shared and debated by the users.

Associating queries to content issues

Apart from the crawling and indexing limitations, sometimes there are also difficulties associating some contents to specific queries. This happens when Uncommon Terms are used, like, for example, when we write “food heating units” instead of “oven”.

Language Questions are another situation that can be problematic, especially when the same word is written in different ways according to the country, leading that not all results are indexed correctly. A good example is the word color and color.

Content Target and Contextual Signals are other two issues that tend to confuse the artificial intelligence of search engines. In what concerns content target, let’s say we have a site in English, but we want people from Mexico to visit, that will be a problem as many people from Mexico might not know the language.

The same with misleading titles, like for example a title like “Best US gambling”, but the page is about a US based casino opening in Brazil, this is a wrong contextual signal that can lead to bad search engine results.

How competitive can it get? Every single site wants to be on the top results of search engines, and they make their best efforts to be there. There is big business for search marketing, as the demand from companies is already high, and will continue to grow as they realize that investing in this area will bring them a good return.

On average, each search results page has ten positions. Needless to say that the ones appearing on the first page are top ranked, and the higher our page is the more click it has, the more people it attracts. The first three positions on the results page receive far more traffic than the other results on page one, and even more than on secondary results pages.

The fact that those top results generate so much traffic makes them a desirable spot and companies are willing to invest to be visible, to make their brand more notorious and with a higher perception of quality.

Search Engines: permanently changing

It was in the middle of the nineties that search engine optimization began. Some might still recall that meta-keywords tag, manual submissions, as well as keyword stuffing were the top practices at the time for a positive ranking.

A decade later, in 2004, the recommended practices were no less than buying links of spam. Yes, automated spam comments from blogs were actually good SEO. Anchor text linking, and sites linking to one another, the so-called link farms, were also very popular as SEO practices, and in fact returned more traffic.

Another decade, more changes. In 2011 we find search engines with refined mathematic equations, to a point that if we use the 2004 SEO practices today, we would be hurting our SEO. Marketing for social media and vertical search are some of the practices still used today for good SEO.

2017 brings the trust concept to the spotlight: trust about the indexed age of our site and our content, trust on our authority profile that is nothing more than quality linking, and trust on our underlying content, that should not have errors or be a duplicate. Lengthy, well-written, and engaging content is appreciated by search engines.

As for the future, we can be certain that a lot of changes will just keep on happening, along with the evolution of search engines and web pages. This just means that all those working in SEO will have to continue to invest in the area if they want to keep appearing in the top results.

It’s all a matter of the return of investment. Sites fight for a good placement in search engines results, and for that, they need to invest in SEO. It’s a no-brainer to them because they know that the return on this investment will be more traffic and more visibility.