engine
optimizers”, which means a group of
consultants/consultation firms who help to carry out
optimization projects on behalf of their clients.
The beginning of search engines
In the middle of 1990, webmasters started optimizing websites for search engines; it began by simply presenting a particular website to various engines which would run some programs, called ‘spiders’ that would collect all the information by downloading the page and storing it in the database of the search engine’s server. In this server, there is another program called ‘indexer’ that then extracted various information about the page like some words and links, and put them in the scheduler which can be used later for crawling.
At the beginning all the search engines depended on the webmasters for the information like keyword meta tag or index files in engines like ALIWEB. It was the meta tags that guided visitors to the content of the page. Gradually, the webmasters started misusing this causing their pages to rank for irrelevant searches. To prevent this, the search engines developed more sophisticated and complex ranking algorithms with the following features:
-
A domain name
-
Placing the text in the title tag itself
-
URL directories and file names
-
HTML tags were provided to highlight the headings and other very strongly emphasized terms
-
Keyword sequence
-
Keyword proximity
-
Keyword adjacency
-
Developing the content
In spite of all this modernization, the search engines were still dependent on the webmasters to quite an extent. The latter, in order to attain a good ranking, kept on maltreating and manipulating thee engines. To give the users and visitors a good search result, it was important to ascertain that the SERPs showed the most pertinent results and not redundant information – this gave rise to a new kind of search engine.
Google introduced the much more sophisticated algorithm, PageRank to combat this problem. This concept worked in a slightly different fashion – it determined the importance of the page based on its incoming links. PageRank considered the fact that most likely a user, who while arbitrarily surfing the net has followed kinks from one page to another, will visit a given page. This means that some links are more effective and hence valuable than others, as a higher PageRank is more likely to be reached by a random surfer.
This PageRank algorithm became very successful in its mission and visitors started using Google as it provided with the most appropriate search results. High recommendation from users and programmers made Google the most popular and effective search engine on the Internet.
A wide range of methods and processes are used in SEO. These include making certain changes to the site’s code and getting links from other sites. These two methods can be further classified into two categories – the methods that are considered as a part of good design by search engines and the methods that are not considered to be a part of good design by search engines and often try to bring their effects to the minimum. This latter is popularly is known as
spamdexing. These two methods are also termed as ‘white hat SEO’ and ‘black hat SEO’
If SEO is employed as a marketing strategy, it can bring forth very good results. Unfortunately, search engines do not get paid for the amount of traffic that they steer from the organic search; thus the algorithms used can and often goes through lot of changes which poses problems to the Search Engine when it is being ranked on the website’s pages – there is no guarantee of success, either in the short term or long term. Due to this uncertainty, SEO is very often compared to traditional Public Relations, with PPC advertising closer to traditional advertising.
|