History and Evolution of SEO
History and Evolution of SEO
Search engine optimization, as the name suggests is the process of improving the quality and quantity of traffic to a website when people search for products or services. So how does SEO improve the traffic flow to a website? Before knowing that we should answer the question, how do users (traffic)reach your website? Actually, there are numerous ways that people end up on a website. Maybe through direct search in search engines, maybe through social media ads, or even through links that may be visible in numerous places on the internet.
SEO is the process through which one can improve traffic to
their website by directing the traffic which searches for some specific query
in the search engines, by ranking their website top in the results in the SERP.
Before we jump into the factors which determine the rank
your website gets in SERP or the processes we could do to improve our rank, Lets
us go back in time from the beginning itself of the SEO process. The time where all it
started.
The history of SEO And Search Engines
It is believed that the term SEO became relevant in the
early to mid-90s to be precise we could say SEO was born on 1991, The very same
year in which the world's first website was launched. Then there was a bloom of
websites, and as the number grew there
was a need for structure and accessibility hence the world's first search engine
was created
Excite in 1993 introduced revolutionary changes in search
engines which were followed by Alta Vista, Yahoo, and others
in 1994
In 1996 the world witnessed the beginning of a new era of search engines. Two Stanford
University Ph.D. students Sergey Brin and Larry Page began building what did become
the biggest and most used Search engine Backrub which now currently is known as
Google.
Two Stanford University Ph.D. students, Sergey Brin and Larry Page met in for the first time in 1995 and began working on their Ph.D. project that explained the mathematical properties of the internet. In other words, the duo was keen on proving that the higher the number of quality backlinks a site gets, the more relevant it is to a search topic/keyword. This was very different from previous search engines that prioritized content keywords above other factors.
Sergey Brin and Larry Page
How Google was born
Sergey Brin and LarryPage, The founders of Google met at Stanford university as students and began
working on their dissertation project which detailed the mathematical
properties of the internet, In other words, the duo was in pursuit to prove that, as the number of quality backlinks
pointing to a site increases, the higher the relevancy of the site is to a
specific keyword or query. This was in contrast to the existing search engines
which were based on keyword density above all else. They initially called the project
Backrub which was changed to Google later on. (The term google was formed from
the
mathematical term "googol," which refers to the number one
followed by 100 zeros.)
Fun fact:
Did you know that Scott Hassan was an unofficial third founder of Google (Backrub)!!!
How do Google and other search engines work?
Search engines
A search engine can be defined as a program or service that allows Internet users to search for content via the World Wide Web
(WWW). When a user enters a keyword or a key phrase into the search engine, the
search engine browses all available information that exists on the internet and provides
users with relevant results as websites on the search engine result page(SERP)
![]() |
| search engines |
Google, Yahoo, Bing, DuckDuckGo, etc. are examples of
search engines
Processes of Search Engines
So how do Search engines work? Search engines are
programs that fetch the results you need from all the data available on the
internet. There are so many websites and so much data on the web .when you
enter a specific query in the search box search engine should understand what
you asked for then search for related and relevant data and should present you
with results page with data which you desired.
Since there is numerous content available on the
web search engines need to comb through every container and need to understand
every content in them. This is made possible by automated software known as spiders, crawlers, or bots
Even though the three names are used synonymously,
there are differences between three of them
Spider: The spider is a program run by the search engine to collect and build a summary of what a site contains. Spiders create a text-based summary of the website and an URL for each webpage
Crawler: a crawler visit every page and analyze every hyperlink that exists on each page
Bot: Bots are automated programs designed to do
specific tasks. its job is to understand how to crawl and index pages of a
website
This whole process can be categorized into three
2. Indexing
3. ranking
Crawling
Crawling can be considered a discovery process by
search engines to find new and updated content from websites. This is done using spiders, crawlers, or
bots. These google bots set out to find new or updated content of any type and
when they find the content they get indexed in the google or search engine database of URLs later to be retrieved when a searcher is seeking information on
that specific topic
Indexing
The crawled pages or websites are later categorized
into different topics and are stored in the URL database based on the keywords
the content encapsulates. This process is called indexing
We could relate indexing to a librarian categorizing
and organizing different books in different sections of a library
Ranking
When a user searches something in google or other
search engines, it scans the whole indexed webpages to provide the user with
the best possible results and it will show the results which are relevant to
the user's query in SERP in a way that the most relevant page that google finds relevant
to the query in top ranks in SERP. This process is called ranking. The ranking
depends on many factors like location, language device, etc.
SEO Explained
SEO as we now know contains a wide spectrum of elements. Still, essentially SEO is the process That exists to optimize any given activity to improve and rank up in search engines like google. Let’s take google into consideration, google ranks or references page that it finds relevant and have good authority. The processes focus on ranking the web pages that are most relevant and which are trustworthy to be ranked up to be viewed by the user. This means more than anything google focuses on customer/user satisfaction than anything else. In a nutshell, if your webpage has quality content with high authority links provided the contents are very well structured and have a healthy link profile, The chances of your website ranking up in the SERP are high.
But as simple as it seems, in earlier days the SERP
results quality was very much affected by many unethical processes
Factors affecting SEO
So you might have heard of Google’s ranking factors, a huge list consisting of about 200 or more factors that determine the rank you get on SERP. Are there 200+ factors? which are they? How does it
affect our website’s ranking? Well, I don’t
think anyone has a solid answer for that. Some may say they have listed all
factors or you may come across some pages claiming so and so. Well some of the
listed factors may be proven, some mostly controversial and some are random guesses
Here are some of the proven methods you could try to
increase your website rank!
- High-Quality Content
- High-quality high authority backlinks
- Internal links
- Device and browser-friendly website
- Good user experience
- Interactive website
- On page optimization
- Page load speed
Types of SEO techniques
SEO techniques can be classified into two.
1. Recommended techniques that are based on search
engine guidelines
2. Techniques that don't abide by search engine guidelines.
These different
types of SEO techniques are based on how an SEO expert follows or does not follow the
tactics the Google’s Webmaster Guidelines.
Here are the different
types of SEO techniques
1. White Hat SEO
This is the type of
SEO technique in which strategies play a significant role. Strategies are created to
rank top on the organic results abiding by the google webmaster guidelines. The
basic techniques include
- Ethical backlinking
- Link building
- Valuable and relevant content creation
- Keyword research
- Guest blogging
It is considered the
safest yet most legal way to boost SEO and ranking. Even though white hat
methods take time to get results, the results last a long time than the black or
grey hat methods
2. Black Hat SEO
Black hat SEO is a
type of technique that exploits the weakness in the googles search algorithm to rank
higher in SERP. They use severe aggressive methods for ranking and do not
abide by any rules or guidelines provided by Google. Even though black hat techniques
provide instant results they can severely impact one’s website if detected by
google
The techniques
include
- Keyword stuffing
- Link spamming
- Paid link building
- Cloaking
- Spin content
- Hidden doorways
3. Grey Hat SEO
The process of Grey
hat SEO is not understood well by people and is often considered to be
something else by most people which is not. It is a type of SEO technique in
which traffic to a website is built illegally, but the breaking of rules and
the strategies used are indirectly so
that the repercussions of breaking the guidelines are not affected as it affects black
hat SEO techniques
They include
- Paid content marketing and reviews
- Purchasing old or expired domains
- Careful keyword stuffing
- Many social media accounts
- cloaking
Evolution of Google and SEO
The process of SEO still seems a little mysterious to people. Now
Google has become the most popular and
trusted search engine and SEO also has evolved based on google.
Niche specific or Content Specific
Google initially was a niche-specific search engine. The more
the number of keywords a site had, the higher the website ranked in the SERP. Knowing
this webmasters began to utilize this opportunity to rank their website higher
by using unethical methods like keyword stuffing. This severely affected the
user experience and trustworthiness of google.
Link Specific
So google changed its algorithm from niche specific to link-specific. Link-specific means that the higher the number of hyperlinks a site gets
from other sites, the higher the chance of the website ranking higher on SERPs.
This led to the unethical use of links to increase ranks by selling or buying links
from other sites and creating backlinks from their own numerous sites.
Quality link specific
To put an end to this later google again changed its algorithm from link specific to quality link specific. Google also introduced the term page rank for the first time with this update. Page rank was considered as a parameter that determined the quality of a website using different parameters and ranks the website on a scale of 0-10, where zero has poor quality and 10 is the best and trustworthy. Google started counting only the number of hyperlinks from higher-page ranked websites. But people started selling links from their higher-page ranked websites. To curb this google introduced a concept which was called passing the juice
Passing the juice/Link Juice
As the term suggests it is a casual term that means the authority or
power of a website is transferred to a site through internal linking or external linking to other sites. If a site has more links
pointed towards it than links pointing from high authority sites it would rank higher
than the sites having lesser links pointing towards it and similarly as you hyperlink
to another website a part of your authority will be passed down to the
hyperlinked site. This put an end to processes like link cultivation, paid
links, etc.
Along with this google also introduced the no follow tag which is an
attribute tag to hyperlinks. It simply means that google considers your
hyperlink only as a reference not as a recommendation so no authority of your site
is passed down to other sites
Bounce rate
It is the percentage of people who visited your websites and leave immediately without spending much time to total visitors.
It may be because of poor content, poor user experience, or poor performance by the website in loading and on different devices.
This indirectly affects the ranking of the website. The lower the bounce rate the better the rank will be.
Dwell time
The amount of tie a user spends on a page before returning to SERP.
Pogo sticking
The movement of an internet user from one result in SERP to other results hoping to find better quality content or answer. It increases bounce rate and may affect site rank as other websites having better content are getting more dwell time.
In 2010 social media signals started playing the main role in SERP rankings. It was observed that one who got more influencer outreach or influence started getting ranked higher.


















Comments
Post a Comment