H.V.P.M's College of Engineering and Technology, Amravati
H.V.P.M's College of Engineering and Technology, Amravati
1. INTRODUCTION:
search, allowing those results to be false would turn users to find other search
sources. Search engines responded by developing more complex ranking
algorithms, taking into account additional factors that were more difficult for
webmasters to manipulate.
History of SEO:
SEO Theory was born in the popular Virtual Promote Gazette
Enewsletter and associated Search Engine Forums founded by Jim Wilson
(whose Web properties have been handed over to his successors after his death
in May 2003). Wilson brought together the first organized community of Web
Site developers, marketers, and promoters with the purpose of sharing
information about how Web search works, how it can be used to benefit
internet community. Wilson's forums and newsletters often documented new
tools and promotional ideas along with techniques. Within few years the SEO
community began to branch out into younger forum and Enewsletter ventures.
SEO Theory shifted its emphasis from on page elements to off page elements
with the ascendancy of Google. Then they acknowledged the necessity of
obtaining links from other documents. Today SEO Theory is less emphasized
by the best practices community, who generally rely upon well-established
Website marketing and search engine Webmaster guidelines to structure their
Web promotion campaigns. Nonetheless, SEO Theory has made significant
contributions to widely accepted and advocated Website marketing
methodologies.
associated with Black hat SEO or deceiving search engines. In addition to the
Web document design, and because of the importance placed upon link
analysis by several important search engines(including Google, Ask, Live
Search, Yahoo), SEO Theory has evolved to include the study and analysis of
linking practices, patterns, and placement. Although its history may be more
strongly identified with the darker side of search engine optimization, SEO
Theory today helps to guide the implementation of best practices marketing.
Web content providers used to manipulate number of attributes in HTML code
to rank well in search engines. By relying so much on them search engines
suffered abuse and ranking manipulation. So relevant searching came into
existence and search engines looked for relevant web pages.
2. SEO TERMINOLOGIES:
2.1 A Search Engine:
Basically every search engine that exists consists of 3 parts:
A Web Crawler
An Indexer
A Query Processor
A Web crawler is a computer program that browses the World Wide Web
in a methodical, automated manner or in an orderly fashion. Other terms
for Web crawlers are ants, automatic indexers, bots, or Web spiders, Web
robots. Web crawlers are mainly used to create a copy of all the visited
pages for later processing by a search engine that will index the
downloaded pages to provide fast searches. It starts with a selected set of
URLs called seeds, and then fetches all the links on them and continues in
this way. A selection policy is implemented for it. Following is a list of
search engines and corresponding bots.
Google – GoogleBot
MSN – MSNBot
Yahoo- Slurp
Fast Search & Transfer - FAST
Bots give the indexer the full text of the pages it finds. These pages are stored in
Google’s index database. This index is sorted alphabetically by search term, with each
index entry storing a list of documents in which the term appears and the location
within the text where it occurs. This data structure allows rapid access to documents
that contain user query terms.
Page Rank is Google's way of giving a specific value to how popular your website is.
As told earlier we will mostly count for Google. It is based on the number of "votes"
other websites cast for your website. A "vote" is simply when another website places
a link on their website that is pointing to your website. Generally, the more "votes" or
links you have pointing to your website, the higher your Page Rank (PR) will be. Page
Rank is 1 of the many factors that Google takes into account when ranking websites.
In the given figure Page C has a higher PageRank than Page E, even though it has
fewer links to it; the link it has is of a much higher value. A web surfer who chooses a
random link on every page (but with 15% likelihood jumps to a random page on the
whole web) is going to be on Page E for 8.1% of the time. The 15% likelihood of
jumping to an arbitrary page corresponds to a damping factor of 85%. Without
damping, all web surfers would eventually end up on Pages A, B, or C, and all other
pages would have PageRank zero. Page A is assumed to link to all pages in the web,
because it has no outgoing links. PageRank was named after Larry Page.
0.15 + (0.85*(A “share” of PR of every page that links to it)) =Your Page’s PR.
For Ex:
A (PR 1) B (PR 1)
C (PR 1)
2.3 Optimizers:
2.3.1 White Hat SEOs:
They are the search engine optimizers that use legal techniques to improve the
ranking of your Web site. They are also known as the Best Practices Community.
Best practices SEO contact a few webmasters having similar content and request for a
link exchange.
2.3.2 Black Hat SEOs:
They are the SEOs that try to deceive the search engines by using tricks. Many a
times sites developed by black hats get penalised or banned. Some of the tricks used
are:
Cloaking / Doorway Pages :
Cloaking is the practice of showing one content page to a search engine and
another content page to actual people. "cloaking" is achieved through a variety of
means and some people argue that there are legitimate reasons for cloaking. Best
practices people advocates not to cloak. Google has been criticised for allegedly
permitting a group (such as newspapers, academic paper archives', SEO forums
where Google employees have participated) to engage in cloaking while banning
other sites for same reasons. Doorway or presell or gateway pages specially
designed minimal content pages created solely for the purpose of receiving traffic
from search engine and direct to destination.
Meta Stuffing:
Meta tag is analysed by almost all search engines while crawling. Meta elements
are HTML or XHTML elements used to provide structured metadata about a Web
page. Such elements must be placed as tags in the head section of an HTML or
XHTML document. Meta elements can be used to specify page description,
keywords and any other metadata not provided through the other head elements
and attributes. However Black Hat SEOs started giving wrong information in
following Elements of Meta tag :
o Description:
The description attribute provides a concise explanation of a Web page's
content. Almost all search engines recommend it to be shorter than 155
characters of plain text. Black Hat SEOs give some different description
than their contents.
o Keyword:
This attribute gives keywords by which the web page will be searched.
However, search engine providers realized that information stored in meta
elements, especially the keywords attribute, was often unreliable and
misleading, and at worst, used to draw users into spam sites.
o Robots:
o Language:
Language attributes tell search engines what natural language the web site is
written in opposed to coding language.
Title is something that appears on the top left of your Web page. Following is
an example of Title tag stuffing used by an Black hat SEO. This things may
ban your site also.
Keyword Stuffing:
What I'm referring to here is when people throw in thousands of the same
exact keyword into their meta tags.
Tent
Fig.no. 4 Keyword Stuffing example
For example, the following website is trying to rank well for "tents".
<META NAME="KEYWORDS" CONTENT="tents, TENTS, Tents, tents
tents supplies, tents, tents tent, tent, Tent, TENTS, tents, Tents,tents, TENTS,
Tents, tents tents tent supplies, tents, tents tent, tent, Tent, TENTS, tents,
Tents, tents, TENTS, Tents, tents tents tent supplies, tents, tents tent, tent,
Tent, TENTS, tents, Tents, tents, TENTS, Tents, tents tents tent supplies,
tents, tents tent, tent, Tent, TENTS, tents, Tents tents, TENTS, Tents, tents
tents tent supplies>
Hidden Text :
Hidden Text is simply text that users can't see when they visit your webpage.
But when we use ctrl+a we get the trick. That hidden text might be the
keyword of the number 1 site for such quries so it will probably up your PR.
3. TYPES OF OPTIMIZATION:
The Search Engine Optimization is divided into two parts. The division is based upon
the characteristics of the factors used in optimization.
On-Page Optimization
Off-page Optimization.
It includes the trics and techniques that are used within the Web page in order
to secure a good ranking. Following are the techniques used:
http://www.goodkeywords.com
The SINo. 1 is "weight loss" and was searched 1,413,194 times in the Overture
search engine last month. The Words column shows the specific keyword that was
searched. If you enter "weight loss", the Good Keywords tool will bring back the 100
keywords containing the word "weight loss" that were searched for last month. The
"count" column will then show us how many times the specific keyword has been
searched for the previous month within the Overture.com search engine. Generally,
you can take that number times 3, in order to estimate the number of times that
keyword has been searched within Google for the previous month. Already, I will
see many people making a BIG mistake, and I'll admit, I was one of these people
when I first begin my online endeavours. If we scroll down, we can find some more
specific keyword phrases like "weight loss story", "weight loss picture", and "safe
weight loss" for which the competition is less.
nnnnnn
So naturally your links get first crawled and then contents. You would really want
Google to read your contents first. A good solution is given below:
Navigational links
So, now search engine will read a empty column first and then your text and at last
navigational links.
A site map (or sitemap) is a list of pages of a web site accessible to crawlers or users.
It can be either a document in any form used as a planning tool for web design, or a
web page that lists the pages on a web site, typically organized in hierarchical fashion.
This helps visitors and search engine bots find pages on the site. While some
developers argue that site index is a more appropriately used term to relay page
function, web visitors are used to seeing each term and generally associate both as one
and the same. However, a site index is often used to mean an A-Z index that provides
access to particular content, while a site map provides a general top-down view of the
overall site contents.
If sitemaps are big having 100 or more links, Google recommends to break sitemaps
in large pages.RSS is Rich Site Summary. RSS also known as Really Simple
Syndication. RSS feeds help other sites to distribute your headlines and content.
Off-page optimization basically consists of all of the ranking factors that are
not located on your Webpage that the search engines look at when ranking a
website.
These include:
Which websites link to you?
The number of websites linking to you.
http://www.webrankinfo.com/english/tools/class-c-checker.php
While looking at the report view within, you'll notice that www.diet-i.com has
hundreds of backlinks that contain the words "diet information" within their anchor
text. This is a BIG plus for them and something you'll want to duplicate to create your
site related to diet information. If you think about it, it makes sense that Google gives
priority to websites that have links on many IP Addresses rather than many links all
on the same IP Address. This helps eliminate the possibility of people controlling the
search engines. If Google didn't look at IP Addresses, I could simply create 1 website
with thousands of pages and link to another 1 of my websites from all of these pages.
I would then have thousands of links pointing to my website, which would probably
result in a #1 ranking...Similarly, observe the PageRanks, Analysis etc.
DMOZ is the world’s largest directory used by Google and AOL to create their
directories. Getting place in it is getting indexed by Google. Getting listed in Yahoo is
$299 a year. But still is worth to get listed. Because once you get recognized by
search engines in a year you won’t have to pay again.
Another way you could increase the number of links to your website is by writing
articles related to your topic and then submit those to article directories. In the footer
of each article you write, you should include a "blurb" about yourself and also a LINK
back to you website, using your main keyword in the "anchor text" of course. This is a
great way to get 1-way links to your website. A 1-way link is when another website
links to you without you linking back to them. Google and other search engines view
1-way links as being much more important than reciprocal links, so the more 1-way
links you can get for your website, the better rankings you will have. A great place to
find a list of many article directories to submit your articles to is the following
website:
http://www.pro-marketing-online.com/submit-articles.html
Try to find quality sites that share compatibility with your site’s topic and ask
webmaster to have a link exchange. A link to your site is a vote to you. Just add your
keyword to Google. Get the site names and contact them requesting a link exchange.
Be careful while involving in Link farms. If your site is having a link from banned or
penalised site Google might ban your site also. In a recent high profile case, BMW
Germany was delisted temporarily for involving in cloaking form.
4 SEO Myths:
5. CONCLUSION:
Web presence is the most necessary thing for your Web site in today's world.
Search engines are the most important source for finding your site. SEO
community will continue to evolve as search engines find new ways to index
and promote Web-based content. SEO theory will also continue to drive Black
Hat SEO practices as they react to the constantly changing criteria for
inclusion in search engine databases. But SEO theory should also remain a
viable part of White Hat or Best Practices SEO because it embraces the
holistic approach that White Hats take to web design and promotion.
6. REFERENCES:
BOOKS:
Search Engine Optimization Made Easy By Brad Callen.
WEB SITES:
http://www.googleguide.com/google_works
http://en.wikipedia.org/wiki/Sitemap
http://en.wikipedia.org/wiki/Meta_tag
http://en.wikipedia.org/wiki/Web_crawler
www.google.stanford.edu
www.1stQuery.com