The Basics of Search Engine Optimization
The Basics of Search Engine Optimization
Our founder, Rand Fishkin, made a similar pyramid to explain the way folks
should go about SEO, and we've affectionately dubbed it "Mozlow's
hierarchy of SEO needs."
As you can see, the foundation of good SEO begins with ensuring crawl
accessibility, and moves up from there.
Using this beginner's guide, we can follow these seven steps to successful
SEO:
1. Crawl accessibility so engines can read your website
2. Compelling content that answers the searcher’s query
3. Keyword optimized to attract searchers & engines
4. Great user experience including a fast load speed and compelling UX
5. Share-worthy content that earns links, citations, and amplification
6. Title, URL, & description to draw high CTR in the rankings
7. Snippet/schema markup to stand out in SERPs
Chapter 1
SEO 101
What is it, and why is it important?
If you already have a solid understanding of SEO and why it's important,
you can skip to Chapter 2 (though we'd still recommend skimming the best
practices from Google and Bing at the end of this chapter; they're useful
refreshers).
For everyone else, this chapter will help build your foundational SEO
knowledge and confidence as you move forward.
What is SEO?
SEO stands for “search engine optimization.” It’s the practice of increasing
both the quality and quantity of website traffic, as well as exposure to your
brand, through non-paid (also known as "organic") search engine results.
If knowing your audience’s intent is one side of the SEO coin, delivering it
in a way search engine crawlers can find and understand is the other. In
this guide, expect to learn how to do both.
For example, if you search for "Denver weather," you’ll see a weather
forecast for the city of Denver directly in the SERP instead of a link to a site
that might have that forecast. And, if you search for “pizza Denver,” you’ll
see a “local pack” result made up of Denver pizza places. Convenient,
right?
It's worth noting that there are many other search features that, even
though they aren't paid advertising, can't typically be influenced by SEO.
These features often have data acquired from proprietary data sources,
such as Wikipedia, WebMD, and IMDb.
Organic search results cover more digital real estate, appear more credible
to savvy searchers, and receive way more clicks than paid advertisements.
For example, of all US searches, only ~2.8% of people click on paid
advertisements.
SEO is also one of the only online marketing channels that, when set up
correctly, can continue to pay dividends over time. If you provide a solid
piece of content that deserves to rank for the right keywords, your traffic
can snowball over time, whereas advertising needs continuous funding to
send traffic to your site.
Search engines are getting smarter, but they still need our help.
Optimizing your site will help deliver better information to search engines so
that your content can be properly indexed and displayed within search
results.
If you end up looking for expert help, it's important to know that many
agencies and consultants "provide SEO services," but can vary widely in
quality. Knowing how to choose a good SEO company can save you a lot
of time and money, as the wrong SEO techniques can actually harm your
site more than they will help.
While webmaster guidelines vary from search engine to search engine, the
underlying principles stay the same: Don’t try to trick search engines.
Instead, provide your visitors with a great online experience. To do that,
follow search engine guidelines and fulfill user intent.
Things to avoid:
Things to avoid:
Basic principles:
Things to avoid
If you'll be focusing on ranking in the Bing search engine, get to know their
guidelines, as well. It's only polite (and good sense!)
Your job as an SEO is to quickly provide users with the content they desire
in the format in which they desire it.
Also evaluate what content your top-ranking competitors are providing that
you currently aren’t. How can you provide 10X the value on your website?
Providing relevant, high-quality content on your website will help you rank
higher in search results, and more importantly, it will establish credibility
and trust with your online audience.
Before you do any of that, you have to first understand your website’s goals
to execute a strategic SEO plan.
What will your KPIs (Key Performance Indicators) be to measure the return
on SEO investment? More simply, what is your barometer to measure the
success of your organic search efforts? You'll want to have it documented,
even if it's this simple:
Sales
Downloads
Email signups
Contact form submissions
Phone calls
And if your business has a local component, you’ll want to define KPIs for
your Google My Business listings, as well. These might include:
Clicks-to-call
Clicks-to-website
Clicks-for-driving-directions
You may have noticed that things like “ranking” and “traffic” weren’t on the
KPIs list, and that’s intentional.
“But wait a minute!” You say. “I came here to learn about SEO because I
heard it could help me rank and get traffic, and you’re telling me those
aren’t important goals?”
Not at all! You’ve heard correctly. SEO can help your website rank higher in
search results and consequently drive more traffic to your website, it’s just
that ranking and traffic are a means to an end. There’s little use in ranking if
no one is clicking through to your site, and there’s little use in increasing
your traffic if that traffic isn’t accomplishing a larger business objective.
For example, if you run a lead generation site, would you rather have:
1,000 monthly visitors and 3 people fill out a contact form? Or...
300 monthly visitors and 40 people fill out a contact form?
If you’re using SEO to drive traffic to your site for the purpose of
conversions, we hope you’d pick the latter! Before embarking on SEO,
make sure you’ve laid out your business goals, then use SEO to help you
accomplish them — not the other way around.
SEO accomplishes so much more than vanity metrics. When done well, it
helps real businesses achieve real goals for their success.
Chapter 2
HOW SEARCH ENGINES WORK: CRAWLING, INDEXING,
AND RANKING
First, show up.
It’s possible to block search engine crawlers from part or all of your site, or
instruct search engines to avoid storing certain pages in their index. While
there can be reasons for doing this, if you want your content found by
searchers, you have to first make sure it’s accessible to crawlers and is
indexable. Otherwise, it’s as good as invisible.
By the end of this chapter, you’ll have the context you need to work with the
search engine, rather than against it!
For more accurate results, monitor and use the Index Coverage report in
Google Search Console. You can sign up for a free Google Search
Console account if you don't currently have one. With this tool, you can
submit sitemaps for your site and monitor how many submitted pages have
actually been added to Google's index, among other things.
If you're not showing up anywhere in the search results, there are a few
possible reasons why:
Most people think about making sure Google can find their important
pages, but it’s easy to forget that there are likely pages you don’t want
Googlebot to find. These might include things like old URLs that have thin
content, duplicate URLs (such as sort-and-filter parameters for e-
commerce), special promo code pages, staging or test pages, and so on.
To direct Googlebot away from certain pages and sections of your site, use
robots.txt.
Robots.txt
Robots.txt files are located in the root directory of websites (ex.
yourdomain.com/robots.txt) and suggest which parts of your site search
engines should and shouldn't crawl, as well as the speed at which they
crawl your site, via specific robots.txt directives.
You can read more details about this in the robots.txt portion of our
Learning Center.
https://www.example.com/products/women/dresses/green.htmhttps://www.
example.com/products/women?
category=dresses&color=greenhttps://example.com/shopindex.php?
product_id=32&highlight=green+dress&cat_id=1&sessionid=123$affid=43
How does Google know which version of the URL to serve to searchers?
Google does a pretty good job at figuring out the representative URL on its
own, but you can use the URL Parameters feature in Google Search
Console to tell Google exactly how you want them to treat your pages. If
you use this feature to tell Googlebot “crawl no URLs with ____ parameter,”
then you’re essentially asking to hide this content from Googlebot, which
could result in the removal of those pages from search results. That’s what
you want if those parameters create duplicate pages, but not ideal if you
want those pages to be indexed.
Common navigation mistakes that can keep crawlers from seeing all of your site:
This is why it's essential that your website has a clear navigation and
helpful URL folder structures.
Ensure that you’ve only included URLs that you want indexed by search
engines, and be sure to give crawlers consistent directions. For example,
don’t include a URL in your sitemap if you’ve blocked that URL via
robots.txt or include URLs in your sitemap that are duplicates rather than
the preferred, canonical version (we’ll provide more information on
canonicalization in Chapter 5!).
Are crawlers getting errors when they try to access your URLs?
In the process of crawling the URLs on your site, a crawler may encounter
errors. You can go to Google Search Console’s “Crawl Errors” report to
detect URLs on which this might be happening - this report will show you
server errors and not found errors. Server log files can also show you this,
as well as a treasure trove of other information such as crawl frequency,
but because accessing and dissecting server log files is a more advanced
tactic, we won’t discuss it at length in the Beginner’s Guide, although you
can learn more about it here.
Before you can do anything meaningful with the crawl error report, it’s
important to understand server errors and "not found" errors.
4xx Codes: When search engine crawlers can’t access your content due to a client
error
4xx errors are client errors, meaning the requested URL contains bad
syntax or cannot be fulfilled. One of the most common 4xx errors is the
“404 – not found” error. These might occur because of a URL typo, deleted
page, or broken redirect, just to name a few examples. When search
engines hit a 404, they can’t access the URL. When users hit a 404, they
can get frustrated and leave.
5xx Codes: When search engine crawlers can’t access your content due to a server
error
5xx errors are server errors, meaning the server the web page is located on
failed to fulfill the searcher or search engine’s request to access the page.
In Google Search Console’s “Crawl Error” report, there is a tab dedicated to
these errors. These typically happen because the request for the URL
timed out, so Googlebot abandoned the request. View Google’s
documentation to learn more about fixing server connectivity issues.
Thankfully, there is a way to tell both searchers and search engines that
your page has moved — the 301 (permanent) redirect.
Indexing Helps Google find and index The presence of 404 errors
the new version of the on your site alone don't
page. harm search performance,
but letting ranking /
trafficked pages 404 can
result in them falling out of
the index, with rankings and
traffic going with them —
yikes!
User Experience Ensures users find the page Allowing your visitors to
they’re looking for. click on dead links will take
them to error pages instead
of the intended page, which
can be frustrating.
The 301 status code itself means that the page has permanently moved to
a new location, so avoid redirecting URLs to irrelevant pages — URLs
where the old URL’s content doesn’t actually live. If a page is ranking for a
query and you 301 it to a URL with different content, it might drop in rank
position because the content that made it relevant to that particular query
isn't there anymore. 301s are powerful — move URLs responsibly!
You also have the option of 302 redirecting a page, but this should be
reserved for temporary moves and in cases where passing link equity isn’t
as big of a concern. 302s are kind of like a road detour. You're temporarily
siphoning traffic through a certain route, but it won't be like that forever.
Read on to learn about how indexing works and how you can make sure
your site makes it into this all-important database.
You can view what your cached version of a page looks like by clicking the
drop-down arrow next to the URL in the SERP and choosing "Cached":
You can also view the text-only version of your site to determine if your
important content is being crawled and cached effectively.
The URL is returning a "not found" error (4XX) or server error (5XX) –
This could be accidental (the page was moved and a 301 redirect
was not set up) or intentional (the page was deleted and 404ed in
order to get it removed from the index)
The URL had a noindex meta tag added – This tag can be added by
site owners to instruct the search engine to omit the page from its
index.
The URL has been manually penalized for violating the search
engine’s Webmaster Guidelines and, as a result, was removed from
the index.
The URL has been blocked from crawling with the addition of a
password required before visitors can access the page.
If you believe that a page on your website that was previously in Google’s
index is no longer showing up, you can use the URL Inspection tool to learn
the status of the page, or use Fetch as Google which has a "Request
Indexing" feature to submit individual URLs to the index. (Bonus: GSC’s
“fetch” tool also has a “render” option that allows you to see if there are any
issues with how Google is interpreting your page).
You can tell search engine crawlers things like "do not index this page in
search results" or "don’t pass any link equity to any on-page links". These
instructions are executed via Robots Meta Tags in the <head> of your
HTML pages (most commonly used) or via the X-Robots-Tag in the HTTP
header.
When you might use: If you run an e-commerce site and your prices
change regularly, you might consider the noarchive tag to prevent
searchers from seeing outdated pricing.
For example, you could easily exclude entire folders or file types (like
moz.com/no-bake/old-recipes-to-noindex):
WordPress tip:
In Dashboard > Settings > Reading, make sure the "Search Engine
Visibility" box is not checked. This blocks search engines from coming to
your site via your robots.txt file!
Understanding the different ways you can influence crawling and indexing
will help you avoid the common pitfalls that can prevent your important
pages from getting found.
Why does the algorithm change so often? Is Google just trying to keep us
on our toes? While Google doesn’t always reveal specifics as to why they
do what they do, we do know that Google’s aim when making algorithm
adjustments is to improve overall search quality. That’s why, in response to
algorithm update questions, Google will answer with something along the
lines of: "We’re making quality updates all the time." This indicates that, if
your site suffered after an algorithm adjustment, compare it
against Google’s Quality Guidelines or Search Quality Rater Guidelines,
both are very telling in terms of what search engines want.
When search engines were just beginning to learn our language, it was
much easier to game the system by using tricks and tactics that actually go
against quality guidelines. Take keyword stuffing, for example. If you
wanted to rank for a particular keyword like “funny jokes,” you might add
the words “funny jokes” a bunch of times onto your page, and make it bold,
in hopes of boosting your ranking for that term:
Links have historically played a big role in SEO. Very early on, search
engines needed help figuring out which URLs were more trustworthy than
others to help them determine how to rank search results. Calculating the
number of links pointing to any given site helped them do this.
The more natural backlinks you have from high-authority (trusted) websites,
the better your odds are to rank higher within search results.
Today, with hundreds or even thousands of ranking signals, the top three
have stayed fairly consistent: links to your website (which serve as a third-
party credibility signals), on-page content (quality content that fulfills a
searcher’s intent), and RankBrain.
What is RankBrain?
RankBrain is the machine learning component of Google’s core algorithm.
Machine learning is a computer program that continues to improve its
predictions over time through new observations and training data. In other
words, it’s always learning, and because it’s always learning, search results
should be constantly improving.
Like most things with the search engine, we don’t know exactly what
comprises RankBrain, but apparently, neither do the folks at Google.
Many tests, including Moz’s own ranking factor survey, have indicated that
engagement metrics correlate with higher ranking, but causation has been
hotly debated. Are good engagement metrics just indicative of highly
ranked sites? Or are sites ranked highly because they possess good
engagement metrics?
“The ranking itself is affected by the click data. If we discover that, for a
particular query, 80% of people click on #2 and only 10% click on #1, after
a while we figure out probably #2 is the one people want, so we’ll switch it.”
Another comment from former Google engineer Edmond Lau corroborates
this:
“It’s pretty clear that any reasonable search engine would use click data on
their own results to feed back into ranking to improve the quality of search
results. The actual mechanics of how click data is used is often proprietary,
but Google makes it obvious that it uses click data with its patents on
systems like rank-adjusted content items.”
Because Google needs to maintain and improve search quality, it seems
inevitable that engagement metrics are more than correlation, but it would
appear that Google falls short of calling engagement metrics a “ranking
signal” because those metrics are used to improve search quality, and the
rank of individual URLs is just a byproduct of that.
Since user engagement metrics are clearly used to adjust the SERPs for
quality, and rank position changes as a byproduct, it’s safe to say that
SEOs should optimize for engagement. Engagement doesn’t change the
objective quality of your web page, but rather your value to searchers
relative to other results for that query. That’s why, after no changes to your
page or its backlinks, it could decline in rankings if searchers’ behaviors
indicates they like other pages better.
Paid advertisements
Featured snippets
People Also Ask boxes
Local (map) pack
Knowledge panel
Sitelinks
And Google is adding new ones all the time. They even experimented with
“zero-result SERPs,” a phenomenon where only one result from the
Knowledge Graph was displayed on the SERP with no results below it
except for an option to “view more results.”
The addition of these features caused some initial panic for two main
reasons. For one, many of these features caused organic results to be
pushed down further on the SERP. Another byproduct is that fewer
searchers are clicking on the organic results since more queries are being
answered on the SERP itself.
So why would Google do this? It all goes back to the search experience.
User behavior indicates that some queries are better satisfied by different
content formats. Notice how the different types of SERP features match the
different types of query intents.
Transactional Shopping
We’ll talk more about intent in Chapter 3, but for now, it’s important to know
that answers can be delivered to searchers in a wide array of formats, and
how you structure your content can impact the format in which it appears in
search.
Localized search
A search engine like Google has its own proprietary index of local business
listings, from which it creates local search results.
If you are performing local SEO work for a business that has a physical
location customers can visit (ex: dentist) or for a business that travels to
visit their customers (ex: plumber), make sure that you claim, verify,
and optimize a free Google My Business Listing.
When it comes to localized search results, Google uses three main factors
to determine ranking:
1. Relevance
2. Distance
3. Prominence
Relevance
Relevance is how well a local business matches what the searcher is
looking for. To ensure that the business is doing everything it can to be
relevant to searchers, make sure the business’ information is thoroughly
and accurately filled out.
Distance
Google use your geo-location to better serve you local results. Local search
results are extremely sensitive to proximity, which refers to the location of
the searcher and/or the location specified in the query (if the searcher
included one).
Prominence
With prominence as a factor, Google is looking to reward businesses that
are well-known in the real world. In addition to a business’ offline
prominence, Google also looks to some online factors to determine local
ranking, such as:
Reviews
The number of Google reviews a local business receives, and the
sentiment of those reviews, have a notable impact on their ability to rank in
local results.
Citations
A "business citation" or "business listing" is a web-based reference to a
local business' "NAP" (name, address, phone number) on a localized
platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).
In the next chapter, you’ll learn on-page best practices that will help Google
and users better understand your content.
Curious about a certain local business' citation accuracy? Moz has a free
tool that can help out, aptly named Check Listing.
...and even provides searchers with the ability to ask the business
questions!
Undoubtedly now more than ever before, local results are being influenced
by real-world data. This interactivity is how searchers interact with and
respond to local businesses, rather than purely static (and game-able)
information like links and citations.
Since Google wants to deliver the best, most relevant local businesses to
searchers, it makes perfect sense for them to use real time engagement
metrics to determine quality and relevance.
Chapter 3
KEYWORD RESEARCH
Understand what your audience wants to find.
Now that you’ve learned how to show up in search results, let’s determine
which strategic keywords to target in your website’s content, and how to
craft that content to satisfy both users and search engines.
Keyword research provides you with specific search data that can help you
answer questions like:
In this chapter, you'll get tools and strategies for uncovering that
information, as well as learn tactics that'll help you avoid keyword research
foibles and build strong content. Once you uncover how your target
audience is searching for your content, you begin to uncover a whole new
world of strategic SEO!
This is where corners are often cut. Too many people bypass this crucial
planning step because keyword research takes time, and why spend the
time when you already know what you want to rank for?
The answer is that what you want to rank for and what your audience
actually wants are often two wildly different things. Focusing on your
audience and then using keyword data to hone those insights will make for
much more successful campaigns than focusing on arbitrary keywords.
What types of ice cream, desserts, snacks, etc. are people searching
for?
Who is searching for these terms?
When are people searching for ice cream, snacks, desserts, etc.?
o Are there seasonality trends throughout the year?
How are people searching for ice cream?
o What words do they use?
o What questions do they ask?
o Are more searches performed on mobile devices?
Why are people seeking ice cream?
o Are individuals looking for health-conscious ice cream
specifically or just looking to satisfy a sweet tooth?
Where are potential customers located — locally, nationally, or
internationally?
And finally — here's the kicker — how can you help provide the best
content about ice cream to cultivate a community and fulfill what all those
people are searching for? Asking these questions is a crucial planning step
that will guide your keyword research and help you craft better content.
Discovering keywords
You likely have a few keywords in mind that you would like to rank for.
These will be things like your products, services, or other topics your
website addresses, and they are great seed keywords for your research, so
start there! You can enter those keywords into a keyword research tool to
discover average monthly search volume and similar keywords. We’ll get
into search volume in greater depth in the next section, but during the
discovery phase, it can help you determine which variations of your
keywords are most popular amongst searchers.
Once you enter in your seed keywords into a keyword research tool, you
will begin to discover other keywords, common questions, and topics for
your content that you might have otherwise missed.
Typing “wedding” and “florist” into a keyword research tool, you may
discover highly relevant, highly searched for related terms such as:
Wedding bouquets
Bridal flowers
Wedding flower shop
In the process of discovering relevant keywords for your content, you will
likely notice that the search volume of those keywords varies greatly. While
you definitely want to target terms that your audience is searching for, in
some cases, it may be more advantageous to target terms with lower
search volume because they're far less competitive.
Typically, the higher the search volume, the greater the competition and
effort required to achieve organic ranking success. Go too low, though, and
you risk not drawing any searchers to your site. In many cases, it may be
most advantageous to target highly specific, lower competition search
terms. In SEO, we call those long-tail keywords.
It's wonderful to deal with keywords that have 50,000 searches a month, or
even 5,000 searches a month, but in reality, these popular search terms
only make up a fraction of all searches performed on the web. In fact,
keywords with very high search volumes may even indicate ambiguous
intent, which, if you target these terms, it could put you at risk for drawing
visitors to your site whose goals don't match the content your page
provides.
Does the searcher want to know the nutritional value of pizza? Order a
pizza? Find a restaurant to take their family? Google doesn’t know, so they
offer these features to help you refine. Targeting “pizza” means that you’re
likely casting too wide a net.
If you're searching for "pizza," Google thinks you may also be interested in
"cheese." They're not wrong...
Was your intent to find a pizza place for lunch? The "Discover more places"
SERP feature has that covered.
The remaining 75% lie in the "chunky middle" and "long tail" of search.
Don’t underestimate these less popular keywords. Long tail keywords with
lower search volume often convert better, because searchers are more
specific and intentional in their searches. For example, a person searching
for "shoes" is probably just browsing. On the other hand, someone
searching for "best price red womens size 7 running shoe" practically has
their wallet out!
Questions are SEO gold!
Discovering what questions people are asking in your space — and adding
those questions and their answers to an FAQ page — can yield incredible
organic traffic for your website.
Keywords by competitor
You’ll likely compile a lot of keywords. How do you know which to tackle
first? It could be a good idea to prioritize high-volume keywords that your
competitors are not currently ranking for. On the flip side, you could also
see which keywords from your list your competitors are already ranking for
and prioritize those. The former is great when you want to take advantage
of your competitors’ missed opportunities, while the latter is an aggressive
strategy that sets you up to compete for keywords your competitors are
already performing well for.
Keywords by season
Knowing about seasonal trends can be advantageous in setting a content
strategy. For example, if you know that “christmas box” starts to spike in
October through December in the United Kingdom, you can prepare
content months in advance and give it a big push around those months.
Keywords by region
You can more strategically target a specific location by narrowing down
your keyword research to specific towns, counties, or states in the Google
Keyword Planner, or evaluate "interest by subregion" in Google Trends.
Geo-specific research can help make your content more relevant to your
target audience. For example, you might find out that in Texas, the
preferred term for a large truck is “big rig,” while in New York, “tractor
trailer” is the preferred terminology.
While there are thousands of possible search types, let’s take a closer look
at five major categories of intent:
If you're enjoying this chapter so far, be sure to check out the Keyword
Research episode of our One-Hour Guide to SEO video series!
Google has a wide array of result types it can serve up depending on the
query, so if you’re going to target a keyword, look to the SERP to
understand what type of content you need to create.
Now that you know how your target market is searching, it’s time to dive
into on-page SEO, the practice of crafting web pages that answer
searcher’s questions. On-page SEO is multifaceted, and extends beyond
content into other things like schema and meta tags, which we’ll discuss
more at length in the next chapter on technical optimization. For now, put
on your wordsmithing hats — it’s time to create your content!
1. Survey your keywords and group those with similar topics and intent.
Those groups will be your pages, rather than creating individual
pages for every keyword variation.
2. If you haven’t done so already, evaluate the SERP for each keyword
or group of keywords to determine what type and format your content
should be. Some characteristics of ranking pages to take note of:
1. Are they image- or video-heavy?
2. Is the content long-form or short and concise?
3. Is the content formatted in lists, bullets, or paragraphs?
3. Ask yourself, “What unique value could I offer to make my page
better than the pages that are currently ranking for my keyword?”
On-page SEO allows you to turn your research into content your audience
will love. Just make sure to avoid falling into the trap of low-value tactics
that could hurt more than help!
Thin content
While it’s common for a website to have unique pages on different topics,
an older content strategy was to create a page for every single iteration of
your keywords in order to rank on page 1 for those highly specific queries.
For example, if you were selling bridal dresses, you might have created
individual pages for bridal gowns, bridal dresses, wedding gowns, and
wedding dresses, even if each page was essentially saying the same thing.
A similar tactic for local businesses was to create multiple pages of content
for each city or region from which they wanted clients. These “geo pages”
often had the same or very similar content, with the location name being
the only unique factor.
Tactics like these clearly weren’t helpful for users, so why did publishers do
it? Google wasn’t always as good as it is today at understanding the
relationships between words and phrases (or semantics). So, if you wanted
to rank on page 1 for “bridal gowns” but you only had a page on “wedding
dresses,” that may not have cut it.
This practice created tons of thin, low-quality content across the web, which
Google addressed specifically with its 2011 update known as Panda. This
algorithm update penalized low-quality pages, which resulted in more
quality pages taking the top spots of the SERPs. Google continues to
iterate on this process of demoting low-quality content and promoting high-
quality content today.
When this guideline is broken, search engines call it "cloaking" and take
action to prevent these pages from ranking in search results. Cloaking can
be accomplished in any number of ways and for a variety of reasons, both
positive and negative. Below is an example of an instance
where Spotify showed different content to users than to Google.
Users were presented with a login screen in Spotify when searching for the
National Philharmonic orchestra.
Viewing Google's cached version of the page shows the content Spotify
provided to the search engine.
In some cases, Google may let practices that are technically cloaking pass
because they contribute to a positive user experience. For more on the
subject of hidden content and how Google handles it, see our Whiteboard
Friday entitled How Does Google Handle CSS + Javascript "Hidden" Text?
Keyword stuffing
If you’ve ever been told, “You need to include {critical keyword} on this
page X times,” you’ve seen the confusion over keyword usage in action.
Many people mistakenly think that if you just include a keyword within your
page’s content X times, you will automatically rank for it. The truth is,
although Google looks for mentions of keywords and related concepts on
your site’s pages, the page itself has to add value outside of pure keyword
usage. If a page is going to be valuable to users, it won’t sound like it was
written by a robot, so incorporate your keywords and phrases naturally in a
way that is understandable to your readers.
Auto-generated content
Arguably one of the most offensive forms of low-quality content is the kind
that is auto-generated, or created programmatically with the intent of
manipulating search rankings and not helping users. You may recognize
some auto-generated content by how little it makes sense when read —
they are technically words, but strung together by a program rather than a
human being.
example.com/seattleexample.com/tacomaexample.com/bellevue
Each page should be uniquely optimized for that location, so the Seattle
page would have unique content discussing the Seattle location, list the
Seattle NAP, and even testimonials specifically from Seattle customers. If
there are dozens, hundreds, or even thousands of locations, a store locator
widget could be employed to help you scale.
In this scenario, the coffee shop should optimize their website for their
physical location, whereas Moz would target “SEO software” without a
location-specific modifier like “Seattle.”
How you choose to optimize your site depends largely on your audience,
so make sure you have them in mind when crafting your website content.
Hope you still have some energy left after handling the difficult-yet-
rewarding task of putting together a page that is 10x better than your
competitors’ pages, because there are just a few more things needed
before your page is complete! In the next sections, we’ll talk about the other
on-page optimizations your pages need, as well as naming and organizing
your content.
How can I control what title and description show up for my page in search
results?
Header tags
Header tags are an HTML element used to designate headings on your
page. The main header tag, called an H1, is typically reserved for the title
of the page. It looks like this:
<h1>Page Title</h1>
There are also sub-headings that go from H2 to H6 tags, although using all
of these on a page is not required. The hierarchy of header tags goes from
H1 to H6 in descending order of importance.
Each page should have a unique H1 that describes the main topic of the
page, this is often automatically created from the title of a page. As the
main descriptive title of the page, the H1 should contain that page’s primary
keyword or phrase. You should avoid using header tags to mark up non-
heading elements, such as navigational buttons and phone numbers. Use
header tags to introduce what the following content will discuss.
Internal links
In Chapter 2, we discussed the importance of having a crawlable website.
Part of a website’s crawlability lies in its internal linking structure. When you
link to other pages on your website, you ensure that search engine
crawlers can find all your site’s pages, you pass link equity (ranking power)
to other pages on your site, and you help visitors navigate your site.
Link accessibility
Links that require a click (like a navigation drop-down to view) are often
hidden from search engine crawlers, so if the only links to internal pages on
your website are through these types of links, you may have trouble getting
those pages indexed. Opt instead for links that are directly accessible on
the page.
Anchor text
Anchor text is the text with which you link to pages. Below, you can see an
example of what a hyperlink without anchor text and a hyperlink with
anchor text would look like in the HTML.
<a href="http://www.example.com/"></a><a
href="http://www.example.com/" title="Keyword Text">Keyword Text</a>
On live view, that would look like this:
http://www.example.com/
Keyword Text
The anchor text sends signals to search engines regarding the content of
the destination page. For example, if I link to a page on my site using the
anchor text “learn SEO,” that’s a good indicator to search engines that the
targeted page is one at which people can learn about SEO. Be careful not
to overdo it, though. Too many internal links using the same, keyword-
stuffed anchor text can appear to search engines that you’re trying to
manipulate a page’s ranking. It’s best to make anchor text natural rather
than formulaic.
Link volume
In Google’s General Webmaster Guidelines, they say to “limit the number
of links on a page to a reasonable number (a few thousand at most).” This
is part of Google’s technical guidelines, rather than the quality guideline
section, so having too many internal links isn’t something that on its own is
going to get you penalized, but it does affect how Google finds and
evaluates your pages.
The more links on a page, the less equity each link can pass to its
destination page. A page only has so much equity to go around.
So it’s safe to say that you should only link when you mean it! You
can learn more about link equity from our SEO Learning Center.
Aside from passing authority between pages, a link is also a way to help
users navigate to other pages on your site. This is a case where doing
what’s best for search engines is also doing what’s best for searchers. Too
many links not only dilute the authority of each link, but they can also be
unhelpful and overwhelming. Consider how a searcher might feel landing
on a page that looks like this:
Redirection
Removing and renaming pages is a common practice, but in the event that
you do move a page, make sure to update the links to that old URL! At the
very least, you should make sure to redirect the URL to its new location,
but if possible, update all internal links to that URL at the source so that
users and crawlers don’t have to pass through redirects to arrive at the
destination page. If you choose to redirect only, be careful to avoid redirect
chains that are too long (Google says, "Avoid chaining redirects... keep the
number of redirects in the chain low, ideally no more than 3 and fewer than
5.")
example.com/location1 → example.com/location3
Image optimization
Images are the biggest culprits of slow web pages! The best way to solve
for this is to compress your images. While there is no one-size-fits-all when
it comes to image compression, testing various options like "save for web,"
image sizing, and compression tools like Optimizilla or ImageOptim for Mac
(or Windows alternatives), as well as evaluating what works best is the way
to go.
Another way to help optimize your images (and improve your page speed)
is by choosing the right image format.
Alt text
Alt text (alternative text) within images is a principle of web accessibility,
and is used to describe images to the visually impaired via screen readers.
It’s important to have alt text descriptions so that any visually impaired
person can understand what the pictures on your website depict.
Search engine bots also crawl alt text to better understand your images,
which gives you the added benefit of providing better image context to
search engines. Just ensure that your alt descriptions reads naturally for
people, and avoid stuffing keywords for search engines.
Bad:
Text size and color - Avoid fonts that are too tiny. Google
recommends 16-point font and above to minimize the need for
“pinching and zooming” on mobile. The text color in relation to the
page’s background color should also promote readability. Additional
information on text can be found in the website accessibility
guidelines and via Google’s web accessibility fundamentals.
Headings - Breaking up your content with helpful headings can help
readers navigate the page. This is especially useful on long pages
where a reader might be looking only for information from a particular
section.
Bullet points - Great for lists, bullet points can help readers skim
and more quickly find the information they need.
Paragraph breaks - Avoiding walls of text can help prevent page
abandonment and encourage site visitors to read more of your page.
Supporting media - When appropriate, include images, videos, and
widgets that would complement your content.
Bold and italics for emphasis - Putting words in bold or italics can
add emphasis, so they should be the exception, not the rule.
Appropriate use of these formatting options can call out important
points you want to communicate.
There is no special code that you can add to your page to show up here,
nor can you pay for this placement, but taking note of the query intent can
help you better structure your content for featured snippets. For example, if
you’re trying to rank for “cake vs. pie,” it might make sense to include a
table in your content, with the benefits of cake in one column and the
benefits of pie in the other. Or if you’re trying to rank for “best restaurants to
try in Portland,” that could indicate Google wants a list, so formatting your
content in bullets could help.
Title tags
A page’s title tag is a descriptive, HTML element that specifies the title of a
particular web page. They are nested within the head tag of each page and
look like this:
Or when you share the link to your page on certain external websites…
Your title tag has a big role to play in people’s first impression of your
website, and it’s an incredibly effective tool for drawing searchers to your
page over any other result on the SERP. The more compelling your title
tag, combined with high rankings in search results, the more visitors you’ll
attract to your website. This underscores that SEO is not only about search
engines, but rather the entire user experience.
Meta descriptions
Like title tags, meta descriptions are HTML elements that describe the
contents of the page that they’re on. They are also nested in the head tag,
and look like this:
For example, if you search “find backlinks,” Google will provide this meta
description as it deems it more relevant to the specific search:
This often helps to improve your meta descriptions for unique searches.
However, don’t let this deter you from writing a default page meta
description — they're still extremely valuable.
What makes an effective meta description?
The qualities that make an effective title tag also apply to effective meta
descriptions. Although Google says that meta descriptions are not a
ranking factor, like title tags, they are incredibly important for click-through
rate.
example.com/desserts/chocolate-pie
or
example.com/asdf/453?=recipe-23432-1123
Searchers are more likely to click on URLs that reinforce and clarify what
information is contained on that page, and less likely to click on URLs that
confuse them.
The URL is a minor ranking signal, but you cannot expect to rank on the
basis of the words in your domain/page names alone (see Google EMD
update). When naming your pages or selecting a domain name, have your
audience in mind first.
Page organization
If you discuss multiple topics on your website, you should also make sure
to avoid nesting pages under irrelevant folders. For example:
example.com/commercial-litigation/alimony
It would have been better for this fictional multi-practice law firm website to
nest alimony under “/family-law/” than to host it under the irrelevant
"/commercial-litigation/" section of the website.
The folders in which you locate your content can also send signals about
the type, not just the topic, of your content. For example, dated URLs can
indicate time-sensitive content. While appropriate for news-based websites,
dated URLs for evergreen content can actually turn searchers away
because the information seems outdated. For example:
example.com/2015/april/what-is-seo/
vs.
example.com/what-is-seo/
Since the topic “What is SEO?” isn’t confined to a specific date, it’s best to
host on a non-dated URL structure or else risk your information appearing
stale.
As you can see, what you name your pages, and in what folders you
choose to organize your pages, is an important way to clarify the topic of
your page to users and search engines.
URL length
While it is not necessary to have a completely flat URL structure, many
click-through rate studies indicate that, when given the choice between a
URL and a shorter URL, searchers often prefer shorter URLs. Like title tags
and meta descriptions that are too long, too-long URLs will also be cut off
with an ellipsis. Just remember, having a descriptive URL is just as
important, so don’t cut down on URL length if it means sacrificing the URL's
descriptiveness.
example.com/services/plumbing/plumbing-repair/toilets/leaks/
vs.
example.com/plumbing-repair/toilets/
Minimizing length, both by including fewer words in your page names and
removing unnecessary subfolders, makes your URLs easier to copy and
paste, as well as more clickable.
Keywords in URL
If your page is targeting a specific term or phrase, make sure to include it in
the URL. However, don't go overboard by trying to stuff in multiple
keywords for purely SEO purposes. It’s also important to watch out for
repeat keywords in different subfolders. For example, you may have
naturally incorporated a keyword into a page name, but if located within
other folders that are also optimized with that keyword, the URL could
begin to appear keyword-stuffed.
Example:
example.com/seattle-dentist/dental-services/dental-crowns/
Keyword overuse in URLs can appear spammy and manipulative. If you
aren’t sure whether your keyword usage is too aggressive, just read your
URL through the eyes of a searcher and ask, “Does this look natural?
Would I click on this?”
Static URLs
The best URLs are those that can easily be read by humans, so you should
avoid the overuse of parameters, numbers, and symbols. Using
technologies like mod_rewrite for Apache and ISAPI_rewrite for Microsoft,
you can easily transform dynamic URLs like this:
http://moz.com/blog?id=123
into a more readable static version like this:
https://moz.com/google-algorithm-change
Case sensitivity
Sites should avoid case sensitive URLs. Instead of
example.com/desserts/Chocolate-Pie-Recipe it would be better to use
example.com/desserts/chocolate-pie-recipe. If the site you're working on
has lots of mixed-case URLs indexed, don't fret — your developers can
help. Ask them about adding a rewrite formula to something known as
the .htaccess file to automatically make any uppercase URLs lowercase.
Geographic modifiers in URLs
Some local business owners omit geographic terms that describe their
physical location or service area because they believe that search engines
can figure this out on their own. On the contrary, it’s vital that local business
websites’ content, URLs, and other on-site assets make specific mention of
city names, neighborhood names, and other regional descriptors. Let both
consumers and search engines know exactly where you are and where you
serve, rather than relying on your physical location alone.
Since the technical structure of a site can have a massive impact on its
performance, it’s crucial for everyone to understand these principles. It
might also be a good idea to share this part of the guide with your
programmers, content writers, and designers so that all parties involved in
a site's construction are on the same page.
Below, we outline the website’s journey from domain name purchase all the
way to its fully rendered state in a browser. An important component of the
website’s journey is the critical rendering path, which is the process of a
browser turning a website’s code into a viewable page.
Knowing this about websites is important for SEOs to understand for a few
reasons:
The steps in this webpage assembly process can affect page load
times, and speed is not only important for keeping users on your site,
but it’s also one of Google’s ranking factors.
Google renders certain resources, like JavaScript, on a "second
pass." Google will look at the page without JavaScript first, then a few
days to a few weeks later, it will render JavaScript, meaning SEO-
critical elements that are added to the page using JavaScript might
not get indexed.
Imagine that the website loading process is your commute to work. You get
ready at home, gather your things to bring to the office, and then take the
fastest route from your home to your work. It would be silly to put on just
one of your shoes, take a longer route to work, drop your things off at the
office, then immediately return home to get your other shoe, right? That’s
sort of what inefficient websites do. This chapter will teach you how to
diagnose where your website might be inefficient, what you can do to
streamline, and the positive ramifications on your rankings and user
experience that can result from that streamlining.
Now that you know how a website appears in a browser, we’re going to
focus on what a website is made of — in other words, the code
(programming languages) used to construct those web pages.
Since style directives can live in external stylesheet files (CSS files)
instead of your page’s HTML, it makes your page less code-heavy,
reducing file transfer size and making load times faster.
Browsers still have to download resources like your CSS file,
so compressing them can make your webpages load faster, and
page speed is a ranking factor.
Having your pages be more content-heavy than code-heavy can lead
to better indexing of your site’s content.
Using CSS to hide links and content can get your website manually
penalized and removed from Google’s index.
You’ve definitely seen JavaScript in action — you just may not have known
it! That’s because JavaScript can do almost anything to a page. It could
create a pop-up, for example, or it could request third-party resources like
ads to display on your page.
SEO-critical page elements such as text, links, and tags that are loaded on
the client’s side with JavaScript, rather than represented in your HTML, are
invisible from your page’s code until they are rendered. This means that
search engine crawlers won’t see what’s in your JavaScript — at least not
initially.
There are also some other things that could go wrong during Googlebot’s
process of rendering your web pages, which can prevent Google from
understanding what’s contained in your JavaScript:
Thankfully, there's a way to check whether Google sees the same thing as
your visitors. To see a page how Googlebot views your page, use Google
Search Console's "URL Inspection" tool. Simply paste your page's URL into
the GSC search bar:
After Googlebot has recrawled your URL, click "View Tested Page" to see
how your page is being crawled and rendered.
Clicking the "Screenshot" tab adjacent to "HTML" shows how Googlebot
smartphone renders your page.
In return, you’ll see how Googlebot sees your page versus how a visitor (or
you) may see the page. In the "More Info" tab, Google will also show you a
list of any resources they may not have been able to get for the URL you
entered.
Understanding the way websites work lays a great foundation for what we’ll
talk about next: technical optimizations to help Google understand the
pages on your website better.
The rel="canonical" tag allows you to tell search engines where the original,
master version of a piece of content is located. You’re essentially saying,
"Hey search engine! Don’t index this; index this source page instead." So, if
you want to republish a piece of content, whether exactly or slightly
modified, but don’t want to risk creating duplicate content, the canonical tag
is here to save the day.
"Avoid duplicate content" is an Internet truism, and for good reason! Google
wants to reward sites with unique, valuable content — not content that’s
taken from other sources and repeated across multiple pages. Because
engines want to provide the best searcher experience, they will rarely show
multiple versions of the same content, opting instead to show only the
canonicalized version, or if a canonical tag does not exist, whichever
version they deem most likely to be the original.
Responsive design
Responsive websites are designed to fit the screen of whatever type of
device your visitors are using. You can use CSS to make the web page
"respond" to the device size. This is ideal because it prevents visitors from
having to double-tap or pinch-and-zoom in order to view the content on
your pages. Not sure if your web pages are mobile friendly? You can use
Google’s mobile-friendly test to check!
AMP
AMP stands for Accelerated Mobile Pages, and it's used to deliver content
to mobile visitors at speeds much greater than with non-AMP delivery. AMP
is able to deliver content so fast because it delivers content from its cache
servers (not the original site) and uses a special AMP version of HTML and
JavaScript.
Mobile-first indexing
As of 2018, Google started switching websites over to mobile-first indexing.
That change sparked some confusion between mobile-friendliness and
mobile-first, so it’s helpful to disambiguate. With mobile-first indexing,
Google crawls and indexes the mobile version of your web pages. Making
your website compatible to mobile screens is good for users and your
performance in search, but mobile-first indexing happens independently of
mobile-friendliness.
This has raised some concerns for websites that lack parity between
mobile and desktop versions, such as showing different content,
navigation, links, etc. on their mobile view. A mobile site with different links,
for example, will alter the way in which Googlebot (mobile) crawls your site
and sends link equity to your other pages.
1. SRCSET: How to deliver the best image size for each device
The SRCSET attribute allows you to have multiple versions of your image
and then specify which version should be used in different situations. This
piece of code is added to the <img> tag (where your image is located in the
HTML) to provide unique images for specific-sized devices.
This doesn’t just speed up your image load time, it’s also a unique way to
enhance your on-page user experience by providing different and optimal
images to different device types.
By both minifying and bundling the files needed to construct your web
page, you’ll speed up your website and reduce the number of your HTTP
(file) requests.
Language
Sites that target speakers of multiple languages are considered
multilingual websites. These sites should add something called an
hreflang tag to show Google that your page has copy for another
language. Learn more about hreflang.
Country
Sites that target audiences in multiple countries are called multi-
regional websites and they should choose a URL structure that
makes it easy to target their domain or pages to specific countries.
This can include the use of a country code top level domain (ccTLD)
such as “.ca” for Canada, or a generic top-level domain (gTLD) with a
country-specific subfolder such as “example.com/ca” for
Canada. Learn more about locale-specific URLs.
Chapter 6
LINK BUILDING & ESTABLISHING AUTHORITY
Turn up the volume.
You've created content that people are searching for, that answers their
questions, and that search engines can understand, but those qualities
alone don't mean it'll rank. To outrank the rest of the sites with those
qualities, you have to establish authority. That can be accomplished by
earning links from authoritative websites, building your brand, and nurturing
an audience who will help amplify your content.
But what is a link, exactly? How do you go about earning them from other
websites? Let's start with the basics.
Since the late 1990s, search engines have treated links as votes for
popularity and importance on the web.
Internal links, or links that connect internal pages of the same domain,
work very similarly for your website. A high amount of internal links pointing
to a particular page on your site will provide a signal to Google that the
page is important, so long as it's done naturally and not in a spammy way.
The engines themselves have refined the way they view links, now using
algorithms to evaluate sites and pages based on the links they find. But
what's in those algorithms? How do the engines evaluate all those links? It
all starts with the concept of E-A-T.
To earn trust and authority with search engines, you'll need links from
websites that display the qualities of E-A-T. These don't have to be
Wikipedia-level sites, but they should provide searchers with credible,
trustworthy content.
Moz has proprietary metrics to help you determine how authoritative a site
is: Domain Authority, Page Authority, and Spam Score. In general, you'll
want links from sites with a higher Domain Authority than your sites.
Just like it sounds, "nofollow" tells search engines not to follow the link.
Some engines still follow them simply to discover new pages, but these
links don't pass link equity (the "votes of popularity" we talked about
above), so they can be useful in situations where a page is either linking to
an untrustworthy source or was paid for or created by the owner of the
destination page (making it an unnatural link).
Say, for example, you write a post about link building practices, and want to
call out an example of poor, spammy link building. You could link to the
offending site without signaling to Google that you trust it.
Standard links (ones that haven't had nofollow added) look like this:
It's natural for your site to have a balance between nofollowed and followed
backlinks in its link profile (more on link profiles below). A nofollow link
might not pass authority, but it could send valuable traffic to your site and
even lead to future followed links.
Use the MozBar extension for Google Chrome to highlight links on any
page to find out whether they're nofollow or follow without ever having to
view the source code!
A healthy link profile is one that indicates to search engines that you're
earning your links and authority fairly. Just like you shouldn't lie, cheat, or
steal, you should strive to ensure your link profile is honest and earned via
good, old-fashioned hard work.
Naturally earned links require no specific action from you, other than the
creation of worthy content and the ability to create awareness about it.
Linking domains don't have to match the topic of your page exactly, but
they should be related. Avoid pursuing backlinks from sources that are
completely off-topic; there are far better uses of your time.
Consider this. You ask ten separate friends at separate times how their day
was going, and they each responded with the same phrase:
A guiding principle for your link building efforts is to never try to manipulate
a site's ranking in search results.
But isn't that the entire goal of SEO? To increase a site's ranking in search
results? And herein lies the confusion. Google wants you to earn links, not
build them, but the line between the two is often blurry. To avoid penalties
for unnatural links (known as "link spam"), Google has made clear what
should be avoided.
⃠ Purchased links
Google and Bing both seek to discount the influence of paid links in their
organic search results. While a search engine can't know which links were
earned vs. paid for from viewing the link itself, there are clues it uses to
detect patterns that indicate foul play. Websites caught buying or selling
followed links risk severe penalties that will severely drop their rankings.
(By the way, exchanging goods or services for a link is also a form of
payment and qualifies as buying links.)
It is acceptable, and even valuable, to link to people you work with, partner
with, or have some other affiliation with and have them link back to you.
It's the exchange of links at mass scale with unaffiliated sites that can
warrant penalties.
If your site does get a manual penalty, there are steps you can take to get it
lifted.
Be earned/editorial
Publish a blog
This content and link building strategy is so popular and valuable that it's
one of the few recommended personally by the engineers at Google. Blogs
have the unique ability to contribute fresh material on a consistent basis,
generate conversations across the web, and earn listings and links from
other blogs.
Careful, though — you should avoid low-quality guest posting just for the
sake of link building. Google has advised against this and your energy is
better spent elsewhere.
Create unique resources
Creating unique, high-quality resources is no easy task, but it's well worth
the effort. High quality content that is promoted in the right ways can be
widely shared. It can help to create pieces that have the following traits:
Creating a resource like this is a great way to attract a lot of links with one
page. You could also create a highly-specific resource — without as broad
of an appeal — that targeted a handful of websites. You might see a higher
rate of success, but that approach isn't as scalable.
Users who see this kind of unique content often want to share it with
friends, and bloggers/tech-savvy webmasters who see it will often do so
through links. These high-quality, editorially earned votes are invaluable to
building trust, authority, and rankings potential.
For example, if you were doing link building for a company that made pots
and pans, you could search for:
cooking intitle:"resources"
...and see which pages might be good link targets.
This can also give you great ideas for content creation — just think about
which types of resources you could create that these pages would all like to
reference and link to.
All of these smart and authentic strategies provide good local link
opportunities.
Learn more
Link building for local SEO
Building linked unstructured citations — references to a business' contact
information on a non-directory platform, like a blog or a news site — is
important for moving the ranking needle for local SEO. It's also a great way
to earn valuable links when you're marketing a local business. Learn more
in our guide:
You can also dust off, update, and simply republish older content on the
same platform. If you discover that a few trusted industry websites all linked
to a popular resource that's gone stale, update it and let those industry
websites know — you may just earn a good link.
You can also do this with images. Reach out to websites that are using
your images and not citing you or linking back to you and ask if they'd mind
including a link.
Be newsworthy
Earning the attention of the press, bloggers, and news media is an
effective, time-honored way to earn links. Sometimes this is as simple as
giving something away for free, releasing a great new product, or stating
something controversial. Since so much of SEO is about creating a digital
representation of your brand in the real world, to succeed in SEO, you have
to be a great brand.
Your goal for an initial outreach email is simply to get a response. These
tips can help:
You could also do this with a root domain, subdomain, or a specific page.
Did you create content that was 10x better than anything else out there?
It’s possible that the reason your link building efforts fell flat is that your
content wasn’t substantially more valuable than anything else like it. Take a
look back at the pages ranking for that term you’re targeting and see if
there’s anything else you could do to improve.
Did you promote your content? How?
Promotion is perhaps one of the most difficult aspects of link building, but
letting people know about your content and convincing them to link to you
is what’s really going to move the needle. For great tips on content
promotion, visit Chapter 7 of our Beginner's Guide to Content Marketing.
Once your target audience is familiar with you and you have valuable
content to share, let your audience know about it! Sharing your content on
social platforms will not only make your audience aware of your content,
but it can also encourage them to amplify that awareness to their own
networks, thereby extending your own reach.
Are social shares the same as links? No. But shares to the right people can
result in links. Social shares can also promote an increase in traffic and
new visitors to your website, which can grow brand awareness, and with a
growth in brand awareness can come a growth in trust and links. The
connection between social signals and rankings seems indirect, but even
indirect correlations can be helpful for informing strategy.
That last point is what we're going to focus on here. Reviews of your brand,
its products, or its services can make or break a business.
In your effort to establish authority from reviews, follow these review rules
of thumb:
Authority is built when brands are doing great things in the real-world,
making customers happy, creating and sharing great content, and earning
links from reputable sources.
Chapter 7
TRACKING SEO PERFORMANCE
Set yourself up for success.
They say if you can measure something, you can improve it.
It also helps you pivot your priorities when something isn’t working.
The only way to know what a website’s primary end goal should be is to
have a strong understanding of the website’s goals and/or client needs.
Good client questions are not only helpful in strategically directing your
efforts, but they also show that you care.
Keep the following tips in mind while establishing a website’s primary goal,
additional goals, and benchmarks:
Measuring
Now that you’ve set your primary goal, evaluate which additional metrics
could help support your site in reaching its end goal. Measuring additional
(applicable) benchmarks can help you keep a better pulse on current site
health and progress.
Engagement metrics
How are people behaving once they reach your site? That’s the question
that engagement metrics seek to answer. Some of the most popular
metrics for measuring how people engage with your content include:
Conversion rate
The number of conversions (for a single desired action/goal) divided by the
number of unique visits. A conversion rate can be applied to anything, from
an email signup to a purchase to account creation. Knowing your
conversion rate can help you gauge the return on investment (ROI) your
website traffic might deliver.
Time on page
How long did people spend on your page? If you have a 2,000-word blog
post that visitors are only spending an average of 10 seconds on, the
chances are slim that this content is being consumed (unless they’re a
mega-speed reader). However, if a URL has a low time on page, that’s not
necessarily bad either. Consider the intent of the page. For example, it’s
normal for “Contact Us” pages to have a low average time on page.
Scroll depth
This measures how far visitors scroll down individual webpages. Are
visitors reaching your important content? If not, test different ways of
providing the most important content higher up on your page, such as
multimedia, contact forms, and so on. Also consider the quality of your
content. Are you omitting needless words? Is it enticing for the visitor to
continue down the page? Scroll depth tracking can be set up in your
Google Analytics.
In Google Analytics, you can set up goals to measure how well your site
accomplishes its objectives. If your objective for a page is a form fill, you
can set that up as a goal. When site visitors accomplish the task, you’ll be
able to see it in your reports.
Search traffic
Ranking is a valuable SEO metric, but measuring your site’s organic
performance can’t stop there. The goal of showing up in search is to be
chosen by searchers as the answer to their query. If you’re ranking but not
getting any traffic, you have a problem.
But how do you even determine how much traffic your site is getting from
search? One of the most precise ways to do this is with Google Analytics.
GA allows you to view traffic to your site by channel. This will mitigate any
scares caused by changes to another channel (ex: total traffic dropped
because a paid campaign was halted, but organic traffic remained steady).
Traffic to your site over time
You can use UTM (urchin tracking module) codes for better
attribution. Designate the source, medium, and campaign, then append the
codes to the end of your URLs. When people start clicking on your UTM-
code links, that data will start to populate in GA’s "campaigns" report.
Your CTR from search results to a particular page (meaning the percent of
people that clicked your page from search results) can provide insights on
how well you’ve optimized your page title and meta description. You can
find this data in Google Search Console, a free Google tool.
In addition, Google Tag Manager is a free tool that allows you to manage
and deploy tracking pixels to your website without having to modify the
code. This makes it much easier to track specific triggers or activity on a
website.
Keyword rankings
A website’s ranking position for desired keywords. This should also include
SERP feature data, like featured snippets and People Also Ask boxes that
you’re ranking for. Try to avoid vanity metrics, such as rankings for
competitive keywords that are desirable but often too vague and don’t
convert as well as longer-tail keywords.
Number of backlinks
Total number of links pointing to your website or the number of unique
linking root domains (meaning one per unique website, as websites often
link out to other websites multiple times). While these are both common link
metrics, we encourage you to look more closely at the quality of backlinks
and linking root domains your site has.
The Moz and STAT APIs (among other tools) can also be pulled into
Google Sheets or other customizable dashboard platforms for clients and
quick at-a-glance SEO check-ins. This also allows you to provide more
refined views of only the metrics you care about.
Dashboard tools like Data Studio, Tableau, and PowerBI can also help to
create interactive data visualizations.
While we don’t have room to cover every SEO audit check you should
perform in this guide, we do offer an in-depth Technical SEO Site Audit
course for more info. When auditing your site, keep the following in mind:
Crawlability
Are your primary web pages crawlable by search engines, or are you
accidentally blocking Googlebot or Bingbot via your robots.txt file? Does
the website have an accurate sitemap.xml file in place to help direct
crawlers to your primary pages?
Indexed pages
Can your primary pages be found using Google? Doing a site:yoursite.com
OR site:yoursite.com/specific-page check in Google can help answer this
question. If you notice some are missing, check to make sure a meta
robots=noindex tag isn’t excluding pages that should be indexed and found
in search results.
Page speed
How does your website perform on mobile devices and in Lighthouse?
Which images could be compressed to improve load time?
Content quality
How well does the current content of the website meet the target market’s
needs? Is the content 10X better than other ranking websites’ content? If
not, what could you do better? Think about things like richer content,
multimedia, PDFs, guides, audio content, and more.
Website pruning can improve overall quality
Removing thin, old, low-quality, or rarely visited pages from your site can
help improve your website’s perceived quality. Performing a content
audit will help you discover these pruning opportunities.
For example:
Important Quadrant I: Urgent & Important Quadrant II: Not Urgent &
Urgent Not Urgent
Not Important Quadrant III: Urgent & Not Important Quadrant IV: Not Urgent &
Important
Putting out small, urgent SEO fires might feel most effective in the short
term, but this often leads to neglecting non-urgent important fixes. The Not
Urgent & Important items are ultimately what will move the needle for a
website’s SEO. Don’t put these off.
Not Important Client reports (unrelated to goals), Video sitemaps, meta keyw
vanity keywords
Measuring your progress along the way via the metrics mentioned above
will help you monitor your effectiveness and allow you to pivot your SEO
efforts when something isn’t working. Say, for example, you changed a
primary page’s title and meta description, only to notice that the CTR for
that page decreased. Perhaps you changed it to something too vague or
strayed too far from the on-page topic — it might be good to try a different
approach. Keeping an eye on drops in rankings, CTRs, organic traffic, and
conversions can help you manage hiccups like this early, before they
become a bigger problem.
Communication is essential for SEO client longevity
Many SEO fixes are implemented without being noticeable to a client (or
user). This is why it’s essential to employ good communication skills
around your SEO plan, the time frame in which you’re working, and your
benchmark metrics, as well as frequent check-ins and reports.
We've put together a quick to-do list you can use to guide your next steps
in the wide, wonderful world of SEO:
1. Figure out your site's information architecture, design, UX, and other
necessary considerations before starting a new project. We
recommend reading Strategic SEO Decisions to Make Before
Website Design and Build and watching our Whiteboard Friday on
the topic, Launching a New Website: Your SEO Checklist.
2. Follow the steps of the Beginner's Guide to SEO as you go along:
1. Understand your goals and the basic rules of SEO
2. Make sure your site is crawlable and indexable in search
3. Conduct thorough keyword research
4. Make sure your on-site optimizations are up to snuff
5. Perform necessary technical SEO optimizations or audits
6. Earn links and establish your site's authority
7. Prioritize effectively and measure the right metrics
3. Test, iterate, and test again! Many SEOs have test sites where they
challenge SEO norms or experiment with new types of optimization
tactics. Implement this today by setting up a website and make up a
gibberish word (one that likely has zero search volume and no
competition), then see how quickly you can get it to rank in search
results. From there, you can experiment with all sorts of other SEO
tests.
4. Take on harder tasks, test refurbishing content for other platforms,
set stretch goals, and compete with stronger competitors.
5. Consider going the extra mile and challenging yourself to learn
technical SEO.
6. Find a community where you can safely learn, discuss, share
experiences, and ask for help. Moz’s Q&A
Forum, TrafficThinkTank, Search Engine Journal's SEO Experts to
Follow, and finding SEO meetups near you are all great options to
start.
7. Take the time to evaluate what worked and what didn’t after an SEO
project. How might you do things a bit differently in the future to
improve your performance?
When it comes to tracking your SEO progress, data is your best friend. You
can use Moz Pro's suite of SEO analytics and research tools to keep a
close eye on rankings, link building, technical site health, and more. Put
your new SEO skills into action with a free 30-day trial of Moz Pro!
We know learning all the ins and outs of SEO vocabulary and jargon can
feel like learning another language. To help you get a handle on all the new
terms we're throwing at you, we've compiled a chapter-by-chapter SEO
glossary with definitions and helpful links. You might want to bookmark this
page for future reference!
Intent: In the context of SEO, intent refers to what users really want from
the words they typed into the search bar.
Local pack: A pack of typically three local business listings that appear for
local-intent searches such as “oil change near me.”
SERP: Stands for “search engine results page” — the page you see after
conducting a search.
Traffic: Visits to a website.
URL: Uniform Resource Locators are the locations or addresses for
individual pieces of content on the web.
4xx status codes: A class of status codes that indicate the request for a
page resulted in error.
5xx status codes: A class of status codes that indicate the server’s
inability to perform the request.
Backlinks: Or "inbound links" are links from other websites that point to
your website.
Index: A huge database of all the content search engine crawlers have
discovered and deem good enough to serve up to searchers.
Internal links: Links on your own site that point to your other pages on the
same site.
Meta robots tag: Pieces of code that provide crawlers instructions for how
to crawl or index web page content.
Navigation: A list of links that help visitors navigate to other pages on your
site. Often, these appear in a list at the top of your website (“top
navigation”), on the side column of your website (“side navigation”), or at
the bottom of your website (“footer navigation”).
NoIndex tag: A meta tag that instructions a search engine not to index the
page it’s on.
Relevance: In the context of the local pack, relevance is how well a local
business matches what the searcher is looking for
Sitemap: A list of URLs on your site that crawlers can use to discover and
index your content.
Spammy tactics: Like “black hat,” spammy tactics are those that violate
search engine quality guidelines.
Rel=canonical: A tag that allows site owners to tell Google which version
of a web page is the original and which are the duplicates.
Scraped content: Taking content from websites that you do not own and
republishing it without permission on your own site.
Title tag: An HTML element that specifies the title of a web page.
ccTLD: Short for “country code top level domain,” ccTLD refers to domains
associated with countries. For example, .ru is the recognized ccTLD for
Russia.
CSS: A Cascading Style Sheet (CSS) is the code that makes a website
look a certain way (ex: fonts and colors).
Fetch and Render tool: A tool available in Google Search Console that
allows you to see a web page how Google sees it.
Hreflang: A tag that indicates to Google which language the content is in.
This helps Google serve the appropriate language version of your page to
people searching in that language.
Lazy loading: A way of deferring the loading of an object until it’s needed.
This method is often used to improve page speed.
Pagination: A website owner can opt to split a page into multiple parts in a
sequence, similar to pages in the book. This can be especially helpful on
very large pages. The hallmarks of a paginated page are the rel=”next” and
rel=”prev” tags, indicating where each page falls in the greater sequence.
These tags help Google understand that the pages should have
consolidated link properties and that searchers should be sent to the first
page in the sequence.
Rich snippet: A snippet is the title and description preview that Google
and other search engines show of URLs on its results page. A “rich”
snippet, therefore, is an enhanced version of the standard snippet. Some
rich snippets can be encouraged by the use of structured data markup, like
review markup displaying as rating stars next to those URLs in the search
results.
Schema.org: Code that “wraps around” elements of your web page to
provide additional information about it to the search engine. Data using
schema.org is referred to as “structured” as opposed to “unstructured” — in
other words, organized rather than unorganized.
Fresh Web Explorer: A Moz tool that allows you to scan the web for
mentions of a specific word or phrase, such as your brand name.
Link profile: A term used to describe all the inbound links to a select
domain, subdomain, or URL.
Google Analytics goals: What actions are you hoping people take on your
website? Whatever your answer, you can set those up as goals in Google
Analytics to track your conversion rate.
Google Tag Manager: A single hub for managing multiple website tracking
codes.
Scroll depth: A method of tracking how far visitors are scrolling down your
pages.
UTM code: An urchin tracking module (UTM) is a simple code that you can
append to the end of your URL to track additional details about the click,
such as its source, medium, and campaign name.