Sat Dec 29 10:03:47 PST 2018

? Sponsor 501(c
tax-exempt organizations (or buy/rent other links
sponsor other sites.
? Reciprocate links with other quality sites. It is hard to get quality sites
to want to reciprocate until you are somewhat integrated into the web,
have built a brand, or are trusted for other reasons, so it may not be
worth chasing reciprocal links too hard off the start.
? Place advertisements on relevant, related sites.
? To keep your link profile looking natural to Google, make sure you
mix your anchor text heavily and get at least a few links from high-
quality sites.
? When your site is brand new, if you take a less is more approach to link
building, and only focus on gaining authoritative high-quality links, that
will help you more than an aggressive campaign where you actively
build many low-quality links.
You will still want to structure your pages properly using heading tags, page titles,
and the other page elements to help you rank well in all search engines, but current,
quality links are what matter most with Google.
Off-topic links still count toward your link popularity. If you already have many
links from within your community, then you may receive a great ranking boost by
sponsoring a few non-profits or by renting a few, strong, inbound links from other
websites. If you are in a competitive field, you need to build many on-topic links.
Be careful with how above the radar you are with link buys though. Established,
trusted, and well-branded sites can get away with using far more aggressive SEO
techniques than a new site could.
Anchor text is important, but all throughout 2006 Google depreciated the effect of
anchor text and started placing more weighting on link authority and domain age
related trust.
It can take up to three months for your inbound links to help improve your
Google rankings for competitive search terms. If Google believes your site is a
trusted authority, the delay time will not really exist. With Google, you want to
build linkage data over time to minimize the chances of it appearing unnatural.
Link Searching Tips
The most time-consuming part of SEO is building a linking campaign. Tools or
ideas that help us save time doing this are extremely valuable.
? You can classify domain extension or specific site ideas when
searching for potential links:
o ?Inurl:+".edu? and
o ?intitle:links? or ?intitle:partners? or ?intitle:resources? or
?intitle:engines? and
o ?searchenginewatch? or ?search engine watch?

Sat Dec 29 09:03:15 PST 2018

Google is harder to manipulate than the other search engines, and Google tends to
trust new sites much less than their competitors do. If you run a new website, do
not expect to rank well in Google for competitive queries until AFTER you rank
well in Yahoo! and MSN.
If you have an old, well-trusted domain but have not focused on SEO yet, then
doing simple things like fixing up your page titles and gaining a few authoritative or
descriptive inbound links might be all you need to do.
Matt Cutts describes his overview of the SEO process in this audio interview:
Note how he states that it is a process and you need a certain amount of authority
before you will be able to compete in Google.
Google is quick to index new sites, usually within a few days to a month
(depending on the quality and quantity of your inbound links
. During the first
month or two, it is common for your site to go into and out of their database many
times until you have built up sufficient link popularity. Link building can help get
your site indexed and keep you in the index.
Google primarily focuses its algorithm on linkage data. On-page criteria is
weighted exceptionally low for shorter and highly competitive search phrases. To
do well in Google, you will need to target low-competition phrases using many
different pages, or think of ways to get others within your community to want to
link to your site. Some of the more common ideas for improving your link
reputation are to do the following things:
? Join trade organizations.
? List your site in quality directories.
? Submit press releases via sites such as Using PR
Web and paying $80 can get you distribution in Google news and
Yahoo! News. Press releases should usually be newsworthy, although
most of them tend to be pure crap. I believe Google news has a 65-
character limit on its titles, and Yahoo! has a 150-character limit. Both
tend to favor longer copy around the 400- to 600-word range, although
a short useful press release is better than a long useless one. You can
look at a few current press releases on PR Web, Google news, or
Yahoo! news for formatting examples. Don Crowther has good PDF
reports offering templates and press release tips at also creates press
releases for an affordable rate.
? Create a quality, topical directory one level above your category
(slightly broader
. By giving links to others in your field, some of them
might take notice of your presence.
? Write an interesting blog about your topic. Link out to recognized
topical authorities.
? Write compelling articles and get them syndicated.

Sat Dec 29 08:02:43 PST 2018

? Look for issues in new pages from your site that are getting indexed by
? Find how how rapidly a story is spreading
From this line of thinking has couple a couple SEO tools, namely a free IndexRank
tool and a Website Health Check Tool. The IndexRank tool tracks how regularly
new content on your site is getting indexed in Google. The Website Health Check
Tool allows you to look for duplicate content issues or other technical issues.
Personalized Search
The word Jaguar has drastically different meanings to a person searching for a car
or an animal. Personalization mixes some of the sites you have viewed in the past
near the top of the search results. So if many searchers frequently visit and trust
your site those searchers will get your site mixed in near the top of the search
results more often.
Universal Search
Google mixes relevant fresh news stories and videos in their search results. If you
are finding it hard to get ranked for some of the more competitive terms in your
industry try posting a relevant useful video on YouTube and mention it to friends
in your industry. I have also seen PRNewswire press releases that were picked up
by CNN ranking for competitive searches like forex.
Perfect Links
Google using multiple algorithms in conjunction can allow them to place
exceptional positive bias on links that fit most or all their bolt on relevancy
boosting algorithms. This means the best links are those from the following places:
? Well-trusted sites (think of sites that were chosen as seed sites or
linked to from seed sites in the TrustRank algorithm

? Sites that are on your theme
? Pages on that site that are about your topic, which is linked to from
many sites on your topic, and also link to other resources similar to
your site
o The anchor text of these pages would include some versions of
your primary keywords
o The link would drive direct, targeted traffic that may convert to
sales or secondary citations from other trusted sites
? Pages that rank well in the search results for your topic
? Sites that are easy to link to and hard to get a link from
? Sites that are well-read
How to Succeed in Google
The easiest way to get
natural-looking quality
linkage data is to create
something people would
want to link to without
you needing to ask them.

Google is much better
than its competitors at
counting only quality
links as votes.

Sat Dec 29 07:03:52 PST 2018

spikes in search volumes or news agencies writing a bunch of content about a
topic Google might be more likely to mix in fresher results.

Supplemental Results Index
When the supplemental index was first launched Google labeled the supplemental
results. SEOs came up with hacks for figuring out how many supplemental results
different websites had, and after they got popular Google disabled them and
removed the supplemental results label.
Google has an auxiliary index where it stores some documents it may not trust
enough to include in its regular index. Supplemental search results are used to
provide search results when not enough regular search results are available. Why
do sites get put in the supplemental results?
? Too much content for the amount of PageRank the site has. If a PR2
site has 100,000 pages Google isn? going to index most of them.
? Newness. If a site is new and does not have many inbound links, some
of its pages may end up in the supplemental results until the site builds
up its popularity and ages a bit. This is typical for some new URLs
that have not been crawled yet.
? Duplicate content. For example, giving a search engine multiple URLs
with the same content or a directory full of empty categories with near
similar content on every page
? Too many variables in the URL
? Host was down when Google tried to crawl
? Too many low-quality outbound links
? Too many low-quality inbound links
? Any combination of the above
Since pages in the supplemental results are not trusted as much as pages in the
regular index, it is likely that Google rarely crawls and places low or no weight on
links from many of the pages in the supplemental results.
Date Based Filters, Cache Date & Site Indexing Growth
Many analytics programs show what pages are crawled frequently by search
engines. If they crawl some parts of your site frequently it is safe to say they trust
that part of your site significantly because they keep spending bandwidth to visit it
Google?s advanced search page allows you to search for pages that are recently
indexed. These search filters can be used to
? Find how well your site is getting indexed

Sat Dec 29 06:03:47 PST 2018

other people will be creating thousands of similar sites with similar link profiles and
similar footprints.

Human Review
Google claims everything is done algorithmically, but they need to refine the
algorithms and test the relevancy of their results with humans. In June of 2005.
Henk van Ess of SearchBistro posted a remote, search-relevancy-quality rater
document and a spam document on his site.
These documents showed how Google asks remote ?raters? to review search
results. Even if the review input is not directly used to alter the search results, the
guidance in the documents shows what Google wants. One of the documents was
dated December 2003, so Google has been using human quality raters for a great
length of time.
I would link to the documents, but I believe that GoogleGuy asked that they not
be posted, and they may be taken down. I did an overview of the documents at The highlights are the following:
? Most search spam sites are heavily automated and provide little useful,
unique, or compelling content to the end user.
? If people would have no reason to want to view an affiliate site instead
of going directly to the merchant, then the site is to be rated as
? For long-term success of affiliate sites, or bare-bone merchant sites, it
helps to add some value-added service that would make people want to
visit your site instead of your competitors?.

In 2007 a leading Google engineer hand edited one of my websites after it was
mentioned on an SEO blog, in spite of the fact that my site was better
(designed, formatted, higher content quality
than the current #1 ranked
website. If you have a successful thin affiliate type website and trust Google
enough to reveal your websites to them on SEO blogs don?t be surprised
when they hand edit your website out of the search results.

A June 2007 NTY article
about Google?s shifting relevancy algorithms further discussed how
humans play a role in Google?s relevancy measurements. Around the same
time Matt Cutts confirmed that Google hires over 10,000 quality raters.
Query Deserves Freshness
In the above linked NYT article a concept called query deserves freshness was
described. For some search queries that seem time dependant or see rapid

Sat Dec 29 05:03:17 PST 2018

There is significant effort being placed on looking for ways to move the PageRank
model to a model based upon trust and local communities.

Link Spam Detection Based on Mass Estimation
TrustRank mainly works to give a net boost to good, trusted links. Link Spam
Detection Based on Mass Estimation was a research paper aimed at killing the
effectiveness of low-quality links. Essentially the thesis of this paper was that you
could determine what percent of a site?s direct and indirect link popularity come
from spammy locations and automate spam detection based on that.
The research paper is a bit complex, but many people have digested it. I posted on
it at
Due to the high cost of producing quality information versus the profitability and
scalability of spam, most pages on the web are spam. No matter what you do, if
you run a quality website, you are going to have some spammy websites link to you
and/or steal your content. Because my name is Aaron Wall, some idiots kept
posting links to my sites on their ?wall clock? spam site.
The best way to fight this off is not to spend lots of time worrying about spammy
links, but to spend the extra time to build some links that could be trusted to offset
the effects of spammy links.
Algorithms like the spam mass estimation research are going to be based on
relative size. Since quality links typically have more PageRank (or authority by
whatever measure they chose to use
than most spam links, you can probably get
away with having 40 or 50 spammy links for every real, quality link.
Another interesting bit mentioned in the research paper was that generally the web
follows power laws. This quote might be as clear as mud, so I will clarify it shortly.
A number of recent publications propose link spam detection
methods. For instance, Fetterly et al. [Fetterly et al., 2004]
analyze the indegree and outdegree distributions of web pages.
Most web pages have in- and outdegrees that follow a power-
law distribution. Occasionally, however, search engines
encounter substantially more pages with the exact same in- or
outdegrees than what is predicted by the distribution formula.
The authors find that the vast majority of such outliers are spam
Indegrees and outdegrees above refer to link profiles, specifically to inbound links and
outbound links. Most spam generator software and bad spam techniques leave
obvious mathematical footprints.
If you are using widely hyped and marketed spam site generator software, most of
it is likely going to be quickly discounted by link analysis algorithms since many

Sat Dec 29 04:02:45 PST 2018

Google may also look at how often your site is bookmarked, how frequently
people search for your brand, how frequently people click on your listing, who
your advertisers are, and other various feedback they can get from their toolbar.
Google was awarded a patent on March 31, 2005 covering these types of topics,
but put forth in much greater detail than I have done here. While I do not think
they are already necessarily doing all the things they mention in the patent, I think
they may eventually use many of them. The patent is interesting and worth reading
if you are deeply interested in SEO and information retrieval. If you do not want
to read it, you may want to look at the Threadwatch post (URL:
that mentioned it and the follow- up thread
TrustRank is an algorithm that can be used to bias PageRank by placing additional
authority on human-reviewed, trusted sites. Trust propagates out from the trusted
pages and sites to pages and sites they link to. TrustRank also can be used to
neutralize the effects of some types of low-quality link building from untrusted
sources as well as to flag high-PageRank, low-trust websites for human review.
In the TrustRank research paper, the seed sites fit the following criteria:
? Seed sites linked to many other websites. DMOZ and the Yahoo!
Directory, for example, were most likely seed sites.
? Seed sites were controlled by major corporations, educational bodies,
or governmental bodies.

I believe TrustRank (or a similar algorithm
is a huge part of Google?s
current search algorithm. If you look at a link through the eyes of a search
engineer or search editor, rather than just looking at PageRank, you would do far
better in terms of evaluating the value of a link.
Look at a site and ask yourself questions like, ?Is this the type of site I would use as
a seed site?? If the answer to that is ?no,? then ask, ?Is this site directly or
indirectly associated with seed sites?? and ?Why would quality seed sites want to
link to this site or sites linking to this site??
Topic Sensitive TrustRank
Since TrustRank is topic-independent, it could place too much relevancy on topics
that are overrepresented in the seed set of trusted sites. Thus you could use
DMOZ or the Yahoo! Directory to help extend out the seeded sites to a broader
array of sites and topically bias their TrustRank score based on the categories in
which they are listed. You could then filter out the bottom half of trusted sites in
each category to prevent too many spam sites from being selected as trust sources.

Sat Dec 29 03:03:14 PST 2018

Even if Google is not using LSI, they launched 5 patents on phrase based research,
which are covered in this WMW thread
Temporal Analysis
Search engines can track how long things (sites, pages, links
have been in existence
and how quickly they change. They can track a huge amount of data such as
? How long a domain has been in existence
? How often page copy changes
? How much page copy changes
? How large a site is
? How quickly the site size changes
? How quickly link popularity builds
? How long any particular link exists
? How similar the link text is
? How a site changes in rank over time
? How related linking sites are
? How natural linkage data looks
In some cases, it makes sense for real websites to acquire a bunch of linkage data in
a burst. When news stories about a topic and search volumes on a particular term
are high, it would also make sense that some sites may acquire a large amount of
linkage data. If that link growth is natural, many of those links will be from high-
quality, active, frequently-updated sites. Unless you are doing viral marketing most
of the time, if links build naturally, they build more slowly and evenly.
If links build in huge spikes from low-quality sites, then search engines may
discount?or even apply a penalty?to the domain receiving that linkage data.
Stale pages may also be lowered in relevancy if other pages in the same topic are
referenced or updated more regularly. A page may be considered fresh if it
changes somewhat frequently or if it continues to acquire linkage data as time
passes. Certain types of queries (like news related ones, for instance
may improve
scoring for fresh documents.

Sat Dec 29 02:02:44 PST 2018

ones with few words in common to be semantically distant. This
simple method correlates surprisingly well with how a human
being, looking at content, might classify a document collection.
Although the LSI algorithm doesn?t understand anything about
what the words mean, the patterns it notices can make it seem
astonishingly intelligent.

Google bought a company called Applied Semantics, which had a technology
called CIRCA, which is a natural language processing technology used to
conceptually understand content for targeting AdSense ads. They may use it better
understand page content and refine their search relevancy as well.
Latent semantic indexing is a rather expensive process, and many SEO experts
debate to what extent major search engines may be using the technology. It is
likely Google is not using LSI, but they are using some other technologies to
understand word relationships.
If you are knowledgeable about a topic and write naturally about a topic you are far
more likely to write semantically sound copy than if you are writing for keyword
density or some outdated SEO technique.
Most webmasters do not need to know much about LSI or other word relationship
technologies other than to know that mixing their inbound link anchor text is
important and that any LSI-like algorithms aim to rank natural writing better
than clumsy, machine-written content focused on things like keyword
Here is an image from that shows some keywords related to SEO.
As you scroll over any of the related words it also allows you to dig deeper into
words semantically related to those words.

Sat Dec 29 01:03:46 PST 2018

Ranking Search Results by Reranking the Results Based on Local Inter-
That subheading probably sounds like a handful, but it is the name of a patent
Google filed. The patent is based on finding a good initial set of results (say the top
1,000 or so most relevant results
then reranking those results based on how well
sites are linked to from within that community.
If you have many links and have been mixing your anchor text but still can not
break into the top results, then you likely need to build links from some of the top
ranked results to boost your LocalRank. Just a few in community links can make a
big difference to where you rank. A site that has some authority but lacks the in
community links may get re-ranked to the bottom of the search results. A site that
has abundant authority, like Wikipedia, probably does not need many in
community links.
Topic-Sensitive PageRank (TSPR

Topic-Sensitive PageRank biases both the query and the relevancy of returned
documents based upon the perceived topical context of the query. The query
context can be determined based on search history, user-defined input (such as
search personalization?try Google Labs Search Personalization if you are
or related information in the document from which the query came (if
people searched Google from a site search box, for example
Topic-Sensitive PageRank for each page can be calculated offline. Using an
exceptionally coarse topic set (for example, the top level Open Directory Project
still allows Topic-Sensitive PageRank to significantly enhance relevancy
over using PageRank alone; however, TSPR can be applied more specifically as
Since much of it is calculated offline, Topic-Specific PageRank can also be rolled
into other relevancy algorithms that are calculated in near real time.
I do not think it is exceptionally important for most webmasters to deeply
understand TSPR, other than to understand the intent of this algorithm. Instead of
grading the web on the whole, they would prefer to evaluate it based upon local
topical communities.
Latent Semantic Indexing (LSI

Latent semantic indexing allows machines to understand language by looking at it
from a purely mathematical viewpoint. Here is a brief description of how it works:
Latent semantic indexing adds an important step to the
document indexing process. In addition to recording which
keywords a document contains, the method examines the
document collection as a whole, to see which other documents
contain some of those same words. LSI considers documents
that have many words in common to be semantically close, and

Sat Dec 29 00:04:00 PST 2018

I mention a number of algorithms and concepts in the following section, including:
Hilltop, TrustRank, Topic-Sensitive PageRank, temporal analysis, and latent
semantic indexing (LSI
Some of these algorithms may not be part of the current search environment, but
the ideas contained within them are still worth understanding to see where search
may be headed and what search topics search engineers think are important to
improve their overall relevancy scores.
Local Re-ranking Results Based on Inter-Connectivity
Hilltop was an algorithm that reorganizes search results based on an expert rating
In the Hilltop white paper, they talk about how expert documents can be used to
help compute relevancy. An expert document is a non-affiliated page that links to
many related resources. If page A is related to page B and page B is related to page
C, then a connection between A and C is assumed.
Additionally, Hilltop states that it strongly considers page title and page headings in
relevancy scores; in fact, these elements can be considered as important as, or more
important than, link text. It is likely that Hilltop also considers the links pointing
into the page and site that your links come from.
The benefit of Hilltop over raw PageRank (Google
is that it is topic sensitive, and
is thus generally harder to manipulate than buying some random high-power off-
topic link. The benefits of Hilltop over topic distillation (the algorithm that powers, which will be discussed later
are that Hilltop is quicker and cheaper to
calculate and that it tends to have more broad coverage.
When Hilltop does not have enough expert sites, the feature can be turned off, and
results can be organized using a global popularity score, such as PageRank.
Google might be using Hilltop to help sort the relevancy for some of their search
results, but I also see some fairly competitive search queries where three of my sites
rank in the top eight results. On those three sites, it would be fairly obvious for
search engines to know that they were all owned by me.
They may use something like Hilltop to scrub the value of some nepotistic links,
but it will not wipe out all related sites just because they are related. When you
search for things like Microsoft, it makes sense that many of the most relevant
websites are owned by the same company.

Fri Dec 28 23:03:44 PST 2018

? Google wants to rank informational pages.
? Many of these newspapers are well trusted offline within their
? Newspapers have an informational bias and their articles consist of real
unique human written text.
? Google feels they can rely on long established businesses and sources
of power more than the average website.
The more your sites (or sections of them
look like a trusted newspaper, the easier
it is going to be to rank well in Google.
Various Data Centers
Google uses groups of data centers to process their search queries. When Google
updates algorithms or their refreshes their index, the changes roll from one data
center to the next. When results rapidly change back and forth, sometimes they are
tweaking algorithms, but more frequently you are getting search results from
different data centers. You can use the free Firefox ShowIP extension to find the
IP address of the data center of your search query.
About PageRank
PageRank is a measure of connectivity. It is a rough approximation of the odds
that a random web surfer will cross your page. PageRank is calculated by following
links throughout the web, and placing more weight on links from pages that many
quality pages link at.
The problem with PageRank is that most industries and most ideas are not
exceptionally important and well integrated into the web. This means that if
Google did place a heavy emphasis on PageRank, webmasters could simply buy or
rent a few high PageRank links from sites in a more important vertical and
dominate the search results for their niche topic. However, that is not how it
PageRank (mentioned in The Anatomy of a Search Engine
as it relates to SEO is
overrated. By Google making the concept easy to see and understand, it allows
more people to talk about them and makes it easier for more people to explain
how search engines work using Google and PageRank as the vocabulary.
Google?s technology is not necessarily better/more effective than the technologies
owned by Yahoo!, MSN, or Ask, but they reinforce their market position by being
the default vocabulary. And, as they move on to more elegant and more
sophisticated technologies, many people are still using irrelevant outdated
marketing techniques.

Fri Dec 28 22:03:44 PST 2018

This update has caused many sites to be not indexed, partially indexed, or stuck in
the supplemental results. Matt Cutts mentioned on his blog that many sites that
were well-indexed are no longer indexed or reduced to only having a few pages
indexed due to having primarily low-trust, spammy, inbound links, shady outbound
links, or participating in cheesy link exchange networks.
The message worth emphasizing again and again is that Google is looking for
quality editorial links.
Google Sandbox
Many new sites, or sites that have not been significantly developed, have a hard
time ranking right away on Google. Many well-known SEOs have stated that a
good way to get around this problem is to just buy old sites. Another option is to
place a site on a subdomain of a developed site, and after the site is developed and
well-indexed, 301 redirect the site to the new location.
The whole goal of the Sandbox concept is to put sites through a probationary
period until they prove they can be trusted.
There are only a few ways webmasters can get around the Sandbox concept:
? Buying an old site and ranking it
? Placing pages on a long-established, well-trusted domain (through
buying sites, renting full-page ads, paying for reviews, renting a folder,
or similar activity

? Gaining a variety of natural high-quality links. When a real news story
spreads, some of the links come from news sites or other sites that are
highly trusted. Also note that when real news spreads, some of the
links will come from new web pages on established, trusted sites (new
news story and new blog posts
. It is an unnatural pattern for all your
link popularity to come from pages that have existed for a long time,
especially if they are links that do not send direct traffic and are mostly
from low-trust sites.
? Participating in hyper-niche markets where it is easy to rank without
needing a large amount of well-trusted link popularity

Google & Authoritative Domains
Content on a new domain with limited authority will not rank as well as content on
a trusted domain name. Through 2006 Google placed significant weighting on
trusted authoritative domains. According to Hitwise and the NYT in November
of 2006, search provides roughly 22% of the web traffic to many newspaper
websites, with roughly 2/3 of that traffic coming from Google.
Google is not sending these newspapers so much more traffic because the
newspapers are doing SEO. They are sending more traffic for a variety of concrete

Fri Dec 28 21:04:11 PST 2018

? Toolbar buttons. You can create custom XML buttons to link to
some of your favorite sites. This also has a simple RSS reader
integrated into it. I created buttons on my site to link to many useful
free SEO tools and SEO blogs.
? Saves bookmarks. If you are logged in, it saves your search history
and bookmarks in your Google Account, which is accessible from any
Google Update Florida
In November of 2003, Google performed a major algorithm change. The goal of
the change was to make it harder to manipulate their search results. It is believed
that Google may have significantly incorporated Hilltop, topic-specific PageRank,
and/or a latent semantic indexing like technology into their algorithms.
It is important to get links from the right community. Do not rely on cheesy off-
topic link exchanges. They can hurt you more than they help you. For example, to
a search engine marketer, a link from Search Engine Watch (a search engine
information resource hub
is worth much more than many random off-topic links.
I still have seen significant evidence that off-topic inbound links can improve your
Google rankings significantly, but it is likely that this will eventually change, and
there is an opportunity cost and risk level associated with every activity.
In early 2004, Google also began to block the ability of certain sites to pass
PageRank, even if those same pages showed PageRank when you visited them.
In addition, Google seems to have set up a portion of their algorithm to delay the
effects of some links or to only allow them to parse partial link credit until the links
age. These moves are aimed at curbing manipulation of the Google index through
link buying by making it a much more expensive and much less predictable
It may take up to three or so months for the full effect of new links to kick in.
Google Update Jagger
In November of 2005, Google rolled out another major update that caused a roar
on the SEO forums. I believe that the update was most likely related to scrubbing
link quality. Google also appeared to have placed more weight on TrustRank or
another similar technology.
The value of low-quality automated links is going down daily. SEO is becoming
more and more about public relations and viral marketing.
Google Update Big Daddy
In early 2006, Google upgraded their crawl and indexing systems to a new
architecture that leveraged different crawl priorities.

Fri Dec 28 20:04:20 PST 2018

Does Google trust this page? There are several ways in which this question can be
? It ranks for relevant search queries, so that is a good sign.
? It is a useful page, so that is a good sign.
? It is relevant to my site, so that is a good sign.
? It only links to relevant resources, so that is a good sign.

If you are using techniques that fall far outside of Google?s recommended
guidelines, I would not recommend using their toolbar, since the feedback the
toolbar provides may make it easy for them to link you to all of your websites.
In October of 2007 Google edited the toolbar PageRank scores of many sites that
were selling links. Most of the sites that had their toolbar PageRank scores edited
did not see any change in traffic. The only thing that changed was their perceived
PageRank scores.
Google Toolbar Broken?
? Sometimes the Google Toolbar gets stuck at 0 when searching the
web. If you are unsure of the PageRank of a page, go to a high
PageRank site (like
and then type the address of
where you were just at in the address bar of Internet explorer. Usually
this technique will unstick the PageRank.
? Keep in mind that Google has only been updating toolbar display
PageRank about once every 3 months, so if a site is only a few months
old, it will not be uncommon for it to show a PageRank 0 in the
toolbar. Also remember that PageRank is only a rough approximation
of authority.
? To find out who is linking to your competitors, you can type
? in the Google search box. Keep in
mind that Google only shows a small sample of inbound links and
other search engines show more/better linkage data.
? The toolbar is just an aid and should be combined with common
sense. If you see sites linking into awful websites or if a site looks
sketchy, then it may not be a good place to get a link from.
? If you use the Safari browser, you can use the PageRank Toolbar
Widget for the Mac from Digital Point.
New Google Toolbar Features
In February 2006, Google introduced the beta version of their 4th Google Toolbar.
Some notable features are the following:
? Search suggest. The toolbar tries to help you complete your search
queries based on other popular searches.

Fri Dec 28 19:05:15 PST 2018

Google Webmaster Central
Google provides obtuse data to the general facing web public. They are more
willing to show site owners more granular data once you have verified that you
own your site.
Inside of Google Webmaster Central they show you
? A much larger list of your inbound links, and the associated anchor text
? Keywords you are ranking for, and keywords that drive the most traffic to
your site
? Any crawling errors, 404 errors, or pages that are blocked in your
robots.txt file
? If your site is penalized in Google, and allows you to submit reinclusion
? Control of your sitelinks if your site shows sitelinks for search queries
related to your brand.
You can use the information from Webmaster Central to help you fix broken links,
reclaim link popularity, and ensure the important parts of your site are being
If you have a site you do not like being associated with it is recommended that you
do not register it with Google Webmaster Central.
How do I Know What Sites are Good?
First off, common sense usually goes pretty far. If a page or site links to a bunch of
off-topic or low-quality garbage, you can feel safe, assuming the page does not pass
link authority. If you have doubts, you probably do not want to link.
Secondly, Google has a toolbar that shows how it currently views a web page or
website. The Google toolbar is one of the top search engine optimization tools for
a person new to search engine marketing. It works on Windows and is
downloadable at
PageRank is a measure of link popularity, which can come and go. It?s not hard for
a successful business to rent a few high PageRank links into their site and then
leverage that link popularity for link exchanges. A site with decent PageRank can
get penalized just the same as a site with low PageRank. Usually, you will want to
error on the side of caution off the start.
Instead of making PageRank your primary criteria when evaluating a page or site,
just think of it as a baseline.

Fri Dec 28 18:03:16 PST 2018

Google threw out that guidance based upon usability ideas. On pages with no link
popularity, they will not want to follow many links. On pages with a large amount
of link popularity, Google will scour thousands of links.
I have one page with over 950K of page copy. Most pages should be smaller than
that from a usability standpoint, but Google has fully indexed that page.
If you ever have questions on any rumors regarding Google and SEO, is one of the most straightforward SEO forums on the web.
What Pages of My Site are Indexed by Google?
You can check to see what pages of your site are indexed by searching Google for
? mysite.?
How do I Submit My Site to Google?
While Google also offers a free site submit option, the best way to submit your site
is by having Google?s spider follow links from other web pages.
Google offers a Google Sitemaps program that you can use to help Google set
crawl priorities. In addition to helping Google index your site, the Sitemaps
program also shows you if they have any crawling problems with your site.
Where do I Rank in Google for My Keywords?
I use the free Digital Point keyword ranking tool to determine where I rank in
Google. The Digital Point keyword ranking tool also supports Yahoo! and MSN.
Tracking various sites helps me determine some of the ways Google may be
changing their algorithm.
If you sign up for the Google API service and are doing lots of sketchy stuff, then
it makes it easy for Google to cross connect your websites. Google generally is the
slowest of the major search engines to trust and rank new websites.
Google Backlink Check
Backlinks is another way of saying ?links into a page.?
When you check backlinks in Google (,
it only shows a
small sampling of your total backlinks. Many links that do not show up when
you use the ?link:? function in Google still count for your relevancy scoring. In
addition, there is a time delay between when links are created and when they will
show up in search results.
To get a more accurate picture of links, you will also want to check backlinks using
Yahoo! or MSN. Yahoo! typically shows many more backlinks than Google. The
code to check Yahoo! backlinks to a site is ?

Fri Dec 28 17:03:26 PST 2018

Google shows up to ten pay-per-click AdWords ads on their search results, but
they keep them separate from the regular (or organic
listings. There is no direct
way to pay Google money to list in their organic search results.

PageRank (PR
, Briefly
Google is primarily driven by linkage data.
The Google Toolbar provides a 0-10 logarithmic scale to mimic the link popularity
of pages. PageRank helps provide a quick glance how important Google thinks a
page is.
Google would like you to believe that their PageRank algorithm is the core of their
search technology, but they also use many other technologies to improve their
search relevancy.
Many webmasters exchange links with as many people as they can, but there is an
opportunity cost to everything you do. There are algorithms for sorting good links
from bad links. Many link schemes increase your risk profile much quicker than
they increase your potential rewards. When you link into the wrong circles, you
run the risk of being associated with them.
It is important to note that this PageRank value is only one component of the
Google search engine algorithm. Many times, a PR 4 site will rank above a PR 6
site because it was optimized better and has a well-defined descriptive inbound link
profile, which means better, more natural links from more sites (and more related
Many Myths about Google
There are many myths about Google that are represented as fact by marketers
trying to make money. Misinformation spreads like wildfire because everyone
wants to sound like the smart person with all the answers. One example of the
many myths about Google is that you are limited to 100 links per page.

Fri Dec 28 16:03:15 PST 2018

Search Engines
Search has been consolidated to being in the hands of a couple important players.
In some regional markets, there might be important local players, but for most of
the world, Google, Yahoo!, and MSN control the bulk of search.
The Major Search Engines
The following search engines are reviewed in order of search distribution from the
best of my knowledge. Some of the first-listed search engines may appear to have
more content and more information than the later-listed search engines. There are
several reasons that the top couple search engines have much more data listed in
their sections:
? Much of the data from one section would carry over to the next
? Companies that have been focused on search the longest are more
likely to have plugged algorithmic holes.
? Google is MUCH harder for new webmasters to manipulate than the
other engines.
The order of these listings has nothing to do with the relevancy or quality of the
search results. They all provide quality results using similar algorithms.

Google Search Distribution
Currently Google is powering around 70% of U.S. search (Google, AOL,
Earthlink, Go, Netscape, and many others
. More worldwide search statistics are
available at

Fri Dec 28 15:03:13 PST 2018

How to check Google?s cache (Google?s cache of my home page


My Directory Checklist (

Some Notes
? Apply to become an editor at JoeAnt, Skaffe, Web Beacon, and DMOZ in
categories that interest you. DMOZ will probably reject you. When you
do apply to DMOZ, try to apply for a small, ugly, and non-commercial
category (maybe a local one
and take your time filling out the application.
? If your site is together and you can afford it, submit your site to the above-
listed directories. If you cannot afford it, apply to become an editor and
submit your site to whatever directories you can for free.
? If your site is new, and you are working on limited time and a limited
budget, submit your site to at least 3 quality directories.

Fri Dec 28 14:03:16 PST 2018

Google Toolbar 4 beta (

PageRank for Safari (

Prog: free Google PageRank display search tool

RoboForm: form filler (

Top25Web PageRank lookup

The Major Search Engines
Search Engine Relationship chart by Bruce Clay

TrustRank: an algorithm for determining the difference between a
good link and a bad link (

BOTW ( (


DMOZ Resource Zone (

DMOZ submission guidelines (

Gimpsy (

JoeAnt (

RubberStamped (

Skaffe (

Uncover the Net (

Web Beacon (

WoW Directory (

Yahoo! Directory (

Directories of Directories
Directory Archives (


Search Engine Guide

Other Sites

Fri Dec 28 13:03:15 PST 2018

I think it is safe to say that when you look at directories such as Yahoo!, DMOZ,
Best of the Web, JoeAnt, and Gimpsy that they probably count as good links in all
the major search engines. Google still ranks many niche sites well, primarily based
on general directory links with few other citations.
The Active Web
Search beat out directories as a primary means of navigation due to scalability and
efficiency of the search business model. Directories are nowhere near as efficient
at monetizing traffic and generally are not as relevant as search engines.
In addition to search, there are also other forces killing the margins of directories:
? Social news sites. Bottoms-up news sites have virtually no cost and
can quickly build authority because the user becomes the editor, which
means the content is relevant to them and they have a vested interest
in marketing their own work.
? Social bookmarking sites. Similar to social news sites, but users tag
pages they find interesting. There might be some spam marketing
going on in both of these channels, but because we can connect to our
closest friends AND can leverage the user base of the community, we
get a fairly high signal to noise ratio.
? Bloggers. As more and more people maintain websites in active
channels that people are actively paying attention to (i.e., are regularly
, it will be easier for engines to determine what parts of the
web are important, and discount those that are not. Quality blogs also
help identify communities and place more context around most links
than most directories do.

A Yahoo! Directory link, listings in a few of the top general directories, and listings
in niche specific directories are still worthwhile because they help identify what
communities you belong to and are signs of editorial quality. But registering in
1,000 directories is not a long-term effective link-building solution. The role of
outlying directories on the web is being reduced as a whole due to so many more
people maintaining active websites.
If competing channels are actively participating in or are actively mentioned in the
active portions of the web then you are going to need to come up with creative
ways for your business to get exposure in those parts of the web as well if you want
to compete.
Interactive Elements
Google Toolbar (

Fri Dec 28 12:03:16 PST 2018

sites, get a few links from signature files from SEO forums, and submit a few
articles to various article banks.
If your link profile matches that of most SEO websites, then it may be harder to
rank than if you can come up with creative ways to get links from places that few
other SEOs are listed.
Large portions of the web are well trusted and virgin territory to most SEOs. If
you can think of creative ways to relate your site to people with great trusted and
rarely abused link equity results will show far quicker than if you follow the same
path taken by many people who are obviously SEOing their sites.
Google Ignoring Some Directories
Some directories have recently been removed from Google?s cache, while others
are not crawled very deeply. Additionally, some of them have not had their cache
dates updated in a great deal of time. Google might be trying to slow the growth
of directories by not counting the links from many of them. Make sure you check
the cache before paying for a listing.
Some of the directories will have a greater effect on relevancy in MSN or Yahoo!
than they do on Google, so even if a directory is not counted by Google, the link
price might still be cheap for its effects on other search relevancy algorithms.
Many directory owners are building multiple related directories. Some search
algorithms such as Google Hilltop are likely to automatically filter out some of the
relevancy score from a second directory if it is easily identifiable as being related to
the first directory.
The one-time listing fees make directories exceptionally appealing, but expect that
many directories will eventually be ignored by some of the search engines. In my
perspective, that cost is factored into the listing prices. I average out the link costs
for links from a number of sites. If I can spend $2,000 and get a dozen or two
dozen well-trusted links, then that is going to be a good buy for launching a site.
The reason there are so many different pieces of information associated with
directories is that for a good period of time, they were pure gold. They provided
cheap and easy marketing and a virtually unlimited ROI, but because they were
getting so abused Google had to buck the trend by coming up with ways to lower
that ROI.
Instead of just discounting some of the links, I believe Google may even place
negative weighting on links from sites they define as low quality. Since Google is
so much more complex and harder to manipulate than Yahoo! and MSN (and
Yahoo! and MSN still place great weight on directories, even on many of the bad
, it is hard to explain what directories count as quality links ? it really depends
on your brand strategy, short term goals, and long term goals.

Fri Dec 28 11:03:23 PST 2018

If you are using a content management system for your site, make sure you are not
accidentally offering search engines the same (or near same
content at multiple
URL variations.
Places to Find Directories

I created the Directory Archives, which should only list directories that parse link
popularity to sites listed in them or directories that look like they might drive traffic
to listed sites. In addition, Search Engine Guide and ISEDB each have a large
directory of directories (though many of the directories listed in those may not
parse link value
Ensure that the page is in Google?s cache and that the address bar shows the
location of the site the link is going to before paying for placement in any directory.
I also created a Microsoft Excel Directory Checklist sheet so you can track your
submissions to many general directories. Some of them are a bit sketchy, but some
of them are decent, and most of them provide links for free or for a one-time fee.
The directories with a blue background are the ones I believe provide the most
authority in Google or are priced cheaply enough to provide value even if Google
does not count them.
The Value of Directory Listings
The value of a single directory link is usually not very great. Directories add value
after you list in many of them, especially if you are listing in high-quality ones next
to other sites that are in your same vertical.
If you have a keyword-rich domain name, it will help you get descriptive inbound
links from directories. Most sites on the web only have links from a few dozen
sites. By listing your site in a half dozen to a dozen quality directories and also
getting links from other relevant quality sites, you can quickly build up a great
linking campaign at minimal cost.
Known SEO Circles
Most directories are not of amazingly high quality, listing many lousy websites.
Martinibuster, a link building guru, often emphasizes avoiding being heavily located
in known SEO circles.
For example, common SEO strategies for a new site might be to list a site in a
bunch of directories, write a press release or two, trade links with many low-quality

Fri Dec 28 10:03:30 PST 2018

? To check that links are indexed by search engines, scroll over a listing
in the directory. The status bar at the bottom of the browser should
show ? A few good directories happen to
show some funky characters for redirects. Yahoo! is the only major
directory I know of that shows funky characters and still provides text
links that search engines index.

Most directories that show some funky tracking characters are not
providing static, spiderable links. If in doubt, ask questions at SEO
forums before spending any money.
? Some redirect links do get indexed, but there is no simple litmus test to
make sure that they do. You can right-click and copy links from within
the directory and do a server header check on them. If they show a
301 redirect, they will probably add to your link popularity. If they
show a 302 redirect, they probably will not add to your link popularity.
If they show a JavaScript redirect, then they do not count. When in
doubt about whether a link counts or not, ask in a couple SEO forums.
? If you use the Safari browser, you can use a tool from Digital Point to
view PageRank.
Crap Directories
Some people buy a directory script, create a ton of categories, and then only add
links if people buy them. The problem with this is that many categories will not
have unique content. Many of these same directories will list any business that
wants to pay for a link.
If there is no unique content, or if the content is all sponsored links, the site does
not add value, and search engines may not want to index it. When looking at a
directory, ask yourself, if you were a search engineer, would you want to count links
from that site? Would you want to allow that site to influence your search
relevancy algorithms?
You can view how many pages Google knows about from a directory by searching
Google for ?
Some directories (such as SevenSeek
do not have many listings as compared to the
number of pages in their site. This will cause some search engines to either avoid
indexing the site or only index a small portion of it. The index saturation of a site
in Google can be found by searching for ? text>.?
Signature text for a site is any common text that appears on every page, such as the
copyright. Sites that consist of many pages with the same or similar content may
trip duplicate content filters. If you run a directory make sure you do not let search
engines index the ?add URL? pages.
The value of a single
directory link is usually not
very high.
Directories add value after
you list in many of them,
especially if you are listing
in high-quality ones next
to other sites that are in
your same vertical.

Fri Dec 28 09:03:17 PST 2018

Reciprocal Link Required
Some directories require reciprocal links to be listed in them. I do not recommend
swapping links with most of these types of directories. Directories are intended to
list sites. Sites are not intended to list directories. If you like something, then feel
free to link to it, if not, then don?t. Some vertical niche directories are of high
enough quality to deserve links, but most are not.
Link popularity is a currency, and if you are lacking money (as I was when I started
on the web
, you may need to reciprocate a few links off the start, but if you get
too aggressive with link trades, you will be digging yourself a hole by making your
link profile unnatural, and you can waste many hours of time.
The exceptions to this rule are that I am usually willing to reciprocate links with the
following directories:
? Extremely powerful sites that I do not believe are going to get
penalized. Generally this type of site still should be on the same topic
as your site.
? Directories that are well-focused and are defined as an industry hub in
my topic. The SEO Consultants Directory, for example, would not be
a bad directory for SEO sites to link to.
Directory Warnings
Some sites that pose as directories do not provide static text links and/or their
pages do not get indexed. Many of these databases will never provide any direct
traffic or link popularity. Additionally, many directories require reciprocal linking
and use their sites to promote aggressive high-margin products. If you link into
sites that primarily promote off-topic high-margin items, then you are sharing the
business risk that site owner is taking.
If you choose to spend money on directory submission, you should ensure that the
directory provides direct traffic or link popularity. You can do this by checking to
make sure their directory pages have some PageRank on them and are in Google?s
cache. Search Google for ?,? and check the
links of listed sites. When you scroll over a link in the directory, the status bar at
the bottom should indicate the domain that the link is pointing to and not some
sort of redirect. You also can right click on the link to copy link location and then
paste that to the address bar.
? You can check PageRank by downloading the free Google Toolbar.
? To ensure that a page is not showing phantom PageRank, check that
the page is in Google?s cache. Search Google for
? Also make sure that the cache date is within the last month. If a page
has not been cached for many months, then search engines do not
trust that page much.

Fri Dec 28 08:03:16 PST 2018

There are also many industry-specific directories you can find by searching for
terms such as ? + add URL,? ? + submit,? or
? + directory.? I usually try to find directories that have one-time
submission fees or directories that look as though they are going to be long-
standing, quality directories.
Tips to Pick Directory Categories
Oftentimes a site will fit many different categories. When choosing a category to
submit to in a directory, I look at a few different ideas.
? Is my site likely to be accepted if I submit to this category? I tend to
be more conservative when submitting to a free directory than if I am
submitting to a paid directory.
? Are there reasons that this organization, or other sites outside of this
organization, are going to place extra emphasis on (or otherwise link
this category?
? How many links are listed in this category?
? Where does this category fit in the directory structure?
? How related is my site to the sites listed in this category?
Reasons Why I Like Second Tier Directories (Great Value

Since second tier directories are smaller, your link is usually closer to the root page,
and most pages have fewer outbound links than large directories. If the categories
in a large directory are full of hundreds of sites, or are many levels deep into the
directory structure, you may gain greater link popularity in a smaller directory than
in a larger one.
Keep in mind that the quality of the other sites listed in the directory matters too.
If they list many junk sites, then few quality resources are going to link to their site.
It is probably not a good idea to list your site in hundreds of directories if you have
not built up significant trust. Also remember that some links may count as a
negative, so try to be somewhat conservative with what you consider to be of good
quality if you are hoping to rank well in Google.
Directory Traffic
Directories rarely provide much direct traffic. The bulk of the value of a directory
listing is in how search engines evaluate the links. Occasionally, you will find a
general directory that does provide good traffic?that is the exception more than
the rule.
Niche directories tend to drive decent traffic. Any site that ranks well in related
search results may be a good place to get a link, because ranking for relevant search
queries means they stand a good chance of being a well-trusted link source and
sending you direct visitors.

Fri Dec 28 07:03:19 PST 2018

I do not necessarily have the best answer for that question. If you are building a
site for temporary profits, then even most of the low-quality ones can help you
build links that will be effective for ranking in MSN or Yahoo!. The problem
comes about when some of those links that help you in less sophisticated
algorithms end up hurting your Google ranking.
From the above picture, you can see that search algorithms are reliant upon linkage
data. If you look at a site?s inbound links (I will explain how to do this later on

and find few links from quality sites, few or no related sites, and many low-quality
links, that is not a good sign for the long-term potential of the link.
When you look through the listings in your category and throughout the directory,
there should be, in general, many high-quality sites that were added for free by an
editor for each site that paid for a listing. You do not need to view the whole
directory to figure out if it is good or bad, just a few categories that you know well.
Are quality sites listed there? If mostly junk sites are listed there, then you probably
do not want to pay for a submission.
If the categories are almost all blank, then wait to see if an editor will be making it
useful. If the directory consists only of paid listings or blank pages it is probably
not worth paying to be listed. Directories with many empty categories often get
flagged by duplicate content filters for having too much similar content, since there
is little content on most of their pages beyond the directory structure.
Another thing to look out for is site-wide or home page ads to high margin sites in
areas like casino, prescription, or debt consolidation. Avoid those types of
directories as well, since they are more likely to be above radar and search engineers
would be more likely to want to discount links from those sites.
I believe TrustRank is not implemented to the point where you get large negative
scores for just a few bad links, as scraper sites virtually guarantee all top-ranked
sites gain a few bad links, but perhaps it could be used to help figure out the good
link to bad link ratio and flag high PageRank sites with low trust scores for human
Niche Directories
Industry-Specific Directories is a business directory that costs a couple hundred dollars annually to
list your site. In general, is a strong link that most businesses should
buy in on. and some other directories allow you to list multiple links
in your listing.

Fri Dec 28 06:03:18 PST 2018

TrustRank can re-rank the results to account for any improved rankings that would
have occurred due to link spamming. They may even impose a penalty greater
than the effective gain from the link spamming, which is why it is important to
make sure you build some quality links and do not just buy from any page selling
TrustRank also flags high PageRank sites with low trust scores for human reviews.
This allows them to assign bad trust scores to sites that are artificially manipulating
the search results.
The following is a graphic of the link profile of a typical money-hungry, low-quality
directory. The red X's represent things that should be there, but are not.

Many other low-quality sites exist on the web outside of the directory space, but
the business model of selling links to desperate webmasters has created a glut of
junk directories. This glut of directories and their link profiles makes the concept
of TrustRank easy to understand.
How do I Tell Which Directories are Good and Which are Bad?

Fri Dec 28 05:03:19 PST 2018

RubberStamped, Uncover the Net, JoeAnt, and Skaffe all cost less than $50 each
for submission.
JoeAnt is free if you become an editor, and it only takes a couple minutes to sign
up. Gimpsy is free if you are willing to wait many months. Skaffe is free for
editors. GoGuides has a bulk submission discount program.
If you are going to list your sites in many directories, you may be able to save time
by using RoboForm to save some of your submission details, but make sure you
modify it to make your listings unique at each location.
Mix Things Up!
When links and citations occur naturally, there is no easily definable pattern. If
something is easy for a search engine to do and it will improve search quality, they
probably will do it. As a result, make sure you mix up your anchor text and your
site descriptions so that there is no easily identifiable unnatural pattern.
If you start directories yourself and you use common default directory software,
you may want to remove the common footprints the script leaves. If other sites
using this script are abusing it, you do not want to cause your site to be filtered as
well if a search engine decides to penalize sites that are using a commonly abused
Junk General Directories
On the web, links are a currency. The problem is, many webmasters want any link
they can get to improve their link popularity. Some webmasters take advantage of
this situation by creating low-quality, general web directories that will link to
anyone willing to give them some money.
This leads to a couple problems, both of which essentially boil down to an
unnatural linkage profile. If a directory is not useful to humans, then the inbound
links are likely going to lack linkage data from many trusted sites. To build up a
high PageRank, the directory will often build lots of links from many low-quality
Additionally, many of these directory owners are lazy and have no desire to create
any legitimate value. In not employing editors to add any useful sites, most of the
listed sites in the directories are of low quality.
TrustRank as an Equalizer
TrustRank is an algorithm that assigns trust scores to a seed set of around a few
hundred or so human-reviewed websites. From those sites, trust propagates
throughout the web. If most of a site?s inbound and outbound links are related to
sites with low trust scores, then that site might have a bad trust score. Good sites
will likely have a higher percentage of trusted links.

Fri Dec 28 04:03:19 PST 2018

Your site will still list in Yahoo! powered search results even if you do not submit
your site to their directory, but their directory is well worth its cost for most
commercial sites. Yahoo! charges a $299 recurring annual fee for commercial sites
(double that for adult sites
, which is a bit expensive for small-time webmasters, but
not a large fee if you are serious about making and marketing a quality website with
a legitimate business model.
A number of people I know have changed their credit card details and found that
their Yahoo! Directory listings still stayed even though they did not re-pay their
recurring annual review fee.
Unlike most directories, Yahoo! shifted their directory to list sites in order of
popularity, instead of alphabetically. They also paginate the results, so if your site is
new and there are 300 sites listed in your category, your site will not be appearing
on the first page of the category unless you also pay a monthly directory category
sponsorship fee, build significant link popularity from a variety of sources, or find a
more niche category to which you can submit.
Non-commercial sites can list in the Yahoo! Directory for free, and I can attest to
the fact that they have listed multiple sites I own for free. I have also submitted
dozens of paid listings and they have yet to reject any of them.
When a site gets submitted to the Yahoo! Directory, an editor checks the quality of
the site. Since Yahoo! controls their own directory, it would be logical for them to
place extra weighting on a Yahoo! Directory-listed site. Many top SEOs have told
me that they have seen significant increases in their Yahoo! Search rankings after
submitting a site to the Yahoo! Directory, and a Yahoo! Directory link seems to be
trusted fairly well in Google as well.
If you submit your site for free, make sure you submit to the most relevant
category. If you pay the Yahoo! Directory review fee, it might be worth it to try to
submit to a somewhat authoritative category. They may place your site in a
different category than that to which you have submitted, but it is worth a shot.
To give you an example, in the Yahoo! Search guidelines, they link to a SEO
resources category. Thus, I decided to submit my site to the authoritative SEO
resources category instead of submitting to the SEO services category. Why?
Because they link to the SEO resources category in their search guidelines, there
are fewer sites in that category, and the co-citation is associated with higher quality
Regional Yahoo! Directories
Yahoo! has depreciated the value of many of their own regional directories. They
still accept free submissions, but do not guarantee a review time.
Second Tier Directories
Although more expensive than many other second tier directories, BOTW is one
of the better general directories. Directories such as Gimpsy, GoGuides,

Fri Dec 28 03:03:17 PST 2018

With DMOZ, you do not need to keep resubmitting over and over. In the past,
they allowed webmasters to ask for status checks on submissions, but they
discontinued that in May 2005. If you do not get accepted, it is not worth losing
sleep over. Submit and forget it.
If you have general questions about DMOZ, you may want to ask at Resource
Become a DMOZ Editor
You may want to apply to become an editor if you really enjoy your category. You
should take your time when applying to become an editor. It is easier to become
an editor for a small, non-commercial category than a large, highly commercial one.
After you become an editor and do a good job, you can gain editing privileges over
other categories as well. Also, it is best if you do not disclose that you are
interested in SEO. They would prefer to hear you say you want to help organize
the web and make it a better place. Saying you are a hobbyist, enthusiast, academic,
or retired person is far better off than telling them you are the CEO of a company
in your field.
The Value of a DMOZ Listing
The Open Directory Project is syndicated by many other sites and inclusion into it
often provides your site with dozens of inbound links. Many people are quick to
state that the Open Directory is worthless or that it is super important.
The fact is, it is fairly important for some sites and fairly unimportant for others. It
really depends on how many other good places there are that may be willing to link
to your site and how creative you are in making things they would want to link at.
There are many variables that go into the value of a listing. I usually just submit
and forget about it. I do not find that it helps to be preoccupied with a DMOZ
listing. Many high ranking sites are listed in DMOZ and many high ranking sites
are not. Most of my original useful sites were accepted into DMOZ. Most of my
spam sites were not.
No ODP Meta Tag
It is easy to create a compelling meta description tag that emphasizes your brand
strengths, but if a website is listed in the Open Directory Project, search engines
may prefer to use your ODP listing information over your meta description or
page content when displaying your site in the search results. If you do not like the
ODP listing information, you can prevent search engines from displaying it when
your site appears in search results by using the following meta tag:

The Yahoo! Directory

Fri Dec 28 02:03:25 PST 2018

Of course, the goal of forums is to have meaningful conversations, but if you are
reading this e-book, odds are that you may still have some SEO questions.
Forum links are easy to get and forums have many links on the pages though, so
the links probably do not have a large effect on SEO. Forum sig links from
relevant. useful posts have far more direct value in driving sales and building
friendships than in effecting search results directly.
I have found that some search engines such as Yahoo! look at word patterns on
web pages to find what words relate to others. I have the username ?seobook? on
many forums. On many forums, there is a button to private message users next to
their username.

By helping others by participating in web communities, you become more
linkworthy and work your name and your brand into the language representative of
your topic. Plus, if you know what people in your community are talking about, it
is much easier to create things they would be interested in and market them to their
needs and wants.
General Directories
Directories Worth Getting Links In
The two most popular directories are DMOZ and the Yahoo! Directory. Just
about any quality search algorithm should trust and place weight on links from
those two sources.

The Open Directory Project
The Open Directory Project (DMOZ
is free, but sometimes it can take a while to
get listed. DMOZ editors work free of charge and are under no obligation to list
your website.
Ensure that you take the time to submit your site to the right category and follow
their directory guidelines. If your site is not in English, make sure you submit it to
the world category. Regional sites should be submitted to their respective regional

Fri Dec 28 01:03:17 PST 2018

On some occasions I have seen mainstream media outlets quote blogs or contact
people who left comments on blogs. If you are actively engaged in the
conversation, you will gain authority much quicker than if you are not.
Many major search engines and blog software vendors came together to make a
link ?nofollow? attribute. The ?nofollow? tag allows people to leave static links in
the comments of blogs that search engines may not count for relevancy.
Essentially, the tag is designed to be used when allowing others to post unverified
links into your site. It is a way of saying, ?I did not provide an editorial vote for the
other page.?
You also can use it if you are linking out to shady stuff, but do not want to parse
any link credit to the destination URL.
Many webmasters are likely to be a bit sneaky and create fake blogs and then spam
their own blog with links off to high-margin website affiliate programs.
The ?nofollow? feature looks as follows:
Link Text
The rel="nofollow" tag may also make it easier for many webmasters to cheat
out reciprocal link partners. However, I am a big believer in karma, and doing
things like that will likely come back to hurt you.
Also think of the ?nofollow? tag as if you were a search engineer. If a site was full
of nothing but unverified links, would you trust that site as much as a site that had
some trusted editorial links to other sites? I wouldn?t.
Search engineers, such as Google?s Matt Cutts, are trying to push webmasters to
use ?nofollow? on ads sold on their sites. If you run a clean business there is little
risk to using nofollow, but if your site is a thin affiliate site I would not recommend
using ?nofollow.? Using it essentially tells search engines that you understand
SEO, and might make them more likely to take editorial action against you if they
think your site is spammy.
Chat, Google Groups & Forums
In forums, people asking and answering questions creates free content for the
person who owns the site. This automated content creation allows the forum
owner to sell advertising space against the work of others.
In exchange for the posts, many SEO forums allow signature links that point to
your website. Since forums change rapidly, they often get indexed frequently. Your
site will get indexed quickly if you ask a few questions at a few of the various SEO

Fri Dec 28 00:03:19 PST 2018

Sites like CNN are crawled hundreds or thousands of times each day. Since search
engines are constantly adding content to their index, they are in a constant state of
How Search Engines Evaluate Links
Through the ?eyes? of a search engine, you usually cannot control who links to
you, but you can control to whom you link.. In most cases, if bad sites link to you,
it does not hurt you. If you link back, it does. So in essence, it usually does not
hurt you to get inbound links. You should be rather selective with whom you
are willing to link out.
Start With Trust
Some search algorithms may look at the good link to bad link ratio as well. If your
site has few well-trusted links and many low-quality ones, they may filter out your
site if they suspect it of overt ranking manipulation.
When you get quality links, you are not only getting the boost those links may give
you, but you are also lowering your risk profile and naturalizing your link profile.
Some links are a sure sign of quality. For example, if you are listed in the Yahoo!
Directory, search engines know that at some point in time an editor working at a
search company reviewed your website.
If you are trying to replicate the success of a competing site, it is important to start
by trying to get a number of higher quality links before getting too many low-
quality links.
If you are unsure if something is a quality link or not, ask yourself if you were a
search engineer would you want to trust that link. If the answer is ?yes,? then it is a
quality link. It is still okay to get some low-quality links, as automated scraper sites
and other junk sites give practically all well-ranked sites a bunch of low-quality
links, but the key to doing well in the long term is to try to create a reason why
people would want to give you quality links.
Blogs and Weblog Comment Spam
I recommend viewing the web as a social medium. Find blogs with posts about
topics you are interested in and participate in the community. The whole point of
weblogs is community discussion, so it is not spam to add something useful and
link to your website from it.
Don?t expect the link to help you rank better in the search engines, but if you
participate in your community and leave useful comments, it will make some
people more likely to link to your site or pay attention to you.
An even better way to get noticed with blogs is to comment about what other
blogs say on your own blog.

Thu Dec 27 23:03:26 PST 2018

The best way to get your site indexed is through having a search engine follow
a link from another site. This section will focus on how to maximize the speed
and efficiency of this process. I will address paid inclusion (mentioned above
more depth toward the end of this book.
Social Interaction and Links
Where to Get Links
? Create content or ideas that important people can identify with and
would likely link to.
? Directories may link to sites you submit.
? You can exchange links with similar websites. If you can afford to, it is
better to create legitimate business partnerships and friendships rather
than just to trade links with whoever is willing.
? Writing articles about your topic and placing them on other websites
can give you inbound links via the article signature. If you submit
articles to other sites, you may want to create unique content just for
the article submission sites, or have a longer or different version of the
article on your site so that you are not fighting against duplicate
content issues when others syndicate your articles.
? Writing press releases can give you inbound links.
? You can participate in forums that provide signature links. If you
participate in communities and leave relevant, useful comments, then
eventually people may want to link to you if they start to like you.
? Buy links or rent advertising space.
? Donate to charities for links.
? People interested in your site may eventually link to you without you
asking. Generally, this is where SEO battles are either won or lost in
competitive markets.
Generally, the easier and cheaper the link is to get, the less a search engine will want
to trust it. Getting other people to want to talk about you or your business (and
link to you
is the golden egg of SEO.
Search engines want to count legitimate editorial citations. They would prefer not
to count other types of links as votes. Some engines, such as Google, have
advanced algorithms to find and discount many artificial links.
How often do Search Engines Crawl?
Search engines constantly crawl the web. Pages that frequently update with strong
link popularity may get crawled many times each day. Pages that do not change
that often, are associated with spammy sections of the web, and/or have little link
popularity may get crawled only once or twice a month.

Featured Post


Indeed SEO is Not Cost: SEO Training and Solutions Kindle Store : See all matching items $1.99 seo,optimizati...