Keep up to date with the most current News, Tips & Highlights from the search marketing industry with the daily SEO Blog.

randfish

All Links are Not Created Equal: 10 Illustrations on Search Engines' Valuation of Links

Posted by randfish on May 27th, 2010 at 2:52 am IR and Patents

In 1997, Google's founders created an algorithmic method to determine importance and popularity based on several key principles:

  • Links on the web can be interpreted as votes that are cast by the source for the target
  • All votes are, initially, considered equal
  • Over the course of executing the algorithm on a link graph, pages which receive more votes become more important
  • More important pages cast more important votes
  • The votes a page can cast are a function of that page's importance, divided by the number of votes/links it casts

That algorithm, of course, was PageRank, and it changed the course of web search, providing tremendous value to Google's early efforts around quality and relevancy in results. As knowledge of PageRank spread, those with a vested interest in influencing the search rankings (SEOs) found ways to leverage this information for their websites and pages.

But, Google didn't stand still or rest on their laurels in the field of link analysis. They innovated, leveraging signals like anchor text, trust, hubs & authorities, topic modeling and even human activity to influence the weight a link might carry. Yet, unfortunately, many in the SEO field are still unaware of these changes and how they impact external marketing and link acquisition best practices.

In this post, I'm going to walk through ten principles of link valuation that can be observed, tested and, in some cases, have been patented. I'd like to extend special thanks to Bill Slawski from SEO By the Sea, whose recent posts on Google's Reasonable Surfer Model and What Makes a Good Seed Site for Search Engine Web Crawls? were catalysts (and sources) for this post.

As you read through the following 10 issues, please note that these are not hard and fast rules. They are, from our perspective, accurate based on our experiences, testing and observation, but as with all things in SEO, this is opinion. We invite and strongly encourage readers to test these themselves. Nothing is better for learning SEO than going out and experimenting in the wild.

#1 - Links Higher Up in HTML Code Cast More Powerful Votes

Link Valuation of Higher vs. Lower Links

Whenever we (or many other SEOs we've talked to) conduct tests of page or link features in (hopefully) controlled environments on the web, we/they find that links higher up in the HTML code of a page seem to pass more ranking ability/value than those lower down. This certainly fits with the recently granted Google patent application - Ranking Documents Based on User Behavior and/or Feature Data, which suggested a number of items that may considered in the way that link metrics are passed.

Higher vs. Lower Links Principle Makes Testing Tough

Those who've leveraged testing environments also often struggle against the power of the "higher link wins" phenomenon, and it can take a surprising amount of on-page optimization to overcome the power the higher link carries.

#2 - External Links are More Influential than Internal Links

Internal vs. External Links

There's little surprise here, but if you recall, the original PageRank concept makes no mention of external vs. internal links counting differently. It's quite likely that other, more recently created metrics (post-1997) do reward external links over internal links. You can see this in the correlation data from our post a few weeks back noting that external mozRank (the "PageRank" sent from external pages) had a much higher correlation with rankings than standard mozRank (PageRank):

Correlation of PageRank-Like Metrics

I don't think it's a stretch to imagine Google separately calculating/parsing out external PageRank vs. Internal PageRank and potentially using them in different ways for page valuation in the rankings.

#3 - Links from Unique Domains Matters More than Links from Previously Linking Sites

Domain Diversity of Links

Speaking of correlation data, no single, simple metric is better correlated with rankings in Google's results than the number of unique domains containing an external link to a given page. This strongly suggests that a diversity component is at play in the ranking systems and that it's better to have 50 links from 50 different domains than to have 500 more links from a site that already links to you. Curiously again, the original PageRank algorithm makes no provision for this, which could be one reason sitewide links from domains with many high-PageRank pages worked so well in those early years after Google's launch.

#4 - Links from Sites Closer to a Trusted Seed Set Pass More Value

Trust Distance from Seed Set

We've talked previously about TrustRank on SEOmoz and have generally reference the Yahoo! research paper - Combating Webspam with TrustRank. However, Google's certainly done plenty on this front as well (as Bill covers here) and this patent application on selecting trusted seed sites certainly speaks to the ongoing need and value of this methodology. Linkscape's own mozTrust score functions in precisely this way, using a PageRank-like algorithm that's biased to only flow link juice from trusted seed sites rather than equally from across the web.

#5 - Links from "Inside" Unique Content Pass More Value than Those from Footers/Sidebar/Navigation

Link Values Based on Position in Content

Papers like Microsoft's VIPS (Vision Based Page Segmentation), Google's Document Ranking Based on Semantic Distance, and the recent Reasonable Surfer stuff all suggest that valuing links from content more highly than those in sidebars or footers can have net positive impacts on avoiding spam and manipulation. As webmasters and SEOs, we can certainly attest to the fact that a lot of paid links exist in these sections of sites and that getting non-natural links from inside content is much more difficult.

#6 - Keywords in HTML Text Pass More Value than those in Alt Attributes of Linked Images

HTML Link Text vs. Alt Attributes

This one isn't covered in any papers or patents (to my knowledge), but our testing has shown (and testing from others supports) that anchor text carried through HTML is somehow more potent or valued than that from alt attributes in image links. That's not to say we should run out and ditch image links, badges or the alt attributes they carry. It's just good to be aware that Google seems to have this bias (perhaps it will be temporary).

#7 - Links from More Important, Popular, Trusted Sites Pass More Value (even from less important pages)

Link Value Based on Domain

We've likely all experienced the sinking feeling of seeing a competitor with fewer and what appear to be links from less powerful pages outranking us. This may be somewhat explained by the value of a domain to pass along value via a link that may not be fully reflected in page-level metrics. It can also help search engines to combat spam and provide more trusted results in general. If links from sites that rarely link to junk pass significantly more than those whose link practices and impact on the web overall may be questionable, they can much better control quality.

NOTE: Having trouble digging up the papers/patents on this one; I'll try to revisit and find them tomorrow.

#8 - Links Contained Within NoScript Tags Pass Lower (and Possibly No) Value

Noscript Tag Links

Over the years, this phenomenon has been reported and contradicted numerous times. Our testing certainly suggested that noscript links don't pass value, but that may not be true in every case. It is why we included the ability to filter noscript in Linkscape, but the quantity of links overall on the web inside this tag is quite small.

#9 - A Burst of New Links May Enable a Document to Overcome "Stronger" Competition Temporarily (or in Perpetuity)

Temporal Link Values

Apart from even Google's QDF (Query Deserves Freshness) algorithm, which may value more recently created and linked-to content in certain "trending" searches, it appears that the engine also uses temporal signals around linking to both evaluate spam/manipulation and reward pages that earn a large number of references in a short period of time. Google's patent on Information Retrieval Based on Historical Data first suggested the use of temporal data, but the model has likely seen revision and refinement since that time.

#10 - Pages that Link to WebSpam May Devalue the Other Links they Host

Spam and its Impact on Link Value

I was fascinated to see Richard Baxter's own experiments on this in his post - Google Page Level Penalty for Comment Spam. Since then, I've been keeping an eye on some popular, valuable blog posts that have received similarly overwhelming spam and, low and behold, the pattern seems verifiable. Webmasters would be wise to keep up to date on their spam removal to avoid arousing potential ranking penalties from Google (and the possible loss of link value).


But what about classic "PageRank" - the score of which we get a tiny inkling from the Google toolbar's green pixels? I'd actually surmise that while many (possibly all) of the features about links discussed above make their way into the ranking process, PR has stayed relatively unchanged from its classic concept. My reasoning? SEOmoz's own mozRank, which correlates remarkably well  with toolbar PR (off on avg. by 0.42 w/ 0.25 being "perfect" due to the 2 extra significant digits we display) and is calculated with very similar intuition to that of the original PageRank paper. If I had to guess (and I really am guessing), I'd say that Google's maintained classic PR because they find the simple heuristic useful for some tasks (likely including crawling/indexation priority), and have adopted many more metrics to fit into the algorithmic pie.

As always, we're looking forward to your feedback and hope that some of you will take up the challenge to test these on your own sites or inside test environments and report back with your findings.

p.s. I finished this post at nearly 3am (and have a board meeting tomorrow), so please excuse the odd typo or missed link. Hopefully Jen will take a red pen to this in the morning!


Do you like this post?

Yes No

99 thumbs up and 1 thumbs down

97

Bookmark and Share

User Comments


  • May 27th, 2010 at 3:00 am
    Great, thanks will add this to the check list of the on-page and off-page optimization documentation for my work. I do add the nofollow to the check list when doing research on value of the possible links.
    Thumbs Up Thumbs Down
    3 up, 0 down
  • May 27th, 2010 at 3:17 am

    Another useful refresher on links! For me the new idea / info is the page-level penalty for blogposts (it makes sense but never occurred to me).

    Matt Cutts has this video on Yahoo's TrustRank being "completely separate from Google."

    What I would love to know is what is known about Google's seed sites. E.g. is it a fact that the Library of Congress site is one of them, or is this merely speculation? I suspect Wikipedia would be another strong candidate.

    Rand, could you share any info about this please?

    Edited by Philip-SEO on May 27th, 2010 at 3:40 am
    Thumbs Up Thumbs Down
    5 up, 0 down
  • May 27th, 2010 at 3:18 am

    For me, this is a welcome return to form in your posts Rand.  This is an excellent resource which I for one have certainly learned from both in terms of new material and re-affirming things I already believed to be true.

    I don't have the facility (read: "time") to do a lot of testing, so I can't prove or disprove any of this but your research gives the impression of being thorough and therefore highly useful.  The diagrams really help too!

    I can definitely relate to point 7 regarding the strength of random page links from quality domains.  Some months ago I was working on a friends site and we managed to get a mention on the bbc.co.uk site from a page way down deep and almost immediately I saw rankings boost for not only the particular page they linked too but the site in general. Obviously that’s not scientific but it certainly mirrors your experience here.

    The only minor issue I have with this post is point 2, I’d really like to get a statisticians opinion on whether .18 to .225 can really be defined as “a much higher correlation” on a Spearmen Correlation scale.  I used to (a few years ago) dabble in this statistical method and I don’t think you can justify making such a bold statement given the numbers you produce.  Obviously I don’t know your sample size and all your methodology on this one, and I’m far from an expert on it myself.

    That aside, thanks for sharing this work, it will definitely be something I re-read several times.

    (Edit: spelling)

    Edited by Bludge on May 27th, 2010 at 3:19 am
    Thumbs Up Thumbs Down
    2 up, 0 down
    • May 27th, 2010 at 8:44 am
      We actually addressed this in the comments section of the post cited in point #2. It details the methodology, including the standard error of the correlations. The correlations come from very large n values, and I think its safe to say that a difference of .005 or greater is significant.
      Thumbs Up Thumbs Down
      1 up, 0 down
      • May 28th, 2010 at 1:16 am
        Thanks for linking, I hadn't read all of them replies. Some of the explanation is a little beyond my knowledge but it's good it was addressed so fully. I now accept that there is some "statistical significance" to the result, but the "much higher correleation" part of the analysis still doesn't sit comfortably with me. Statisitcally significant yes, "much higher", no. Possibly I'm just nit-picking over semantics now.
        Thumbs Up Thumbs Down
        1 up, 0 down
  • May 27th, 2010 at 3:27 am

    Great stuff!

    Now that Google are adding site performance (loading speed) into how a page ranks does anyone think that this could also affect the value of outbound links from these sites ?

    Edited by stukerr on May 27th, 2010 at 4:00 am
    Thumbs Up Thumbs Down
    2 up, 0 down
    • May 27th, 2010 at 5:53 am

      If sites are being penalized already for speed it would make sense for the links to be worth less.  i would also guess that a slow site probably does not give the best user experience and hence might not be the best place to gain out bound links from in the first place.

      Although at present i am not aware of many sites being impacted by the inclusion of page speed. Its something i will keep at the back of my mind though.

      Thumbs Up Thumbs Down
      2 up, 0 down
  • May 27th, 2010 at 3:45 am
    is it possible that a site that has a high amount of traffic can pass a great link value?
    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 5:31 am

      How could I put it in my words...

      For me (it's not necessarily a bulletproof statement, so throw it away if it don't suits you), SEO is all about increasing a website visibility.

      Visibility = More traffic

      More traffic = Better conversion rate (You get more chances to sell whatever your selling)

      Better conversion rate = More money in your pockets (Or anything you want to perform, depending of your objectives, but it's usually everyone's ultimate goal... meh!)

      From this point of view, traffic is a *result* of the SEO process. Therefore, high amount of traffic can't be implied in the calculation of Off-Site SEO: It would be a vicious cycle in search engines algorithms... 

      Thumbs Up Thumbs Down
      2 up, 0 down
    • May 27th, 2010 at 5:56 am
      Well, if link value includes traffic then sure! But this is not technically a part of the SEO value of a link.
      Thumbs Up Thumbs Down
      2 up, 0 down
  • May 27th, 2010 at 3:59 am

    Your post is rocking and knowledgeable... I really appreciate the way you write . I would like to read more from you.

    Edited by richardbaxterseo on May 27th, 2010 at 4:49 am
    Thumbs Up Thumbs Down
    1 up, 6 down
  • May 27th, 2010 at 4:01 am

    Another awesome post! 

    This is related to point 10 :

     what should be the course of action if any of your domains face a spammy attack and multiple inbound links are created to the domain? (Maybe a whole post can be dedicated to this)

    Lately one of our domains got hacked and an html page and a .php page with the script to create umpteen no. of inbound links was uploaded on the server.

    By the time we came to know i.e within 4 days the html page had got indexed and the 100s of inbound links were refelected  in the WMT.

    Needless to say all the rankings were affected and it started ranking for all irrelevant  and embarrasing keywords.

    We have removed the html and the php page from the Google index by using the WMT Remove URLs  feature.

    The no. of inbound links now showing in WBT have reduced but what else can be done to reverse the process  and to get back the rankings. This domain ranked on page 1 for many relevant keywords and the company was getting good online business through organic search.

    It is very frustrating when your hard work is sabotaged by a spam attack like this .

    In fact as a policy matter WebPro never believes in link building as we advocate the process of getting natural links for the domain . As  commented by us on the SeoMoz post on http://www.seomoz.org/blog/whiteboard-friday-sitewide-reciprocal-and-directory-links

    Thumbs Up Thumbs Down
    2 up, 0 down
    • May 27th, 2010 at 4:51 am
      Tell me about it! That's exactly how I felt when I'd been away only to return to find one of my best traffic generating posts (at the time) had dropped out of the rankings due to comment spam / safesearch filtering.

      I think in the long term I learned a lot more from the experience than anything I lost.
      Thumbs Up Thumbs Down
      2 up, 0 down
      • May 27th, 2010 at 7:02 am
        Richard, I'd love to see a post about your experience and what you learned from it. I've had my sites hacked before and the only thing I know to do to prevent it is to keeps my scripts up to date. If you knew more tips I'd be all ears.
        Thumbs Up Thumbs Down
        2 up, 0 down
        • May 27th, 2010 at 7:24 am

          Yes I think all would be interested in knowing how to unlink the unwanted links rather than how to build links for a change.

          Looking forward to read  such a post .

          Right now GWT is the only tool available as far as I know to undo such harm.

          As SEOs we should be well equiped with this knowledge also to help the client to come out of such a mess.

          Especially when they have spent so much on SEO and when such a situation arises though the SEO is not responsible for this but we should have a plan and strategy to help his website come out of this mess.

           

          Thumbs Up Thumbs Down
          3 up, 0 down
    • May 27th, 2010 at 6:01 am
      Now that you've removed the offending links, just wait for things to be recrawled and reindexed, and you can force this a but by pinging the URLs from which the bad links were removed. But if your rankings are not back to normal in like a month, you can apply for "site reconsideration" in GWT, explaining what has happened, and Google will probably help you. But I think it's enough to just wait, and things will right themselves.
      Thumbs Up Thumbs Down
      3 up, 0 down
  • May 27th, 2010 at 4:36 am

    Awesome post.. Thanks alot.

     

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 4:36 am
    Another great post! This page is directly going to my "cheat-sheet" bookmarks ^^
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 4:50 am

    Thanks Rand... very useful post as always. I liked very much the infographics, as they helped me understand even better what said.

    I would like to underline one corollary effect of what explained in point 1 ("Links Higher Up in Html Code Cast More Powerful Votes"): links in Footer cast less powerful votes than any other links in Html Code (as they normally are at the end of the Html Code).

    And I find interesting the note (always in point 1) that HigherUp Links can overwhelm On-page factors... as it's almost a dogma the power of the Title tag.

    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 5:01 am

    I had some incredible success with #9 recently. My site even outranked the keyword domain manufacturer site!

    That alone really demonstrates the worth of a targeted online marketing and SEO campaign.

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 5:06 am

    Im fairly new when it comes to most SEO topics so this post has been a massive help thanks!

    Really interesting to see what kind of links perform better than others, and what I can be doing to help get my ranking up a bit.

     One thing I have come across a few times is internal linking, I will always link keywords from one page to the other but never know if it is best to link all the keywords displayed on one page or just one? I have seen arguments for and against.

    Also you have shown that when a keyword appears first in a title tag this will perform better but is it best to have just the keyword or a short description? Again I've heard many arguments for both sides!

    Would be nice to know some other peoples opinions.

    Thanks again!

    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 5:23 am

      if you have couple links from page a to page b, only the first link and the anchor text will be counted towards page b - the rest is useless in terms of juice passing.

      So the best practice would be to have one link with your most targeted anchor text from page a to page b.

      hope that helps!

      Thumbs Up Thumbs Down
      3 up, 0 down
  • May 27th, 2010 at 5:18 am

    Amazing post, thanks so much for sharing!

     

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 5:28 am
    good abstract of the most important things to know about links..
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 5:37 am
    Great insights as always. Very insightful and actionable which can impact SEO activity right away. Thanks for passing this on.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 5:38 am
    no#10 especially wrankles me. In the case where legacy agency SEO tactics included mass placement on such pages and those pages later gathered crap links the client suffers in blissful ignorance. Multiply this by thousands and just the job of monitoring those legacy links let alone removing them becomes a huge task that distracts from the genuine good link development effort.
    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 5:41 am

    How can't we bookmark this post?

    It's really "chicken soup for the SEO soul" stuff!

    Btw, Burst of New Links had mistaken me so many times in the past I now have a 90 days delay in my procedure, starting the day I see my new links in YSE.

    Thanks for such great stuff Rand!

    Thumbs Up Thumbs Down
    3 up, 0 down
  • May 27th, 2010 at 5:50 am
    Great Link value refresher.

    I'll think I'll summarise this into bullet points and just use it as a reminder for myself. Heck, I didn't even know about some of these points!!
    Thumbs Up Thumbs Down
    3 up, 0 down
  • May 27th, 2010 at 6:32 am
    Very comprehensive! One thing I'm still curious about though is linking to relevant external sites from your site. Does this still help out a smidgen (assuming they are well respected relevant sites)?
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 6:54 am
    thinking of point 3 about links from other domains... How do links from other domains within the same IP play into this? I'd think it's less effective than links from domains on different IPs but curious what everyone else thinks about this.
    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 7:22 am

      I won't have thought it would take into consideration whether the domain is from the some IP address or not...I honestly hope it doesn't take that into consideration!

      But yeah, I'm sure to be on the safe side, it would be best to get links from totally different sources, IPs and domains alike.

      Thumbs Up Thumbs Down
      1 up, 0 down
    • May 27th, 2010 at 7:31 am

      Not scientific datas to show you here, but it's quite widespread the opinion that links from websites with the same IP address (and C-block too) are less powerful than from totally different IPs.

      Edited by gfiorelli1 on May 27th, 2010 at 7:55 am
      Thumbs Up Thumbs Down
      1 up, 1 down
  • May 27th, 2010 at 7:20 am
    A great piece, loved the illustrations! Although this was pretty old news to most SEO'ers still a good refresher and great if you're new
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 7:32 am
    Most excellent Rand! I can see why it took you until 3AM to finish this. You've shared a ton of info , yet made it simple to understand with your "Rand Graphics". This would make a superb shiny wall poster. Y'all ought to consider branding some SEOmoz posters that incorporate some of your best "info" or "how to" posts.

    I didn't know about the ability of a higher link to overwhelm on page, especially the title tag. I'll have to look into that more.

    Point #9 says It appears that the engine also uses temporal signals around linking to both evaluate spam/manipulation and reward pages that earn a large number of references in a short period of time. I wonder where the balance lies when building links. I've always subscribed to the doctrine of building a clients link profile slowly and steadily to avoid any kind of Google penalty. Yet it sounds like they allow and even reward a flood of new links. I'd love to see more on this.

    Hope you're able to catch up on your sleep. Perhaps you can use your new door to good effect today and take a nap :)
    Thumbs Up Thumbs Down
    3 up, 0 down
    • May 27th, 2010 at 9:31 am
      Yeah, I share your views on building links slowly as well. I would have thought a sudden high volume of links to a new website would trigger the paid links/directory submission alarm bells with search engines or something like that.

      I guess it also has to do with where the links are coming from, if it's from a highly trusted source, or not...
      Thumbs Up Thumbs Down
      2 up, 0 down
      • May 27th, 2010 at 10:05 am
        Yeah Tola. You're probably right. If the links were coming from news sites it would be a cinch for Google to see they're not spam.

        But unless I hear different from the mozplex, as far as link building goes, I'll follow the old nursery rhyme:

        When you're unsure of the way to go, build your links...nice and slow.

        What? Your Nanny didn't used to tell you that rhyme? Huh. Come to think of it, my Nanny was a bit peculiar...
        Thumbs Up Thumbs Down
        1 up, 0 down
        • May 28th, 2010 at 1:20 am

          I think I would have been worried if my nanny used to sing that to me...lol!!

          But yeah I'll go with the rhyme myself!

          Thumbs Up Thumbs Down
          1 up, 0 down
  • May 27th, 2010 at 7:35 am

    thanks, awesome post.

    I have noticed that some web sites with terrible contents but very good links rank better...

    and trustrank & age of the website are more and more relevant..

     

     

    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 9:28 am
      One of my client's main competitor has a site with mostly low quality links and an old domain. I think the age of the domain does matter a lot, but how about the age of the links? Do the engines keep track of how long a link exists and use that to help determine its voting power?
      Thumbs Up Thumbs Down
      1 up, 0 down
  • May 27th, 2010 at 8:37 am

    Another nicely presented and informative post Rand. I think most seasoned SEO's would agree with all of these assumptions.Like you say, there is nothing like testing and is where the true insights are gained.

    You have a great knack for presenting complicated issues easily and effortlessly with your infographics. A picture really does speak a thousand words!

    Out of interest, which software do you use in order to create them?

     

    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 8:42 am

    Excellent info, now I have seen that many websites are doing lots of blogs in different free blog creation systems and linking from them using content links, how about these kind of links?,they have diversity of domains from authority sites (they make blogs from the best newspapers in Argentina, Spain, etc.).

    They are ranking high for with just a few blogs created this way for some competitive keywords in the spanish market.

     

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 8:48 am
    For point #8, I wonder if it's possible that the disparities in your measurements are due to Google now following basic JavaScript links.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 8:58 am
    I currently have 3 .gov links coming into my site.  I've heard they are about 20 times more powerful than most other links.  Is this true?
    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 9:36 am
      I don't think anyone can qualitatively say how much more or less a link is worth. I would say look at #4. .gov domains may have more authority because they are closer to a "trusted seed" and perhaps not because of their .gov type. I can also imagine that relevance of the page also will play a bit factor. Selling lawn tractors on and FCC forum probably won't pass much juice, but a link from a relevant .gov site would definitely help more.
      Thumbs Up Thumbs Down
      1 up, 0 down
    • nbd
      nbd
      May 27th, 2010 at 5:18 pm
      Have to agree with mathewshoffner, more likely due to sites own standing rather than .gov status. Matt Cutts makes this point over at Google Central. I'd lean towards his view more than your '20 times more powerful' source.
      Thumbs Up Thumbs Down
      1 up, 0 down
  • May 27th, 2010 at 9:07 am

    SEO to me is like one gigantic jigsaw puzzle, with tiny pieces of the puzzle scattered all over the web (and many false pieces thrown in for good measure). The jigsaw puzzle deliberately has pieces missing, but if you can put enough of the pieces together, you'll get a pretty good picture of what's going on.

    This post for me is like finding a pretty large piece of this puzzle. It's quite exciting to get a glimpse of some pretty good testing around the anatomy of a link.

    Regarding #7, I think a lot of link builders try to piggy-back off trusted domains that allow you to throw up a web page with followed links. Presumably from your comments these types of sites that allow user-generated content won't pass much value given that they probably end up linking to a lot of junk sites.

    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 9:40 am

    Amazing article Rand. Few things i knew but few I didn't.

    It really help me a lot to understand the things from Link Placement & Diversity point of view.

    Thanks for sharing this detailed information. Really appriciate it. You always rock :)

     

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 9:57 am
    What's a randtom?

    as in randtomtestreggae.com (see #7). Just proof that I actually read your post.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 10:05 am

    Very Good!A will translate this in my site: www.webmarketing-seo.com.br

    Edited by chenry on May 27th, 2010 at 1:17 pm
    Thumbs Up Thumbs Down
    1 up, 3 down
  • May 27th, 2010 at 10:31 am

    Love it!

    Too many people don't understand how to create better links for better ranking. These are all quick and actionable examples of how to create better links.

    Thumbs Up Thumbs Down
    2 up, 4 down
  • May 27th, 2010 at 10:37 am
    I think I was addicted to Fuzquatziik at one point in my life.

    As always, thanks for sharing your research.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 11:29 am
    Porn Loans and Viagra Poker, very original. 
    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 11:46 am
    Another great post. I'll be sure to point people to this article when we have a client who still thinks they can buy their way to a great link profile!
    Thumbs Up Thumbs Down
    3 up, 0 down
  • May 27th, 2010 at 11:50 am
    Phenomenal. I enjoyed what I've been learning from your experiments and experience in regards to link placement and link quality.
    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 12:00 pm
    Rand - I've been reading this blog since 2006. This is a great summation of how Google has grown and deliniated the signals they use for quality control and value. This reminds me of some posts that Jim Boykin put together a few years ago, but yours make the pecking order even easier to understand. Some of these ranking factors seem intuitive after you understand what Google wants to give the user. Thanks.
    Thumbs Up Thumbs Down
    3 up, 0 down
  • May 27th, 2010 at 12:44 pm

    Great stuff.

    What do you think about temporary internal links? There are a lot of plugins or widgets that rotate posts and display random links to other posts - for example the plugin that lists "similar posts" at the end of an article.

    Are these temporary links offering any value or they even raise some red flags? They are useful for the user but I imagine the little robot dizzy trying to figure out how to count these hide-and-seek links.

    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 1:17 pm

      A Yahoo patent application from 2006 pointed at how search engines might identify those types of links and avoid crawling them.

      The patent is Consecutive crawling to identify transient links

       In short, it says that the search crawling program might visit a page and make a list of URLs on that page, and then return and refresh the page a minute or so later and make another list of the URLs, and compare the two lists.  It may then mark URLS that appear on both lists as ones to possibly crawl, ignorning the URLs that changed from the first visit to the second. 

      The URLs that change are referred to as transient links, and the patent notes that they may also visit and crawl the pages those lead to so that the search engine can ignore those links in the future.

       

      Edited by bill slawski on May 27th, 2010 at 1:19 pm
      Thumbs Up Thumbs Down
      3 up, 0 down
      • May 28th, 2010 at 6:25 am

        Good point Bill and thanks for reminding me about one of the best, although understated aspects of this article.  The patent references.  While I presumably may not understand the scientific depth of a patent doc, I believe it's another resource to get an idea about a search engine's focus.  Thanks for re-freshening that point for me

        As for transient links and other visitor-useful tactics, I think it's important to understand how link juice flows.  Of course it is.  In cases similar to this one I believe that we should also use those tools and tactics that do help the visitor even if they don't directly influence ranking.

        Perhaps the point of this mini discussion would be around the usefulness of the plug-in or if the author should manually link to these related posts?  In my mind if it's a choice between doing something useful for the visitor or dropping it because there's no SEO benefit, then I hope that we would choose to side with the visitor.

        It's that balance of always doing what's right for the visitor along with doing what's right for SEO and her fickle prophets: Google, Bing and so on. Edited by Mike_CN on May 28th, 2010 at 6:27 am
        Thumbs Up Thumbs Down
        2 up, 0 down
        • May 28th, 2010 at 9:21 am

          Thank you, Mike.

           I had noticed that Rand put this post in his "IR and Patents" category, so I figured that it wouldn't be a bad idea to  respond to icanhazseo's question with something from a patent that was pretty much on point, even though it was from Yahoo instead of Google.

          I've always felt that it was as important, if no more so, to look at the assumptions behind a patent filing rather than just the examples that they provide within the patent.  The big takeaway from me on the Reasonable Surfer model patent for instance, is that links that appear to be the ones most likely to be clicked upon by a visitor stand a good chance of being the ones that carry the most weight.

          On Yahoo's transient link patent application, the main purpose appears to be to keep a search engine from using up its crawling budget by following links to advertisements.  It's sort of a "measure twice, cut once" approach to web crawling - the links that don't change after a minute or so and a refresh are likely the more important links.

          I do like widgets or link sections that might change over time to point to things like "most popular posts," or "recent comments" or "related posts" or on an ecommerce site, something like "featured products," and I think those can be a useful way of providing an alternative way for visitors to experience a web site.

          What I've recommended in the past, based upon the ideas behind that patent filing, is that if a widget or link section like that isn't likely to change over a short interval of time, it isn't likely to be harmful.  If it changes everytime it's viewed, a search engine might confuse the content and links within it as random advertisements that might not be so important to crawl. 

          So, for instance, if you set up a "most popular products" on a category page for an ecommerce site, it might be a good idea to change those once a day or once a week instead of everytime someone visits.  The benefit of the links is to both allow a visitor to see some alternatives that might be interesting and to provide a search crawler with links to deeper content on a site, and to keep the search engine from confusing those links with ads.

           A "related content" widget on a blog post will likely show the same links for a post until possibly a new blog post is made that might be more related, so it probably wouldn't be a problem as a "transient" link.

          I agree with you that it's usually a better idea to side with the visitor, but it you can come up with a way to best serve a visitor and make it more likely that a search engine will crawl your pages, that's not a bad approach to take as well.  As you called it, a balance.

          Thumbs Up Thumbs Down
          3 up, 0 down
  • May 27th, 2010 at 1:58 pm

    Hi Rand,

    Thank you for the mention of my posts on the two Google patents.  I think the patents both provide some interesting topics for discussion.

    One thing that I would like to state here, because I've been recently asked about this, is that your post involves PageRank or link equity rather than the hypertext relevance of anchor text associated with links.

    As for your first point on Links appearing higher in HTML, interestingly, it's the original Google patent (Method for node ranking in a linked database)  that discusses that possibility as a factor, amongst a few others:

    Additional modifications can further improve the performance of this method.Rank can be increased for documents whose backlinks are maintained by different institutions and authors in various geographic locations. Or it can be increased if links come from unusually important web locations such as the root page of a domain.

    Links can also be weighted by their relative importance within a document. For example, highly visible links that are near the top of a document can be given more weight.* Also, links that are in large fonts or emphasized in other ways can be given more weight. In this way, the model better approximates human usage and authors' intentions. In many cases it is appropriate to assign higher value to links coming from pages that have been modified recently since such information is less likely to be obsolete.

    * my emphasis

    While we don't know whether or not those modifications were implemented by Google, the newer patent looks at a much wider range of features involving links, the pages they appear upon, the pages that they point towards, and user behavior data associated with the pages and the links.

    The Reasonable Surfer patent tells us that the search engine would consider all of those features in combination, using a feature vector, rather than considering them in isolation.  Weights for links would be generated based upon both user behavior data and the feature vector.

    So, a link that appears higher in the HTML of a page that appears commercial in nature, uses anchor text that isn't related to the content on the page, isn't clicked upon as much as other links on the page, and so on, may not be given as much weight as other links on the page.

    The patent does provide some other examples that may challenge some of the other illustrations that you provide.

    Another example from the patent that somewhat challenges your second illustration:

    For example, model generating unit 410 may generate a rule that indicates that a link positioned under the "More Top Stories" heading on the cnn.com web site has a high probability of being selected. Additionally, or alternatively, model generating unit 410 may generate a rule that indicates that a link associated with a target URL that contains the word "domainpark" has a low probability of being selected.

    An interesting twist on illustration number five is that some sites may have more than one "main content" section, such as a news page that has a number of headlines.

    A Microsoft patent, "Method and system for calculating importance of a block within a display page," tells us that it might give different weights to different content blocks based upon looking at features such as:

    1. The size and number of images in the block,
    2. The number of words in a block,
    3. The number of links, and words within those links in the block,
    4. User interaction within blocks, such as forms and the size of input fields.

    A multiple topic page may have links in a particular block that are very related to the topic of that block, but are unrelated to the topics in the other blocks.

    In a page that covers multiple topics, one topic block may be seen as more important than another, even if it is lower on the same page.  A link from that block might be viewed as more important than a link from another main content block.

    I think there may potentially be some issues with a number of the other features, but I do appreciate this post tremendously as a jumping off point for discussing how links may vary in value based upon a wide number of factors.  

    Edited by bill slawski on May 27th, 2010 at 2:03 pm
    Thumbs Up Thumbs Down
    6 up, 0 down
    • May 28th, 2010 at 12:23 pm

      Hey Bill - thanks a ton for the terrific contributions, both here and on your blog. I certainly agree that more nuanced positions likely exist on all of these, and that our observations are, as I mentioned, worth further testing.

      Thumbs Up Thumbs Down
      1 up, 0 down
      • May 28th, 2010 at 3:49 pm

        Thank you, Rand.

        Thats what makes SEO so interesting and fun.

        A really good patent, like this reasonable surfer model one, raises more questions than it answers, giving us interesting things to test and try, and opportunities to challenge our own assumptions and those of others. :)

        Thumbs Up Thumbs Down
        1 up, 0 down
  • May 27th, 2010 at 2:00 pm
    Great post Rand...Many things to go and mess around with now. ;-)

    Cheers,

    Kevin
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 2:12 pm
    well done randfish - keep up the good work!
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 2:21 pm
    great post on value of links rand. thx for the direct link to Google's patent on docs as well
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 2:57 pm

    #1 - does this only apply to Internal pages or from other sites linking to you as well?  I cannot see the logic behind for other sites linking to you, why a link higher up in the code is more important than something farther down.  Take this post for example.  If SEOmoz linked to another site under #1 and another site under #5.  #1 will get more juice or authority than the second link because it is found first on the page?

    #3 - If you have site A linking to you, say on one or two pages, getting another link will not do you much good, such as another article that you wrote, like Youmoz?

    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 27th, 2010 at 3:07 pm

      I am a little confused on trying to figure out the value of links from certain web sites/pages.  For example:

      We have a link from site A.  The page that links to us is a PR 6, homepage is 8 and this page is 2 clicks from the homepage or any other page on the site.  We are almost the exclusive external link on the page.  The site doesn't have many external links period.  However, SEOmoz data on this page is Page Authority: 33, Domain: 64, whereas homepage has 77 Page Authority, 7.14 MozRank and 6.83 Domain MozRank.  Based on the Page Authority of that page linking to us with a 33, I assume this link doesn't do much good at all?  There are people's profile pages on SEOmoz that have a much higher Page Authority than that! 

      I also find it interesting how the page authority is so low, but Google gives it a PR of 6.  Does that say PR is really a useless number now?

      Thumbs Up Thumbs Down
      1 up, 0 down
    • May 27th, 2010 at 5:45 pm

      In answer to your first paragraph, entry one 

      That is how I've interpret it as well, but you have to realize this is a blog and thus the link weighting algorithm is different from a corporate website most likely. This is because if a search engine can break apart the hierarchical structure of a web page into distinct "blocks" as Bill mentioned, then it stands to logic that a search engine is also able to group websites, blogs, online news publications, etc. into categories of their own and treat the link juice factor differently depending on what category it falls into.

      Edited by jpandian on May 27th, 2010 at 5:49 pm
      Thumbs Up Thumbs Down
      1 up, 0 down
  • May 27th, 2010 at 4:27 pm

    Great post.

    I must admit,SEO has confused me, but I am determined to learn more about it.

    Very easy to read, just got to make sure that I put into practice what you preach.

    Kind regards

    Simon

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 4:36 pm
    Nice article.  I was just asking my SEO guy about one of these today.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 6:01 pm
    This is very valuable information and analysis. Thank you for sharing !
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 6:34 pm

    One aspect of link valuation that I've tried extensively to find data on is the degree to which links pointing to a subdomain pass authority on to the root domain (and its underlying pages).

    This is extremely important when looking to build link equity for the root domain through company blogs & link bait.  Often times, these links assets are easier to build - and sometimes we have no choice but to build them - at subdomains. What are the trade-offs and how significant are they?

    Any thoughts on this would be much appreciated! 

     

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 6:44 pm
    it seems to hard for me
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 7:30 pm
    All of thios makes sense.  The one big surprise I found was the etent to which placement on a page makes such a difference.  But, upon reflection, it makes sense.  It is easy to throw a link at the end of a page.  Placing it near the top is a very clear "vote" of confidence in the link.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 8:50 pm

    For someone who is a visual learner, like myself, I appreciate this! great post rand,

    I was surpirsed that there was no mention of follow attributes, although I am sure the majority of readers will have a basic comprehension of them already, 

     

    Looking forward to tomorrow's WBF

    Thumbs Up Thumbs Down
    2 up, 0 down
  • May 27th, 2010 at 9:24 pm

    I got a lot out of this post...thx Rand....

     tony

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 9:42 pm

    Great post on the multiple aspects of link juice.

    Something I am not clear on and maybe someone here has had time to test this but I wonder what juice if any flows from a noFollow link from a high trustrank site. No anchor text or PR transfers I know but does the trustrank 'proximity' pass juice in any meaningful way?

    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 11:31 pm
    Just what we needed. I think that most active webmasters understand that more content and more backlinks lead to better ranking, but have quite a fuzzy idea of how those backlinks work and where they should be placed and with whom.
    I've been following SEOmoz for a while now, and even though I get something out of most posts, this is the one that should be made compulsory reading for all visitors :-)
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 27th, 2010 at 11:43 pm

    Question..does every industry has a "seed" site? and if yes how to find that seed site?

    For example which site is a seed site for a payday loan website or a debt consolidation website?

    I am a bit unclear about the seed site thing...thanks

    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 28th, 2010 at 1:59 am
      in the UK, I would have a guess at something like the Financial Services Authority would be treated as fairly authoritative in the Payday Loan sector.
      Thumbs Up Thumbs Down
      1 up, 0 down
  • May 28th, 2010 at 12:23 am
    Great summary! I'll send this over to the link slaves straight away!

    The value of in-content links is very interesting. Placing articles in 3rd party sites will continue to rise as an activity methinks.

    Jeremy.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 28th, 2010 at 3:48 am
    As everyone, I would like to thank you a lot for all these shared knowledges. The graphs are simply the best way to illustrate what links are about (and how they work) and they're awesome :)
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 28th, 2010 at 5:04 am
    Thank you for this very detailed post.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 28th, 2010 at 10:33 am
    Wonderful read.  The Illustrations are alway a plus in my book.  It was interesting to see the theory on the order of links on a page.  Thank you for the great material.
    Thumbs Up Thumbs Down
    1 up, 0 down
  • May 28th, 2010 at 12:23 pm
    Great post Rand. On point #9, you mention the benefit of overcoming stronger competition with bursts of fresh links. I've always been under the impression that slow and steady link acquisition was the way to go. Can you expand on this?
    Thumbs Up Thumbs Down
    1 up, 0 down
    • May 29th, 2010 at 1:21 am

      Yes, quickly getting building or a burst of links will make a site rank high according the freshness factor. I've seen this with sites I've worked on.

      Basically normally SEOs suggest building slow and steady link acquisition because there is also the fear of getting a penalty for building links too fast (which many spammy sites have abused). But the only reason there would be a penalty for this is if it worked. So if the links are natural and no-spammy I don't think you would have to worry if you had a burst of links all of a sudden.

      Hope that helps!

      Edited by seosean on May 29th, 2010 at 1:21 am
      Thumbs Up Thumbs Down
      1 up, 0 down
  • May 29th, 2010 at 4:48 am
    Your post is rocking and knowledgeable... I really appreciate the way you write . I would like to read more from you. i must visit it again when i am free herbal hgh
    Thumbs Up Thumbs Down
    1 up, 2 down