A look inside the world of search from the people of Yahoo!

September 29, 2005

Webmasters, tell us what we don’t know

Having been to a couple of Search Engine Strategies conferences, I realized how much you look to search engines for information on how your content is indexed by them. I’ve heard stories of elaborate scripts that scrape search engines, using ‘site:’, ‘link:’ and ‘linkdomain:’ queries to understand your content’s relationship to other pages on the web. Through these queries, Yahoo! provides unique information, but often there is more that you are looking for.

Today we are launching Site Explorer from Yahoo! Search, a webmaster tool we talked about at SES San Jose. WebmasterWorld and Search Engine Roundtable have been tracking it, and so have some folks on My Web. Currently, you can use Site Explorer to:

Site Explorer is geared towards your needs, providing 50 results by default, web services APIs, the ability to export the data to a TSV file for further analysis, as well as free submission for missing URLs.

Tell us what we don’t know. If you don’t find a URL that you expect to be in the index, use free submit. In case you hadn’t heard, we are also accepting lists of URLs, so you don’t have to provide us one URL at a time.

This is a starting set of features of what we hope becomes a truly valuable tool for you to interact with us. So please send us feedback, tell us how the product works for you, and let us know what else you’d like to see. Enjoy exploring, and tell us what you don’t find!

Priyank Garg
Product Manager, Yahoo! Search

Posted by Yahoo!Search at September 29, 2005 01:05 PM
Comments

This is a helpful and efficient asset for both Webmasters and SEOs.

Thank You :-)

Anticipate using it very often

Really looking forward to more innovations from Yahoo!! (That's Yahoo! - plus exclamation)

Posted by: Search Engines Web at September 29, 2005 01:23 PM

An asset to SEOs? I certainly hope not! I like to see relevant search results, not "optimized" ones.

Posted by: Anonymous at September 29, 2005 02:41 PM

This site is an option for SEO site research, as opposed to their using the Search.yahoo.com for their analyses


Yahoo's Search SERPs are still as relevant as before - it is just that siteexplorer ONLY accepts - "URLs" -
as queries, and adds a more related format.

Posted by: Search Engines Web at September 29, 2005 03:32 PM

The 5000 rate limiting will be a big problem for Onlinetools. So the most will probably prefer to scrape your sites in the future as they do are doing it now. Furthermore I hoped to discover with the siteexplorer more than 1000 links. At the moment I am a little disappointed.

Posted by: Jojo at September 29, 2005 04:13 PM

Simple. For vbulletin or similar sites, let us submit the url like this:

http://www.domain.com/forums/showthread.php?t=1
through
http://www.domain.com/forums/showthread.php?t=1001

where you increment the numbers. And you can infer that this number grows over time so we shouldn't have to revisit all the time to submit.

Maybe give us an interface for telling you through a file on our site updated daily saying the high and low number for the various url schemes.

Posted by: GilbertZ at September 29, 2005 05:56 PM

Thanks! That's a neat toy, very much appreciated. As requested, here is a greedy geek's wish list.

Posted by: Sebastian at September 30, 2005 01:52 AM

Oups, tags stripped out ... here is the wish list:
http://www.smart-it-consulting.com/article.htm?node=148&page=104

Posted by: Sebastian at September 30, 2005 04:13 AM

Very cool. I saw this as SES and was anxiously anticipating it. My only feedback would be allow downloading a TSV of more than 50 items. I tried to browse to the second and third SERPs hoping this would allow me to download a tsv of the next and subsequent 50 links, but each page only download the original 50.

Posted by: Ryan Roberts at September 30, 2005 08:59 AM

I am ALL FOR getting info from tools from the SE's. This stuff about looking at your incoming links is cool and all, but what about learning more about who's clicking through?

With personalized search results becoming all the rave, and every person receiving a different set of search results for the same search, how will we know where we ranked? How will the API's work with that?

With personalized search, the SE's will have lots of info about the people clicking through to our site. But how will that info be passed on to us?

I'm hoping information provided to search markers about their target markets & the places they advertise (free or not (PPC or organic)) will be much more than just a site where we can look up some link & page relationships, and instead maybe advancing to more the level of information we get about the people we advertise to in traditional marketing.

Posted by: laura at September 30, 2005 09:25 AM

IM glad your paying attention to webmasters a little more.

Posted by: Joeychgo at September 30, 2005 11:29 AM

For those who are interested I put together a script for IPB and vBulletin forums users to create a list that can be submitted to Site Explorer. The info for this app can be found here:

http://www.bleepingcomputer.com/yahoo_site_explorer.php

Posted by: Lawrence Abrams at September 30, 2005 04:27 PM

The inlink API does not seem to work correctly when specifying a start param.

Try this one:
http://api.search.yahoo.com/SiteExplorerService/V1/inlinkData?appid=YahooDemo&query=http://search.yahoo.com&results=5&start=3

Notice that in the resultant firstResultPosition is 1 though it should be 3, and totalResultsReturned is 7 though it should be 5.

HELP!

Posted by: Ed at September 30, 2005 05:26 PM

This is an asset for SEO's. We must start using this to figure out all the back links and status of web pages submission at Yahoo!

Posted by: Amit Doda at October 2, 2005 09:39 PM

Hey Priyank... neat tool, but it doesn't really change what was available. Long time webmasters already know how to check links to a page and links to a domain - and they already know how to check indexing relating to the whole site or only folders or subdomains.

I see a few advnatages - one there is no description and two there is an API - but, it is missing some features to make it truly useful (thus, beta).

Give the ability to condense IBL's from the same domain into one listing (maybe with a number beside it totalling how many are from that domain) so that we can weed scraper sites. Allow us to download all the results, or at least a bit more than the first page, especially when you can't condense URL's.

Good move though in a show of making an effort at webmaster relations - more than some of the other engines have done.

Posted by: Rae at October 4, 2005 04:57 AM
Post a comment



(or blank, but don't fake one)


(no weblog? leave it blank)





Remember personal info?

(on topic, please!)




Disclaimer and Reminder. The opinions expressed here are not necessarily the opinions of Yahoo! and we assume no responsibility for such content. Yahoo! may, in our sole discretion, remove comments that are off topic, inappropriate or otherwise violate our Terms of Service. Please do not post any private information unless you want it to be available publicly and never assume that you are completely anonymous and cannot be identified by your comments.

Copyright © 2004 Yahoo! Inc. All rights reserved. Privacy Policy - Terms of Service

1