September 29, 2005
Webmasters, tell us what we don’t know
Having been to a couple of Search Engine Strategies conferences, I realized how much you look to search engines for information on how your content is indexed by them. I’ve heard stories of elaborate scripts that scrape search engines, using ‘site:’, ‘link:’ and ‘linkdomain:’ queries to understand your content’s relationship to other pages on the web. Through these queries, Yahoo! provides unique information, but often there is more that you are looking for.
Today we are launching Site Explorer from Yahoo! Search, a webmaster tool we talked about at SES San Jose. WebmasterWorld and Search Engine Roundtable have been tracking it, and so have some folks on My Web. Currently, you can use Site Explorer to:
Site Explorer is geared towards your needs, providing 50 results by default, web services APIs, the ability to export the data to a TSV file for further analysis, as well as free submission for missing URLs.
Tell us what we don’t know. If you don’t find a URL that you expect to be in the index, use free submit. In case you hadn’t heard, we are also accepting lists of URLs, so you don’t have to provide us one URL at a time.
This is a starting set of features of what we hope becomes a truly valuable tool for you to interact with us. So please send us feedback, tell us how the product works for you, and let us know what else you’d like to see. Enjoy exploring, and tell us what you don’t find!
Product Manager, Yahoo! Search
Posted by Yahoo!Search at September 29, 2005 01:05 PM
This is a helpful and efficient asset for both Webmasters and SEOs.
Thank You :-)
Anticipate using it very often
Really looking forward to more innovations from Yahoo!! (That's Yahoo! - plus exclamation)
An asset to SEOs? I certainly hope not! I like to see relevant search results, not "optimized" ones.
This site is an option for SEO site research, as opposed to their using the Search.yahoo.com for their analyses
Yahoo's Search SERPs are still as relevant as before - it is just that siteexplorer ONLY accepts - "URLs" -
as queries, and adds a more related format.
The 5000 rate limiting will be a big problem for Onlinetools. So the most will probably prefer to scrape your sites in the future as they do are doing it now. Furthermore I hoped to discover with the siteexplorer more than 1000 links. At the moment I am a little disappointed.
Thanks! That's a neat toy, very much appreciated. As requested, here is a greedy geek's wish list.
Very cool. I saw this as SES and was anxiously anticipating it. My only feedback would be allow downloading a TSV of more than 50 items. I tried to browse to the second and third SERPs hoping this would allow me to download a tsv of the next and subsequent 50 links, but each page only download the original 50.
I am ALL FOR getting info from tools from the SE's. This stuff about looking at your incoming links is cool and all, but what about learning more about who's clicking through?
With personalized search results becoming all the rave, and every person receiving a different set of search results for the same search, how will we know where we ranked? How will the API's work with that?
With personalized search, the SE's will have lots of info about the people clicking through to our site. But how will that info be passed on to us?
I'm hoping information provided to search markers about their target markets & the places they advertise (free or not (PPC or organic)) will be much more than just a site where we can look up some link & page relationships, and instead maybe advancing to more the level of information we get about the people we advertise to in traditional marketing.
IM glad your paying attention to webmasters a little more.
This is an asset for SEO's. We must start using this to figure out all the back links and status of web pages submission at Yahoo!
Hey Priyank... neat tool, but it doesn't really change what was available. Long time webmasters already know how to check links to a page and links to a domain - and they already know how to check indexing relating to the whole site or only folders or subdomains.
I see a few advnatages - one there is no description and two there is an API - but, it is missing some features to make it truly useful (thus, beta).
Give the ability to condense IBL's from the same domain into one listing (maybe with a number beside it totalling how many are from that domain) so that we can weed scraper sites. Allow us to download all the results, or at least a bit more than the first page, especially when you can't condense URL's.
Good move though in a show of making an effort at webmaster relations - more than some of the other engines have done.
Disclaimer and Reminder. The opinions expressed here are not
necessarily the opinions of Yahoo! and we assume no responsibility
for such content. Yahoo! may, in our sole discretion, remove
comments that are off topic, inappropriate or otherwise violate our
Terms of Service. Please do not post any private
information unless you want it to be available publicly and never
assume that you are completely anonymous and cannot be identified by