Profile Information

Display Name:
SajeetNair
Job Title:
Senior Project Manager
Company:
SMG Convonix
Type of Work:
Agency
Location:
Mumbai, India
Favorite Thing About SEO:
Google Analytics
Bio:
Senior Project Manager at SMG Convonix. I have been working in the internet marketing domain for over 4 years now. Also an Analytics enthusiast. You can also find me on twitter at @SajeetNair
Blog Bio:
With over 5 years of work experience in digital marketing, Sajeet Nair is a Senior Project Manager at SMG Convonix, a Digital Marketing Firm based out of India. You can also find him on Twitter
Favorite Topics:
Analytics, Blogging, Keyword Research, On-page SEO, Technical SEO

Protect Your Website From Becoming a Doorway Page - Google Patent Style

- Posted by SajeetNair to Blogging

I have a confession to make. I hate patents as I simply do not understand them and probably never will. Maybe it’s the lingo or maybe it’s the plain formatting or maybe it’s the unbelievably complicated language and words that are used to describe the simplest of statements or even worse, it is written in a language which is not English and I have not yet realized it.


Understanding the True Power of Language Using Google Analytics

- Posted by SajeetNair to Analytics

Some of the finest marketing messages in the world are often made up of simple unimposing words. It's their simplicity that turns out to be their USP. Of course in the ever so intimidating online world it would take more than just a one liner to get "CONVERSIONS". Evolution is the key to success and exploring new business opportunities is an integral part of that process and one of the most crucial (and often neglected) aspects of this process is "Language".


Blog Post: The Best of 2012: Top Posts and People of the Year

Watching my name there (twice) seriously made my day, although i think that there is one more comment of mine which got 32 likes and should also be featured here :) (Sorry if I am being little too greedy)

This post is surely the one which should be bookmarked, some amazing posts out there, congratulations to all the contributors.

Note to Self - High time I submit a Guest post to SEOMOZ

Cheers,
Sajeet 

Blog Post: International SEO: Dropping the Information Dust

Hi Gianluca,

First of all great post, if you don't mind i would like to add a few points -

Regarding Alternate Tags - 

The biggest challenges are faced by Multinational companies that sell thousands of products and have different country specific pages that have the same content. In such cases adding multiple alternate tags becomes a huge issue. For example if I have an electronic product page and that same product is being sold across multiple countries that have different country specific URLs and country specific content, then the webmaster will have to add multiple lines of Alternate Tag code on each product page. Imagine the plight of the webmaster in cases where there are 1000 of products across 50 - 60 countries.

However one seriously cannot get away with this issue. In the Mid April and May of 2012, where Penguin and Panda were on a vengeance, I noticed that a lot of websites fell in rankings, in fact in such cases Google started showing those websites which it considered as the original source. For example, in UK Google started ranking US website despite the fact that there existed a UK website. The theory here being that, due to the same content i.e. US and UK websites were selling the same product which had the same specifications (content), Google started ranking that version of the website which it thought was the original source of content i.e. the US website. Why did Google do that? The reasons can be many, higher domain authority, better link profile etc etc. But what intrigued me was the fact that it did not remove the domain from the SERP. With the Panda update that deals with content scrapping/duplicate content, emerged a problem that I call "Geographic Bundle of Confusion". The problem did not persist only in UK or US, same was the case with India where US website started ranking in place of the India version of the website. In South America, there were so many cases where Spain version of the website started ranking in place of the Argentina version of the website, simply because both these websites had the exact same content and Google did not consider the latter to be original source of content. 

In such cases the significance of Alternate tags cannot be stressed enough. But one might argue, what is the harm, as such the domain is ranking, how does it matter if it is the US domain or the UK domain? The answer is measurability. Also one should not ignore the fact that every Geo specific website is always designed keeping in mind the local needs of the people and users.

From Geo perspective, there are a lot of things that the webmaster/SEO can do to help the website -

  • Apart from the head section where the alternate tag goes, you can also specify the alternate tags in the XML sitemap. However, I would strongly recommend implementing it in the head section as well. 
  • But what about PDF files? Although Google has not officially rolled it out, maybe implementing header response alternate tags (similar to header response canonical tags) might help the cause. But then again since Google has not officially rolled it out, it might not be useful.
  • One can also add the Geo tags on the pages with the latitude and longitude, this will help in Geo profiling of the webpages
  • Needless to say specifying Geo in Google WMT is the most important
  • Building as many local citations as possible will also help the cause, having local pages in Google Places, Yahoo local etc will also help the cause


Hope my comments were useful.

Cheers,
Sajeet

Blog Post: Outranking Google

Larry Page recently announced that the Internet Marketer who can shout at the top of His Voice in front of the whole world "I DON'T CARE ABOUT GOOGLE" will receive tickets to Disney land. Although a lot of internet marketers did participate in this contest and did exactly as was told to them, Larry Page did not spend a single penny. You know why? Because all those people were already living in Disney land. :P

OK sorry for the bad joke, but the reality is that we don't live in Disney Land but in a real PRACTICAL world. At least for the next 5 years we cannot ignore the mighty G. Theoretically its all good but practically not possible. 

- Sajeet

Blog Post: The 6 Month Link Building Plan for an Established Website

Good link building plan, but my genuine concern has always revolved around the concept of sustainable link building strategies. I completely understand the concept of 6 month link building or 1 year link building in cases where the website is pretty new and needs to have that initial push for building links.

But I feel that purpose is somewhere lost. In this damn age of Panda and Penguin, the best way to survive is to ensure that the website itself becomes a link machine. What I mean to say is that the aim of all the link building strategies should be to promote the website not for rankings but for awareness. Awareness that the website is a great source for great content which people will obviously link to. 

One other concept that very few people are aware of is citations. Not sure why but whenever a content piece is syndicated people always expect a link in return. A mere mention of say abc.com is good enough for a website abc.com. Dont trust me? Check Google WMT data, where mere citations are also referred to as links.

Anchor text - A lot of SEO experts will agree with the fact that brand linking is the way to go. Instead of focusing on keywords, brand names as anchor text is all that will be needed. The focus should be more towards building thematic content that will convince the search engine spiders that the website should rank for certain set of keywords.

Lastly, I have noticed this recently where a lot of clients come to me saying that the website has been hit by Penguin or Panda and after prospecting their back link profile one common thing among them is a poor link profile with loads and loads forum links, comment links etc i.e. links that were build by another SEO agency and links that cannot be removed. In such cases I believe that there are only two options, either let the client know that overriding the effect of negative SEO will take a lot of time or Simply BUY a new domain and start afresh. 

@James - Would love to know your thoughts on what would your response be to such clients.

Note - I guess you forgot to mention Majestic SEO. A brilliant tool for back link analysis. 

- Sajeet

Blog Post: The New On-Page Optimization - Whiteboard Friday

Great WBF, but one question though, in your experience, while trying to implement the strategies that you have discussed here how much of an impact or difficulty did you face while managing timelines? Did you guys set a predefined timeline or was it on the fly? Please note that I totally understand the dynamic aspect to it and that it varies from project to project.

I am asking this question as this is something clients always ask us while discussing strategies.

- Sajeet

Blog Post: The Lowdown on Structured Data and Schema.org - Your Questions Answered!

 Great Info Daniel, A few questions from my end as well - 

  • What if I implement microdata on a few pages and microformats on a few pages? Is there any remote possibility where I might see some weird results?
  • There are certain websites that show rich snippets without implementation of any formats? Do you think that any of those websites should go ahead and implement structured markup to ensure that Google does not remove the rich snippets feature for these websites?
  • Do you foresee customized markups in the future where an individual creates these markups and then submits to schema.org for approval? 
  • Lastly, do you think structured markup will play a major role in the rankings in the near future or do you think they will be misused just like meta keywords? 

 

- Sajeet

Blog Post: Unnatural Link Warnings and Blog Networks

Dan,

India?? Really?? Spammers are present everywhere and not just in one country.

Cheap labor always does not necessarily mean cheap quality work. It would greatly be appreciated if a specific country is not targeted and labeled as a hub of cheap spammers.

- Sajeet

Blog Post: Building Links with Video Content

Hi Jacob,

Some very Good points, however wanted to ask you, since Google has now started indexing textual content inside Flash do you think that videos should now focus on textual content as well and not just visual content?

Also do you think that addition of Videos (which will eventually lead to increase in page load time) cause any negative impact from an SEO perspective?

- Sajeet

Blog Post: Going Viral on Pinterest: Driving Big Traffic and Making Pinterest a Real Marketing Solution

If only I could "Pin" this post on SEOMOZ :)

Thanks for the great post and great info, however what are your thoughts on the effect of the rising popularity of Pinterest on individual content posts.

I ask this as I feel that the concept of paid ads or posts is not far away. Also with more and more people submitting content the time period for which the post will be present on the home page will reduce significantly.

What are your thoughts?

- Sajeet

Blog Post: A Visual Guide to Rich Snippets

Hi Selena,

I believe you are talking about the reply just above your last message.

The test was conducted at my end where same content (say address of a location) was marked up across different pages with different formats. Although the rich snippet tool showed the same thing for all three formats, search engines recognized microdata more quickly as compared to other formats when the changes were pushed live and hence I agree with you that maybe microdata is the way to go.

Thanks a lot for your insights, really appreciate it.

@incept @adoptionhelp - Thanks for your responses as well.

- Sajeet

The concept and importance of rich snippets is something that really needs a lot of explanation, especially on the client side. A few informed clients would quote the line from the Google page

Marking up your data for rich snippets won't affect your page's ranking in search results, and Google doesn’t guarantee to use your markup.

Then explaining the concept of CTR comes into picture to which the client replies -

What if I am ranking on the 5th or 10th page of Google, how will rich snippets help me? Also Google Doesn’t guarantee to use the markup

The infographic that you have provided will truly be useful in tackling such clients and highlighting the advantages.

But here’s the deal, in case of Wordpress yes we have some amazing Plugins, but what about other popular CMSs like Drupal or Joomla? 

Also, I also ran a test to identify which markup format is the most useful? Microformats, Microdata or RDFa? Turns out that the rich snippet tool identifies all three but when implemented on a live site, for some strange reason microdata gets picked up more quickly as compared to other mark up formats.

I might not have considered a lot of variables here but I would like to know your opinion on which format would you prefer and why? Microdata, Microformats or RDFa?

- Sajeet

Blog Post: 5 Awesome Content Ideas for Boring Niches

Hi Mike,

Now the main concern when it comes to content marketing is measurability. I totally agree with you when you say that there is not a single content piece/niche that can be categorized as boring but what happens after the content is generated is a more crucial.

So somewhere down the line after the content piece is generated we probably need to look at the avenues where this content will be syndicated or is made viral, also depending on the niche different forms of content marketing will come into picture. For example lathe industry which is basically a subset of heavy machinery industry, video marketing (which is also a form of content marketing) will probably be a better option as compared to textual content marketing. Also somewhere down the line in the process of strategizing, we will also have to consider the time that will be available to distribute and make the content viral.

Rand wrote a fantastic post on how to make the content viral but somewhere down the line I guess identifying the target audience also becomes a key here. Taking the example of the lathe industry, the target audience that we are talking about might not be well verse with terms like Feeds or social media, but they would definitely watch YouTube and probably in the spare time watch some training videos, a avenue that needs to be capitalized.

Overall, a great video.

Thanks,

Sajeet

Blog Post: The Brand of SEO and the Trend of Inbound Marketing

Hi Rand,

I guess that’s the notion that we probably need to change. People sometimes (kind of) associate SEO with either spamming or just link building.

When we use such a term we need to tell them that Brand Monitoring will also include monitoring the performance of the website in terms of rankings because eventually the rankings of highly searched terms will also define the BRAND popularity. Also as compared to SEO it kind of is easier to explain.

Also as I stated above, SEO can no longer exist as an independent entity so might as well sell it as a subset of something that a Brand will understand more easily.

- Sajeet

Hi Rand,

The only physical entity so to speak that can exist independently is PPC.

Today’s online Eco system demands all the other platforms to support each other. For example, Social Factors play an important role in Organic Searches, the social platforms ideally should be accessible via Organic Searches, and there are cases where ranking for Brand Terms is even more crucial than ranking for Non brand terms.

So to sum it up I believe we need to define and identify a term that gives a more holistic view of what is done, so I would suggest

ONLINE BRAND MONITORING AND REPUTATION MANAGEMENT.

The name is not scary, easily understandable and something that will help the brands connect with the digital world. It also covers every platform like search engines, Social Platforms and the website itself.

- Sajeet

Blog Post: An Open Letter to New SEOs

Thanks Vinod.

 

Hey Anil,

I definitely do not get paid from anyone to write comments, since SEOMOZ is a such a great platform for discussions i put forth my opinion quite often.

Infact you will see me commenting on a lot of blogs, sometimes daily. From your comment i understand that you clearly do not like my comments, Hopefully one of these you will find them useful.

- Sajeet

Thanks Dr Pete for the sound advice, really appreciate it :)

Thanks Asad, appreciate the feedback here :)

Thanks Vahe for the kind words, i am working on a post and will soon be submitting it to Youmoz :)

Hi Dr Pete,

That's some awesome tips that you have given.

For someone who is starting afresh the first and the foremost important thing would be to talk to to someone who has been in this field for some time and can explain the history in detail. Its important to identify what are the key areas that an individual needs to focus. Different avenues that define SEO need to be explained. 

Analytical Tools - Some say that Google analytics is mainly related to analytics consulting but i disagree with it. Infact analytics can make or break an SEOs career. One needs to have sound knowlege of Google Analytics and other popular analytics tracking software like Omniture. Same goes for popular tools like Google WMT, Bing WMT etc

Link Building - The SEO fuel if i may say. Its important that right from the beginning the person is made aware and taught the difference between white hat and black hat SEO. If not trained properly the poor guy will land up doing black hat SEO all the time and before he realizes he will be out of the market.

Content Writing - Some call it content marketing but nevertheless one of the most important elements that define the success of an SEO campaign. The newbie should learn how to write SEO friendly content and at the same time not make it keyword stuffed

Basic Level of Coding - Of course an SEO need not be a master programmer but he should be aware of the basic syntaxes that are used in some of the popular programming languages like php, asp etc

Identify your weakness - Some are good at writing and some may be expert coders, some may understand the algorithms and some may just know how to optimize the on page elements. In any case its important that you understand your weaknesses and strengths and accordingly work

Follow and Learn from the right people - There are some leading experts out there, experts like Dr Pete, Rand Fishkin, Gianluca Fiorelli, David Mihm, Mike Blumenthal and very recently Mike King who are doing a fantastic job in spreading the SEO knowledge and it's important that the newbie recognizes these people and follows them. You know what they say, always learn from the best.

Lastly try to build your won niche - If someone advices you to be an expert in all fields of SEO then i would say that its not probably the best idea. Identify where your talent is and that will come only after working in this field for maybe 1-2 years. It may be analytics or places or content writing or something else and once you identify where your skill set is, work towards being the best you can in that field.

- Sajeet

Blog Post: The Web Developer's Interactive Cheatsheet for SEO and the Open Graph

Excellent Stuff Ray, I must say Great Job.

Was just wondering, could you also please add fields for Language Tag and Geo Tags?

For Language Tag we can have two fields, Country and Language and for geo Tags we can have a field where we enter the location Coordinates.

Also if its not too much, how about adding Robots Tag in the head section i.e. Follow and Index?

Cheers,

Sajeet

Blog Post: Visualizing the Marketing Funnel - Whiteboard Friday

knowing that buses are good places to first meet eventual wives is indeed good information

I have a strong gut feeling that you will not get Food today or maybe worse you will have to COOK today.

ROFL :P (J/K)

- Sajeet

Interesting question, in fact based on user behavior data in terms of nth Visit we will be able to identify the type of keyword that got the visit.

For example, for the first visit, we can get the number of visits for (not provided) and then segregate them into branded and non branded. I mean the segregation will be hypothetical and for safety ill assume 50 – 50%

However that should ideally not be the case with the second level and third level visit as someone who landed on our website the first time via a keyword, would most probably land on our website again using a brand term instead of a non branded term.

This way we will actually be able to break down/segregate the not provided data into Non Branded and Branded Visits. The data will not be accurate but at least we will have some breakdown and understanding of the number of branded and non branded visits.

Excellent point WojKwaski :)

- Sajeet

Blog Post: Influencer Marketing - What it is, and Why YOU Need to be Doing it

Thanks Gianluca,

I am working on a post and when completed will surely submit it to Youmoz. Hopefully people will like it :)

- Sajeet

Hi Eric,

That’s one awesome post and the concept of influencer marketing is very intriguing but here’s the deal and please do not misunderstand my points,  I am just trying to state my views here - 

The biggest challenge is not the first step i.e. identifying an influencer, but the biggest hurdle would be the second step i.e. to build contact.

A few questions that a marketer should think about before contacting the influencer  

What should I talk to him?

It’s important to understand that you do not talk any random crap. Its important to remember that first impression should always be the best impression

Which medium should i use? Facebook, Twitter, LinkedIn, Email?

Very crucial, what’s the point if your influencer is not very tech savvy and does not understand the meaning of @mention. Identify their favorite spot and BANG catch them :)

What will make the influencer interested to respond to me?

This is again a tricky one, it’s important that you showcase yourself as someone who knows what he is doing, one should definitely not look like an idiot while contacting an influencer.

Where should i draw the line between stalking and following?

The most rookie mistake any marketer would do. Never show desperation, it’s important that you give influencer due importance but at the same time and for heaven’s sake don’t act desperate, Give time and be patient.

If the influencer replies how long should i wait before contacting the influencer again?

This is where i would say the word planning comes into picture. A little understanding of Human psychology is important here, i mean if a person is going through crises then obviously it’s not a good time to send a mail to an influencer stating, "look at the awesome blog post that I have written".

What USP does my brand have?

Never approach an influencer if you don’t have anything to offer, i mean what’s the point. Put yourselves in the shoes of the influencer, why would he respond to something that has no value.

Is there anyone who can influence the influencer?

The biggest and the most important strategy, one needs to identify people who in themselves might not be great influencers but are definitely important to the influencer in consideration. Try to reach out to them and then via them reach the influencer.

Nature of the influencer?

This is slightly a tricky one, some people may not like it but the reality is I would prefer to talk to someone who is well natured and does not think of himself as the KING of the world. Trust me you wouldn’t want to work with such morons, unless of course they truly are the kings and queens of their field.

These are some of the questions whose answers a marketer should always have, the reason is very simple, a potential influencer should ideally get like a thousand emails, and there should be something that will make you stand out.

Most importantly it’s important to classify your influencers, for example do not go for the biggest influencer, start with a little lesser known people, interact with them start building your reputation and then go for the big fish.

The concept of how an influencer can help is great and this is something that you have explained brilliantly but i believe its the approach and strategy that would go into contacting an influencer which is MUCH MORE important.

- Sajeet

Blog Post: 12 Things That Will Kill Your Blog Post Every Time

Thanks Michael,

Appreciate the Kind words here.

- Sajeet

Hi Neil,

Thanks for the insightful post, but i believe there are a few more things that define the success of a blog - 

Having Kick ass Images - Let’s be honest "looks do sell" what i mean here is that a blog post should not be filled with just some text, no matter how intriguing the post may be, the impact that an amazing image can create is something that a post with only text cannot do. In fact if the images can at times make or break the image of a Blogger. With Funny or cocky images not only is the blogger displaying his blog writing style but is also displaying his persona.

Humor - Now this might be a little controversial one but let’s face it, it takes a lot of skill to get that right sense of humor and if applied properly can bring a huge amount of success to the blog and the blogger.

Appreciation - Very few bloggers appreciate the reader and sponsors, by sponsors i mean people who have text link ads on their page. I mean what’s wrong with a little Thank You

Response to Comments - Now I won’t take names but unfortunately there are loads of so called Big bloggers out there who just do not respond to comments, I mean then why the hell do you even write the post if you are not interested in the people who are reading

Going out of the way - Alright so you are an expert in your field, but it will not harm you if someone wants to learn something, it might be a small query in the form of a comment but you don’t have to charge FEES all the time to help someone. Who knows that small good deed might get you some awesome referrals :)

Invite People - Everyone loves an invitation, so maybe next time you are writing a kick ass post and you know about it send an email blast. There are loads of providers who offer free email blasts.

Spread the love - If there is someone else in your field who is doing good work show some appreciation. People by default will look up to you as the bigger guy.

- Sajeet

Blog Post: How to Leverage Content Communities to Expand Your Brand Reach

Hi Julianne,

Thanks for the information here and when it comes to content marketing, I totally agree with you when you say that one needs to have a clear understanding of why the content is produced and who the target audience is, something which very few people understand. But when it comes to small brands it really does become a herculean task, especially when you are dealing with small brands that belong to a very niche category.

In such cases before any content marketing initiatives are taken it kind of becomes imperative to establish the brand as kind of a big player in that niche and that can be achieved through paid content marketing strategies like Press releases i.e. PR Web and then follow up with Display and Banner Ads that will play a significant role in such cases, probably more than conventional PPC ads on Google. Honestly this might be one of those cases where citations and local results will play a bigger role than Social Media platforms in the initial stages.

The next step would be to identify brand endorsers in that niche, tie up with them and then exploit their traffic and Brand. The bottom-line here being that initial investment is required to give that required push and highlight the brand.

Having established the brand, the concept of engaging the customers will come into play and that’s where great content needs to generated. People might not agree with my theory and say

"Dude you are talking crap, Produce great content and over TIME You will see the results".

Please note the highlighted word, TIME, something that very few clients have and understand and hence from a practical point of view it would probably make sense to initially help the client to establish the brand and then shift focus to content marketing.

All the other viral strategies that you have mentioned above will fall in place, be it videos or infographics etc. BTW not sure how many of you have tried Panoramio but it's a brilliant citation tool, highlight being the Geo Tag feature.

PS - Regarding content marketing i am bluntly assuming that the client will have content writers at their end, in case they do not have one then probably the overall cost will increase when a content writer is hired :)

- Sajeet

Blog Post: Exploring the New Features in Bing Webmaster Tools

Hi Daniel,

Thanks for the post, A few things that I love about Bing WMT - 

  • One verification code for all websites in a single account - It’s just so convenient, especially for an agency I can just create one account paste the same code on all the websites that I have and view the data, no need to go ahead and search for a new code for every website.
  • Traffic data on a day to day basis - I just love this feature where they actually give data on a daily basis, something that Google WMT does not have
  • Same goes for Crawl data and pages with crawl error
  • The crawl details is also way more organized and sorted on the basis of Type

 

One feature that I would love to have is linking Bing Analytics (I am assuming here that they will someday launch Bing Analytics) with Bing WMT something which Google could successfully implement.

One amazing feature that I would like to highlight is the Bing Index Explorer (I so wish if I could attach a snapshot here) where I have the option to block URLs, clear cache etc

With so much amazing data the only sad part is, if only I had huge traffic numbers from Bing analyzing the data would have been so much fun.

- Sajeet

Blog Post: 11 Google Analytics Tricks to Use for Your Website

Hey Eugen,

Thanks for the Great Post!!!!

A few points from my end as well - 

  • Use of Regular expressions to extract the required data
  • Use of custom Variables/User defined variables
  • Use of Advanced segments to segregate data quickly
  • Use of advanced filters to get the required data

 

- Sajeet

Blog Post: Stop Paying for Stupid Clicks: Negative Keywords for Positive ROI

Hi Keri,

Thanks for the insightful post, a few tips from my end as well - 

  • Google analytics is the best source to find keywords that are useless. By sorting the keywords on the basis of Bounce rate we can get keywords that are not converting. 
  • Using regular expressions I can also identify keywords that are often misspelled and result in a high bounce rate or exit rate. For example - dat(e|a) analysis, both the terms here i.e. data analysis and date analysis mean different things and obviously depending on the website the other needs to be removed from the keyword basket
  • Numbers are at times killing, what i mean to say here is that there are cases where for an e commerce website product numbers/ids are sometimes part of the keywords basket. The tragedy here is the fact that the ads sometimes show up for numbers like 1,2,3...... This is pretty irritating. Appropriate rules should be in place to avoid such cases
  • Generic keywords, this is kind of a tricky one as this totally depends on your budget. I have noticed cases where generic keywords result in a lot of bounce rate and lead to very few conversions but it’s very difficult to ignore such keywords as they are so relevant to the client's business. However the least that such keywords can do is brand awareness.
  • Long Tail Keywords - Again Google Analytics can be useful to identify those supremely awesome keywords but at the same time there are keywords which do not really help us, individually they might not be so expensive but cumulatively they can create a big hole in your pocket.
  • I have also noticed ads showing up for Geo locations terms like "California", "Georgia" etc. A very simple one but should be avoided

 

PS - I have noticed a lot of marketers who place conversion tracking code on the landing pages but not the GA code which is kind of a shame as they are a source of some amazing data out there

- Sajeet

Blog Post: How Google Makes Liars Out of the Good Guys in SEO

Hi Wil,

Thanks for the post but we really need to ask one question - 

Do we live in an ideal world or a practical world?

At the end of the day it’s important to realize that a Search Engine Robot is a software. It does not have the intelligence that we human beings do.

Taking the example of the SEO Company, if you check the link profile of the companies that are ranking in Top 3 it will be very harsh to say that their link profile is not good. I am not saying it’s excellent but at the same time not poor also

Let me take a simple example, if an SEO company actually goes and asks their clients to place a footer link as follows - Digital Marketing by SEO Company Company Name with "SEO Company" as the anchor text.

In an ideal world the clients will not belong to the category of SEO, so thinking that way we should not even get link value. In fact some people might even say that the website should be penalized. 

But now let’s think logically and apply semantics, the very fact that so many websites are placing link on their home page suggests that the SEO company is trusted and the anchor Text "SEO Company" only helps the search engine spiders to make an informed decision. Not to forget the 500 odd more signals that the "SOFTWARE" will have to go through. Bigger the brand, (ideally) bigger will be the brand value.

I am not saying that one should not generate good content but at the same time generating Good content all the time is also not the solution. 

Every Good SEO agency tells two things to their client - 

  • We cannot guarantee Number 1 rankings
  • SEO is a slow process

 

The very mention of the second statement scares the clients. Now if the ideal platform was put into picture the word "Slow" will get converted to "Slower".

I think everybody appreciates the concerns that you have shown but all i am trying to say is that we need to be practical as well, after all what is the point if we only talk about great content and no client comes to us.

- Sajeet

Blog Post: February Linkscape Update: 66 Billion URLs

Hi Rand,

This is just so freaking awesome.

Was just wondering, will it be possible to have something like "Historic link profile data". Since you guys remove historic data but i believe that you must store that data somewhere right?

I am only asking this as from an analyst point of view it would be very helpful for me. For example - If one of my competitor’s ranks fell drastically, looking at the historic data and the fresh data I will be able to analyze what went wrong for them and then i will refrain from using the same tactics.

- Sajeet

Blog Post: 92 Ways to Get (and Maximize) Press Coverage

Gianluca and Chris,

Thanks for the reply, really appreciate it :)

- Sajeet

Hi Chris,

Thanks for an informative post here, but there are certain concerns that I have always had when it comes to PR. For starters I am really not sure if PR marketing is the right way to go for small businesses. I mean because they are starting out it can get difficult for them to get recognition in the earlier stages. My second concern being time, I guess when we try to apply all these strategies it will surely take a lot of time. Explaining the Time constraint will be a very difficult for agencies to explain to their clients especially mid-size clients. My third concern is brand partiality. I mean lets be honest here, a startup will ideally not be able to convince a Journalist, in fact there is a very high probability that these small brands will get ignored and big brands will always be given preference. Also a lot of other PR websites like PRWEB are pretty expensive.

Also from what i understand, aligning the other internet marketing strategies like SEO, PPC and maybe to some extent SMM will be a herculean task.

Somewhere down the line I can’t help but think that lot of investment is necessary in order to make the PR strategy a SUCCESS. Please correct me if i am wrong.

Outlining an Ideal step by Step process is always so great to read and listen but lets be honest, I guess it’s the practical aspect that needs to be highlighted. 

- Sajeet

Blog Post: Web Site Migration Guide - Tips For SEOs

Hey Modesto,

The sole purpose was to pass the SEO value that these PDF pages might have garnered.

- Sajeet

Excellent Post Modesto, You have covered almost everything, however a few points from my end.

DNS propagation is a very crucial aspect of any website transition. Unfortunately there is no specific time frame which has been recorded where the search engines have recognized the new IP. In some cases its 3 days and in some cases its been as small as 2 hours. The key here is to identify that business hour of the day where least number of visits are obtained. Thats when we should initiate the propagation.

There are cases where PDF files are no longer required in the new website and sometimes its a tricky situation as ideally if they are being redirected should be 301 redirected to corresponding PDF files on the new website. In such cases mapping becomes difficult and either they all should be redirected to home page or blocked via robots.txt.

In cases where URL extension is going to change, for example .html to .aspx, extra care needs to be taken. For example lets say we try to implement abc.com/index.html >> 301 redirect >> abc.com then in such cases canonical tags must be placed well in advance as 301 redirect rule will not apply as that will result in a redirect loop. The second option would be abc.com/index.html >> 301 redirect >> abc.com/index.aspx and then place a canonical tag on the page abc.com/index.aspx pointing to abc.com.

Just to ensure everything is in place i will also place Geo tags and in cases of multiple Geo Targeted website place alternate tags to avoid duplicate content issues and language tags to specify the language.

And since we are as such talking about complete transition, might as well ensure that all the JavaScript is being called via an external file just to ensure higher text to code ratio.

Once again Congrats on such an awesome debut on SEOMOZ.

Cheers,

Sajeet

Blog Post: Create Crawlable, Link-Friendly AJAX Websites Using pushState()

Hi Rob,

Thanks for the very informative post, but i have a few questions, (again please note that i am not a programmer so there is a possibility that you might my questions extremely stupid) - 

  • For the website http://html5.gingerhost.com/new-york as you stated that the text on the right hand side does not change and the video also keeps running. My question is that, is it not an example of template driven content? I mean the same content is present on all the pages, will such implementation not result in duplicate content?
  • In the example above i.e. "ON THIS TEXT" link, from an SEO perspective what I see is that the URL is same but content does not change, so isn’t it a case of duplicate content again?
  • I understand that probably you used these example to emphasize on the reduction in site load time but can you provide us some examples where the above mentioned AJAX implementation is used which does not create any duplicate content issues?
  • Will data drawing APIs work with this implementation? For example if I have a finance related website that gets Stock related data using third party APIs, will the above mentioned technique be useful there?
  • Lastly is there any workaround for websites that do not have HTML 5 i.e. PHP, ASP, or the classic HTML pages?

 

- Sajeet

Blog Post: How to Forecast Seasonal Keyword Traffic with Google Insights and Python Scripts

Hi Zack,

Thanks for the post, but I guess the only problem from a user’s point of view is that the target audience for this post here is pretty small, I mean for someone like me who is horrible with coding, it would be great if there is a step by step guide on how to install Python, where to write or in this case copy paste the program etc.

No offense intended here but thought that I should let you know about this. After reading this post, I was like "This seems awesome but how the hell to get it in place?"

Cheers,

Sajeet

Blog Post: What Community Builders Can Learn From Research

Hi Thomas,

Excellent Post, however I do not agree on one aspect of the post i.e.  90% are users/lurkers who do not directly add anything to the communityIt’s a very subtle way of saying that they are of no use or "useless".

I understand that it’s not your point of view and correct me if i am wrong but what if the 90% of the so called "lurkers" stop visiting the site? What would happen if only "synthesizers" and "creators" visit the site? 

I think instead of calling them lurkers the right word for them would be "audience" and it makes sense doesn’t it? These are the people for whom the community is built, people who are looking for information, people who are trying to learn and obviously in the initial stages they will not contribute.

Give them time and if they are groomed properly they will fit into the category of "synthesizers" and "creators". 

- Sajeet

Blog Post: Google Analytics Certification and How to Pass the GAIQ Test

Great Post,

A few pointers from my end - 

  • It’s an open book test and you have the option to pause the test, Of course I am not encouraging that people pause the test all the time and then get the answers but at the same time one needs to be little practical minded.
  • There are cases where questions related to PPC are asked. Not advanced PPC questions but questions like why is there a difference in number of Clicks and Visits, different ways of tagging etc. Basically very simple questions.
  • There are a few tricky questions which involve selecting multiple options. The key here is to identify the correct options.
  • Regarding the new interface, very few questions are asked but the most common one is pertaining to using events as Goals and how it is set
  • Please do check sitesearch in detail. There are questions that are asked related to sitesearch and which are slightly tricky

 

Some key concepts that are not provided in conversion university - 

  • Google Analytics by default will provide only 5 Custom variables and 50 custom variables are provided in Google Premium
  • User defined Variables and Custom Variables are not the same
  • Google Analytics language report does not provide language of the user but provides default language of the OS of the PC
  • Maximum number of profiles that can be created for a specific account is 50 but when contacted Google that number can increase to 200
  • For regular expressions related to IP please do not try to be a superman but use Google's Tool
  • Always remember that Filters are executed in the order in which they are applied.

 

Lastly to all the people who are giving the test, ALL THE BEST :) 

- Sajeet

Blog Post: The 2 User Metrics That Matter for SEO

Dr Pete,

An interesting post, but i believe that purely looking at these two metrics from a user point of view is not the key here.

I agree with you when you say that Matt Cutts has never been so clear in his statements but the reality is that if we start believing blindly that Google representatives say it would be like stabbing ourselves with a sharp edged knife.

Honestly if I had to predict or state one of the ways on how Google would rank websites based on CTR/user behavior it would involve at least three basic steps - 

Step 1

Google was granted a patent that talks about classification of websites based on categories. To give everyone a gist, search engines will be able to classify websites and categorize them which will eventually help them in determining relevance which will in turn help in determining (to some extent) rankings. Of course other factors like back link profile; social presence etc will be taken into consideration. So basically the way I see it, websites will be classified into specific categories.

Step 2 

Bounce rate data is fine; I mean everybody talks about that but what’s surprising is that no one talks about the exit rate of a particular page. From an analytics perspective I would definitely look at exit rate of a page as that is the page from which a user is leaving the website after navigating through my website. 

Who knows maybe even in page analytics data can be used. So if I had to predict, more than bounce rate, exit rate and in page analytics data should ideally be used. One would argue, why not site load time, the answer is NO for the simple reason that it’s a sample set, so using data which has been collected based on sample data set would be very stupid.

Step 3

Now we have websites categorized and then we have the required GA data, now combine that with Google Patent of Demoting Search Results. Again to give you all a Gist, this patent states that search engine might demote websites with similar search results in case a user did not click on any of the search results that was provided in the previous search made using a similar query.

They said that they dont crawl flash or JavaScript, but according to Mike they definitely do.

Similarly, they say that they don’t use GA but when it’s a matter of search supremacy I hardly doubt that they won’t use it.

Again I am not stating that the two user metrics stated above are not important but I think it’s high time that we stop limiting ourselves to some basic metrics.

- Sajeet

Blog Post: Monitor Which Social Networks Your Visitors are Logged Into With Google Analytics

Great Post Tom,

I guess the only problem that I foresee in the near future is the limitation of Google Analytics where we can have only 5 custom variables at a time, as a result of which we might not be able to track other social media sites like Pinetrest, Orkut or any other social media site that might come up in the near future.

Apart from Google Premium (which offers 50 custom variables, for a highly moderate fee of $1,50,000 per year) is there any other alternate method that you have in mind?

- Sajeet

Blog Post: Location: A Ranking Factor in Organic SERPs

Hi Michael,

Thanks for the post, but as far as Google Places is concerned i would love see a post that would cover the following points - 

  • What triggers an integrated listing i.e. organic + local?
  • What is the influence of citations alone as compared to other link building activities in triggering a places result?
  • Does natural link building i.e. text links, content links help in the performace of listings?
  • What is the role of on page elements like having instances of location specific keywords, Geo tags, specifying Geo location in WMT, language tags etc in the performace of places listings?
  • How to integrate two listings into one?
  • What are the main reasons to a listing getting rejected and how to get the listing back? (A practical example would be awesome)
  • What are the common pitfalls of bulk verification? (We all know that bulk verification is not easy and webmasters have reported a lot of bugs/complains)
  • How to handle Google places when you only have one physical location but you serve the entire country?
  • Does Google treat listings with individual location specific URLs differently as compared to multiple listings that have the same URL i.e. in most cases Home Page URL?
  • What is the best way to measure visits specifically coming from Google places in Google Analytics? (We all know that the numbers in Google Places Dashboard are not accurate and the problem with manual tagging is that we do not get all the keyword data)
  • Does Google Analytics report visits from maps.google.com and places listing in google.com differently?
  • How to handle the "others" bug i.e. search terms that are reported under "others" in Google Places dashboard? (People who frequently use Google places should know that the numbers reported under "others" is normally pretty high as compared to visits from other keywords)
  • Under what conditions does a Google places listing get flagged and how to avoid such cases?

 

The point that i am trying to make here is that its good to have basic posts but the expectation is that there will be a follow up post with some advanced techniques/solutions to problems that SEOs face. Of course if not a single post it would be great if someone can actually suggest links that answer the questions that i have posted.

Thanks,

- Sajeet

Blog Post: The Inside Scoop to Finding Link Building Opportunities with Free Alerts

Hi Neil,

Interesting post and yes i totally agree with you that its important to automate work in order to save time.

But I believe that we also need to think beyond Google. We have Yahoo alerts which I believe also does a decent job. Unfortunately and quite surprisingly Bing does not offer alert service and the discomfort was very evident in the Bing community. Please correct me if i am wrong here but i believe that Bing has not started an Alert service of their own.

For Twitter there is a tool called Followerwonk. It’s a paid tool but there are lot of options like analyzing, tracking and sorting. The free section as well has a few options that can be helpful in identifying good link building opportunities.

Of course then there is Twitterfeed where we can track data for Facebook, Twitter, Linkedin, Statusnxt and Hellotxt. We also have cotweet which is simply a brilliant tool. They had launched it as a free service and an amazing tool that it is i was hardly surprised when they converted it to a paid service. Although a paid service it’s still an amazing tool. You have the option to schedule your tweets, have email alerts etc, the list is pretty long. 

Please Note - The last day for Cotweet Standard was 15 February

Lastly I would recommend tweetdeck which also does a fantastic job to manage your accounts. It is currently a free tool but I won’t be surprised if they convert it into a paid service.

Hopefully the list that i have provided above will also be helpful.

Lastly it’s really great to see you writing on SEOMOZ so frequently. 

- Sajeet

Blog Post: Broken Link Building Guide: From Noob to Novice

Hi Anthony,

First of all congratulations on your first post and you have pointed out some amazing techniques. As far as operators are concerned you can also use the expression intext:brandname -site:brandname.com. You will be surprised to know the number of pages/websites that have instances of brand URL. I certainly miss the "+" operator. It would have been very useful in identifying loads of websites that have instances of the URLs under consideration.

Now for a big problem with Google WMT -

One of the biggest flaws with Google WMT is that even if there is an instance of a URL in the form of text it picks it and shows it in the crawl errors section. For example if someone writes www.abc.com/page1 and it is in plan text and not a link and is present in the comments section of a blog, even if that page does not exist Google WMT will highlight it as a crawl error. The only option in this case would be to manually write a 301 redirect rule in htaccess file.

There is also a virtual problem that we all face. Lets say we implement virtual page views for tracking purposes in Google Analytics say a URL abc.com/virtual for some weird reason Google WMT picks it up and reports it in the crawl errors section.

- Sajeet

Blog Post: Be Careful Using AdWords for Keyword Research

Hi Rand,

I believe that there is not a single tool out there that can provide the most accurate information. Besides wanted to ask you, did you select "only show ideas closely related to my terms"?

Let’s take a simple example here -

Search for a keyword like Bail Bonds and do not select "only show ideas closely related to my terms" and then search for Bail Bond Agent, you will see it right there. However if you select "only show ideas closely related to my terms" and then search for the term "Bail Bond Agent" you will not find the keyword. 

So one care that needs to be taken is that you DO NOT select "only show ideas closely related to my terms" when you are looking for a broader range of keywords.

Also there is a considerable difference in the number of keyword ideas that the tool gives when you are logged in using your adwords account.

For example, for the keyword "bail bonds" i got 100 suggestions as compared to 800 when i was logged in using my adwords account.

There are some amazing tools within Google Adwords like Placement Tool, contextual Targeting tool etc which are some of the most underrated tools as far as keyword analysis is concerned.

Besides there are others like SEM rush (analyze your competitors data), ubersuggest, Soovle which can be used for keyword analysis.

Of course then there is Google Webmaster Tools impression data and Google Analytics keyword data. 

There are cases where I have noticed that a particular keyword is showing 0 searches in exact match in the keyword tool whereas in Google Analytics I am getting over 1000 visits via that exact same keyword.

Yes there are a few flaws but then which tool does not have flaws. Please remember people that its a FREE tool. Although it’s not the most perfect tool in the world but i believe that if we take certain precautions it can be used in the most optimum way. It will remain the number 1 keyword analysis tool for me. So I will definitely give it a Thumbs Up. 

Sajeet

Blog Post: 6 Reasons You Need to Charge More

Thanks Dr Pete for the post,

As the Great Steve Jobs once said -

Be a Yardstick of quality. Some people aren't used to an environment where excellence is expected.

The very concept of cheap services results in a cheap clientele who will eventually not help the business in any way.

Being expensive ensures that people are expecting Good quality or should i say high quality services. 

Lastly its important to remember that Client Referral Programs are one of the best ways to get great potential clients.

- Sajeet

Blog Post: Building a Technical SEO Process

Hi Stephanie,

Interesting points, there are certain things that i would love to see automated - 

There are many instances where a test site is hosted on a staging server which is a replica of the website which is live. Of course the robots.txt file of the test website blocks all the search engine spiders from crawling these pages. However there are instances where the site is transferred from the staging server to the live server which includes the blocked robots.txt file which is then followed by the site not getting crawled by search engine spiders which eventually results in fall in rankings and traffic.

Wish #1 - A tool/software that would alert the concerned people about robots.txt file blocking the search engine spiders.

For Google Analytics, there are so many websites that have an include.php file where the GA code can just be copy pasted and the code gets pasted on all the pages. Problems occur when there are multiple template files. If a website has 2,00,000 odd pages with 20 templates, even if the GA code is removed from one of the templates, one would not see a major drop in traffic, of course there will be a fall but chances are that it might go unnoticed.

Wish #2 - A tool/software which would alert webmasters about removal of GA code in case of multiple template driven websites.

There are cases where canonical tags cannot be implemented simply because the CMS does not allow it. In such cases one would obviously go for canonical tag implementation in the HTTP header response. However the biggest problem here would be in cases where a website has thousands of PDF files and corresponding number of HTML/PHP/ASP pages live with canonical tags present in the http header response.

Wish #3 - A tool/software which would crawl all such pages and alert webmasters in case canonical tags are removed from the http header response.

It would be great if these pointers are somehow incorporated while building a streamlined process for technical audit of a website.

I am sure community members will definitely have a few more suggestions/pointers.

Cheers,

Sajeet

Blog Post: Find Your Site's Biggest Technical Flaws in 60 Minutes

Thanks Robert, Really Appreciate it :)

- Sajeet

Hi Tyler,

The Problem - 

The biggest problem with pagination was the inability of search engine spiders to identify the paginated pages as a series. For any website that has thousands of pages and pagination in place, on checking Google WMT in the HTML suggestions section one would find loads of duplicate titles, duplicate meta description and duplicate meta keywords. The most ideal recommendation would be to implement optimized/unique titles and metas for individual pages of the series. However from a webmasters perspective it will be a huge operational hazard, especially in cases where pages get added very frequently.

The concept  - 

The purpose of these tags is very simple, they basically provide signals to search engine spiders about the paginated series. You can also read the blog post for detailed information and implementation.

When to implement the tags - 

These tags  would work very well say for a blog where these signals will help Google to identify the first page of the paginated series and ideally give importance to the first page. However if you consider an e-commerce website where there is a "View All" option then in such cases ideally one would want the View All page to be ranked by search engine spiders. In such cases canonical tag should be placed on the individual pages of the series pointing to the "View All" page.

Effect of these tags -

We implemented the tags on the pages and noticed that there was a sharp decline in the number of pages showing duplicate titles and duplicate Metas in the HTML suggestions section of Google WMT. This was followed by a sharp improvement in the rankings of the campaign keywords. In terms of statistics, we had around 600 paginated pages that showed duplicate elements in WMT, but after implementation that number was reduced to 176.

One would ask, why did the number of errors not reduce to zero, well there are "n" number of explanations ranging from the pages not getting crawled, delay in Google WMT data getting updated etc.

Lastly canonical tags or prev/next tags? it totally depends on your requirement and feasibility from an implementation point of view. If you want the "View All" section to rank then implement canonical tags, if you want the first page of the series to rank implement next/prev tags.

I Hope that I was able to answer your question.

Let me know in case you need any more help.

@Gianluca - I dont remember the last time you actually spelled my name correctly, btw I think with my two comments here, i have written my own mini Post :P

Cheers,

Sajeet

Hi Dave,

Thanks for the feedback. I totally agree with you when you say that all points cannot be covered in 60 minutes, but apart from the points that you mentioned, i think Google Analytics Audit and WMT Audit can be completed in the time frame of 60 min.

Please correct me of i am wrong.

- Sajeet 

Thanks Gianluca for the feedback, I was well aware of the 60 min clause, most of the pointers that i mentioned above is to identify certain problems that a website may or may not have. In fact pointers mentioned in Google Analytics Audit and WMT audit should not take lot of time. Even in the technical audit part, site search, alternate tags and checking home page redirects should not take lot of time. For others, detailed recommendations can be sent later.

One more point, I forgot to mention in the comments section was microformats. We can just have a look at the source code and see if microformats or microdata or RDFA has been implemented or not i.e. basic markup of Schema.org.

Regarding YOUMOZ, thanks for the suggestion, i would definitely consider it and hopefully i will be able to come up with a good post :-)

- Sajeet

Hi Dave,

Interesting post, however there are a few more pointers that i would definitely consider as part of my audit which are as follows - 

Technical Audit - 

  • Does the website have site search feature?
  • Is the search term appended at the end of URL for tracking purposes
  • Are the images in the website optimized (Alt Tags and Filenames)
  • Are all JavaScript files being called externally?
  • What is text to code ratio
  • If there is flash content are we using SWF object
  • Site load time (Take home page as the sample page)
  • Do we have www and non www versions of the website, if yes then retain only one
  • If the website is hosted on IIS server do we have ISAPI rewrite module in place? For IIS 6 and above it is present by default and for lower versions we need to install it
  • If it's a region specific website, do we have alternate and language tags in place?
  • How many standalone pages does the website have?
  • Are there two versions of the home page i.e. default "/" and "index.php/html/asp"? If yes then retain only one by implementing canonical tags.  

 

Please Note - Many people try to implement 301 redirect i.e. say www.abc.com/index.php >> 301 Redirect >> www.abc.com. This implementation in most cases will not work as it will result in a redierct loop.

Regarding Pagination, why not implement rel=prev and rel=next on the paginated pages. I have tested this at my end and they seem to work really well

Canonical tags should ideally be implemented when you have a dedicated "view all" page and you want the search engine spiders to give importance to that page.

Keep in mind that implementing canonical tags would result in substantial reduction in number of pages indexed.

Google Analytics Audit -

  • Is Google Analytics Tracking code present on the pages of the website? If yes is it traditional code or asynchronous code, if its is asynchronous code then is it placed in the head section?
  • If there are multiple sub domains or cross domains involved do we have appropriate cross domain or sub domain tracking is place.
  • If its an e commerce website do we have e commerce tracking enabled?
  • Is there scope for adding event tracking code or virtual pageviews in case of downloads?
  • Do we have appropriate Goals in place?
  • Check Hostnames, best place to identify places where implementation of Google Analytics Tracking code is not placed properly 

 

Webmaster Tools Audit - 

  • Does the website have Google WMT and Bing WMT code in place? (I have noticed that a lot of webmasters often neglect Bing WMT)
  • Do we have an optimized XML sitemap in place?
  • Do we need a sectional sitemap or a sitemap index?
  • Do we need video sitemap and an image sitemap?
  • Do we have too many links pointing to us or have we been hit by site wide links?
  • How many crawl errors are present? How many pages are there with duplicate Titles and Metas
  • How many pages return a soft 404? 

 

- Sajeet

Blog Post: Are Your Titles Irresistibly Click Worthy & Viral?!

Brilliant Post,

In fact I will be using these points as part of a presentation where obviously I will give credit to you :P.

Just one point though, regarding Ingredient #7 expectations - Its slightly tricky, sometimes we need to have a fine balance between intentions and outcome. If the intention is just to get exposure one wouldn’t really bother about negative or positive responses, however if you want a good outcome one needs to be very vary of the target audience.

Cheers,

- Sajeet

Blog Post: The 10 Golden Rules to Attracting Authority Links

Hi Neil,

Its great to see you blogging so frequently and this is another great post that you have written.

Regarding Rule #7 - I have always wanted to toy with the concept of getting backlinks from big media websites but the bottom line is that it rarely happens. The content has to be exceptionally viral and to some extent promoted by a big brand, but nevertheless one can always dream right ;)

Regarding Rule #8 - I am slightly concerned about getting links from government websites or edu websites. Somewhere down the line when you consider relevancy as an important factor, from an SEO perspective these links might not help. In fact if the anchor text is a keyword then it might have a negative impact where a competitor goes ahead and complains foul play in the form of Paid links.

Using a brand name would probably be a safer option. If the government or edu website is very popular we should see a spike in referral visits.

- Sajeet

Blog Post: How to Increase the Odds of Your Content Going Viral - Whiteboard Friday

The concept of early Buy in is just so amazing. Patience is the key here and there is no absolutely no shortcut to success.

It’s always important to remember that there are some well experienced guys out there whose feedback really matters. There will be times when they might not reply, it’s not because they are arrogant but it’s just that they sometimes do not get time to reply to everyone.

Keep striving and as I always say hard work and great content always pays of (sometimes literally). 

Thanks for the great post Rand.

- Sajeet

Blog Post: Google Analytics Cross Domain Tracking Made Easy

Hey Martijn,

Thanks for the insightful post, was just wondering shouldn’t the line 

$("form[action*='example-B.co.uk']").attr("onSubmit","_gaq.push(['_linkByPost', this])");

be

$("form[action*='example-B.co.uk']").attr("onSubmit","_gaq.push(['_linkByPost', this.href])");

or is it because it’s a jquery function href is not required?

Also in case of third party shopping carts is there any jquery function that will automatically append utm_nooverride=1 in the Thank You page URL for medium/source attribution purposes?

Let me know.

 - Sajeet

Blog Post: A Peek Under the Hood: How we Manage the SEOmoz Community

Challenge Accepted :)

- Sajeet

This is awesome stuff, it’s always amazing to interact with community members, yeah I mean there are times when there is a difference of opinion (right Gianluca? :P ) but I guess that happens everywhere, more importantly we get to learn from seniors and people from across the globe.

Also, i am saying this from personal experience that Keri and Jennita are truly amazing people, despite their super hectic schedule they have always been super supportive right from the day i became a member. 

Lastly knowledge is shared and people can showcase their talent and quite frankly I don’t see a bigger platform than this.

- Sajeet

Blog Post: How To Handle Downtime During Site Maintenance

Amazing information Frederick, i was just wondering will it be possible to actually incorporate the functionality of a custom 404 using this technique?

I mean it will be a 503 error but will look like a custom 404 page. The main advantage here is that Google will not report these pages in the crawl errors section of Google WMT.

Also not sure but do these 503 pages get reported in Google WMT?  

- Sajeet

Blog Post: Super Pac Man Bot vs. Sonic the Hedgehog Bot

Hey Jackson,

I understand your point of view. Whenever there is a new post by Slingshot SEO I always expect Great content. But this content was very disappointing.

But then again let’s not blame the SEOMOZ team. We are all human beings and mistakes do happen.

@Slingshot SEO - I sincerely hope that you guys don’t take this criticism in a negative way but learn from it. You guys have written some amazing content in the past and we expect great content from you.

- Sajeet

Blog Post: Using Social Media Monitoring as an Inbound Marketing Channel - Whiteboard Friday

Hey Himanshu,

When handling social media campaigns sometimes replying to customer queries also becomes part of the deal. Also it would be little harsh to say that i was generalizing stuff but rather i was just giving my point of view. Also i totally agree with u that building relationship is the key and one shouldn't just give a cold call or mail. However once you have that rapport i guess having a good call wouldn't hurt.

- Sajeet

Thank God You wrote the last line :D

It seems that you are feeling bad for your Good friend, honestly even i agree that his comment did not deserve 9 thumps down. Maybe its his tone of writing which is pretty rude, but then again thats my perspective.

- Sajeet

I never spoke anything about linkback request without any communication, all i stated was that a GOOD CALL IS BETTER THAN A GOOD EMAIL where time is of the essence.

All bolded just like you love it.

- Sajeet

Hey Himanshu,

I totally agree with you regarding cold calls and yes there are agencies that are willing to work according to US/UK time but then again consider this scenario -

A customer raises a query on twitter for a brand X at 12:30 PM PST (2:00 AM IST) and the marketing agency for brand X is based in India. By the time the agency checks the query it is already EOD for USA. Also let’s assume that the question is very product specific where the agency has to depend on the client for the answer which results in further delay.

The only way this can work is if the agency is given full authority to answer the questions without the client's consent but then again that is something that very few brands allow.

Regarding calls, I believe a very good call always has an edge over a very good mail especially in cases where time is of the essence. The probability that a user will reply to a call is always higher than the probability of a user replying to a mail.

- Sajeet

A brilliant whiteboard Friday and as usual you have covered some awesome white hat techniques that you have been preaching for like a zillion years.

It’s amazing how there exists tones of opportunities which often go unnoticed. All the techniques mentioned above are pretty good but if you don’t mind as far as the second technique is concerned i.e. direct contact, I foresee it to be very beneficial for an in house marketing agency. If an external agency or hell worst case scenario the internet marketing efforts are outsourced, too much to and fro will happen because of the time gap. Being a strict time bound activity, for external agencies where work is outsourced this seems like slightly a difficult task or maybe even (if I may say so and definitely mean no DISRESPECT) impractical.

Reaching out privately is also a tricky technique. It is important to understand where to draw the line between stalking and sending a message directly. There is a slight probability where people might start ignoring the mails where a big brand recieves a lot of mails. I guess good old fashioned "CALLS" might have an advantage over mails.

Some brilliant examples of guest blogging and mentions which will definitely go a long way in establishing a brand. But then again it will be a RACE AGAINST TIME :)

Implementing your suggestions or ideas will truly test an internet marketer or agency.

Rand, you surely know how to push people over the edge :P

Lastly but not the least, wishing everyone at SEOMOZ a great 2012.

- Sajeet

Blog Post: Counting to 10 the Google Way

HA HA HA HA, that was a good one, but i guess in such cases we should just not count these results. I mean firstly as you rightly pointed out its a discontinued service and secondly users will surely click on other results after finding out that the first result is of no use :)

- Sajeet

Dr Pete,

Products - Google Product Feed

Images - Google Image search

Listings - Google Places 

News - Google News

Hell if you search for "Power Meter" guess which URL is ranking at #1 - www.google.com/powermeter

Initially Google places was free but then came Boost, keyword data was free but then came {not provided}

The way things are moving it seems that slowly and steadily everything will become paid.

From a users perspective nothing really changes because at the end of the day they are getting information.

- Sajeet 

Blog Post: Why Link Schemes Fail

Interesting post Dan,

Would like to hear your opinion on this, What if my competitor builds spammy links for my website and then reports it to Google following which I am penalized?

- Sajeet

Blog Post: 8 Things You Can Give Away to Earn Links + Mentions - Whiteboard Friday

Thanks Gianluca for the response, looking forward to more such "FUN" conversation/debate with you ;)

Hi Gianluca,

Thank you for your honest feedback and I really appreciate it. I am aware of the fact that you are a reputed name in the field of SEO and I am glad that you took time to provide a response. I may not be as BIG an expert as you are but I would just like to clarify that I am NOT here to argue or offend anyone but only to learn. All those posts that you have mentioned above, I have gone through each and every one of them and I also practice those techniques regularly.

I was NOT complaining but was just highlighting a practical problem that many SEOs around the world are facing today and put in a request to a leading industry expert to provide some of his expertise.

Also please note that I am an avid follower of SEOMOZ where I even contribute regularly to YOUMOZ

- Sajeet

Hi Rand,

Thank you for your insights and all the above techniques that point towards building a natural looking link profile which is definitely the way to go. But there are so many websites whose link profiles that I have analyzed using tools like Open site explorer or Majestic SEO and many more where more than 70% percent of the times they are paid links with keywords as the anchor text. These are the same websites that are ranking very highly for very competitive keywords.

Building relationships and using the above link building techniques and many more is definitely the way to go but nevertheless these are also time consuming activities. What’s worse is that Google claims to filter out paid links but the fact is that more than 80% percent of the websites that are ranking on Google.com have a big paid link profile.

An ideal search engine would be the one where websites are identified on the basis of their brand popularity and website usability. Although Google has been able to achieve that for Big brands like Microsoft, Apple etc that is not the case with medium scale businesses. Of course there are patents being filed by Google that claim to understand user behavior and change the results but i have not yet seen the implementation.

No offense intended Rand but your recommendations always point to an ideal scenario. What i and a lot of SEOs around the world would love to see is a White board Friday edition where you address the topic "How to Beat a competitor who is ranking because of PAID links". Also you will agree to the fact that clients always look for quick results so it would be great if you could also incorporate a "Time" factor in it.

I am pretty sure that you will probably IGNORE this comment but what the heck it’s worth a shot.

Last but not the least HAPPY Holidays :)

- Sajeet

Blog Post: Local SEO and Moving Business: 5 Steps, 4 Lessons

Hi Martin,

That’s pretty decent analysis and yes I would not be the last person to say that Google Places is one of the most volatile places where absolutely nothing is stable.

Citations are one of the most crucial factors that help Google to identify the most trustworthy listing. In your case one way to promote the right listing would be to start with the improving your citations.

You can start with Whitespark, although a paid tool but the free version also provides sufficient data. 

You can also refer Rand Fishkin's post http://www.seomoz.org/blog/how-to-research-local-citations-after-google-removed-them-from-places

Get in touch with local bloggers who blog about web design and SEO and ask them to list your website with the latest address. Create a PRO yellow pages profile.

The more citations you build the better it will be for your website.

I hope that I was of some help to you.

- Sajeet

Blog Post: How I Got The Attention of One of the Top SEO Bloggers With Diet Coke

Charles,

This is by far the most amazing case study I have ever read. As I also stated in Neil's recent post on SEOMOZ that Good Will is of key importance in order to build successful professional relationships.

Yet it is important that we do not make it obvious that we need something in return for the favor. Keep up the good work.

I also have a gut feeling that the supply of Diet Coke will surely increase in the SEO community after reading this post :)

Cheers,

Sajeet

Blog Post: The Five Marketing Lessons That Took Me a Long Time to Learn

Excellent Points Neil,

One of the most underrated methods of building a successful business is to have/maintain a good relationship with your old clients.

For startups the initial clientele is usually pretty small, but as the brand grows bigger clients come in which eventually leads to old clients being forgotten.

It is important to realize that it is the good work done for the old clients and probably their referrals that led to bigger clients.

Most successful entrepreneurs often neglect their old clients simply because now they are too "BUSY" for them. Like it or not but Good will is equally important for the success of any business.

- Sajeet

Blog Post: How to Launch, the Spotify Way

@Tof, Jennita and Moosahemani,

I was just stating a fact, i am not against these strategies infact i support them. All i wanted was to clear the myth on how they got these tweets.

Let not manipulate people by highlighting it as if it was free.

- Sajeet 

Big names tweeted about the brand? Seriously? One need not be an Einstein to realize that obviously there was a marketing budget associated with this strategy. But the trick is to make it look natural to the target audience.

Like it or not Andrew but the harsh reality is that the big names who have tweeted about the brand obviosuly got their own fair share if you know what i mean. Same strategy would also have been extended to Techcrunch. 

Bottom-line, if you invest in the right mediums and the right people, whether your brand is good or bad, you will always get that initial push. Later it is totally up to the brand to live up to the hype and the quality of service delivered.

- Sajeet

Blog Post: How to Fix Crawl Errors in Google Webmaster Tools

Hi Joe,

Thanks for the very well-illustrated post.

If you don’t mind a few points from my end as well - 

For some God forsaken reason nonexistent URLs also show up in the crawl errors section, for example - www.seomoz,org/a....... 

Such URLs clearly do not exist but somehow always show up in the crawl errors report. Maybe it’s because some Webmaster has linked to the irrelevant page or maybe because it’s in the form of text somewhere and Google picked it up as a URL or maybe some other reason. Nevertheless it does add to a lot of headache. The best solution here obviously would be to implement 301 redirects, but how do you do that? That’s where webmasters actually have to install ISAPI rewrite module for IIS servers ($100) or MOD rewrite module in Apache servers. Fortunately for IIS 7 and above the module is already installed :). By making changes to the web config file and writing rules any URL can be 301 redirected to the correct URL to pass the SEO value.

Now for a Virtual problem, not sure how but somehow Google also picks up virtual pages that are created for the sake of tracking in Google Analytics and adds them to the list of pages showing a 404 error. Now how do you solve this problem? We have no other option but to block them using robots.txt file.

Cheers,

Sajeet

Blog Post: Protect Your Website From Becoming a Doorway Page - Google Patent Style

I guess when you talk about redesign, as long as the changes are strictly cosmetic it is fine. By cosmetic changes i mean CSS style sheets, images etc. However if the content changes on a regular basis including the URLs and 301s are implemented on a regular basis then it will confuse the search engines spiders which will eventually effect the performance of the website across major search engines. 

Bill, this is simply awesome. I think not only me but there are lot of people out there who depend on you for analysis of complicated patents. 

I totally agree with you when you say - "Google might identify a page as a doorway page if it not only changes signigicantly, but also stops focusing upon a topic that might have been its main focus, and/or adds a number of topics."

This is something that everyone should look out for.

Once again thank you so much for taking out time and providing me and the community with such awesome insights.

This is truly awesome :)

- Sajeet

If we have tagged the pages using URL tool builder that shouldnt be a problem. One of the probable cases when a page might be considered as a doorway page would be when that landing page is linked from somewhere and search engine spiders are able to access that page via those links. Otherwise you should be just fine.

It depends on your definition of a microsite, in your case probably having a different site would have helped. Again it will be impossible to determine how the algorithm works but analyzing Google's patents can provide some insights into it.

Blog Post: A Tale Of Two Studies: Google vs. Bing Click-Through Rate

Good Infographic but i would like to disagree on one point - 

Header Tags do play an important role as far as rankings are concerned in Google.com.

In fact we have case studies that suggest that for Bing more than the header tag, keywords in Bold tag perform better.

Again one should note that this is not the case for HTML 5. In case of HTML 5 <b> denotes something in the content you want to bring attention to which is again a whole different concept.

- Sajeet 

Blog Post: A Letter to Google from Inbound Marketers

In all fairness to Google, &limit=N was more like a hack, however there is an alternate solution to that - http://www.convonix.com/blog/web-analytics/how-to-export-more-than-500-results-in-new-version-of-google-analytics/

Having said that as far as {Not Provided} is concerned the % value is just going on increasing. Its only a matter of time before everyone uses personalized search.

I guess even Google knows that and just wants to monetize everything which is really Sad. A typical "BIG" corporate attitude which will eventually lead to it's own downfall.

 

- Sajeet

Blog Post: Duplicate Content in a Post-Panda World

Dr Pete,

Great post, but there are certain things that i would like to point out -

www.abc.com/index.html >> 301 Redirect >> www.abc.com is not possible as that would lead to a redirect loop and the home page will go down. Canonical tag is the only way.

For international sites with duplicate content there are lot of options like-

  • Using GEO tags
  • Using Rel=alternate tag
  • Specifying Geo Location in WMT
  • Using Language tags
  • Having Country specific terms in SEO elements like Titles and Metas 

- Sajeet

Blog Post: Solving New Content Indexation Issues for Large Websites

Hi Abdul,

Great Post!!!!! However was just wondering, since site: operator does not provide an accurate data, would it make more sense to actually use the data provided by Google WMT i.e. the number of pages indexed from our XML sitemap?

Also for websites with large number of pages implementing sectional sitemaps is probably the best way to understand indexation issues at the very root level.

Let me know your thoughts

- Sajeet