Content versus Links

We read a lot in Leo Di Milo’s blog about the importance of incorporating social factors even in banal niches like toilet seats.  How do they attract attention to the banal niches in real life?  Hot chicks(Hot Chicks holding toilet seats could go viral), Celebrities, and humor.  The first two methods of attracting attention cost money unless you obtain the photos illegally or photoshop them or maybe you’re a hot chick yourself.  The last one has the most potential but humor blogs have enough trouble going viral on their own never mind a toilet seat humor site.

But still in 2011, I’d say the way to incorporate Leo Di Milo’s social strategies in banal niches is to incorporate humor.  A joke page on it’s own may be enough to garner a few natural links and tweets.  However, toilet seat cartoons and videos would be the best way to get a social aspect(and social links) to your niche(and there’s humor in every niche).  It does cost money to do quality cartoons and videos.  You could also do a theme song.




This is easier said than done and advertisers end up spending millions for campaigns that don’t work.  Unless you are very creative, SEO has a much higher Return on Investment.
And SEO is still very much Google.  There’s Bing on the scene but Bing goes after much the same market as Google(the casual internet user).  You don’t advertise on Gossip Girl and Parenthood to get tech savvy traffic.  The only search engine that really seems to be going after the tech savvy market is Blekko with it’s slash tags and ability to view SEO results for your site(the casual internet user definitely doesn’t care about the sites SEO results).

Since Google and Bing are both targeting the casual internet user(as they are the ones that click on ads), we can use what that target market wants to determine what direction Google is going in the future.
You’ll note that Google has a Web Spam Team but not a quality content finding team as the casual internet user isn’t as discriminating at finding quality content as the average internet user.  Google would rather display a less risky Wikipedia entry as number one even when other entries are more relevant.  Usually, Wikipedia either has the problem of having too much information on the page when the user is looking for something specific info(even if the user specifically requests exact info like “list of amino acids”) or the reader needs a lot more information on a query than Wikipedia could provide.

Most of the time though you can afford to look around.  There are some times when you need an answer right away like a virus or other computer issue.  Most of the time Google is pretty good when responding to tech support queries unless all the methods applied still don’t work.  If a search engine can do better at answering and resolving these queries(which can be hidden deep in forums) than they may be able to usurp Google without massive fund-raising.

So, basically, Google doesn’t have to find the best content to satisfy the average internet user.  They just need to be above a certain content floor.
In fact, if Google did find the best content they might turn away users.  The best content on credit cards may be a 10,000 page authority site on how credit card companies screw you over.  The best content on insurance may be another large authority site explaining with actuarial analysis why all insurance is a bad deal.  If Google allows for companies to buy rankings with PPC or SEO, then you will get corporate rankings near the top which are the less controversial sites.

Now, Web Spam.  People point out the example of Ezine Articles(extremely spammy format), Wisegeek, and TheFind as spammy sites that Google allows in it’s rankings.  Now I think Google is going to let these spammy sites stay but they only want large spammy sites.  As Nixon once said, “If the president does it, it’s not illegal.”  People are willing to take more crap from the people with a larger authority.  If a bank robs from you that’s okay but it’s not okay for a person to rob from the bank.  If people get ripped off by buying an e-book they cry foul, but if they get ripped off by Best Buy they think maybe I should have read the fine print more carefully.

So Google is going to be cracking down on the smaller spam sites but leaving the larger sites intact.  What does that mean in terms of SEO for 2011?

It means unique content is going to be more important.  In addition to spamming forum profile links, you’re going to start needing to write unique content for your forum profiles for your bio section.  Google is going to start requiring more unique content from all sites.  The writing is on the wall for this with Xomba moving from 50-100 words for it’s bookmarks and Infopirate moving to a 50 word minimum.  Not just emphasis on links but more emphasis on the content surrounding those links.  Sometimes brevity is good but 50 words is not that much at all so even if the user has a query that can be answered briefly they won’t be battered with too many irrelevant words.
This will still keep Google game-able so that people(read: corporations) can buy their way to the top and yet prevent the people who write long rants on credit cards from getting to the top.

There’s also been talk about the anchor text use in inbound links no longer being so important.

Grizz for instance has said that Google can determine what your site is about regardless of the anchor text.  I am not so sure.  Can it determine that adipocytes are relevant to weight loss?  The average person doesn’t use the word adipocytes in their weight loss article and the scientific abstracts don’t often use the term weight loss when talking about adipocytes.  Google needs to look at anchor text when helping to determine relevance.  For example, a searcher may see an article about adipocytes and link to it using the word weight loss making the connection for Google.
Google engineers don’t make money by doing things manually and it’s a coding sin to ever have to do a task manually.  Google engineers get paid money for algorithms and aren’t paid the big bucks to do rankings by hand.  Anchor Text is an extremely valuable LSI tool for Google.  It helps tell Google what your site is about and helps Google find synonyms that it wouldn’t ordinarily associate.  So in 2011, Google bombing your anchor text isn’t going to be important as using your anchor text to help Google associate your site with terms and synonyms that it wouldn’t find on it’s own.

Content is going to play a larger role in rankings

For example, in a recent SEO Moz article, it was mentioned that a brand mention in the New York Times resulted in the search engines crawling and discovering a site.  Again, Google engineers don’t like to do anything by hand.  This could be something special for very large authority sites but something may occur in Google when it sees a term that it doesn’t recognize.  Google may start crawling the web for more information about that term so it doesn’t get caught unaware.  How is Google supposed to rank content if it doesn’t know all the facts about the content it’s supposed to rank?
There’s also been talk about Twitter and Facebook affecting rankings.  Matt Cutts has said that Twitter and Facebook are in the testing stage and they are trying to work out ways to determine the validity of twitter votes and Facebook likes.  The problem with Twitter is that there are very legit reasons to use bots like using twitter feed to update your latest blog posts.  It’s not spam as it still serves a very specific purpose which is to notify your followers that you have a new post up.  Google will properly require a percentage of content to be unique that’s much lower than an ordinary web site(I’ve heard 50% thrown around).  So, say one unique tweet for every 100 bot tweets may be enough to keep you out of the spam tweeter zone.
Google does have a social graph that it uses and for twitter maybe it could have some sort of system for non-reciprocal followers but really how is Facebook and Twitter different from any other social site with the exception of the no-follow tag?

So in 2011 there’s going to be a gradual change of the no-follow tag.

The point of the no-follow tag was to be able to link to spammy sites without giving them votes but then the top sites on the web started using the no-follow tag.  No-follow was designed to help people deal with comment spam before Google was better at discounting spam from their end.  A person using Facebook isn’t using the no-follow tag to be able to link to Decor my Eyes without giving them a vote, it’s having the no-follow thrust upon them.  How is spamming twitter or facebook for links any different then spamming amplify or livejournal for links?  So, my prediction is that Google will start selectively following some Twitter and Facebook links as long as they rise above the spam criteria.  And then if it works well, they’ll apply it to other sites where the writers of the content don’t chose which links to follow and which not to follow.

2011 Predictions for SEO:
  • Dancing Leo Di Milo turtleneck video to promote his rulers site with followup case study on this blog.
  • Unique content is going to be more important but only as much as that the content floor is going to rise.  It’s going to take more unique content for each link to count.
  • Spammy sites will be consolidated into a few like Wisegeek and The Find with the little guy no longer able to compete.
  • Instead of Anchor Text being a strong signaling for ranking, it will now be more important for LSI and to help establish keyword relationships.
  • Google and Bing will never be about finding the best content(and after all they do sell the top three results) as that is not the type of content that their target market wants.  Look at the type of content that’s in a magazine like Psychology Today.  Most magazines sell based on the pictures and anyone can label a pornographic picture as being a picture of Justin Bieber to try to get traffic.  Even a link manipulation scheme will weed out mislabeled sites(most social bookmarking sites at least don’t allow inappropriate tagging).  Link building will continue to be essential as it will always be a way to make sure that a site doesn’t violate the terms of service and that it’s above a certain content floor.
  • More no-follow links are going to be selectively followed if the no-follow tag was not added by the developer of the page.  So a blog comment might still be no-followed as the owner of that page is the one saying he doesn’t want it to be followed but a twitter page will be followed as it’s not the intent of the page owner for the links to be no-followed.
Tyler Davis actually adheres to a lot of the Leo Di Milo established principles of internet marketing.  The unique selling point of his primary website is to try to find a growing taller method that works. And the search engines have much of the time ranked his site for what they think his site should be ranking for rather than what he’s built anchor text links for.  Most of the time he has found that social methods work better but they take time and link building is a way to vent out your frustration that your site isn’t ranking right now!  Alsosearch engine marketing is the only way to target the casual internet user who is only on the internet one minute a week to type in a query on Google.

1 comments:

Sell Original Software said...

wow nice info ser thanks for sharing this with us

Post a Comment

◄ Newer Post Older Post ►