segunda-feira, 7 de fevereiro de 2011

Two Diametrically Opposed Google Editorial Philosophies

Two Diametrically Opposed Google Editorial Philosophies: "

An 'Algorithmic' Approach


When it comes to buying links, Google not only fights it with algorithms, but also ran a 5-year long FUD campaign, introduced nofollow as a proprietary filter, encouraged webmasters to rat on each other, and has engineers hunting for paid links. On top of that, Google's link penalties range from subtle to overt.


Google claims that they do not want to police low quality content by trying to judge intent, that doing so would not be scalable enough to solve the problem, & that they need to do it algorithmically. At the same time, Google is willing to manually torch some sites and basically destroy the associated businesses. Talk to enough SEOs and you will find stories of carnage - complete decimation.


Economics Drive Everything


Content farms are driven by economics. Make them unprofitable (rather than funding them) and the problem solves itself - just like Google AdWords does with quality scores. Sure you can show up on AdWords where you don't belong and/or with a crappy scam offer, but you are priced out of the market so losses are guaranteed. Hello $100 clicks!


How many content farms would Google need to manually torch to deter investment in the category? 5? Maybe 10? 20 tops? Does that really require a new algorithmic approach on a web with 10's of millions of websites?


When Google nuked a ton of article banks a few years back the damage was fairly complete and lasted a long time. When Google nuked a ton of web directories a few years back the damage was fairly complete and lasted a long time. These were done in sweeps where on day you would see 50 sites lose their toolbar PageRank & see a swan dive in traffic. Yet content farms are a sacred cow that need an innovated "algorithmic" approach.


One Bad Page? TORCHED


If they feel an outright ban would be too much, then they could even dial the sites down over time if they desired to deter them without immediately killing them. Some bloggers who didn't know any better got torched based on a single blog post:


The Forrester report discusses a recent “sponsored conversation” from Kmart, but I doubt whether mentions that even in that small test, Google found multiple bloggers that violated our quality guidelines and we took corresponding action. Those blogs are not trusted in Google’s algorithms any more.


One post and the ENTIRE SITE got torched.


An Endless Sea of Garbage


How many garbage posts have you seen on content farms?


When you look at garbage content there are hundreds of words on the page screaming 'I AM EXPLOITATIVE TRASH.' Yet when you look at links they are often embedded inline and there is little context to tell if the link is paid or not, and determine if the link was an organic reference or something that is paid for.


Why is it that Google is comfortable implying intent with links, but must look the other way when it comes to content?


Purchasing Distribution


Media is a game of numbers, and so content companies have various layers of quality they mix in to make it harder for Google to find signal from noise. Yahoo! has fairly solid content in their sports category, but then fluff it out with top 10 lists and such from Associated Content. Now Yahoo! is hoping they can offset lower quality with a higher level of personalization:


The Yahoo platform aims to draw from a user’s declared preferences, search items, social media and other sources to find and highlight the most relevant content, according to the people familiar with the matter. It will be available on Yahoo’s Web site, but is optimized to work as an app on tablets and smartphones, and especially on Google Android and Apple devices, they said.


AOL made a big splash when they bought TechCrunch for $25 million. When AOL's editorial strategy was recently leaked it highlighted how they promoted cross linking their channels to drive SEO strategy. And, since acquisition, TechCrunch has only scaled up on the volume of content they produce. In the last 2 days I have seen 2 advertorials on TechCrunch where the conflicting relationship was only mentioned *after* you read the post. One was a Google employee suggesting Wikipedia needs ads, and the other was some social commerce platform guy promoting the social commerce revolution occurring on Facebook.


Being at the heart of technology is a great source of link equity to funnel around their websites. TechCrunch.com already has over 25% as many unique linking domains as AOL.com does. One of the few areas that is more connected on the social graph than technology is politics. AOL just bought Huffington Post for $315 million. The fusion of political bias, political connections, celebrity contributors, and pushing a guy who promoted (an ultimately empty) promise of hope and change quickly gave the Huffington Post even more link equity than TechCrunch has.


Thus they have the weight to do all the things that good online journalism is known for, like ads so deeply embedded in content you can't tell them apart, off-topic paginated syndicated duplicate content and writing meaningless posts devoid of content based on Google Trends data. As other politically charged mainstream media outlets have shown, you don't need to be factually correct (or even attempt honesty) so long as your bias is consistent.


Ultimately this is where Google's head in the sand approach to content farms backfired. When content farms were isolated websites full of trash Google could have nuked them without much risk. But now that their is a blended approach and content farms are part of public companies backed by politically powerful individuals, Google can't do anything about them. Their hands are tied.


Trends in Journalism


Much like the middle class has been gutted in the United States, Ireland (and pretty much everywhere that is not Iceland) by economic policies that gut the average person to promote banking criminals, we are seeing the same thing happen online to the value of any type of online journalism. As we continue to ask people to do more for less we suffer through a lower quality user experience with more half-content that leaves out the essential bits.


How to build a brick wall:


  • step 1: get some bricks

  • step 2: stack them in your workplace

  • step 3: build the brick wall

The other thing destroying journalism is not only lean farms competing against thick and inefficient organizations for distribution, but also Google pushing to control more distribution via their various data grabs: Youtube video & music, graphical CPA ads in the search results, lead generation ads in the search results, graphic AdSense ads on publisher sites that drive searches into those lead generation funnels, grouping like data from publishers above the organic search results, offering branded navigational aids above the organic search results, acquiring manufacturer data, scraping 3rd party reviews, buying sentiment analysis tools, promoting Google maps everywhere, Google product pages & local review pages, extended ad units, etc. If most growth in journalism is based on SEO & Google is systematically eating the search results, then at some point that bubble will get pricked and there will be plenty of pain to go around.


My guess is that in 3 to 4 years the search results become so full of junk that Google pushes hard to rank chunks of ebooks wrapped in Google ads directly in the search results. Books are already heavily commoditized (it's amazing how much knowledge you can get for $10 or $20), and given that Google already hard-codes their ebooks in the search results, it is not a big jump for them to work on ad deals that pull publishers in. It follows the trend elsewhere 'Free Music Can Pay as Well as Paid Music, YouTube Says.'


It's Not All Bad


The silver lining there is that if you are the employer your margins may grow, but if you are an employee & are just scraping by on $10 an hour then it increases the importance of doing something on the side to lower your perceived risk & increase your influence. A few years back Marshall Kirkpatrick started out on AOL's content farms. The tips he shared to stand out would be a competitive advantage in almost any vertical outside of technology & politics:


one day Michael Arrington called and hired me at TechCrunch. 'You keep beating us to stories,' he told me. I was able to do that because I was getting RSS feeds from key vendors in our market delivered by IM and SMS. That's standard practice among tech bloggers now, but at the time no one else was doing it, so I was able to cover lots of news first.


Three big tips from the 'becoming a well known writer front' for new writers are...


  • if short form junk content is the standard then it is easier to stand out by creating long form well edited content

  • it is easier to be a big fish in a small pond than to try to get well known in a saturated area, so it is sometimes better to start working for niche publishers that have a strong spot in a smallish niche

  • if you want to target the bigger communities the most important thing to them (and the thing they are most likely to talk about) are themselves

Another benefit to publishers is that as the web becomes more polluted people will become far more likely to pay to access better content and smaller + tighter communities.


Prioritizing User Feedback?


On a Google blog post about web spam they state the following:


Spam reports are prioritized by looking at how much visibility a potentially spammy site has in our search results, in order to help us focus on high-impact sites in a timely manner. For instance, we’re likely to prioritize the investigation of a site that regularly ranks on the first or second page over that of a site that only gets a few search impressions per month.


Given the widely echoed complaints on content farms, it seems Google has a different approach on content farms, especially considering that the top farms are seen by millions of searchers every month.


Implying Intent


If end users can determine when links are paid (with limited context) then why not trust their input on judging the quality of the content as well? The Google Toolbar has a PageRank meter for assessing link authority. Why not add a meter for publisher reputation & content quality? I can hear people saying "people will use it to harm competitors" but I have also seen websites torched in Google because a competitor went on a link buying spree on behalf of their fellow webmaster. At least if someone gives you a bad rating for great content then the content still has a chance to defend its own quality.


With link stuff there is a final opinion and that is it. Not only are particular techniques of varying levels of risk, but THE prescribed analysis of intent depends on who is doing it!


A Google engineer saw an SEO blog about our affiliate program passing link juice and our affiliate links stopped passing weight. (I am an SEO so the appropriate intent is spam). Then something weird happened. A few months later a Google engineer *publicly* stated that affiliate links should count. A few years later Google invested in a start up which turns affiliate links into direct links while hiding the paid compensation in the background. (Since Google is doing it the intent is NOT spam).


Implying Ignorance


Some of the content mills benefit from the benefit-of-doubt. Jason Calacanis lied repeatedly about 'experimental pages' and other such nonsense. But when his schemes were highlighted he was offered the benefit of the doubt. eHow also enjoys that benefit of the doubt. It doesn't matter that Demand Media's CEO was the chairman of an SEO consulting company which sold for hundreds of millions of Dollars. What matters is the benefit of the doubt (even if his company flagrantly violates quality guidelines by doing bulk 301 redirects of EXPIRED domains into eHow ... something where a lesser act can put you up for vote on a Google engineer's blog for public lynching).


The algorithm. They say. It has opinions.


What Other Search Engines Are Doing


A Bing engineer accused Google of funding web pollution. Blekko invites end users to report spam in their index, and the first thing end-users wanted booted out was the content mills.


But Google need to be "algorithmic" when the problems are obvious and smack them in the face. And they need to "imply intent" where the problems are less problematic & nowhere near as overt.


Makes sense, almost!


"

Nenhum comentário:

Postar um comentário