Thursday, March 1, 2012

Goldilocks SEO

Sharing is caring!

Please share :)

Embed code is here.

SEO strategies

Categories: 

Source: http://www.seobook.com/goldilocks-seo

search engine marketing internet marketing search engine search engine marketing seo shelly long

From AdSense to SpamSense to Spam Cents

Google announced they rolled out their anti-overly-aggressive-ads algorithm. They didn't give a specific % on how much of the above the fold content can be ads, but suggest using their browser preview tool. Using that tool on Google.com's search results would of course score it as a spam site, but for some small AdSense webmasters that avoided Panda, Google may have drew first blood.

Much Quicker Updates

With a limited number of recoveries nearly a year after Panda, the first bite might seem like a big concern, however the "too many ads" algorithm updates far more frequently than Panda does:

If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.

And for those who got hit by Panda then tried to make up for those lower ad revenues with more AdSense ad units, they probably just got served round #2 of Panda Express. ;)

Is it Screen Layout, or Something Else?

In the past Google suggested to a nuked AdWords advertiser that more of his above-the-fold real estate should be content than ads.

However Google has such a rich data set with AdSense that I don't think they would just look at layout. If I were them I would factor in all sorts of metrics like

  • AdSense CTR
  • average page views per visitor
  • repeat visits & brand searches
  • bounce rate
  • clickstream data from Chrome & the Google toolbar (so even if you are using other ad networks, they can still sample the data)

Some sites are primarily driven off of mobile views while other sites might be seen on large monitors. When Google sees every page load & measures the CTRs, tacking actual user response is better than guestimating it.

They could come up with some pretty good metrics from those & then for any high traffic/high earning site they could manually review them to see if they deserve to get hit or not & adjust + refine the "algorithm" until those edge cases disappeared. Google's lack of credible competition in contextual & display ads means they can negotiate pretty tough terms with publishers that they feel are not adding enough value to the ecosystem.

It's Not Just Algorithms Cleaning Up AdSense

In addition to these sorts of algorithms, over the past year they have manually hit networks of sites with the doorway pages label & disabled ad serving on sites or entire accounts where they felt there was a bit too much arbitrage. One of our SEO Book members pointed me to this thread where a lot of Pakistani AdSense accounts got torched last October & another sent me a sample termination email from Google similar to this one:

Notice that in the above:

  • There was no claim of click fraud, copyright issues, or anything like that.
  • There was no claim of advertiser complaints.
  • Google offers no customer support phone number, no "you might want to work on this" advice, doesn't list which of the sites in the account they felt could be improved, and RETROACTIVELY nuked past "earnings" ... depending on where it is in the schedule that can amount to anywhere from 30 to 50+ days (I remember Teeceo mentioned how they waited until the day before the AdSense payday to smoke his stuff way back in the day to have maximum impact!)

On Google's latest quarterly earnings call they highlighted how year on year Google's revenues were up 25% but the network revenues only grew at 15%. They also explained the slower network revenue growth as being associated with improved search quality & algorithm updates like Panda.

Left unsaid in such a statement was that until those algorithms rolled out, Google admitted they funded spam. ;) The whole AdSense & content farm problem was created through incentive structures with unintended consequences.

Is the Garbage Disappearing, or Just Moving to a New Landfill?

If you track what is going on with the Google+ over-promotion (long overdue post coming on that front shortly!) or how Google is still pre-paying Demand Media to upload video "content" to Youtube, Google still may be funding the same model, but doing so while gaining a tighter control of relevancy so they can better sort good stuff from crap (when you host content & track user response you have all the metrics in the world to determine how relatively good you think it is). If they over-promote these sites then in the short run they create the same skewed business model problem.

Sure hosting the user experience makes it easier to sort the wheat from the chaff, but the other big risk here is the impact on the rest of the publishing ecosystem. There will be lots of thin spam from popular people on Google+ (anyone launched a celebrity-focused Pay-Per-Plus site yet?) & in-depth editorial content might not be economically feasible in certain categories where there literally is no organic SERP above the fold.

I will complement them on their efforts to clean up some of the worst offenses (from the prior generation of "bad incentives"). If you were hit by it, Panda was every bit as big/brutal as the famous Florida update. If this update is anything near as significant as the Panda update (in how it impacts smaller independent webmasters) then it is going to force more of them/us to move up the value chain.

That may mean pain in the short run, but (for those who take it as a wake up call to develop brand & organic non-search traffic streams) far more rewards in the longrun for those who remain after the herd is thinned.

Working for "The Company"

Larry Page's view on working for the company:

My grandfather was an autoworker, and I have a weapon he manufactured to protect himself from the company that he would carry to work. It's a big iron pipe with a hunk of lead on the head. I think about how far we've come as companies from those days, where workers had to protect themselves from the company.

I think for many SEOs the idea of starting over is painful, but the best SEOs often enjoy the forced evolution & the game of it all. They don't roll over & play dead or forget SEO. And if Google didn't put hard resets in every once in a while, then the big hedge funds would be mopping up the SERPs and cleaning our clocks with the help of Helicopter Ben.

Areas For Improvement

Of course this could be taken as a positive post toward Google (and it mostly is), but I don't want to come across as a fanboi, so I thought I should do a shout out to a couple things they still need to fix in order to be consistent:

  • If Google is going to tell people that thick deep content is needed to gain sustainable exposure then they shouldn't be ranking thin + pages in the SERPs just because it is a Google product. Even people who have *always* given Google the benefit of the doubt (full on fanbois) found the Google+ placement in the SERPs distasteful.
  • Google's AdSense is still sending out some of those automated "you are leaving money on the table" styled emails reminding publishers to use 3 ad units. If such behavior may lead to a smoke job, then the recommendation shouldn't be offered in the first place. Right below the "use 3 ad units" there needs to be a "proceed with caution" styled link (in red) that links to the recent "too many ads" post.
  • Old case studies that are no longer in line with best practices in the current market should have some sort of notice/notification added to them so new webmasters don't get the wrong idea.
  • Some of the AdSense heatmaps are roadmaps to penalization. These should have been fixed before yesterday's announcement, but if they are still up there next week then Google is willfully & intentionally trying to destroy any small business owner that follows that "best practice" advice.

Your Feedback Needed

Since this update impacted far fewer sites than the Panda update, there are fewer sample/example sites. Did any of your websites get hit? If so, how would you describe ...

  • your ad layout
  • your ad CTR
  • you mode of monetization (AdSense, other, both)
  • the level of impact on your site from the update

Source: http://www.seobook.com/spamsense

bonnaroo 2011 sing it for the world debbie reynolds ryan widmer elycia turnbow

Cloaking: Survey Says?

In the below video Matt Cutts states that "there is no such thing as white hat cloaking" ...

... yet Google is testing a new ad unit where users have to fill out a survey before they can view the content.

How long until the surveys include something like:

  • did you vote in 2008
  • what presidential candidate did you vote for
  • how do you feel about issue x
  • how strongly do you feel about your opinion on x

Then after the survey: "Thanks for your feedback. Candidate y supports your views on issue x."

Advertisers then get a report like: "in Ohio, 84% of the 289,319 swing voters with an average household income between $32,400 and $67,250 think issue x is vitally important and have a 6:1 bias toward option A. They respond to it more strongly if you phrase it as "a c b" and are twice as likely to share your view if you phrase it that way. The bias is even stronger amongst women & voters under 50, where they prefer option A by a factor of 9:1."

Couple that ability to flagrantly violate their own editorial guidelines with...

... & Google is in an amazing position politically.

It is thus not surprising to see how politicians have a hard time being anything but pro-Google, as they are the new Western Union.

This isn't the first time Google experimented with cloaking either. Threadwatch had a post on Google cloaking their help files years ago & YouTube offers users a screw you screen if they are in a country where the content isn't licensed - yet they still show those cloaked pages ranking in the search results.

?The most perfidious way of harming a cause consists of defending it deliberately with faulty arguments.? ? Friedrich Nietzsche

It is common knowledge that you shouldn't mix business and politics, however if one looks at history, many of those who gave us those sage words did precisely the opposite - and often illegally so - selling us down the river.

What is so obnoxious about Google's survey trial is that a big site that was hit by Panda was hit because they used scroll cloaking & didn't let the users get to the content right away. Googlers suggested users didn't like it & voted against it, and then roll out the same sort of "wait 1 moment please" stuff themselves as a custom beta ad unit.

And today Google just announced that they might create an algorithm which looks at ad placements on a website as a spam signal outside of Panda:

?If you have ads obscuring your content, you might want to think about it,? asking publishers to consider, ?Do they see content or something else that?s distracting or annoying??

On the one hand they tell you to optimize your ad placements & on the other they tell you that those were not optimal & are so aggressive that they are spam.

For a while there was a period of time where you could use something like "would Google do this" as a rule of thumb for gray area behavior.

In the current market that won't work.

?No man has the right to dictate what other men should perceive, create or produce, but all should be encouraged to reveal themselves, their perceptions and emotions, and to build confidence in the creative spirit.? ? Ansel Adams

As ad units get more interactive & Google keeps eating more verticals the line between spam vs not will keep blurring.

Perception is everything.

?We are all in the gutter, but some of us are looking at the stars.? ? Oscar Wilde

Categories: 

Source: http://www.seobook.com/survey-says

seo sem search engine marketing internet marketing search engine

Focus on The Business Model

Google's Take on Search Plus Your World

A few weeks ago Google announced the launch of Search Plus Your World, which deeply integrates social sites (especially Google+) into the Google search experience to make it more personalized.

While Google claimed that the socialization was rather broad-based, the lack of inclusion of Facebook & Twitter along with the excessive promotion of Google+ raised eyebrows. While the launch was claimed to be social for personalizing results, the Google+ promotions appeared on queries where they were clearly not the most relevant result even when users are not logged into a Google account.

Google+ Over-promotion

A couple weeks ago when Google announced Google Search Plus Your World competitors collectively complained about Google over-promoting their own affiliated websites.

Twitter was perhaps the loudest complainer, highlighting how Google basically eats all the above-the-fold real estate with self promotion on this @WWE search.

It is no surprise that folks like Ben Edelman, Scott Cleland & Fair Search chimed in with complaints, as this is just a continuation of Google's path. But the complaints came from a far wider cast of characters on this move: the mainstream press like CNN, free market evangalists like the Economist, Google worshipers indoctrinated in their culture who wrote a book on Google & even ex-Googlers now call into question Google's transparently self serving nature:

I think Google as an organization has moved on; they?re focussed now on market position, not making the world better. Which makes me sad.

Google is too powerful, too arrogant, too entrenched to be worth our love. Let them defend themselves, I'd rather devote my emotional energy to the upstarts and startups. They deserve our passion.

The FTC's Google antitrust probe is to expand to include a review of Google+ integration in the search results.

Facebook & Twitter launched a don't be evil plugin named Focus On The User, which replaces Google+ promotion with promotion of profiles from Facebook& Twitter.

For the top tier broad social networks framing the idea of integrating promotion of their networks directly in the search results is a natural & desirable conclusion, but is that just a convenient answer to the wrong question?

  • Whether Google ranks any particular organic result above the corresponding Bing ranking in Google's now below-the-fold organic results is a bit irrelevant when the above the fold results are almost entirely Google.com. But is the core problem that we are under-representing social media in the search results? According to Compete.com, Facebook & YouTube combine to capture about 16% of all downstream Google clicks. Do we really need to increase that number until the web has a total of 5 websites on it? What benefit do we get out of a web that is just a couple big walled gardens?
  • If Facebook is already getting something like 20% of US pageviews & users are still looking for information elsewhere, doesn't that indicate that they probably desire something else? Absolutely Facebook should rank for Facebook navigational queries, but given all their notes spam, I don't like seeing them in the search results much more than seeing a site like eHow.
  • The he said / she said data deals are also highly irrelevant. What is really needed is further context. Before Google inserted Google+ in their search results the Google+ social network was far less successful than MySpace (which recently sold for only $35 million). If social media is added as an annotation to other 3rd party listings then I think that has the opportunity to add valuable context, but where a thin "me too" styled social media post replaces the publisher content it lowers the utility of the search results & wastes searcher's time. Further, when those social media results are little more than human-powered content scrapers it also destroys the business models of legitimate online publishers.

Over-promotion vs "Search Spam"

At any point Google can promote one of their new verticals in a prominent location in the search results & if they are anywhere near as good as the market leader eventually they can beat them out of nothing more than the combination of superior search placement, monopoly search marketshare, account bundling & user laziness. What's more, they can make paid products free and/or partner with competitors 2 through x in an attempt to destroy the business model of anyone they couldn't acquire (talk to Groupon).

Amit Singhal is obviously a brilliant guy, but I thought some of the answers he gave during a recent interview by Danny Sullivan were quite evasive & perhaps a bit inauthentic. In particular, ...

  • "The overall takeaway that I have in my mind is that people are judging a product and an overall direction that we have in the first two weeks of a launch, where we are producing a product for the long term." If the product wasn't ready for prime time you were not required to mix it directly into the organic search results right off the bat. It could have been placed at the bottom of the search results, like the "Ask on Google" links were. Bing has been working on social search for 18 months & describes their moves as "being very conservative."
  • "The user feedback we have been getting has been almost the other side of the reaction we?ve seen in the blogosphere." Of course publishers who see their content getting scraped & see the scraped copy outranking the original have a financial incentive to care about a free & automated scraper site displacing their work. They don't get those pageviews, they don't get that referrer data, and they don't get those ad impressions. Google's PR team is anything but impressed when another company dares do that to Google.
  • "The users who have seen this in the wild are liking it, and our initial data analysis is showing the same." Much like the Google Webmaster Tools shows that pages with a +1 in the search results get a higher CTR, this Google+ social stuff also suffers from the same type of sampling bias & giving the listings a larger and more graphical stand out further help them pull in much more clicks. Any form of visual highlighting & listing differentiation can lift CTR. I might be likely to click on some of my own results more, but when I do so you might just be grabbing a slice of navigational searches I was going to do anyway where I was looking for something else I posted on Google+ or my Google+ account or the account of a friend & so on. Further, aggregate data hides many data points that are counter to the general trend. I have seen instances of branded searches where the #1 organic site was getting a CTR above 70% (it even had organic sitelinks, further indicating it was a navigational search) and for such a search in some cases there were 2 Adwords ads above the organic results & then the Google+ page for a brand outranked the associated brand in the SERPs for those who followed it! That is a terrible user experience, particularly since the + page hasn't even had any activity for months.
  • "Every time a real user is getting those results, they really are delighted. Given how personal this product is, you can only judge it based on personal experiences or by aggregate numbers you can observe through click-through." First, publishers are not fake users. Secondly, as mentioned above, there is a sampling bias & the + listings stand out with larger & more graphical listings. If they didn't get a higher CTR that would mean they were *really* irrelevant.
  • "out of the gate, whereas we had limited users to train this system with, I?m actually very happy with the outcome of the personal results." They could have been placed at the bottom of the search results or off to the side or some such until there was greater confidence in the training set.
  • "People are coming to a conclusion about the product today, within the first two weeks, and they?re not fully seeing the potential where we can build this product around real identities and real relationships." If a publisher promotes a site to the top of the search results & then says something like 'we will improve quality later' they are branded as spammers. In the past Google has justified penalizing a site based on its old content that no longer exists on the site. Investing in depth, quality & volume is a cycle. If others get prohibited from evolving through the cycles due to algorithms like Panda then it becomes quite hard to compete as a new start up when Google can just insert whatever it wants right near the top & then work on quality after the fact.
  • "We don?t think of this as a promotional unit now. This is a place that you would find people with real identities who would be interesting for your queries." If this is the case then why does it only promote Google+?
  • "We?re very open to incorporating information from other services, but that needs to be done on terms that wouldn?t change in a short period of time and make our products vanish." The problem is, if a company builds a reputation as a secretive one that clones the work of its partners & customers then people don't want to do open-ended transparent relationships. Naive folks might need to see the blood and tears 3 or 4 times to pick up on the trend, but even the slowest of the slow notice it after a dozen such moves.
  • "I?m just very wary of building a product where the terms can be changed." Considering Google's lack of transparency & self-promotional bias on the social networking front, would you be fully transparent and open with Google? If so, then aren't the search algorithms complex enough that it would make sense to make those transparent as well? How can you ask other social networks to increase transparency at the same time Google is locking down their search data on claims of protecting user privacy?
  • "It?s not just about content. It?s about identity, and when you start talking about these things and what it takes to build this, the data needed is much more than we can publicly crawl." This is where being trustworthy is so crucial. Past interactions with Yelp, TripAdvisor & Groupon likely make future potential partners more risk adverse & cautious. Outrageous "accidents" like those that happened with Mocality & Open Street Map from playing fast and loose further erode credibility. And even when Google hosts the media & has full access to user data they still rank inferior stuff sometimes (like the recent Santorum YouTube cartoon fiasco), even on widely searched core/head keywords.

The big issue is that if people feel the game is rigged they won't have much incentive to share on Google+. I largely only share stuff that is irrelevant to tangentially relevant to our business interests & won't share stuff that is directly relevant, because I don't want to be forced to compete against an inferior version of my own work when the deck is stacked so the inferior version wins simply because it is hosted on Google.

As we move into the information age a lot of physical stores are shutting down. Borders went bust last year. Sears announced the closure of many stores. And many of the people shopping in the physical stores that remain are using cell phones for price comparisons. Given Google's mobile OS share this is another area where they can build trust or burn it. A friend today mentioned how their online prices on Google Product search almost always show a lower price near the header than the lowest price available in the list - sometimes by a substantial margin.

Identity vs Anonymous Contractors

In the past we have mentioned that transparency is often a self-serving & hypocritical policy by those atop power systems who want to limit the power of those whom they aim to control.

When Google was caught promoting illegal drug ads there was no individual who took the blame for it. When the Mocality scraping & the Open Street Map vandalism issues happened, all that we were told was that Google "was mortified" and it was "a contractor." If people who did hit jobs could just place all the blame on "the contractor" then the world would be a pretty crappy place!

Eric Schmidt warned that "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place." That sage advice came from the same Eric Schmidt that blackballed cNet for positing personal information about him. Around the same time Eric offered the above quote, Google was engaged in secret & illegal backdoor deals with direct competitors to harm their own employees.

What happened to Google recruiters who dared to go against the illegal pact? They were fired on the hour:

"Can you get this stopped and let me know why this is happening?" Schmidt wrote.

Google's staffing director responded that the employee who contacted the Apple engineer "will be terminated within the hour."

When Google+ launched they demanded that you use your real name or don't use the product. They later claimed that you can use a nickname on your account as well, but there is a difference between a nickname and pseudonyms.

What is so outrageous about the claims for this need for real identities is that past studies have shown that pseudonymous comments are best & Bruce Schneier highlighted how we lose our individuality if we are under an ever-watchful eye:

Cardinal Richelieu understood the value of surveillance when he famously said, "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged." Watch someone long enough, and you'll find something to arrest -- or just blackmail -- with. Privacy is important because without it, surveillance information will be abused: to peep, to sell to marketers and to spy on political enemies -- whoever they happen to be at the time.

Privacy protects us from abuses by those in power, even if we're doing nothing wrong at the time of surveillance.

In many markets ads and content are blended in a way that is hard to distingush between them. Whenever Google wants to enter they can demand greater transparency to participate (and then use the standard formatted data from that transparency to create a meta-competitor in the market.)

Increasingly Google is placing more of their search data & their webmaster-related functions behind a registration wall. If you are rich & powerful they will sell you the data. If you are the wrong type of webmaster that aggregate data can be used in *exceptionally* personal ways.

User Privacy

Ahead of Google updating their privacy policy Google has directed a large portion of their ad budget toward ads about how they protect users online.

What better way to ensure user privacy than to allow them to register their accounts under psydonyms? The real name policy on Google+ was part of what made Google want to stop providing referrer data for logged in users who search on Google. This has had a knock on effect where other social sites are framing everything, requiring registration to read more of public user generated content & sending outbound traffic through redirects.

Google's new privacy policy allows them to blend your user data from one service into refining the experience (and ads) on another:

If you?re signed into Google, we can do things like suggest search queries ? or tailor your search results ? based on the interests you?ve expressed in Google+, Gmail, and YouTube. We?ll better understand which version of Pink or Jaguar you?re searching for and get you those results faster.

Google & Facebook's war (against) user privacy is catching media and governmental attention. Microsoft highlighted some of Google's issues in their "putting people first" ad campaign & the blowback has caused Google not only to publish PR-spin "get the facts" styled blog posts, but to launch yet another ad campaign.

EU regulators have asked Google to pause their privacy policy changes.

Bogus Testimonials & Social Payola

Is social media a cleaner signal than links? If search engines put the same weight on social media that they put on links it would get spammed to bits. It won't be long until a firm like Ad.ly offers sponsored Google+ posts.

Some have suggested that you won't be able to buy Google+ followers however Google already includes user pictures on AdWords ads (even when they desire not to be & even when they didn't endorse the product that Google suggests they endorsed). In due time I expect Google will indeed sell followers & other user interactions as ad units (just like Twitter & Facebook do).

Further, celebrities sell Tweets to advertisers. When they are hot their rates go up:

When Ad.ly introduced self-destructing Charlie Sheen to Twitter, he was paid about $50,000 per tweet. It was worth it. Sheen?s tweet for Internships.com generated 95,333 clicks in the first hour and 450,000 clicks in 48 hours, created a worldwide trending topic out of #tigerbloodintern, attracted 82,148 internship applications from 181 countries, and added 1 million additional visits to Internships.com.

Search engines might consider these to be clean signals if those same search engines were not busy buying the manipulation of said "relevancy" signals.

Attention is purchased to create demand. It isn't comfortable to put it this way, but we are trained to obey authority & to like what others like:

The average Facebook user has 130 friends, which equates with four degrees of separation to thousands of people, Mr. Fischer said. Metrics like that led him to believe that if Facebook could figure out a way to capitalize on "social endorsements," it would be like creating a word-of-mouth campaign that could reach millions of people simultaneously. Since the campaigns would come from a friend, they would theoretically be taken more seriously than, say, a TV commercial, he said.

On an individual basis reviews and ratings get faked everywhere. Even stodgy old slow-moving institutions like colleges game their ranking systems.

There recently was a question raised about how Google's rating systems skewed high on the underlying data. Surely Overstock (the same Overstock Google penalized earlier this year) wouldn't promote Google's trusted stores aggressively on their own site if it made their business appear worse than it actually is, thus a positive bias must be baked in to the system.

Entire categories of demand are created by those tied in with power cost shifting to create bubbles. The federal reserve helped spark a real estate bubble with low interest rates. FBI warnings of mortgage fraud were ignored. Consumers were constantly fed propaganda about "real estate only goes up." Then when that bubble popped, the US government bailed out those who caused it & burned trillions of Dollars propping up home prices. The government even bailed out a company that is now shorting the housing market (when that company was about to get bailed out the secretary of treasury leaked that material non-public information to some of his criminal investor buddies).

Does all the above sound circular, conflicting, corrupt & confusing? It should, because that is how power works & comes off as seeming semi-legitimate when acting in illigitimate ways. The perception of reality is warped to create profitable opportunties that are monetized on the way up and the way down.

Millions of kids take drugs that address the symptoms of being a child full of energy, imagination & entusiasm. In some cases they may need them, but in most cases they probably don't. The solution with the highest economic return gets the largest ad budget, even if it only treats symptoms.

Web Scrape Plus+ (Now With More Scraping)

When the +1 button & Google+ launched, Google highlighted how they would use the + button usage as a "relevancy" signal. Google recently started inserting + pages directly into the search results for brands & right from the very start they were using it as a scraper website that would outrank the original content source.

Google used the buy in from their promised relevancy signal to create a badge-based incentivized system which acts as a glorified PageRank funnel to further juice the rankings of these new pages on a domain name that already had a PageRank 10.

I recently read a blog post about how anyone could do the above & the opportunity is open to everyone. But the truth is, I can't state that something will become a relevancy signal that manipulates the search results in order to get buy in. Or, if I did something which actually had the same net effect, Google would likely chop my legs off for promoting a link scheme.

Recently the topic of Google+ as a scraper site came up yet again via Read Write Web & on Hacker News a Googler stated that it was "childish" to place any of the blame on Google!!!!!!

Google determines how much information is shown near each listing & can create "relevancy" signals in ways that things tied to Google get over-represented (look at the +1 count here). When they do that & it destroys other business models *of course* Google deserves 100% of the blame.

Thin Content & Scraper Sites

Remember the whole justification for Panda was that thin content was a poor user experience?

In spite of sites like eHow getting hit, Google is still pre-paying them to upload content to Youtube.

Now that the (non-Google hosted) thin content has been disappeared (and the % of downstream traffic from Google to Youtube has more then doubled in the past year) it is time for Google to take another slice of the search traffic stream with Search Plus Your World:

The Google vs Facebook locked down walled garden contest will retard innovation. As the corporate internet silos grow larger the independent web withers. Them going after each other may leave room for Twitter, but it doesn't leave lots of room is left for others, as the economics of publishing have to work or the publishers die.

Start ups that were on a successful trajectory were killed by Panda:

The startup had been on a roll up until last February when Google altered its ranking algorithm with the release of ?Panda.? The changes decimated TeachStreet?s traffic, and the company never quite recovered.

?We lost a lot of our traffic, and overnight we started talking to partners for biz dev, not for acquisition,? he said. However, many of the potential partners wanted to know about an outright acquisition.

About.com was also smoked by Google:

The biggest worry, though, is that the decline of About.com itself may be irreversible. Fewer people are clicking on About ads placed by Google and the site?s own display ads have dropped in value.

The company has attributed this decline in value to Google?s decision last year to downgrade About pages in its search results. With more than 80% of traffic coming from search, the Google denigration was indeed a blow but About?s problems may be rooted in something deeper.

Keep in mind that the reason these websites were hit was that they were claimed to be thin & thus a poor user experience. When the NYT bought About.com one of the top competing bidders was Google!

Now that the "thin content" has been demoted in the search results Google can integrate deep content silos from Google+, like this one:

That is an 8-word Google+ post about how short another blog post is. I like Todd & do like to read his writings, but here Google is clearly favoring the same sort of content they would have torched if it was done on an independent webmaster's website.

How Google has raters view other websites that redirect traffic is based upon those sites having a substantial value add. Clearly in the above example there was nothing added to the interaction beyond sharing a bookmark with a punchy tagline.

If Google wants to use the + notation to pull up that other referenced page then perhaps that can make sense, but to list an 8-word Google+ page in the search results nearly a year after the Panda algorithm is outrageous. This sort of casual mention integration in the search results occurs on expensive keywords as well. Not only do they list your own Google+ posts...

...but they also list them from anyone you follow...

In addition to information pollution, the other big issue here is time. Google wants to make forms more standardized to make filling them out faster & they give regular sermons on the importance of fast search results. Yet when I do a navigational search, Google delivers two AdWords ads, a huge Google+ promotion, and then the navigational search result barely above the fold.*

*Since I thought the above was obnoxious, I renamed our Google+ company page to S_E_O Book to help Google fix their relevancy problems.

Can anyone explain how Google's speed bias is aligned with putting plus junk right at the top, even on brand searches? Yahoo! has been pretty aggressive with putting shopping ads in the search results, but their implementation is still a better user experience than what Google did above.

And Bing offers an even cleaner experience than that.

Due to how Google integrates Google+ in such a parasitic way I see no incentive for participating on their network except when I have something that is outside of my domain of expertise, something that I am not targeting commercially, something that is thin, or something irrelevant to say! That incentive structure combined with Google's photo meme feature will ensure that content marketers will help plenty of people see Star Wars stuff ranking for mortgage loan search queries.

When you own search/navigation you own language. that position can easily be extended into any other direction/market in a way a social graph can not:

"The only technology I?d rather own than Windows would be English," McNealy said. "All of those who use English would have to pay me a couple hundred dollars a year just for the right to speak English. And then I can charge you upgrades when I add new alphabet characters like ?n? and ?t.? It would be a wonderful business."

Further, Google can chose at any point to respond to or ignore market regulations in accordance with whatever makes them the most money. They can also fund 3rd parties doing the same (like undermining copyright) to force others to strike an official deal with Google to be "open."

A lot of businesses live on small profit margins, so Google's ability to insert itself & fund criminal 3rd parties aligned with Google's internal longterm interests is a big big big deal. Companies will learn that you either work with Google on Google's terms or you die.

When a public relations issue brews they can quickly change their approach and again position themselves as the white knight.

Brand Equity & Forcing the Brand Buy

Yahoo! put out a research paper highlighting activity bias, stating that the efficacy of online advertising is often over-stated because people who see ads about a topic were already more closely tied in with that particular network & that particular topic before they even saw the ad. As an example, any person who sees an AdWords ad for hemorrhoid treatment was already searching for hemorrhoid-related topics before they saw your ad (thus they were in the subset of individuals that might have came across your site in some way if you were in the search ad ecosystem or not).

This sort of activity bias-driven selection bias (homophily) exists on social networks online & offline.

Google did research on incrementality of ads & they came to the opposite conclusion as Yahoo! did. Google suggested you should buy, buy, buy, even on your own branded keywords. They suggested that testing was expensive (no mention that the only reason it is expensive is because Google chooses not to make such tools easily accessible to advertisers) & that the clicks were so cheap on branded keywords that you should buy, buy, buy. Many advertisers who mix brand & non-brand keywords together don't realize that they are using the "returns" from bidding on their own brand to subsidize over-paying for other keywords.

Google Analytics is the leading & most widely used web analytics program. They can share whatever metrics help them sell more ads (defaulting to crediting the last click for conversions, even if it was on a navigational search to your site) & pull back on features that are not aligned with their business interests (SEO referral data anyone?)

This goes back to Scott McNealy's quote: "The only technology I?d rather own than Windows would be English. All of those who use English would have to pay me a couple hundred dollars a year just for the right to speak English. And then I can charge you upgrades when I add new alphabet characters like ?n? and ?t.? It would be a wonderful business."

Analysts didn't understand why Google CPC rates were down 8% last quarter while overall search clicks were up 34%. The biggest single reason was likely more clicks on adlinks on branded AdWords ads. While a brand buying its own keyword typically pays far less per click than what some of the biggest keywords go for, the branded keywords typically have an exceptionally high CTR. Those additional clicks dragged down Google's average CPC, but the extra revenue they offered was a big par of the reason why Google was about to grow at 25% even though their display network only grew at 15%.

That slow growth of display is in spite of Youtube now serving over 4 billion video streams per day & Google adding display ads to log out pages.

Online views are not the same as TV views. A comScore study found that 31% of display ads are never seen. In spite of that, US online advertising will reach nearly $40 billion this year.

Google wants to insert itself as a needed cost of business in the same way credit card companies have.

On Google Maps they put an ad inside your location box.

Even if most people don't participate on Google+, Google can still force advertiser buy in through over-promotion of the network in the search results. On your branded keywords they may drive your organic listing below the fold & put Google+ front & center.

Facebook earnings are still growing much faster than Google's & Facebook encourages advertisers to advertise their Facebook pages, so even when you pay for the click Facebook still keeps the user. Facebook is adding apps to the timeline & is trying to win VEVO music video hosting from YouTube.

While Google is primarily known as a search company, it is getting harder to get off of Google though any channel other than a toll booth. Google keeps driving the organic search results downward, while Google verticals fill up many of the organic results that remain. Many companies already buy Google ads on their own YouTube content. Some buy ads on Google to drive them to their Youtube videos & then buy ads on their own Youtube video to promote their websites. Soon Google will try to push you to buy them on your Google+ page as well. Google is becoming a walled garden:

Google wants to control more elements of your social world now. They don?t just want to be a search engine.

Is that so bad? Maybe not. It?s certainly no different from how other companies, from AOL, to Microsoft, to Apple, to Disney, to Facebook, have viewed the world ? as ideally a walled garden, an all-consuming platform that most people use for pretty much every form of entertainment and social interaction.

A lot of people thought that Google was somehow different. They were, of course, wrong.
...
To move forward either as the old Google or Google+, Google needs to be capable of making fair deals with the partner ecosystem. It needs to curb its instinct to kill competing media companies that were actually producing great content that Google helped you find.

I suspect there will be plenty of bloodshed before Google figures that one out.

"This is the path we?re headed down ? a single unified, ?beautiful? product across everything. If you don?t get that, then you should probably work somewhere else." - Larry Page

Google no longer believes in the concept of the open web. Blame it on Larry Page becoming the CEO, blame it on him talking to Steve Jobs & Steve telling him to make fewer and tighter products, blame it on Google funding eHow, or blame it on basically anything. But if you go back far enough, much of the stuff that is going on now was clearly envisioned a decade ago:

I was lucky enough to chat with Larry one-to- one about his expectations for Google back in 2002. He laid out far-reaching views that had nothing to do with short-term revenue goals, but raised questions about how Google would anticipate the day sensors and memory became so cheap that individuals would record every moment of their lives. He wondered how Google could become like a better version of the RIAA - not just a mediator of digital music licensing - but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it. I would rather jump on board that bullet train than ride a local that never missed a revenue stop but never." - Douglas Edwards

What happens when the Google+ version of your content outranks the version on your own site? And what happens when your branded channel and/or your fans become a vertical ad silo Google sells to your competitors?

I tested submitting a couple posts to Google+ with a Wordtracker top keywords list & valuable keywords (on a cpc*traffic) basis in posts about top keywords. Those posts rank #2 or #3 in Google for many people that follows me. No harm to me since those posts were irrelevant to this site, but if they were about my theme & topic I just would have out-competed myself. When Google outranks you (even with a copy of your content) they get to taste the data again and sell off the attention another time. You only get a slice of that monetization, even when it is your work that is being monetized. Maybe it is great for stuff that is somewhat less relevant and/or keywords that are so competitive that you otherwise wouldn't score for them, but we have to be really careful we don't out-compete ourselves. Though if Googke keeps this up they won't be the only ones monetizing it. Give it a few months and celebrities will be selling sponsored Google+ posts based on some metric created by multiplying search volume, CPC & how many followers they have.

Is Bing Better? Will Enough People Ask That Question to Matter?

For years Google built their reputation as being the search engine that offered the cleanest & fastest search results. They were known for monetizing less aggressively than the competition. But over the past couple years Google has dialed up their ads to where they now send a greater ratio of ad traffic than organic search traffic. One Google engineer recently described the ability to rank highly in Google without buying their ads as being a bug that was getting fixed!

Google's big risk in their coupling of aggressive monetization, aggressive self-promotion & changing how users feel about user privacy is that they can create the perception that users should go elsewhere for for an honest or trustworthy search. This not only builds momentum for smaller search services like DuckDuckGo & Blekko, but has also won praise for Bing from Gizmodo, Dave Winer & The Next Web.

Categories: 

Source: http://www.seobook.com/focus

debbie reynolds ryan widmer elycia turnbow sports illustrated swimsuit 2011 bonnaroo 2011 lineup rumors

YouTube Creator Playbook Version 2 Indicates Two-Dimensional Thinking

YouTube has updated the YouTube Creator Playbook, which compiles important tips, best practices, and strategies to build greater audiences on YouTube. While content and audience are covered smartly, one dimension is missing: video advertising.

Source: http://feeds.searchenginewatch.com/~r/sew/~3/6XH3Bfyu5uk/YouTube-Creator-Playbook-Version-2-Indicates-Two-Dimensional-Thinking

search engine marketing internet marketing search engine search engine marketing seo shelly long

Wednesday, February 29, 2012

The 10-5-10 social media routine: You must matter

These are the annotated slides from my presentation at SEMPDX Searchfest today. I talked about social media, having a productive routine and building a powerful campaign without sucking up 4 hours a day: Your daily social media routine – stuff that matters View more presentations from Ian Lurie…

Source: http://feedproxy.google.com/~r/conversationmarketing/MRJI/~3/7duZxB80tOg/the-10-5-10-social-media-routine-you-must-matter.htm

sing it for the boys sing it for the girls bonaroo 2011 stan musial seo sem

How Mobile, Social & Trust Are Shaping Local Search Usage [Study]

Online local listings are the most trusted and relevant results to access local business information, especially as consumers are adopting new technologies, applications and social networks, according to a new study from comScore and Localeze.

Source: http://feeds.searchenginewatch.com/~r/sewblog/~3/mOSuyoaN4GE/How-Mobile-Social-Trust-Are-Shaping-Local-Search-Usage-Study

survivor redemption island david kovacs

Non-Google Link Strategy: An Example of Stealth Link Marketing

Link opportunities are everywhere. You just have to stop thinking about Google in order to see them. It isn't easy. Yes, Google is important, but nothing is better than doing significant business regardless of which way the Google winds blow.

Source: http://feeds.searchenginewatch.com/~r/sewblog/~3/1BCGuQVZTvg/Non-Google-Link-Strategy-An-Example-of-Stealth-Link-Marketing

bonaroo 2011 stan musial seo sem search engine marketing

Website Auditor Review: A Full-Featured On-Page Optimization Tool

website-auditor-enter-url

Website Auditor is one of the 4 tools found in Link-Assistant's SEO Power Suite. Website Auditor is Link-Assistant's on-page optimization tool.

We recently reviewed 2 of their other tools, SEO Spyglass and Rank Tracker. You can check out the review of SEO Spyglass here and Rank Tracker here.

What Does Website Auditor Do?

Website Auditor crawls your entire site (or any site you want to research) and gives you a variety of on-page SEO data points to help you analyze the site you are researching.

We are reviewing the Enterprise version here, some options may not be available if you are using the Professional version.

In order to give you a thorough overview of a tool we think it's best to look at all the options available. You can compare versions here.

Getting Started with Website Auditor

To get started, just enter the URL of the site you want to research:

website-auditor-enter-url

I always like to enable the expert options so I can see everything available to me. Next step is to select the "page ranking factors:

wa-select-page-factors

Here, you have the ability to get the following data points from the tool on a per-page basis:

  • HTTP status codes
  • Page titles, meta descriptions, meta keywords
  • Total links on the page
  • Links on the page to external sites
  • Robots.Txt instructions
  • W3C validation errors
  • CSS validation errors
  • Any canonical URL's associated with the page
  • HTML Code Size
  • Links on the page with the no-follow attribute

Your next option is to select the crawl depth. For deep analysis you can certainly select no crawl limit and click the option to find unlinked to pages in the index.

wa-step-3

If you want to go nuts with the crawl depth frequently, I'd suggest looking into a VPS to house the application so you can run it remotely. Deep, deep crawls can take quite awhile.

I know HostGator's VPS's as well as a Rackspace Cloud Server can be used with this and I'm sure most VPS hosting options will allow for this as well.

I'm just going to run 2 clicks deep here for demonstration purposes.

Next up is filtering options. Maybe you only want to crawl a certain section or sections of a site. For example, maybe I'm just interested in the auto insurance section of the Geico site for competitive research purposes.

Also, for E-commerce sites you may want to exclude certain parameters in the URL to avoid mucked up results (or any site for that matter). Though there is an option (see below) where you can have Website Auditor treat pages that are similar but might have odd parameters as the same page.

Another option I like to use is pulling up just the blog section of a site to look for popular posts link-wise and social media wise. Whatever you want to do in this respect, you do it here:

wa-step-4-filtering-options

So here, I'm included all the normal file extensions and extension-less files to include in the report and I'm looking for all the stuff under their quote section (as I'm researching the insurance quote market).

The upfront filtering is one of my favorite features because I exclude unnecessary pages from the crawl and only get exactly what I'm looking for, quickly. Now, click next and the report starts:

wa-step-5-searching

Working With the Results

Another thing I like about Link-Assistant Products is the familiar interface between all 4 of their products. If you saw are other reviews, you are familiar with the results pane below.

Before that, Website Auditor will ask you about getting more factors. When I do the initial crawl I do not include stuff that will cause captchas or require proxies, like cache dates and PR. But here, you can update and add more factors if you wish:

wa-more-factors

Once you click that, you are brought to the settings page and give the option to add more factors, I've specifically highlighted the social ones:

wa-social-factors

I'll skip these for now and go back to the initial results section. This displays your initial results and I've also highlighted all the available options with colored arrows:

wa-results-pane-large

Your arrow legend is as follows:)

  • Orange - You can save the current project or all projects, start a new project, close the project, or open another project
  • Green - you can build an white-labeled Optimization report (with crawl, domain, link, and popularity metrics plugged in), Analyze a single page for on-page optimization, Update a workspace or selected pages or the entire project for selected factors, Rebuild the report with the same pages but different factors, or create an XML sitemap for selected webpages.
  • Yellow - Search for specific words inside the report (I use this for narrowing down to a topic)
  • Red - Create and update Workspaces to customize the results view
  • Purple - Flip between the results pane, the white-label report, or with specific webpages for metric updates

Workspaces for Customizing Results

The Workspaces tab allows you to edit current Workspaces (add/remove metrics) or create new ones that you can rename whatever you want and which will show up in the Workspaces drop-down:

wa-workspaces

Simply click on the Workspaces icon to get to the Workspaces preference option:

wa-workspaces-options

You can create new workspaces, edit or remove old ones, and also set specific filtering conditions relative to the metrics available to you:

wa-eric-workspace

Spending some time upfront playing around with the Workspace options can save you loads of time on the backend with respect to drilling down to either specific page types, specific metrics, or a combination of both.

Analyzing a Page

When you go to export a Website Auditor file (you can also just control/command + a to select everything in the results pane and copy/paste to a spreadsheet) you'll see 2 options:

  • Page Ranking Factors (the data in the results pane)
  • Page Content Data

You can analyze a page's content (or multiple pages at once) for on-page optimization factors relative to a keyword you select.

There are 2 ways you can do this. You can highlight a page in the Workspace, right click and select analyze page content. Or, you can click on the Webpages button above the filter box then click the Analyze button in the upper left. Here is the dialog box for the second option:

wa-analyze-page-content

The items with the red X's next to them denote which pages can be analyzed (the pages just need to have content, often you see duplicates for /page and /page/)

So I want to see how the boat page looks, highlight it and click next to get to the area where you can enter your keywords:

wa-keywords-content-analysis

Enter the keywords you want to evaluate the page against (I entered boat insurance and boat insurance quotes) then select what engine you want to evaluate the page against (this pulls competition data in from the selected engine).

wa-choose-engines

The results pane here shows you a variety of options related to the keywords you entered and the page you selected:

wa-analysis-results

You have the option to view the results by a single keyword (insurance) or multi-word keywords (boat insurance) or both. Usually I'm looking at multi-word keyphrases so that's what I typically select and the report tells you the percentage the keyword makes up of a specific on-page factor.

The on-page factors are:

  • Total page copy
  • Body
  • Title tag, meta description, and meta keywords
  • H1 and H2-H6 (H2-H6 are grouped)
  • Link anchor text
  • % in bold and in italics
  • Image text

Website Auditor takes all that to spit out a custom Score metric which is mean to illustrate what keyword is most prominent, on average, across the board.

You can create a white-label report off of this as well, in addition to being able to export the data the same way as the Page Factor data described above (CSV, HTML, XML, SQL, Cut and Paste).

Custom Settings and Reports

You have the option to set both global and per project preferences inside of Website Auditor.

Per Project Preferences:

  • Customer information for the reports
  • Search filters (extensions, words/characters in the URL, etc)
  • Customizing Workspace defaults for the Website reports and the Web page report
  • Setting up custom tags
  • Selecting default Page Ranking Factors
  • Setting up Domain factors (which appear on the report) like social metrics, traffic metrics from Compete and Alexa, age and ip, and factors similar to the Page Factors but for the domain)
  • XML publishing information

Your Global preferences cover all the application specific stuff like:

  • Proxy settings
  • Emulation settings and Captcha settings
  • Company information for reports
  • Preferred search engines and API keys
  • Scheduling
  • Publishing options (ftp, email, html, etc)

Website Auditor also offers detailed reporting options (all of which can be customized in the Preferences area of the application). You can get customized reports for both Page Factor metrics and Page Content Metrics.

I would like to see them improve the reporting access a bit. The reports look nice and are helpful but customizing the text, or inputting your own narratives is accessed via a somewhat arcane dialog blog, where it makes it hard to fix if you screw up the code.

Give Website Auditor a Try

There are other desktop on-page/crawling tools on the market and some of them are quite good. I like some of the features inside of Website Auditor (report outputting, custom crawl parameters, social aspects) enough to continue using it in 2012.

I've asked for clarification on this but I believe their Live Plan (which you get free for the first 6 months) must be renewed in order for the application to interact with a search engine.

I do hope they consider changing that. I understand that some features won't work once a search engine changes something, and that is worthy of a charge, but tasks like pulling a ranking report or executing a site crawl shouldn't be lumped in with that.

Nonetheless, I would still recommend the product as it's a good product and the support is solid but I think it's important to understand the pricing upfront. You can find pricing details here for both their product fees and their Live Plan fees.

Source: http://www.seobook.com/website-auditor-review-full-featured-page-optimization-tool

david kovacs iowa high school wrestling lara logan assault details survivor redemption island david kovacs