Traffick - The Business of Search Engines & Web Portals
Blog Categories (aka Tags) Archive of Traffick Articles Our Internet Marketing Consulting Services Contact the Traffickers Traffick RSS Feed

Archive for July, 2011

Yahoo Adobe Research In Motion -Zilla

Tuesday, July 19th, 2011

Every so often I have a crazy thought like this.

It usually comes when a player in this digital space starts to get so far ahead of everyone else, that they others seem to stop trying. And paradoxically, “trying” can mean admitting your limitations and moving into new fields by giving up your independence and merging with someone else, or partnering at the very least.

Google’s getting pretty far ahead. Sure, I’ll use their browser. But is it so hard for these other guys to at least *try*?

Microsoft let Google Analytics take over the world, but shutting down their analytics project. Sigh.

Lately, when I start using an application, I tend to check which browser I’m using. I think “I might as well use IE for this one, since Firefox crashed the last few times I tried this adventure on a site like this.” Or: “Google stuff. Better use Chrome.” And then I think, well at least someone *has* a browser of their own. Yahoo never did. Although some Canadians think they did, since there was a Rogers-or-another-ISP version of Yahoo browser which was probably IE but I don’t really remember.

So anyway, why don’t these guys do something like the following.

* Yahoo needs a browser and someday an OS, just to feel techie again. Why doesn’t Yahoo “acquire” Firefox somehow, in spite of Mozilla’s apparent belief that it can carry on forever on its own? Failing that, buy another remaining browser company and get developing. I mean, both eBay and Microsoft bought Skype.

* Research in Motion is suddenly in such deep trouble (peering into the future, with Apple and Android mopping the floor with them) it almost makes sense for them to give up and offer Android phones to grow market share. Why not? Think outside the box, guys. Failing that, they need to lock in distribution and just plain figure out a way to gain allies in the world. Why not merge with Yahoo,and the combined entities would sell and gut spare parts and keep what makes sense?

* Adobe seems to be a successful company that likes RIM and hates Apple. That merger-and-gut could take place a year after the previous one.

The combined company needs great relationships with wireless carriers, etc.; it needs great browser(s) and OS(‘s); and it should take any cash it has after the dust settles, and any debt it can incur, and start buying up fiber and investing in data capacity, like Google did.

The web search technology might be something called Yahoo Search that gets resuscitated and developed, or a better Bing might do.

For a social network, they’d have to build one of their own too, like Google did, since Facebook’s IPO price likely values them way too high. So to facilitate something like that, probably the new YARMzilla entity would need to do something like acquire Twitter, which may come available cheap as its weak finances begin to sink in.

For local search, that’s easy. I still think Yahoo (YARMzilla) should acquire Yelp, given their failure to do a deal with Google.

The hot IPO market is making a mess of transactions like this, of course (Pandora, LinkedIn, GroupOn, crazy valuations) which is why larger market cap companies like RIM need to get involved and use cash to help with acquisitions.

Ready, set… GO!

Google+ and SEO

Thursday, July 14th, 2011

Each generation and branch of SEO tacticians, similar to any other established order in any industry, has built a set of assumptions — and even worse, for their ability to remain strategically nimble — developed sunk costs in infrastructure, most appropriate to a fixed (past) era… in this case, a past era in search engine ranking algorithms.

What increasingly matters (whoops, has always mattered) is whether (a) your content is relevant and high quality; (b) popular and/or authoritative. Search engines attempt to assess these raw qualities in different ways in different eras, and they move the goalposts when the nature of information consumption and sharing change, and when spammers catch up with their measurement techniques.

There are many great ways to sum this up; to illustrate for common-sense purposes the difference between what search engines actually measure at any given time and what they are trying to capture for user benefit. But perhaps one of the most succinct is Hugh McLeod’s notion of social objects. If you’re shouting about the benefits of Maxwell House coffee (yawn), you’ll eventually get through if you spend enough. But if coffee enthusiasts are really discussing coffee and really helping one another — as they do on this thing called the Internet — surely there’s a relevancy algorithm waiting to happen to that process of relatively spontaneous buzz. If the right community of people are retweeting Richard Florida’s tweets about a certain article about urban tranportation, that helps us to understand more about the value of the article itself, but also about the trust patterns and interest patterns within the community. It should also help us to understand which publications and authors themselves are reputable. With rel=author and other mechanisms, search engines will have more and more available cues so that we don’t have to sift through counterfeit crapola. It’s a long term battle, but one that each generation of spammers will lose after their initial successes.

There’s no question that this is already what Google PageRank (and other search engines like Teoma) were attempting to tap into. It’s just that the proxy for the community interest and heartfelt recommendation — the backlink structure of the whole web — is outdated and endlessly gameable today.

SEO tacticians have invested more than anything else in the infrastructure around backlinks, because “Google PageRank” originally centered around the authority conferred on a website or page by the authority and volume of links pointing to them… along with anchor text and other relevancy factors to attempt to match the aboutness of a page with the aboutness/intent of the user’s query.

With every passing year, traditional link signals become less useful. Backlink-obsessed SEO tacticians (including the companies that spider the whole web and give you ponderous information about your site’s linkage skills) have had a longer-than-usual heyday precisely because Google themselves has had so many sunk costs invested in their link-centric ranking paradigm, they’ve been slow to wind down this archaic methodology. One key problem was they didn’t have enough to replace it with yet. And part of the reason for that was that “Google didn’t get social,” so it was reliant on deals with companies like Twitter so it could assess social signals. It then began acquiring small reputation management companies like PostRank. Facebook, of course, won’t play ball.

The biggest change in Google’s ability to understand sharing behavior just happened. They launched Google+.

Social signals and user clickstreams are two emerging types of signals search engines look at to help with ranking. Google will soon have more social and sharing signals than anyone but Facebook.

The environment for SEO is going to change radically. Google won’t be afraid to shift their ranking algorithms more dramatically in the future — away from their old backlinks analysis — because they’re no longer impressed by their own sunk costs. Google’s costs and revenues both keep rising dramatically, and in Google+ itself, Google has a huge new sunk cost. Think they won’t mine that data to make search work better? Think the SEO world is only going to change incrementally, and it will be more or less business as usual? Think again.

When a big company comes to me and asks a 2007 question like “if we pay you guys $2,000 a month for a few months, how many quality backlinks do you think we can get for that?,” it makes me want to cry. A fair question, if you could get quality backlinks (incremental to your company’s already huge online footprint) by simply contacting a few people, like we used to do in 2001. Admittedly, I am a rink rat who grew up playing shinny for 9 hours straight without lunch on the outdoor rinks of Ottawa; later, inhaling the Zamboni fumes in suburban Toronto. Maybe I’m still high on the fumes, which is odd given that it’s July. But when it comes to SEO, I advise companies to head where the puck is going… not where it was years ago.

A talented outside agency can assist with that process, of course. But maybe you don’t need “SEO”.

Reflections on ‘In the Plex’

Monday, July 11th, 2011

It’s always a bit tough to digest the latest Important Book on Google when you work in the industry. The concern is that you’ll be forced to rehearse and rehash warmed-over stories. And at times, these books make you want to exclaim: “Do you believe everything they tell you?”

Fortunately, Steven Levy has created such a rich, engaging portrait of Google’s development that these concerns melt away in the first chapters of In the Plex: How Google Thinks, Works, and Shapes Our Lives. Having had the time to dig fully into it, here are some of the themes that really stood out for me:

  • Entrepreneurs-cum-pan-technologists like Page and Brin have a lot to teach other prospective entrepreneurs. It’s clear that in balancing outside input and media noise with a drive to create superior, groundbreaking products that elegantly solve problems, the latter needs to win out. Many of us might be distracted by the slightest little criticism, and go ’round in circles with non-fans, hoping to win them over, or worse, hear their naysaying point of view! New products and new ways of showing advertising will be inherently unfamiliar to the masses. Many objections will be of the knee-jerk variety. While Page and Brin both had exacting performance standards and were extremely critical of products until they were ready for prime time, when it came to buying into unusual concepts, they waved broader objections away (such as the notion that ads in GMail would be ‘creepy’) without so much as a thought.
  • That skill must get easier to master over time, mainly because many outsiders are not well-meaning, but either competitors, protectors of the status quo, or in many cases, idiots. Proposed California legislation against targeted advertising in email came from a Democrat who seemed to get randomly up in arms over any type of commercial offer, online or offline. It took some serious lobbying, and a meeting at the Ritz-Carlton with none other than Google Special Adviser Al Gore (plus post-it notes, hand puppets, etc.) to convince the legislator to water down the bill some. It never became law.
  • One error several recent Google book authors — Levy included — seem to drift into, is to credit Google with trends that are really inherent to the Internet, where Google’s solution came along and got adopted by many, but not *first*. Case in point being the concept of using “cloud” based email to take care of all your messages and to change one’s whole way of working. Personally, I moved exclusively to web-based email through Yahoo (down to having it manage all of my other email accounts) somewhere around 2002. I did switch to GMail early on, when it proved better. The point is that Hotmail and Yahoo Mail were much earlier, and Google soft-innovated and improved on them. But it did not invent the category, nor even pioneer the relevant user behavior. But because competitors like Microsoft and Yahoo had no momentum and even less cool factor, no one writes books about them. History is written by the victors, as always.
  • In “In the ‘Plex,” I learned that one of the early architects of the AdWords auction was an engineer from Sarnia, Ontario (where I lived for three years, and where my sister was born). The takeaway seems to be that if you can survive the toxic soils of Sarnia, you are at least a very special person, and possibly, superintelligent. There are quite a few Canadians lurking in the Google honor roll… so many, in fact, it’s difficult to keep track!
  • Back to GMail. Many people are aware of the bold move Google made to add virtually unlimited storage, based on an insight about Moore’s Law that should really have been available to anyone. But few would have known that Paul Buchheit wanted to create the application in Javascript, just because it was a challenge that might have a payoff if it worked well in the real world. This was seen as odd at the time, coming as it did before the widespread popularity of AJAX, a technology that greatly speeds up workflow for online applications (making them work more like a desktop app). In the Plex is chock full of stories like this: little nuggets that seem so familiar because they now form part of how we work and live our everyday lives. Without the freedom to come up with innovations like this, Google — and our daily working lives — simply wouldn’t be what they are today. This is why Buchheit still feels that Google is Awesome, despite moving onto other ventures.
  • Indeed, there are a couple hundred compelling, for-the-ages business stories lurking in the ‘Plex (literally the Plex, not just the book). To me, many of them are far more riveting than comparable yarns you’ll read about GE, Enron, or whatnot (on those, enjoy Christopher Byron, Testosterone, Inc.) Undoubtedly, that is because Google has created something — many things — of lasting value, to scale, in record time.

Display Advertising’s Sea Change: It’s About Audiences, Not Publications

Sunday, July 3rd, 2011

2011 is the Year of Remarketing, thanks in large part to Google validating and popularizing the space, and creating a stable and easy-to-interpret platform for showing banner ads to members of a set audience online. By “audience,” we’re talking about such a highly targeted group of people who have “raised their hands” in some way — often by stopping by your website (but it could be extended to things like viewing your YouTube video anywhere on the web, including as shared inside Facebook). That’s not “permission” in the classic sense of “opting in” to receive targeted messages, but it works the same way. By “audience,” we are talking about an audience the advertiser themselves builds and maintains with the aid of the ad platform provider — not an audience built by the publisher and sold by the publisher with the claim “well, these people like our type of content on vacation properties, so you ought to expect them to like your vacation properties ads.”

The difference in mindset shifts from *where* can I place ads online that will be likely to convert well for my type of prospective customers, to *who* is already much more likely to convert based on what I know about them (a past customer, has visited a lot of volleyball related websites in the past year, etc.). If that user is cookied, and I know a fair bit about them (anonymously, but with pertinent behavioral information), then I don’t need to care all that much what they’re doing or what site they’re on. I can show them ads pretty much anywhere, and measure the results of that type of advertising.

How this works “on the ground” is interesting; Google advertisers, in particular, are going to have to unlearn some old habits. Many Google advertisers had a built-in suspicion of the so-called “content network,” so were hyper-vigilant about excluding certain “irrelevant” publishers from showing their ads. When “managed placements” became available, many advertisers were all over this, assuming that a hand-built portfolio of “relevant websites” would be the safest and best targeting method. Sometimes it was, but that doesn’t always scale.

So now, let’s say you’re poring through your stats for a Remarketing campaign. You’re getting good results for an audience you’ve defined: “people who have made a purchase from your site in the past 180 days.” Turns out, repeat purchasing is much more likely if you target those folks — duh — with ads. To be less annoying, you could create a custom combination that avoided showing ads to people who just bought less than a month ago. It’s getting pretty sophisticated, right? Sneakily so, as it looks simple on the surface… but the targeting ability is improving markedly.

Now you look at what publications your ad converted on. And lo and behold, they aren’t targeted. Our target customer — let’s call him Roadrunner — is going about his day at,, and a whole bunch of other random websites he likes, — and is seeing your ad for Delicious Bird Feed. The Roadrunner buys because the Roadrunner realized he needed a refill of the delicious bird feed. The “relevance” of the content daily news website for Iron Mountain, MI is, well, not relevant to the Roadrunner’s decision to purchase that day.

So you want to make sure that some newbie doesn’t get into your advertising account, excluding websites in the display network willy-nilly using the “eyeball method” (the site doesn’t “look” relevant).

Of course, another way to describe the shift is that we’re heading towards an era of “behavioral targeting”. To date, this has been in a building phase, but it’s set to take off.

Out in the wild court of public opinion, debate will continue about these ads that “seem to follow you around.” Yes, it can be creepy. And yes, the main pushback against this superior targeting will be around privacy policies. There is going to need to be a multi-stakeholder dialogue on how best to strike a balance here.

With the advent of new Google initiatives like Interest Categories (*not* similar to Facebook’s, but audience-based) and of course, Google+, we’re hurtling quickly into the era of the audience… not audience as a publisher defines it, but rather, how advertisers and users (more precisely) define it in a tighter (but mediated) ecosystem.

Markets as conversations, anyone?


Traffick - The Business of Search Engines & Web Portals


Home | Categories | Archive | About Us | Internet Marketing Consulting | Contact Us
© 1999 - 2013, All Rights Reserved