Traffick - The Business of Search Engines & Web Portals
Blog Categories (aka Tags) Archive of Traffick Articles Our Internet Marketing Consulting Services Contact the Traffickers Traffick RSS Feed

Archive for August, 2010

Canadian Real Estate Brokerages to Challenge MLS Site?

Tuesday, August 31st, 2010

In the U.S., real estate search sites like Zillow have attracted huge enormous user bases and record venture capital funding rounds. This was an idea whose time had come, given the widespread consumer distaste with having real estate sales information controlled by the Multiple Listings Service. A far cry from the Internet’s open ethos.

Not quite so in Canada. Canadians have tended to be stuck with the Realtor.ca (also accessible at MLS.ca) site. Some upstarts have come along, but they’ve faced resistance, and legal challenges for their habit of scraping information. The best known is Zoocasa, a real estate search site powered by funding from Rogers Ventures.

Now, the nation’s three largest real estate brokerages may be launching a “challenge from within.” They’re bent on creating a branded website offering that offers better functionality than the old MLS site, but maintains some of the data integrity and conservatism they charge is lacking on “syndicated information” sites. Representatives from Royal Lepage, Century 21, and Re/Max are meeting today, according to the Globe and Mail.

A quick peek at traffic rankings on Alexa indicates that the brokerages have little to fear from Zoocasa, which has failed to gain major user traction. Meanwhile, the MLS.ca site continues to lead the pack in its vertical by a considerable margin. Therefore, the launch of a new broker-driven branded site has to be mainly a challenge to MLS.ca; consumers frequently complain about its outdated approach and clunkiness.

Can this possibly work? Won’t consumers be limited to listings from just three major real estate companies, so therefore still dependent on other sources of information? This branded site could be squeezed on the “open, universal” side by Zoocasa, and the “closed, nearly universal” side by MLS.

Still, the new site could have unique advantages. First, it would pool resources of three major national real estate companies who were going to be going toe-to-toe anyway, each with their own branded websites. Collectively, they’ll provide original canonical content, so may fare better in search engines than the growing legions of de facto scraper sites (which search engines don’t like).

It seems like a uniquely Canadian sort of compromise. It remains to be seen what the outcome will look like, but promoting the website could be a costly venture. Reaching “brand recall” status is key for a real estate site. A cursory look through many of the search query results for real estate queries shows Zillow doing fine (first page listings) but by no means dominating the page or frequently reaching #1 or #2 spots. Zillow could not survive on search discoverability alone. Many consumers have simply heard of Zillow through frequent press coverage and partnerships.

It does not seem, in any case, like Zillow and the like have much interest in entering the Canadian market. So for now, there will be plenty of room in the search results for new generation ways of providing real estate information.

The Google AdWords ‘Tourist Tax’

Monday, August 30th, 2010

On a recent trip to Italy, I was struck by the pricing gap between “insider deals” you might get if you were a local, and the prices paid by harried newbies who just got off the plane.

It’s price discrimination against the “rich,” or anyone who is new to the scene.

It extends to everything from real estate to pizza slices. And of course, there are probably two sets of grocery stores in some towns. The “tourist” grocery store carries only a limited selection of marked-up wine.

Some towns have free beaches. In “touristy” towns, you mostly have to pay. The water-clarity-and-bikini-density quotient is roughly the same at both, so you figure most people who live there don’t pay for beach space.

Some time ago, Google figured out the “tourist tax” phenomenon, presumably to the benefit of its profit margins. Some have called it a “bozo filter,” but that’s not fair. Tourists aren’t stupid, they’re just trapped, vulnerable, and in a bit too much of a hurry.

Long-standing, loyal advertisers will be less impressed with arbitrary measures to ratchet up click prices, especially when they’ve invested heavily in optimizing their accounts. That relationship settles into a relatively happy equilibrium. Patient advertisers who are willing to build up their “local knowledge” over time can look forward to bargain rates. It’s important to stress, of course, that they’ve optimized a marketing channel over time, so that’s the primary reason for their improving ROI as time progresses. The “tourist tax” I refer to is specifically the initial hump (lasting anywhere from a week to two months), during which it’s particularly daunting for the newbie.

Many new advertisers are unsettled by the initial response to the first draft of their campaigns. There are some common outcomes:

  • The advertiser is deterred by the high apparent costs of clicks, or other “scary” metrics like low quality scores, and pauses the campaign (leaves AdWords Nation on the next flight out).
  • The advertiser is unsettled or even freaked out by their inability to make things work, and tries to fix things all at once. This might include taking dogmatic views of matching options, cranking bids up and down, etc. No progress occurs, so eventually they have to leave as well.
  • The advertiser exhibits more patience, but misses a number of key optimization techniques, and buys into subtle cues in the interface (such as quality scores, “first page bids,” the bid simulator, and “impression share” reports) that make it seem like paying more is almost a requirement. These advertisers may show slight improvement over time but are accepting worse ROI than someone who is relentless in optimization, and who doesn’t take certain click price ranges as “inevitable.”

Seasoned advertisers, on the other hand, exhibit five key traits:

  • Much like those who are natives of a “costly” country, they avoid the high-price district as quickly as they can. There are ways to accelerate progress in the short term, so why not compress some of that activity into the early going so you get out of the weird “new account spiral,” a negative feedback loop that feeds you low volume and bad metrics? Some “smart kickoff” activities might include leaving the optimize setting on for ads at first; starting with more granular, highly relevant terms first, and going broad later; bidding high (but not too high) for one to two days to get some data built up on particularly relevant parts of the account; enabling content targeting and optimizing that as well; taking care to walk bids down crisply as progress is made, so as not to exhibit too much irrational demand for your keywords; and more.
  • They gain organizational buy-in for paid search, setting reasonable expectations that an initial “tourist tax” will be required, and that the days of setting expectations around 10-cent clicks are long gone. You don’t just set up a big campaign and “lowball” at 10 cents, hoping to cherrypick the odd click. Chances are such a campaign will just sit there.
  • They plan to stay the course and run campaigns based on stable brand names and relatively consistent campaign structures, understanding that one factor Google looks at in quality score is the performance of the display URL.
  • They manage accounts over a long period of time, with one eye on ROI and one eye on quality score friendliness, and enjoy the “green light” advantage of an established account that make new campaigns and ad groups more likely to gain positive momentum from the outset.
  • They learn to tell the difference between helpful and boilerplate Google advice.

Google prefers the seasoned advertisers to the tourists, at the end of the day. But if the tourists are going to keep barging in to disrupt the flow of AdWords Nation – the least they can do is profit from their short-term stays. Your job is to minimize the damage, by thinking like a native.

Good luck on your long-term quest to attain that peaceful, reasonably-priced existence that is eventually extended to full citizens of AdWords Nation.

This column originally appeared at ClickZ on July 16, 2010. Reprinted by permission.

Google vs. 3M

Sunday, August 29th, 2010

Looking for investment ideas?

In this corner: Founded in 1902 and relocated to Duluth, Minnesota, in 1905, this was an unsuccessful mining company at first, but found early wins in areas like the first waterproof sandpaper and a little invention called masking tape. Now the legendary producer of products like Scotch Transparent Duct Tape, the company has been paying a dividend to its investors since 1916. Product diversification and soft innovation are the company’s hallmarks.

And in this corner: Founded in 1997 in Silicon Valley, Google is the legendary provider of a free voice calling service similar to Skype. It’s also a popular search engine that makes 99% of its revenues on advertising. The company is known for its kickass computer code and its free gourmet food on site. From a one-trick pony, Google has reached a stage of rampant diversification and huge user bases for its many products in less than a decade.

Other than that, the companies have very little in common.

As an investment, both are well known, well research securities with public market valuations above $50 billion. Based on valuation, Google is about 2.5X the size of 3M

Google’s price-to-earnings ratio has reached a lowly 19, in part due to the market’s doubt as to the growth prospects for its primary business, search engine and related online advertising. 3M, a methodical, steady-growth innovator, has a modest P/E ratio of 14.

A crucial difference between the two companies: Google does not, of course, pay a dividend. 3M’s dividend yield is around 2.5%, and it would have been around that every year you’d owned it, and will continue to be so for years to come. That means the company sends you checks — even if you don’t have an AdSense account.

If you like delicious food, Google is a better place to work. As an investment, you might be better off with 3M.

More in the next post, though, on Google’s many-tentacled efforts to break through its perceived “upside limits”.

Blekko Update: Mechanical Turk Meets Algorithmic Web Search

Friday, August 13th, 2010

Blekko, the stealth search engine startup co-founded by Rich Skrenta (of ODP fame), has recently launched a limited private beta. I’ve been able to give it a spin and can confirm, it feels like early days yet for the project but it does have its core metaphors and functions nailed down solidly.

As close observers now know, the service allows users to use pre-set (or custom) “slashtags” to reduce a whole web search to a narrower slice of relevant websites. “Pre-sets” include /people, /stock, /weather, /define, etc. These work somewhat the same as certain Google pre-sets, which can be either activated by knowing the nomenclature, or may be activated automatically based on the style of your query. Blekko’s pre-sets already go beyond some of what Google offers; as these develop they constitute a potentially useful toolkit.

Some quirky areas they’re working on are near and dear to the founder’s heart: because he’s always aimed to pitch the product to search pros and SEO’s, Blekko has spent considerable time building out SEO-candy features. For example, a slashtag called /rank (and no doubt other features in development) reveal the factors in ranking for certain phrases, which would allow one to understand why some sites rank higher than others. Go ahead and game Blekko if you want — people won’t be using it to search the whole web, so they’re unlikely to include your site in a slashtag site list if it’s spammy.

More germane to the spirit of the project are “topic” slashtags (from /arts to /discgolf to /yoga) and “user” slashtags (anything users customize). The idea is basically to metasearch all the sites that are included in the list, but none of the rest of the web. (Or you can call it a mini index or a multi site search if you will.)

It’s important to note that Blekko needs to maintain its own full web index to offer this to users and partners as original technology; after all, when you enter a site to add to the list, it’s got to be spidered and already included in the master list of sites. One of the key guiding principles Skrenta mentioned to me in the earlier going was that so few search startups are going into the Google arena of “whole web search” that the field is starved of healthy competition. He’s right. To keep costs down, Blekko’s index is of course much smaller than Google’s. That means it’s less comprehensive, but it makes the project feasible and further limits the incentives for spammers.

So many other search startups have been content to set their sights lower, tinkering with mere “features.” Yet somewhat disturbingly, in a recent Techcrunch interview, Skrenta’s emphasis is on slashtags as a “feature”. It’s still unclear, then, whether Blekko is a big idea, or a small one.

A few points are worth mentioning as far as this core functionality goes.

1. There is some question as to who Blekko is competing with and where its positioning lies. Clearly, the identified shortcoming in mega web indexes is junk, spam, and an inadequate commitment to personalization.

2. To address that, some return to the curated web must be contemplated. Fixed directories had their day and were susceptible to claims of corruption. Moreover, as Steve Thomas of a startup called Moreover once noted, directories and other curated sources suffer from the “fixed taxonomy problem”. What if your own list or own approach would be different? Shouldn’t you be able to follow an “editor” who “slashes” the web in a way that you approve of? Shouldn’t you be able to contribute your own value as an editor to the community, without being stuck on categories you don’t actually love because someone else got there first?

3. The large number of pre-set slashtags are interesting. They suggest yet another attempt could be made to intelligently curate the web, using people who are good at putting together the lists. Someone would need to admit that it was opinionated categorization, of course. There’s something disingenuous about Google News’s helpful reminder that the whole thing is “generated by a computer program” and has no editorial judgment involved. Presumably Blekko suggests we might (or must inevitably) go in a different direction. Admitting to curation might be a start!

So Blekko is a way to customize the web and shrink the universe. It’s a cousin of wave 1 of “peer to peer” search (OpenCola) or shared bookmarking services (Backflip, HotLinks).

Here’s the rub at this stage. When I try to use certain slashtags in an intuitive way to find content, it doesn’t work very well.

I built my own slashtag called /ugc to encompass a number of user review sites I like, such as Yelp and Chowhound (and Tripadvisor and…). When I tried to find certain restaurants, no matter what I typed, I still got a mess of irrelevant results.  And this is arguably not an advanced search problem, but one of the simpler ones you’ll throw at an engine. I’d be way better off just going directly to Yelp or Chow.com, or to their mobile apps. Then, I discovered the slashtag called /reviews. It didn’t work any better. At the end of the day, I know how Yelp, Tripadvisor, and HomeStars work when I visit them directly. I don’t know how Blekko works, and/or, it currently just doesn’t do the job. In many important ways, it offers a less rich experience than the individual sources, which offer a rich navigational experience, community, etc. It’s a search engine, of course, so it helps you find stuff. Which in this day and age, isn’t all that impressive to the average person… especially if it does a poor job of that.

Similar problems came up when I tried intuitive seeming searches using the /vegan slashtag. With a very specific intent, maybe that tag might work. 90% of the time, it’s safe to say you’ll be just as frustrated as you would be using Google.

One generous reviewer noted that “Slashtags aren’t perfect“. In my opinion, they don’t really work as billed, and might never work intuitively. As a user, when I see the slash /liberal or /local or /review, I think of an attribute, not a simple notation that the web universe will be shrunk to a curated list of websites, however small or large. The state of the semantic web is in shambles, there is no question about that. So it is indeed pie-in-the-sky to expect key attributes of pages (such as “this is a consumer review”) to be widely and universally available for search engines and users. But I get overexcited when I see a slashtag like /funny or /green. I expect it to work magically. Of course, it doesn’t.

Granted, the project is still in Beta and even the individual site searches on the underlying sites are often weak and cluttered until you become a power user. At HomeStars, we quibbled for a couple of years over the best way of treating the fact that 40% or more queries are on a company name, but generic words in company names often overlap with review content, giving the “wrong” rank-order for that user’s intent. Those are solvable problems to an extent, so with more curation, more tweaking, more feedback on the beta, etc., Blekko can maybe solve many of these problems.

I definitely empathize with the challenge, having been involved in a site that fails often even just making the search and navigational experience work for a narrow subset of users. Novices come and go who think they can write code that will “fix” search and make it “work”. It does work, for 0.5% of users. Then it breaks for the other 99.5%. Everyone has opinions about rank-order, and pretty much everything else to do with search. You don’t “solve” it with a couple of pointers as it’s incredibly complex and subjective.

Ruminating on all of that, it’s clear that Blekko is an enormous idea, not a small idea or a simple feature. As a result, it opens up enormous cans of worms — as it should.

Other (Googly) ways of assessing relevance and quality are going to eventually need to be built into an engine like Blekko, regardless. Google has a light-year’s worth of head start in gathering data about user behavior, click trails and paths, and other signals that help propel one page or site higher than another. Even if you cut out 95% of Google’s index, or 99%, Google’s proprietary data and ranking technologies would be immensely helpful to ranking. Heck, you can use Google site search for your own site and tap into the same technology.

Meanwhile, Google is moving forward to aggregate content and serve up types of content with certain attributes, drawing from qualifying lists of underlying data providers. One such area is user review aggregation. They’re making a botch of it now, and are heavy-handedly trying to funnel users into their own review app. But they’re iterating fast.

In other words, Google is going hard after solving this type of problem in areas where it really matters. Blekko is experimenting with how the problem itself is posed, and where to take it next. And that is bound to be strikingly different in tone and execution than Google’s method.

Rich Skrenta is right. The field of large scale search engines is in desperate need of innovation and genuine competition. Blekko can act as an incubator for better ideas about search, but perhaps just as importantly, different ideas about search.

Google’s Shuttered Projects: Does it Come Down to Trust?

Monday, August 9th, 2010

Danny Sullivan offers a thoughtful piece “celebrating” and reviewing Google’s failed (closed) projects, accompanied by “celebratory” (rationalizing, explaining) quotes from Google.

Of course, there can be any number of reasons why pilot projects fail to take off. On the flip side, Google’s dominance in some of the lines of business that started out in Google Labs or otherwise alpha level functionality, is nothing short of astounding — think Google Maps.

But is there a deeper reason for Google’s inability to gain traction with what often seemed to be promising initiatives? With their tech savvy and reach, they should have an advantage from the standpoint of promotion as well as deep pockets to ride them through to completion.

One thing that comes to mind is user trust. Is there an unconscious upper threshold of total company product adoption in many users’ minds? For business users, is that perception particularly acute? As in: “I love Google Search and YouTube, and Google Maps I can’t live without. I’m willing to try Google 411… but when it comes to this latest invention, I think I’m going to take a pass and stick with [independent provider x].”

In the old school of American political science (pluralism, ne0-pluralism, Robert Dahl’s Who Governs?) the idea was that contemporary America was reasonably democratic even though there were elites (C. Wright Mills). As long as no one elite had total control over all facets of an economic region (for Dahl it was New Haven, CT), there was something analogous to the checks and balances the Founding Fathers of the country envisioned. That concept (wish?) still resonates with many people. In fact, it’s baked into the American psyche.

Why wouldn’t people approach their business relationships or technology adoption decisions the same way? (“I’m using five Google products all the time, but something about this new one bothers me. I think I’ll take my business to the other guys for a change.”) It’s hugely different when a startup starts doing well with a similar product. Why? Well, the underdog syndrome has consumers cheering that entity, community, product on to greater heights. When a product is Google-owned, you can no longer cheer for Page and Brin (though the company would like us to still see them all as underdogs). And the developers of Google’s products (even the widely loved GMail and Chrome) are rarely charismatic, or even visible. They are software developers: that’s what they do. They don’t feel the need to be hybrid entrepreneur-evangelist-developers. With a new or acquired Google product (say Aardvark?), it’s not cool to recommend it to friends anymore, because you’re recommending “just another Google product”. You make a conscious or unconscious choice not to work too hard at adopting that thing. The underdog syndrome, too, is baked into the American psyche.

Beyond trust is being top of mind in a category. Foursquare gets to be top of mind in something specific. If the same product is owned by Google, the positioning gets muddied.

So is any of this true? Do new Google initiatives now fail because people don’t trust Google to be a powerful elite in the ecosystem across too many categories? Are users consciously wanting to spread the love, if not to the tiniest garage startups, at least to other elites?

Let’s go through Danny’s list product by product, to find out.

Google Wave: Beyond the product being potentially too advanced and new to achieve quick adoption by the average (especially non-technical) user, there is a huge trust component to sharing internal company messaging across different communications styles, with different objects, etc. At the risk of oversimplifying, the product works in the general category of the old-school “office intranets”. Those kinds of systems worked in an atmosphere akin to “lockdown” and the service providers took security seriously. Despite the fact that similar players in the category like 37 Signals are offputtingly casual to some, the fact remains that Wave entered a paranoid type of market category with a freewheeling, experimental attitude. If people are going to be free-wheeling in their work and project environments, they’ll do it in ways they’ve already grown accustomed to (even using Skype or various cobbled-together solutions, or multiple types of solution). Despite the Cluetrain 2.0 style lectures by “how to set up a Wiki” evangelists near the end of the Web 2.0 hype phase, it’s clear that many aren’t adopting. The alternative to going whole hog into a single, consolidated system, for many, appears to be using multiple solutions clumsily and sometimes badly. The alternative, though, is akin to totalitarianism. Inhibited by trust issues? Yes.

SearchWiki: Did Google ever really seriously intend to keep this feature? Was adoption really the problem, or was this one of the many weak experiments in the Google Search world intended to act as a trial balloon to see how much genuine behavior would result, as opposed to self-dealing and spam? Leaving the feature in wouldn’t hurt anything, but Google must have felt it was a distraction to users. I’m sure they collected enough data to learn a few things during the time it was running. Trust issues? No.

Google Audio Ads: This was a very ambitious initiative. Nothing has changed in Google’s thinking long term: they’d like to broaden the idea of advertising auctions to any conceivable medium. There’s a slight snag, though. This works best on sites Google owns. To enable ad auctions on publishing platforms & content Google does not own, Google needs to sign partnerships. The end result of such partnerships may be that the publishers & providers won’t partner with Google for prime inventory but only remnant inventory. Take the analogy of a billboard owning company like CBS or Pattison. The best case scenario for Google would be to run an auction where advertisers could bid on the inventory digitally, and run the ads digitally. But they’re still at the mercy of the providers. Would the providers be resistant to such partnerships? Absolutely. They don’t see their interests as aligned with Google’s, especially if in the short term their revenues are not threatened by the old model, and especially if they believe they could create their bidding technology when their world becomes digitized. Google, in short, faces competition with ad networks, other technology platform providers, and resistance from publishers and media sources. They’ll continue to study ways to control more of their inventory directly, in partnerships where the other party is eager to try something out long term. Closing Google Audio Ads doesn’t reflect at all on Google’s backing away from the larger strategy. It does point to some of the roadblocks they’ll face in attempting to play a middleman or platform role for media assets they do not own. Did trust issues derail this one? In a strange way, yes.

Google Video. Google Video got its butt whipped by YouTube, so Google bought YouTube. The user culture at YouTube was much more vibrant. Google Video got hammered by a high spam to content ratio, in both videos and comments. On YouTube, when it came to the user base, there was a lot of “there there”; on Google Video, it felt like no one was home. YouTube’s culture was edgy and fast-moving. They also moved quickly to become the standard for embedded videos.  As a result of all this, Google Video failed to gain momentum while YouTube momentum was huge. Trust issues? Pretty much. But also, users simply couldn’t strongly identify Google’s brand with video. (Not until after the YouTube acquisition.)

Dodgeball. Why don’t we chalk this one up primarily to the “first draft effect”. In the first attempt to identify opportunities in a category, services don’t always find their feet. The valuable experience (and fundability) the founders gain from being acquired by Google helps them succeed much bigger on their next venture, especially if it’s playing in a similar space. On top of that, Foursquare being identifiable as a standalone thing, and coming along at a time when tweets helped to augment what it does, helped its momentum as an agenda-setter in the very space it wanted to define for itself. One part experience, one part positioning. You can’t entirely rule out the possibility that users feel better about sharing their location when it’s somehow a “cool” service that isn’t owned by the big guys. Typical early adopters of mobile and social apps are not exactly careful about their privacy; but like everything else, they like the exploitation of their willingness to trust, to at least be spread around a bit. Trust issues? Not entirely, but somewhat.

Jaiku. Google decided not to devote engineering resources to a Twitter clone, which just might indicate that it’s the user base and the data, not the amazing innovation and code base, that hold the key to Twitter’s current market value. In other words, there is room in the world for only one “show about nothing”. Trust issues? Hard to say, but don’t rule them out as a factor in slowing adoption had Jaiku gotten far enough along.

That’s enough examples to make the point. The many advantages Google’s reach, brand, and resources bring to a project can be offset by users’ unspoken “single overlord adoption threshold”. Both impulses will be at work when Google rolls out its mega social networking initiative near the end of this year. And indeed, it’s hard to completely untangle the issue of “I don’t trust you” from the positioning issue of “I trust you to do certain things really well, and I don’t believe you’re actually good at these other things.” These issues are inevitable as a company grows very large. It doesn’t mean anyone at Google is actually untrustworthy. It speaks more to an inherent fact of contemporary customer psychology, where the very big will do anything to mask or soften that bigness in order to appear humble and scrappy.

 


Traffick - The Business of Search Engines & Web Portals

 


Home | Categories | Archive | About Us | Internet Marketing Consulting | Contact Us
© 1999 - 2013, Traffick.com. All Rights Reserved