If you’re looking for advice to boost your holiday sales, please have a look at our post on Cyber Monday Is Coming.
Archive for November, 2009
We’re all familiar with 80-20 problems, where the last 20% of the solution is 80% of the work. Search is a 90-10 problem. Today, we have a 90% solution: I could answer all of my unanswered Saturday questions, not ideally or easily, but I could get it done with today’s search tool. (If you’re curious, the answers are below.) However, that remaining 10% of the problem really represents 90% (in fact, more than 90%) of the work.
Well, Dr Search agrees with the philosophy, if not the time lines.
Microsoft’s Director of Product Planning Stefan Weitz also said in an Ars Technica interview with that we’re early in the game of search:
“’We’re not at where we’d like to be,’ Weitz began, and then dove in to explain that people are generally happy with how their search engine is working, until the data shows that they are not.”
So, there seems to be consensus that there’s a lot to do to improve web search. The question is, what does that improvement look like? A blog post by author and industry pundit John Battelle caught my attention:
I describe my frustration with search as it relates to helping me make a complicated decision: How to possibly buy a classic car. From it:
So first, how would I like to decide about my quest to buy a classic car? Well, ideally, I’d have a search application that could automate and process the tedious back and forth required to truly understand what the market looks like. After all, if I’m looking for classic Camaro or Porsche convertibles from the mid to late 1960s, there are only so many of them for sale, and they can be categorized by any number of important variables—price, model, region, color, features, etc. And while a number of sites do a fair job with a portion of the market, I don’t trust any of them to give me a general overview of what’s really out there. That’s where an intelligent search agent can really help.
So here, Battelle hits on the idea of search assisting in complex decisions. And then, from our own Search 2010 series of interviews, usability expert Jakob Nielsen voiced a similar concern:
I think we can see a change maybe being a more of a usefulness relevance ranking. I think there is a tendency now for a lot of not very useful results to be dredged up that happen to be very popular, like Wikipedia and various blogs. They’re not going to be very useful or substantial to people who are trying to solve problems.
In the same series of interviews, I talked to Marissa Mayer about where search may go, and she envisioned a more interactive set of search results:
We will be able to have much more rich interaction with the search results pages. There might be layers of search results pages: take my results and show them on a map, take my results and show them to me on a timeline. It’s basically the ability to interact in a really fast way, and take the results you have and see them in a new light. So I think that that kind of interaction will be possible pretty easily and pretty likely. I think it will be, hopefully, a layout that’s a little bit less linear and text based, even than our search results today and ultimately uses what I call the ‘sea of whiteness’ more in the middle of the page, and lays out in a more information dense way all the information from videos to audio reels to text, and so on and so forth. So if you imagine the results page, instead of being long and linear, and having ten results on the page that you can scroll through to having ten very heterogeneous results, where we show each of those results in a form that really suits their medium, and in a more condensed format.
The common theme, it seems to me, is aspiring to move beyond relevancy as the metric by which a list of search results is ordered to providing us with information that we can do something with. For that quest, there seems to be two different approaches.
I believe Bing is on the right track, but they’re still are too bound to the typical search format. Even searches in these targeted categories don’t usually deliver a search page that offers substantially more useful results than Google.
er all the relevant information and then guide me through it. This is what Bing promises to do. Let’s see how it delivers.
I don’t like jazz, because you never know what’s going to happen next…I’ve been calling this problem ‘user interface jazz.’ This result looks this way, and that result looks that way [something much different], and it really does slow you down.
- do you think it improves usability?
- how do you think it will impact website owners?
- will you be doing anything to leverage the improved access to other modes e.g. video, local?
“A lot of people within Google think that the web should be fast, it should be a good experience and so it’s fair to say if you’re a fast site, maybe you should get a little bit of a bonus, or if you have an awfully slow site, maybe users don’t want that as much.”
In 2008, Cyber Monday spending hit a record high, with consumers spending a whopping $846 million online.
So the big question is: are you ready? The key to improving your sales during this period is to focus on marketing that can deliver instant results.
In the online world, this typically includes:
* Google AdWords / PPC Advertising
* Local Search Listings
* Featured Listings on smaller search engines
Google AdWords (PPC) Advertising
Google AdWords advertising would definitely be the number one way to target holiday shoppers. It offers pinpoint targeting and instant exposure enabling you to get on the first page of Google when customers are searching for your products and services.
* Campaigns can be live within hours.
* Ability to target customers via keyword and location.
* First page placement on Google.
Local Search Listings
If you’re targeting local customers, a local search listing across Google, Yahoo and Bing is another way to get on the first page of organic search results. It’s simple to setup and there’s no limit to the number of people who click on your listing.
* Once verified, listings are live almost immediately.
* Can be included on the first page of results.
* Free organic traffic.
If your customers use a search engine besides the top 3, there’s no harm in being found there either. Top 10 featured listings can help boost the efforts of your PPC and Local campaigns.
* Listings are live within 48 hours.
* Traffic is free – no click fee’s.
* Keyword targeted traffic.
This year, Cyber Monday falls on the 30th November, so there’s only a few weeks now to get prepared. But don’t leave it to the last minute!
“I know that webmasters can get anxious around this time of year, so I wanted to reassure site owners that the full Caffeine roll out will happen after the holidays. Caffeine will go live at one data center so that we can continue to collect data and improve the technology, but I don’t expect Caffeine to go live at additional data centers until after the holidays are over.”
Google’s caffeine update is a major upgrade to their search technology which is expected to improve speed and accuracy of search results.
Users have been testing the caffeine-powered search results via the Google sandbox for the past few months, but this has now been pulled, with a message that Caffeine is ready for a larger audience.
We’ll be sure to update our blog as soon as this starts being pushed out across more data centers.
1. Clear unequivocal geo-targeted signal
To own a country code domain or ccTLD (in this article called local domains), you actually need to go and buy them and register with a local authority. As such, the local domain has always represented the best controlled and strictest identifier of a specific geography. There are some exceptions of course, but these are mostly to do with certain domains, such as .tv (the tiny island state of Tuvalu) having found that their particular geography had a gold mine domain name it could use to generate revenue.
On several occasions I have been approached by engineers employed by search engine specifically who were working on geo-targeting of their results. In all cases they have given the local domain as the first and best signal they would look for in determining a local result.
2. Good site architecture
The argument is often put forward that it’s far too expensive to switch an existing dot com website with zillions of pages over to its relevant local domains in the various countries its owners wished to target. It can, of course, be expensive to switch the domain used and this needs to be done with great care.
Many great SEOs will repeat to you over and over again how important it is to have good site architecture. I’m a firm believer that using local domains for your site is a very good place to start when structuring your site.
3. People generally buy locally
Purist SEOs may not see conversion factors as the most important in recommending which steps a client should take. However, I firmly believe users read URLs in the search engine results and that it has a direct impact on how many of them click on links. Say you’re looking for a “second hand car” and you live in Germany. If you know nothing else about a website, which is most likely to be the most compelling: “secondhandcar.com” or “secondhandcar.de?” To me, it is clearly the latter.
Even beyond the results page, the local domain plays in the mind of the user. “If this is a .de and I live in Munich, then they’re more likely to deliver” is a reasonable conclusion for most folks to draw.
4. Link attractiveness
Having a local domain also helps in your link building programs. Other sites in the same country are much more likely to link to you if you have a local domain. But it’s especially true that they’ll be more interested in receiving links from you if you’re local—after all, they need local links too. Many local directories will only accept local domain names in any case.
5. More powerful internal linking
Links between sites of the same dot com are less valuable, in my view, than links between truly international versions using local domains. So a site which splits its dot com into many countries has an opportunity to reap some benefits from the many different domains it now controls—subject to the normal caveats such as having quality content and offering a good experience to the user.
6. Resistance to the shifting sands of algorithms
I can’t prove this one to you, but after more than a decade of experience I’m convinced that local domain sites tend to be more stable in results than dot coms which move up and down when search engine algorithms change.
Enchantment from the dot com sirens
Why do so many talented SEOs first conclude that dot coms are just as acceptable as local domains when they first start working in the international field? The first issue is that many look at the situation in the UK as a test case for what happens internationally.
Second, the structure of a site’s geo-selector—the method by which countries and languages are chosen—plays a key role in sharing link values around the site. Dot coms have an advantage here,but only because using local domains shows up the poor structure of the geo-selector. With improvement, they will easily overtake the dot com.
The third reason is that SEOs just love research and data. So they head into the search engines and check some keywords and then assess how many dot coms or local domains show-up. I have seen this so many times.