SEARCH CLINIC

Search engine online marketers
Subscribe Twitter Facebook Linkedin

Archive for November, 2009

Black Friday searches up 20% on last year

November 30, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Google have released search figures that show queries for “Black Friday” rose by more than 20% compared to last year.

Furthermore, searches for “black Friday sales” and “black Friday ads” also showed impressive growth, rising 50% compared to 2008. Black Friday (the Friday after thanksgiving) typically marks the start of the holiday shopping season for consumers in the U.S.
gooogle search logo
The figures show that more consumers are taking advantage of the black Friday deals offered by various retailers on this day. The Google retail blog also shared some of the fastest rising search terms on the day, which included:
    * “Walmart Black Friday”
    * “Kohls Black Friday Ad”
    * “Sears Black Friday Sales”
    * “Target Black Friday Deals Online”

Hitwise also published their own stats about black Friday, which shows that big retailers like Amazon, Walmart and Apple saw a surge in traffic on the day.

    The top visited Retail Website on Black Friday 2009 was Amazon receiving 13.55 % of U.S. visits among the top 500 Retail Web sites.

    Walmart was the second most visited with 11.18 % of visits followed by Target.com with 5.65%, BestBuy.com with 4.62%. followed by Sears with 2.95%.

    Among the top 20 sites visited on Black Friday 2009, The Apple Store saw the largest increase in visits compared to Thanksgiving day 2009 with a 110% increase, Staples saw a 47% increase YoY and Dell saw a 40% increase. Amazon had a 9% increase.

If you’re looking for advice to boost your holiday sales, please have a look at our post on Cyber Monday Is Coming.

Search marketing- what’s in the future?

November 27, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

If there’s one thing that both Google and Microsoft agree on, it’s that search marketing isn’t solved yet. 
Google’s vice president of search product and user experience Marissa Mayer has said:

    We’re all familiar with 80-20 problems, where the last 20% of the solution is 80% of the work. Search is a 90-10 problem. Today, we have a 90% solution: I could answer all of my unanswered Saturday questions, not ideally or easily, but I could get it done with today’s search tool. (If you’re curious, the answers are below.) However, that remaining 10% of the problem really represents 90% (in fact, more than 90%) of the work. 

    Coming up with elegant, fitting and relevant solutions to meet the challenges of mobility, modes, media, personalization, location, socialization, and language will take decades. Search is a science that will develop and advance over hundreds of years. Think of it like biology and physics in the 1500s or 1600s: it’s a new science where we make big and exciting breakthroughs all the time. However, it could be a hundred years or more before we have microscopes and an understanding of the proverbial molecules and atoms of search. Just like biology and physics several hundred years ago, the biggest advances are yet to come. That’s what makes the field of Internet search so exciting.

Well, Dr Search agrees with the philosophy, if not the time lines. 

Information discovery and dissemination is a science that is already hundreds of years old. Google, in its present state, is a small but significant wrinkle in that time line. What is exciting is that it’s marking an important change in how we look at information. 
What Google has done is introduced a “Just in Time” information economy. It’s a little presumptuous to say that we’re at the beginning and that internet search marks an entirely new science. Really, this still comes down to how we seek and use information. The internet and search has represented a monumental shift, yes, but it’s not a new ball game. And I certainly hope we don’t have to wait hundreds of years for significant advancements in the state of search.

Microsoft’s Director of Product Planning Stefan Weitz also said in an Ars Technica interview with that we’re early in the game of search:

    “’We’re not at where we’d like to be,’ Weitz began, and then dove in to explain that people are generally happy with how their search engine is working, until the data shows that they are not.”

So, there seems to be consensus that there’s a lot to do to improve web search. The question is, what does that improvement look like? A blog post by author and industry pundit John Battelle caught my attention:

    I describe my frustration with search as it relates to helping me make a complicated decision: How to possibly buy a classic car. From it:

    So first, how would I like to decide about my quest to buy a classic car? Well, ideally, I’d have a search application that could automate and process the tedious back and forth required to truly understand what the market looks like. After all, if I’m looking for classic Camaro or Porsche convertibles from the mid to late 1960s, there are only so many of them for sale, and they can be categorized by any number of important variables—price, model, region, color, features, etc. And while a number of sites do a fair job with a portion of the market, I don’t trust any of them to give me a general overview of what’s really out there. That’s where an intelligent search agent can really help.

So here, Battelle hits on the idea of search assisting in complex decisions. And then, from our own Search 2010 series of interviews, usability expert Jakob Nielsen voiced a similar concern:

    I think we can see a change maybe being a more of a usefulness relevance ranking. I think there is a tendency now for a lot of not very useful results to be dredged up that happen to be very popular, like Wikipedia and various blogs. They’re not going to be very useful or substantial to people who are trying to solve problems.

In the same series of interviews, I talked to Marissa Mayer about where search may go, and she envisioned a more interactive set of search results:

    We will be able to have much more rich interaction with the search results pages. There might be layers of search results pages: take my results and show them on a map, take my results and show them to me on a timeline. It’s basically the ability to interact in a really fast way, and take the results you have and see them in a new light. So I think that that kind of interaction will be possible pretty easily and pretty likely. I think it will be, hopefully, a layout that’s a little bit less linear and text based, even than our search results today and ultimately uses what I call the ‘sea of whiteness’ more in the middle of the page, and lays out in a more information dense way all the information from videos to audio reels to text, and so on and so forth. So if you imagine the results page, instead of being long and linear, and having ten results on the page that you can scroll through to having ten very heterogeneous results, where we show each of those results in a form that really suits their medium, and in a more condensed format.

The common theme, it seems to me, is aspiring to move beyond relevancy as the metric by which a list of search results is ordered to providing us with information that we can do something with. For that quest, there seems to be two different approaches. 

Microsoft, with Bing, appears to be favoring Battelle’s “online valet” model—an all-knowing wizard that helps guide us through complex decisions. Indeed, the branding of Bing as a “decision engine” reiterates that aspiration. Bing’s strategy, still in it’s nascent stages, is to pick the categories where complex decisions and the need for more useful information abound: shopping, local, travel and health.

I believe Bing is on the right track, but they’re still are too bound to the typical search format. Even searches in these targeted categories don’t usually deliver a search page that offers substantially more useful results than Google. 

If the goal of Bing is to be a decision engine, it should rise to the challenge more boldly. For example, I’m thinking of buying a Prius, which, with all the trade-offs between a higher sticker price but potentially lower operating costs certainly qualifies as a complex decision. To echo John Battelle’s wish, I’d love a digital valet to go out and gath
er all the relevant information and then guide me through it. This is what Bing promises to do. Let’s see how it delivers.

Firefox double celebrations- turns 5 and 25% Market Share

November 26, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Firefox from Mozilla turned 5 this month, and celebrated by having achieved 25% of the global browser usage market share. 

NetMarketShare.com from Net Applications tracks global and local usage market share for browsers, operating systems, search engines and mobile systems.

“What an exciting milestone for Mozilla, particularly as we are celebrating five years of Firefox this week,” said Mitchell Baker, Chair, Mozilla. “The momentum around Firefox adoption has been truly astounding.”

Firefox mozilla market share 2009
When Firefox entered the browser wars, we asserted it needed to achieve 10% usage market share to be considered a true competitor to Microsoft’s Internet Explorer. Mozilla crossed that threshold back in March of 2006 and they have grown their usage share fairly steadily since then. 
Now 1 in 4 people globally are browsing the Internet with Firefox.

That’s impressive, especially when you consider the advantages the other browser providers have enjoyed over the years:
    * Internet Explorer has been the default browser on Windows systems, and often the only browser supported by many company’s IT departments
    * Safari has been the default browser on Macs and iPhones
    * Opera has often been first to introduce new browser features, and has supported many mobile and gaming devices
    * Chrome comes from Mozilla’s primary source of revenue, Google, and has performance advantages over other browsers

The competition has been heated, but Mozilla has focused on a formula of:
    * Free (this may seem obvious now, but there had to be a difficult decision made between charging for the browser as Opera was doing early on, or come up with another revenue model)
    * Open source
    * Extensibility
    * Excellent user experience
    * Frequent updates and innovation

That formula has been successful so far, but the war is far from decided. Microsoft’s release of Windows 7 has seen a very impressive early adoption rate. There are 2 major decisions computer users will face with a major new operating system available.
First, do they upgrade to Windows 7, or is this the time to consider an alternative such as a Mac OS or Linux based system. New Mac users will most likely lead to new Safari users. Second, with a new operating system many people will have to decide on a default browser again. This gives IE a great opportunity to win back some of its lost market share. But, it also gives Chrome, Opera and other browsers an opportunity to become the alternative browser of choice over Firefox.

Another major force in the browser wars is the move to mobile. The iPhone has proven that people will browse from their mobile device if the browser and device can provide a similar user experience to a computer. 
Mobile browsing is projected to grow substantially in the years to come, so this may be the next big battle ground for browser providers.

See current usage market share and trends at NetMarketShare.com.

New Google search results layout preview

November 25, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

For Google, 2009 has been the year of UI testing. There have been more tests on Google’s Search Engine Results Page this year than any other that I can recall.

Now it appears the testing is paying off. Many of the changes which were tested this year look to be securing permanent positions and will be appearing in early 2010.

The overhaul of Google’s Search Engine Results Page is aimed at tackling the issue of “User Interface Jazz” that Marissa Mayer raised with Danny Sullivan from Search Engine Land.

It seems that all the UI testing that Google has been conducting has been creating some discontent in the Google ranks, with Marissa describing it like “Jazz”. As she puts it:

I don’t like jazz, because you never know what’s going to happen next…I’ve been calling this problem ‘user interface jazz.’ This result looks this way, and that result looks that way [something much different], and it really does slow you down.

So Google will probably look to consolidate some of 2009’s testing with permanent changes to roll out in 2010. And Search Options appears to be one of the most noticeable changes. Courtesy of Search Engine Land, here’s some screenshots of the new interface being tested.
new google search results layout
The Search Options area on the left hand side now has additional filters (or “modes” as Google call them), making it a one stop navigation point for Google’s results, including video, blogs, local etc.

Search Options will be permanently displayed, unlike the current incarnation, where users have to click to display the options.

If you want an in depth review of the new interface – I recommend reading over Danny’s post at Search Engine Land.

The Search Clinic would love to hear what you think about the new interface:
  • do you think it improves usability?
  • how do you think it will impact website owners?
  • will you be doing anything to leverage the improved access to other modes e.g. video, local?

Speed- another reminder from Google

November 24, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Google have given webmasters a strong indication that the speed of your website may become an even bigger organic ranking factor in 2010!

Here’s what Google’s Matt Cutts has to say on the issue:

“A lot of people within Google think that the web should be fast, it should be a good experience and so it’s fair to say if you’re a fast site, maybe you should get a little bit of a bonus, or if you have an awfully slow site, maybe users don’t want that as much.”

google-logo
Google already considers page load times as part of the AdWords quality score, so it’s a logical step for this to be included in organic listings too.

While it’s unlikely to have a huge impact on rankings (there are over 200 other factors) it would be a smart decision to prepare for its introduction ahead of time and get your site loading as fast as possible. Here are a couple of tools which can help you get started:
  • Google’s Page Speed Tool – This is a firefox plugin from Google that enables you to see which areas of your site need improvement.
  • WebPageTest.org – Another speed tool which provides optimization recommendations and other interesting stats.

How you can create a UK Bing local listing

November 23, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Dr Search has had some questions from our UK readers who pointed out that the Bing local listing center was only accepting US based businesses.

While UK listings were showing in search results, this was sourced from other data providers and it didn’t appear there was any way to get your information included.

Well, thanks to one of our loyal readers, it appears you can! Microsoft gets UK local business listings from a company called Market Location.

Here’s a screenshot from multimap.com (owned by Bing) on how to get your business listed:
Local search listings with Bing through Multimap
All you have to do is simply visit http://www.marketlocation.com/changereq/ and add your details!

There’s been a bit of discussion on the issue over at the Bing community forums, and some readers have confirmed the inclusion of previously unlisted businesses by using the form above.

So if you’ve got a UK business, add your details with the form above and please leave a comment if/ when you get included! 
The Search Clinic wishes you Good Luck!

Cyber Monday is coming- is your website ready?

November 20, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Cyber Monday (the first Monday after US Thanksgiving), marks the start of the online holiday shopping season for most retailers. 
With a flurry of online activity expected on this day, it’s important to ensure your website’s going to capture the attention of as many shoppers as possible.

In 2008, Cyber Monday spending hit a record high, with consumers spending a whopping $846 million online. 

The good news for retailers is that Cyber Monday is only the start – with strong online sales expected to continue right through until the New Year.

So the big question is: are you ready? The key to improving your sales during this period is to focus on marketing that can deliver instant results.

In the online world, this typically includes:
    * Google AdWords / PPC Advertising
    * Local Search Listings
    * Featured Listings on smaller search engines

Google AdWords (PPC) Advertising
Google AdWords advertising would definitely be the number one way to target holiday shoppers. It offers pinpoint targeting and instant exposure enabling you to get on the first page of Google when customers are searching for your products and services.

Key Benefits:
    * Campaigns can be live within hours.
    * Ability to target customers via keyword and location.
    * First page placement on Google.

Local Search Listings
If you’re targeting local customers, a local search listing across Google, Yahoo and Bing is another way to get on the first page of organic search results. It’s simple to setup and there’s no limit to the number of people who click on your listing.

Key Benefits:
    * Once verified, listings are live almost immediately.
    * Can be included on the first page of results.
    * Free organic traffic.

Featured Listings
If your customers use a search engine besides the top 3, there’s no harm in being found there either. Top 10 featured listings can help boost the efforts of your PPC and Local campaigns.

Key Benefits:
    * Listings are live within 48 hours.
    * Traffic is free – no click fee’s.
    * Keyword targeted traffic.

This year, Cyber Monday falls on the 30th November, so there’s only a few weeks now to get prepared. But don’t leave it to the last minute!

Google’s major Caffeine update coming after US holidays

November 19, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Google blogger Matt Cutts has announced that Google’s “Caffeine” update will be going live after the US holiday period. 
Matt explains that this timing is good for webmasters who don’t want to deal with any unexpected updates during the busy holiday sales period.

“I know that webmasters can get anxious around this time of year, so I wanted to reassure site owners that the full Caffeine roll out will happen after the holidays. Caffeine will go live at one data center so that we can continue to collect data and improve the technology, but I don’t expect Caffeine to go live at additional data centers until after the holidays are over.”

Google’s caffeine update is a major upgrade to their search technology which is expected to improve speed and accuracy of search results. 

Google has said they don’t expect this to be a dramatic change to existing results; rather it is “the first step in a process that will let us push the envelope on size, indexing speed, accuracy, comprehensiveness and other dimensions.”

Users have been testing the caffeine-powered search results via the Google sandbox for the past few months, but this has now been pulled, with a message that Caffeine is ready for a larger audience.

We’ll be sure to update our blog as soon as this starts being pushed out across more data centers.

Google adds gold stars to AdSence presentation

November 18, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Google has added yellow stars on Google AdSense ads — these ads were named “featured ads”.

Dr Search didn’t believe it. I thought maybe this was some type of AdSense hack that placed gold stars on the ads. I just received confirmation from a Google representative that these ads are, indeed, real.

Here is a picture of the ad:
Google adwords with review stars

Google told us: “We are currently running a limited test in which a small number of users are seeing ads that are marked based on signals related to quality and relevance. This experiment is part of our ongoing efforts to help users find what they’re looking for, and we’re closely monitoring feedback.”

This sounds like Google is showing a star near ads that are contextually more relevant than the others. I asked Google for confirmation on that point.

Clearly, many AdSense publishers will find these ads hypocritical, since Google disallows images placed near ads. But at the same time, I doubt publishers will mind Google placing these stars on the ads, since it will lead to higher clicks and more money for the publisher.

6 ways local URLs beat .com in international SEO

November 17, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

We’ll look at the top six reasons why local domains are for winners and dot coms for runners up. 
Dr Search will also attempt to put on record why newbies to the promotion of international sites are seduced by the songs of the dot com siren and how to spot those who have been enchanted away on magical waves of sound.
 
So here are the six reasons why I believe local domains are the clear winners when trying to promote international sites.

1. Clear unequivocal geo-targeted signal
To own a country code domain or ccTLD (in this article called local domains), you actually need to go and buy them and register with a local authority. As such, the local domain has always represented the best controlled and strictest identifier of a specific geography. There are some exceptions of course, but these are mostly to do with certain domains, such as .tv (the tiny island state of Tuvalu) having found that their particular geography had a gold mine domain name it could use to generate revenue.

On several occasions I have been approached by engineers employed by search engine specifically who were working on geo-targeting of their results. In all cases they have given the local domain as the first and best signal they would look for in determining a local result.

2. Good site architecture
The argument is often put forward that it’s far too expensive to switch an existing dot com website with zillions of pages over to its relevant local domains in the various countries its owners wished to target. It can, of course, be expensive to switch the domain used and this needs to be done with great care. 

However, when corporations calculate the cost of making the change, they tend to give less financial value to the ongoing cost of SEO and of compensating for not having the relevant local domain. This could mean additional local hosting costs or even substantial link building to overcome the inherent disadvantages of the dot com.

Many great SEOs will repeat to you over and over again how important it is to have good site architecture. I’m a firm believer that using local domains for your site is a very good place to start when structuring your site.

3. People generally buy locally

Purist SEOs may not see conversion factors as the most important in recommending which steps a client should take. However, I firmly believe users read URLs in the search engine results and that it has a direct impact on how many of them click on links. Say you’re looking for a “second hand car” and you live in Germany. If you know nothing else about a website, which is most likely to be the most compelling: “secondhandcar.com” or “secondhandcar.de?” To me, it is clearly the latter.

Even beyond the results page, the local domain plays in the mind of the user. “If this is a .de and I live in Munich, then they’re more likely to deliver” is a reasonable conclusion for most folks to draw.

4. Link attractiveness

Having a local domain also helps in your link building programs. Other sites in the same country are much more likely to link to you if you have a local domain. But it’s especially true that they’ll be more interested in receiving links from you if you’re local—after all, they need local links too. Many local directories will only accept local domain names in any case.

5. More powerful internal linking
Links between sites of the same dot com are less valuable, in my view, than links between truly international versions using local domains. So a site which splits its dot com into many countries has an opportunity to reap some benefits from the many different domains it now controls—subject to the normal caveats such as having quality content and offering a good experience to the user.

6. Resistance to the shifting sands of algorithms
I can’t prove this one to you, but after more than a decade of experience I’m convinced that local domain sites tend to be more stable in results than dot coms which move up and down when search engine algorithms change.

Enchantment from the dot com sirens

Why do so many talented SEOs first conclude that dot coms are just as acceptable as local domains when they first start working in the international field? The first issue is that many look at the situation in the UK as a test case for what happens internationally. 

This is not a good idea as the UK is a very odd example indeed where US sites are often as acceptable to British folks as home-based UK ones. The balance between .co.uk and dot com in the UK is NOT typical of how it works in the wider world.

Second, the structure of a site’s geo-selector—the method by which countries and languages are chosen—plays a key role in sharing link values around the site. Dot coms have an advantage here,but only because using local domains shows up the poor structure of the geo-selector. With improvement, they will easily overtake the dot com.

The third reason is that SEOs just love research and data. So they head into the search engines and check some keywords and then assess how many dot coms or local domains show-up. I have seen this so many times. 

The problem with this approach is that you would have to check a huge number of keywords to get a sensible result, you’d have to check the correct language keywords and you’d have to work out how competitive the sector is. If it’s relatively uncompetitive—more dot coms will show up. And if you use the wrong keywords… well, that’s a story for another column.