SEARCH CLINIC

Search engine online marketers
Subscribe Twitter Facebook Linkedin

Archive for October, 2009

Bing and Google keep gaining search market share in September

October 30, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

The September search share statistics are out and Yahoo! is the biggest loser yet again as Google and Bing increase their traffic.

According to statistics released by Compete, Yahoo’s search share continues to take a steady dive. It lost another 1% market share last month and has now lost a total of 5% since September 2008.

It’s a whole other story for the new kid in town, Bing, whose search share continues to rise and unlike many of the other major search engine also had an increase in the number of total queries.

Here is a rundown of the market share for each search engine.

search engines market share Sept 09
The all mighty Google remained pretty stable from last month, but has grown over 4% from this time last year. I wonder if we will see this same growth this time next year or will Bing cause Google’s growth to slow in 2010?

The introduction of Bing to the market seems to have had little effect on AOL and Ask. Both search engines market share has remained quite constant over the last few months. Although with less than 4% market share combined, it’s hardly going to cause any sleepless nights for Bing.

So how did search do overall in September? Not too good actually! Searchers submitted 200 million less queries than in August. There’s no cause for concern yet though as this will likely bounce back with with Halloween, Thanksgiving and Christmas all just around the corner.

The science of rating your search engine optimisation (SEO)

October 29, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Search engine optimisation (SEO) is a science. Crafting rewrite rules and so forth is pretty geeky stuff. The science side of SEO is where Dr Search spends most of my time.

A dichotomy is that SEO is both subjective and objective. The point at which a title tag, URL, or headline is “good enough” and thus moving on to the next task is warranted — that is certainly subjective. Also consider what might comprise the most optimal URL structure? Does it end in / (slash) or a file extension like .html? Again, subjective.

In my view, SEO for the most part is cut-and-dry, it’s objective. That’s because it can all be boiled down to an algorithm, and in fact, it already has. The algorithm I speak of, of course, is Google’s (or Yahoo’s, or Bing’s). 

The SEO practitioner’s challenge is to reverse-engineer that algorithm to the best of their ability. But it shouldn’t stop there. Why not write your own algorithm — an approximation of the search engine’s own algorithm, one that teases out the various signals and accurately assesses the quality, relevance and importance of these signals without human intervention/assistance?

Running algorithmic analysis on a site-by-site and a page-by-page basis will then allow you to ascertain a site’s SEO health, and more importantly, the subsequent actions required in this never-ending process known as optimization. That is data-driven decision-making, my friends, and it will be a key driver in the next stage in the evolution of SEO.

To be effective, SEO scoring has to get granular. Knowing you scored an 89 out of 100, or a B+, overall with your SEO may be reassuring, but there weren’t any next steps that followed from that knowledge. The same is true even if you individually score each of the major SEO areas of focus. 

In my SEO Report Card column for Practical Ecommerce, I (arbitrarily) chose the following areas of focus: Home Page Content, Inbound Links and PageRank, Indexation, Internal, Hierarchical Linking Structure, HTML Templates and CSS, Secondary Page Content, Keyword Choices, Title Tags, and URLs. I don’t claim that these are the best “buckets”. Nonetheless, scoring such broad areas is still not actionable, really.

Score the title tags, internal anchor text, keyword prominence, H1s, meta descriptions and so forth separately, and on a page-by-page basis, and now you’re talking!

SEO effectiveness can be deconstructed into its many components. It can be benchmarked against competitors. Inferences can be made, priorities can be set, content can be massaged, link juice can be directed. Consequently, the SEO practitioner relies less on their gut and more on the data to drive their actions.

DMOZ- the Open Directory- everything you wanted to know part 2

October 28, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Dr Search continues yesterday’s DMOZ Open Directory interview between Debra Mastaler of Search Engine Land recently posted an interview with Bob Keating, Editor In Chief of the Open Directory Project who graciously agreed to the interview.

Debra: Why can’t DMOZ notify webmasters when their sites are included or rejected? Has there been any discussion on being able to pay for this feature?

Bob: Because the ODP is not designed to be a site listing service, creating a notification system has not been a priority. In the past, there was a “check my site status” thread offered via the editor-run public forums at Resource Zone (www.resource-zone.com). It was not hosted or administered by AOL. It was a good faith effort to reach out to the webmaster community.

However, the thread got quite unruly and unmanageable, so it was taken down. Moreover, some editors felt the “check status” thread conflicted with their other editing pursuits. Nonetheless, I can see us adding this as a feature in the future. As with any feature associated with DMOZ, it would be provided at no cost.

Debra: What are the top three reasons sites don’t make it into the ODP?


Bob: They are:
   1. The site was submitted to the incorrect category. Editors may move these submissions to the correct category (which can significantly delay review); or delete them from the submission queue.
   2. The site is incomplete, under construction, returns an HTTP error, or lacks adequate or unique content.
   3. The site’s content mirrors a URL that is already listed in the directory.

Debra: Mention DMOZ to a group of webmasters or read forum posts discussing the directory, and you’ll usually find the negative comments far outweigh the positive. How is the ODP dealing with their legacy issues?

Bob: Webmaster angst stems from the fact that the ODP is not designed to be a site listing service for webmasters. Webmasters have worked very hard to make the ODP work for them, and the editors have worked equally hard against Webmaster tactics that are contrary to how the directory operates. As result, this conflict has created a cloud of distrust and negativity between both camps.

The Webmasters feel shut out of a community that was intended to be open to all types of contributions. For a while now, our challenge has been to create system that allows Webmasters to contribute to the ODP in a mutually beneficial and meaningful way, while preserving editorial quality.

The solution is not as simple as turning the ODP into a submission service or maintaining the status quo. Rather, the solution involves expanding the ODP’s scope, offerings and levels of participation. This is at the heart of what we are working on today.

Debra: It’s great to hear the ODP is working to expand its scope, offerings and participation levels, can you tell us a little more about your plans and when we can expect to see them implemented?

Bob: ODP is committed to expanding its scope, offerings and participation levels, but I can’t share any details with you at this point. When we are ready to announce more details, we will be sure to let you know.

Debra: Do you think people would be so passionate about being included in the directory if it wasn’t used by Google?

Bob: It depends if you are talking about Webmasters or editors. Clearly, webmasters would not care much about DMOZ if it weren’t for its influence on search engines. Editors, on the other hand, have a different perspective. The reasons editors participate in the ODP are as diverse as the global makeup of its participants.

Debra: There was a post on the DMOZ Blog recently where an editor (crowbar) outlined what makes content unique by ODP standards. It listed a number of points but seemed to dwell on the issue of mirror sites, or that “A site should not mirror content available on other sites”. Since this is a strong criteria for inclusion (or not) in the Directory, why does the DMOZ give away its content through the dump program? On one hand, DMOZ admits to deleting sites submitted that don’t have unique content and yet they provide mirror content to anyone who asks. Is this a case of do as I say and not as I do?

Bob: There are two separate issues here. One is content distribution and syndication, which the ODP does as do billions of other websites. Sites that include syndicated content are not considered “mirror sites” simply because they include syndicated content.

The other issue is content that an entity replicates over different branded domains. This is a common tactic in e-commerce, and is the issue the guidelines around “mirror sites” are intended to address.

The interview ended there. Here’s my tidbits and takeaways:

The tidbits

When I heard Bob make this comment:

    “the lion’s share of agency sites are directed and listed in the Regional area of the site, which is where a lot of the editors in this area are spending their time and effort.”

The word “regional” caught my ear. I’ve been following Tim Armstrong since he came on board as AOL’s CEO and understand he (and now AOL) have a strong interest in Patch.com. It’s interesting to note Patch.com is a regional, community specific platform showing news and events from specific cities and towns. Seeding Patch.com with regional results from a respected directory would make a lot of sense, so if you’re bricks and mortar based, now might be a good time to submit your business to DMOZ.

The second tidbit worth noting, is the comment about the notification service. Notifying webmasters why their sites aren’t being added to the directory would go a long way in eliminating the frustration many feel about the ODP; after all, education is preferable to being ignored. I sincerely hope this project moves along much faster than the DMOZ 2.0 project they dropped hints about back in June 2008.

The last and most notable takeaway from this interview, IMO, is the response to my question on why sites don’t make it into the DMOZ. Bob’s answer is informative and also very unsettling because it speaks directly to what I feel is the core problem with the DMOZ – a lack of editors.

Here’s what he said when I asked “What are the top three reasons sites don’t make it into the DMOZ?”

    “The site was submitted to the incorrect category. Editors may move these submissions to the correct category (which can significantly delay review) or delete them from the submission queue“.

I’ve spoken to many DMOZ editors who all say the same thing, they delete submissions made to incorrect categories rather than send them along. Why? I’m told it’s because so much of the directory is without editors and/or because they have the authority to do so.

Hmm. This attitude is interesting especially since the DMOZ states “fairness and objectivity prevail here” in their editor requirements. It doesn’t seem “fair” or “objective” to simply delete a submission added to the wrong category but hey, that’s the way things go at the DMOZ. 

Say anything and even top management is quick to point out “the ODP is not designed to be a site listing service for webmasters”. I think you’ll find a lot of webmasters support that s
tatement and want a quality DMOZ maintained, they just don’t always get it right when they submit. Submitting your site to the wrong category should not preclude you from being added to the directory.

One of the reasons for doing this interview was to find out what the DMOZ was going to do about recruiting editors to fill its very empty ranks. While Bob reaffirmed the DMOZ’s commitment to quality editing, he didn’t address the issue of recruitment, even though I asked the question twice.

How can the directory maintain quality content with so many categories missing editors? Case in point, when I view the page dedicated to the hot topic of H1N1/Swine Flu, see no editor and note the page was last updated October 18, 2009 I wonder if the DMOZ is really a serious search source.

Add to it, I don’t see popular sites such as the Mayo Clinic, the World Health Organization, MedicineNet or FluView listed and now I’m also wondering about the ability of ODP to provide relevant information. It’s not hard to list the top health sites on the Web for the term H1N1/Swine Flu, but it’s  impossible when you don’t have a editor working the category.

Yes, yes – I know section editors can and do come in to edit but they’re obviously not doing that here, are they? For topics in the news or representing financial/health issues, every effort should be made to fill those categories with qualified editors and keep the category updated. To do anything less is a disservice to the public and the directory.

I sincerely hope DMOZ doesn’t become invisible like the Great Pumpkin, as it has been an integral part of the search industry for 11 years and deserves respect for its contributions. A hand-edited directory of 4.5 million websites is an accomplishment no one else can claim and I support the stringent admission standards they have in place. 

But I also hope the directory makes every effort to utilize the vast resources AOL has to recruit quality editors to its empty categories. The H1N1/Swine Flu category is a classic example of how out-of-date the directory is and how important editors are to keeping it current. I believe once editors are in place, many of the other issues will take care of themselves.

DMOZ- the Open Directory- everything you wanted to know

October 27, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

The Open Directory Project (ODP or DMOZ) always seems to creep into the conversation when we’re discussing links and SEO. 
Check any forum, social news or answer site and you’ll see a wide variety of opinions on the 11 year old directory and how it’s managed.

Since DMOZ is not a search giant, and seemingly does little to promote itself or the core values of the directory, you have to wonder why editors and SEO’s even bother with it. 

Debra Mastaler of Search Engine Land recently posted an interview with Bob Keating, Editor In Chief of the Open Directory Project who graciously agreed to the interview. Below and tomorrow Dr Search carries the interview:

Debra: Tell us a little about yourself Bob, how did you get started with the Open Directory Project (ODP) and how long have you been there?

Bob: I have been working on the ODP since I joined AOL in June 1999. Initially, I was brought on to work on a directory solution for AOL Search. I joined the ODP team shortly afterward to help develop the ontology and the community self-governance model. About a year later, the ODP Founders appointed me Editor in Chief.

Over the years, I’ve worked on number of search and publishing projects at AOL. In 2004, I left full-time employment with AOL, and took a position with the Federal government to start-up a new search engine program, but I remained as a consultant on the ODP. Since 2006, I’ve worked in the strategy consulting space, helping Federal clients develop product strategies around search, social media, and web-based services.

But through all these career changes, the ODP has been a constant in my life. For the last five years, my involvement has been more focused on overseeing the community and advising the ODP team at AOL on everything from the project’s history to community interaction.

Debra: Why is the directory sometimes referred to as the ODP and other times DMOZ? Is there a difference?

Bob: The directory’s “official” name is DMOZ: The Open Directory Project. DMOZ means “Directory Mozilla” – the idea was to align the directory with the Mozilla brand, even though it was not actually part of that group. DMOZ and ODP are now used interchangeably to refer to the directory.

Debra: Most of us know that DMOZ is owned/operated by AOL, but the site still lists Netscape as “hosts and administrators”. Who ultimately makes the “big decisions” at DMOZ?

Bob: By design, it is the community that makes the “big decisions.” But in terms of the corporate entity that is ultimately responsible for DMOZ, it is AOL.

Debra: Can you explain the chain of command at DMOZ?

Bob: DMOZ is essentially a meritocracy in which editors are granted high permissions based on their interest and quality of participation. There are two general types of permissions: those that allow one to edit anywhere, and those that allow one to participate in community management. An editor with the former permission is known as an “Editall.” Editalls can edit anywhere are engaged in discussions around taxonomy and the editorial guidelines.

An editor with the latter permission is known as a “Meta Editor.” Meta Editors are community managers and are responsible for reviewing editor applications, investing and resolving abuses, and leading editor discussions. For all intents and purposes, Meta Editors and Editalls are “equal” permissions but focus on different aspects of the community.

The “Administrator” permission is the highest community management permission, and is granted to a few, trusted editors to oversee the day-to-day operations of the community. They ensure that Meta Editors and Editalls are being fair and equitable, and that the guidelines are kept current.

The ODP’s governance model is intended to be self-regulating, so there are checks and balances in place to ensure all topics and all points of view are represented, and to foster an inclusive environment in which any editor who wants to contribute is encouraged to contribute. This model doesn’t always work perfectly, but it has been very successful in creating a self-regulating environment, which actually has less to do with the model and more to do with the extraordinary group of editors who contribute to the directory’s governance.

Debra: How do you respond to the allegations some DMOZ editors accept money for submissions?

Bob:Accepting money for submissions is strictly against the community codes of conduct. In cases where we have confirmed this is happening, we revoke the editor’s account. That said, in more cases than not, the allegations are just that… allegations. Still, accepting money in exchange for submission is a consequence of an open directory operation in a closed community.

As I mentioned previously, our challenge is to create a system that allows Webmasters to contribute to the ODP, rather than feeling disconnected from it, which gives one incentive to abuse the system. This solution involves expanding the ODP’s scope, offerings and participation levels. I can’t promise the solution will rid the ODP of nefarious activity, but I think becoming more inclusive while still retaining the directory’s self-governance model will be a significant improvement.

I think it’s important to note that our editor application review process is very thorough. From a directory quality perspective, the best time to identify potential abusers is before they get a foot in the door. We ask that applicants provide a thorough listing of site affiliations and we use full disclosure (as opposed to the affiliations themselves) as a criterion for selection along with general editing quality of the sample sites they provide. While this may mean that we occasionally reject good applicants, the end result is that we keep out many potential abusers. That’s good for everyone.

We unfortunately do sometimes encounter editors who abuse their editing privileges for personal gain. We have a system of community policing to help weed out these “bad eggs.” The public, as well as other editors themselves, are able to report suspected abuse via our abuse reporting tool. When a report comes in, meta editors investigate these allegations fully and if we find that they have merit, we revoke the editor’s account. In the case that a meta editor is suspected of abuse, the case will be investigated by an admin.

We recently did a blog post about what editor abuse really is and what information we need to have in order to fully investigate it.

Debra: There are a lot of categories at DMOZ without editors. I know there is an open invitation for anyone to apply, but what is DMOZ doing to recruit people to fill the empty categories?

Bob: Even though there are lots of categories without listed editors, anyone listed in a parent category or with directory-wide editing permissions can edit these categories. So, even though an editor is not listed in a category does not mean the category is not being maintained.

We are an all-volunteer force, so recruitment is primarily through word of mouth from our curre
nt editors and through data users themselves. The editors reach out to others within their own communities and this has produced tremendous growth in some areas. 

We also get new editors who find us via the DMOZ data attribution badge on other sites or because they learn about us by seeing our results in Google or another search engine. DMOZ gets hundreds of applications daily, and routinely accepts those most likely to edit well and contribute more than just their own site.

Debra: Yes, I understand category editors can/do pitch in, but when I look at a major category like Real Estate and notice seven of the first nine categories are without editors and one category shows 2007 as the last date the page was updated, I have to wonder what the Directory is doing to keep its results fresh. How can a handful of people in a major category like Real Estate keep that section of the Directory current?

Bob: The date at the bottom of the page can be misleading. It’s not always an indicator of freshness. Some pages are not updated frequently simply because they are directional pages (i.e., they direct users to categories where sites are listed); or because the kind of site listed in the category is so specific that few sites are listed at that particular level. http://www.dmoz.org/Business/Real_Estate/Agents_and_Agencies/ is a good example.

The category description page explains how agency sites are listed. The lion’s share of agency sites are directed and listed in the Regional area of the site, which is where a lot of the editors in this area are spending their time and effort.

Debra: Has there been any discussion about the ODP offering a paid review program?

Bob: This issue has been raised and discussed many times. Paid review really goes against the whole idea behind the ODP. In fact, our Social Contract with the web community takes an especially firm position on this issue.

Dr Search hopes that you found this post educational. For more, please see tomorrow’s post.

Links building- 5 simple tips for you

October 26, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Links building-  when it comes to SEO, increasing your inbound links is one of the most effective ways to improve your ranking. 
If you’ve recently launched a new website, or are looking for some advice to get started – here are 5 simple (but effective) link building methods.

1. Internal Linking
One of the easiest methods simply involves linking to relevant pages from other pages on your website. By using content based links, you can control the anchor text of each link to ensure you get the maximum SEO benefit.

2. Directory Submission
Submitting your website to free online directories doesn’t take much time and can be a very effective way to increase your inbound links. Many directories also allow you to write your own title and description which is another chance to include relevant anchor text.

3. Social Media Links
Bookmarking your website across some popular social media sites is another way to ensure you receive a keyword based incoming link. To get you started, here’s a list of some of the more popular sites:

    * www.delicious.com
    * www.folkd.com
    * www.diigo.com
    * www.reddit.com
    * www.weblinkr.com

4. Article Distribution
Writing informative articles and distributing them online is another powerful link building tactic. Write original articles, keep them to around 500 words and submit them to popular sites like www.ezinearticles.com and www.articlebase.com.

5. Start Writing a Blog
Setup a blog using WordPress or Blogger and get into the habit if writing one new post per week. Make sure to link to your website using targeted anchor text where possible. Aside from the direct linking benefit, your blog posts might also get syndicated on other sites which will create another incoming link.

If you want to build more incoming links but don’t have the time, you can also consider our new link building service which can help to improve your ranking.

More on the Twitter, Google deal

October 23, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Google’s Marissa Mayer has announced at the Web 2.0 Summit that Google Social Search will be launching in the coming weeks. 
I’ve seen an early release of it. It’s way cool. Below, what details we have now about this plus some follow up on today’s Google-Twitter search deal that was announced.

I’m in a bind because I can’t say more about the product than what Mayer released today. I wasn’t able to make it to Web 2.0 nor were her remarks on the product broadcast live. TechCrunch was there and summarizes what she said this way:
 

There’s a new Google product called Social Search that is launching in Google Labs. This is a new feature that allows you to see results for queries from people in your social network. This works by using your Google Profile. If you fill it out with the other social networks you’re a member of, such as FriendFeed, Google will scan who you are connected to and give your results from those people.

For example, I have a Google Profile here. On that page, I’ve listed my Twitter account. This means when I’m signed into Google, it can tell who I am and what my Twitter account is with certainty. Then when I search, it can offer to show me web pages that are related to other people in my Twitter profile.

More specifically, if I were do to a search relating to journalism matters, because I follow a number of people in the journalism field (not everyone might see this Twitter List yet), I’d get back both “regular” search results as well as those that are from people who I follow. News.com notes that Mayer said these would appear at the bottom of regular search pages.

Other links from social sites such as Facebook or LinkedIn could also be added to your profile (any link can be added to it). To the degree Google can see your network, those can be used to filter your results.

The social search product also predates today’s news that Google has a partnership with Twitter to tap into its data. That means Social Search doesn’t depend on the Twitter deal, but it certainly should help.

What exactly will Google do with the Twitter data. Are we getting a dedicated Twitter search engine like Bing Twitter Search launched today?

Google’s kind of cagey on that front. Mayer said at Web 2.0 that it will be integrated in to regular results. But what’s that mean? Integrated only using Universal Search, which could mean there’s also standalone Twitter search engine out there (just as there’s a standalone image search, news search, blog search and so on)?  Integrated to use Twitter data as part of the core ranking data?

I couldn’t get clarity on whether there will be a standalone Twitter search. Personally, I think there will be, or that there will be a combined microblog search service. We know Google has at least gotten people to translate a name for that service.

Whether that type of dedicated search for microblogged content service gets integrated into the completely different Social Search service that refines results on your social network remains to be seen.

Certainly Google sees the microblogged content as something that needs to be gathered and somehow integrated alongside web pages. Johanna Wright, director of product management at Google, talked to me today about this.

“There are things on Twitter that you can only find on Twitter,” she said, especially local happenings that might never see an actual news article written about them.

One example Wright gave, of stories she says Google is collecting, was about an art project where 2,000 “invisible dog” leashes were handed out in Manhattan. You know, those solid leashes that look like  you have an invisible dog holding them up? No one wrote a news article about this, but if you were trying to figure out what was happening if you saw people with them, the information was blogged on Twitter.
 

Take That, Twitter: Google Hot Trends Integrated Into Google Search is another article from us that covers a primary signal that Google has if something is a hot topic — actual searches on Google that happen. And while Google’s has a “query deserved freshness” algorithm that can very quickly find new pages and rank them in top results, Twitter’s data potentially could make that even faster.

Google and Twitter also agree tie up

October 22, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

Following hot on the heels of the Bing / Twitter partnership which we announced yesterday, Google has now also announced that it, too, has struck a deal with Twitter to include real-time tweets in Google’s search results.
 
“We believe that our search results and user experience will greatly benefit from the inclusion of this up-to-the-minute data, and we look forward to having a product that showcases how tweets can make search better in the coming months. “
“That way, the next time you search for something that can be aided by a real-time observation, say, snow conditions at your favorite ski resort, you’ll find tweets from other users who are there and sharing the latest and greatest information.”

That was published minutes ago on the official Google blog by VP Marissa Mayer. Not coincidentally, she’s due to speak shortly at the Web 2.0 Summit — where Bing made its announcements earlier today. We plan to live blog her appearance just as we did earlier when Microsoft’s Qi Lu was on stage.

Bing to do deal with Twitter as it launches it’s own Twitter Search

October 21, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

It has been reported that Microsoft will announce a deal with Twitter today to gather its real time data. We’re able to confirm that from a source as well and provide some additional details.

The deal will make Bing the first major search engine to have access to Twitter’s “Firehose” of tweets. It’s not exclusive, however. Google potentially could still do a deal, to.

We’re told that:
    * The deal will be announced today shortly after Microsoft’s Qi Lu takes the stage at the Web 2.0 conference at 11:30 Pacific Time today. Some sessions are being broadcast live here, and Lu’s might be one of them.
    * There will be a standalone Twitter search service offered at Bing, with some ranking technology other than sort by date involved, and that shortened URLs will be expanded. That service should go live today.
    * There will be some integration within the regular Bing service itself

Discussions to gather data from Facebook are also continuing, and there’s a chance a deal might be concluded for announcement today.

We’ll update as we learn more. To understand the importance of Twitter and Facebook data to the major search engines, see my What Is Real Time Search? Definitions & Players. It covers what Bing currently does with limited Twitter data it’s able to get now.

We hope more working relationships with organizations in the search business will mean even more variety for users.

Social Media- top tips to avoid getting burned

October 20, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

If you’re not already tapping into social media, you should, as the channel offers brands tremendous opportunities to foster community and engagement. 
But many marketers jump into social media efforts with little more than a “cool” idea. This is a mistake. Not only could it pose serious implications for a brand, but it could also obliterate the value you sought to derive from social media in the first place. To effectively leverage social media, you first need to devise a plan. Here are five tips to help you get started.

1) Develop your vision. Get creative and develop a vision of what your brand looks like in social media. For example, will your brand be personified, or will it have a catchy tagline? Will the user get special deals or coupons if they connect with you in this space? What is the message you wish to have transcend the brand? 

During this process, be sure to think through the different sites you are interested in using, such as YouTube, Facebook, and Twitter, and define how you envision your brand acting within these sites. Give thought to what you might tweet about, the discussions you might create, how to best use Facebook notes, or what to include on your YouTube channel. When the sky’s the limit—as it is in social media—it is essential to take the time to first create this vision.

2) Understand your goals. Outline what you are trying to achieve with social media and be specific about your goals. And note that jumping into the fray just because your competitors are doing so is not a valid reason. Instead, perhaps you want to leverage social media to interact with your customers, or drive sales, or simply to reinforce your branding efforts in alternative channels. 

Whatever the case, you first need to have clarity on what you are hoping to achieve so you can put together an effective strategy to get the job done.

3) Identify your success goals. Decide how you will gauge the success of your social media efforts and the specific metrics you will use. For example, if the goal of your campaign is to create awareness for a contest that you are running, it’s important to measure contest and brand impressions, numbers of fans or followers, video views and interaction with the site. In addition to identifying your success metrics, be sure to have analytics in place so that you can track on-the-page site interactions and monetary value.

4) Define how you will communicate value. Identify the value you are offering your audience and how you will communicate it. In doing so, be sure to make the connection between cool and valuable. Why? Because while creative content generates initial interest, the communication of value will keep the discussion with your audience going longer. 

Regardless of the means you choose—maybe it’s a contest, or an exclusive coupon for fans or followers—be sure to give your audience a reason to stay connected. Not only will it facilitate engagement, but it will also boost the longevity of your social media effort.

5) Integrate your efforts. From the outset, you should plan to integrate your social media efforts with the rest of your marketing initiatives as it can produce a symbiotic effect. For example, by integrating social media with your offline programs, you can create “buzz” for the launch of a new commercial, or solicit feedback about your latest magazine feature. Likewise, by integrating social media with search, you can leverage SEO tactics to help your social content rank in the search engines, build a PPC campaign to capture the demand created by you or your competitors and leverage optimized press releases to promote your efforts.

6) Identify sufficient resources.
Give thought to the effort and resources necessary to launch your social media initiatives and keep them going. Remember, just showing up in social media won’t suffice. Instead, you need to continually update your presence on a daily basis. For example, you can’t just create a Facebook fan page and walk away from it. 

You need to invest the time necessary to leverage it as a creative means to interact with your audience and expand the conversation. As you develop your social media plan, make it a priority to identify who will be responsible to update and outline communications with your constituents. Otherwise, your social media vision will get stonewalled.

Social media is a growing channel that offers brand marketers creative ways to interact with their audience and keep the conversation going, but getting started requires more than a cool idea. To derive the most value from it, you first need a plan, and these tips should help you get started.

The correct content for a landing sales page

October 19, 2009 By: Dr Search Principal Consultant at the Search Clinic Category: Uncategorized

What do the best landing pages have for potential customers?
Can an ecommerce store’s product detail pages bog a visitor down in too much detail? Can you provide the wrong information and leave people with unanswered questions?

My friend and ongoing client Michael runs Very Colourful Jewellery, an store for handmade fashion accessories. He recently asked me for some online marketing consulting to help him increase his conversion rate. I thought I’d share this mini-usability review to help Mike and other store owners who may be struggling with these issues.

First let’s check out detail page.

The page gets the general info down fine. It obviously matches the keywords likely to deliver visitors, and like the rest of the site, there’s shopping cart info in the top right.


Possible solutions to test:

    * By far the easiest solution is to offer no alternative colors. By making the color question a simple yes-or-no decision, momentum is a lot easier to maintain.
    * A better solution is to offer a very limited range of popular colors. You could probably copy The Gap and go with blue, pink, gray, red and black. This avoids leaving money on the table in the case of people thinking, “No, I don’t like the default color.”
    * Add pictures of the product in the alternative available colors.
    * Have some pretty girls model the product, and explain what size they’re wearing. Tests typically show that actual-use pictures convert better.

Shipping questions for detail pages- two common questions visitors have are:

    * When will the product arrive? (Sometimes phrased as, “When will it ship”)
    * What will the price of shipping be.

The product arrival date info is automatically estimated, which is a great piece of functionality. Unfortunately, this too is hidden in the discreet “Additional Information” box below the product image.

As to the price of shipping, this is nowhere to be found on this detail page or any others.

Normally this emphasis on the checkout is good, but in this case it will create a lot of scenarios like this:

    * Add to cart
    * Check cart info
    * Continue to checkout

Then when people move on to the billing page, the ‘Standard’ and ‘Rush’ shipping options don’t provide any more info on price.

So what?

So the net effect of this lack of information on shipping times and rates creates anxiety. Again, this slows momentum towards conversion.

Possible solutions

    * Embed a simpler calculator in a reasonably prominent part of the product detail page. For example, some of the whitespace on the right hand side could be used without affecting how clean the page looks. Of course, that’s just a hunch – you’d have to test that to know for sure.
    * Since most products have a standard weight and size, Mike could use USPS’ “If it fits, it ships” product and just automatically list shipping rates on his product detail page according to product type.

The fundamental role of a product detail page is to decrease anxiety by spelling out clearly what the product offer is. It should offer enough information to answer visitors’ questions, without overwhelming them and making them bounce.