SEARCH CLINIC

Search engine online marketers
Subscribe Twitter Facebook Linkedin

Google profits increased by PPC sales

April 25, 2015 By: Dr Search Principal Consultant at the Search Clinic Category: Dr Search, Google, Pay Per Click, Pay Per Click Advertising, Pay Per Click Marketing, Search Clinic, Search Engine Marketing, Search Engine Optimisation, Search Engine Results, Uncategorized

Google has reported a 4% increase in profits to £2.38 billion, as strong PPC advertising sales helped boost the firm’s accounts.

Google reported a 4% increase in profits to £2.38 billion, as strong advertising salesGoogle said advertising sales for the first three months of 2015 were £10 billion, an 11% increase from the same period a year earlier.

Total revenue also increased by 12% to £11.53 billion, but like other US firms, the company was hurt by the strong dollar.

Shares in the firm rose more than 3% in trading after markets had closed.

There had been fears on Wall Street that profits would be weaker due to investment in new businesses and weaker advertising revenue as more people access Google via mobile devices, where advertising rates are lower.

But the fears turned out to be unfounded – a fall in the average price of an advert was offset by an increase in the number of adverts.

In a statement accompanying the results, chief financial officer Patrick Pichette said the company continued “to see great momentum in our mobile advertising business and opportunities with brand advertisers”.

However, Google did suffer from the stronger dollar. Taking out the impact of currency movements, Mr Pichette said revenue grew by 17% in the quarter compared with a year earlier.

The results also showed the firm continued hire new staff at a high rate, with employee numbers up 9,000 over the past year.

Google’s mobilegeddon for non responsive websites

April 20, 2015 By: Dr Search Principal Consultant at the Search Clinic Category: Google, Mobile Marketing, mobile phones, Search Clinic, Search Engine Marketing, Search Engine Optimisation, search engines, SEO, smart phones, Uncategorized

Google is launching “mobilegeddon” by making changes to the way its search engines ranks websites.

Google’s mobilegeddon for non responsive websitesGoogle regularly changes its algorithms as it battles with Search Engine Optimisation (SEO) specialists who try to understand the system on behalf of their clients and ongoing technical changes.

But this is a big change – dubbed “mobilegeddon”- which is designed to prioritise websites that are optimised for the mobile internet.

Google gave plenty of warning, telling developers about the change in a blog post in February and providing a simple tool to check whether sites were mobile friendly.

The search firm is trying to reassure website owners that this won’t be an earthquake which turns their businesses upside down but quite a subtle evolution.

But SEO specialists say this looks like the biggest change since 2011 – and for some that will unearth some unpleasant memories.

For any online retailer, appearing on page one of Google’s search results can make all the difference between a profitable business and one heading for the scrapyard

Google’s move to make mobile capabilities more important in search rankings seems eminently sensible as our smart phones and tablets become the key route to finding goods and services online.

But over the next few weeks we can expect cries of pain from those whom the all powerful search algorithm has deemed less worthy.

And, coming just days after the European Commission accused Google of abusing its dominance, it will be another illustration of just how important a role the Californian company plays in every corner of the global economy.

So if you need help with optimising my website then please contact us now either by clicking the contact us button or ring us 01242 521967:contact search clinic

Facebook announces Graph Search- a social search tools for users

January 11, 2013 By: Dr Search Principal Consultant at the Search Clinic Category: Customer Service, Dr Search, Facebook, internet, Search Clinic, Search Engine Marketing, Search Engine Results, search engines, Social Media, Technology Companies, Uncategorized

Facebook has announced a major addition to its social network – a smart search engine it has called Graph Search.Facebook announces Graph Search- a social search tools for usersThe feature allows users to make “natural” searches of content shared by their friends.

Search terms could include phrases such as “friends who like Star Wars and Harry Potter”.

Founder and chief executive Mark Zuckerberg insisted it was not a web search, and therefore not a direct challenge to Google- however, it was integrating Microsoft’s Bing search engine for situations when graph search itself could not find answers.

Mr Zuckerberg said he “did not expect” people to start flocking to Facebook to do web search.

“That isn’t the intent,” he said. “But in the event you can’t find what you’re looking for, it’s really nice to have this.”

Earlier speculation had suggested that the world’s biggest social network was about to make a long anticipated foray into Google’s search territory.

“We’re not indexing the web,” explained Mr Zuckerberg at an event at Facebook’s headquarters in California.

“We’re indexing our map of the graph – the graph is really big and its constantly changing.”

In Facebook’s terms, the social graph is the name given to the collective pool of information shared between friends that are connected via the site.

It includes things such as photos, status updates, location data as well as the things they have “liked”.

Until now, Facebook’s search had been highly criticised for being limited and ineffective.

The company’s revamped search was demonstrated to be significantly more powerful. In one demo, Facebook developer Tom Stocky showed a search for queries such as “friends of friends who are single in San Francisco”.

The same technology could be used for recruitment, he suggested, using graph search to find people who fit criteria for certain jobs – as well as mutual connections.

Such queries are a key function of LinkedIn, the current dominant network for establishing professional connections.

“We look at Facebook as a big social database,” said Mr Zuckerberg, adding that social search was Facebook’s “third pillar” and stood beside the news feed and timeline as the foundational elements of the social network.

Perhaps mindful of privacy concerns highlighted by recent misfires on policies for its other services such as Instagram, Facebook stressed that it had put limits on the search system.

Google is demoting websites’ ranking who host pirated content

August 17, 2012 By: Dr Search Principal Consultant at the Search Clinic Category: Customer Service, data security, Google, internet, Search Engine Marketing, Search Engine Results, search engines, Technology Companies, Uncategorized

Google has announced that it is changing its search engine algorithms to crack down on the search rankings of those websites that contain or receive a large number of copyright infringement notices.Google is demoting websites' ranking who host pirated contentGoogle has a web page dedicated to showing how many requests it receives from copyright holders and reporting organizations to remove certain websites from its search engine due to piracy and soon it will start demoting the rankings of those websites that receive high volumes of copyright-infringement notices.

Google’s senior vice president of engineering while addressing the copyright issue, Amit Singhal said in a blog post,

“In fact, we’re now receiving and processing more copyright removal notices every day than we did in 2009, more than 4.3 million URLs in the last 30 days alone,”

“We will be using this data as a signal in our search rankings.”

Google is moving to mostly penalise those websites potentially hosting pirated entertainment.

It’s positive that Google’s search algorithms are now reranking various websites which appear at lower ranks in search results criteria based on the number of copyright removal notices that Google receives against them.

The search engine said it will not demote the ranking of any of the websites from its search results unless it receives a valid copyright-removal notice from the rights owner.

Google claimed that it has already taken numerous actions against pirated websites- between July and December 2011, it claimed that it has removed 97 per cent of search results specified in those copyright removal notices.

Yahoo to axe non core services to improve profits

May 15, 2012 By: Dr Search Principal Consultant at the Search Clinic Category: AdWords, bing, Customer Service, Ecommerce, internet, Microsoft, Pay Per Click, Pay Per Click Marketing, Search Engine Marketing, Search Engine Optimisation, Search Engine Results, search engines, Technology Companies, Uncategorized

Yahoo has confirmed plans to shut down dozens of services which are not seen as core to the firm.Yahoo to axe non core services to improve profitsAs a result they said that it would be “shutting down or transitioning roughly 50 properties that don’t contribute meaningfully to engagement of revenue”.

The CEO Mr Thompson did not identify which units would be abandoned, but noted that news, finance, sports, entertainment and mail were safe.

“Each of our products and services may individually generate more engagement than most start-ups or even mid-sized companies in certain markets, but that does not mean that we should continue to do everything we currently do,” he was quoted as saying in a transcript of the conference call by Seeking Alpha.

The chief executive also noted that its search alliance with Microsoft was “not yet delivering” what had been expected.

The two firms agreed to team up in 2009. The idea was that Microsoft would provide Yahoo with the search results produced by its Bing service, which Yahoo would tailor to its audience. In addition Yahoo’s salesforce would target “premium” advertisers on behalf of both firms.

Mr Thompson said the UK and France were currently being moved to Microsoft’s search algorithm, and that other parts of the EU and Asia would follow.

However, he added that Yahoo was “working hard with Microsoft” to address the fact that the software firm’s AdCenter technology was still not delivering the sort of revenue it had hoped for.

For the time being Yahoo is protected against the shortfall by a “revenue per search” guarantee signed by Microsoft that is due to expire next March.

Mr Thompson was also quizzed for more detail about his promise to make better use of the company’s “vast data”.

He explained that the firm would use cookies to personalise its news content.

He added that the data would also be used to help advertisers understand how visitors used the site and to request “almost real-time” analytics data.

This is the latest in a series of turnaround plans promised for the web portal.

The key will be in getting the search and banner advert revenues higher.

Yahoo quarterly profits fall 26% with shrinking revenue

October 24, 2011 By: Dr Search Principal Consultant at the Search Clinic Category: Customer Service, data security, Email, Online Marketing, Search Engine Optimisation, search engines, SEO, Technology Companies, Uncategorized, YouTube

Yahoo has reported a quarterly profits fall of 26% as it struggled to boost earnings from online advertising.Yahoo quarterly profits fall 26% with shrinking revenueNet profits in the third quarter were £188 million compared with £247 million during the same period last year.

Last month, Yahoo sacked chief executive Carol Bartz after its online earnings failed to keep pace with those of rivals Google and Facebook.

However, its performance beat market expectations, and its shares ended 3% higher.

Yahoo’s net revenue in the three months to September was £668 million, compared with £700 million the year before.

“My focus, and that of the whole company, is to move the business forward with new technology, partnerships, products and premium personalised content,” said interim chief executive Tim Morse.

Yahoo has been looking for a new chief executive since firing Ms Bartz in September amid mounting frustration at failed efforts to turn the firm around.

Analysts say that in recent weeks there has been increasing speculation that Yahoo, or parts of its business, might be sold to an assortment of buyout firms.

There have been rumours that Microsoft is considering a second attempt at a takeover. Microsoft last offered to buy Yahoo for £29 billion in 2008.

China’s internet firm Alibaba has already said it might be interested in buying Yahoo- however american political sensivities will complicate any chinese purchase due to data spying senstivities of the Yahoo email system.

How links building can help your online business

September 07, 2011 By: Dr Search Principal Consultant at the Search Clinic Category: bing, Dr Search, Ecommerce, Google, internet, Links Building, Search Clinic, Search Engine Marketing, Search Engine Optimisation, search engines, SEO, Technology Companies, Uncategorized, Website Design, Yahoo

You’ve got a wonderful new website- so the world is going to be knocking your door down. Or so you hope.How links building can help your online businessAfter the initial disappointment comes the realisation that it’s a big world out there.

If your website starts with the word “welcome” congratulations- there are over three billion other websites listed by Google making the same mistake.

Go on- ask yourself how often do you search for “welcome”?

To get online traffic depends on Search Engine Optimization (SEO) and SEO has a number of factors:

1. Technical (how well can it be crawled by the search engines)
2. On page (what’s on the page being crawled)
3. Off page (mainly building links)

Google and most other search engines use links to determine reputation. A site’s ranking in Google search results is partly based on analysis of those sites that link to it.

Link-based analysis is an extremely useful way of measuring a site’s value and has greatly improved the quality of web search.

Both the quantity and, more importantly, the quality of links count towards this rating.

Of course Google does not just use links; they use over 200 indicators such as:

  • domain name
  • meta tags
  • alt tags
  • directory names
  • filenames
  • heading tags
  • link popularity (how many links back to you there are)
  • link text (anchor text indicating the subject of the link)
  • page title
  • Page Rank

PageRank™ is Google’s patented method to assign a numerical algorithm and weighting to each element of a hyperlinked set of documents that provides a rough estimate of the overall importance of a web page.

In short the whole basis of Google’s success- is that success of website pages breeds success.

If you think that buying links is the key to your future success- hold on!

To find the pitfalls of blindly building links, just have a quick look at Google’s own links building rules .

As the article on paid links makes clear: “Buying or selling links that pass PageRanking is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results”

So, what you need are good quality links and lots of them.

But what is meant by good quality links?

The seven most important factors for link quality are:

Contextual links. One way is obvious and contextual just means appearing within the natural flow of a page’s text. If the link is reciprocal, Google sometimes discounts some of the value of that link.

High authority websites. Getting one link from the Telelgraph or DMOZ is possible worth more than 10,000 links from your new site. Authority does not
just mean high PageRank- but a site that is also well established as one of the most important sites for a topic.

Relevant / related links. If you have loads of sites that are irrelevant / unrelated to your sites topic you will probably lose out. In natural linking by people, they tend to link to one another within the same topics and industry. Spammers- like addicts don’t care where they get their fix- or link.

Diversity of link sources. Having many links from another site is good but it’s probably better to have one link each from many sites. The former could be spam. The latter is harder to achieve.

Deep links. If you only have links pointing to your homepage and no deep links to other pages in your site then you will probably have less success than with a proportion of deep links.

Different anchor texts -the actual text of the link. For example “Dr Search is an online marketing professional” is something we might be able to place with small variations, but hundreds and thousand of individuals who may link to us will vary the text they use just because they will as they all think slightly differently.

Consistent links growth. Link building is a marathon, not a sprint. Acquiring 20,000 links on one day unless your marketing goes viral is very unlikely for the vast majority of sites, so any search engine will rightly be suspicious.

Google’s Panda review- another reranking coming soon

June 22, 2011 By: Dr Search Principal Consultant at the Search Clinic Category: Google, Search Engine Marketing, Search Engine Optimisation, search engines, Technology Companies, Uncategorized

Matt Cutts- Google’s unofficial spokesperson made the Panda reranking announcement in a question and answer session at the SMX Advanced conference.Google's Panda review- another reranking coming soonWith Danny Sullivan of Search Engine Land in attendance they mentioned that ‘Google has updated and approved the changes in the Panda algorithm but that it hasn’t been rolled out yet, but that should happen soon’.

Addressing the frequent complaints from webmasters that web content scraping sites still out rank the original content sites, Cutts said that the Google Panda algorithm update (version 2.2) will tackle this issue.

He also said that Google will keep working on the Panda algorithm until they have a robust algorithm that can penalize the content farming and related sites and display better improved search results.

This means that websites that were penalised by the original Panda update could soon look forward to regaining their original rankings or even getting higher rankings in the SERPs.

The differences with Panda update 2.2 lies in its implementation – Cutts said that the algorithm will be manually scanning for those particular websites that are involved in content farming and/or content re-publishing.

While Panda Update 2.1 was only released in early May, it was a minor update and received no fanfare from Google.

The fact that Matt Cutts has mentioned 2.2 at one of the biggest SEO conferences of the year indicates that its probably going to be big (and have quite an impact).

However, to avoid any great impact to your websites ranking, ensure that your site is kept up-to-date with good quality content.

This will reduce the likelihood of any penalty or if you have already been penalized in previous updates then the penalty should soon be lifted after the 2.2 launch.

Google hasn’t given a specific date on the launch of the Panda update 2.2, but as soon as we hear anything more we will be sure to let you know.

Top Google search result gets 36.4pc of clicks

June 07, 2011 By: Dr Search Principal Consultant at the Search Clinic Category: Customer Service, Ecommerce, Google, Search Clinic, Search Engine Marketing, Search Engine Optimisation, search engines, Technology Companies, Uncategorized

New research has shown the importance of Google’s search results rankings- especially in one of the top three organic positions, as these spots receive 58.4 percent of all clicks from users, according to Optify.
Top Google search result gets 36.4pc of clicksWebsites ranked number one received an average click-through rate (CTR) of 36.4 percent; number two had a CTR of 12.5 percent; and number three had a CTR of 9.5 percent. Being number one in Google, according to Optify, is the equivalent of all the traffic going to the sites appearing in the second through fifth positions.

Here’s Optify’s look at Click Through Rates of the top 20 sites’ rankings:
CTRs on the top google 20 rankingsBasically, Optify concludes that moving up to the top spot in Google from second or third could triple visits to your website.

Optify’s study of U.S. Google search engine results pages, conducted in December 2010, analyzed organic keyword visits for B2B and B2C websites. Optify analyzed data from 250 randomly chosen sites and an initial set of 10,000 keywords.

Here are some of Optify’s other findings:

The average CTR on Page 1 of Google was 8.9 percent, but the average CTR on Page 2 was 1.5 percent. Ranking first on Page 2 had a slight benefit over ranking in the last spot on Page 1 (2.6 percent vs. 2.2 percent CTR).

Optify concludes that, because predicting which position your site will appear in Google is basically impossible, your SEO efforts should first focus on getting on Page 1, and then on investing in working your way up to one of the top three spots. Also, ranking beyond Page 2, while good for tracking trends, has almost no business value, Optify noted.

Optify’s study defined head terms as keywords with more than 1,000 monthly Google searches and long tail terms as keywords with less than 100 monthly searches.

Head terms had a higher CTR (32 percent) in the number one position than long tail terms (25 percent). However, long tail terms had a higher overall CTR on Page 1 of Google than head terms (9 percent vs. 4.6 percent).

Optify concluded that you won’t see “huge benefits until you get to the top few positions” with head terms. However, long tail terms can see decent CTR almost anywhere on the first page, though there is less benefit of moving up to higher positions.

Bottom line: let your business goals shape your SEO strategy.

The research was analysed at: Top-Google-Result-Gets-36.4-of-Clicks-Study

Apple’s Mac victim of MACDefender fake malware security software

May 20, 2011 By: Dr Search Principal Consultant at the Search Clinic Category: Apple, Cyber Security, Ecommerce, internet, Microsoft, Search Engine Optimisation, search engines, Uncategorized

A fake security program for Apple computers called MACDefender is infecting a growing number of victims.Apple's Mac victim of MACDefender fake malware security softwareHundreds of people who installed the software are turning to Apple’s forums for help to remove it.

The program’s tactic of populating users’ screens with pornographic pictures has increased the urgency of victims to find a solution quickly.

MACDefender seems to have been successful because of the work its creators did to make it appear high up in search results through search engine optimisation.

The number of people seeking help was uncovered by ZDNet journalist Ed Bott. Ironicaaly he is a Microsoft specialist but his blog post, he wrote about finding more than 200 separate discussions on Apple’s official forums about MACDefender malware security software.

The volume of reports about the problem was “exceptional” in his experience, he said.

The fake Mac anti-virus software, which goes by the name of both MACDefender and Mac Security, began circulating in early May and has been steadily racking up new victims.

Such programs, often called scareware, urge people to install software that then pretends to scan a machine for security problems. It then fabricates a list of threats it has found and asks for cash before it will fix these non-existent problems.

One trick the software uses to make people cough up cash quicker is to launch the browser of unattended machines and call up one of several different pornographic websites.

Although the vast majority of malware that security firms see is aimed at Windows users, there is much less malware in existence for Mac OS X than there is for Windows.

But that’s no reason to blithely think that there are no Mac threats.