twitter
    Find out what I'm doing, Follow Me :)

Manage high traffic keywords & low traffic keywords better - cut your keyword costs

High traffic keywords have the potential to generate a lot of business on your website but they also generate a lot of costs very quickly because they get so many clicks. Learn to manage your high traffic keywords carefully... lest they eat your lunch.

Once you have developed a long list of advertising keywords you will inevitably find that a small handful of these advertising keywords generates the most traffic to your website. Which is usually good news. But these high traffic keywords do not always the most conversions. Which can be very bad news.

That is why you should...

Isolate your high traffic keywords

Actually, your first step should be to separate your advertising keywords into campaign groupings based on context and theme. For more detail on this step read:
Pay Per Click Campaign Management - Get Better Quality Scores, Spend Less Money. However, after you have separated your advertising keywords into logical divisions using your adwords campaign structure you still need to go a step further to achieve better keyword cost efficiency.

High traffic keywords

High traffic keywords (also called fat head keywords) are usually less targeted keywords, consisting of single words or very short phrases etc. Being less targeted these high traffic keywords tend to deliver lower conversion rates than lower traffic keywords which tend to be more highly targeted.

So, your bidding strategy for search targeted results will be different for high traffic keywords and low traffic keywords. You may need to bid lower on high traffic keywords to make them work with their lower conversion rates. But, even with lower conversion rates these high traffic keywords are very important and tend to deliver the most gross conversions because of sheer volume. The key concept is that this conversion rate will tend to be different for high traffic keywords and they should be managed separately.

High traffic keywords will begin to flow to the top of your reports if they are sorted by volume in descending order. This sorting makes it easier to spot your high traffic keywords then peel them off and move them into separate campaign groups. This separation will greatly facilitate applying your unique bidding strategy for these words. Separation will also allow you to focus on that handful of high traffic keywords without getting lost in a long list of mixed high and low traffic keywords.

Low traffic keywords

Low traffic keywords (also called long tail keywords) tend to be more specific with fewer single words, contain longer phrases, have more qualifying words etc. Because these low traffic keywords are more specific and qualified they generally have higher conversion rates and create high quality leads. Low traffic keywords can be managed more on a group level and less as individual words. Low volume keywords will also require less frequent tinkering.

How will this adwords campaign separation look?

Let’s say you are promoting a web site that sells office products. You have created your campaign categories to include, office furniture, copiers, printers, paper etc. Using this method of separating high traffic keywords from low traffic keywords for search targeted results you would now have your campaigns for each of these categories set up like this:

  1. Office furniture—content
  2. Office furniture—search (high traffic keywords)
  3. Office furniture—search (low traffic keywords)
  1. Copiers—content
  2. Copiers—search (high traffickeywords)
  3. Copiers—search (low traffic keywords)
  4. Etc.

Then when you add in the adword group sub-sections your campaign structure might look like this:

  1. Copiers—content category level
    1. Konika ad group level
    2. Canon ad group level
    3. Xerox ad group level
  2. Copiers—search( high volume keywords) category level
    1. Konika ad group level
    2. Canon ad group level
    3. Xerox ad group level
  3. Copiers—search( low volume keywords) category level
    1. Konika ad group level
    2. Canon ad group level
    3. Xerox ad group level

Separate your adword campaign budgets

Another advantage to separating high traffic keywords from low traffic keywords is the ability to separate your ad budgets and manage them accordingly. Because Google applies budgets on the campaign level your separation of high traffic keywords and low traffic keywords will have to take place on the campaign level rather than the adword group level to accomplish this separation of ad budgets.

When the budgets for low traffic keywords and high traffic keywords are mixed the high traffic keywords will tend to burn up the available cash on the front end and possibly limit your exposure for the low traffic keywords that that are so highly targeted and have good conversion rates and create high quality leads. separating the budgeting for these tow types of advertising keywords will help you balance the two.

Your organizational structure for campaigns is destiny. How you organize, categorize and budget your campaigns will have a dramatic impact on your success with pay per click advertising.

Bollywood Clips

Pay Per Click Campaign Management - Get Better Quality Scores, Spend Less Money

Manage your pay per click campaigns better and improve your quality score with Google (and with Yahoo! or MSN). You'll spend less money to get more traffic to your website.

Pay Per Click Campaign Management For Better Quality Score

Getting a better quality score means you don't have to bid as much to get your text ads shown. You could even bid less than your competitors and still get better placement of your ad. How does it work...

In addition to your click through rate quality score is determined by factors such as:

  • the relevance of your ad to the key word being searched
  • the relevance of the landing page to the key word being searched

…you may find you get better results when you organize your paper click advertising campaign around word concepts, and types.

An Example:
If you are advertising for office products you might want to organize your campaigns by the same categories that you use to organize your website navigation. So you may find yourself with separate campaigns for furniture, copiers, paper, printers, organizers etc. rather than one large campaign for office supplies.

So instead of having one campaign for office supplies with 2000 keywords that lumps together all sorts of keywords for office products such as ergonomic chairs together with copier toner in one long contiguous list… you will be better off twenty campaigns with 100 words each where the key words and ads are organized and separated by concept, theme, or product line.

Form leads to relevance
Managing your pay per click campaigns with this new categorization method you will find that when you write your text ads you are not creating an ad that is very general like “ Get office products, large selection available”. Instead you will be creating targeted unique copy and targeted unique ads for each of your twenty campaigns.

Writing text ad copy that is highly targeted to the keywords being searched will greatly improve your quality score and lower the amount of money you are required to bid to get that coveted top placement. You'll find the added work you put into your pay per click management is well worth the effort.

Now go one step deeper
When you have organized your pay per click campaign by the larger concepts you can then use ad groups as sub-sections of your campaigns to further subdivide your keywords and achieve even higher relevancy and improve your quality score.

Take your campaign for copier paper… break it down into adword subsections of very specific keywords that match up with some very specifically worded advertising copy. Your campaign for copy for copiers might have ad groups broken out by manufacturer:

  • Xerox
  • Brother
  • Konica
  • Canon
  • Etc.

You can see the number of ad group’s you end up with is going to be rather large… but that’s the general idea of being more targeted and relevant. It’s better to have 100 at groups with twenty keywords than to have everything loaded into one big lump. And as I have noted earlier you'll find the added work you put into your pay per click management is well worth the effort when you see your quality score go up and your minimum bid requirements go down.

Breaking your keywords up in this manner forces you to write ad copy that is specific for each keyword group and will provide the structure and discipline you need to write more targeted copy and improve your quality store.

And a better quality score means you will spend less money to get more clicks.

Ranking Google Ranking Factors By Importance

Rand Fishkin and SEOmoz polled 132 SEO experts with data from over 10,000 Google search results, and have attempted to rank the importance of ranking signals. It’s not confirmed fact, obviously. Google won’t provide such information, but I suppose the next best thing is the collective opinion of a large group of people who make their livings getting sites to rank in search engines, and Fishkin has put together an impressive presentation.

Do you think Google is ranking search results effectively?

You can view the entire presentation here, but I’ve pulled out a few key slides that basically sum up the findings.

The factors are actually broken down into the following subsets, where each is ranked against other related factors: overall algorithmic factors, page-specific link signals, domain-wide link signals, on-page signals, domain name match signals, social signals, and highest positively + negatively correlated metrics overall.

The results find that page-level link metrics are the top algorithmic factors (22%), followed by domain-level, link authority features (21%). This is similar to the same SEOmoz poll for 2009, but there is a huge difference in the numbers, indicating that experts are less certain that page-level link metrics are as important. In 2009, they accounted for 43%.

Search Ranking Factors

Page-specific link signals are cited as metrics based on links that point specifically to the ranking page. This is how the results panned out there:

Page-specific linking factors

According to Fishkin, the main takeaways here are that SEOs believe the power of links has declined, that diversity of links is greater than raw quantity, and that the exact match anchor text appears slightly less well-correlated than partial anchor text in external links.

Domain-wide link signals are cited as metrics based on links that point to anywhere on the ranking domain. Here is what the poll looked like in this department:

Domain Level linking factors

The report compares followed vs. nofollowed links to the domain and page, finding that nofollow links may indeed help with rankings:

Nofollow

On-page signals are cited as metrics based on keyword usage and features of the ranking document. Here’s what the poll looked like on these:

on-page factors

Fishkin determines that while it’s tough to differentiate with on-page optimization, longer documents tend to rank better (possibly as a result of Panda), long titles and URLs are still likely bad for SEO, and using keywords earlier in tags and docs “seems wise”.

Here is how the domain name extensions in search results shook out:

Domain extensions

Here are the poll results on social-media-based ranking factors (which Google has seemingly been putting more emphasis on of late):

Social Factors

Fishkin suggests that Facebook may be more influential than Twitter, or that it might simply be that Facebook data is more robust and available for URLs in SERPs. He also determines that Google Buzz is probably not in use directly, as so many users simply have their tweet streams go to Buzz (making the data correlation lower). He also notes that there is a lot more to learn about how Google uses social.

Andy Beard has been testing whether links posted in Google Buzz pass PageRank or help with indexing of content since February 2010. He is now claiming evidence that Buzz is used for indexing.

Danny Sullivan asked Google’s Matt Cutts about the SEOmoz ranking factors survey in a Q&A session at SMX Advanced this week – specifically about the correlation between Facebook shares and Google rankings. Cutts is quoted as saying, “This is a good example of why correlation doesn’t equal causality because Google doesn’t get Facebook shares. We’re blocked by that data. We can see fan pages, but we can’t see Facebook shares.”

The SEOmoz presentation itself has a lot more info about the methodology used and how the correlation worked out.

All of the things covered in the presentation should be taken into consideration, particularly for sites that have experienced significant drops in rankings (because of things like the Panda update or other algorithm tweaks). We recently discussed with Dani Horowitz of Daniweb a number of other things sites can also do that may help rankings in the Post-panda Google search index. DaniWeb had been hit by Panda, but has seen a steady uptick in traffic since making some site adjustments, bringing up the possibility of Panda recovery.

Barry Schwartz at Search Engine Roundtable polled his readers about Panda recovery, and 4% said they had fully recovered, while more indicated that they had recovered partially. Still, the overwhelming majority had not recovered, indicating that Google probably did its job right for the most part (that’s not to say that some sites that didn’t deserve to get hit didn’t get hit). In that same Q&A session, Cutts said, “The general rule is to push stuff out and then find additional signals to help differentiate on the spectrum. We haven’t done any pushes that would directly pull things back. We have recomputed data that might have impacted some sites. There’s one change that might affect sites and pull things back.”

A new adjustment to the Panda update has been approved at Google, but has not rolled out yet, he says. This adjustment will be aimed at keeping scraped content from ranking over original content.

Home Page Content

There have also been other interesting bits of search-related information coming out of Google this week. Cutts posted a Webmaster Central video talking about the amount of content you should have on your homepage.

“You can have too much,” said Cutts. “So I wouldn’t have a homepage that has 20MB. You know, that takes a long time to download, and users who are on a dial-up or a modem, a slow connection, they’ll get angry at you.”

“But in general, if you have more content on a home page, there’s more text for Googlebot to find, so rather than just pictures, for example, if you have pictures plus captions – a little bit of textual information can really go a long way,” he continued.

“If you look at my blog, I’ve had anywhere from 5 to 10 posts on my main page at any given time, so I tend to veer towards a little more content when possible,” he added.

Who You Are May Count More

Who you are appears to be becoming more important in Google. Google announced that it’s supporting authorship markup, which it will use in search results. The company is experimenting with using the data to help people find content from authors in results, and says it will continue to look at ways it could help the search engine highlight authors and rank results. More on this here.

Search Queries Data from Webmaster Tools Comes to Google Analytics

Google also launched a limited pilot for search engine optimization reports in Google Analytics, tying Webmaster Central data to Google Analytics, after much demand. It will use search queries data from WMT, which includes:

  • Queries: The total number of search queries that returned pages from your site results over the given period. (These numbers can be rounded, and may not be exact.)
  • Query: A list of the top search queries that returned pages from your site.
  • Impressions: The number of times pages from your site were viewed in search results, and the percentage increase/decrease in the daily average impressions compared to the previous period. (The number of days per period defaults to 30, but you can change it at any time.)
  • Clicks: The number of times your site’s listing was clicked in search results for a particular query, and the percentage increase/decrease in the average daily clicks compared to the previous period.
  • CTR (clickthrough rate): The percentage of impressions that resulted in a click to your site, and the increase/decrease in the daily average CTR compared to the previous period.
  • Avg. position: The average position of your site on the search results page for that query, and the change compared to the previous period. Green indicates that your site’s average position is improving.To calculate average position, we take into account the ranking of your site for a particular query (for example, if a query returns your site as the #1 and #2 result, then the average position would be 1.5).
  • This week, we also ran a very interesting interview between Eric Enge and Bill Slawski addressing Google search patents and how the might relate to the Google Panda update.

    Back to the SEOmoz data. Do you think the results reflect Google’s actual algorithm well? Tell us what you think.

    Google Buys PostRank to Boost Social Analytics

    Google bought PostRank, a social analytics service that tracks and measures social posts. With this acquisition, Google has gained a resource to better understand the social web, and that resource can be applied to great impact in several of Google's key social arenas.

    What Is (or Was) PostRank?

    PostRank, in their own words, "is the largest aggregator of social engagement data in the industry." The service allowed users to get real-time data on trending topics, high-volume conversations around the web, and their own posts. The company was established in 2007 under the name "AideRSS." They re-branded as PostRank in October of 2008 in preparation for more socially oriented features. In July of 2009, the company released a service to track social trends (i.e. for datamining). Over the next year they expanded into brand management, blogging services, trend analysis, and consulting services.

    The company was acquired by Google earlier this month, but beyond that no details have been released regarding the transaction. A Google representative gave a fairly typical statement to TechCrunch about the company's excitement to be working with the PostRank team, stating that PostRank has "developed an innovative approach to measuring web engagement, and we think they can help us improve our products for our users and advertisers." Which is almost exactly the same as what they said about buying Slide.

    postrank-google

    While we don't know exactly what the PostRank team will be doing, it is known that they're moving from their Waterloo, Ontario home to Mountain View, California, and statements from both Google and PostRank indicate that the team's focus will continue to be on social tracking and analytics. As was the case with Slide, SayNow and Fflick acquisitions Google is most likely buying the talent of the development team, rather than the product.

    What Google's Acquisition Means

    It's fairly common knowledge that Google is going more social. Since Google's first foray into the social scene (aka, the disaster that was Buzz), the company has been attempting, with mixed success, to implement social features. However, if one thing is certain, it's that Google is committed to changing its poor position when it comes to the social web. They even went so far as tying the bonuses of all Googlers to the company's success in social.

    Here are a few areas where PostRank's product and know-how may be used:

    • Google +1: The +1 button, a feature mirrors the Facebook's "Like", this little gadget hopes to make a big impact in a) establishing Google as a more social brand, b) make the Google SERP a more social experience, and c) reduce the link-based gaming of the Google SERP. PostRank's technology can rapidly be applied to improve Google's tracking of social impact, making it easier to position and analyze +1's success.
    • Social Search: The recent addition of a "social promotion" element to the Google SERP, where stories that have been Tweeted or +1'd by users connected to your Google account are promoted on the results page, relies on understanding how people connect on the social web and how links are shared. PostRank will help with both elements.
    • Google's Social Network: The rumored 'Google Me' or 'Google Circles' social network which will hypthetically rival Facebook isn't ready, probably because Google doesn't have enough social reputation or understanding of the social platform. PostRank can help them get that footing.

    The acquisition is by no means a game-changer. However, it's one more resource to add to Google's stack, and the move continues to hammer in the company's social focus.

    Bollywood funny clips