That which can be measured can be improved, and in search engine optimization, measurement is critical to success. Professional SEOs track data about rankings, referrals, links and more to help analyze their SEO strategy and create road maps for success.
Although every business is unique and every website has different metrics that matter, the following list is nearly universal in appeal. Note that we’re only covering those metrics critical to SEO – optimizing for the search engines – and as such, more general but still important metrics may not be included. For a more comprehensive look at web analytics overall, check out Choosing Web Analytics Key Performance Indicators from Avinash Kaushik’s excellent Web Analytics Blog.
Every month, it’s critical to keep track of the contribution of each traffic source for your site. Broadly, these include:
- Direct Navigation (type in traffic, bookmarks, email links without tracking codes, etc.)
- Referral Traffic (from links across the web or in trackable email, promotion & branding campaign links)
- Search Engines (queries that sent traffic from any major or minor web search engine)
Knowing the percentage and exact numbers will help you identify strengths and weaknesses and serve as a comparison over time for trend data. If, for example, you see that traffic has spiked dramatically but it comes from referral links with low relevance while search engine and direct type-ins fell, you’ll know you’re actually in much more trouble than the raw numbers would suggest. You should use this data to track your marketing efforts and to serve as a broad measurement for your traffic acquisition efforts.
Three major engines make up 95%+ of all search traffic in the US (Yahoo!, Bing & Google), and for most countries outside the US (with the notable exceptions of Russia, China, Japan, Korea & the Czech Republic) 80%+ of search traffic comes solely from Google. Measuring the contribution of your search traffic from each engine is critical for several reasons:
Compare Performance vs. Market Share
By tracking not only search engines broadly, but by country, you’ll be able to see exactly the contribution level of each engine in accordance with its estimated market share. Keep in mind that in sectors like technology and Internet services, demand is likely to be higher on Google (given its younger, more tech-savvy demographic) than in areas like cooking, sports or real estate (where the percentages might be closer to the estimates from firms like Comscore).
Get Visibility Into Potential Drops
If your search traffic should drop significantly at any point, knowing the relative and exact contributions from each engine will be essential to diagnosing the issue. If all the engines drop off equally, the problem is almost certainly one of accessibility. If Google drops while the others remain at previous levels, it’s more likely to be a penalty or devaluation of your SEO efforts by that singular engine.
Uncover Strategic Value
It’s very likely that some efforts you undertake in SEO will have greater positive results on some engines than others. For example, we frequently notice that on-page optimization tactics like better keyword inclusion and targeting has more benefit with Bing & Yahoo! than Google, while gaining specific anchor text links from a large number of domains has a more positive impact on Google than the others. If you can identify the tactics that are having success with one engine (or that are failing to succeed with others), you’ll better know how to focus your efforts.
If you find your site underperforming at one of the engines (based on broad market share numbers), don’t immediately panic. Remember that search engines have demographics and biases just like any other referral source. For example, in the US, Google’s market share is supposedly between 65-70%, yet the vast majority of sites we’ve ever worked with (and those reported by our friends and colleagues in the search marketing industry) show that 80-85% of traffic share from Google is actually far more common.
Don’t just rely on Comscore, Hitwise or Compete.com data to tell you what percentage of share an engine should provide – make sure to investigate. You can do this by running PPC ads on the various engines (and comparing impression data), checking rankings across the engines (if your Yahoo! rankings are just as good or better than your Google rankings, it’s not missed opportunity, it’s lower volume), and making sure you haven’t made any dumb mistakes (blocking other engines’ spiders, using the meta robots NOODP to control listings at Google, but forgetting to use NOYDIR at Yahoo!, etc.).
The terms and phrases that send traffic are another important piece of your analytics pie. You’ll want to keep track of these on a regular basis to help identify new trends in keyword demand, gauge your performance on key terms and find terms that are bringing significant traffic you’re potentially under-serving (e.g., you rank well and get visits, but don’t have content that helps the searcher accomplish their goal).
You may also find value in tracking search referral counts for terms outside the “top” terms/phrases – those that are important and valuable to your business. If the trend lines are pointing in the wrong direction, you know efforts need to be undertaken to course correct. Search traffic worldwide has consistently risen over the past 15 years, so a decline in quantity of referrals is troubling – check for seasonality issues (keywords that are only in demand certain times of the week/month/year) and rankings (have you dropped, or has search volume ebbed?).
When it comes to the bottom line for your organization, few metrics matter as much as conversion. However, analytics often misstates the impact of conversion rates from the last referral, clouding the true picture of what brought a visitor who “converted.” For example, in the graphic to the right, 4.46% of visitors who reached SEOmoz with the query “check backlinks” signed up to become members during that visit. What we don’t know (at least, from this simple analysis), is how many of those visitors had already signed up, how many signed up during a later visit, or even what percentage of those visits were first-time visitors.
The real value from this sort of simplistic tracking comes from the “low-hanging fruit” – seeing terms/phrases that continually send visitors who convert and increasing focus on both rankings and traffic from that keyword referral as well as improving the landing pages that visitors reach. While conversion rate tracking from keyword phrase referrals is certainly important, it’s never the whole story. Dig deeper and you can often uncover far more interesting and applicable data about how conversion starts and ends on your site.
Knowing the number of pages that receive search engine traffic is an essential metric for monitoring overall SEO performance. From this number, we can get a glimpse into indexation (how many pages the engines are keeping in their indices from our site), and, more importantly, watch trends over time. For most large websites (50,000+ pages), mere inclusion is essential to earning traffic, and this metric delivers a trackable number that’s indicative of success or failure. As you work on issues like site architecture, link acquisition, XML Sitemaps, uniqueness of content and meta data, etc., the trend line should rise, showing that more and more pages are earning their way into the engines’ results. Pages receiving search traffic is, quite possibly, the best long tail metric around.
While other analytics data points are also of great importance, those mentioned above should be universally applied to get the maximum value from your SEO campaigns.
The Right Tools for the Job
- Sawmill Analytics
- Lyris / Clicktracks
- Unica Affinium NetInsight
- Yahoo! Web Analytics(formerly Indextools)
- Google Analytics
- Clicky Web Analytics
- Piwik Open Source Analysis
- Woopra Website Tracking
While choosing can be tough, at the time of publication, our top recommendation is for Google Analytics (so long as you have few privacy concerns and don’t mind the brief data delays), followed closely by Clicky. Yahoo! Web Analytics also has a solution worth considering. If you cannot use tracking code on your web pages and need a log-file based solution, AWStats is our top recommendation, though any log file based tracking will suffer from the inability to track clickstream paths, first time vs. referring and other important metrics as accurately as cookie/session based software.
No matter which analytics software you decide is right for you, we also strongly recommend testing different versions of pages on your site and making conversion rate improvements based on the results. Testing pages on your site can be as simple as using a free tool to test two versions of a page header or as complex as using an expensive multivariate software to simultaneously test hundreds of variants of a page. There are many testing platforms out there, but if you’re looking to put a first toe in the testing waters, one free, easy to use solution we recommend is Google’s Website Optimizer. It’s a great way to get started running tests that can inform powerful conversion rate improvements.
Search Engine Optimization
In organic SEO, it can be difficult to track the specific elements of the engines’ algorithms effectively given that this data is not public, nor is it even well researched. However, a combination of tactics have become best practices, and new data is constantly emerging to help track direct ranking elements and positive/negative ranking signals. The data points covered below are ones that we will occasionally recommend to track campaigns and have proven to add value when used in concert with analytics.
We’ve already discussed many of the data points provided by services such as Google’s Webmaster Tools, Yahoo! Site Explorer and Microsoft’s Webmaster Tools. In addition to these, the engines provide some insight through publicly available queries and competitive intelligence. Below is a list of queries/tools /metrics from the engines, along with their respective applications.
Employing these queries & tools effectively requires that you have an informational need with an actionable solution. The data itself isn’t valuable unless you have a plan of what to change/build/do once you learn what you need to know (this holds true for competitive analysis as well).
Google Site Query
e.g., site:seomoz.org – useful to see the number and list of pages indexed on a particular domain. You can expand the value by adding additional query parameters. For example – site:seomoz.org/blog inurl:tools – will show only those pages in Google’s index that are in the blog and contain the word “tools” in the URL. While this number fluctuates, it’s is still a good rough measurement. You can read more about this on this blog post.
Available at Google.com/Trends – this shows keyword search volume/popularity data over time. If you’re logged into your Google account, you can also get specific numbers on the charts, rather than just trend lines.
Google Trends for Websites
Available at Trends.Google.com/websites – This shows traffic data for websites according to Google’s data sources (toolbar, ISP data, analytics and others may be part of this). A logged in user account will show numbers in the chart to indicate estimated traffic levels.
Google Insights for Search
Available at google.com/insights/search – this tool provides data about regional usage, popularity and related queries for keywords.
Yahoo! Site Query
e.g., site:seomoz.org – note that a standard site query will automatically redirect to Yahoo!’s Site Explorer, but advanced queries that include additional parameters such as site:seomoz.org inurl:rand will show Yahoo!’s standard results format. You can use these much in the same way as the Google site query to see the number and list of pages Yahoo! has in their index for a particular site.
Yahoo! Link & Linkdomain Queries
e.g., linkdomain:seomoz.org – as with site queries, these will redirect to Yahoo! Site Explorer unless additional parameters are employed. For example, to see only links to SEOmoz.org that have the word “google” in the title tag, you’d use the query – linkdomain:seomoz.org intitle:google. Yahoo!’s link queries are the most robust and accurate of the major engines, but unfortunately they do include nofollow links which they don’t separately mark.
Bing Site Query
e.g., site:seomoz.org – just like Yahoo! and Google, Bing allows for queries to show the number and list of pages in their index from a given site. Unfortunately, Bing’s counts are given to wild fluctuation and massive inaccuracy, often rendering the counts themselves useless.
Bing IP Query
e.g., ip:188.8.131.52 – this query will show pages that Microsoft’s engine has found on the given IP address. This can be useful in identifying shared hosting and seeing what other sites are hosted on a given IP address.
Microsoft AdCenter Labs
Available at adlab.microsoft.com/alltools.aspx – a great variety of keyword research and audience intelligence tools are provided by Microsoft, primarily for search and display advertising. This guide won’t dive deep into the value of each individual tool, but they are worth investigating and many can be applied to SEO.
Ask Site Query
e.g., site:seomoz.org inurl:www – Ask.com is a bit picky in its requirements around use of the site query operator. To function properly, an additional query must be used (although generic queries such as the example above are useful to see what a broad “site” query would normally return).
Blog Search Link Query
e.g., link:www.seomoz.org/blog – Although Google’s normal web search link command is not always useful, their blog search link query shows generally high quality data and can be sorted by date range and relevance. You can read more about this on this blog post.
Page Specific Metrics
Page Authority – Page Authority predicts the likelihood of a single page to rank well, regardless of its content. The higher the Page Authority, the greater the potential for that individual page to rank.
mozRank – mozRank refers SEOmoz’s general, logarithmically scaled 10-point measure of global link authority (or popularity). mozRank is very similar in purpose to the measures of static importance (which means importance independent of a specific query) that are used by the search engines (e.g., Google’s PageRank or FAST’s StaticRank). Search engines often rank pages with higher global link authority ahead of pages with lower authority. Because measures like mozRank are global and static, this ranking power applies to a broad range of search queries, rather than pages optimized specifically for a particular keyword.
mozTrust – Like mozRank, mozTrust is distributed through links. First, trustworthy “seeds” are identified to feed the calculation of the metric. (These include the homepages of major international university, media and governmental websites.) Websites that earn links from the seed set are then able to cast (lesser) trust-votes through their links. This process continues across the web and the mozTrust of each applicable link decreases as it travels “farther” from the original trusted seed site.
# of Links – The total number of pages that contain at least one link to this page. For example, if the Library of Congress homepage (http://www.loc.gov/index.html) linked to the White House’s homepage (http://www.whitehouse.gov) in both the page content and the footer, this would still be counted as only a single link.
# of Linking Root Domains – The total number of unique root domains that contain a link to this page. For example, if topics.nytimes.com and http://www.nytimes.com both linked to the homepage of SEOmoz (http://www.seomoz.org), this would count as only a single linking root domain.
External mozRank – Whereas mozRank measures the link juice (ranking power) of both internal and external links, external mozRank measures only the amount of mozRank flowing through external links (links located on a separate domain). Because external links can play an important role as independent endorsements, external mozRank is an important metric for predicting search engine rankings.
Domain Specific Metrics
Domain Authority – Domain Authority predicts how well a web page will rank on a domain. The higher the Domain Authority, the greater the potential for an individual page on that domain to rank well.
Domain mozRank – Domain-level mozRank (DmR) quantifies the popularity of a given domain compared to all other domains on the web. DmR is computed for both subdomains and root domains. This metric uses the same algorithm as mozRank but applies it to the “domain-level link graph”. (A view of the web that only looks at domains as a whole and ignores individual pages) Viewing the web from this perspective offers additional insight about the general authority of a domain. Just as pages can endorse other pages, a link which crosses domain boundaries (e.g., from a page on searchengineland.com to a page on http://www.seomoz.org) can be seen as endorsement by one domain for another.
Domain mozTrust – Just as mozRank can be applied at the domain level (Domain-level mozRank), so can mozTrust. Domain-level mozTrust is like mozTrust but instead of being calculated between web pages, it calculated between entire domains. New or poorly linked-to pages on highly trusted domains may inherit some natural trust by virtue of being hosted on the trusted domain. Domain-Level mozTrust is expressed on a 10-point logarithmic scale.
# of Links – the quantity of pages that contain at least one link to the domain. For example, if http://www.loc.gov/index.html and http://www.loc.gov/about both contained links to http://www.nasa.gov, this would count as two links to the domain.
# of Linking Root Domains – the quantity of different domains that contain at least one page with a link to any page on this site. For example, if http://www.loc.gov/index.html and http://www.loc.gov/about both contained links to http://www.nasa.gov, this would count as only a single linking root domain to nasa.gov.
To Your Campaign
Just knowing the numbers won’t help unless you can effectively interpret and apply changes to course-correct. Below, we’ve taken a sample of some of the most common directional signals provided by tracking data points and how to respond with actions to improve or execute on opportunities.
In Search Engine Page and Link Count Numbers
The numbers reported in “site:” and “link:” queries are rarely precise, and thus we strongly recommend not getting too worried about fluctuations showing massive increases or decreases unless they are accompanied by traffic drops. For example, on any given day, Yahoo! reports between 800,000 and 2 million links to the SEOmoz.org domain. Obviously, we don’t gain or lose hundreds of thousands of links each day, but the variability of Yahoo!’s indices means that these numbers reports provide little guidance about our actual link growth or shrinkage.
If you do see significant drops in links or pages indexed accompanied by similar traffic referral drops from the search engines, you may be experiencing a real loss of link juice (check to see if important links that were previously sending traffic/rankings boosts still exist) or a loss of indexation due to penalties, hacking, malware, etc. A thorough analysis using your own web analytics and Google’s Webmaster Tools can help to identify potential problems.
Search Traffic from a Single Engine
Search Traffic from Multiple Engines
Chances are good that you’ve done something on your site to block crawlers or stop indexation. This could be something in the robots.txt or meta robots tags, a problem with hosting/uptime, a DNS resolution issue or a number of other technical breakdowns. Talk to your system administrator, developers and/or hosting provider and carefully review your Webmaster Tools accounts and analytics to help determine potential causes.
Gaining or losing rankings for a particular term/phrase or even several happens millions of times a day to millions of pages and is generally nothing to be concerned about. Ranking algorithms fluctuate, competitors gain and lose links (and on-page optimization tactics) and search engines even flux between indices (and may sometimes even make mistakes in their crawling, inclusion or ranking processes). When a dramatic rankings decrease occurs, you might want to carefully review on-page elements for any signs of over-optimization or violation of guidelines (cloaking, keyword stuffing, etc.) and check to see if links have recently been gained or lost. Note that with sudden spikes in rankings for new content, a temporary period of high visibility followed by a dramatic drop is common (in the SEO field, we refer to this as the “freshness boost”).
“Don’t panic over small fluctuations. With large drops, be wary against making a judgment call until at least a few days have passed. If you run a new site or are in the process of link acquisition and active marketing, these sudden spikes and drops are even more common, so simply be prepared and keep working.”
Increases in Link Metrics Without Rankings Increases
Many site owners worry that when they’ve done some “classic” SEO – on-page optimization, link acquisition, etc. they can expect instant results. This, sadly, is not the case. Particularly for new site and pages and content that’s competing in very difficult results, rankings take time and even earning lots of great links is not a sure recipe to instantly reach the top. Remember that the engines need to not only crawl all those pages where you’ve acquired links, but index and process them – given the almost certain use of delta indices by the engines to help with freshness, the metrics and rankings you’re seeking may be days or even weeks behind the progress you’ve made.