Over the past week, the ever-present issue regarding (not provided) has been given a forceful shove from the edge of the ring right into the spotlight. Google’s recent move to encrypt all keyword referral data even for users who aren’t signed in has signalled the loss of some useful information that SEOs have traditionally used to quantify site performance in SERPs. But what are the actual implications of this change? Is it the end of the world or simply a case of marketer’s needing to adopt a fresher perspective?
What is (not provided)?
The (not provided) label denotes when searches are made via a secure connection (https://) and keyword data is not passed to the destination URL. These keywords are displayed in Analytics as “(not provided)”. Google’s reasoning for its introduction was increased user privacy but other theories bill it as an attempt to drive engagement with Adwords, prevent the NSA spying on user search terms or to limit the amount of what Google considers proprietary data that is available to competitors. The speculation regarding motives stems from the fact that Google doesn’t protect the privacy of users clicking on paid search ads, as keyword data is still passed to Adwords advertisers.
When it was originally introduced for logged in users in October 2011, Matt Cutts, Google’s Head of Webspam, stated that even at full roll-out the (not provided) count would not surpass a single digit percentage. Despite this, over the last couple of years the figure has been steadily increasing due to secure search appearing as standard in a number of places including as the default setting for the majority of popular browsers.
In recent weeks however the figure has deviated from its pattern of gradual increase to a steep trajectory which is fast approaching 100%. At the time of writing, notprovidedcount.com predicts that we will hit 100% on 21st November 2013 although now that Google has adopted SSL as standard, it is basically 100% already.
What Does this Mean for Website Owners?
The main implications of this change are that you can no longer see granular organic keyword data from Google and use it in the analysis of your site’s performance in SERPs. Without running paid search campaigns, you can’t see accurate conversion rates for particular keywords or even a clear brand/non-brand split. Where previously Analytics keyword data was an essential aspect of measuring the performance of an SEO campaign – for example, conversion rates could be attributed to specific organic keywords – other methods will now have to be used to determine the success (or failure) of efforts.
With this loss of precision for organic keywords, there is a blurring of lines between the effects of different marketing channels on organic performance. SEO and offline branding activities will become more difficult to distinguish from one another and, while a move towards more integrated campaign analytics will in some respects be positive, there will still be a need for clarity when it comes to differentiating which mediums were most effective.
Despite such a significant shake-up all is not lost and there are myriad solutions as to how to extrapolate meaningful keyword data and sufficiently measure organic campaign results.
Webmaster Tools is the obvious place to turn as it already reports keyword and ranking data. Accuracy is a tangible issue – a lot of figures are rounded up or down – but Google has been incrementally improving the precision and scope of what’s on offer and now you can view data from the last 12 months as opposed to the previously available 3 months. Dr Pete over at Moz also posted a report alluding to the increasing veracity of the ranking data available in Webmaster Tools with some fairly positive conclusions.
As previously mentioned, keyword data is still fully available to paid search advertisers in Adwords too, so running paid search campaigns can allow you to determine important metrics such as impression data and conversion rates and also allow you to identify opportunities via the search terms report. The drawback here is additional cost but when effectively employed the benefit of the data retrieved may easily outweigh the expense.
In addition to these methods, extensive landing page and user behaviour analysis coupled with a shift back to emphasis on rankings will also, amongst other methods, feature heavily. The latter, however, is somewhat of an inexact science due to considerable SERP personalisation.
The Marketer’s Point of View
From a marketer’s point of view, while this change removes a chunk of the data that we use for a wide range of tasks including certain aspects of reporting and opportunity identification, it doesn’t necessarily have to be a problem. Testing a number of proposed solutions and taking a new and broader approach to the metrics we use, including considering multiple data sources, is a positive thing and will likely refresh and benefit our perspective as a whole.