FireFox 14 & The Reclamation of (not provided)

Posted on 28. Aug, 2012 by in Digital Marketing



Mozilla’s recent announcement that Firefox will now make SSL encryption default for all users has stoked the flames of the online privacy debate initially sparked by Google’s introduction of automatic secure connections for logged-in users last October. Does this development signal the beginning of the end for organic keyword data?

Extending beyond Google’s encryption efforts, Firefox 14 will encrypt all Google organic searches by default, regardless of user status. Furthermore, Mozilla has declared its intent to roll this functionality out to additional search engines in the future, so in effect we, as digital professionals, will be unable to see keyword information related to any natural search click conducted using the Firefox browser.

However, early analysis suggests the impact of Mozilla 14 has been relatively muted in comparison to Google’s privacy roll out last October. Earlier posts have discussed both the initial effects of Google’s introduction of SSL encryption and its continued impact on organic search performance analysis over the subsequent months. In a similar vein we have examined data from a sample of LBi clients operating across a range of verticals in order to assess the initial effects of the release of Firefox 14 upon the prevalence of “not provided” search terms. Firstly it must be noted that Firefox 14 rapidly became the most utilised of all versions of the web browser. As of the 15th August, 69% of total Firefox visits to the sample client sites were using version 14.0.1. This trend will only continue. Secondly, whilst some sources place the popularity of Firefox at around a quarter of the worldwide share of web browsers, research into the effects of Firefox SSL encryption revealed that only around 14% of visitors to the sample of client sites used Mozilla’s browser.

Looking at instances of “not provided”, there was a significant rise in the levels of encrypted search terms from Firefox traffic in the wake of the release of version 14, as would be expected. In one case there was a 21% increase in the proportion of the total number of Firefox visits that were generated by keyword “not provided” comparing the week prior to the release of version 14.0.1 and the week commencing 30th July. However, looking at non-browser specific traffic, the impact of 14’s automatic encryption of natural search terms appears much less alarming. Below are the top line results from a focussed case study of four clients. The average numbers of visits from “not provided” as a proportion of total visits were calculated for both the short-run period preceding the 17th July and the period immediately following:

In the case of Client D, the average number of visits from “not provided” as a proportion of total visits has risen by only 1%. Furthermore, if we visualise the data and compare trends to those previously reported in the wake of Google’s move to encrypt logged in users’ search terms we can see a clear discrepancy in the reach and impact of the two privacy related developments:

As previously mentioned, Mozilla’s market share was found during the course of research to have been consistently around the 14% mark over the last seven months. In addition, usage of version 14 amongst Firefox users of client sites has yet to reach 100%. In sum, the impact of the release of version 14 has so far been limited to a smaller, albeit not insignificant segment of internet usage. When Google released SSL encryption last October on the other hand, the effects were much wider reaching as they affected anyone signed in to Google, regardless of browser or version.

Whilst the release of Firefox 14 will be causing a certain demographic of those working within the search industry some degree of concern, it doesn’t spell the end of data pertaining to free clicks. Whilst any implementation of search encryption does serve to muddy the waters panned by search professionals, all is not lost if we employ an intuitive combination of filters, segments, benchmarks and common sense.

Indeed, Avinash Kaushik has perhaps made the strongest argument in favour of utilising historical data to guide our on-going interpretations of “not provided” traffic. If in the past the proportion of brand keywords driving search traffic to a given website was consistently around 30%, why should that suddenly cease to be the case? Finger in the air measurement yes, but a sensible assumption nonetheless.

Furthermore, we may glean additional insight by combining instances of keyword encryption with landing page data. We may rationally deduce that the majority of “not provided” keywords that generated landings on a site’s homepage were brand terms. We may go further to propose that homepage landings and landings on focussed hub pages such as /lawnmowers, for example, were driven by ‘head’ rather than long-tail keyword terms. In addition, we can QA assumptions of this nature to an extent by running a test PPC campaign in parallel at sporadic intervals. Since paid search advertising sidesteps the “not provided” issue, it is possible to obtain accurate snapshots of how the brand/non-brand and long/short tail split is behaving.

Combining sensible estimations of the prevalence of certain keyword terms based on historical data and on trends seen amongst visible keywords with more insightful landing page data, whilst perhaps not going as far as restoring the levels of accuracy previously readily available to interested parties, nonetheless rescues a significant chunk of search data from the analytical scrapheap.

How helpful do you find the solutions highlighted by the pioneers above? Have you any innovative suggestions of your own? Or are you of the opinion that that the war on free clicks is well under way and that the threat of Google following Mozilla’s lead will render such efforts untenable?

 

 

Tags: , , , ,