Definition and history of Digital analytics

Article read 4480 times in categories Digital Analytics @en

The purpose of the present study is to retrace the history of digital analytics, the main events that shaped its evolution and to try to predict the upcoming trends.It focuses on quantitative analysis of data for advertisers.

It is intended for both beginners and advanced marketers – with strong or no technical skills at all – who wish to develop their digital analytics culture by (re)discovering its history.

It is composed of 16 main parts to which you can access using theses links…

- Digital Analytics, a Core Activity for Advertisers

- Definition of Digital Analytics

Presentation of the Digital Analytics Market

Digital Analytics Evolution from 1993 to Today

1993: Beginning of the Log-Based Analysis of Digital Trends

Simplified Diagram of Log-based Data Collection

1997: First Use of the JavaScript Tag-based Data Collection Method


Definition and Functioning of JavaScript Tags

Simplified Diagram of Tag-based Data Collection

- The Future: Server-Side

Major Digital Analytics Solutions

Succeeding in Digital Analytics, is it a Matter of Solutions?

Paid Solutions’ Added Value

Paid Solutions’ Price

Tip to Calculate a License’s Cost

Report Examples

Evolution of Major Digital Analytics Solutions from 1993 to Today

Positioning of Major Digital Analytics Solutions

Communities around Digital Analytics

2010: Launch of Tag Management Systems (TMS)

Definition and Functioning of a TMS

Before and After TMSs

Advanced TMS Features

Major TMS Solutions

Evolution of Major TMS Solutions From 2007 to Today

How to Choose a TMS?

2012: Creation of the European Cookie Law


What Does the Law Say Exactly?


How to Be Compliant with Regulation?

Solutions to be Compliant with Regulation

Do you Still Have Doubts about Being Compliant?

The Technological Revolution that Needs to Take Place


Evolution of Data Storage Technologies

The Next Big Trends

Digital Analytics, a Core Activity for Advertisers

Advertisers have come to understand that being present online is indispensable to market and sell their products and services, and that such presence does not “cannibalize” their traditional existence. They now also comprehend that their digital reputation is not only due to traffic acquisition, but also and mostly to monetization and to the users they manage to transform into clients, subscribers or who engage with content served by the advertiser.

However, nowadays leads and clients use a myriad of devices (PCs, smart-phones, tablets, TVs, etc.) to connect to several platforms (desktop websites, mobile websites, applications, etc.) through an increasingly varied number of online communication channels (sponsored links, emailing, etc.). Offline content (printed press, TV, radio, etc.) coexists with its digital peer and advertisers need to harmonize their interaction.

Altogether, they constitute a complex ecosystem that makes marketing and selling products/services more difficult:

Digital, a complex ecosystem

Luckily, one of the particularities of online communication channels and mediums (with respect to their offline counterparts) is that they are fully measurable and can be managed according to performance. The major objective of advertisers today, is to understand their ecosystem to learn how to manage it, optimize it semi-automatically, and to offer their leads and clients the most personalized experience possible.

Digital analytics are one of the components allowing that objective to be met.

Definition of Digital Analytics

Digital analytics is the measurement, monitoring and analysis of visitor behavior with an aim at improving advertisers’ performance.

It is a discipline common to these digital marketing universes:

Digital analytics, a cross disciplinary subject

Digital analytics relies on the use of one or more audience measurement solutions, which gather data on visitors and generates reports for analysis.

Presentation of the Digital Analytics Market

The digital analytics market is mature. It has existed for about 20 years in Europe and North America. There are three major actors in it:

Relationship between advertisers agencies and editors

Advertisers are aware of the added value digital analytics bring, regardless of their economic model. Most of them, if not all of them, have one or more such solutions (according to the French consultancy firm Converteo, in the second semester of 2015, 90% of the 200 largest French websites – in terms of traffic – were using a digital analytics solution.).

Several agencies – specialized or not – have ventured in digital analytics counseling over the years and today assist a growing number of advertisers. Advice can generally range from the choice of the solution to data analysis and optimization recommendations. Some agencies also do media buying: they purchase spots for their clients to advertise on, sponsored links, etc. The objective is to attract qualified visitors to the advertiser’s Website. In most cases, they retain a percentage of the amount allocated to the media acquisition as payment. Agencies seldom connect their roles as digital analytics consultants and media buyers in order to be, and seem, more neutral and trustworthy.

Agencies have strongly contributed to the fast-paced evolution of the market’s maturity, thanks to their skill and the experience they acquired by accompanying their clients. In addition to these players, solution editors also have an important role in the market maturity’s evolution. They should indeed answer to and anticipate advertisers’ needs through adapted and easy-to-use features included in their products.

Advertisers usually work with one or more agencies that help them set up their strategy and manage the relationship with editors (for instance, Criteo will take care of media buying while Adobe Analytics will take care of digital analytics and so on, etc.). There are many types of agencies: generalists, which can advise an advertiser on both their offline and online marketing strategies; online marketing specialists (with a main focus on certain areas of the latter: digital analytics, media buying, etc.). Generalist agencies can meet global requirements whereas specialized agencies master best particular aspects of online marketing. There is not a best type of agency, but rather several agencies that can provide better solutions to given needs. To make a good choice, you will need to know and anticipate your requirements. It is important for advertisers to have skilled internal human resources to select, manage and monitor the agencies’ work to ensure they meet the established goals.

Digital Analytics Evolution from 1993 to Today

Digital analytics evolution

Digital analytics’ evolution is intrinsically related to that of the World Wide Web.

Three major Internet evolutions have had a significant impact on digital analytics. The first two are technological: the arrival of JavaScript and that of new data storage technologies. The third one is legal and consists of the European Directive on Cookies, better known as the Cookie Law.

The appearance of JavaScript revolutionized the discipline by allowing the collection of new types of data more precisely (please refer to the chapter “1997: First use of the JavaScript tag-based data collection method”).

The widespread use of new data storage technologies made it possible to gain access and modify data in real time with an aim at optimizing digital performance both in terms of acquisition and personalization (please refer to chapter “The Technological Revolution to that Needs to take Place”).

Ever since the European Cookie Law entered into force, advertisers are compelled to ask for visitors’ approval to collect data. They now have to find equilibrium between regulatory compliance and their data collection needs (please refer to chapter “2012: Creation of the European Cookie Law”).

Regarding the digital analytics universe, two major events shaped it after the first digital analytics solution – webtrends – surfaced and gave birth to the digital analytics marketplace. Those events are the appearance of Google’s free audience measurement tool and the creation of Tag Management Systems (TMS). The Google Analytics platform contributed significantly to the development of the discipline. It is well known and very few advertisers have not yet used it (according to French consulting firm Converteo, 73% of France’s top 200 websites –in terms of traffic- use Google Analytics). TMSs are almost indispensable and greatly simplify the everyday work of marketing teams (please refer to chapters “Major Digital Analytics Solutions” and “2010: Launch of Tag Management Systems (TMS)”).

1993: Beginning of the Log-Based Analysis of Digital Trends

Digital analytics appeared thanks to the http protocol, which stores every interaction between a user and a Website in a log file. This is what made possible the analysis of user behavior.

A protocol is a series of rules defined for a type of communication. If we extrapolate this to our daily lives, the protocol would be the language we use to communicate with one another. There would then be a French protocol, an English protocol or even Canadian French. Http is the protocol we use every day to display the webpages we visit.

We use it unconsciously by entering it in our browser’s address bar whenever we want to visit a website:

Http protocol

Http protocol

Location of the HTTP protocol and WWW prefix in a classic web address

You certainly know other protocols such as https, which is a secured version of http and is being increasingly used, or ftp for example.

Http was invented in 1990 by Tim Berners-Lee with URLs and the HTML language to create the World Wide Web (which is what the WWW acronym visible in web addresses stands for).

Simplified Diagram of Log-based Data Collection

Functioning of the log based data collection method

Here is a log extract:

Log extract

Log extract

The first mission of audience measurement solutions that rely on log analysis is to allocate every query (every line in the log is a request, also called “hit”, from the browser) to the corresponding user. Thanks to this, users’ activity can be tracked based on their IP address.

At the beginning of the Web (1990 to 1996), web pages were “static”, made almost exclusively out of text and links. A hit was basically a page that had been displayed. As time passed by, pages became much richer with the inclusion of images, videos and animations. This was made possible with technologies such as AJAX or HTML5.

Given that every hit corresponds to the calling of page’s elements, their number has increased exponentially in the past years, making it difficult for digital analytics solutions to retrace user visits through log analysis. In addition, measuring interaction with dynamic content (video plays, etc.) was impossible since nothing was captured in the logs for that type of action.

Then came new technologies outside the evolution of Websites that downgraded the quality of log-based analysis: search engines and their respective robots, proxy servers allowing anonymous browsing, the assignation of dynamic IP addresses by Internet Service Providers (ISPs) and caching techniques for content from Content Management Systems (CMS).

The evolution of pages and the Web as a whole made log analysis irrelevant.

Marketing departments also became acquainted with and adopted the Web because of its added value. But log analysis was too technical for them to use without problems.

Audience measurement through logs had become inadequate.

Editors of digital analytics solutions had to modify the ways in which they collected data so it could be compatible with the Web’s evolution.

Nowadays, data collection through log analysis is almost never used. Google’s Urchin, one of the last log-analysis-based technologies was acquired in 2005 to create Google Analytics. It is not sold or supported since March 28, 2012.

1997: First Use of the JavaScript Tag-based Data Collection Method


Digital analytics solutions’ editors created the JavaScript (JS) tag-based data collection method two years after the language was created.

Definition and Functioning of JavaScript Tags

A tag is a JavaScript code snippet.

Below is a tag example for Google Analytics:

Example of Google analytics tag

Example of Google analytics tag

Google Analytics was taken as an example because it is the most widely known and used solution.

Digital analytics solutions’ tags need to be placed on all of a website’s pages.

Tags are executed by the browser when the page has finished loading; they collect information about the visitor and its behavior online: information related to the page that is being viewed, browser details, geographical data, screen resolution, etc. The information is then sent to a remote server or, in rare cases, one located within the premises. This depends on the audience measurement tool that is used.

After that, the solution matches visits with corresponding visitors as soon as it receives the information. This is possible thanks to the transmission of visitor ids when the tag is executed. Each and every visitor has a unique id that is stored in a cookie (text file present in the user’s browser). The fact that the id is stored in a cookie allows sending the same user id systematically every time they visit the page (except, of course, if they delete their cookies or change browsers: in these cases, a new visitor id is created, the user is considered as new and the visit is counted twice). Finally, the solution stores the data and returns it in real time (or within 24h max.) in the form of charts.

Simplified Diagram of Tag-based Data Collection

Functioning of the tag based data collection method

Most solutions’ tags are made of two elements:

-  Variable initialization and function calls

- A JavaScript library containing the definition of the different functions that are called by the tag to collect and send the information.

Let us take the example that was presented previously, that of the Google Analytics tag.

Example of Google analytics tag

Example of Google analytics tag

Lines one and ten are the opening and closing JavaScript tags, respectively, and tell the browser that the code block in between needs to be interpreted as JavaScript.

Lines seven and eight define the Google Analytics property (there is usually one property per website) and inform that a page has been viewed. In addition to data about the page view, Google Analytics will send more information about the user and its visit to Google Analytics’ servers (ex: screen resolution, browser, among others).

Lines two to five issue a call to the JavaScript library containing the definitions of all available functions.

Below is an extract of Google Analytics’ JavaScript library:

Google Analytics Javascript Library extract

Extract of Google Analytics’ JavaScript library

The definition of the function that is called above is underlined (please refer to the complete Google Analytics’ JavaScript library). This library has been minified willingly (deletion of line breaks, indentation, etc.) to lower its size (about 25Ko), reduce loading time and obfuscated (variables’ names have been replaced by letters) by Google. It is therefore hardly readable as it is.

If you wish to make it more readable by un-minifying it, you can use the online service jsbeautifier. All you need to do once on the site is copying and pasting the contents of the JavaScript library and then clicking the “Beautify JavaScript or HTML” button.

For certain editors, such as Google, the library is common to all users and the solution is hosted on the editor’s servers. But for others, such as an Adobe Analytics, the library is hosted on the advertisers’ servers and can be specific to each and every advertiser the solution works with. Whether local or remote hosting is chosen for the library depends on whether or not the solution supports it and on the advertiser’s needs. Local hosting is preferred mostly because it offers security and gives the possibility to personalize the solution’s library, whereas remote hosting is chosen because it guarantees that the latest version of the solution’s library is used, along with the corresponding new features.

Every audience measurement solution has its own tag and library.

The tag is usually placed right above the </body> or </head> html tags.

Values taken by each and every one of the variables included in the tag are contextual. They can depend on the page that is being visited (the content category, etc.), on the visit itself (engagement level, etc.) or on the visitor (gender, age, etc.). Each variable’s value and sent data are different from one page to another.

Once data is collected by the tag, it just needs to be sent to the audience measurement tool’s servers. The execution of the transmission function (“send” for Google Analytics) included in the tag sends all the information that was collected about a visitor and their visit. This data is collected through different sources: the first one -presented above- consists of the tag’s variables; the second one consists of cookies (text files inherent to the web browser) that store specific information about a user; and finally, the third one consists of the browser itself, which also holds information about the user (used browser and version, screen resolution, etc.).

Data is transferred through a request (message sent by the browser (client) to a server) made of several parameters containing one or more pieces of information about the visitor.

The request is sent using the GET method (through the HTTP or HTTPS protocols depending on which one is used by the page on which the tag is executed).

Below is an extract of a request issued by Google Analytics:

Extract of Google Analytics Request (displayed on Google Chrome’s Console)

Extract of Google Analytics Request (displayed on Google Chrome’s Console)

We can see that the tid parameter takes the Google Analytics account id as value, as it was defined in the tag’s body (please refer to the tag’s code block above).

It is possible to visualize requests that are issued by all of the solutions with the browser’s debugging console (Firefox’s Firebug is often used and Chrome’s Chrome Developer Tools as well). Information about the user and the visit are generally sent through the GET method in the form of a parameter; this is done by calling a 1×1 transparent pixel. When the audience measurement solution’s servers receive the information, it is stored and processed.

Chrome Developer Tools’ Screenshot (shortcut: CTRL + MAJ + i)

Chrome Developer Tools’ Screenshot (shortcut: CTRL + MAJ + i)

The names of parameters present in the requests solutions issue to their servers are different from those of the tag’s variables, this is due to limits set on the requests’ size. These limits vary from one browser to another, but the average request length is of 3,000 characters (for GET requests). If the request exceeds this limit, it is interrupted by the browser and the server only receives part of the information. This is why the names of parameters used by solutions in their requests are very short most of the time.

Verifying sent data is very annoying due to the aforementioned difference.

It is also why solutions often propose a summary/table describing every parameter so you can verify values match (example: Google Analytics parameters).

If the solution was implemented correctly and its tag was placed on all of the website’s pages, a request will be issued for every page that is viewed. If an advertiser wishes to measure one or more specific elements on a given page (ex: number of times a file is downloaded, number of video plays, etc.), there will be as many requests (issued to the solutions’ servers) as there are elements to track.

Collecting data with JavaScript tags allows measuring everything, from e-commerce transactions to the number of clicks on any page element and to track mouse movement.

However, the “unique visitors” metric remains slightly inaccurate and would better be called “unique browser” (a term used in ComScore’s Digital Analytix, acquired in 2015 by Adobe). Indeed, solutions create and use a cookie (specific to the browser and stored in it) to identify every visitor. But if the same user goes to the same site using two different browsers (and deletes all cookies before returning to said site with the new browser), the solution will count two unique visitors rather than one.

The tag-based collection method has been a victim of its own success: it is not uncommon to find websites with tens of tags executing on every page.

This slows pages down, affects user experience and reduces conversion rates.

In addition, one of the main stakes for advertisers is to comply with data protection regulation, as well as to control data transfers to their partners and to avoid data breaches. Unfortunately, it is usual that some tags are piggy-backed by other tags (second level tags) that collect data for solutions the advertiser is unaware of.

The Future: Server-Side

In between 2017 and 2018, we should witness the emergence of a new data collection technology: server-side. Server-side data collection should be the third generation of data collection methods, after log analysis and JavaScript tags. The purpose is to send information only once to a Tag Management system (TMS), which will in turn transmit it to partners, rather than sending many pieces of information as many times as there are solutions.

Every time a page/screen is displayed, the website/application issues a call towards the TMS with data from the data layer that is to be shared with partners. The TMS then triggers the solutions’ tags based on rules defined by the user.

Data collection evolution from 1993 to today

Server-side strengths:

- Higher conversion rates and reduced loading time for pages (only one tag is called, as opposed to dozens today).

- Better decision-making thanks to more reliable data: tags firing does not depend on user interaction with the page anymore (ex: leaving the conversion page before it has finished loading and tags are executed). Transfer rates are nearly at 100% whereas rates for the current technology vary greatly.

- Regaining control over data that is transferred to partners: executing tags on the client’s end favors data leakage; solutions can collect data without letting the advertiser know and share it with third parties (through second level tags). The server-side approach helps controlling the transfer of such data.

The adoption of server-side is taking some time due to the lack of compatibility of several solutions with this technology. However, more and more advertisers are aware of the benefits it brings and put pressure on their partners so they make their tags compatible.

Major Digital Analytics Solutions

Paid and free major Digital Analytics solutions

Paid and free major Digital Analytics solutions

All of these solutions work according to the Software as a Service model (SaaS). You can log in to any of them as you log in to your webmail, for instance.

Some editors like webtrends or IBM propose installed versions (“On-Premise”) of their solutions, hosted on the advertiser’s servers, to meet their requirements. But this type of request is on the decline.

When Google’s paid solution Google Analytics Premium was launched, it became the only editor to offer both a paid and a free solution. Google’s strategy to offer a free solution at first and then proposing a paid peer provided it with a large amount of users of its products and thus with as many potential clients. Google stays close to advertisers throughout their evolution and growth with its solutions and lets them switch from the free to the paid version smoothly, whether this involves the technical or the marketing teams (the technical implementation and the interface of both versions are slightly different from one version to another).

Mature advertisers with advanced needs use paid solutions whereas beginners use Google Analytics at first.

Succeeding in Digital Analytics, is it a Matter of Solutions?

First of all, there isn’t really a “best” digital analytics solution, there are instead solutions more adapted than others to meet a specific need.

In order to make the best choice, it is important to remember every crucial need. To do so, you need to think in terms of use case: what would you like the solution to let you do? What are the questions you wish to find an answer to? These use cases must be included in the objectives you would like every potential partner to meet during a “Proof of Concept” (POC). Choosing a solution represents an important investment, financially (unless you decide to make use of a free solution) and in terms of human resources to implement and exploit it. You should feel free to take advice from a specialized agency in this endeavor.

In addition, setting up an efficient digital analytics strategy depends more on you and your staff than on the solution. The latter is no more than another key element to achieving your goals. Success lies in your ability to ensure that the implementation of the solution sticks to your needs, that collected data is qualitative enough, that the right people are trained, that you have synthetic dashboards and mostly, that you use data you collect to improve decision making!

Very often, advertisers have one or more well-performing solutions but only use a small part of their capabilities due to a lack of time and money to recruit a digital analyst or to hire an agency to guide them.

Paid Solutions’ Added Value

Five main reasons differentiate paid solutions from free solutions:

  • A contractual warranty providing support 24/7 (local language).
  • A contractual warranty guaranteeing the availability of data collection, report (over 99% most often) and performance (time to process requests until results are displayed in the reports) servers.
  • A contractual warranty applying to data property.
  • Consultancy and commercial account management.
  • Advanced features included in a suite of complementary products to meet advertisers’ requirements throughout the evolution of their maturity.

Paid Solutions’ Price

Paid solutions’ pricing is generally based on the purchase of an annual license comprising a request package.

Additional products to the main one can be acquired depending on the advertiser’s needs. Additional fixed or variable annual costs have to be expected.

License lifetime usually is of one year.

If the solution was implemented correctly, at least one request will be issued per page viewed.

If the advertiser measures clicks on outbound links with their solution, as well as download links, etc. we can expect as many as 4 or 5 requests on certain pages, for instance.

Tip to Calculate a License’s Cost

Here is a base of calculation to estimate your number of annual requests:

(Number of page views the previous year) + (page views growth forecast (%) for the current year) + (25% of page views forecast for the current year * 3).

The last parenthesis considers an addition of page views forecast (with each page view entailing more than one request; we assume here that 25% of potential page views in the current year will account for three requests.)

Example: if a website has one million page views in the previous year and expects them to grow by 10% in the current year:

(1,000,000 + (10% of 1,000,000)) + (25% of 1,000,000 * 3) = 1,850,000 requests.

Avoid underestimating the requests amount because if the package you purchase is exceeded, an extra cost per request can be invoiced. The unitary cost of a request depends on the solutions and the request package you purchased (the bigger the package, the lower the unitary cost of a request).

Prices range from EUR 5000 / year for a website with low traffic (under 500k page views per month) to EUR 50,000 for a website with regular traffic (several million page views per month) to EUR 100,000 for a website with high traffic.

Please note that Google’s paid solution, Google Analytics Premium, differentiates itself from its competitors by offering a unique license, independent from the number of page views, at EUR 140k / year.

Report Examples

Here are three report examples: the first one is associated to traffic acquisition, the second one to navigation and the third one to conversion. They are all taken from Google Analytics and are meant to serve as an example. Other digital analytics solutions have similar reports.

You will find the different reports here below:

Example of Google Analytics’ traffic acquisition performance report

Example of Google Analytics’ traffic acquisition performance report

Example of Google Analytics’ report on users’ browsing paths

Example of Google Analytics’ report on users’ browsing paths

Example of Google Analytics’ traffic funnel report

Example of Google Analytics’ traffic funnel report

Evolution of Major Digital Analytics Solutions from 1993 to Today

Evolution of the main digital analytics solutions

The digital analytics market was created by webtrends in 1993.

Between 1996 and 2000, six other firms follow and launch their digital analytics solution: WebSideStory, Omniture, Unica, Coremetrics, NedStat and XiTi.

The market evolved shyly until March 2005, when Google became a part of it and set the pace by acquiring Urchin software and launching Google Analytics in November that same year.

Google’s reputation represented a major advantage over its competitors and allowed its solution Google Analytics to be widely adopted, notwithstanding the fact that it is free and its features evolve very quickly. The competition proved to be beneficial for advertisers because it compelled editors to make their products evolve faster, but also for the latter as the market matured more rapidly and the number of potential clients increased. Then Google launched Google Analytics Premium, a more powerful, paid version of Google Analytics in September 2011.Since then, Google has progressively gained more market share and secured major accounts.

An important wave of acquisitions started when Adobe purchased Omniture in September 2009. The market condensed when Coremetrics and Unica were acquired by IBM and Nedstat by ComScore. Thanks to this, the latter, known for its panel-based measurement, completed its offer portfolio with the now digital measurement of what gave it a unique and interesting market position compared to its competitors. The market’s condensation continued in November 2015 when Adobe took a hold of ComScore’s digital analytics solution (it existed three and a half years).

These acquisitions helped Adobe and IBM build the first blocks of what they call today their digital marketing suite: Adobe Marketing Cloud and IBM Enterprise Marketing Management (EMM).

The evolution was also marked by the abrupt disappearance of Microsoft’s and Yahoo!’s digital analytics solutions in March 2009 and June 2012, respectively (they existed for three and five years). This was due to a series of cuts to activities that were not profitable in both cases (the two solutions were free).

Three other solutions exist independently in the market today: webtrends, AT Internet and Webtrekk. These solutions differentiate themselves from their peers: webtrends with it innovation pursuit; AT Internet is the leading provider on the French market and Webtrekk on the German market.

Piwik’s open-source initiative is also worth mentioning; it has two packages: a free one launched in March 2008 and a professional one launched in September 2013.

Positioning of Major Digital Analytics Solutions

Seven Major Solutions

Seven Major Solutions

“Magic quadrant” study, “Digital Marketing Analytics” category, by advisory company Gartner (September 2015)

“Magic quadrant” study, “Digital Marketing Analytics” category, by advisory company Gartner (September 2015)

This study features solutions Adobe, Google, IBM, ComScore and webtrends. Those not included are AT Internet and Webtrekk. Only extensively tested digital analytics solutions are included in this analysis. This solutions suite looks forward to meeting all of the advertisers’ needs, from data collection to analysis and optimization.

Google is the leading editor in the study thanks to its paid solution Google Analytics Premium and to the other products in the suite (DoubleClick, Adwords, Retargeting, etc.), which make it the most complete offer in the marketplace.

Adobe also appears as leader in this study. After it acquired Omniture, Adobe created the Adobe Marketing Cloud suite, which contains all the solutions of Omniture’s suite. It was then enriched with the acquisition of French provider Neolane in a bid to optimize traffic acquisition.

Neolane became Adobe Campaign within the Adobe Marketing Cloud suite. In addition, the Adobe Analytics measurement tool and the testing and customization solution Adobe Target from Adobe Marketing Cloud can be integrated into the Adobe Creative suite. This is possible thanks to extensions.

IBM appears as a challenger in this study. After it acquired Coremetrics and Unica, among others, it created the Enterprise Marketing Management (EMM) suite, which made part of “IBM’s Smarter Commerce” offer from that day on. The offer is composed of the e-commerce WebSphere Commerce framework, among others.

ComScore is presented as a visionary in this study. Adobe purchased ComScore Digital Analytix in November 2015, and this operation should contribute to improving its ranking in the next issue of the study, in terms of ability to deliver and of the suite’s comprehensiveness.

webtrends is a niche player in this study, as much as are other specialized firms in specific domains (such as Visual IQ for instance, in the attribution field). Indeed, webtrends does not have a digital analytics and a testing and customization tool whereas other editors have traffic acquisition optimization tools for example.

In sum, Google and Adobe are the market’s leaders; IBM is a challenger and webtrends is a niche player, as are AT Internet and Webtrekk, which are not present in this study.

Communities around Digital Analytics

In its early days, digital analytics was a confidential and rarely practiced discipline in North America and Europe. It was difficult for the first enthusiasts to find the necessary resources and to share their experience. The maturity level of the discipline progressed slowly.

This encouraged some to gather and establish a community. The first initiative was launched in 2003 in the United States with the Web Analytics Association (WAA, then renamed into Digital Analytics Association).

It was created by Jim Sterne, Andrew Edwards and Bryan Eisenberg (from left to right).

Photo digital analytics association

It was built around many poles, which include:

  • Evangelization
  • Education
  • Events
  • Internationalization of the discipline
  • Members management/sponsoring
  • Research
  • Standards definition

Each pole has multiple objectives, with a common one consisting in evangelizing the public and presenting the value of digital analytics through concrete cases.

The community is more active in the United States but it is also present in Europe.

In 2004, Eric Peterson, a renowned digital analytics consultant in the United States launched the first forum dedicated to the discipline. It is supported and moderated by the Digital Analytics Association. The forum was very active between 2005 and 2011 (there were up to 661 messages in 2008). Given that the market’s maturity has evolved, it is a bit less active today. There are job postings, practical questions related to solution implementation or change, etc.

In 2007 Eric Peterson continued his efforts to make the discipline move forward with the launch of Web Analytics Wednesdays (WAW). WAWs are digital analytics free events allowing digital analytics enthusiasts to meet and share. Many events have been organized in the United States but also in Australia, France, Poland, etc. Today, WAWs are less popular even though similar events take place in the United States and England.

Finally, in 2009, in a bid to help digital analytics enthusiasts evolve and gain experience, Eric Peterson launched “Analysis Exchange” with the support of two other renowned digital analytics consultants (John Lovett and Aurélie Pols). The objective of “Analysis Exchange” is to help associations and non-profit organizations move forward in the digital analytics universe for free and with the help of an individual wishing to gain experience (a student for example) through the teachings of an experienced mentor. In 2014, five years after it was launched, over 400 organisms had benefited from it.

In Europe, two major initiatives have been launched: the Measure Bowling and Measure camp.

The purpose of “Measure Bowling” is to gather digital analytics enthusiasts for a bowling session to have fun and share knowledge.

The Measure Bowling event was launched following a Twitter thread:

Creation of measure bowling on Twitter

The idea became a reality and the event created thanks to Peter O’Neill (left) and Nicolas Malo (right).

Photo of Peter o neill and Nicolas Malo creators of measure bowling

Measure Bowling events are now organized throughout Europe… and they all take place at the same time!

Shortly after the first Measure Bowling took place in London, Measure Camp was created in that same city in September 2012 by Peter O’Neil.

Measure Camps are “unconferences”, that is to say that content is created by participants themselves at the beginning of the event with a will to share and exchange ideas. Every participant can then go to the meeting room where the topic they are interested in is being discussed. “Unconferences” existed before in many areas, such as Web development, under the form of bar camps (the first bar camp took place in 2005).

The concept was imported into Paris and the first Measure Camp was organized in June 2015. In France, Measure and Bowling Camps are supported by the French-speaking Digital Analysts Association (French acronym: AADF).

As time went by, many resources were created, whether in the form of books or blog articles. We can quote, among others, books written by Avinash Kaushik (the first one is called “Web Analytics: An Hour a Day” and the second one “Web Analytics 2.0”); books and articles by Bryan Eisenberg (books “Call to action”, “Waiting for Your Cat to Bark?” and “Always Be Testing”); Jim Stern’s “Social media metrics” and “The Devil’s Data Dictionary” and Stéphane Hamel’s analytics maturity model “Online Analytics Maturity Model”.

Here are the links to get news on the aforementioned events:

-        Digital Analytics Association:

-        Forum dedicated to digital analytics: link

-        “Analysis Exchange” website:

-        Measure Bowling:

-        Measure Camp London:

-        Measure Camp Paris:

Print and share this article :

  • Twitter
  • Facebook
  • LinkedIn
  • viadeo FR

Brice Bottégal
About the author, Brice Bottégal :

Brice Bottégal started his career at digital analytics consulting agency Hub’Sales, where he worked as digital analytics consultant. He later took charge of pre-sales and participated in the launch of the quality assurance solution Hub’Scan. After that, Brice Bottégal joined TagCommander where he served as Product Manager. TagCommander is a major Tag and Data Management Platform in Europe. Brice Bottégal has been serving as digital analytics Professor at HETIC for the past six years.

You can leave a response, or trackback from your own site.

7 Responses to “Definition and history of Digital analytics”

  1.  Angel Vázquez says:

    Thank you for share and spread the knowledge, greetings from México City


  2.  Patrick LeMay says:

    Well explained the history of web analytics. Thank you very much Brice.


  3.  Ken Quandt says:

    Thank you for this history, Brice. Well done. May I blogroll you on my blog?


    Brice Bottégal the March 24th 2010 at 08:28 answered :

    Of course ! Thank you :) I will create a blogroll too but not now. Good continuation in writing your blog !


    Ken Quandt the March 25th 2010 at 20:29 answered :

    Thanks, Brice.

    Take care.

    Kind regards,



  4.  Justin Kistner says:

    Great history of web analytics post, Brice. If only the Wikipedia version was this good… ;)


    Brice Bottégal the March 23rd 2010 at 16:01 answered :

    Thank you Justin :) English is not my mother tongue so…
    I tried to do my best !



Leave a Reply