3 Pillars of Digital Experimentation

The bridge between academia and industry is narrow, and it's hard to cross over meaningfully.

Sometimes though, you stop chewing your toast mid-bite, and your coffee spills from your hand as you find an academic paper blows you away.

Today I’m going to share one of those papers with you! Readable (enough), deeply aligned to both the theoretical implications and practical application of being a digital practitioner.

Photo of Evan for blog articles

Evan Rollins

Co-Founder

May 31, 2024
Laboratory equipment

At Drumline, we're united with academia in one very specific way - we love experiments! This paper is exceptional in that it outlines what you need for digital experimentation, far removed and distinct from the 'pitch' that you'll find from folks trying to sell experimentation (yes, even us). However, academics are going to be academics, and at times it’s an impenetrable read.

In this article we’ll take on the role of translator for what experimentation practices look like on the ground for businesses across ANZ. We’ll examine how the ‘building blocks of experimentation’ appear practically for two types of organisations:

  • Starters: Teams just starting their journey with experimentation

  • Scalers: Teams looking to move from a few teams involved with, and leading the charge with experimentation to a business-wide approach

So what does a business need to activate experimentation? Bojinov and Gupta propose 3 capability pillars:

  • The Data Platform

  • The Experimentation Platform

  • The People & Process

The data platform, the experimentation platform, people & processes

“Platform” in this context is a capability, not a technology tool with a login, so includes a suite of technologies, teams that use them and processes that govern them (or a mix of any of these!).

The Data Platform

While most folks running experimentation probably don't think about a 'data platform', it's an essential capability - one that's often rolled into the packaging of experimentation tools. The basic needs for a data platform can also be supplied by Google Analytics, Adobe Analytics or a various suite of other analytics tool - so don’t let the name intimidate!

The Data Platform needs 4 ingredients:

  • Data capture for specifics like behaviour of users on the product, and which experiment they're exposed to

  • Analytics and reporting on that data captured

  • Ability to augment data and monitor quality of tests

  • Provide a governance layer of security and privacy

Practically, most Starter organisations running experimentation use their technology to fill the majority of these gaps. If you have Optimizely Web or Kameleoon for example, then these technologies provide the layer of governance and flexibility in data capture for most use cases out of the box. For example, you can create a metric for ecommerce transactions quite easily to use as a measure of increasing revenue in any of these tools.

A technology platform diagram showing integrated data and experimentation solutions. The top section displays logos divided into two categories: 'Experimentation Tech Solutions' (featuring Optimizely, Kameleoon, and ABTasty) and 'Digital Analytics' (showing graph and analytics icons). Below are four connected components with upward arrows: 'Capture' (represented by a database icon), 'Analytics' (dashboard icon), 'Augment and monitor' (eye/visibility icon), and 'Governance' (document with checkmark icon). At the bottom, three platform categories are shown in coloured banners: 'The Data Platform' (purple), 'The Experimentation Platform' (orange), and 'People & Process' (teal).

More mature businesses, Scalers, where experimentation is folded into how products and features are developed, often have more sophisticated systems for managing their experiment data. Usually involving a warehouse or data lake as the repository for reporting and augmentation, and requiring well developed data engineering and IT infrastructure to support. Needless to say, complexity can increase quickly, so this is where your solution architects earn their keep.

A technology ecosystem diagram with integrated data services. The top features three dotted-outlined sections: 'Product / Digital Analytics' (displaying Heap logo, analytics graphs and metrics tools), 'CDPs & Feature Flags' (showing data connection and feature flag icons including LaunchDarkly), and 'Data Warehouse and IT IAM' (featuring AWS, Google Cloud, Azure logos). Below are four purple data process components with upward arrows: 'Capture' (database icon), 'Analytics' (dashboard icon), 'Augment and monitor' (eye monitoring icon), and 'Governance' (document with checkmark). The foundation shows three platform categories in coloured banners: 'The Data Platform' (purple), 'The Experimentation Platform' (orange), and 'People & Process' (teal).

Looking at the logo-soup above, you might think, “Well I have one (or more) of those pieces of kit. I must be sorted!” As always, the answer is “it depends”.

If you have a source of data that you trust, is consistent, and allows you to measure an experiment, then that’s all you need to be moving in the right direction.

The Experimentation Platform

An Experimentation Platform is synonymous with tech vendors these days. What should a piece of dedicated experimentation technology provide? If you’ve spoken to them before you might have seen propositions around integrations, ease of use or speed to value. Let’s look past the names of features and consider the core capabilities required.

The components of The Experimentation Platform are:

  • Test configuration: choosing which users see your test and setting up your test variations

  • Test execution: randomly assign users to the test variations, and the ability to safely turn tests on and off

  • Analyse your test results using the correct statistics, even when you want to test non-standard metrics.

The vast majority of businesses will be buying an Experimentation Platform off the shelf from a vendor - almost all of which tick these capabilities to vary degrees. When choosing a tech vendor here, remember that it should be based on your circumstances, not just the most features! One of the better top-level views of the platforms available to market is Speero's AB Testing Tool comparison.

A diagram illustrating the Experimentation Tech Solutions ecosystem. At the top, three experimentation tool logos are displayed: Optimizely, Kameleoon, and ABTasty. Below these logos, three upward-pointing black arrows connect to three orange-outlined icons representing the experimentation workflow: 'Test configuration' (gear icon with exclamation marks), 'Test execution' (checkmark and X icon with question mark), and 'Statistical analysis' (bell curve/normal distribution graph). At the bottom, three connected platform components are shown as coloured banners: 'The Data Platform' (purple), 'The Experimentation Platform' (orange, prominently featured in the centre), and 'People & Process' (teal).

Most platforms can cover the gamut of use cases from starting AB testing on landing pages and forms through to a scaled multi-team experimentation practice that drives product feature decisions.

You might have heard about digital product companies like Spotify and Netflix building their own experimentation platforms in-house as well! This practice tends to work for businesses with a very strong engineering-led practice and the resources to create, manage and enforce their own technical infrastructure. If you're reading this in a business with your own experimentation platform, you probably spend more time thinking about people and process.

The People and Process

The biggest sticking point for most businesses in making experimentation just part of operations is the change management to processes and practices. You can buy a piece of tech and run an AB test in an afternoon - getting folks to use experimentation as a tool for better decision making is hard and takes work.

The components of People and Process to deliver experimentation are:

  • Leadership on the role of experimentation in the business, and direction in the KPIs and incentives that will be used to measure success across teams

  • Expertise in the business to manage and deliver experiments successfully

  • Workflow practices on involving stakeholders and teams to input experiment ideas, approving designs and features, and communicating results back to the business

  • Experimentation as a service, which means clear documentation, training and support to bring the business along for the ride

As with any human practices, crafting a culture of experimentation takes ongoing, consistent effort. For most of our clients, People and Process becomes the biggest barrier to success, and so we spend a large amount of time focused on advocacy, sharing of results, and inviting input to the experimentation process.

Building the capabilities in People and Process also look incredibly different for Starters and Scalers. Starter organisations almost always have experimentation relegated to a single team (or person in a team!).

A diagram showing the organisational structure for 'Teams building experimentation capability'. At the top are three rectangular blocks, with the middle one in teal containing a people icon, while the others are grey. Below these is a row of rounded rectangles with 'Websites, Apps and Products' labelled underneath, with the middle one highlighted in teal showing an A/B testing icon. Four upward-pointing black arrows connect to four teal icons representing key capability areas: 'Leadership' (bar chart with lighthouse), 'Expertise' (person with badge/medal), 'Workflow practice' (connected process blocks with checkmark), and 'Exp. as a Service' (person surrounded by connected nodes). The foundation shows three platform components with 'People & Process' (teal) prominently featured as the largest section, alongside 'The Data Platform' (purple) and 'The Experimentation Platform' (orange) in smaller sections.

Scalers can represent many different organisational models, ranging from a centre-of-excellence model, to a hub-and-spoke design through to distributed experimentation practices across teams.

A diagram depicting an 'Experimentation Centre of Excellence' within a dotted outline. At the top, four light teal rectangles with people icons represent teams feeding into a central darker teal group. This central team connects to three A/B testing platforms (shown as computer icons with 'A/B' text) labelled collectively as 'Websites, Apps and Products'. Four upward-pointing black arrows connect from below to teal icons representing core capabilities: 'Leadership' (bar chart with lighthouse), 'Expertise' (person with badge), 'Workflow practice' (connected process blocks), and 'Exp. as a Service' (person coordinating a network). The foundation shows three platform components: a small purple 'The Data Platform' section, a small orange 'The Experimentation Platform' section, and a dominant teal 'People & Process' section that spans most of the width. The diagram uses Australian spelling ('Centre') and illustrates how centralised expertise supports multiple teams and digital properties.

These models all have benefits and drawbacks, and the differentiation reflects what works for the organisation, not necessarily what’s best for the practice of experimentation.

The key principles to be broadening the reach of experimentation across a business, especially as a single team are based on sharing impact, getting leadership buy-in, and having easy ways to invite collaboration.

  • Build (and maintain a high level of expertise in the core team

  • Share your wins (and your losses) broadly

  • Invite collaboration from other teams to be part of the process

  • Make time to advocate to leadership on the value of experimentation

Summary

The ANZ market is seeing a growth in maturity of experimentation practices, and the ability to evaluate practice through these 3 pillars provides an outside-in view of how to continually assess capability and excellence.

There is still a long way to grow in our market, as well as a need for better adoption of non-traditional experimental methods, due to the 3 big challenges of:

  • Smaller budgets

  • Less traffic to be testing on

  • Fewer local examples of excellent practice

Even so, businesses in ANZ are ripe to take advantage of experimentation as a tool for building better digital platforms and customer experiences, with the earliest adopters on this journey seeing the greatest rewards.

  1. Online Experimentation: Benefits, Operational and Methodological Challenges, and Scaling Guide · Issue 4.3, Summer 2022 (mit.edu)