Today's stakes are too high for "Lies, damned lies, and statistics"
Collecting data is certainly not new. However, with the advent of "Big Data", there is almost a blind faith that if we can collect more data, link it all together, we will suddenly become more intelligent and make better decisions. The challenge with all the data available today, is how do you make sense out of it? The typical answers are "data mining" and analytics. Just because you put mountains of data through a computer to generate more elegant views of it, doesn't necessarily mean you'll be any smarter! In fact, more analyses often yields paralysis. Today's business decisions often do not require more data … they require asking 7 Key Questions for Decision Precision.
Why this is important: We don't lack data today, we are literally drowning in it. The key to better decisions is not data analyses to create more views, but asking the right questions to narrow the focus on how to accurately measure outcomes.
Lies, damned lies, and statistics …
No one is exactly sure who should be credited for the phrase: "There are three kinds of lies: lies, damned lies and statistics". This phrase was certainly popularized by Mark Twain to describe the pervasive power of numbers. Regardless of the source, the phrase is particularly relevant in describing the use of statistics and analytics to bolster weak arguments.
I recently received an email from Gregory Yankelovich at Customer Experience IQ. Gregory always has great insights and a very practical view of developing intelligence for the consumer centric world we live in. Gregory has a great definition which captures some of the cynicism associated with big data over reach and the unsubstantiated claims coming from data mining:
Data Scientist (n): A machine for turning data you don't have into infographics you don't care about.
The power of intelligence developed through testing and measurement
The point is that data cannot lie to us. Data is not inherently good or bad … it is merely descriptive of elements at points in time. Data mining and analytics are also neither good nor bad. They are designed to "cut through the fog" to find some relationship or pattern.
The key to making better decisions is NOT sifting through data after the fact. To determine what works where requires planning metrics and measurement that will pinpoint both outcomes and what impacts results. Said another way, data is of little value and can be very misleading without a plan and design to focus on measuring what works and what doesn't.
7 Critical Questions for Business Decision Precision
-
Are you measuring the right thing(s)?
In order to make valid decisions you need data that is "valid". For example, it is hard to make decisions about whether a diet makes you "healthy" if you only measure your weight. If critical decisions are being made about store design to capture consumer behavior, then there needs to be some valid data regarding consumer footfalls, where they land, and how long they stay in the area.
To measure the right things related to omni-channel consumer experience will increasingly require data beyond the traditional store and in store cash register.
-
Are you measuring consistently?
Consistency refers to reliability of the data. If two observers watch a consumer shop and record different levels of "engagement", then data collection or tools are unreliable. Unreliable data can mask any real differences and lead to erroneous conclusions. Sales data at the cash register tends to be highly reliable, but lacks validity in terms of measuring the quality of the store experience. Accurate decisions require both valid data and consistent reliability.
-
Are you measuring accurately enough?
Measuring accurately refers to precision. If you are a carpenter, measuring in inches may do for building a house, but building a fine furniture cabinet requires fractions of an inch. Measuring gross revenue or profit is simply not precise enough to determine if a training program increases "add on sales". Web traffic is not precise enough to determine if marketing is attracting new customers (unique visitors who purchase).
-
Are you measuring often enough?
It was often sufficient to measure weekly print ad impact by measuring weekly sales. Web sales fluctuate wildly by hour, and the best online retailers track sales by minute to measure impact of variables such as pricing changes. One of the biggest misses in measuring retail store results is not measuring the impact of hourly staffing patterns.
-
Are you measuring the right variables?
Retailers are typically very good at measuring "internal" variables, especially sales. But, in order to understand which variables impact results often requires analyzing "external" variables such as customer characteristics, competitive factors, or even the weather. The best insights come from analyzing a mix of outcome data and environmental attributes, which can impact implementation and on-going execution.
-
Is your sample large enough?
It is extremely dangerous to draw conclusions from a case of one. From conducting countless experiments scientists have discovered that there is a "Hawthorne Effect". People simply behave and perform differently when there is something new going on (as in the case of new store design, training, or even website pages).
In order to limit the false conclusions and misinterpretations, the sample needs to include at least "several" implementations in the same period. The larger the financial decision or risk, the larger the sample needs to be for both test and control stores.
-
How can you be sure it's not something else?
If you search long enough with enough data mining, you will find something statistically significant. For example, researchers have found a statistical correlation between the height of heels on women's shoes and the state of the economy. But in the case retail business decisions, we need to know if some variable like training caused a change in an outcome like consumer satisfaction. This requires comparing "tests" to "controls" design.
With the realities of retail, it is often impossible to randomly assign stores to test and control groups. However, it is possible to identify and measure "comp stores" who have similar demographics, but they don't get the implementation variables. The single greatest source of "lies" in retail is trying to draw conclusions from data without measuring enough comp stores during the same time periods.
Lies are most often made by those most heavily invested
In business, decisions are rarely made in a vacuum. Managers making the investments believe that they will "work". The larger the investment, the more heavily vested the sponsors become in "proving" the value and outcomes. While very few mangers set out to "lie", not adequately addressing the 7 Critical Questions opens the door to let inaccurate results, or interpreting data in ways that mislead.
The single greatest challenge facing retailers today is that in the rush for rapid change, they don't make the time to setup a measurement design in advance of spending the money to implement change.
The anecdote for lies and damn statistics à Eliminate bias upfront
The point is data cannot inherently lie to us. As increasing business pressures mount, we have a great risk of using analytics to grasp at straws to find "something". With "big data" we have an even greater risk of using today's sophisticated tools to be able to data mine everything to find some "significant" relationship.
The simple but tough solution is don't guess, and pray to find something. The very best of breed design measurement upfront to address the 7 Critical Questions for Decision Precision.
To receive more information and sound bites from IMS follow IMS Results Count on Twitter, Facebook, Pinterest and Google+.
Sources:
- Data Dice Image: Stuart Miles; Freedigitalphotos.net
- Wikipedia: Lies, Damn Lies, and Statistics
- World Data Image: Suphakit73; Freedigitalphotos.net
- Customer Experience IQ; www.cx-iq.com, Gregory Yankelovich
Comments