It started back in the Woodstock Era 1 – small like the Big Bang. And like the Big Bang, Big Data exploded across the void to saturate every point of every aspect of life on earth.
Each day, without exception, people around the globe create 2.5 quintillion bytes of data (1 QT = 1 followed by 18 zeros – 1,000,000,000,000,000,000.). Google all by itself processes 24 Petabytes of data daily – enough to store the DNA of the entire human race 3 times. Most U.S. companies, such as a single-rooftop auto dealership, create or store 1 Terabyte of data each day, equal to 2,000 hours of CD-quality recording. There is simply no place on earth, our micro-universe, where data does not affect, control, manage the human experience
It’s only just beginning. Any way you look at it, whatever you might think of it, big data is here to stay, and is growing exponentially minute by minute, hour by hour, day by day.
How to navigate this new universe, how to survive and thrive in it, is a mind-boggling task. The management, storage, cleansing and proper use of data is a massive, expensive job for the largest global enterprises, and an almost impossible task for moderate, medium and smaller businesses.
Take, for example, the U.S. Postal Service, which in 2013 lost 6-billion pieces of mail at a cost of about $1.5-billion, all due to bad data. A Gartner survey 2 put the annual cost of bad data at $13.3-million per enterprise among Fortune 1000 companies. (More than 25% of their critical data is flawed.) For typical franchised auto dealerships, the costs of bad data have been pegged at over $480,000 per year per rooftop right off their bottom line. (More on this later.) Add it all up, and incorrect, inconsistent, fraudulent, redundant, stale data cost the U.S. economy over $3-trillion per year 3, more than five times the total 2016 U.S. Defense budget!
Bad data can happen any number of ways. Some of the most typical include: incorrect data entry; stale data due to aging; change of address; death; expired permissions, and more. Data requires constant updating and maintenance. Like oceans do to land, time is forever eroding data quality. Without close, diligent attention, data cannot survive uncorrupted.
- Using the J.D. Power & Associates figure of 12% national closing ratio, and –
- Putting that over a dealership average of 1,000 leads per month, a typical dealership –
- Will Not close 880 prospects who went into its CRM every month.
- Now, take an average ad budget of $50,000/month or $600,000/year.
- At 12,000 total CRM entries (1,000 x 12),
- The direct per lead cost to the dealership is $50.
Now, let’s say we have an above-average dealership that’s closing at 20%, and is losing only 800 sales per month. The math is:
800 lost leads per month
x $50 cost per lead
x 12 months
$480,000 in Direct Cost per year, or $2.4-million lost over 5 years.
That’s just the beginning of the expense. You can attribute the lost sales – these missed opportunities – to a galaxy of causes, including poorly managed showroom processes, poor follow-up and drip campaigns that end too soon.
Whatever the cause and whatever you name it (dead/lost/unsold), that CRM data is turning into junk, a worthless, unrecoverable expense that’s degrading everything around it – making the entire CRM suspect. Again, you can blame sloppy data entry, prospects moving, changing phone numbers and emails, death, naturally occurring opt-outs, or automatic legal compliance of opt-outs due to no written opt-in consumer consent. Every year, time and lack of care turn thousands and thousands of missed opportunities that a dealership already paid for into unusable, toxic junk.
Let’s turn to the Department of Defense one more time to understand the Indirect Costs of bad data.
The D.O.D’s data quality standards categorize indirect Dirty Data costs into three verticals: Operational, Tactical, and Strategic. These verticals apply equally well to car dealerships:
- Operational – Employee and customer dissatisfaction and increased operating expenses due to resources being misallocated. Advertising and marketing efforts perform poorly and lead to increased ROI.
- Tactical – Bad data leads to bad decisions and causes channel conflict, resulting in wasted time and effort to re-engineer processes. This creates mistrust and loss of confidence within the organization.
- Strategic – Bad data makes it difficult to define and organize organizational strategies. Uncertainty distracts ownership and management from critical strategic arenas such as customers and competition.
There is a Solution
You can fix bad data. But you can’t improve what you can’t measure.
Here are the qualities of good data: completeness; uniqueness; timeliness; validity; accuracy; and, consistency.
Here are the steps to creating quality data: cleansing; matching and linkage; data dimensions; enhancement; standardization; profiling; auditing; and monitoring.
Data quality solutions vary dramatically depending on the organization, industry and data needs. There is no one-size-fits-all data quality solution in the market.
Dealerships universally cannot internally keep their CRM and DMS data cleansed without hiring a full staff and contracting and managing multiple outside vendors. And even after cleansing the data, the dealership will still need a strategy and the means to use it profitably.
Yet, a dealership with a CRM of approximately 30,000 records – properly cleansed, appended and verified – can realize an additional 10 to 15 incremental unit sales every month. It’s worth it. Ultimately, the quality of a dealership’s data can be its competitive advantage or disadvantage.
Smart dealerships are catching on to the costs – and potential – of the missed opportunities in their CRM. They are seeking out reputable, automotive-centric, data and technology firms with the latest innovations in data hygiene and implementation to turn garbage into gold. For those dealerships, what was once a dead cost has become a consistent profit center.
FOOTNOTES
[1.] There are many ways to set a birth date for the Big Bang of Big Data. For the purposes of this article, I have selected 1969, the year the first communications were sent across ARPANET, the precursor to today’s Internet, which itself gave rise to cloud computing.
[2.] Bad Data Costs the U.S. $3 Trillion Per Year, Thomas C. Redman, Harvard Business Review, September 22, 2016. https://hbr.org/2016/09/bad-data-costs-the-u-s-3-trillion-per-year
[3.] Dirty Data is a Business Problem, Not an IT Problem, Gartner, March 2007. http://www.gartner.com/newsroom/id/501733
OTHER SOURCES
Auto Dealers that Embrace Technology Deliver a More Satisfying Sales Experience, J.D. Power and Associates, 2015
A Survey on Data Quality: Classifying Poor Data, IEEE Computer Society, a paper presented to the 2015 IEEE 21st Pacific Rim International Symposium on Dependable Computing
The ROI of Data Quality, by William McKnight (McKnight and Associates), and Firstlogic, for Experian, 2014