The Effects Of Sub-Optimal Data On Modern-Day Corporate Strategy.

The Effects Of Sub-optimal Data On Modern Day Corporate Strategy



Leading Tax Advice

Call Nicholas Kilpatrick



“In God we trust – but all others bring data”


These are the words of Eric Schmidt, former CEO of Google, at a 2012 speech on Collective Intelligence at MIT’s Sloan School of Management.  The context was the ever changing yet dynamically enhancing culture of the Internet as the planet’s largest data engine.

The costs of sub-optimal financial and non-financial data-collection are possibly best revealed in numerous past corporate strategy blunders that emanated from the absence of, inaccuracy of, or misuse of, Data (Big or Small).

Many will remember well the decision of Coca-Cola in 1985 to completely re-vamp its recipe and brand with the “New Coke” label.    The Company’s share lead over its chief competitor, in its flagship market, with its flagship product, had been gradually declining for 15 years.  Believing that it needed change to spur growth and interest, the Company decided to test its new secret recipe in 200,000 taste tests across the country.

While the data revealed that most consumers taking the test preferred the taste of the new Coke to the traditional recipe, what was not taken into account was the deep emotional attachment consumers had with the Coca-Cola brand.

Amid the outcry of the new look of cans and, by association, the new taste, calls flooded in to 800-GET-COKE and to Coca-Cola offices across the country demanding to know why the disastrous change was made.  Consumers held employees and executives alike responsible for the change, and the whole change process was tagged as the “marketing blunder of the century”.

The official rollout began on April 11, 1985, and by July the Company reversed the whole strategy by introducing “Coca-Cola Classic” to quench the fears of its committed army of followers.

Such is an example of insufficient data, in this case non-financial, where the power of the brand could have easily been revealed via data collection efforts such as surveys asking if change was necessary, if consumers purchase other carbonated drinks on a regular basis and why, and how important the Coke brand is to them?

Consider Kodak.  The Company actually developed the first digital camera in 1975, but collected insufficient data to accurately reveal its potential, consequently shelving the project.  By the time it realized the market shift and the potential of the technology, it was too late; other companies that established the market were on their 2nd generation of product offerings before Kodak got started.

The need to get data right and comprehensive enough to facilitate correct strategic decisions is prevalent in companies of all sizes; the effects of short-cutting the data quality exercise are self-evident.  A study from DemandGen Reports tabulating the impact of bad data on the Enterprise reports that 62% of organizations rely on marketing and prospecting data that is 20 to 40% incomplete or inaccurate.[1]  Additionally, almost 85% of businesses said they are operating customer relationship management (CRM) and sales force automation databases with between 10 to 40% bad records.

These inaccuracies lead to substantially increased costs and lost production time.  The unfortunate results of insufficient, poor quality, or poorly managed data result in email campaigns with a less than 3% response rate, and between 15 to 20% returned mail.

As a defining attribute to the problem, the research reveals that 68% of executives allow sales and marketing staff to access external sources to “supplement” their databases and information assets, summarily discounting substantial costs already incurred to provide a centralized hub of information to facilitate the compilation of sales and marketing metrics.  Even more sobering, by doing this the executives tacitly vote no-confidence on the accuracy and sufficiency of the data they spend so much money collecting and working with.

Companies also exacerbate the lack of transforming power of their data source by placing pockets of data within different silos spread across different departments.  The decentralized positioning results in duplication of data, inability to execute comprehensive cleansing and optimizing routines, and excess staff time coordinating the data to present in meaningful ways to sales staff.

To enhance data quality and reduce data transformation times, a good Data Quality (DQ) process attends to 3 main areas:

  1. Data Accuracy
  2. Data Cleansing
  3. Data Optimization



Act Now To Get Our Value Pricing Adjustment on Audit and Review Engagements For Your Business



Data Accuracy

Data accuracy is symptomatic of the strategic plan employed by the Company.  If the strategic plan does not align with the data collected, then no data optimization exercise will resuscitate this misalignment.

The preferred strategy to ensure compatibility between strategy and data utilization is to first internally assess which drivers could lead to the success of the strategy, and then collect meaningful data to ascertain what those drivers actually are.  For example, Coca-Cola should have widened its pool of drivers to explain consumer response to the new Coke, but the pool of drivers was limited to taste.  Expanding the driver pool to include customer relationships to the brand would have revealed a strong attachment.  It’s the responsibility of the C-level personnel to assess the parameters of the strategy to be employed.

What is equally important is to correctly target the right market to obtain the right data.  In the new Coke example, securing knowledge of the age group and demographic purchasing the most Coke is necessary to then assess their impressionable tendencies and how they would respond to any changes to the product.


Data Cleansing

Upon consensus of what data is needed to correctly facilitate the strategy, the data needs to be cleansed for duplications, errant attributes, and missing information.  The DemandGen report reveals that 30% of respondents do not have a strategy for cleaning their raw data (via software, processes prior to data entering the funnel, etc.), and more than a third of respondents leave inaccurate or incomplete records in their databases, requiring sales teams to update them as often as possible.  When asked how respondents update their in-house customer and prospect databases, “manually” was the most common response.

This means that substantial time is wasted by employees on non-value added activities, resulting in possible runaway costs and reduced value enhancement to the Company.

If done right, the data cleansing process can be eradicated at the collection stage as long as the data collection is aligned with the strategy and the process to collect the data is comprehensive and sufficient for the needs of the strategic deployment.


Data Optimization

To transform data to usable information, software analytics complemented with the experience and insights of the decision makers and executive personnel will optimize the usage of the data, but only if the data coming in at this stage is clean and current.  Also, optimization of the data needs to be reconciled to the strategy by using insights, which provide the “how” and the “why” behind the “what” of analytics.  The adage “Garbage-In-Garbage-Out remains a call to due diligence at all stages of strategy implementation.

When a well-considered and researched strategy is combined with grounded insights and resolute management, data can be correctly and confidently used as an important component to reveal optimal strategic alternatives.


Consider Southland Corporation in the 1970’s, known for pioneering the concept of the convenience store chain with its 7-Eleven shops.

Toshifumi Suzuki, the first CEO of 7-Eleven Japan, decided that the key to profitability and the Company’s stores would be rapid inventory turnover.  So he placed responsibility for ordering in the hands of the stores’ 200,000 mostly part-time sales clerks.  The strategy was based not just on data Suzuki had, but also on his insights and common-sense approach to the space in which Southland participated.[2]

Intending to get the right data to the stores in a timely manner, and in a format that could be most easily used to make the decisions necessary to carry out his profit-increasing strategy, Suzuki sent each store daily sales reports and supplemental information such as weather forecasts.  The reports detailed what had sold the previous day, what had sold the previous year on the same date, what had sold the last day the weather was similar, and what was selling in other stores.  In addition, he connected the clerks with suppliers to encourage the development of items that would suit local customers’ tastes.

The result? 7-Eleven Japan has been the most profitable retailer in Japan for the last 30 years.  Why?  Because Suzuki determined the correct data to put in the hands of the people who were responsible for using it, in order to ensure that the top-level strategy was realized.  Suzuki also made sure that the data was clean and reliable before being disseminated to the stores.  He was also adamant to retain primary control over the definitions of the data to be deployed.  Lastly, he aligned the responsibility of inventory management with those who were most able to successfully do it.

Maintaining a Data Quality strategy by managing collection, cleansing, and optimization procedures is an easily recognizable step in the process to facilitate strategic success. Similar to the inverse relationship in the software development space of more time on planning resulting in less customer complaints at the post-sales stage, the more time spent on data optimization results in a higher success rate of the strategic plan


Nicholas Kilpatrick is a partner at the accounting firm of Burgess Kilpatrick.  He leads the firm’s consulting and strategy practice and works with companies to enhance their Analytics, Forecasting , and Data Optimization functions.  The practice’s focus includes quantitative forecasting, corporate and unit strategy and planning.  Please visit our website at or on Facebook at for more information on our firm.


[1] DemandGen Reports, Assessing the Impact of Dirty Data on Sales & Marketing    Performance, Waltham, MA, July 2013.

[2] Ross, Jeanne W., You May Not Need Big Data After All – Learn how lots of little data can inform everyday decision making; Harvard Business Review; Harvard University Press; pp92-93.

Contact Us

Leave a comment

Your email address will not be published. Required fields are marked *

Want Great Insights, Tax Planning, and Business Videos?

Subscribe To Our Newsletter!

You have Successfully Subscribed!