Harnessing big data

Addressing the analytics challenges in supply chain management. 


A changing workforce and lack of convergence between information technology (IT) and business may be preventing many companies from joining the big-data revolution. Defined as very large sets of data but more commonly used in reference to the rapid increase in amounts of data in recent years, big data will divide companies into two groups in the next decade: those able to benefit from big data’s potential and those unable. Companies that create capabilities for capturing, processing, analyzing, and distributing data in order to make better decisions in real time will likely be able to outperform their competition and respond more quickly to their customers’ needs. The data avalanche is coming from a number of sources, such as enterprise resource planning, orders, shipments, Weblogs, GPS data, radio-frequency identification, mobile devices, and social channels; and there is value to be created in all areas of a business by adopting a data-driven culture.

However, in discussions about big data’s arrival, we sometimes forget to ask how effectively we’re converting the data into value. Too often, huge investments in IT infrastructure coupled with sophisticated analytical and reporting software have delivered little value. Why? We often find it’s because companies are understaffed, or they may lack the analytics talent who know how to build links between the data and the value drivers. There is also a gap between finding insights from data and then applying the insights to create value. That is where the levels of training and experience of a company’s analysts enter the equation.

One area of particular concern is supply chain management (SCM). A company’s SCM organization makes decisions about build plans, stocking locations, inventory levels, and so forth based on the conversion of raw data about demand, sales, and inventory on hand. And when there’s a shortage of analytics talent, SCM is typically one of the first areas affected. Traditionally, analytical innovation happens in two ways: either through an internal-pipeline process of developing junior analysts into senior analysts or by periodically bringing in external experts to seed knowledge. But big data is challenging both approaches.

The internal pipeline is challenged by a workforce marked by shorter tenures. Shorter tenures result
in more generalists in the workforce, often in place of the specialists needed for analytical innovation. For example, younger workers, such as millennials, are significantly less likely to settle into a long career at a company. According to a survey by Future Workplace, 91% of millennials (born in the 1980s and ’90s) expect to stay in a job for less than three years (Meister 2012), meaning that those in analytical roles are usually in the job only long enough to execute established analytics—and not long enough to develop a holistic understanding of how data can be applied to drive business value. As a result, those on the business side and those on the IT side don’t always learn to make the end-to-end connections between raw data and measurable value. The internal-pipeline approach is further challenged by companies themselves: frustrated by high turnover, companies are less likely to invest in developing their people— only to watch the people leave for higher-paying positions.

The second approach—that of periodically bringing in external experts to rebuild a process or implement the latest software package—is also starting to show wear. The evolution cycle of new analytical techniques is rapidly slowing down as big data brings opportunities to better integrate internal and external data sources. Traditionally, companies have been able to implement software solutions or bring in experts to install the latest offering and then profit from that investment for five or seven years. The initial cost was justified by the continued value for years to come. But now, the volume, variety, and velocity of the new data being generated are changing the business landscape by calling for a more rapid cycle of analytical-tool introduction. And that landscape itself usually changes every two or three years. So, as a result, the days of big-bang projects appear to be coming to an end.

What can be done? Companies should look across the entire supply chain—or across any function,
for that matter—and measure the amount of data being generated. Then they should weigh that measurement against the value actually realized. If data volumes are growing more rapidly than the corresponding increase in value, there may be an analytics talent challenge.

Three methods of creating value have proved effective in today’s rapidly changing market.

1. Outsourcing portions of analytic requirements

Companies can approach analytics outsourcing in a variety of ways, ranging from a data prep model—in which a company hires a third party to process raw data to the point where an analyst can consume it— all the way to a fully outsourced model, in which a third party processes and analyzes the data, poten- tially adds other proprietary data, and sends back fully actionable information. The data prep model enables a company to focus a limited pool of analysts on the critical knowledge-capture portion of the process and thereby free up time spent on non-value- added processes. The fully outsourced model enables companies to stay up-to-date on the latest technol- ogies and software without having to make up-front investments to purchase the latest software and technology.

2. Creating central analytics teams

Companies that rely heavily on converting data to knowledge can set up an analytic group focused solely on solving analytical issues across the company. Such companies have adopted analytics
as a core differentiator and encourage analysts to develop the holistic view that facilitates insight. Central analytics groups seem to perform better than embedded groups—and especially when they report through the business side. Of course, maintaining a group dedicated to analytics is an investment that some companies may hesitate to make, but there is tremendous value in having such in-house expertise.

3. Partnering with academic or not-for-profit institutions

Academic and nonprofit organizations are often-overlooked resources. For instance, the brand-new Center for Supply Chain Management at the University of Pittsburgh intends to provide student and faculty interactions with industry representatives who will promote experience-based learning activities within the university’s supply chain management courses. To improve the center’s effectiveness, the university plans to create a Supply Chain Management Industry Council composed of member companies dedicated to SCM. The council members, along with tenured faculty specializing in teaching SCM, will foster interest and excellence in SCM and analysis. Other institutions offer training, certifications, and conferences that encourage and enable analysts to further develop and share ideas. The Institute for Operations Research and the Management Sciences recently introduced the Certified Analytics Profes- sional certification to give companies an option for developing their people without having to make hefty investments in training organizations.

Big data is fundamentally transforming the way business operates. It is enabling management to track the previously untrackable, forecast the previ- ously unpredictable, and understand interactions between suppliers and customers—all of it with unprecedented clarity. And winning organizations will invest in the necessary infrastructure and people to harness the transformative power of data.


Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.

2 thoughts on “Harnessing big data

  1. Rahul Saxena

    Great article and a pointer to analytics resources being under-staffed and under-skilled. I’d add that the analysts often lack the skills to build the connection between analytics and results (converting data to value) because they don’t understand how their insights are converted into decisions and actions. This is the gap that I’m trying to address, and some of my thoughts are in my book and on Slideshare (http://www.slideshare.net/rahsaxen).

    Incidentally, I moved from the Bay Area to Bangalore to solve for a US firm’s analytics skills shortage using offshoring.

    Rahul Saxena
    LinkedIn: http://in.linkedin.com/in/rahulsaxena
    SlideShare: http://www.slideshare.net/rahsaxen

  2. Victor Resendez

    Outsourcing potentially has risks regarding safeguarding that proprietary data. Differenciating between proprietary information and shared or general information can cause other issues for the companies providing those services. How do those companies protect themselves to avoid legal repurcusions ?


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>


Entegris announces GateKeeper GPS platform
07/15/2014Entegris, Inc., announced last week the launch of GateKeeper GPS, its next-generation of automated regeneration gas purification system (GPS) technology....
Bruker introduces Inspire nanoscale chemical mapping system
07/15/2014Bruker today announced the release of Inspire, the first integrated scanning probe microscopy (SPM) infrared system for 10-nanometer spatial...
MEMS wafer inspection system from Sonoscan
06/25/2014Sonoscan has announced its AW322 200 fully automated system for ultrasonic inspection of MEMS wafers....