APIs and the High Value of ‘Useless’ Data

Global enterprises spend a lot of money on technology — up to 7% of revenues go toward hardware and software to communicate, gather, store, curate, and analyze the organization’s data. However, a robust IT infrastructure is only as valuable as a company’s ability to use it to transform data into actionable insights. Insights are the key that unlocks the answers, not the data itself. But, despite the significant investments in data management, most companies have trouble gaining insights from their data, even though doing so yields substantial benefits to the top and bottom lines. When it comes to supply chains, there are two main reasons for this.

Why Companies Can’t Gain Insights from Data

First, most companies still try to manage and analyze their data manually, even though the expansive supply chains of today generate way too much data for this to be plausible. Relying on manual analysis is like trying to use messenger pigeons in an age of web conferencing, email, and single-click mass communication. Second, data is often siloed by business unit, department, team, region, or simply the organization’s technology paradigm when the data was captured. Most enterprises have more than 400 different applications supporting their operations. Without a centralized means of control, the risks of data disorganization, corruption, and loss are acute.

Because the data is siloed, each time the organization mines it for crucial insights, they draw conclusions based on — at best — a small slice of data. In reality, the best insights often come from cross-referencing data sets based on financial reporting period, region, business unit, or vendor. Silos are often the exact thing that prevents this sort of cross-referencing. It would be like setting out on a transcontinental journey with only a sliver of a map as a guide. The money is spent and the data is there, but if it can’t provide insights, it’s virtually useless.

Solving the Twin Challenges of Manual Analysis of Siloed Data

The only real way to solve the twin challenges of manual analysis of siloed data is to use a platform that simultaneously breaks down data silos and then transforms the resulting data deluge into actionable insights. Application programming interfaces (APIs) are a crucial part of solving this challenge. An API is a software tool that enables two different applications (including their data streams) to communicate with each other much the same way as an interpreter breaks down communication barriers between people who speak different languages. Whether we realize it or not, most of us use APIs every day, for instance, whenever we use a search engine, ride-sharing app, or social media platform. Without these technology interfaces, we’d be searching databases manually one by one, spending hours on what takes less than a minute to accomplish with APIs. APIs present an efficient means of bridging data silos, and the best technology solutions will include APIs that integrate critical internal and external data streams.

To achieve the highest return on data, though, companies need an end-to-end software platform that also automates essential analytics and processes across the enterprise. The solution should be broad, covering demand planning, sourcing and supplier selection, production management and logistics all the way through export, import, and trade compliance, and it should deliver the most current data and highest accuracy rates available. Such a solution will also enable companies to lay the groundwork for other high-level functions, such as demand sensing and demand planning, that will solidify competitive advantage over the long term.

Data quality has always been important for enterprise efficiency, but it’s increasingly important for compliance too. The US Treasury Department’s Office of Foreign Assets Control (OFAC) is beginning to issue fines and penalties for violations that result from software errors. OFAC recently fined FedEx $500,000 for transacting with a restricted party even though FedEx used a screening software to prevent this. The solution failed to alert FedEx to the restricted party because it was only effective for exact word or phrase matches, not for spelling variations. The effect of enforcement actions like this is clear: “reasonable care” now requires the most accurate, most comprehensive compliance software solution available.

Breaking down data silos and automating data analysis isn’t just a step companies need to take in order to stay relevant. In fact, they can lead to significant competitive advantage not only by enabling cost reductions but by leading to new insights that drive market share and margin improvements. In almost every case, it makes more sense for an enterprise to integrate various data streams onto a single platform that automates business-critical data analysis. This process — digitizing — equips companies to reduce long-run OPEX costs, free-up personnel to perform other business-critical tasks, see and seize opportunities faster while gaining scalability, and consider a multiplicity of factors when weighing strategic initiatives and analyzing trends.

Daniel Smith is Product Marketing Leader at Amber Road, now part of E2open

TAGS

TOPICS

Categories

TRENDING POSTS

Sponsors