Realizing Big Data’s Promise and Sharing the Benefits | Straighttalk

Newsletter Subscription

Keep up with new content on the site, receive exclusive content and commentary, and learn about activities within the Straight Talk community.

A forward-looking CIO is using internal and customer-facing big data analytics to drive efficiencies, innovation, and revenue growth for his company – and for his customers.

By John Kochavatr, Chief Information Officer, SUEZ Water Technologies & Solutions

Leveraging big data holds tremendous promise for legacy industrial companies like ours. At the same time, there are significant challenges – for example, ensuring your advanced analytic solutions are in fact driving empirical economic benefits. But challenges notwithstanding, we have made real strides in our efforts to harness data to deliver efficiencies, cost cutting, and innovation—for our customers and for ourselves.

Our business, SUEZ Water Technologies & Solutions—formerly GE Water & Process Technologies, before being acquired by Suez in October 2017—provides chemical and equipment solutions to help customers manage and optimize water resources and processes. We serve a wide range of industries, from power to food and beverage to utilities, in more than 130 countries.

Our customers face many challenges, including increasing regulatory requirements and meeting corporate sustainability initiatives. We provide the technology, chemistry, and related services to help customers meet the challenges they face within their businesses. Increasingly, we are leveraging digital data and analytics to aid customers in meeting evolving operational needs—not just increased compliance requirements and environmental pressures, but also the need to improve efficiency or drive profitability.

In recent years, our analytics initiatives have matured from projects designed to improve our own productivity and efficiency to digital services we provide to our customers to help them innovate and grow.

Data, Data Everywhere

Most companies, including ours, already have access to numerous terabytes of data.  We have manufacturing data on our factory machines and relating to product quality, financial data on how we run our business, commercial data on how well our products and services sell, and performance data from our field service teams.

To foster innovation using that data, we take a three-fold approach. First, we analyze the data we currently have to understand where we can help drive favorable outcomes. Second, we gather new data from critical operating assets, either from existing systems or by enabling them with sensors. Third, we marry these sets of data to derive algorithms that can uncover new sources of value. 

In some cases, we have been able to discover or identify benefits based on data we already had. To take a simple example, we started looking at costs for one of our high transaction-volume products. We had outsourced the billing and collections process and we were paying a service provider nearly $15 per invoice to help us collect the cash. After doing some basic data analysis, we found that these products comprised 25% of our total enterprise invoices, and each invoice averaged $400. We were spending as much time on these invoices as we would for million dollar invoices. As a consequence, we deployed a credit card payment system for our customers, many of them small businesses with no accounts payable divisions. Customers were happy. And even with the 2% credit card fee, we reduced our payment processing costs by almost half – and we could collect the cash immediately.

Leveraging legacy data for insight is not always that straightforward. In our laboratory information management system, we have millions of data points associated with the samples we’ve tested from customers over the years. But they weren’t collected with advanced analytics in mind. Some of the data lacks the context required to make it useful for analytics. That’s forced us to go through all that data to determine where we can provide the context we need to analyze it.

In many cases, the outcomes we try to achieve for customers require that additional data be collected—we may have lab data collected six months ago and customers may need new samples or new sensors to provide the additional insight. That’s where domain expertise has been critical in our analytics efforts. We can’t just throw a data scientist in there and demand that they give us an algorithm or regression that will work. You need to understand the customer process or processes.

Connecting the Data Dots

One area where we have made significant progress is with our asset monitoring program, launched a decade ago. We have since introduced an asset performance management platform called InSight, which is deployed to more than 4,500 global customers and supports over 45,000 active assets. 

The program enables us to measure our equipment effectiveness and maintain asset health. But what we are finding is that while our customers expect the operating assets machines to be reliable, what they really care about is overall performance at their plants, whether that’s an oil refinery or a power generation plant.

To address that pain point, we had to start looking at data from assets we didn’t manufacture and run data analytics across all of a customer’s assets to monitor performance. Our asset performance management platform receives data from non-SUEZ assets operating on our customers’ SCADA systems. In some cases, we actually install sensors on our customer’s older assets that lacked them.

Previously, we would take water samples from cooling towers every week or so to look for signs of deposit, corrosion, or biological agents; apply the necessary chemical treatments; and come back again a week or two to do it all over again. Even though we can still do the on-site weekly tests, with our InSight platform, we can also now monitor certain changes in near real-time. Therefore, when the acidity or conductivity of a cooling tower deviates from specified limits, we can ensure the customer is aware of and/or help address the issue before it disrupts the customer operations.

Outcomes-as-a-Service

Initially, our data and analytics helped our account managers and field teams be more efficient and work smarter. A few years in, we began analyzing the data to identify benchmarks for “best-in-class” performance across the assets that we monitor, from boilers to reverse osmosis machines.  This has led us into developing our products, water treatment methods, and services to assist our customers in being more competitive in their own markets.

For instance, one way our oil and gas customers make money is from producing distillates from the oil they process; naturally, they would like to produce more of the most profitable distillates, such as diesel fuel.  To assist with that, we can use our combination of domain expertise, monitoring solutions, and data analytics to employ the right chemistries and optimize the tower top temperatures of the crude unit so that our refining customers can produce the most profitable distillates.

Taking an outcome-as-a-service approach enables our customers to only pay for results, which is very different than just buying our equipment or chemicals. That is the direction we’d like to go to maximize the value of digital data and services for our customers in the future.

Revenue generation is a new role for IT. As we were developing all these terrific tools, we weren’t sure if we had the right resources internally to commercialize them. So we looked to our IT service providers to help us. We re-examined our partner portfolio and decided to work with only those vendors willing to help us to sell digital solutions.

As IT has shifted from a cost center to a strategic lever (we have already generated several significant orders this year), we have earned a much bigger seat at the table. Over the past five years, my role has evolved from managing infrastructure and ERP systems to playing a critical role in digital transformation.

That has required me to develop my business acumen and my understanding of our customer base. I focus much more on business outcomes than technologies. I spend a lot of time identifying and eliminating friction points across the value chain. My team has also had to become much more savvy about our business, our customers, and how we develop and commercialize our products.

What is Success?

With that new mindset comes a crucial question: “How do we define success with our data-driven initiatives?” And whatever answer we devise must be measurable. The return doesn’t have to be economic, but we need to understand the outcomes we are seeking.

I always ask business owners how they define success before we embark on any major work. I’m looking for a commitment – a revenue increase, margin expansion, customer or retention increase. In addition to ensuring that those project benefits are tracked, this step ensures that leaders have skin in the game because they will be held accountable for the results. That helps to guarantee that they will make the necessary business or process changes to realize the benefits.

We can pour investment dollars into developing algorithms and advanced analytics all day long.  But if those investments don’t correlate to some committed return on investment or aren’t aligned to a customer outcome, they are likely to fail before they start. We want our investments to convert into products and services that enhance customer outcomes that will ultimately translate into economic returns for our business.

***

The Takeaways 
In order to drive advanced analytics initiatives, IT organizations must examine and analyze legacy sources of data, identify potential new sources of data, and connect the two to deliver insight.
Analytics efforts designed to improve internal productivity and efficiency can pave the way for customer-facing analytic services that generate new revenues.
When IT is able to harness data to generate new revenues, it gets a bigger seat at the corporate table. But with that expanded role comes additional responsibilities.