By Straight Talk Editors

[Read Part 1 and Part 2]

Can We Get the Data to Work for Us?

To move from the Internet of Things to the Internet of Experiences, products and services should be developed not only with an emphasis on making technology fade into the background, but also with data and its analysis at their core. The value of the “things” in the Internet of Experiences is measured by how well the data they collect is analyzed and how quickly useful feedback based on this analysis is delivered to the user. The problem with the current crop of wearables is not only the poor design of the user interface but also the absence of relevant data and analysis. “For the mass market,” says Donna Hoffman, the co-director of the Center for the Connected Consumer, “you need to think about how to get this information to people so they are not overwhelmed but they can use it to improve their lives.”

The collection of all the relevant data means also tapping into other sources of information and not limiting it to a single “thing” or a single user. Hal Varian provides an example from his home state of California, which is suffering a severe drought. “Yet I still see lawn sprinkler systems running while it is raining,” he says. “Certainly sprinkler systems should be smart enough to check the weather forecast and current conditions before turning on the water!” Or, as Aziza, the CMO and self-described Quantified Self geek, says: “We work for the data today; instead, the data needs to work for us.”

For the data to work for us, we need to follow a few Internet of Experiences principles: The data should be collected in non-intrusive ways; it should be collected from all relevant sources; the user should be involved in deciding what data is  collected  and  how  it  is  analyzed; the analysis should be provided to the user (or another “thing”) in a timely fashion; and the analysis should provide value (e.g., benchmarks, recommendations, predictions), delighting users and encouraging further use.

Separating the Signal from the Noise

These principles apply not only to consumers, but also to enterprises. The Internet of Things promises to improve the internal operations of enterprises everywhere, cutting waste and improving productivity in numerous activities, from inventory management to supply chain logistics to customer relations. Just as with consumers, however, enterprises will not benefit from the true potential of increased connectivity if their workers are overwhelmed by data. If the data collected and transmitted by the Internet of Things is not relevant, if it is not provided in a timely fashion, if the analysis does not suggest ways to improve a work activity or process, then enterprises will not benefit from the Internet of Experiences. If employees, managers, and senior executives in enterprises big and small, private and public, don’t have positive experiences with this abundance of data, they won’t take advantage of it.

Mike Cavaretta, the Technical Leader for Predictive Analytics and Data Mining at Ford Research and Innovation Center, highlights this importance of separating the signal from the noise: “I don’t think the problem is the storage of the data. It’s ingesting the data, the analysis of the data, and the understanding of the data. Machines don’t drown in data; people drown in data.”

Cavaretta is in a unique position to assess the impact of the current data deluge, and of the coming data tsunami unleashed by the Internet of Things, on both consumers and enterprises. Ford’s Fusion Energi car model already generates about 25 gigabytes of data every hour, which is used to improve fuel efficiency and reduce emissions. But Ford’s research labs are experimenting with vehicles that produce 250 gigabytes of data per hour. And the number of cars connected  to the Internet worldwide will grow to 152 million in 2020, up from 23 million today, according to IHS Automotive. That will be a small slice of the embedded systems market that IDC says will generate 4.4 trillion gigabytes of data worldwide in 2020.

Staying afloat in all this data becomes even more difficult if you attempt to increase relevance and value by integrating data from different sources-a task that’s particularly challenging for businesses. Says Cavaretta: “As you take more and more data sets and mash them together, you find that the value of the data can go up quite significantly. But it’s a huge perennial challenge to look across different and disparate data sets. Big data technologies such as Hadoop, however, allow you to put everything in one large repository.” And there’s value to be found in data from outside the enterprise and its ecosystem: “One data source that is really important is government, and more generally, the open data movement. There’s a lot of value here,” says Cavaretta.

Data merging and mashing, while increasing the value of the analysis, raise another issue: My data does not always talk to your data. Yes, they both consist of ones and zeros, but they are formatted differently and may be using different codes, names, and labels to describe the same thing. Semantics in general and the Semantic Web-a framework of common data formats that allows data to be shared and reused across Web pages and applications – in particular promise to overcome this difficulty. “I would like to see these technologies take off,” says Cavaretta. “Google Trends is a great example of how semantic technology can rationalize things across different domains.”

Steven Gustafson, the Manager of the Knowledge Discovery Lab at GE Global Research, established his lab several years ago when he realized what semantics can do: capture, retain, and share the domain knowledge of GE’s engineers. “As data geeks, one of the things we first spend time on is talking to engineers, understanding the data and how it gets processed. I felt that with the advent of Semantic Web there is a better, more efficient way we can achieve things like self-describing data, and that would make finding patterns so much easier. We found a way to allow domain experts to capture their own knowledge in semantics and give them access to things like reasoning and tools for data integration by developing what we call semantic application design language, or SADL.”

Using an example from aviation, where GE estimates that servicing aircraft engines involves 205 million man-hours per year at a cost of $10 billion, Gustafson explains the utility of semantics. “We want to proactively manage engines service to prevent unexpected repairs,” he says. “We use semantic technology to extract data from maintenance records - where somebody wrote down what they did to return the engine to service - turning textual unstructured data into knowledge, eliminating uncertainty and providing enhanced understanding of an asset’s health.” This now-accessible knowledge about prior maintenance allows GE’s services teams to deliver better operational performance.

“We will start moving this technology closer and closer to the asset,” says Gustafson. “When the data that is coming off of engines is already semantically tagged,   we can start connecting things to give them more autonomy, to schedule when maintenance occurs, for example.”

Semantically tagged data could give enterprises a much more granular view of their operations, allowing them to move from tracking assets to servicing assets to managing assets-even to giving assets more autonomy to manage themselves.  Web standards provide a layer that masks the idiosyncrasies of specific devices, especially those that have been developed and installed over the past 20 or 30 years and communicate in their own specialized language, thus creating data that can be shared by all other things on the Internet. The use of web standards, says Gustafson, will “enable complex assets and data to work more easily with people.” In the Internet of Experiences, my data will talk to your data, no matter what you are.

The Principle of Great Experiences

In Forrester’s latest global survey of executives, 74% said their company has a digital strategy. Still, 93% believed “digital” will disrupt their business in the next 12 months. Most important, only 15% believed their firms have the skills and resources to execute their digital strategies.

With the emergence of the Internet of Things, “digital” is taking on new dimensions, increasing the number of people connected to the Internet, as well as the even faster- growing number of connected things.

To take advantage of the opportunities and overcome the challenges presented by the Internet of Things, product and service developers must re-evaluate product design and the role of data and analysis. They should follow the Principle of Great Experiences: The more inconspicuous and data-centric the technology, the faster it will spread among a growing network of delighted users. If we’re able to get technology to recede into the background and analyze the onslaught of data in a way that reveals its significance, the Internet of Things will become a true Internet of Experiences.

Originally published in CTO Straight Talk, No. 1 (August 2014)