Nicholas Ismail, Global Head of Brand Journalism, HCL Technologies Ltd.
Nicholas Ismail
Global Head of Brand Journalism
HCL Technologies Ltd.

Professional background: Nick Ismail is the Global Head of Brand Journalism at HCL Technologies. He is responsible for delivering the editorial and content strategy. He previously spent 6 years leading the content for Information Age, a B2B technology publication headquartered in London.

Education: MA (TV Journalism) City University, BA (English Literature) University of Manchester

By Nicholas Ismail, Global Head of Brand Journalism, HCL Technologies Ltd.

 

Data is being generated at an unprecedented scale from a variety of sources, including the expanding presence of IoT devices. It’s estimated that 175 zettabytes will be produced by 2025. This data, mostly compromised of unstructured information, is stored across a multitude of hard-to-access and disparate silos. In addition, a significant portion will also be stored in public cloud environments, making the data ripe for analysis.

The first job is locating the data to effectively identify, manage and store it. Analytics and associated technologies, such as AI, can then analyze this data – deriving value from it by producing unique and transformative insights, recommendations and actions.

This article will explore the crucial role analytics plays in deriving value from data, often termed the lifeblood of an enterprise.

In this article, we will explain how to get the data ready for analysis, the various technologies that can advance analytics, such as AI and machine learning, and how to communicate insights so the data analysis can be understood and engaged with by business stakeholders to inform and transform decisions.

Getting data ready for analysis
  1. Identify and classify

    There are many different types of data, such as customer data, IoT data and HR data, which come from a variety of sources. The first step is to identify what type of data needs to be analyzed and then determine where this data is stored. Once identified, organizations should classify (index and tag) the data for future analysis and monetization.

  1. Ingest and manage

    The next step includes moving the identified data, both unstructured and structured, into one location where the analysis tools can be put to work. Organizations can deploy a data management tool to assist with this process or can achieve the same goal in-house, providing an efficient IT team is in place. Some tools allow for this to seamlessly take place both on-premise and in the cloud.

  1. Cleanse and format

    Data sets may be incomplete or inaccurate impacting the ability to analyze the data. Cleansing and formatting the data allows data scientists and engineers to ensure their projects are ready for analysis. Unnecessary data can also be deleted at this stage. Software automation tools may be necessary to cleanse and format large sets of data, before they can be analyzed.

    It’s important to save the cleansing and formatting processes for future projects on similar data sets. This way, data scientists and engineers don’t have to start from square one each time.

  1. Analysis

    It’s now time to gain value from the data and create useful, key insights, recommendations, and actions that organizations can take to drive their digital transformation objectives. To do this, enterprises have a range of software tools and algorithms at their disposal.

    At this final stage, it’s important not to change the data, as this could skew the outcomes, and establish an effective method of translating the data analysis, so the results resonate with business users. It’s not just an IT exercise.

Deriving value from data analytics for transformation

There are some scenarios where black box analytics projects fail to deliver effective insights and actions. This can be for a variety of reasons – perhaps one of the above steps was missed or carried out incorrectly, or errors were made during an analytics project carried out manually, or there was a lack of interpretability.

To drive transformational insights and actions, organizations can incorporate technologies and new ways of working to scale up the value of their data analysis.

Augmented analytics is one such tool. This method uses AI and machine learning to automatically gather insights and actions from the data.

Gartner provides an example of this in action at a fast-food chain. During a store remodel, the restaurant moved the location of the fountain drinks. As a result of this change in location, the augmented analytics system in place found a 20% increase in sales and profitability – impacting the plans for future store design across the chain.

Analytics tools that operationalize AI, machine learning and automation are fundamentally changing the game when it comes to deriving value from data.

With the rise of Explainable AI – the tools and frameworks that help users understand the decisions behind predictions and decisions – leaders are now able to understand the factors that influenced the recommended insights and actions. This decision intelligence is crucial in scaling analytics throughout an organization in a way that improves business outcomes.

Another example is the use of predictive analytics in human resource management (HRM).

Writing in his featured blog, Naveen Joshi – founder and CEO at Allerin – explains that predictive analytics in HRM empowers human resource managers to recruit highly skilled candidates and even recognize disengaged employees, ensuring that the balance of the workforce is never disturbed.

Predictive analytics makes use of historical data and current factors to determine future outcomes.

In the context of HRM, predictive analytics looks at resumes, job skills, likes and dislikes, and items such as employee engagement and employee productivity to predict future outcomes of the candidate or employee. It helps organizations bridge the skills gap, boost employee productivity and retain talent, while determining whether a candidate is the right fit for an organization.

Augmented and predictive analytics are joined by cognitive, conversational, descriptive, diagnostic, prescriptive and visual analytics that apply across a variety of use cases, including IoT analysis and threat analysis.

Self-service analytics

The whole point of adopting technology solutions is that the insights and actions go beyond the IT department.

Organizations need to empower business users with the tools, skills and training to understand the outcomes being generated by the analysis of data, in whatever form this occurs. A culture shift is needed.

To accommodate this, enterprises are investing in analytics software that have self-service data analytics capabilities. These include data visualization features and low-code or no-code application builders. These non-technical features create transparency, democratize an organization’s data and empower business users to effectively utilize the data analysis for transformed decision-making.