CIO

The top 10 data and analytics technology trends for 2019 - and how to make them work for you

Incorporate these top trends into your strategy and roadmap, but prioritise based on business value, advises Gartner

The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change

Donald Feinberg, Gartner

Augmented analytics, continuous intelligence and explainable artificial intelligence (AI) are among the top trends in data and analytics technology that have significant disruptive potential over the next three to five years, reports Gartner.

Rita Sallam, research vice president at Gartner, says data and analytics leaders must assess the potential business impact of these trends and adjust business models and operations accordingly, or risk losing competitive advantage to those who do.

“The story of data and analytics keeps evolving, from supporting internal decision making to continuous intelligence, information products and appointing chief data officers,” she says. 

“It’s critical to gain a deeper understanding of the technology trends fueling that evolving story and prioritise them based on business value.” 

Donald Feinberg, vice president and distinguished analyst at Gartner, notes the very challenge created by digital disruption — too much data — has also created an unprecedented opportunity. 

The vast amount of data, together with increasingly powerful processing capabilities enabled by the cloud, means it is now possible to train and execute algorithms at the large scale necessary to finally realise the full potential of AI.

“The size, complexity, distributed nature of data, speed of action and the continuous intelligence required by digital business means that rigid and centralised architectures and tools break down,” says Feinberg. 

“The continued survival of any business will depend upon an agile, data-centric architecture that responds to the constant rate of change.”

Gartner calls on data and analytics leaders to talk with senior business leaders about their critical business priorities and explore how the following top trends can enable them:

Augmented analytics

Augmented analytics is the next wave of disruption in the data and analytics market. It uses machine learning (ML) and AI techniques to transform how analytics content is developed, consumed and shared.

By 2020, augmented analytics will be a dominant driver of new purchases of analytics and BI, as well as data science and ML platforms, and of embedded analytics. Data and analytics leaders should plan to adopt augmented analytics as platform capabilities mature.

Augmented data management 

Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning. It is automating many of the manual tasks and allows less technically skilled users to be more autonomous using data. It also allows highly skilled technical resources to focus on higher value tasks.

Augmented data management converts metadata from being used for audit, lineage and reporting only, to powering dynamic systems. Metadata is changing from passive to active and is becoming the primary driver for all AI/ML.

Through to the end of 2022, Gartner predicts data management manual tasks will be reduced by 45 per cent through the addition of ML and automated service-level management.

Continuous intelligence

By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.

Continuous intelligence is a design pattern in which real-time analytics are integrated within a business operation, processing current and historical data to prescribe actions in response to events. It provides decision automation or decision support. Continuous intelligence leverages multiple technologies such as augmented analytics, event stream processing, optimisation, business rule management and ML.

“Continuous intelligence represents a major change in the job of the data and analytics team,” says Ms. Sallam. “It’s a grand challenge — and a grand opportunity — for analytics and BI (business intelligence) teams to help businesses make smarter real-time decisions in 2019. It could be seen as the ultimate in operational BI.”

Explainable AI

 AI models are increasingly deployed to augment and replace human decision making. However, in some scenarios, businesses must justify how these models arrive at their decisions. To build trust with users and stakeholders, application leaders must make these models more interpretable and explainable.

Unfortunately, most of these advanced AI models are complex black boxes that are not able to explain why they reached a specific recommendation or a decision. Explainable AI in data science and ML platforms, for example, auto-generates an explanation of models in terms of accuracy, attributes, model statistics and features in natural language. 

Graph

Graph analytics is a set of analytic techniques that allows for the exploration of relationships between entities of interest such as organisations, people and transactions.

The application of graph processing and graph DBMSs will grow at 100 per cent annually through to the end of 2022 to continuously accelerate data preparation and enable more complex and adaptive data science.

Graph data stores can efficiently model, explore and query data with complex interrelationships across data silos, but the need for specialised skills has limited their adoption to date, according to Gartner.

Graph analytics will grow in the next few years due to the need to ask complex questions across complex data, which is not always practical or even possible at scale using SQL queries.  

Data fabric

Data fabric enables frictionless access and sharing of data in a distributed data environment. It enables a single and consistent data management framework, which allows seamless data access and processing by design across otherwise siloed storage.

Through to the end of 2022, bespoke data fabric designs will be deployed primarily as a static infrastructure, forcing organisations into a new wave of cost to completely re-design for more dynamic data mesh approaches.

NLP/Conversational analytics

By 2020, 50 per cent of analytical queries will be generated via search, natural language processing (NLP) or voice, or will be automatically generated. The need to analyse complex combinations of data and to make analytics accessible to everyone in the organisation will drive broader adoption, allowing analytics tools to be as easy as a search interface or a conversation with a virtual assistant.

Commercial AI and ML 

Gartner predicts that by 2022, 75 per cent of new end-user solutions leveraging AI and ML techniques will be built with commercial solutions rather than open source platforms.

Commercial vendors have now built connectors into the Open Source ecosystem and they provide the enterprise features necessary to scale and democratise AI and ML, such as project & model management, reuse, transparency, data lineage, and platform cohesiveness and integration that Open Source technologies lack. 

Blockchain

The main value of blockchain and distributed ledger technologies, is providing decentralised trust across a network of untrusted participants. The potential ramifications for analytics uses are significant, especially those leveraging participant relationships and interactions.

However, it will be several years before four or five major blockchain technologies become dominant, says Gartner. 

Until that happens, technology end users will be forced to integrate with the blockchain technologies and standards dictated by their dominant customers or networks. This includes integration with your existing data and analytics infrastructure. The costs of integration may outweigh any potential benefit. Blockchains are a data source, not a database, and will not replace existing data management technologies.

Persistent memory servers

New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads. 

Gartner says the technology has the potential to improve application performance, availability, boot times, clustering methods and security practices, while keeping costs under control. It will also help organisations reduce the complexity of their application and data architectures by decreasing the need for data duplication.

“The amount of data is growing quickly and the urgency of transforming data into value in real-time is growing at an equally rapid pace,” says Feinberg. “New server workloads are demanding not just faster CPU performance, but massive memory and faster storage.”

Focus on the non-tech trends too

The Gartner analysts, meanwhile, say hand in hand with these trends are eight areas data and analytics leaders need to consider:

“Place an equal emphasis on investing in nontechnology trends,” they advise.

In addition, Sallam and Feinberg list other steps to take around the technology and non-technology trends above:

Gartner says steps they can take around the technology and non-technology trends above include:

  • Engage, educate and ideate with senior business leaders about their strategically relevant priorities.

  • Explore how top trends can enable those business priorities.

  • Identify capabilities and gaps to leveraging top trends.

  • Incorporate top trends into your strategy and roadmap, but prioritise based on business value.

  • Take action over the next three to five years by either monitoring, experimenting or exploiting the trend.

    Sign up for  CIO newsletters for regular updates on CIO news, career tips, views and events. Follow CIO New Zealand on Twitter:@cio_nz
    Send news tips and comments to divina_paredes@idg.co.nz @divinap