Story image

Why real-time AI remains a challenge

Raise the possibility of real-time Artificial Intelligence (AI) and the issue of processing power inevitably follows hot on its heels. 

But while extracting actionable insights in the shortest possible time frame is a business imperative for organisations that hope to remain competitive, computing power is only part of the challenge.

According to the 2016 Gartner Market Guide for Self-Service Data Preparation analytics users spend the majority of their time either preparing data for analysis or waiting for data to be prepared for them.

This exposes a crucial but often overlooked aspect of the data processing function – data quality. It is imperative that the insights gained from analytics and AI are not just quick, but also accurate and reliable. 

Staying ahead

Why does speed matter? Quite simply because markets move fast and databases decay – at the rate of more than 20% a year, according to one report. To get relevant actionable insights, businesses need to keep up by replacing obsolete data with current data as quickly as possible.

With data informing every aspect of operations – it’s already been deployed in some form by nine in 10 leaders of large Australian businesses, according to 2018 research from Infosys – ensuring data quality is paramount. Real-time analytics and AI help organisations adjust offers and pricing to current market events, target new and existing customers with tailored options and respond to competitors’ offerings. 

The ability for employees to access data on demand is key to a successful AI strategy. This is why self-service analytics, cloud-based data access, data preparation and data integration are so important for today’s businesses. With more data in the cloud than ever before, companies need to take a new, cloud-first approach to data management. In an ideal world, this involves creating a carefully balanced data environment which ensures data privacy and protection, while offering the right people the access they need, when they need it. 

Eliminating poor data

Gartner estimates poor data costs organisations, on average, $15 million every year. The Harvard Business Review credits this to what it dubs “hidden data factories”, where departments end up having to check and correct data that’s already ‘in the system’. 

In a world where data has been termed the ‘new oil’, it makes sense for businesses to ensure their information is top quality but current processes for doing so are tedious and inefficient. Data deluge, in the form of vast amounts of unstructured, unvetted data, leads to organisations only mining a fraction of what’s available to them. And if data scientists do invest significant time refining and preparing data, the insights gleaned may no longer be timely.

Adopting an augmented intelligence approach

When it comes to exploiting structured data, de-duplication of databases is a critical but time-consuming task. AI can be used to slash the time this takes by cleansing duplicated data sets, provided the data is perfect. Unfortunately, in the real world, it rarely is. In many scenarios, augmented intelligence – a combination of AI and human smarts – is likely to be called for.

Here’s an example of how it can work. Natural Language Processing (NLP) makes it possible to teach machines to understand natural human language, whether verbal or written, within unstructured data sets. By extracting information, including names and phone numbers, from unstructured data sets such as email threads and notes pages, NLP can contribute to a data quality strategy.

This practice can be employed to good effect in what could be dubbed the ‘lazy sales rep’ scenario. The Salesforce system is commonly used to store useful information such as phone numbers and job titles but until now this data has been largely unexploited. Harnessing AI can change this. By labelling words within an unstructured data set, users can give analytics programs the information that’s needed to automate tasks and extend the outcomes of certain labels across larger data sets. 

The upshot of this is that previously unexploited data can be put to use within the business. It’s an example of humans helping machines to help humans – or augmented intelligence in action.

Article by Talend A/NZ Country Manager Steve Singer

Zoom’s new Rooms and Meetings features
Zoom has released information about the upcoming releases for its Rooms and Meeting offerings for 2019.
Aussie company set to democratise direct-to-orbit IoT access
Adelaide-based Myriota has released a developer toolkit that has been trialled and tested by a smart waste management platform.
Apple's AirPods now come with 'Hey Siri' functionality
The new AirPods come with a standard case or a Wireless Charging Case that holds additional charges for more than 24 hours of listening time.
Dynatrace takes pole position in APM Magic Quadrant
It placed highest on Ability to Execute and furthest on Completeness of Vision in the 2019 Quadrant for Application Performance Monitoring (APM).
HCL and Xerox expand strategic partnership
Under the terms of the agreement, HCL will manage portions of Xerox’s shared services, including global administrative and support functions.
Avaya expands integration with Google Cloud AI
This includes embedding Google’s machine learning within conversation services for the contact centre, enabling integration of AI capabilities.
Forrester names Crowdstrike leader in incident response
The report provides an in-depth evaluation of the top 15 IR service providers across 11 criteria.
Poly appoints new A/NZ managing director, Andy Hurt
“We’re excited to be bringing together two established pioneers in audio and video technology to be moving forward and one business – Poly."