Story image

Why real-time AI remains a challenge

Raise the possibility of real-time Artificial Intelligence (AI) and the issue of processing power inevitably follows hot on its heels. 

But while extracting actionable insights in the shortest possible time frame is a business imperative for organisations that hope to remain competitive, computing power is only part of the challenge.

According to the 2016 Gartner Market Guide for Self-Service Data Preparation analytics users spend the majority of their time either preparing data for analysis or waiting for data to be prepared for them.

This exposes a crucial but often overlooked aspect of the data processing function – data quality. It is imperative that the insights gained from analytics and AI are not just quick, but also accurate and reliable. 

Staying ahead

Why does speed matter? Quite simply because markets move fast and databases decay – at the rate of more than 20% a year, according to one report. To get relevant actionable insights, businesses need to keep up by replacing obsolete data with current data as quickly as possible.

With data informing every aspect of operations – it’s already been deployed in some form by nine in 10 leaders of large Australian businesses, according to 2018 research from Infosys – ensuring data quality is paramount. Real-time analytics and AI help organisations adjust offers and pricing to current market events, target new and existing customers with tailored options and respond to competitors’ offerings. 

The ability for employees to access data on demand is key to a successful AI strategy. This is why self-service analytics, cloud-based data access, data preparation and data integration are so important for today’s businesses. With more data in the cloud than ever before, companies need to take a new, cloud-first approach to data management. In an ideal world, this involves creating a carefully balanced data environment which ensures data privacy and protection, while offering the right people the access they need, when they need it. 

Eliminating poor data

Gartner estimates poor data costs organisations, on average, $15 million every year. The Harvard Business Review credits this to what it dubs “hidden data factories”, where departments end up having to check and correct data that’s already ‘in the system’. 

In a world where data has been termed the ‘new oil’, it makes sense for businesses to ensure their information is top quality but current processes for doing so are tedious and inefficient. Data deluge, in the form of vast amounts of unstructured, unvetted data, leads to organisations only mining a fraction of what’s available to them. And if data scientists do invest significant time refining and preparing data, the insights gleaned may no longer be timely.

Adopting an augmented intelligence approach

When it comes to exploiting structured data, de-duplication of databases is a critical but time-consuming task. AI can be used to slash the time this takes by cleansing duplicated data sets, provided the data is perfect. Unfortunately, in the real world, it rarely is. In many scenarios, augmented intelligence – a combination of AI and human smarts – is likely to be called for.

Here’s an example of how it can work. Natural Language Processing (NLP) makes it possible to teach machines to understand natural human language, whether verbal or written, within unstructured data sets. By extracting information, including names and phone numbers, from unstructured data sets such as email threads and notes pages, NLP can contribute to a data quality strategy.

This practice can be employed to good effect in what could be dubbed the ‘lazy sales rep’ scenario. The Salesforce system is commonly used to store useful information such as phone numbers and job titles but until now this data has been largely unexploited. Harnessing AI can change this. By labelling words within an unstructured data set, users can give analytics programs the information that’s needed to automate tasks and extend the outcomes of certain labels across larger data sets. 

The upshot of this is that previously unexploited data can be put to use within the business. It’s an example of humans helping machines to help humans – or augmented intelligence in action.

Article by Talend A/NZ Country Manager Steve Singer

HPE promotes 'circular economy' for end-of-use tech
HPE is planning to show businesses worldwide that throwing old tech and assets into landfill is not the best option when it comes to end-of-use disposal.
Xinja can now officially call itself a bank
The ‘neo’bank that is focused on being a digital disruptor to traditional financial institutions has received a restricted banking license from APRA.
Exclusive: Three access management learnings from 2018
There was a renewed global response to data security in 2018, placing pressure on organisations to assume more responsibility for the data they hold.
Wearables market flourishing - fuelled by smartwatches
A market that has stuttered in the past now has a bright forecast as adoption of wearable technology continues to thrive.
The tech that helped the first woman to sail around Australia
Lisa Blair used devices from supplied by Pivotel to aid her in becoming the first woman to circumnavigate Australia non-stop.
Why there will be a battle for the cloud in 2019
Cloud providers such as AWS, Azure, and Google will likely find themselves in a mad scramble to gain additional enterprise customers.
WLAN market picks up thanks to high-end products
Dell’Oro Group have released a report showing that the WLAN market picked up in 2Q18 as 802.11ax saw its first shipments.
Mercury Energy sells smart meter business for $270m
“Metrix’s large installed meter base, deep customer relationships and innovation platform, make this a natural acquisition."