Story image

Why real-time AI remains a challenge

Raise the possibility of real-time Artificial Intelligence (AI) and the issue of processing power inevitably follows hot on its heels. 

But while extracting actionable insights in the shortest possible time frame is a business imperative for organisations that hope to remain competitive, computing power is only part of the challenge.

According to the 2016 Gartner Market Guide for Self-Service Data Preparation analytics users spend the majority of their time either preparing data for analysis or waiting for data to be prepared for them.

This exposes a crucial but often overlooked aspect of the data processing function – data quality. It is imperative that the insights gained from analytics and AI are not just quick, but also accurate and reliable. 

Staying ahead

Why does speed matter? Quite simply because markets move fast and databases decay – at the rate of more than 20% a year, according to one report. To get relevant actionable insights, businesses need to keep up by replacing obsolete data with current data as quickly as possible.

With data informing every aspect of operations – it’s already been deployed in some form by nine in 10 leaders of large Australian businesses, according to 2018 research from Infosys – ensuring data quality is paramount. Real-time analytics and AI help organisations adjust offers and pricing to current market events, target new and existing customers with tailored options and respond to competitors’ offerings. 

The ability for employees to access data on demand is key to a successful AI strategy. This is why self-service analytics, cloud-based data access, data preparation and data integration are so important for today’s businesses. With more data in the cloud than ever before, companies need to take a new, cloud-first approach to data management. In an ideal world, this involves creating a carefully balanced data environment which ensures data privacy and protection, while offering the right people the access they need, when they need it. 

Eliminating poor data

Gartner estimates poor data costs organisations, on average, $15 million every year. The Harvard Business Review credits this to what it dubs “hidden data factories”, where departments end up having to check and correct data that’s already ‘in the system’. 

In a world where data has been termed the ‘new oil’, it makes sense for businesses to ensure their information is top quality but current processes for doing so are tedious and inefficient. Data deluge, in the form of vast amounts of unstructured, unvetted data, leads to organisations only mining a fraction of what’s available to them. And if data scientists do invest significant time refining and preparing data, the insights gleaned may no longer be timely.

Adopting an augmented intelligence approach

When it comes to exploiting structured data, de-duplication of databases is a critical but time-consuming task. AI can be used to slash the time this takes by cleansing duplicated data sets, provided the data is perfect. Unfortunately, in the real world, it rarely is. In many scenarios, augmented intelligence – a combination of AI and human smarts – is likely to be called for.

Here’s an example of how it can work. Natural Language Processing (NLP) makes it possible to teach machines to understand natural human language, whether verbal or written, within unstructured data sets. By extracting information, including names and phone numbers, from unstructured data sets such as email threads and notes pages, NLP can contribute to a data quality strategy.

This practice can be employed to good effect in what could be dubbed the ‘lazy sales rep’ scenario. The Salesforce system is commonly used to store useful information such as phone numbers and job titles but until now this data has been largely unexploited. Harnessing AI can change this. By labelling words within an unstructured data set, users can give analytics programs the information that’s needed to automate tasks and extend the outcomes of certain labels across larger data sets. 

The upshot of this is that previously unexploited data can be put to use within the business. It’s an example of humans helping machines to help humans – or augmented intelligence in action.

Article by Talend A/NZ Country Manager Steve Singer

Lenovo DCG moves Knight into A/NZ general manager role
Knight will now relocate to Sydney where he will be tasked with managing and growing the company’s data centre business across A/NZ.
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.
Symantec and Fortinet partner for integration
The partnership will deliver essential security controls across endpoint, network, and cloud environments.
Review: Blue Mic’s Satellite headphones are good but...
Blue Mic’s newest wireless headphones deliver on sound, aesthetic, and comfort - but there is a more insidious issue at hand.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
IDC: Smartphone shipments ready to stabilise in 2019
IDC expects year-over-year shipment growth of 2.6% in 2019, while the world's largest market is still forecast to be down 8.8% in 2018.
Microsoft NZ bids Goldie a “fond farewell”
Microsoft New Zealand director of commercial and partner business takes new role across the Tasman. The search for his replacement has begun.