ChannelLife Australia - Industry insider news for technology resellers
Story image

Gradient and Ethical AI to take on biased AI algorithms

The non-profit Gradient Institute and consultancy Ethical AI Advisory have signed an alliance to collaborate on the development and deployment of Artificial Intelligence (AI) that is both ethical and trustworthy.

AI is the fastest growing technology sector in the world with more than US$35 billion invested in 2019, an 18% increase over 2018, according to analyst group IDC.

About 40% of organisations are predicted to use AI by the end of this year.

In a joint statement, the companies liken the impact of AI to the introduction of electricity on business and society – AI will increase efficiency, improve analytics and decision-making, and automate many tasks for which humans were once required. Essentially, AI will make things better, faster and cheaper, for better or worse.

Despite the rapid pace of growth of the AI sector, there are few laws, regulations, practical guidelines or software tools to assist companies developing or deploying AI.

Some recent high profile examples of AI gone wrong include credit limit discrimination against women and racially biased medical algorithms.

The common misconception that algorithms are neutral has been disproved repeatedly in recent history, as unconscious biases and inequitable practices have permeated them.

Both Gradient and Ethical AI Advisory contributed to the Federal Government's development of Australia's AI Ethics Framework released by the Minister for Industry, Science and Technology, Karen Andrews, in November 2019, and the Human Rights Commission's AI Discussion Paper launched in December 2019.

Gradient provides deep expertise in AI systems consulting focused on evaluation, design, implementation and measurement of AI systems to check them against ethical goals.

In addition, it is focused on research and collaboration to advance the science of ethical AI, and provides training and education to professionals responsible for the technical, managerial, policy and decision-making aspects of AI-based decision systems.

Ethical AI Advisory provides training and education in AI and ethics for boards, executives and leaders in organisations interested in developing a strong governance and leadership capability in ethical AI, plus the design and development of ethical AI frameworks and toolkits as well as keynote speaking and coaching in the field.

“We are very pleased to be partnering with Gradient. Bill Simpson-Young, Tiberio Caetano and the team at Gradient Institute are the best in their fields, highly knowledgeable and capable in research, training and operationalising ethical AI. We are experts in executive education, organisational design and implementation of ethical AI principles – so the two organisations coming together provides the ideal end to end service,” says Ethical AI Advisory founder and CEO Dr Catriona Wallace,

These combined services are ideal for organisations who recognise that AI will be an important part of their company's infrastructure, but also want to ensure that it is built to maintain the trust of its clients and customers while reducing risk and unintended harm.

Both organisations believe that Australia has a great opportunity to position itself as a global leader in ethical AI, giving Australian companies a competitive advantage while also doing the right thing by customers and society at large.

“We are thrilled to be partnering with Dr Catriona Wallace and the Ethical AI Advisory team. Catriona has been a vocal advocate of the ethical development and use of AI for a long time, and we have always considered her a key member of Australia's ethical AI ecosystem. We think that Gradient's expertise in the technical aspects of ethical AI and Ethical AI Advisory's expertise in governance and organisational design for AI are highly complementary. We are looking forward to together helping establish Australia's global leadership in the ethical development and use of AI,” says Gradient CEO Bill Simpson-Young.

Follow us on: