Story image

How gaming graphics changed the world: Nvidia AI Conference Sydney

14 Sep 18

The Singularity, the hypothetical moment in time when artificial intelligence becomes capable of recursive self-improvement, could radically change human civilisation almost overnight. 

According to the boffins at Nvidia, we may owe the next big leap in human technological advancement to gamers’ desire for better PC graphics.

Last week, graphics cards power-house Nvidia hosted their AI event in Sydney’s International Conference Centre. There they invited developers to check out their advances in deep learning and their new Turing-class GPUs. 

In a relatively short amount of time machine learning has gone from theory to reality, Nvidia’s DGX systems are giving scientist the tools needed to train the artificial intelligence that is going to revolutionise the world.

And it’s all thanks to a couple of GeForce GTX580 graphics cards. 

In his keynote speech, Marc Hamilton, Nvidia’s Vice President of Solutions Architecture and Engineering, looked back at how the power of GPU CUDA cores were first harnessed for AI.

In 2012, University of Toronto Students Alex Krizhevsky and Ilya Sutskever, along with their professor, Geoffrey Hinton, entered the ImageNet Large Scale Visual Recognition Challenge, with a convolutional neural network programmed in CUDA, running on two off the shelf Nvidia graphics cards. The trio won the competition and set up a company that was promptly purchased by Google.

Professor Hinton is also the great-great-grandson George Boole (the English mathematician that came up with Boolean algebra), proving that the apple doesn’t fall far from the tree.

But why GPUs? Nvidia’s Marc Hamilton explained that whilst CPUs are good at taking a packet of information and sequentially processing it, they do so one-by-one. GPU technology, on the other hand was developed with the need to get thirty images render rendered and up on the screen a second. It’s this technology that vert basically speaking make GPUs so efficient for training convolutional neural networks.   

And it is the application of this cutting-edge technology that Nvidia was showing off to the media at the conference. 

The most visually impressive application has to be real-time ray tracing. Nvidia’s showcase demo, running on RTX infused Quattro GPUs, had a photoreal model of a Porsche with ray-traced lighting that could be adjusted on the fly. The detail was amazing, right down to the refraction in the headlight lenses. 

Apart from looking very cool, real-time ray-tracing allows designer to examine their products in real-world lighting situation rather than relay on software tricks. Amazingly, the upcoming Shadow of the Tomb Raider and Battlefield V games will both feature this technology courtesy of GeForce RTX GPUs. 

Another mind-blowing visual application that Nvidia were showing of was in the form of a couple of Adobe Photoshop plugins using the machine learning of Turing-based RTX and Quattro GPUs. Super Rez makes the formally impossible/ridiculous CSI-style image enhancements almost a reality. A low-resolution image can be cleaned up with the plug-in actually adding in information. The result is an image with better clarity and more detail than the original photograph. The other plug-in, InPaint, allows users to literally paint-out objects and people from photographs- both amazing and slightly unnerving.

Nvidia’s AI technology is also revolutionising medicine by enabling what would ordinarily be a series of 2D scans to be realised as 3D models. Medical professionals will be able to use this technology to better diagnose aliments and more effectively treat patients.

Another demonstration, less glamorous, but still impressive, showed how Nvidia’s AI technology monitored their office parking garage. Using hundreds of cameras, the GPUs disseminate the data from the camera images, monitoring users parking habits and watching out for suspicious or erratic driver behaviour.

Poetically, the technology has come full circle in that the upcoming GeForce RTX2080 gaming GPUs are built upon the hardware advances that have come out of Nvidia’s machine-learning R&D. Very soon gamers will be able to experience the holy grail of 3D graphics, real-time ray tracing, in their games courtesy of these Turing-based GPUs.

With applications ranging from entertainment to autonomous vehicles, machine learning is already having a huge impact on our day-to-day lives. It is said that we will not be aware that the Singularity is occurring, we will only be aware once it has happened. That day may just be sooner than we think.

Lenovo DCG moves Knight into A/NZ general manager role
Knight will now relocate to Sydney where he will be tasked with managing and growing the company’s data centre business across A/NZ.
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.
Symantec and Fortinet partner for integration
The partnership will deliver essential security controls across endpoint, network, and cloud environments.
Review: Blue Mic’s Satellite headphones are good but...
Blue Mic’s newest wireless headphones deliver on sound, aesthetic, and comfort - but there is a more insidious issue at hand.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
IDC: Smartphone shipments ready to stabilise in 2019
IDC expects year-over-year shipment growth of 2.6% in 2019, while the world's largest market is still forecast to be down 8.8% in 2018.
Microsoft NZ bids Goldie a “fond farewell”
Microsoft New Zealand director of commercial and partner business takes new role across the Tasman. The search for his replacement has begun.