Blog: 3 Takeaways from GTC 2019
Coming out of the NVIDIA GPU Technology Conference (GTC), it is hard not to be consumed by the hype and showmanship of the event. The wow-factor of autonomous driving cars, robotics, and hyper-realistic graphics can steal the show during the week, so I wanted to take a different approach and emphasize some of the innovation that has the potential to significantly impact the Enterprise IT landscape.
AI and machine learning (ML) have become common industry buzzwords for IT professionals. The grand promises of AI solving ‘unsolvable’ problems and predicting the future were initially appealing for businesses. However, professionals are beginning to see past the hype and are asking the important questions like ‘What problems can we actually solve using AI?’ and ‘What will it take to implement AI in our business’. While the wow factor associated with AI was certainly on display at GTC, the exhibit hall showed that IT professionals are just as concerned with how to operationalize AI. A number of vendors such as Skymind, Iguazio, Determined AI, and DotScience are simplifying the process of putting AI and ML into production by providing automated model definition, deployment, and monitoring. These exhibits were flooded with attendees, seeking to learn how they can put AI to work in their business.
Accelerating AI Workloads
As the name of GTC would suggest, the GPU was on center display throughout the conference. One of the more compelling enterprise use cases for the GPU was the acceleration of AI workloads. The parallelized acceleration of GPUs allows data pipelines and queries to be executed at an elevated rate. A demo during the keynote presentation displayed the power of using a GPU enabled data preparation tool (Datalogue) and a GPU enabled data analytics platform (Omnisci) to process and visualize Terabytes of data in seconds. This demo brought a buzz about the crowd during the presentation, and the booths for both companies were highly attended afterwards in the exhibit hall. Other vendors using GPUs to accelerate workloads included Fastdata, BlazingDB, and SQream.
Training Data / Synthetic Data
One of the major hurdles to developing new AI (machine learning, deep learning, computer vision, etc.) models is having enough high-quality data to train the them. These models have a tremendous appetite for data, and the model accuracy depends heavily on how well it is trained. For more complex models with video and image recognition, the quality of the training data is of the utmost importance. Video and images must be accurately labeled to ensure model accuracy, which can be a tedious and timely process. Vendors such as Samasource, Understand.ai, and Handl.ai look to take the pain out of gathering and grooming this data by offering services and platforms for tagging training datasets. Other vendors like Mostly.ai take a slightly different approach by generating synthetic training data based on other data sets.