Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more
Qualcomm today at its Snapdragon Summit 2021 announced a collaboration with Google Cloud to bring the latter’s Neural Architecture Search to Qualcomm platforms. The move is designed to speed up the development of AI models at the edge.
Qualcomm claims the announcement will make it the first system-on-a-chip (SoC) customer to offer the Google Cloud Vertex AI Neural Architecture Search services. It will first be available on the Snapdragon 8, Gen 1 Mobile Platform, followed by the Snapdragon portfolio across mobile, IoT, automotive, and XR platforms.
As AI/ML hardware has become more widespread, attention has turned to the software stack, which often consists of point solutions. Optimizing develops MLOps workflows for AI and with this collaboration, Qualcomm aims to speed up the development of AI models for Snapdragon at the edge.
Google Cloud announced Vertex AI Neural Architecture Search in May as a unified platform for developing, deploying, and maintaining AI models. According to Google, training models with Vertex AI required almost 80% fewer lines of code compared to other platforms. Google claims it’s the same toolkit that is used internally to power Google, ranging from computing vision to language and structured data.
Vertex AI consists of various tools, but Qualcomm specifically called out the Neural Architecture Search. As the name implies, it seeks to optimize AI models. Vertex AI NAS will be integrated into the Qualcomm Neural Processing SDK, and will run on the Qualcomm AI Engine.
“With this collaboration, Qualcomm Technologies will now be able to build and optimize new AI models in weeks rather than months, and we’re thrilled at the impact this will have on people using Snapdragon-powered devices,” June Yang, vice president of Cloud AI and Industry Solutions at Google Cloud, said in a statement.