Presented by Qualcomm Technologies
With the global rollout of 5G, AI is moving to the edge, opening up all of the possibilities that intelligent connection can bring — all the time. From smartphones to vehicles, factory cameras, VR/AR devices and more, these products with on-device AI capabilities have prompted a surge of true digital innovation. Developers have all the blocks they need to build digital transformation in every industry — but they still need the right tools to bring it all together.
“A major challenge we’re seeing for developers and OEMs is access to AI software that can generate best-in-class performance from the hardware,” says Ziad Asghar, vice president of product management at Qualcomm Technologies, Inc.
The other challenge is scale. How do you make it easy and cost-effective for developers to take an AI feature developed for one product and expand it across a portfolio of products — or use it to open up a whole new opportunity in a new line of business? Right now, it takes a great deal of effort and cost to recreate a model from scratch for each new use case — time that could be better spent on innovation.
That’s where the modern unified AI stack comes in, transforming how developers and OEMs approach AI innovation and business growth.
Giving developers and OEMs the competitive advantage
A unified AI software stack, which connects the hardware required to build AI applications with the developer environment, gives developers a single place to view, manage and develop consistently across a full range of applications. For a company like Qualcomm, building a unified portfolio of proprietary tools makes it incredibly easy for their partners to develop across various product lines.
One of the most lucrative benefits of a unified AI software stack is how it puts resources like time and budget directly back into the developer’s pocket, Asghar says.
Once a model has been completed and optimized to the point where it can be deployed, developers can take that model, tweak it for a different vantage point in another product with AI capabilities, and deploy it very quickly there as well, instead of having to deal with a different stack or re-develop.
Or a company that has only operated in smartphones/handset development can take its AI-powered image enhancement neural network and apply it to a home security camera device which identifies and approves visitors, or be leveraged in an autonomous car, to improve and enhance collision sensors.
Across different Qualcomm products, every new AI feature developed for one product can be an opportunity to transform another product or open up new research areas, Asghar says. Qualcomm would much rather see developers spend their resources on that kind of innovation, rather than just redoing the same work for different products.
“If they’re not spending as much time and effort on essentially creating new models for each of these different Qualcomm products, that same R&D spend can now be used to enhance those experiences much more,” Asghar says. “It’s a very big advantage for customers who may have felt that the increased R&D they would need to go into a new area would be prohibitive.”
“These could be use cases and experiences that we might not even be looking at yet,” he adds. “That’s the true promise and benefit of making this very simple for OEMs and developers.”
Why build a proprietary AI stack?
In the connected world, every node now speaks the same language, and can leverage the same technology. Qualcomm was already enabling this from a hardware perspective, but they’ve designed the Qualcomm AI Stack to extend that advantage to the software side for their developers and OEMs.
“It’s an industry-leading software supporting the connected edge side and the devices that Qualcomm powers, and it’s going to make the lives of developers and OEMs extremely easy,” Asghar says. “It’s going to allow them to develop innovative, best-in-class products.”
The company’s AI stack unifies its existing AI software offerings into a single, end-to-end package. OEM customers and developers can create, optimize and deploy their AI applications across a span of devices powered by Qualcomm solutions and connected intelligent edge products. That includes smartphones, IoT, automotive, XR, cloud and mobile PC.
“That scale in itself is a huge competitive edge. We have the understanding of different vantage points which others do not have,” he says. “But on top of that, having the Qualcomm AI stack allows us to be able to do a lot of work with lesser effort, which is a huge competitive edge too, for us and for our partners.”
A closer look at the stack
The Qualcomm AI Stack, from top to bottom, supports popular AI frameworks such as TensorFlow, PyTorch, ONNX and runtimes including TensorFlow Lite, TensorFlow Lite Micro and ONNX RT. In addition, it includes inferencing SDKs like the Qualcomm Neural Processing SDK with the recently announced Windows version. The developer libraries and services support the latest programming languages, virtual platforms based on open-source QEMU and compilers including LLVM and TVM.
At a lower level, the system software includes system interfaces, emulation support and drivers. It also includes OS support across different product lines like Android, Windows, Linux and QNX and infrastructure Oses like Prometheus, Kubernetes and Docker.
As part of the new offering, the Qualcomm AI Engine direct will now scale across every AI accelerator inside a broad range of products. It will also include popular tools like the Qualcomm AI Model Efficiency Toolkit (AIMET), AIMET Model Zoo, model analyzers and Neural Architecture Search (NAS) which includes Google Cloud Vertex AI NAS.
“Our strong technology road map allows us to take the leadership position in new business areas very quickly,” Asghar says. “That’s exactly what we’re doing on the AI software side, for our developers and OEMs, with the Qualcomm AI Stack. We believe it’s the leading edge offering for the connected intelligent edge.”
Qualcomm AI Stack is globally available today. To learn more about the Qualcomm AI Stack, go here.
Source: Read Full Article