Qualcomm announces new deep learning SDK with support for Snapdragon 820, heterogeneous compute
Qualcomm announces new deep learning SDK with support for Snapdragon 820, heterogeneous compute
Qualcomm announced this morn that it's building its showtime deep learning software development kit (SDK) for Snapdragon 820 processors. The new SDK (the Snapdragon Neural Processing Engine) runs on tiptop of Qualcomm's Zeroth Machine Intelligence Platform and is designed to leverage the heterogeneous compute capabilities of the Snapdragon 820.
Before nosotros dive into this topic in more detail, permit's clear up one point of confusion. We first reported on Zeroth more than a year ago, when Qualcomm was discussing including Zeroth equally a physical hardware core known as an NPU, or Neural Processing Unit. This core was rumored to be included equally standard on all Snapdragon 820 devices. Nosotros now know that Qualcomm opted non to ship an NPU with the Snapdragon 820, and the Zeroth brand proper noun refers to a software machine learning platform rather than a specific processing block on the SoC.
Defining deep learning
Deep learning is a subset of motorcar learning which, as the proper name implies, is a method of didactics a figurer how to exercise something, rather than programming it to practice something. Early neural networks were fairly shallow, with an input layer, a few hidden layers, so an output layer. A deep learning network, equally the name implies, use far more layers to summate the human relationship betwixt variables.
Neural networks are widely used in reckoner vision and have been deployed in that field for several decades, but much of the enquiry into fields like self-driving cars has been made possible by advances in deep learning. A conventional neural network might have a single subconscious layer where "weights" are computed for the purpose of facial recognition, speech estimation, or handwriting analysis:
In an example like this, data is feed in, the network weights it (according to the parameters it has learned through preparation runs), and and so the output is displayed. A deep learning network, in dissimilarity, looks more similar this:
Deep learning networks take more hidden layers than conventional neural networks
Qualcomm, for instance, currently uses Zeroth for two technologies: Snapdragon Scene Detect, which classifies objects, items, and people within a visual scene, and Snapdragon Smart Protect, which uses automobile learning to look for suspicious behavior that could exist a sign that a smartphone has been compromised.
Snapdragon Smart Protect
If you're having trouble grasping how deep learning is useful, consider the following instance. Imagine you're walking down the street and you see a house with the front end door standing open. How you lot interpret this will depend on a great many boosted data points: Is there a vehicle obviously being loaded or unloaded? Are there whatever people visible in or about the entranceway? Do you lot hear shouting, laughter, or music? Are at that place whatever lights on inside the house, and if there are, can you see anything? Is it v AM, 12 noon, or 11:30 PM?
The answers to these questions determines how yous respond to the situation. If there are people moving in and out of the business firm and loud music playing, information technology's probably a party. If no ane is visible and the business firm is dark, you might be witnessing a break-in — or someone may simply have forgotten to latch the door properly. Nosotros assign "weights" to these probabilities and evaluate the situation appropriately — and we do information technology unconsciously and at boggling speed compared with a conventional calculator. Conventional neural networks try to duplicate this process. Deep learning networks expand on the basic principles of neural networks, but add more hidden layers and, as a effect, are capable of evaluating more than complicated scenarios and making more sophisticated determinations.
Features and markets
According to Qualcomm, the Snapdragon Neural Processing Engine contains the following features:
- Accelerated runtime for on-device execution of convolutional and recurrent neural networks on the Snapdragon 820 cores (Qualcomm Kryo CPU, Qualcomm Adreno GPU, Qualcomm Hexagon DSP);
- Support for common deep learning model frameworks, including Caffe and CudaConvNet;
- A lightweight, flexible platform designed to apply Snapdragon heterogeneous cores to deliver optimal performance and power consumption;
- Supports companies in a broad range of industries, including healthcare, automotive, security, and imaging, to run their ain proprietary trained neural network models on portable devices.
Qualcomm is clearly interested in emerging markets similar self-driving vehicles, as is Nvidia. The "intelligence" of deep learning has profound implications for how we interface with technology, however, and could potentially pb to a revolution in homo-computer interaction.
I of the differences betwixt computers on shows similar Star Expedition: The Side by side Generation and our ain technology is that Star Trek (and plenty of other sci-fi) depicts a reckoner that's both conversationally fluent and capable of interpreting less-than perfectly clear statements. The replicator knows that when Captain Picard says "Tea, Earl Grey, hot," he wants his tea served at a specific temperature and does not inquire him to explain what "hot" means. (There'southward an interesting StackExchange thread on syntax and voice communication every bit depicted on Star Trek, for the truly nerdy.)
Deep learning networks could help us build computer programs that are far more capable of parsing man speech than electric current software. I doubtable information technology's likewise the basis for much of the work companies like Facebook and Microsoft are doing on bot research, though Tay's implosion terminal calendar month also shows the perils of such research.
With the Zeroth Machine Platform and the Snapdragon Neural Processing Engine, Qualcomm is throwing its lid into the ring and betting developers will use the capabilities of the Snapdragon 820'due south CPU, DSP, and GPU to build heterogeneous networks that leverage the capabilities of all three processing blocks. The SDK is expected to be bachelor in the back half of this yr.
Source: https://www.extremetech.com/computing/227634-qualcomm-announces-new-deep-learning-sdk-with-support-for-snapdragon-820-heterogeneous-compute
Posted by: lewisvengland.blogspot.com

0 Response to "Qualcomm announces new deep learning SDK with support for Snapdragon 820, heterogeneous compute"
Post a Comment