[Embedded Vision Summit 2020] Perceive’s Steve Teig Describes How to Do Data Center-Class Inference in Edge Devices at Low Power

 

Steve Teig, CEO of Perceive, presents the “Ergo: Perceive’s Chip – Data Center-Class Inference in Edge Devices at Ultra-Low Power” tutorial at the September 2020 Embedded Vision Summit.

 

To date, people seeking to deploy machine learning-based inference within consumer electronics have had only two choices, both unattractive. The first option entails transmitting voluminous raw data, such as video, to the cloud, potentially violating customers’ privacy, tempting hackers, and costing substantial energy, money, and latency. The second option runs at the edge, but on severely limited hardware, which can implement only tiny, inaccurate neural networks (e.g., MobileNet) and runs even those tiny networks at low frame rates.

 

Solving this dilemma, Perceive’s new chip, Ergo, runs large, advanced neural networks at high speed for imaging, audio, language, and other applications inside edge devices without any off-chip RAM. Even large networks, such as YOLOv3 with more than 64 million weights, can run at ~250 fps (with batch size 1). Moreover, Ergo can run YOLOv3 at 30 fps in about 20 mW (i.e., more than 50x more power-efficiently than competing devices).

 

Watch the full video here.