Lattice sensAI Delivers 10x Performance Boost for AI on Edge Devices
Posted 05/20/2019 by Hussein Osman
A year ago we launched the Lattice sensAI solutions stack. Since then, the need for AI at the Edge has continued to grow. Consider this statistic from Tractica: by 2025 the market for Edge-based AI chipsets is forecasted to hit $51.6 billion (that’s over three times their forecasted revenues for cloud-based AI chips). Why all the interest in chips that support AI at the Edge? According to Tractica, “…AI is best fulfilled when the chipsets are optimized to provide the appropriate amount of compute capacity at the right power budget for specific AI applications.” Here at Lattice, we agree with Tractica’s point of view, and providing our customers with an AI solution with the right mix of compute and power for Edge devices is why we created Lattice sensAI.
Which leads me to some exiting news. Today Lattice released a new version of our sensAI solution stack with a major performance increase and design flow and ecosystem enhancements that make it easier than ever for system designers to support low power AI inferencing in their Edge device. The enhancements include:
- A 10x performance boost over the previous generation Lattice sensAI stack driven by updates to the CNN IP and neural network compiler and features like 8-bit activation quantization, smart layer merging and a dual-DSP engine. The performance gain enables real-world application benefits like the ability to analyze images at a higher frame rate or resolution, while still maintaining low power consumption.
- A more seamless user experience that accelerates and simplifies working with sensAI, including:
- Expanded support of ML frameworks to include Keras
- Support for quantization and fraction setting schemes for neural network training that eliminated the need for iterative post-processing
- Simple neural network debugging via USB
- New customizable reference designs customers can use to add support for popular use cases like object counting and presence detection
- A growing design service partner ecosystem, including full product design capability from partners such as Pixcellence, to make it easier for customers to get to market faster.
sensAI lets OEMs add low power AI inferencing to new products or easily update their existing device designs. As our new reference designs show, this capability is particularly helpful to smart camera applications in markets like industrial, automotive and consumer. Integrating local AI support in smart cameras has a number of advantages.
- Reduced data cost – Smart cameras without local AI support constantly send video back to the datacenter when they think they’ve detected a person or object. But many times these are false positives. For example, a cat walking by a smart doorbell would trigger the doorbell to query the datacenter if the cat in the image is actually a person. But if the camera supports human presence detection locally, it can make this determination on its own without the need to transmit data, greatly reducing costs associated with cloud-based analytic subscriptions.
- Increased data privacy – By conducting simple smart vision functions at the Edge, OEMs reduce the amount of video data sent to a datacenter for analysis. This keeps more customer data from potential exposure as it travels to and from the datacenter.
- Increase data security –With sensAI’s support for local inference, less device data needs to be collected and stored remotely, reducing the risk of customer data theft from the datacenter.
We look forward to seeing new and existing sensAI users capitalize on these new performance and ease-of-use features to enable AI inferencing in Edge devices of all types.
To learn more about the Lattice sensAI solutions stack, please visit www.latticesemi.com/sensAI.