Lattice Blog

Architecting Low Power AI

System Architecture Options for On-Device AI

Posted 11/14/2018 by Deepak Boppana

How often is low power the determining factor for success? Certainly when designing solutions for AI inferencing in always-on edge devices, the power consumption must be measurable in milliwatts. Think about it: AI at the edge solves real world problems, and is – or very soon will be – everywhere.

Read more...
Home is where AI is

Home is where AI is

Posted 09/04/2018 by Hussein Osman

Very soon most homes will have Siri, Alexa, Google Home or similar. Many already have all three. The acceptance of such sophisticated AI systems as an everyday, normal addition to the living room says much about the human condition to imagine, conceptualize, innovate, experiment with...

Read more...
Meeting Demand for More Intelligence at the Edge

Meeting Demand for More Intelligence at the Edge

Posted 08/21/2018 by Deepak Boppana

Over recent decades system design has evolved from one processing topology to another, from centralized to distributed architectures and back again in a constant search for the ideal solution.

Read more...
AI / Machine Learning

Inferencing Technology Stack Shrinks Time-to-Market for Edge Applications

Posted 06/12/2018 by Deepak Boppana

New Technology Promises to Accelerate Deployment of Machine Learning Inferencing Across Mass Market, Low-power IoT Applications

Read more...
Like most websites, we use cookies and similar technologies to enhance your user experience. We also allow third parties to place cookies on our website. By continuing to use this website you consent to the use of cookies as described in our Cookie Policy.