Lattice Blog

Our system is going under maintenance on August 13th between 2:00 pm to 10:00 pm Pacific. During this window, the website may not be reachable. For immediate assistance, please contact techsupport@latticesemi.com.
Accelerating Innovation in Low Power AI Applications with Lattice FPGAs

Accelerating Innovation in Low Power AI Applications with Lattice FPGAs

Posted 08/03/2022 by Hussein Osman, Segment Marketing Director, Lattice Semiconductor

On-device AI inference capability is expected to reach 60% of all devices by 2024, according to ABI Research. This underscores the rapid speed of AI innovation to take place in the last few years that has required engineers to come up with flexible design models as they transition from the cloud to the edge. The trend is driven by advances in ultra-low latency, security, bandwidth limitations, and privacy. Lattice FPGAs and software solutions help enable acceleration of future models with exist...

Read more...
sensAI_AI_innovation_blog_title

Lattice sensAI Stack Enables Next Generation Edge AI Experiences

Posted 11/10/2021 by Hussein Osman

The AI/ML revolution continues to gain traction across multiple applications, particularly Edge applications. Edge devices like security cameras, robots, industrial equipment, Client PCs, and even toys can now support AI/ML capabilities that provide users with new capabilities and experiences. According to industry analyst firm ABI Research, the Edge AI chipset market “has experienced strong growth in the past and is expected to continue to grow to US$71 billion by 2024, with a CAGR of 31%...

Read more...
Certus-NX Blog

Lattice Certus-NX: Reinventing the Low Power, General Purpose FPGA

Posted 06/24/2020 by Juju Joyce

Lattice Certus-NX FPGAs reinvent the general purpose, low power FPGA by delivering twice the I/O density per mm2 of similar competing FPGAs.

Read more...
Architecting Low Power AI

System Architecture Options for On-Device AI

Posted 11/14/2018 by Deepak Boppana

How often is low power the determining factor for success? Certainly when designing solutions for AI inferencing in always-on edge devices, the power consumption must be measurable in milliwatts. Think about it: AI at the edge solves real world problems, and is – or very soon will be – everywhere.

Read more...
Like most websites, we use cookies and similar technologies to enhance your user experience. We also allow third parties to place cookies on our website. By continuing to use this website you consent to the use of cookies as described in our Cookie Policy.