News

Imaging Sensors Summit Grenoble

AI sensor
Blog

Imaging Sensors Summit Grenoble

AI close to the sensor

Easics had the priviledge to talk at the Mems and Imaging Sensors Summit in Grenoble. In this talk we explained how image sensor designers can integrate our AI engine or accelerator. In this way they can use deep learning networks to analyze and classify sensor data close to the point of data collection. What we call AI close to the sensor!

Send an email to info@easics.be if you would like to have a demo. 

Abstact of the talk: 

Easics uses its expertise in system-on-chip design to develop small, low-power and affordable AI engines that run locally, close to your sensors. This results in a low and predictable latency and runs with ultra low-power consumption.  Easics’ embedded AI solutions integrate tightly with novel and existing sensors such as image sensors capturing light inside and outside the visible spectrum (such as hyperspectral and thermal infrared), 3D scanning laser (LiDAR), Time-of-Flight (ToF) sensors, radar, microscopy, ultrasound sensors, and microphones. To create such innovative hardware, easics is developing the next generation of its embedded AI framework. This framework automatically generates hardware implementations of the deep neural networks that make your specific application smart. Easics maps these AI engines on custom ASIC technology and FPGA. The framework and technology offers enough flexibility in terms of different deep learning models and hardware to have a scalable AI engine that is ready is for the future. In this talk we will elaborate how we offer an easy integration path for image and other sensor manufacturers to add AI in their products.