As AI image sensing and Edge computing technologies are on the rise, Etron Technology subsidiary eYs3D Microelectronics has launched the new eSP36 multi-sensor image controller IC.
The eSP936 supports the synchronous processing of data from up to seven visual sensors, providing high image recognition accuracy. Paired with the Sense and React human-machine interaction developer interface, it enables intelligent control through human-machine interaction, becoming a key driver for the implementation of smart applications.
The eSP936 is able to be integrated with multi-modal visual language models (VLM), combining multiple visual sensors with real-time AI Edge computing capabilities. It is well-suited for smart application scenarios such as unmanned vehicles like automated guided vehicles, autonomous mobile robots, and drones.
Furthermore, the eSP936 can process multiple 2D images at high speeds and generate 3D depth maps, enhancing precise environmental recognition. Embedded AI chips further enable dynamic navigation in complex environments. Additionally, industrial and service robots can achieve more precise intelligent perception in complex scenarios, combining real-time computing and automated operations for high-efficiency performance. In immersive human-machine interaction systems, the eSP936 can be integrated with AI SoC platforms to enhance the Sense and React interaction experience, widely applicable in Drones, USV (Unmanned Surface Vehicle), video conferencing, augmented education, and extended reality (XR) fields.
Key technology highlights of the eSP936 include its support for synchronised processing of data from up to seven visual sensors, built-in DRAM chip, and wide-angle image de-warping technology, enabling high-precision environmental perception and multi-view 3D depth map generation. It also features high-performance data compression capabilities to reduce latency, providing developers with a flexible and efficient platform. The MIPI+USB simultaneous processing technology ensures high-quality 2D image and 3D depth map output, improving image recognition accuracy. The hope from the company is to create more opportunities for applications where customers can leverage the single-chip eSP936 IC to maximise the potential of AI agency.
Debuting at CES 2025, eYs3D Microelectronics is unveiling its cutting-edge spatial awareness solution, YX9170. Powered by the eYs3D multi-sensor image control chip eSP936 and the XINK-ll Edge spatial computing platform, this solution leverages advanced multi-sensor fusion and AI-driven technologies to deliver breakthroughs in spatial perception and recognition. It provides robust support for the development of intelligent systems such as industrial and service robots and autonomous vehicles, emerging as a critical driver for smart system innovation.
The core strength of the YX9170 solution rests in its comprehensive sensor fusion capabilities. It integrates dual-depth sensors, supporting high-definition images up to 1280×720 resolution, and synchronises four RGB cameras for enhanced perception range and recognition accuracy. With support for multiple stereo lens baseline integrations, it overcomes the limitations of traditional stereo vision measurement techniques. This innovation surpasses previous challenges in multi-camera image processing and computational architecture integration, forming a holistic AI-based spatial awareness system.
eYs3D is also introducing its groundbreaking YX9670 navigation solution for autonomous vehicles. Equipped with the eYs3D multi-sensor image control chip eSP936 and the XINK-ll Edge spatial computing platform, this innovative solution revolutionises environmental awareness and navigation capabilities through advanced multi-sensor fusion and AI-driven technology.
The core advantage of the YX9670 solution lies in its comprehensive sensor fusion capabilities. It integrates a dual-depth sensor system supporting high-definition images up to 1280×720 resolution and synchronises four RGB cameras to deliver a 278-degree panoramic field of view. Additionally, a monochrome camera provides a 145-degree overhead view, with an embedded high-efficiency AHRS (Attitude and Heading Reference System) for vessel coordination and posture recognition. The system also incorporates a thermal imaging sensor, creating a highly advanced perception system.
eYs3D Microelectronics will debut its XINK-ll Edge Spatial Computing Platform at CES 2025. This “Platform as a Service (PaaS)” development solution is equipped with the eYs3D AI chip eCV5546, featuring ARM Cortex-A and Cortex-M CPU cores, along with an NPU (Neural Processing Unit). The platform supports the integration of AHRS, thermal imaging, and millimetre-wave radar sensors while incorporating AI convolutional neural network (CNN) technology to significantly enhance object recognition and detection capabilities for Edge AI devices.
Notably, ARM IoT Capital has provided not only funding but also exceptional performance enhancements for the XINK-ll platform. ARM CPUs with Neon instruction set support SIMD processing capabilities, accelerating vector and matrix operations for better computer vision and signal processing performance. Additionally, the low-power ARM Cortex-M4 processor serves as an MCU for system control, motor operation, and timeline synchronization. With a next-generation AI accelerator, the platform delivers superior performance compared to its peers, offering programmable development capabilities that supports various AI models.
In the rapidly evolving AI landscape, XINK-ll is well-positioned to meet future computational demands. XINK-ll supports a wide range of AI development and acceleration frameworks, including ONNX, TensorFlow, TensorFlow Lite, PyTorch and so on, providing complete development kits and services to help developers more efficiently develop NPU (Neural Processing Unit) applications.
At CES 2025, Etron Technology Group will be focusing on the theme ‘Innovation in Action, AI Implementation and Connecting MemorAiLink to Shape the Future’.
There’s plenty of other editorial on our sister site, Electronic Specifier! Or you can always join in the conversation by commenting below or visiting our LinkedIn page.