The myriad video-processing abilities of the Xilinx Zynq SoC will be on display as Alliance member demonstrations in the Xilinx booth at Embedded World 2015 in Nuremberg, Germany later this month. The demonstrations range from robots to ADAS (Advanced Driver Assist Systems) to AR (Augmented Reality). These demos include:
- Multi-camera Surround View Auto Calibration with Synchronous Recording: Xilinx Alliance Premier Member Xylon will be showing its Zynq-based, multi-camera Surround View ADAS demo with automated stitching of as many as six vision streams to create a surround view with synchronous video recording. (Six cameras are critically important for larger vehicles such as trucks and buses.) Here’s a very cool (silent) video demo of the Xylon Surround View multi-camera system:
- Real-Time Semi-Global Matching Stereo Vision: Supercomputing Systems Ltd, a Xilinx Alliance Certified Member, will present a real-time stereo vision-processing system that employs image rectification and disparity calculations on stereo imagers to map the location and the distance of objects in view. The demo runs on a Zynq Z-7045 All Programmable SoC development platform.
- Simultaneous Localization and Mapping for Automotive Applications: Metaio, a Xilinx Alliance Member, will be demonstrating its AR expertise in the form of a localization and mapping system. The company’s AREngine 2 accelerator hardware currently runs in the Programmable Logic on a Xilinx Zync FPGA SoC development board under control of the Linux OS running on the Zynq SoC’s ARM Cortex-A9 MPCore processor. Here’s a video with more information about the Metaio AREngine:
- Embedded Computer Vision for Intelligent Stations: Silicon Software, a Xilinx Alliance Member, will be showing a high-speed inspection system based on image-processing algorithms from the company’s VisualApplets programming environment and toolkit which makes use of both the programmable logic and processor resources in the Xilinx Zynq SoC. Here’s a great video of the demo from SPS 2013:
- MoMath Robot Swarm: Perhaps the coolest demo in the booth, the Museum of Mathematics (MoMath) Robot Swarm consists of several LED-encrusted “Trilobots” forming patterns that require precise positioning of each Trilobot. The robots feature a novel optical tracking system that allows each of them to be “aware” of their position and orientation with sub-mm accuracy based on high-resolution fiducial patterns. Image analysis is implemented in VHDL and mapped into the fabric of a Xilinx Zynq SoC on Xilinx Alliance Member Knowledge Resources’ KRM3Z20 Module. The MoMath exhibit was created by Three Byte Intermedia and premiered in New York city at the museum last December. Here’s a video of the robot swarm in action from The Verge: