harbrickkia1

Renesas DevCon to Feature Highly Automated Driving from Harbrick

Burney Simpson

The Renesas DevCon 2015 next month is shaping up as a major conference on autonomous vehicle technology and will feature ride along opportunities in vehicles just coming out of developers’ garages.

Moscow, Idaho-based Harbrick has teamed with Renesas to create several advanced vehicles loaded with autonomous technology that will be available for rides.

Harbrick is the developer of the PolySync platform for autonomous systems used by auto OEMs and tier 1 auto suppliers. Tokyo-based Renesas is a major global supplier of microcontrollers (MCUs) and microprocessors (MPOs).

DevCon runs from October 12-15 at the Hyatt Regency in Orange Co., Calif. It offers a slew of seminars, presentations, workshops and panel discussions from experts in autonomous and connected driving technology. Topics covered include ADAS, automotive embedded controls, automotive tools and ecosystems, safety and security of autonomous systems, new developments in the automotive cockpit, and more.

Harbrick will show in the parking lot two 2015 SUVs from Roush Enterprises outfitted with an Advanced Driver Assistance System (ADAS) sensor set and a full complement of technology that allows for autonomous driving.

“We will be giving rides in these. They are not actually automated yet but they are fully instrumented with V2X and V2I technology,” said Josh Hartung, Harbrick’s CEO and co-founder. “This is what highly automated driving – HAD – will look like.”

Harbrick’s SUVs will include eight different LiDAR systems, five different radar systems, along with Vehicle to Everything (V2X) and Vehicle to Infrastructure (V2I) technology, several cameras, and motion sensors, all working together to create a sensor fusion demonstration, notes Hartung.

SENSOR FUSION
Sensor fusion occurs when data from a number of sensors are combined, or fused, to create something greater than the information supplied by a single sensor. For autonomous vehicles the sensor fusion can create a simulated three dimensional space filled with multiple objects, much like what a human experiences while driving.

The Harbrick team is preparing simulated accidents, or threats of accidents, for those taking a test drive in one of the SUVs.

Indoors at the show, Harbrick will exhibit a Harbrick Kia, a White Knight Mojavaton that was used in DARPA challenges, a bench demo of its PolySync system, along with some background on Harbrick and its plans for the future.

Harbrick uses the Renesas R-Car development boards in its PolySync platform. Renesas claims the system enables “the delivery of high-resolution camera video through multiple systems while maintaining real-time performance with low latency levels.”

This month, Renesas officially launched its R-Car T2 system-on-a-chip for Ethernet AVB-enabled vehicle camera networks, complementing its R-Car family of devices for infotainment, instrument cluster and ADAS applications.

Safe autonomous driving requires higher performance from processors to deliver continuous, high-resolution, real-time camera video from multiple sources. These systems must be able to conduct driver assist for such programs as braking, steering, and obstacle detection.