AV, AND SENSOR FUSION: THOUGHTS OF NODAR’S ROSEN

Sign up for our weekly email to stay on top of the latest news and insights!

SOMERVILLE, Mass. – The debate about what type of sensing will be the key to truly autonomous vehicle (AV) operation, as well as achieving all the interim AV levels and providing advanced driver-assist systems (ADAS) keeps swirling, with issues like “phantom braking” intensifying the discussion.

The BRAKE Report caught up with NODAR Co-Founder and Chief Operating Officer Brad Rosen recently who talked about some of these issues, including how the answers may come (at least in part) through a fusion of sensing technologies like his firm’s 3D camera system in conjunction with radar and LiDAR.

“We assume and support the fusion of other sensors [beyond the camera-based NODAR], but I think it depends upon the use case,” said Rosen. “We believe that NODAR in conjunction with radar can support Level 2, 2+ and 3, without the use of LiDAR.

“Level 4 is a different discussion. We believe trucking is going to be the first to hit the road in unmanned or self-driving and in that situation, we fully expect the camera-based systems to be dominant. And we expect there to be LiDARs and radars around the truck. The cameras will face forward; they’ll face rearward, and they’ll face sideways to ensure there are no collision.”

The Hammerhead™ in the NODAR logo is derived from how the company’s sensor system mimics the shark’s physiology, all to achieve similar results – long-range vision with extraordinary depth of field.

“The wider the stance of the cameras, the longer the range, so we were looking around and we found that the hammerhead shark has the widest distance between their pupils of any animal in the animal kingdom and recent research has shown that those their eyes have an overlapping region, which gives them the best depth perception of any animal.”

NODAR takes the 3D images received by two widely spaced high-resolution cameras set to overlap their image capture. It then leverages advances in computing, CMOS technology, computer vision, and AI to accurately produce depth maps out to 1,000 meters and determine specific information about potential obstacles in the vehicle’s path.

Related post:
NODAR ADDS KEY AUTOMOTIVE LEADERS TO TEAM

Rosen acknowledged the potential limitations environmental conditions like snow or fog might have on camera-based systems but said a combination of technology and sensor fusion provide the solution.

“Cameras are susceptible to all of these issues,” he said. “This is why the industry, and we, support sensor fusion, because each type of sensor, whether it’s radar, LiDAR, they each their strengths and their shortcomings.”

He added the NODAR system uses high-resolution cameras which mitigate much of the potential weather and environmental limitations.

How soon does Rosen believe we will see truly autonomous, self-driving vehicles in every-day-use?

“So, our general opinion and mine is that to get to mainstream autonomy, where a consumer could buy a car that drives itself, without the consumer having to worry at all, decades; literally decades, maybe two decades for that particular use case, that would be level five. My opinion is that we’ll take baby steps.

“We’ll see trucking [first]. The ROI [return on investment] is really there. It costs $1.67 a mile to run a truck right now. [And] there’s a shortage of drivers, a massive shortage of drivers. The sooner we can get automated trucks on the road, run them 20 hours a day, reduce the cost to 60 cents, (because the bulk of that is driver costs; increased safety because drivers no longer get drowsy) it’s like a complete win.

“And that’s why we think that trucking will be the first to really, in a mainstream way, have autonomous driving.”

Sign up for our weekly email to stay on top of the latest news and insights!

When it comes to phantom braking, he speculated on issues plaguing a couple of automakers, offered a few thoughts but nothing concrete about other systems.

He did say a key to avoiding this issue is developing a system that can accurately “understand” the environment around the vehicle.

“We think that being able to create a reliable map of the 3D environment around the car is critical. Our technology does that. And it, it does it, you know, out to exquisite ranges, like I said, 1,250 meters in some situations.”

The entire interview can be viewed by clicking on the YouTube link above.

Mike Geylin
Mike Geylin

Mike Geylin is the Editor-in-Chief at Hagman Media. Geylin has been in automotive communications for five decades working in all aspects of the industry from OEM to supplier to motorsports as well as reporting for both newspapers and magazines on the industry.