Mobility analysts, urban planners and AI companies bill widespread lidar as a building block for future urban societies, where autonomous vehicles, smart homes and infrastructure work together to create “smart” cities.
Lidar, short for light detection and ranging, is a sensing method that enables devices to glean what an object is based on its shape. In theory, when deployed on traffic lights, in parking lots and on enough vehicles, the technology could help contextualise what’s happening outside so cities can better manage energy and security. It could also manage traffic congestion.
The tech has been around since at least the 1970s. However, it was considered too expensive and complicated for companies in a broad range of industries to utilise. That is until now, according to HanBin Lee, founder of South Korea-based Seoul Robotics, a computer vision company.
Prices have come down so much that the tech is found in the latest model iPhones. It’s how robot vacuums see what’s around your home. It’s at the centre of several thought-provoking product announcements to come out of this year’s CES, a large, global tech conference that took place last week.
Seoul Robotics said it wants to take the siloed industry of lidar software and expand it to the masses. Essentially, the software company developed what it calls an easy-to-use “plug-and-play” lidar system that allows a wide range of organisations to benefit from 3D sensors.
For instance, retail stores could use it to understand where people are moving and whether patrons are social distancing. Cities could use it on highway offramps to detect vehicles going the wrong way.
Its offering is meant to analyse and interpret 3D data from most available lidar products. It was built to unlock “autonomy through infrastructure,” Lee said.
Seoul Robotics already has a few big-name partnerships under its belt, including BMW and Mercedes-Benz. It also partnered with the lidar company Velodyne on office monitoring tech for Qualcomm. Seoul Robotic’s software has been installed in parking lots to help automate cars. BMW used it to move driverless vehicles via wireless internet connections.
“So basically, this infrastructure takes over the vehicle. And thousands of vehicles can be automated with just a few sensors,” Lee said.
Intel’s MobilEye said at the trade show that it developed a strategy for making highly automated cars safe enough to use on roads across the globe by 2025. The company, a leading player in automotive technology, plans to leverage crowdsourced mapping, a camera-based computer vision system and a lidar suite to achieve its goal.
MobileEye, which Intel snapped up in 2017, has been testing its mapping technology in Munich and plans to use cameras built into production vehicles to map the world. The company claims to have already mapped nearly a billion kilometres, setting a foundation for autonomous cars to follow.
Its project relies on two independent computer vision systems to ensure that vehicles are safer in self-driving mode than if a human were controlling the car. One is a camera-based system that is advanced enough to power the car autonomously, and the other is a lidar and radar-based system that’s strong enough to do the same thing.
The two approaches are fused along with the 3D maps allowing “safety-critical performance that is at least three orders of magnitude safer than humans,” according to Mobileye. Pending regulatory approval, Mobileye will expand its fleet of autonomous test vehicles to New York City by the end of the year.
The Munich-based start-up Blickfeld showed two new lidar sensors for cars meant to hit the market in three to four years. The 3D sensors, dubbed Vision Mini and Vision Plus, are designed to produce a surround-view “that is crucial for automated urban traffic as well as robotic vehicles,” according to the company.
The Mini is small, roughly five centimetres long and is meant to detect closer range objects around a vehicle. It’s customisable to fit within a vehicle’s design scheme, according to the company. The larger Vision Plus can detect objects 200 metres in front of and behind cars with self-driving features. Together, they’re designed to enable cars to handle more than one automated task at a time.
A combination of six sensors are needed for 360-degree views, unlocking level four autonomous capabilities, says Florian Petit, founder of Blickfeld. The company is working with production partners to meet what it sees as a rapidly increasing demand.
“We saw that there’s a huge gap between the cars produced to be autonomous eventually, and the number of lidars produced,” Petit said.
The Washington Post