許多自動(dòng)駕駛技術(shù)發(fā)展計(jì)劃都要求在車輛上安裝固態(tài)激光雷達(dá)傳感器,但由于當(dāng)前原型車所采用的激光雷達(dá)模塊都是由活動(dòng)部件組成的機(jī)械系統(tǒng),因此這一需求引發(fā)了人們對(duì)激光雷達(dá)的巨大興趣,OEM和供應(yīng)商競相投資開發(fā)非機(jī)械組裝的激光雷達(dá)技術(shù)。
一些小型公司已經(jīng)研發(fā)出了固態(tài)激光雷達(dá)技術(shù),但還無法實(shí)現(xiàn)在汽車上的應(yīng)用,并且其中一些技術(shù)已經(jīng)在過去18個(gè)月內(nèi)被大型汽車公司所收購:福特投資了Velodyne,采埃孚收購了Ibeo 40%的股份,大陸集團(tuán)和Analog Devices則分別并購了Advanced Scientific Concepts和Vescent Photonics。
激光雷達(dá)通過發(fā)射激光來測量物體間的距離,該功能與雷達(dá)類似,但激光可以支持系統(tǒng)在夜晚和雨雪天氣下提供高分辨率的圖像,這激發(fā)了市場對(duì)固態(tài)激光雷達(dá)技術(shù)的興趣。
“高分辨率‘快閃光’激光雷達(dá)的功能適用于所有照明和天氣條件,因此成為了自動(dòng)駕駛的一項(xiàng)必要技術(shù)。”Continental北美高級(jí)輔助駕駛系統(tǒng)的客戶項(xiàng)目主管Dean McConnell介紹道,“我們以30赫茲的速度捕捉圖像,然后以每秒30次的頻度構(gòu)建3D點(diǎn)簇。”
該技術(shù)還有助于安全系統(tǒng)對(duì)值得注意的對(duì)象進(jìn)行歸零校正,這對(duì)確定一個(gè)對(duì)象是否會(huì)對(duì)駕駛構(gòu)成威脅至關(guān)重要。
“激光雷達(dá)的作用就像人類的眼睛,視野寬廣,可快速進(jìn)行掃描,然后專注于感興趣的對(duì)象,”ADI的汽車安全總經(jīng)理Chris Jacobs說道。
激光雷達(dá)供應(yīng)商正在加速開發(fā)緊湊型固態(tài)模塊。由于目前自主駕駛研究人員采用的大型機(jī)械裝置體積過大,生產(chǎn)成本太高,研究人員正在努力縮小模塊尺寸,并將距離和視野完美結(jié)合。
“我們的固態(tài)塻塊尺寸是9×6×6厘米,大約相當(dāng)于兩盒撲克牌的大小,”Quanergy的首席執(zhí)行官Louay Eldada介紹道。“固態(tài)塻塊目前有120度的視野范圍,所以三個(gè)可以構(gòu)成360度視野覆蓋。三個(gè)固態(tài)塻塊的位置安排,可以是左前方和右前方各一個(gè),第三個(gè)位于后方中間位置,或者每個(gè)角落各一個(gè)。”
車輛與物體間的距離是關(guān)鍵的安全參數(shù),而縮小視野范圍有助于提高該參數(shù)的測量精度。開發(fā)人員正在努力實(shí)現(xiàn)與相機(jī)和雷達(dá)相同的測距能力,目標(biāo)約為200米(656英尺)。要獲得理想的測距能力,需要權(quán)衡不同方面,位置點(diǎn)是幫助確定視野覆蓋范圍的關(guān)鍵參數(shù);側(cè)視模塊不需要與前視模塊擁有相同的測距能力,所以其視野范圍可以更寬。
“我們已經(jīng)實(shí)現(xiàn)了在15度視野內(nèi)進(jìn)行70米(230英尺)范圍內(nèi)的精確測定,但這明顯不夠,”采埃孚主動(dòng)和被動(dòng)安全部門產(chǎn)品規(guī)劃負(fù)責(zé)人Aaron Jefferson說。“視野寬度至少需要達(dá)到50或60度。成本下降后,可以將模塊與尾燈和前大燈合并。”
激光雷達(dá)可以彌補(bǔ)攝像機(jī)和雷達(dá)的不足,提供的信息在與其他傳感器信息“融合”后,能夠生成可靠的汽車環(huán)境圖像。所有這些傳感器會(huì)產(chǎn)生大量的數(shù)據(jù),所以通信和數(shù)據(jù)管理將成為整體設(shè)計(jì)中的一個(gè)重要因素。
“3D激光雷達(dá)傳感器將生成大量數(shù)據(jù),但與雷達(dá)和攝像機(jī)類似的是,軟件技術(shù)可以幫助減少數(shù)據(jù)量,剔除無用或不重要的數(shù)據(jù),并從關(guān)注的數(shù)據(jù)中提取細(xì)節(jié),”Jefferson說道。“此外,用于數(shù)據(jù)過濾、數(shù)據(jù)分組/分類、對(duì)象識(shí)別的技術(shù)也決定了需要處理的數(shù)據(jù)量,對(duì)于數(shù)據(jù)量管理而言,這才是真正的問題。”
市場充滿興趣,但仍在觀望
盡管出現(xiàn)了很多新發(fā)展,但預(yù)計(jì)市場在未來一段時(shí)間內(nèi)不會(huì)有太多活動(dòng)。許多工程師都表示,激光雷達(dá)技術(shù)可以在等待自動(dòng)駕駛車輛設(shè)計(jì)更加成熟的同時(shí)慢慢發(fā)展。目前,系統(tǒng)設(shè)計(jì)人員可以在等待下一代模塊推出的時(shí)間內(nèi),采用機(jī)械組件創(chuàng)建原型。
“固態(tài)激光雷達(dá)將于今年晚些時(shí)候投入生產(chǎn),但測試和軟件開并不需要固態(tài)產(chǎn)品,”Eldada說道。“雖然我們計(jì)劃固態(tài)模塊在9月出貨,但要到一年以后我們才能準(zhǔn)備好車用級(jí)別的零件。”
激光雷達(dá)車輛的大規(guī)模推廣和自動(dòng)駕駛汽車一樣,前景尚未明朗。在主流OEM開始訂購激光雷達(dá)傳感器之前,企業(yè)車隊(duì)項(xiàng)目可能會(huì)擴(kuò)大規(guī)模,并帶來市場機(jī)會(huì),優(yōu)步(Uber)在匹茲堡正在進(jìn)行的自動(dòng)駕駛測試項(xiàng)目就是一個(gè)例子。
“我們正在研究2021年以前的批量生產(chǎn)情況,但不同領(lǐng)域的實(shí)現(xiàn)時(shí)間可能會(huì)有所差異,”McConnell說道。“一些車隊(duì)服務(wù)公司正積極地推動(dòng)在已進(jìn)行地圖測繪的區(qū)域應(yīng)用自動(dòng)駕駛技術(shù)。”
許多開發(fā)人員認(rèn)為,激光雷達(dá)投入使用后并不會(huì)大量取代其他傳感器,因?yàn)槲覀內(nèi)孕枰幌盗屑夹g(shù),以支持在各種天氣條件下實(shí)現(xiàn)自動(dòng)駕駛所需的能力和冗余水平。
“我們并不認(rèn)為3D激光雷達(dá)會(huì)替代現(xiàn)有傳感器,而是將其看成是一種能夠提供高分辨率感知系統(tǒng)、幫助實(shí)現(xiàn)SAE4級(jí)自動(dòng)駕駛的創(chuàng)新技術(shù),”Jefferson說道,“3D固態(tài)激光雷達(dá)、攝像機(jī)、雷達(dá)、超聲波傳感和其他技術(shù)將繼續(xù)發(fā)揮作用,這些技術(shù)是在駕駛過程中360°實(shí)時(shí)感知車輛周圍狀況所必須的。
然而,這并不是一個(gè)普遍公認(rèn)的結(jié)論。
“超聲波技術(shù)將會(huì)消失,”Eldada反駁道。“攝像機(jī)可以支持色彩辨認(rèn)需求,如查看交通信號(hào)燈。將激光雷達(dá)和攝像機(jī)融合,對(duì)數(shù)據(jù)進(jìn)行‘著色’,可以使數(shù)據(jù)更有價(jià)值。雷達(dá)可以支持冗余需求,而進(jìn)行轉(zhuǎn)向或剎車決策還需要其他傳感器信息的支持。”
Many autonomous-driving development plans call for deploying a handful of solid-state Lidar sensors on each vehicle, but the Lidar modules used for today’s prototype vehicles all are mechanical systems with moving parts. That’s prompted huge interest in Lidar, with OEMs and suppliers racing to invest in non-mechanical technologies.
Several small companies have developed solid-state Lidar technologies that aren’t ready for automotive applications—and some of those have been gobbled up by major automotive companies past 18 months. Ford made a large investment in Velodyne, while ZF bought a 40% stake in Ibeo. Continental acquired Advanced Scientific Concepts. Analog Devices Inc. (ADI) acquired Vescent Photonics Inc.
The interest stems from Lidar’s advanced use of emitted laser light to measure the distance of objects, functioning much like radar. The laser lets the system provide high resolution imagery at night and in rain or snow.
“High-resolution 'flash' Lidar is a necessary technology for autonomous driving because its capabilities are available in all lighting and weather conditions,” said Dean McConnell, Director of Customer Programs, Advanced Driver Assistance Systems, at Continental North America. “We’re capturing images at 30 Hz, constructing 3D point clusters thirty times per second.”
The technology also helps safety systems zero-in on objects of interest. That’s important to determine whether an object is a threat to driving.
“Lidar acts more like the human eye: it views a broad scene, doing a quick scan, then if it sees something interesting, it can focus in on that,” said Chris Jacobs, General Manager of Automotive Safety for ADI.
Lidar providers currently are racing to develop compact solid-state modules because the large mechanical pucks now used by autonomous-driving researchers are too bulky and costly to go into production vehicles. Researchers are striving to shrink sizes and come up with a good combination of distance and field of view.
“Our solid-state box measures 9 x 6 x 6 cm, about the size of two decks of cards,” said Louay Eldada, Quanergy's CEO. “Currently, it has a 120-degree field of view, so with three you have 360 degree coverage. There will always be two in the front, on the right and left sides, and one in the back middle or one on each corner.”
Determining the vehicle's distance to objects, a key parameter for safety, can be increased by narrowing the field of view. Developers are trying to achieve the same distance levels as cameras and radar, with a goal of around 200 m (656 ft). To achieve desirable distance performance, several tradeoffs are being considered. Location points are key parameters that help determine field-of-view coverage; modules looking to sides, for example, won’t need the same range capability as forward-facing units, so their field of view can be wider.
“We’ve demonstrated 70 meters (230 ft) with a 15-degree field of view, which is clearly not sufficient,” said Aaron Jefferson, Director of Product Planning for ZF’s Active and Passive Safety Division. “It needs to go up to 50 or 60 degrees to start. When the cost gets down, it’s conceivable that they could be integrated into taillights and headlights.”
Lidar will complement cameras and radar, providing information that typically will be "fused" with that from other sensors to create a reliable image of vehicle surroundings. All these sensors generate a huge amount of data, making communications and data management an important factor in overall designs.
“3D Lidar sensing will create a significant amount of data, but similar to radar and camera, there are software techniques to help minimize the amount of data, eliminate useless or unimportant data and extract the detail from the data of concern,” Jefferson said. “Furthermore, the techniques used to filter data, group/cluster data, identify objects, etc. also determine the amount of data that needs to be processed, which is the real concern in terms of managing data volume.”
Curiously, no real hurry
Though there’s plenty of development, the market isn’t expected to see much activity for some time. Many engineers say Lidar can develop slowly while waiting for autonomous vehicle designs to solidify. For now, system designers can create prototypes using mechanical components while they wait for next-generation modules.
“Solid-state Lidar will be in production later this year, but for pilots and software development, you don’t need solid-state,” Eldada said. “Though we plan to ship solid state products in Sept., we won’t have automotive-grade parts ready until a year later.”
The rollout of Lidar-equipped vehicles is as murky as the emergence of autonomous cars. Corporate fleet programs like Uber’s autonomous current tests in Pittsburgh may expand into market opportunities before mainstream OEMs start ordering Lidar sensors.
“We’re looking at series production in the 2021 timeframe, but it may happen faster in different segments,” McConnell said. “Some fleet-service companies are aggressive about getting vehicles out with automated driving in a geomapped area.”
Once Lidar is in use, many developers don’t expect it to displace many other sensors. A range of technologies is needed to provide the capability and redundancy needed to drive autonomously in all weather conditions.
“We do not see 3D Lidar as a sensor replacement, but rather as an innovation that can enables the high resolution sensing needed to realize SAE Level 4-plus automated driving,” Jefferson said. “3D solid-state Lidar, camera, radar, ultrasonic sensing and other technologies will continue to play a role—a combination of these will be necessary to properly sense the vehicle environment in 360 degrees, in real time.”
That’s not a universal conclusion, however.
“Ultrasonics will go away,” Eldada countered. “Video is needed for color, things like seeing traffic lights. Fusing Lidar and cameras 'colorizes' our data so it’s more valuable. Radar is needed for redundancy; you need another sensor before deciding to steer or hit the brakes.”
Author: Terry Costlow
Source: SAE Automotive Engineering Magazine