德?tīng)柛#―elphi)將在2017年美國(guó)拉斯維加斯國(guó)際消費(fèi)類電子產(chǎn)品展覽會(huì)(CES 2017)的舞臺(tái)上,展示一款新型先進(jìn)自動(dòng)駕駛系統(tǒng)。作為一款“一站式”完整自動(dòng)駕駛系統(tǒng),該平臺(tái)預(yù)計(jì)將于2019年前送達(dá)各家汽車廠商,為車輛提供SAE 4級(jí)到5級(jí)的全自動(dòng)駕駛功能的支持。
據(jù)了解,這款德?tīng)柛?ldquo;中央傳感定位與規(guī)劃系統(tǒng)(CSLP)”采用了英特爾(Intel)的最新超級(jí)計(jì)算機(jī)微處理器芯片,以及以色列機(jī)器視覺(jué)專家Mobileye的全新傳感器融合處理器和三焦攝像頭硬件系統(tǒng)。
提到CLSP,除了德?tīng)柛9举x予這個(gè)系統(tǒng)的強(qiáng)大微處理能力之外,最為獨(dú)特的地方也許正是由Mobileye開(kāi)發(fā)的新型“道路體驗(yàn)管理系統(tǒng)(Road ExperienceManagement,REM)”。REM系統(tǒng)可利用眾包信息為車輛提供一幅超高精度的實(shí)時(shí)地圖。Mobileye稱,“沒(méi)有地圖,就沒(méi)有安全的自動(dòng)駕駛系統(tǒng)”。
2017年CES展的技術(shù)演示道路長(zhǎng)約10.1公里,覆蓋了德?tīng)柛KQ的“自動(dòng)駕駛公開(kāi)演示以來(lái),最為復(fù)雜的市內(nèi)/高速綜合路段”。
在CES展之前舉行的CLSP系統(tǒng)背景信息媒體發(fā)布會(huì)上,德?tīng)柛9痉?wù)副總裁Glen De Vos積極發(fā)聲稱,今年的CES展將見(jiàn)證德?tīng)柛?、Mobileye和英特爾三家公司的史上首次聯(lián)合演示,我們非常激動(dòng)。
“大家都希望在2019年前推出無(wú)人駕駛汽車,但在這場(chǎng)競(jìng)賽中,有三個(gè)因素會(huì)讓真正杰出的產(chǎn)品脫穎而出,”De Vos表示,“一是攝像頭、雷達(dá)和激光雷達(dá)等感知傳感器;二是駕乘體驗(yàn);三是計(jì)算處理速度。”
De Vos向在場(chǎng)媒體表示,相較于其他系統(tǒng)常用的,成本更高的激光雷達(dá)傳感器,CLSP系統(tǒng)在生成周邊環(huán)境信息時(shí)大膽采用了先進(jìn)視覺(jué)感知系統(tǒng)與軟件,因此成本相對(duì)更低。
De Vos表示,德?tīng)柛?ldquo;一站式”CLSP系統(tǒng)的成本大約在5,000美元,而且這一數(shù)字預(yù)計(jì)還將隨著元件成本的下降與銷量的上升而大幅下降。他還表示,目前,德?tīng)柛O到y(tǒng)尚未敲定任何“確定客戶”。
整合各路選手與技術(shù)
在未來(lái)的自動(dòng)駕駛研發(fā)中,德?tīng)柛⒗^續(xù)扮演自身一級(jí)技術(shù)整合商的角色,向無(wú)力或無(wú)意依靠自身力量進(jìn)行自動(dòng)駕駛研發(fā)的汽車廠商提供整合式系統(tǒng),這也意味著在推廣初期,德?tīng)柛P枰挟?dāng)相應(yīng)的責(zé)任。
2016年8月,德?tīng)柛P紝⑴cMobileye進(jìn)行傳感器融合方面的合作,助力高級(jí)別自動(dòng)駕駛系統(tǒng)的發(fā)展。數(shù)月之后,公司證實(shí)已選擇英特爾為系統(tǒng)供應(yīng)計(jì)算速度高達(dá)每秒萬(wàn)億次的高級(jí)處理芯片。
2015年,德?tīng)柛J召?gòu)了從卡內(nèi)基梅隆大學(xué)研發(fā)實(shí)驗(yàn)室中獨(dú)立出來(lái)的自動(dòng)駕駛軟件工程專家公司Ottomatika。很大程度上來(lái)說(shuō),正是在Ottomatika駕駛軟件算法的幫助下,德?tīng)柛2拍芡瓿?015年的那場(chǎng)全自動(dòng)汽車橫跨美國(guó)之旅。
目前,Mobileye已經(jīng)與27家汽車廠商簽訂協(xié)議,為這些廠商提供高級(jí)駕駛員協(xié)助技術(shù)。但在與Delphi建立合作前不久,Mobileye曾直接卷入了迄今為止自動(dòng)駕駛技術(shù)發(fā)展史上最大的挫折之中——一輛在自動(dòng)駕駛模式下行駛的特斯拉電動(dòng)車發(fā)生了事故,駕駛員不幸喪生,而這輛車正是采用了Mobileye的攝像頭視覺(jué)和軟件系統(tǒng)。
盡管如此,Mobileye的最新視覺(jué)傳感硬件和軟件系統(tǒng)仍是德?tīng)柛LSP系統(tǒng)傳感器融合與軟件功能的核心,特別是基于EyeQ 4/5片上芯片微處理器的REM軟件。這樣一來(lái),即使在沒(méi)有GPS信號(hào)的情況下,車輛的攝像頭視覺(jué)系統(tǒng)也能將車輛定位的誤差控制在10厘米以內(nèi)。
公司高級(jí)副總裁、首席通信官Dan Galves在談及REM系統(tǒng)時(shí)表示,“這才是Mobileye對(duì)行業(yè)的真正貢獻(xiàn)。”
REM軟件目前“仍處于驗(yàn)證階段,”Galves補(bǔ)充說(shuō),Mobileye已經(jīng)開(kāi)始向通用汽車(General Motors)、日產(chǎn)(Nissan)和大眾汽車(Volkswagen)供應(yīng)REM軟件,支持這些公司采用的基于EyeQ芯片的視覺(jué)系統(tǒng)。
精度進(jìn)一步提升
De Vos表示,利用REM系統(tǒng)實(shí)時(shí)理解“當(dāng)?shù)?rdquo;環(huán)境信息是德?tīng)柛H伦詣?dòng)技術(shù)平臺(tái)“自動(dòng)駕駛系統(tǒng)的關(guān)鍵元素”。他解釋說(shuō),公司最近在新加坡開(kāi)展的“自動(dòng)駕駛按需出行(AMOD)”試點(diǎn)項(xiàng)目,將采用基于REM的系統(tǒng)。德?tīng)柛_€計(jì)劃在北美和歐洲城市(很大可能是匹茲堡和波士頓)開(kāi)展類似的AMOD演示項(xiàng)目。
有意思的是,在Mobileye和英特爾的協(xié)助下,盡管德?tīng)柛5腃LSP平臺(tái)擁有很大的計(jì)算處理容量,但REM軟件本身并不需要很大的存儲(chǔ)容量。系統(tǒng)僅需大約60GB的數(shù)據(jù)存儲(chǔ)空間,就可以記錄全美所有“本地化信息”所需的路測(cè)標(biāo)志、建筑物,以及無(wú)數(shù)的其他固定道路標(biāo)記。
但至少到目前為止,先進(jìn)攝像頭系統(tǒng)和REM仍無(wú)法取代激光雷達(dá)傳感系統(tǒng)所提供的高精度車載地圖功能。德?tīng)柛5腄e Vos表示,為了增強(qiáng)車輛的視覺(jué)傳感功能,我們的CES演示車上安裝了6個(gè)機(jī)電激光傳感器,當(dāng)然還有雷達(dá)系統(tǒng)。
德?tīng)柛UJ(rèn)為,AMOD項(xiàng)目很有可能成為高級(jí)別(SAE 4級(jí)到5級(jí))自動(dòng)駕駛系統(tǒng)的首批采用者。De Vos稱,自動(dòng)大巴、出行“小車”和私人駕乘分享汽車可能是高級(jí)別自動(dòng)駕駛在推廣初期的最佳應(yīng)用。
Delphi will use CES 2017 to demonstrate an advanced new automated-driving technology platform the company plans to make available to automakers by 2019 as a complete system to enable vehicles to operate with SAE Level 4-5 fully autonomous capability.
The Delphi technology platform, called Central Sensing Localization and Planning (CSLP), leverages the latest supercomputing microprocessor chip from Intel, as well as a new sensor-fusion processor and tri-focal camera hardware from Israel-based machine-vision specialist Mobileye.
But apart from the pure microprocessing power Delphi’s assembled for the CSLP system, perhaps its most unique advance is new, Mobileye-developed software called Road Experience Management. REM provides the vehicle with crowd-sourced information to create an ultra-precise, real-time map that Mobileye said “is a prerequisite for safe autonomous driving.”
The Las Vegas demonstration during CES 2017 is on a 6.3-mi (10.1-km) course comprised of public roads that Delphi claimed is “the most complex automated drive ever publicly demonstrated on an urban and highway combined route.”
At a media information background session prior to the CES unveiling of the CSLP system, Glen De Vos, vice-president of services for Delphi, enthused that the CES demonstration is “the first time we will showcase this (combined Delphi, Mobileye and Intel technology) together. We couldn’t be more excited.
“Three factors will separate the leader from the pack in the race to offer driverless vehicles by 2019,” De Vos continued in a release: “best-in-class perception sensors such as cameras, radar and LiDAR, automotive experience and computer processing speed.”
De Vos told the media the system, because of its advanced vision-sensing and software, will be less-expensive than others being developed that rely on still-costly LiDAR sensors to generate adequate data about the environment around the vehicle.
He said Delphi projects its turnkey system will cost on the order of $5,000, but that figure is of course expected to plummet as costs are reduced and sales volumes increase. He also said Delphi currently has no “committed customers” for the system.
Assembling the players and technologies
For autonomous-driving development, Delphi will continue its established role as a Tier 1 integrator of technology in order to sell a complete system to automakers, many of which either do have the resources to develop their own automated-driving system or are not inclined to do so, preferring for suppliers to make the investment—and, potentially, to shoulder the early-adoption liabilities.
In August 2016, Delphi announced its partnership with Mobileye to develop the sensor-fusion aspects required for high-level automated driving. A few months later, the company confirmed it had enlisted Intel to supply the advanced processing chipset that will enable trillions of calculations per second.
In 2015, Delphi acquired Ottomatika, an automated-driving software engineering specialist spun off from research at Carnegie Mellon University. It was Ottomatika driving-software algorithms that subsequently helped Delphi achieve a fully autonomous cross-country vehicle trip in 2015.
Not long before its partnership with Delphi, Mobileye, which has contracts with 27 automakers to supply some type of advanced driver-assistance technology, was on the front lines of autonomous driving development’s most notable setback to date: a Tesla car using Mobileye camera vision and software crashed while under autonomous control, killing the driver.
It is Mobileye’s latest vision-sensing hardware and software, however, that is at the center of the Delphi CLSP system’s sensor-fusion and software capabilities—particularly the REM software “overlay” on its EyeQ 4/5 System-on-a-Chip microprocessor. This enables the camera-vision capabilities alone to position the vehicle with a 10-cm (3.9 in) accuracy—even in the absence of a Global Positioning System (GPS) signal.
Dan Galves, senior vice-president and chief communications officer, said of REM, “This really is what Mobileye is offering to the industry.” He added that in internal development, a test vehicle was able to drive autonomously using only camera vision after just four circuits of a congested portion of I-75 near Detroit supplied the necessary visual data for REM.
The REM software is “really in the validation phase right now,” said Galves, who added that Mobileye also has supplied the REM software to EyeQ-based vision systems being used by GM, Nissan and Volkswagen.
Next-gen accuracy
Real-time understanding of the “local” environment with REM is “the key element of automated driving” using Delphi’s new autonomous-technology platform, said De Vos. He explained that the company’s recent automated mobility on demand (AMOD) pilot program in Singapore will adopt the REM-based system. Delphi also plans AMOD demonstration programs in North America and Europe (likely cities in the U.S.: Pittsburgh and Boston).
Ironically, despite the high-powered processing capacity Delphi is building into its CLSP platform with the help of Mobileye and Intel, the REM software itself does not require much memory capacity. The roadside signs, buildings and countless other fixed landmarks throughout the nation that the system employs for a large portion of its localization “knowledge” only needs about 60 GB of data storage.
But at least for now, advanced camera vision and REM isn’t replacing the highly accurate onboard mapping capability available from LiDAR sensing. Delphi’s De Vos said the CES demonstration vehicles will have six electromechanical LiDAR sensors, not to mention radar, to augment vision sensing.
Delphi sees AMOD projects as the likely first candidates for high-level (SAE Level 4-5) autonomous driving. De Vos noted that autonomous buses, mobility “pods” and individual ride-share passenger cars probably will be the best initial deployments for high-level autonomy.
Author: Bill Visnic
Source: SAE Automotive Engineering Magazine