眾所周知,蜻蜓在捕捉獵物時,能夠偵測和追蹤獵物,預判其行動軌跡,從而先發(fā)制人,防止其逃脫。而最新的研究顯示,蜻蜓的這種能力或許會幫助自動駕駛汽車的安全性更上一個臺階。
澳大利亞阿德萊德大學醫(yī)學院的研究中心總監(jiān)Steven Wiederman博士認為,蜻蜓腦部有一類專門負責偵測目標的神經(jīng)元,用以在捕獵過程中先發(fā)制人,采取行動。在研究者看來,這為提升自動駕駛汽車的視覺系統(tǒng)帶來了新的思路。蜻蜓在捕獵時聚焦視線的能力十分強大,即便獵物潛伏在紛繁隱蔽的環(huán)境中,也能將其牢牢鎖定。
這一神經(jīng)元有望被應用在自動駕駛系統(tǒng)上,以進一步提升汽車的安全性。為此,Wiederman博士的團隊已和來自瑞典隆德大學的研究小組開展了合作項目。而為了更好地展示這一技術的潛力,有一所大學的研究小組正試圖運用全自動的輪式機器人平臺來測試這一模擬蜻蜓探測與預判能力的感應技術,而這一測試也是上述項目中的重要環(huán)節(jié)。
澳大利亞的研究人員發(fā)現(xiàn),蜻蜓腦中用于目標偵測的神經(jīng)元,能夠提升蜻蜓對所追趕獵物前方焦點區(qū)域的反應能力。這樣,即使獵物忽然從蜻蜓的視野中消失,焦點也會隨著時間推移而進行相應的變化,使得蜻蜓的大腦能夠預測出獵物可能的運動軌跡以及接下來出現(xiàn)的方位,從而重新定位目標。
目前,自動駕駛汽車所面臨的一大難題,就是在給定路況的情況下(對蜻蜓而言即為“給定獵物”的情況),從至少兩個選項中做出優(yōu)先目標選擇的決策。在SAE《國際汽車工程》雜志的專訪中,Wiederman博士向我們解釋了昆蟲學對此項研究帶來的幫助。
“在專注追蹤目標,不受周邊情況干擾方面,生物的大腦往往具有很強的能力。我們在蜻蜓的大腦中就發(fā)現(xiàn)了這樣一類神經(jīng)元,具有“選擇性關注”的生物特征。當它同時面對兩個移動目標時,神經(jīng)元會選擇只關注其中一個,有時甚至還能在中途切換關注目標。”
“有時神經(jīng)元會下達指令,鎖定一個并不那么起眼的目標。”Wiederman博士特別指出,“我們目前正在做的,就是要弄清究竟是哪種屬性決定了蜻蜓追蹤目標的選擇?是和時機有關,還是因為目標物運動軌跡的緣故,抑或是因為該目標更值得關注?這一選擇是否只和目標物的特性相關?還是說蜻蜓每次目標鎖定,都受其大腦內(nèi)部某種高階活動的支配?另外,什么時候是適合鎖定目標的時機?什么時候又該更換到另一更值得關注的目標?這些問題都有待我們?nèi)セ卮稹?rdquo;
通過研究這一相對可控的模型系統(tǒng),研究人員希望能進一步洞悉更復雜的大腦結構是如何鎖定目標的,比如:人腦如何在駕車時,如何從諸多目標物中選擇并進行鎖定。因此,這些研究人員正在基于蜻蜓的選擇性處理模式,積極研發(fā)針對可自助移動平臺的相關模型。
獨特的機器人測試平臺
Wiederman博士除在高校任職外,還供職于澳大利亞研究委員會(ARC)科研中心,主攻納米級生物光子學,是視覺生理學與仿生機器人實驗室(the Visual Physiology and Neurobotics Laboratory)的負責人。據(jù)他稱,之所以選擇蜻蜓進行研究原因有三。首先,蜻蜓是電生理學進行數(shù)據(jù)記錄研究時的理想實驗對象;其次,蜻蜓是自然界最高效的捕獵者之一;第三,蜻蜓所展現(xiàn)出的“高階”處理能力,令人十分好奇,比如其預判和選擇性專注的能力,就是其他很多低階昆蟲(如家蠅)所不具備的。
那么,蜻蜓的眼睛和人類相比,究竟有無相似之處呢?Wiederman博士解釋道,蜻蜓的復眼有數(shù)千個獨立的晶體,可以共同把光聚焦到某一片視網(wǎng)膜上。相比之下,人眼則只有一個晶體可用于相同的聚焦活動。蜻蜓復眼的分辨率很低(單眼視覺靈敏度僅為約0.8o),而人眼因為有視網(wǎng)膜中央凹,所以靈敏度非常高。蜻蜓雙眼視域的靈敏度也僅為10o左右,且雙眼的間距太近,因此要產(chǎn)生縱深感,就必須采取其他方式(如動態(tài)視差)來追蹤目標。
然而,Wiederman博士也指出,如果比較神經(jīng)元處理視覺信息時的基本方式,放眼整個動物界,不同物種之間還是有很多相似之處的。而通過運用人類精神物理學的實驗方式,研究小組可以弄清何種類型的處理方式在人類和蜻蜓身上都能起到重要作用。
相關研究是在Clearpath Robotics公司的Husky A200型無人地面車上進行的。這一配備了攝像頭的機器操作平臺采用的是開源串行協(xié)議。經(jīng)大學研究小組的設置后,這一設備可以很好地復制蜻蜓追蹤目標的能力,其原理當然也是蜻蜓捕獵過程中的預判機制。研究人員認為這一技術在該領域具有開創(chuàng)意義。
Wiederman博士說,“根據(jù)測試對象的不同,我們會選用相應的攝像系統(tǒng)。我們會對鏡頭的移動進行測試,以模仿各種類型駕駛員在不同姿勢下頭部和眼部的移動方式。”若能在實驗中使用更大型的無人地面車,就有可能采用更高端(即所需計算能力更高)的算法進行測試。
“我們會測試‘主動視覺系統(tǒng)’,即在閉環(huán)條件下,這一移動平臺本身是如何影響算法的。借助計算神經(jīng)學方面的知識,研究小組已經(jīng)開發(fā)出了可以自主選擇并追蹤移動目標的模型。”
Wiederman博士滿懷期待地表示,“我們摁下啟動按鈕后,接下來就是等著看Clearpath Husky無人地面車自己會選擇哪個目標進行追蹤。”
CSTMD1神經(jīng)元
盡管讓機器能“看見”移動目標已是一大收獲,但是追蹤目標的運動軌跡,然后避開其行駛路線,才是對于自動駕駛汽車而言至關重要的。據(jù)Wiederman博士稱,研究人員發(fā)現(xiàn)蜻蜓的這一神經(jīng)元(已命名為CSTMD1神經(jīng)元)不僅能預判目標會在何處再次出現(xiàn),而且可以切換負責追蹤該運行軌跡的眼睛,這期間的大腦運轉甚至還要在左右腦半球之間進行轉換。
eLife期刊上最近發(fā)表了一篇有關CSTMD1的研究報告。文章中提到,許多動物都具備在紛繁復雜的環(huán)境中偵測移動目標的能力。文中還補充道,這種區(qū)分識別的過程其實相當復雜,尤其體現(xiàn)在十分復雜的周邊環(huán)境中,需要對某個對比度非常微弱的小型目標進行鎖定時。
研究認為,蜻蜓具有一種具備“優(yōu)勝劣汰”選擇能力的神經(jīng)元,可以促使其對某個目標做出選擇性關注,而不受其他周邊環(huán)境的干擾。
更多有關“CSTMD1用于機器人應用研究”的信息,已在2017年7月號的《神經(jīng)工程學》(Journal of Neural Engineering)期刊上發(fā)布。
這一項目為國際合作研究項目,由瑞典研究委員會(the Swedish Research Council)、澳大利亞研究委員會以及瑞典研究和高等教育國際合作基金會(the Swedish Foundation for International Co-operation in Research and Higher Education)共同資助。
A predatory dragonfly’s ability to detect, track and anticipate the escape maneuvers of a juicy target may provide a link to making autonomous driving safer.
Dr. Steven Wiederman, Research Supervisor at the University of Adelaide Medical School in Australia believes that a target-detecting neuron in the dragonfly's tiny brain which anticipates movement, could provide a link to vehicle vision systems. The dragonfly’s visual focus on prey is so great that it can do this even when its target is embedded against a background of “clutter."
To demonstrate the neuron’s potential for safer autonomous mobility applications, a university research team is using an autonomous robot wheeled platform to test sensing techniques derived from the dragonfly. It's part of collaborative research project being conducted by Dr. Wiederman's group and a team at Lund University in Sweden.
The Australian researchers discovered that the target-detecting neuron was able to enhance a dragonfly’s responses in a small focus area just ahead of a moving object being chased. Even if the dragonfly has lost sight of its prey, the focus spread forward over time allows the insect’s brain to predict its likely track and subsequent reappearance to re-establish target acquisition.
A problem facing autonomous vehicles is priority decision making with at least two choices in given traffic situations (or in dragonfly terms, targets). In an interview with Automotive Engineering, Dr. Wiederman explained how the entomological study can help.
“Biological brains have the ability to competitively select one stimulus amidst distracters. We found a neuron in the dragonfly brain that exhibits such selective attention. When presented with two moving targets, the neuron selects just one—sometimes even switching between them mid-trial.
"Sometimes the neuron can ‘lock-on’ to a less salient stimulus," he noted. "We are currently investigating what properties of the target make it the one chosen—is it timing, saliency or trajectory? Is it only attributes of the stimulus, or is the dragonfly choosing the target by some high-order, internal workings in its brain? Finally, when is it appropriate to lock-on, and when is it time to switch to a more salient object?"
By studying this tractable, model system, the researchers hope to gain insight into how more complex brains select stimuli, e.g. a human driving along a road with multiple stimuli. So they're developing models for autonomous, moving platforms based on the dragonfly selection processes.
A unique robotic platform
There are three reasons for using the dragonfly according to Dr. Wiederman, who also heads the Visual Physiology and Neurobotics Laboratory at the Australian Research Council (ARC) Center for Nanoscale Biophotonics. First, it's a fine animal model for electrophysiological recordings; second, it's one of the world’s most effective predators, and third, it exhibits interesting "high-order" processing, e.g. prediction and selection that may not be exhibited by simpler insects, such as the house fly.
Does the dragonfly’s eye have similarities to that of a human? Dr. Wiederman explained that the compound eye has thousands of individual lenses focusing light onto a single retina, while human eyes have one lens focusing light onto a single retina. The dragonfly has less resolution (visual acuity of only ~0.8o), while humans have a central fovea of very high acuity. The dragonfly only has about 10o of binocular overlap and the eyes are too close together, so it must use other techniques for depth perception (e.g. motion parallax).
But Dr. Weiderman noted that there are many similarities between how the underlying neurons process visual information across a diverse range of animal species. By using human psychophysics experiments, his team examines what types of processing are evident in both humans and dragonflies.
The camera-equipped robotic platform is a Clearpath Robotics’ Husky A200 using an open-source serial protocol. Configured at the university, it has been designed to replicate the dragonfly’s target-tracking capability via its predictive pursuit of prey. The researchers believe it to be a technology “first” in such a context.
“We use different camera systems dependent on what we are testing. We test movement of the camera to emulate eye and head movements independent of body directions," he said. The use of a larger ground vehicle provides flexibility to test computationally expensive algorithms.
"We test ‘active vision’—how the moving platform itself affects the algorithms in a closed loop. From the computational neuroscience, the team develops models for autonomous selection and pursuit of a moving target.
"We hit ‘go’ and see what the Clearpath Husky ground vehicle autonomously pursues,” Dr. Wiederman asserted.
The CSTMD1 neuron
While it is one thing for artificial systems to be able to see moving targets, tracing movement so that it can then move out of the way of those things is a significant aspect of self-steering vehicles. The researchers found that the neuron (CSTMD1) in dragonflies not only predicts where a target would reappear but also traced movement from one eye to the other – even across the brain’s hemispheres, Dr. Wiederman reported.
A study of CSTMD1 was recently published in the journal eLife. The article stated that a diverse range of animals have the capability of detecting moving objects within cluttered environments. It added that this discrimination is a complex task, particularly in response to a small target generating very weak contrast as it moves against a highly textured background.
The study refers to the “winner-takes-all” neuron in the dragonfly, which is likely to promote such competitive selection of an individual target whilst ignoring a distraction.
More information on the study of the implementation of CSTMD1 into the robot was published in July 2017 by the Journal of Neural Engineering.
The research project is an international collaboration funded by the Swedish Research Council, the Australian Research Council and the Swedish Foundation for International Co-operation in Research and Higher Education.
Author: Stuart Birch
Source: SAE Automotive Engineering Magazine