iMotions利用神經(jīng)科學(xué)與人工智能驅(qū)動的分析工具,增強了車載系統(tǒng)人機界面的追蹤、評估和設(shè)計。
隨著車輛安全和信息娛樂功能的不斷增強,對現(xiàn)代商用車和工業(yè)車輛的車載人機界面(HMI)的評估也變得越來越重要。除了新技術(shù)的復(fù)雜性增加了駕駛員的學(xué)習(xí)難度之外,與先進駕駛輔助系統(tǒng)(ADAS)之間的交互也會增加駕駛員的認(rèn)知負(fù)擔(dān)并分散其注意力,無論是乘用車還是商用車都如此。
隨著車輛自動化程度的不斷提升,許多客戶開始使用生物傳感器技術(shù)來監(jiān)測駕駛員的注意力以及各種系統(tǒng)和界面的影響?;谏窠?jīng)科學(xué)原理和AI技術(shù),研發(fā)人員正在利用眼動追蹤、面部表情和心率等數(shù)據(jù)設(shè)計出更有效的系統(tǒng)和界面,從而確保駕駛體驗因自動化技術(shù)的進步得到改善,而非變得更糟糕。
車輛設(shè)計中的人機界面系統(tǒng)集成正在迅猛發(fā)展,并重點關(guān)注改善用戶與車輛的交互體驗。只有深刻理解人因工程,才能打造出安全、高效及易于操作的駕駛體驗。iMotions和SmartEye等公司正在利用行為研究和眼動追蹤技術(shù)引領(lǐng)全新的人機界面設(shè)計理念。
神經(jīng)科學(xué)在人機界面設(shè)計中的作用
交通系統(tǒng)人機界面的設(shè)計必須考慮人為因素,只有這樣才能使設(shè)計適應(yīng)人類駕駛員的能力和局限性。具體而言,研究人員需要借助人體工學(xué)設(shè)計、認(rèn)知心理學(xué)及用戶體驗(UX)研究,開發(fā)出易操作、安全且直觀的人機界面。
人因研究的最終目的是打造出適應(yīng)人類行為及其認(rèn)知過程的系統(tǒng),從而減少出錯并提高易操作性。神經(jīng)科學(xué)的進步極大地改善了人機界面的跟蹤、評估和設(shè)計方法,從基于面對面交流的傳統(tǒng)方法發(fā)展為基于神經(jīng)科學(xué)的復(fù)雜技術(shù)。
研發(fā)人機界面的現(xiàn)代技術(shù)包括基于攝像頭的眼動儀(如SmartEye Pro)、AI驅(qū)動的面部表情分析工具(如Affdex),以及皮膚電反應(yīng)和心電圖(ECG)等之前僅用于實驗室的技術(shù)。這些工具現(xiàn)在被用于商業(yè)人機界面應(yīng)用的開發(fā),在改善系統(tǒng)設(shè)計上展現(xiàn)出了較強的相關(guān)性和實用性。
車輛人機界面的設(shè)計必須考慮到駕駛員的認(rèn)知負(fù)荷,即工作記憶所需的腦力活動。人因?qū)<业哪繕?biāo)是設(shè)計出能夠簡化信息處理的界面,通過減輕駕駛員的認(rèn)知負(fù)荷,使其避免陷入困惑和潛在危險。為此,人機界面的設(shè)計需要通過有條理地組織信息,最大程度降低復(fù)雜性,并使用視覺和交互層級來突出對用戶最重要的功能。
例如,空調(diào)、音樂和巡航控制等非必要的控件可設(shè)計在方向盤上,避免駕駛員在使用這些功能時令視線離開道路。此外,還應(yīng)對添加到儀表板上新功能的位置精心規(guī)劃。iMotions軟件有助于測量駕駛員在與這些工具進行交互時將視線從道路上轉(zhuǎn)移開的時間。
人體工學(xué)主要關(guān)注用戶與車內(nèi)環(huán)境的物理交互,因此在人機界面設(shè)計中起到了至關(guān)重要的作用。例如,精心規(guī)劃控件和顯示屏的位置可以確保其易于操作并提供充分的反饋。良好的人體工學(xué)設(shè)計能夠提高用戶的駕乘舒適度和操作效率,并減少其承受的壓力、出錯和發(fā)生事故的概率。為了滿足多樣化的用戶需求,人體工學(xué)設(shè)計往往需要經(jīng)過復(fù)雜的評估,該評估傳統(tǒng)上是通過迭代測試或焦點小組訪談完成的。
人體工學(xué)的核心在于以用戶為中心的設(shè)計方法,即根據(jù)初始階段的用戶反饋設(shè)計出直觀且令人滿意的界面。該方法使用生物傳感器實時記錄用戶的具體反應(yīng),從而克服了傳統(tǒng)設(shè)計方法中用戶反饋偏頗和不充分的問題。通過捕獲這些即時反應(yīng),開發(fā)人員能夠識別出區(qū)分有效設(shè)計和無效設(shè)計,從而設(shè)計出更易于操作且更重視用戶體驗的系統(tǒng)。
人機界面設(shè)計中的人為因素不僅涉及認(rèn)知和物理方面,還涉及界面對用戶的情緒和心理影響。其中包括通過理解設(shè)計元素如何影響用戶的情緒和壓力水平,從而設(shè)計出與用戶建立積極情緒聯(lián)系的界面,例如,利用顏色、形狀和紋理影響用戶對系統(tǒng)的感知和情緒反應(yīng)。駕駛員面部表情監(jiān)測、心電圖和肌肉張力等工具可以為設(shè)計人員提供寶貴的信息,助其理解駕駛員和乘客對車內(nèi)環(huán)境的反應(yīng)。一旦確定了形式和功能上影響駕駛員的各種因素,便能對車內(nèi)環(huán)境進行改善。
在人機界面設(shè)計中,安全是第一位的。其中,人因?qū)I(yè)知識對于減少錯誤和事故至關(guān)重要。這要求開發(fā)人員設(shè)計出直觀、可預(yù)測且能夠容忍用戶出錯的界面。界面的易操作性也必須得到重視,必須確保不同能力水平的駕駛員(包括殘疾人)都能夠操作界面。
除此之外,此前用于評估用戶反應(yīng)的工具現(xiàn)在也被用于駕駛員監(jiān)控系統(tǒng)的開發(fā)。這些系統(tǒng)能夠檢測駕駛員的困倦或分心等狀態(tài),從而助其糾正危險駕駛行為,提高駕駛安全性??ㄜ嚭推嚬菊诩訌娺@些技術(shù)的使用,以集成提高整體駕駛安全性的人在回路(human-in-the-loop)系統(tǒng)。
iMotions致力于將情緒分析融入到人機界面的設(shè)計過程中。這種方法旨在通過先進的傳感器技術(shù)捕獲眼球運動、面部表情和生理反應(yīng)數(shù)據(jù),并據(jù)此實時了解駕駛員的情緒和認(rèn)知狀態(tài)。這種數(shù)據(jù)驅(qū)動的方法能夠助力設(shè)計師識別駕駛員與各種人機界面元素交互的方式,發(fā)現(xiàn)導(dǎo)致駕駛員認(rèn)知過載,分心或產(chǎn)生壓力的區(qū)域。
通過使用這種技術(shù),汽車設(shè)計師便可以在人機界面系統(tǒng)的布局、復(fù)雜性和功能問題上做出合理的決策。例如,研究駕駛員的注視模式可為設(shè)計師優(yōu)化儀表板重要信息的布局提供參考,確保駕駛員在獲得所需信息的同時保持對路面的注意力。同理,監(jiān)測駕駛員在和信息娛樂系統(tǒng)交互期間的生理反應(yīng)有助于設(shè)計出減少用戶認(rèn)知壓力的人機界面。
在模擬器中集成iMotions情緒分析和生物識別傳感技術(shù)能夠幫助設(shè)計師深入了解駕駛員的認(rèn)知負(fù)荷、情緒狀態(tài)和生理反應(yīng)。iMotions軟件采用了各種生物評估方法來研究駕駛員在模擬過程中對不同人機界面元素的反應(yīng),包括眼球運動、面部表情、心率變化和EEG(腦電圖)。這些實時數(shù)據(jù)豐富了設(shè)計師對駕駛員行為的理解,助其確定人機界面設(shè)計中的哪些方面可以提高易用性,哪些會造成用戶困惑。
iMotions還與荷蘭模擬器制造商Cruden開展合作,以進一步加強上述流程并加快人機界面的評估與調(diào)整。密歇根大學(xué)迪爾本分校的駕駛模擬器實驗室就積極采用這種方法,以增強人機界面與人類能力的匹配性。
由多攝像頭Smart Eye Pro系統(tǒng)提供的先進眼動追蹤技術(shù)使人機界面設(shè)計師和研究人員能夠準(zhǔn)確地監(jiān)測駕駛員對車輛內(nèi)部各區(qū)域的關(guān)注點和注視時長。該系統(tǒng)還展示了駕駛員如何通過3D線框建模與座艙進行交互,同時不會因可穿戴設(shè)備分散注意力。該技術(shù)有利于開發(fā)自適應(yīng)人機界面系統(tǒng)——能夠根據(jù)駕駛員的關(guān)注點進行動態(tài)調(diào)整的系統(tǒng)。
通過結(jié)合使用情緒分析和眼動追蹤技術(shù),以及Cruden公司的先進模擬器系統(tǒng),設(shè)計師可增強對駕駛員身體和情緒狀態(tài)的理解。這些技術(shù)的潛在協(xié)同作用蘊含著促進未來車載人機界面設(shè)計發(fā)展的巨大潛能。
本文由 iMotions技術(shù)合作經(jīng)理兼高級神經(jīng)科學(xué)產(chǎn)品專家Nam Nguyen撰寫,并向SAE投稿。
The advancement of vehicles with enhanced safety and infotainment features has made evaluating human-machine interfaces (HMI) in modern commercial and industrial vehicles crucial. Drivers face a steep learning curve due to the complexities of these new technologies. Additionally, the interaction with advanced driver-assistance systems (ADAS) increases concerns about cognitive impact and driver distraction in both passenger and commercial vehicles.
As vehicles incorporate more automation, many clients are turning to biosensor technology to monitor drivers’ attention and the effects of various systems and interfaces. Utilizing neuroscientific principles and AI, data from eye-tracking, facial expressions and heart rate are informing more effective system and interface design strategies. This approach ensures that automation advancements improve rather than hinder the driving experience.
The integration of HMI systems in vehicle design is evolving rapidly, focusing on enhancing user-vehicle interactions. A deep understanding of human-factors engineering is essential for creating safe, efficient and user-friendly driving experiences. Companies like iMotions and SmartEye are using behavioral research and eye-tracking to pioneer new HMI design principles.
The role of neuroscience in HMI design
Human factors are critical in designing HMIs for transportation systems, with designs needing to accommodate the abilities and limitations of human drivers. This involves ergonomic design, cognitive psychology and user experience (UX) research to develop interfaces that are user-friendly, safe and intuitive.
The aim is to create systems that align with human behavior and cognitive processes, reducing errors and improving usability. Advances in neuroscience have greatly enhanced the tracking, assessment and design of HMIs, moving beyond traditional interview-based methods to more sophisticated, neuroscience-based techniques.
Modern approaches include using camera-based eye trackers like SmartEye Pro, AI-powered facial expression analysis tools like Affdex, and methods like electrodermal response and electrocardiography (ECG), once limited to research labs. These tools are now applied to develop commercial HMI applications, showcasing their relevance and utility in improving system design.
Vehicle HMI design must manage the driver’s cognitive load, the mental effort required in working memory. Human-factors experts aim to design interfaces that simplify information processing, reducing cognitive load to prevent confusion and potential hazards. This involves organizing information logically, minimizing complexity, and using visual and interactive hierarchies to highlight essential functions.
For example, non-essential controls like A/C, music and cruise control are placed on the steering wheel, allowing drivers to use them without looking away from the road. New features added to the dashboard are strategically positioned for safety and ease of use. iMotions software helps measure how long drivers divert their gaze from the road to interact with these tools.
Optimizing user-centric design
Ergonomics play a critical role in HMI design, focusing on the physical interaction between users and vehicle interiors. This includes the strategic placement of controls and displays, ensuring they are easy to operate and provide adequate feedback. A well-executed ergonomic design enhances user comfort and efficiency, reducing the likelihood of strain, errors and accidents. Designing for diverse users involves complex assessments traditionally conducted through iterative testing or focus groups.
The core of this approach lies in user-centered design, which incorporates user feedback from the initial stages to create intuitive and satisfying interfaces. This method helps overcome traditional design challenges like biased or inadequate feedback by using biosensors that record detailed reactions in real time. By capturing these instantaneous responses, developers can discern critical factors that differentiate effective from ineffective designs, leading to better, more user-focused systems.
Human factors in HMI design go beyond just cognitive and physical aspects. They also consider the emotional and psychological impact of interfaces on users. This involves understanding how design elements can affect mood and stress levels and designing interfaces that create positive emotional connections with the user. For instance, the use of color, shape and texture can influence a user’s perception and emotional response to a system. Tools such as driver facial expression monitoring, ECG and muscle tension can provide valuable insights into how drivers and passengers react to the cabin environment that can be improved once the individual elements of form and function are identified.
Safety is paramount in HMI design, where human-factors expertise is crucial for reducing errors and accidents. This requires designing interfaces that are straightforward, predictable and forgiving of user errors. Accessibility is also essential, ensuring interfaces are usable by people of various abilities, including those with disabilities.
Additionally, tools used for measuring user responses are now employed in developing driver monitoring systems. These systems detect states like drowsiness or distraction, helping to mitigate risky driving behaviors and enhance safety. Truck and car companies are increasingly utilizing these technologies to incorporate human-in-the-loop systems that improve overall driving safety.
Emotion analytics and advanced eye-tracking
iMotions specializes in integrating emotion analytics into the HMI design process. The approach aims to understand the emotional and cognitive states of drivers in real time, using advanced sensor technologies to capture data on eye movement, facial expressions and physiological responses. This data-driven approach helps designers recognize how drivers interact with various HMI elements, identifying areas of cognitive overload, distraction or stress.
By utilizing such technology, vehicle designers can make informed decisions about the layout, complexity and functionality of HMI systems. For instance, insights into gaze patterns can inform the optimal placement of critical information on dashboards, ensuring that drivers can access the information they need without diverting attention from the road. Similarly, monitoring physiological responses during interactions with infotainment systems can help in designing interfaces that minimize cognitive strain.
Integrating iMotions’ advanced emotion analytics and biometric sensing within simulators provides insights into drivers’ cognitive loads, emotional states and physiological responses. iMotions Software employs various biometric measurements, including eye movements, facial expressions, heart rate variability and EEG (electroencephalogram) to study how drivers respond to different HMI elements during simulations. This real-time data enriches understanding of driver behavior, helping designers pinpoint which aspects of HMI design enhance intuitive use and which could cause confusion.
The collaboration with Dutch simulator manufacturer Cruden enhances this process, enabling swift evaluations and adjustments. This method is actively used at the University of Michigan Dearborn’s Driving Simulator Lab, for example, to align interfaces more closely with human capabilities.
Advanced eye-tracking technology provided by the multi-camera Smart Eye Pro system allows HMI designers and researchers to accurately monitor where and for how long a driver looks at different areas of the vehicle interior. It also reveals how a driver interacts with the cabin in real time via 3D wireframe modeling without any distracting influence from wearables. This technology is beneficial in the development of adaptive HMI systems that can dynamically adjust based on the driver’s focus.
By combining the strengths of both emotion analytics and eye-tracking technology, and leveraging advanced simulator systems from Cruden, designers can gain a better understanding of the driver’s physical and emotional state. The potential synergy between these technologies holds great promise for the future of HMI design in vehicles.
Nam Nguyen, technical partnership manager and senior neuroscience product specialist for iMotions, contributed this article for SAE Media.