Project Description
Driver observation methods used successfully in highly innovative R&D projects in simple driving tasks will be extended to cover complex driving tasks such as highway and city driving, where DRIVOBS will uniquely combine:
- Task-related metrics such as speed, acceleration, jerk, steering angle, Time To Collision, Time to Line Crossing, and new violation and error parameters predicting accident risk.
- System identification of driver control actions resulting from visual, motion (vestibular) and force stimuli in car following and steering. This will be a major innovation as only a partial identification in simple tasks has been published.
- Behaviour observation and physiological measurements regarding driver visual focus (head and gaze tracking), driver actions (e.g. hand motion) and driver state (ECG, EMG, GSR, facial expression, body motion, etc). Instrumentation will be tailored, analysis of camera recordings will be automated to efficiently deal with the vast amounts of data generated, and interpretation will be provided in terms of driver behaviour, focal attention and state.
In order to elicit the most realistic driving behaviour enhanced motion and vision will be provided. System identification will be used to quantify how motion, visual and force cues are used by the driver in various scenarios, and analysis in the frequency domain will disclose the relevance of the various frequency bands in simulator motion thus supporting motion filter optimisation.
- Motion platform and motion filter guidelines providing optimal realism will be developed for the scenarios selected.
- Resolution, timing and quality of the visual field will be varied in order to demonstrate the relevance of high-end visuals for the scenarios tested and to indicate whether for certain conditions medium fidelity visuals may be accepted.
- Force cueing techniques will be developed such as a motion seat, vibratory elements, a helmet loader, a pressure seat, seat belt tensioning systems, and control loading systems for steering wheel and pedals, and their contributions to the perceptual and physical realism as compared to real car driving will be evaluated.
Observation methods and cueing techniques will be developed for several use cases.
- Car following - to develop system identification describing human gas pedal control as a function of the visually perceived distance to the lead vehicle, the (haptic) gas pedal force, and simulator acceleration.
- Steering - to develop system identification describing human steering actions as a function of road preview, visual flow, (haptic) steering wheel force and vehicle acceleration.
- Combined identification for car following and steering will be evaluated for more complex driving tasks such as complex tracks, highway driving and city driving. New methods will be developed dealing with time varying aspects of such tasks, and assisting the experimenter to efficiently analyse results in the various phases of different driving manoeuvres.
- Prolonged driving will be evaluated for C-ACC systems developed in HTAS Connect & Drive. Distraction by the HMI of the C-ACC will be evaluated by monitoring gaze intervals when the driver is focussing on the C-ACC display. Effects of fatigue and the low workload of this task will be investigated.
- City driving in dense traffic will focus on the time-varying focal attention of the driver for the different lanes and vehicles including mirror usage. Hazards will be simulated including crossing pedestrians, crossing vehicles and oncoming traffic turning left.