The focus of this research is on the challenges of using multi-sensory, multi-modal data focused on drivers and their environment, to infer the cognitive states and intentions of the driver. We propose several relevant research tasks including behavioral attention and priming analysis, cue selection, and data fusion and model development. Ultimately we arm an Intelligent Driver Assistance System with such information in order to improve decision making, safety, and comfort. Specifically, the objective of this research is to build a holistic driver intent inference system by (1) observing and understanding the most reliable visible behaviors, characteristics, and environmental cues that indicate driver intentions, and (2) analyzing the interactive nature and performance of real-time learning-based driver assistance systems. The thesis makes contributions to, and documents research interactivity between, the fields of electrical engineering, signal processing, cognitive science, human-computer interaction and psychology. The advantage of including human intent information in assistive feedback is found to be substantial, especially for reducing risk in safety-critical situations in driving and other task-oriented scenarios. The contributions of this research could be extended to applications in Advanced Driver Assistance Systems, Assistive Living, and Smart Meeting Rooms