Individuals affected by severe motor impairment lose the ability to move and interact effectively. The severity of motor impairment varies across affected individuals, and conditions like LIS and ALS can fluctuate and deteriorate over time, sometimes within the same day. Despite the increasing availability of augmentative and assistive technologies, existing solutions do not easily adapt to these changing capabilities. In this paper, we present Eyehome 3.0, an extensible multimodal input system which builds user interfaces that adapt to the user via recurrent neural networks. In particular, eye gaze data is used to inform the layout of these user interfaces. We also present a framework for easily developing additional layouts and applications for the system, and how additional devices can be incorporated. We discuss how our participatory design work with Amado, a person with LIS, helped drive the development of the system.