Natural and artificial acoustics of the built world provide spatial awareness, environmental context, and communicative abilities to people with visual impairments. When the acoustic soundscape is insufficient, assistive technologies -- devices and computational systems designed to augment the visual modality -- can be used to translate visual information to other modalities. In the field of Human-Computer Interaction (HCI), nonvisual assistive technologies seek to bridge the human to computer relationship through speech and non-speech sounds, touch, and by facilitating interactions through sighted support structures. Current research exploring technological approaches to solving nonvisual challenges in a sight-first world, often emphasize the transfer of visual information to a single modality. In this dissertation, I examine how individual modalities can be unified through novel computational interactions to alleviate sensory overload, and support a broad set of activities. I ask the questions: 1) How can tangible and mixed ability computational systems be designed to reduce audio and interactions for blind and low vision people in everyday activities, and 2) Do assistive technologies that are less reliant on auditory input improve blind and low vision user interactions? Through field studies of blind and low vision computers users, I designed and evaluated a platform for augmenting auditory computer interactions with a tangible interface. An experimental evaluation of the combined auditory and tangible control found a thirty-nine percent increase compared to traditional audio-only tools in web navigation tasks. Through a multi-year ethnographic study of a blind and low vision outrigger canoeing community, I examined the intersection of sensory modalities and mixed-ability relationships during canoeing activities. Applying a public-facing co-design participatory methodology, I worked alongside blind and sighted outrigger canoe enthusiasts to design, evaluate, and deploy a shared assistive technology to support blind paddling. Analysis of my work reveals how physicality in the world influences the auditory and tangible interactions of assistive technologies. In addition to my empirical findings, through this work I demonstrate how attending to context and the physicality of sound and touch reveal critical insights to guide the design of assistive technology.