The use of autonomous systems is burgeoning in the world for applications in many fields from scientific, industrial, to military. At the same time, advances in semiconductor technology have enabled ever smaller, complex, and use-specific microprocessors and microcontrollers. This work details the design and implementation of an open source real-time hardware controller for resource-constrained autonomous vehicles and systems. It is intended to be integrated inside a distributed control architecture consisting of the real-time hardware controller, a guidance and navigation computer, and an edge tensor processing unit for machine learning inferences. While the latter two processors are commercially available, a dedicated, modular real-time controller is not, providing the motivation for this work. To demonstrate the versatility of our open source real-time controller we present several use cases including a ground vehicle, marine vessel, quadcopter, and fixed-wing aircraft. The power of the distributed architecture is the ability to solve complex sensing, guidance, navigation, and control challenges even in resource-constrained systems. One such challenge is the simultaneous localization of an autonomous system while mapping an environment. In this work we develop the components of a novel hybrid sensor combining a visual camera and LiDAR sensor that is mounted on the ground vehicle. This sensor is trained to recognize landmarks in the environment using object detection frameworks and deployed on the edge tensor processing unit. At the same time, the LiDAR sensor provides range and bearing information for objects within its field of view. By combining the two we can get fast detections of arbitrary landmarks in the environment as well as determine their position relative to the sensor, thus enabling simultaneous localization and mapping functionality.