The Program on Understanding Law, Science, and Evidence (PULSE) at UCLA School of Law explores the complex, multi-faceted connections between technology, science, and law. PULSE engages in cutting-edge interdisciplinary research and innovative programming to study how technological advances and scientific knowledge and uncertainties influence law and policy making, and how their impacts can be managed to advance human and societal well-being.
On November 26, 2017, Elon Musk tweeted: “Got to regulate AI/robotics like we do food, drugs, aircraft & cars. Public risks require public oversight. Getting rid of the FAA wdn’t [sic] make flying safer. They’re there for good reason.”
In this and other recent pronouncements, Musk is calling for artificial intelligence (AI) to be regulated by traditional regulation, just as we regulate foods, drugs, aircraft and cars. Putting aside the quibble that food, drugs, aircraft and cars are each regulated very differently, these calls for regulation seem to envision one or more federal regulatory agencies adopting binding regulations to ensure the safety of AI. Musk is not alone in calling for “regulation” of AI, and some serious AI scholars and policymakers have likewise called for regulation of AI using traditional governmental regulatory approaches .
How can we speak of algorithms as political?
The intuitive answer disposes us to presume that algorithms are not political. They are mathematical functions that operate to accomplish specific tasks. In this regard, algorithms operate independently of a specific belief system or of any one system’s ideological ambitions. They may be used for political ends, in the manner in which census data may be used for voter redistricting, but in and of themselves algorithms don’t do anything political.
t the end of the Cold War, the renowned political scientist, Samuel Huntington, argued that future conflicts were more likely to stem from cultural frictions– ideologies, social norms, and political systems– rather than political or economic frictions. Huntington focused his concern on the future of geopolitics in a rapidly shrinking world. But his argument applies as forcefully (if not more) to the interaction of technocultures.