- Chen, Yifan;
- Huerta, EA;
- Duarte, Javier;
- Harris, Philip;
- Katz, Daniel S;
- Neubauer, Mark S;
- Diaz, Daniel;
- Mokhtar, Farouk;
- Kansal, Raghav;
- Park, Sang Eon;
- Kindratenko, Volodymyr V;
- Zhao, Zhizhen;
- Rusack, Roger
To enable the reusability of massive scientific datasets by humans and machines, researchers aim to adhere to the principles of findability, accessibility, interoperability, and reusability (FAIR) for data and artificial intelligence (AI) models. This article provides a domain-agnostic, step-by-step assessment guide to evaluate whether or not a given dataset meets these principles. We demonstrate how to use this guide to evaluate the FAIRness of an open simulated dataset produced by the CMS Collaboration at the CERN Large Hadron Collider. This dataset consists of Higgs boson decays and quark and gluon background, and is available through the CERN Open Data Portal. We use additional available tools to assess the FAIRness of this dataset, and incorporate feedback from members of the FAIR community to validate our results. This article is accompanied by a Jupyter notebook to visualize and explore this dataset. This study marks the first in a planned series of articles that will guide scientists in the creation of FAIR AI models and datasets in high energy particle physics.