Digitization has facilitated the proliferation of crowd science by lowering the cost of finding individuals with the willingness to participate in science without pay. However, the factors that influence participation and the outcomes of voluntary participation are unclear. We report two findings from a field experiment on the worlds largest crowd science platform that tests how voluntary contributions to science are affected by providing clarifying information on either the desired outcome of a scientific task or the labor requirements for completing the task. First, there is significant heterogeneity in the motivations and ability of contributors to crowd science. Second, both of the information interventions lead to significant decreases in the quantity and increases in the quality of contributions. Combined, our findings are consistent with the information interventions improving match quality between the task and the volunteer. Our findings suggest that science can be democratized by engaging individuals with varying skill levels and motivations with small changes in the information provided to participants.