Skip to main content
Open Access Publications from the University of California


The Introductory Statistics Course: A Ptolemaic Curriculum?

As we begin the 21st century, the introductory statistics course appears healthy, with its emphasis on real examples, data production, and graphics for exploration and assumption-checking. Without doubt this emphasis marks a major improvement over introductory courses of the 1960s, an improvement made possible by the vaunted “computer revolution.” Nevertheless, I argue that despite broad acceptance and rapid growth in enrollments, the consensus curriculum is still an unwitting prisoner of history. What we teach is largely the technical machinery of numerical approximations based on the normal distribution and its many subsidiary cogs. This machinery was once necessary, because the conceptually simpler alternative based on permutations was computationally beyond our reach. Before computers statisticians had no choice. These days we have no excuse. Randomization-based inference makes a direct connection between data production and the logic of inference that deserves to be at the core of every introductory course. Technology allows us to do more with less: more ideas, less technique. We need to recognize that the computer revolution in statistics education is far from over.

The Role of Technology in Improving Student Learning of Statistics

This paper provides a broad overview of the role technological tools can play in helping students understand and reason about important statistical ideas. We summarize recent developments in the use of technology in teaching statistics in light of changes in course content, pedagogical methods, and instructional formats. Issues and practical challenges in selecting and implementing technological tools are presented discussed, and examples of exemplary tools are provided along with suggestions for their use.

On Getting More and Better Data Into the Classroom

The authors’ work to develop capabilities for getting data into the data analysis software Fathom™ is described. Heuristics of detecting data on a web page allow drag and drop of a URL into a document. A collaboration with the Minnesota Population Center makes possible sampling from census microdata from 1850 through 2000. With direct support for Vernier sensors, students can build a model during the process of realtime data collection. Finally, a survey capability makes it easy for teachers and students to create simple data entry forms hosted on a web site such that the collated data is instantly downloadable for data analysis in Fathom. By taking some of the drudgery out of gathering data, these capabilities carry implications for teaching and curriculum development; namely that students should have experience throughout their learning with data that they individually have chosen to explore. It is argued that the skills they gain by engaging in exploratory data analysis with self-chosen and self-generated data are critically important in our data-driven society and not yet adequately supported in K–14 learning.

  • 4 supplemental videos
  • 1 supplemental ZIP

Using Wiki to Promote Collaborative Learning in Statistics Education

This article attempts to make a strong case for the use of Wiki to support collaborative learning experiences for students in the statistics classroom. Wiki is an innovative Website that allows all users to add and edit content with relative simplicity. Wiki features empowered learners and bottom-up organization that enable easy authoring of Web content, open access and unrestricted collaboration. We first introduce statistics as a collaborative discipline and therefore compatible with Wiki as a collaborative learning space. We then show evidence that collaboration can improve the learning of individuals in the statistics classroom as well as the whole class. Finally we demonstrate how Wiki can facilitate collaborative learning and bring about instructional change to improve student learning of statistics. We present several types of Wiki-based activities: collaborative writing, glossaries, discussion and review, statistical projects, self-reflective journals, and assessment.

Computing and Introductory Statistics

Much of the computing that students do in introductory statistics courses is based on techniques that were developed before computing became inexpensive and ubiquitous. Now that computing is readily available to all students, instructors can change the way we teach statistical concepts. This article describes computational ideas that can support teaching George Cobb's Three Rs of statistical inference: Randomize, Repeat, Reject.

Much Has Changed; Little Has Changed: Revisiting the Role of Technology in Statistics Education 1992-2007

The author of this article reflects on the uses of technology in statistics education, comparing the state of the art as described in her article from 1992 with current developments. She reviews five categories of software: software that uses video as data, Geographical Information Systems, graph construction tools, systems with distribution and data manipulation capabilities, and probability generation tools. Considering how software has changed in the past fifteen years, the author argues that while remarkable technological progress has been made, many of the same pedagogical caveats apply as in 1992. These concerns are an integral part of studying the uses of technology as a learning tool in any content area, so it is important that we put them front and center as this journal begins and keep them there as it grows.