Skip to main content
eScholarship
Open Access Publications from the University of California

The Introductory Statistics Course: A Ptolemaic Curriculum?

Abstract

As we begin the 21st century, the introductory statistics course appears healthy, with its emphasis on real examples, data production, and graphics for exploration and assumption-checking. Without doubt this emphasis marks a major improvement over introductory courses of the 1960s, an improvement made possible by the vaunted “computer revolution.” Nevertheless, I argue that despite broad acceptance and rapid growth in enrollments, the consensus curriculum is still an unwitting prisoner of history. What we teach is largely the technical machinery of numerical approximations based on the normal distribution and its many subsidiary cogs. This machinery was once necessary, because the conceptually simpler alternative based on permutations was computationally beyond our reach. Before computers statisticians had no choice. These days we have no excuse. Randomization-based inference makes a direct connection between data production and the logic of inference that deserves to be at the core of every introductory course. Technology allows us to do more with less: more ideas, less technique. We need to recognize that the computer revolution in statistics education is far from over.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View