About
The Statistics Online Computational Resource designs, validates and freely disseminates knowledge. Specifically, SOCR provides portable online aids for probability and statistics education, technology based instruction and statistical computing. This archive contains a number of training and learning materials developed and disseminated by the SOCR resource and various SOCR collaborators.
Statistics Online Computational Resource
Recent Work (7)
SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit
The web-based, Java-written SOCR (Statistical Online Computational Resource) tools have been utilized in many undergraduate and graduate level statistics courses for seven years now. It has been proven that these resources can successfully improve students' learning. Being �first published online in 2005, SOCR Analyses is a somewhat new component and it concentrate on data modeling for both parametric and non-parametric data analyses with graphical model diagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learning for high school and undergraduate students. As we have already implemented SOCR Distributions and Experiments, SOCR Analyses and Charts ful�ll the rest of a standard statistics curricula. Currently, there are four core components of SOCR Analyses. Linear models included in SOCR Analyses are simple linear regression, multiple linear regression, one-way and two-way ANOVA. Tests for sample comparisons include t-test in the parametric category. Some examples of SOCR Analyses' in the non-parametric category are Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno� test and Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman's test and Fisher's exact test. The last component of Analyses is a utility for computing sample sizes for normal distribution. In this article, we present the design framework, computational implementation and the utilization of SOCR Analyses.
Expectation Maximization and Mixture Modeling Tutorial
This technical report describes the statistical method of expectation maximization (EM) for parameter estimation. Several of 1D, 2D, 3D and n-D examples are presented in this document. Applications of the EM method are also demonstrated in the case of mixture modeling using interactive Java applets in 1D (e.g., curve fitting), 2D (e.g., point clustering and classification) and 3D (e.g., brain tissue classification).
The Rise of Infocracy: Virtualized Human Interplays, Decline of Physical Interactions, and the Adaptation of People’s Social Valuation System
Rapid technological advances, our insatiable appetite for instantaneous rewards, and the massive information overflow are impacting our everyday lives. The long term effects of the social informatification, people’s overreliance on massive amounts of dynamic digital information, remain enigmatic and poorly understood. Our ability to anticipate, prepare, react and adopt to potential negative consequences of the minute-by-minute existence in the new infoctratic world, complete virtual immersion into a digital information where most time, energy and resources are dedicated to rapid acquisition, processing and inference using large amounts of information, may have a significant long term impact on mankind. This opinion outlines the scope of our virtualized abilities to manage and interpret Exabytes of information and suggests that timely prediction and appropriate response to the information avalanche will be critical to managing the unavoidable social and cultural changes ahead.