We offer a survey of recent results on covariance estimation for heavy-tailed
distributions. By unifying ideas scattered in the literature, we propose
user-friendly methods that facilitate practical implementation. Specifically,
we introduce element-wise and spectrum-wise truncation operators, as well as
their $M$-estimator counterparts, to robustify the sample covariance matrix.
Different from the classical notion of robustness that is characterized by the
breakdown property, we focus on the tail robustness which is evidenced by the
connection between nonasymptotic deviation and confidence level. The key
observation is that the estimators needs to adapt to the sample size,
dimensionality of the data and the noise level to achieve optimal tradeoff
between bias and robustness. Furthermore, to facilitate their practical use, we
propose data-driven procedures that automatically calibrate the tuning
parameters. We demonstrate their applications to a series of structured models
in high dimensions, including the bandable and low-rank covariance matrices and
sparse precision matrices. Numerical studies lend strong support to the
proposed methods.