Automatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human-computer interaction (HCI) and human behavior analysis. It is therefore not surprising that sev- eral related techniques and methods have been investi- gated in recent years. However, very few camera-based systems proposed in the literature are both real-time and robust. In this work, we propose a real-time user- calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumi- nation changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm. Our method has been validated on a data set of images of users in a natural environment, showing promising results. The possibility of a real time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications.