People rapidly form first impressions based on facial appearances, which have significant real-life consequences. While various computational models have been developed to analyze how facial characteristics influence these impressions, they often have limitations, such as focusing on limited trait impressions, restricted facial characteristics, reliance on black-box machine learning methods, and dependency on manual annotations. In this study, we address these shortcomings by utilizing recent advancements in computer vision to extract human-interpretable, quantitative measures of facial characteristics (e.g., facial morphological features and skin color) and emotional attributes from face images. Using machine learning techniques, we modeled 34 first impressions and validated our model's generalizability and predictive accuracy with out-of-sample faces. Our model demonstrates the relative importance of facial characteristics and emotional attributes in shaping these 34 first impressions. Our results provide a comprehensive understanding of how various facial characteristics and emotional attributes collectively influence social biases.