Previous studies suggested brain differences in the temporal domain when processing real human faces versus virtual agent faces, starting from 400 ms onward. However, few studies directly compared the early and the late face processing stages within one paradigm. Here we conducted an EEG study utilizing real human faces and high-quality virtual agent faces, examining two event-related potentials; the early N170 and the Late Positive Potential (LPP). Results showed identical N170 responses for both face types. However, the LPP response revealed a nuanced distinction, with real human faces evoking a slightly larger LPP compared to virtual agent faces. These results suggest that although virtual agent faces can approach the level of emotional engagement and higher-order evaluation associated with real human faces, human faces remain the most engaging. These findings shed light on the cognitive processes involved in face perception and the potential for intelligent virtual agents in AI and education.