Appearance-based gaze tracking algorithms, which compute gaze direction from user face images, are an attractive alternative to infrared-based external devices. Their accuracy has greatly benefited by using powerful machine-learning techniques. The performance of appearance-based algorithms is normally evaluated on standard benchmarks typically involving users fixating at points on the screen. However, these metrics do not easily translate into functional usability characteristics. In this work, we evaluate a state-of-the-art algorithm, FAZE, in a number of tasks of interest to the human-computer interaction community. Specifically, we study how gaze measured by FAZE could be used for dwell-based selection and reading progression (line identification and progression along a line) - key functionalities for users facing motor and visual impairments. We compared the gaze data quality from 7 participants using FAZE against that from an infrared tracker (Tobii Pro Spark). Our analysis highlights the usability of appearance-based gaze tracking for such applications.