Skip to main content
eScholarship
Open Access Publications from the University of California

Traps and pitfalls when learning logical theories : a case study with FOIL and FOCL

Abstract

Supervised concept learning based on Horn clauses is one of the most active areas of machine learning research. Two popular systems in this area are FOIL (First Order Inductive Learner) and FOCL (First Order Combined Learner). This paper points out sorne conceptual traps and pitfalls in which these two systems fall when they cope with some tasks taken both from the machine learning literature and from real world domains. An interpretation of the obtained results is provided. It is based on a comparison between the search space that these two systems should explore and the search space that they actually explore and on considerations about the representation of the examples as tuples in a relational database. Theoretically-founded solutions to the detected problems are suggested. Moreover, a more manageable practical solution is proposed and its strengths and weaknesses in comparison with the theoretically-founded ones are evaluated. Such a solution has been satisfactorily implemented in a new version of FOCL.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View