ML unit-5 notes

https://drive.google.com/file/d/1uCBFRwJofUK7APLGB6gWoQBJ2OhSseRw/view?usp=sharing








This document introduces Analytical Learning, primarily through Explanation-Based Learning (EBL). Unlike inductive methods that rely on statistical inference from large datasets, EBL utilizes prior knowledge (a Domain Theory, DT) to logically explain how a single training example satisfies the target concept. This explanation process distinguishes relevant features for generalization. The PROLOG-EBG algorithm is presented as a method for EBL with perfect domain theories, which involves explaining a positive example, analyzing the explanation to find the weakest preimage (a general rule), and refining the learned rules. The notes then transition to combining inductive and analytical learning to leverage the advantages of both: using minimal prior knowledge and learning from scarce data. Techniques for combining them include using prior knowledge to initialize the hypothesis (KBANN), altering the search objective (TANGENT PROP), or augmenting search operators (FOCL).

Here are 5 key bullet points of the specific topics covered:
  • Analytical Learning (EBL): A method that uses prior knowledge, called the Domain Theory (DT), to analyze or explain how a training example satisfies the target concept, allowing for generalization based on logical, rather than statistical, learning.
  • PROLOG-EBG: An algorithm for EBL, typically assuming a perfect DT, that iteratively explains a positive training example, analyzes the explanation to determine a generalization (the weakest preimage), and refines the current set of learned Horn clauses.
  • Weakest Preimage: Defined as the most general set of initial assertions (A) such that A deductively supports the conclusion (C) according to the proof (p) of the explanation (A $\vdash$ C according to p). In PROLOG-EBG, it is the most general rule that is justified by the domain theory.
  • KBANN (Knowledge-Based Artificial Neural Networks): An approach for combining analytical and inductive learning where prior knowledge, in the form of a DT of propositional Horn clauses, is used to construct an initial neural network. This network is then refined using the Backpropagation algorithm to fit the training data.
  • FOCL (FOIL + Explanation-Based Learning): A system that combines inductive and analytical learning by augmenting the search operators of the inductive algorithm FOIL. FOCL generates additional candidate specializations for a Horn clause rule based on the Domain Theory, allowing it to discover new operational literals.

Comments