Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nonmonotonic Abductive – Inductive Learning Oliver Ray Department of Computer Science University of Bristol AIAI'07, September 15th, 2007 Aix-en-Provence.

Similar presentations


Presentation on theme: "Nonmonotonic Abductive – Inductive Learning Oliver Ray Department of Computer Science University of Bristol AIAI'07, September 15th, 2007 Aix-en-Provence."— Presentation transcript:

1 Nonmonotonic Abductive – Inductive Learning Oliver Ray Department of Computer Science University of Bristol AIAI'07, September 15th, 2007 Aix-en-Provence

2 Nonmonotonic Abductive – Inductive Learning Oliver Ray Department of Computer Science University of Bristol AIAI'07, September 15th, 2007 Aix-en-Provence (for temporal process modelling in bioinformatics and AI)

3 Motivation: Learning Temporal Theories Machine Learning: automated methods needed to handle the volume and complexity of data generated by modern experimental and data logging techniques. Inductive Logic Programming: produces expressive human-understandable hypotheses, exploits prior domain knowledge, and facilitates incremental knowledge update. Abductive–Inductive Learning: combines explanation and generalisation, allows for non-observation predicate learning, supports non-monotonic inference. Non-monotonic Learning of Temporal Theories: Infer temporal models of systems or processes from (partial) domain knowledge and (partial) observations.

4 Problem: Induction of Process Models Given: temporal logic calculus partial process model scenarios/narratives Find: (more) complete model that extends the partial model and explains the given narratives and scenarios wrt. the temporal calculus

5 Given: temporal logic calculus…event calculus partial process model…events, fluents, time scenarios/narratives…happens, holds, initially Find: (more) complete model …initiates, terminates Problem: Induction of Process Models that extends the partial model and explains the given narratives and scenarios wrt. the temporal calculus

6 Given: temporal logic calculus…event calculus partial process model…events, fluents, time scenarios/narratives…happens, holds, initially Find: (more) complete model …initiates, terminates B E H NM ILP Problem: Induction of Process Models that extends the partial model and explains the given narratives and scenarios wrt. the temporal calculus

7 Example: E. coli Lactose Metabolism E. coli growth medium add_lactose sub_glucose add_glucose sub_lactose ACTIONS (Events) pres_lactose pres_glucose meta_lactose EFFECTS (Fluents)..…TIME..… (Integers)

8 Transcriptional Regulation of LAC Operon polymeraseactivator 1. 2. cAMP repressor PromOplac(z)lac(y) CAP allolactose 3. 4. 5. PromOplac(z)lac(y)CAP galactosidasepermease 6. 7.8. (a) Lactose metabolising genes not expressed (b) Lactose metabolising genes expressed (low glucose) (high lactose)

9 Event Calculus Axioms holdsAt(F,T2)  happens(E,T1), T1<T2, initiates(E,F,T1), not clipped(T1,F,T2). holdsAt(F,T2)  initially(F), not clipped(0,F,T2). clipped(T1,F,T2)  happens(E,T), T1<T, T<T2, terminates(E,F,T).

10 Partial LAC Process Model % ontology time(0..9). event(add_gluc). event(add_lact). event(sub_gluc). event(sub_lact). fluent(pres_lact). fluent(pres_gluc). fluent(meta_lact). % behaviour initiates(add_gluc, pres_gluc, T). initiates(add_lact, pres_lact, T). terminates(sub_gluc, pres_gluc, T). terminates(sub_lact, pres_lact, T).

11 LAC Scenario / Narrative % actions initially(pres(gluc)). happens(add(lact),1). happens(sub(gluc),2). happens(sub(lact),3). happens(add(lact),4). happens(add(gluc),5). happens(sub(lact),6). happens(sub(gluc),7). % observations not holdsAt(meta(lact),1), not holdsAt(meta(lact),2), holdsAt(meta(lact),3), not holdsAt(meta(lact),4), holdsAt(meta(lact),5), not holdsAt(meta(lact),6), not holdsAt(meta(lact),7), not holdsAt(meta(lact),8). n.b. in general, could have partial knowledge of actions and/or observations, many actions per timepoint, etc.,

12 Language and Search Bias % domain specific mode declarations modeh(2, initiates(#event,#fluent,+time) ). modeh(2, terminates(#event,#fluent,+time) ). modeb(3, holdsAt(#fluent,+time) ). modeb(3, not holdsAt(#fluent,+time) ). % built-in preference criterion: Occam’s Razer % (greedily )prefer the simplest (i.e., smallest ) % hypothesis that correctly explains the data n.b. in general, need ways to constrain the search space both syntactically and semantically

13 Abductive–Inductive Learning Abduction 1 Deduction 2  = a1 :ana1 :an K= a 1  d 1 d 2 … d m 1 : a n  d 1 d 2 … d m n Given: B,E,M Return: H 1 1 1 n n n Induction 3 H= A 1  D 1 D 2 … D m 1 : A n  D 1 D 2 … D m n 1 1 1 n n n head atoms of K are abductive explanation of the examples E : i.e., B  |= e. body atoms of K are deductive consequences of the theory B: i.e., B |= d i. H is a compressive theory subsuming the theory K : i.e., H  K. IDEA: construct and generalise an initial ground hypothesis K called a Kernel Set

14 Abductive Phase

15 Deductive Phase

16 Inductive Phase

17 DEMO CLICK ME!

18 Related Work Sakama Baral Otero & Lorenzo Muggleton & Moyle Inoue, Iwanuma & Nabeshima

19 Conclusion XHAIL provides a (stable model) semantics and proof procedure for NM-ILP It uses mode declarations in the construction of a Kernel Set to reduce generalisation search space It is well suited to learning temporal theories in the Event Calculus (which provides a more intuitive event-based formalism than pure first order logic) But, still need to investigate stability, noise, confidence, …

20 END


Download ppt "Nonmonotonic Abductive – Inductive Learning Oliver Ray Department of Computer Science University of Bristol AIAI'07, September 15th, 2007 Aix-en-Provence."

Similar presentations


Ads by Google