Download Planning Based on Model Checking - Automated Planning - Lecture Slides and more Slides Computer Science in PDF only on Docsity! Chapter 17 Planning Based on Model Checking Lecture slides for Automated Planning: Theory and Practice Docsity.com Motivation • Actions with multiple possible outcomes – Action failures • e.g., gripper drops its load – Exogenous events • e.g., road closed • Nondeterministic systems are like Markov Decision Processes (MDPs), but without probabilities attached to the outcomes – Useful if accurate probabilities aren’t available, or if probability calculations would introduce inaccuracies a c b grasp(c) a c b Intended outcome a b Unintended outcome Docsity.com Goal Start 2 Example • Robot r1 starts at location l1 • Objective is to get r1 to location l4 • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} Docsity.com Goal Start 2 Example • Robot r1 starts at location l1 • Objective is to get r1 to location l4 • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} Docsity.com Goal Start 2 Execution Structures • Execution structure for a policy π: – The graph of all of π’s execution paths • Notation: Σπ = (Q,T) – Q ⊆ S – T ⊆ S × S • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} s2 s1 s5 s3 s4 Docsity.com Execution Structures • Execution structure for a policy π: – The graph of all of π’s execution paths • Notation: Σπ = (Q,T) – Q ⊆ S – T ⊆ S × S • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} s2 s1 s5 s3 s4 Docsity.com Goal Start 2 Execution Structures • Execution structure for a policy π: – The graph of all of π’s execution paths • Notation: Σπ = (Q,T) – Q ⊆ S – T ⊆ S × S • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} s1 s4 Docsity.com Execution Structures • Execution structure for a policy π: – The graph of all of π’s execution paths • Notation: Σπ = (Q,T) – Q ⊆ S – T ⊆ S × S • π1 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4))} • π2 = {(s1, move(r1,l1,l2)), (s2, move(r1,l2,l3)), (s3, move(r1,l3,l4)), (s5, move(r1,l3,l4))} • π3 = {(s1, move(r1,l1,l4))} s1 s4 Docsity.com Example π = failure π' = ∅ Sπ' = ∅ Sg ∪ Sπ' = {s4} Goal Start s4 2 Docsity.com Start Example π = failure π' = ∅ Sπ' = ∅ Sg ∪ Sπ' = {s4} π'' ← PreImage = {(s3,move(r1,l3,l4 )), (s5,move(r1,l5,l4 ))} Goal s5 s3 s4 2 Docsity.com Start Example π = failure π' = ∅ Sπ' = ∅ Sg ∪ Sπ' = {s4} π'' ← PreImage = {(s3,move(r1,l3,l4) ), (s5,move(r1,l5,l4)) } Goal s5 s3 s4 2 Docsity.com Example π = {(s3,move(r1,l3,l4)), (s5,move(r1,l5,l4))} π' = {(s2,move(r1,l2,l3)), (s3,move(r1,l3,l4)), (s5,move(r1,l5,l4))} Sπ' = {s2,s3,s5} Sg ∪ Sπ' = {s2,s3,s4,s5} Goal Start s2 s5 s3 s4 2 Docsity.com Goal Start s2 s5 s3 s4 s1 2 Example π = {(s3,move(r1,l3,l4)), (s5,move(r1,l5,l4))} π' = {(s2,move(r1,l2,l3)), (s3,move(r1,l3,l4)), (s5,move(r1,l5,l4))} Sπ' = {s2,s3,s5} Sg ∪ Sπ' = {s2,s3,s4,s5} π'' ← {(s1,move(r1,l1,l2))} π ← π' = {(s2,move(r1,l2,l3)), Docsity.com Example π = {(s2,move(r1,l2,l3)), (s3,move(r1,l3,l4)), (s5,move(r1,l5,l4))} π' = {(s1,move(r1,l1,l2)), (s2,move(r1,l2,l3)), (s3,move(r1,l3,l4)), Goal Start s2 s5 s3 s4 s1 2 Docsity.com Example π = failure π' = ∅ Sπ' = ∅ Sg ∪ Sπ' = {s4} Goal Start s4 2 Weak Weak Docsity.com Example π = failure π' = ∅ Sπ' = ∅ Sg ∪ Sπ' = {s4} π'' = PreImage = {(s1,move(r1,l1,l4) ), (s3,move(r1,l3,l4)) , Start Goal s5 s3 s4 2 s1 Weak Weak Docsity.com Example π = ∅ π' = {(s1,move(r1,l1,l4) ), (s3,move(r1,l3,l4)) , (s5,move(r1,l5,l4)) } Sπ' = {s1,s3,s5} S ∪ S = Goal Start s5 s3 s4 2 s1 Weak Weak Docsity.com Example 1 π ← ∅ π' ← {(s,a) : a is applicable to s} Goal Start s5 s3 s4 s2 2 s1 s6 at(r1,l6) Docsity.com Example 1 π ← ∅ π' ← {(s,a) : a is applicable to s} π ← {(s,a) : a is applicable to s} PruneOutgoing(π',Sg) = π' PruneUnconnected(π',Sg) = π' RemoveNonProgress(π') = ? Goal Start s5 s3 s4 s2 2 s1 s6 at(r1,l6) Docsity.com Example 1 π ← ∅ π' ← {(s,a) : a is applicable to s} π ← {(s,a) : a is applicable to s} PruneOutgoing(π',Sg) = π' PruneUnconnected(π',Sg) = π' RemoveNonProgress(π') = as shown Start s5 s3 s4 s2 2 s1 s6 Goal at(r1,l6) Docsity.com Example 2: no applicable actions at s5 π ← ∅ π' ← {(s,a) : a is applicable to s} π ← {(s,a) : a is applicable to s} PruneOutgoing(π',Sg) = … Goal Start s3 s4 s2 2 s1 s6 at(r1,l6) s5 Docsity.com Example 2: no applicable actions at s5 π ← ∅ π' ← {(s,a) : a is applicable to s} π ← {(s,a) : a is applicable to s} PruneOutgoing(π',Sg) = π' Goal Start s3 s4 s2 2 s1 s6 at(r1,l6) s5 Docsity.com Example 2: no applicable actions at s5 π ← ∅ π' ← {(s,a) : a is applicable to s} π ← {(s,a) : a is applicable to s} PruneOutgoing(π',Sg) = π' PruneUnconnected(π',Sg) = as shown Goal Start s3 s4 s2 2 s1 s6 at(r1,l6) Docsity.com Example 2: no applicable actions at s5 π' ← shown π ← π' PruneOutgoing(π',Sg) = π' PruneUnconnected(π'',Sg) = π' so π = π' RemoveNonProgress(π') = as shown MkDet(shown) = no change Goal Start s3 s4 s2 2 s1 s6 at(r1,l6) Docsity.com Planning for Extended Goals • Here, “extended” means temporally extended – Constraints that apply to some sequence of states • Examples: – want to move to l3, and then to l5 – want to keep going back and forth between l3 and l5 Goal Start m ove(r1,l2,l1) wait wait wait wait 2 Docsity.com Planning for Extended Goals • Context: the internal state of the controller • Plan: (C, c0, act, ctxt) – C = a set of execution contexts – c0 is the initial context – act: S × C → A – ctxt: S × C × S → C • Sections 17.3 extends the ideas in Sections 17.1 and 17.2 to deal with extended goals – We’ll skip the details Docsity.com