CSC242: Intro To AI

2y ago
9 Views
2 Downloads
4.21 MB
55 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Gannon Casey
Transcription

CSC242: Intro to AILecture 18

QuizStop Time: 2:15

Exact Inference inBayesian NetworksP(X e) P(X, e) nXYy i 1XP(X, e, y)yP (Xi parents(Xi ))

Exact Inference inBayesian Networks Exact inference in BNs is NP-hard Can be shown to be as hard as computingthe number of satisfying assignments of apropositional logic formula #P-hard

Approximate Inferencein Bayesian Networks

The Goal Query variable X Evidence variables E , ., E Observed values: e e , ., e Non-evidence, non-query (“hidden”) variables: Y Approximate: P(X e)1m1m

A Simpler Goal Query variable X Evidence variables E , ., E Observed values: e e , ., e Non-evidence, non-query (“hidden”) variables: Y Approximate: P(X)1m1m

A Really Simple GoalHeadsGoal: P(Heads)i.e., P(Heads true)i.e., P(heads)

A Really Simple GoalHeads# of flips:# of heads:NNheads

A Really Simple GoalHeadsNheadsP (heads) N

A Really Simple GoalHeadsNheadsP (heads) limN !1N

Sampling Generating events (possible worlds) from adistribution Estimating probabilities as ratio of observedevents to total events Consistent estimate: becomes exact inthe large-sample limit

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00P (Rain true)C P(R)t .80f .20

Sampling Generate assignments of values to therandom variables That is consistent with the full jointdistribution encoded in the network In the sense that in the limit, theprobability of any event is equal to thefrequency of its occurrence

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20R P(W)t.99f.90t .90f .00hCloudy true, Sprinkler false, Rain true, WetGrass truei

Generating Samples Sample each variable in topological order Child appears after its parents Choose the value for that variableconditioned on the values already chosenfor its parents

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudySprinklerRainWetGrass

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudySprinklerRainWetGrassR P(W)t.99f.90t .90f .00P(Cloudy) h0.5, 0.5itrue

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler falseRainWetGrassR P(W)t.99f.90t .90f .00P(Sprinkler Cloudy true) 0.1, 0.9

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrassR P(W)t.99f.90t .90f .00P(Rain Cloudy true) 0.8, 0.2

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrass trueR P(W)t.99f.90t .90f .00P(WetGrass Sprinkler false, Rain true) h0.9, 0.1i

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrass trueR P(W)t.99f.90t .90f .00hCloudy true, Sprinkler false, Rain true, WetGrass trueiGuaranteed to be a consistent estimate(becomes exact in the large-sample limit)

Sampling Generate an assignment of values to therandom variables That is consistent with the full jointdistribution encoded in the network In the sense that in the limit, theprobability of any event is equal to thefrequency of its occurrence

The Goal Query variable X Evidence variables E , ., E Observed values: e e , ., e Non-evidence, non-query (“hidden”) variables: Y Approximate: P(X e)1m1m

P(C) .5P(Rain Sprinkler C P(R)t .80f .20R P(W)t.99f.90t .90f .00hCloudy true, Sprinkler false, Rain true, WetGrass truei

Rejection Sampling Generate sample from the priordistribution specified by the network Reject sample if inconsistent with theevidence Use remaining samples to estimateprobability of event

P(C) .5P(Rain Sprinkler R P(W)t.99f.90t .90f .00P(Rain Sprinkler true)C P(R)t .80f .20100 samplesSprinkler false: 73Sprinkler true: 27Rain true: 8Rain false: 198 19 , 0.296, 0.704 27 27

Rejection Sampling Generate sample from the priordistribution specified by the network Reject sample if inconsistent with theevidence Use remaining samples to estimateprobability of event Fraction of samples consistent with theevidence drops exponentially with numberof evidence variables

Likelihood Weighting Generate only samples consistent with theevidence i.e., fix values of evidence varables Instead of counting 1 for each non-rejectedsample, weight the count by the likelihood(probability) of the sample given theevidence

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudySprinklerRainWetGrass

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudySprinklerRainWetGrassR P(W)t.99f.90t .90f .00P(Rain Cloudy true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudySprinklerRainWetGrassw 1.0P(Rain Cloudy true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudySprinklerRainWetGrassw 1.0P(Rain Cloudy true, WetGrass true)true

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudytrueSprinkler falseRainWetGrassw 0.5P(Rain Cloudy true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrassw 0.5P(Rain Cloudy true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrass truew 0.5P(Rain Cloudy true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffR P(W)t.99f.90t .90f .00C P(R)t .80f .20CloudytrueSprinkler falseRaintrueWetGrass truew 0.45P(Rain Cloudy true, WetGrass true)

Likelihood Weighting Generate sample using topological order Evidence variable: Fix value to evidencevalue and update weight of sample usingprobability in network Non-evidence variable: Sample fromvalues using probabilities in the network(given parents)

Likelihood Weighting Pros: Doesn’t reject any samples Cons: More evidence lower weight Affected by order of evidence vars intopsort (later worse)

Approximate Inferencein Bayesian Networks Rejection Sampling Likelihood Weighting

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler trueRainfalseWetGrass trueR P(W)t.99f.90t .90f .00P(Rain Sprinkler true, WetGrass true)

Markov Chain MonteCarlo Simulation To approximate: P(X e) Generate a sequence of states Values of evidence variables are fixed Values of other variables appear in theright proportion given the distributionencoded by the network

U1.UmXZ1jY1.U1Z njYnConditionalIndependenceUm.XZ 1jY1.Z njYnMarkovBlanket

Markov Blanket The Markov Blanket of a node is itsparents, its children, and its children’sparents. A node is conditionally independent of allother nodes in the network given itsMarkov Blanket

MCMCGibbs SimulationSampling To approximate: P(X e) Start in a state with evidence variables setto evidence values (others arbitrary) On each step, sample the non-evidencevariables conditioned on the values of thevariables in their Markov Blankets Order irrelevant

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler trueRainfalseWetGrass trueR P(W)t.99f.90t .90f .00P(Rain Sprinkler true, WetGrass true)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudytrueSprinkler trueRainfalseWetGrass trueR P(W)t.99f.90t .90f .00P(Rain Sprinkler true, WetGrass true)P(Cloudy Sprinkler true, Rain false)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudyfalseSprinkler trueRainfalseWetGrass trueR P(W)t.99f.90t .90f .00P(Rain Sprinkler true, WetGrass true)P(Rain Sprinkler true, Rain false, Cloudy false)

P(C) .5CloudyCtfP(S).10.50RainSprinklerWetGrassSttffC P(R)t .80f .20CloudyfalseSprinkler trueRaintrueWetGrass trueR P(W)t.99f.90t .90f .00P(Rain Sprinkler true, WetGrass true)

Gibbs Sampling To approximate: P(X e) Start in a state with evidence variables setto evidence values (others arbitrary) On each step, sample non-evidencevariables conditioned on the values of thevariables in their Markov Blanket Order irrelevant A form of local search!

Exact Inference inBayesian Networks #P-Hard even for distribution described asa Bayesian Network

Approximate Inferencein Bayesian Networks Sampling consistent with a distribution Rejection Sampling: rejects too much Likelihood Weighting: weights get too small Gibbs Sampling: MCMC algorithm (like localsearch) All generate consistent estimates (equal toexact probability in the large-sample limit)

Project 3Inference in Bayesian NetworksImplement and evaluateFull 2/spring2013/projects/project-03.pdfDue: Fri 5 Apr 23:00

For Next Time:AIMA 15.0-15.3

Sprinkler Rain Wet Grass P(W) Cloudy Sprinkler Rain WetGrass false true true hCloudy true , Sprinkler false , Rain true , WetGrass true i Guaranteed to be a consistent estimate (becomes exact in the large-sample limit)

Related Documents:

Publication 1398-5.2 – PDF 1997 Table of Contents IntroTable of Contents Table of Contents Intro-3 List of Figures Intro-9 List of Tables Intro-13 Preface Intro-17 Who Should Use this Manual.Intro-17 ULTRA 100 Series Product Receiving and Storage Responsibility. .

ART V02A Intro to Hist of Western Art I 3 ARHS 200 Art of Western World I 3 EHAP, TCNA ART V02B Intro to Hist of West Art II 3 ARHS 2XXX Intro to Hist of West Art II 3 EHAP, EHAP ART V02C Intro to Non-Western Art 3 ARHS 2XXX Intro to Non-Western Art 3 ART V02D Art of Ancient Mediterranean 3

work/products (Beading, Candles, Carving, Food Products, Soap, Weaving, etc.) ⃝I understand that if my work contains Indigenous visual representation that it is a reflection of the Indigenous culture of my native region. ⃝To the best of my knowledge, my work/products fall within Craft Council standards and expectations with respect to

Chem 350 Jasperse Ch. 1 Notes 1 Ch. 1 Intro and Review 1.1 Intro to Organic Chemistry “Organic”: “Organic Chemistry”: Focus on

Welcome to the Serato DJ Intro 1.2.8 software manual. Serato DJ Intro is an integrated software and hardware system, designed to give music selectors and DJs new kinds of control. Using the Serato DJ Intro software you can

INTRO Urlich Kunz 08.20-08.40 Management of chronic subdural hematomas Panagiotis Selviaridis, Greece INTRO Lukas Rasulic 08.40-09.00 The Cybathlon - Moving People and Technology Verena Klamroth-Marganska, Switzerland INTRO Djula Djilvesi 09.00-09.30 Klaus won Wild lecture Peter Hutchinson, UK INTRO Ioan Stefan Florian 09.30-09.40 Coffee break .

Physical Education (4th Grade): Unit #1: Throwing/Catching, Kicking, and Fitness Assessment Intro Georgia Department of Education Page 4 of 215 Unit #1: Throwing/Catching, Kicking, and Fitness Assessment Intro Course: 4th Grade UNIT #1: Throwing/Catching, PACING: 9 Weeks Physical Education Kicking, Fitness Assessment Intro

Japanese Language School 1 This guide for university study abroad students, advisors, and staff is intended to help you determine which of your courses is equivalent to each of the levels (1–8) of the KCP Intensive Japanese Language course. Total immersion KCP International teaches Japanese in the Direct Method —learning entirely in Japanese without a vehicular language such as English, at .