Probabilistic Representation And Reasoning

2y ago
17 Views
3 Downloads
364.67 KB
19 Pages
Last View : 13d ago
Last Download : 3m ago
Upload by : Macey Ridenour
Transcription

Probabilistic Representation and ReasoningAlessandro PanellaDepartment of Computer ScienceUniversity of Illinois at ChicagoMay 4, 2010Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20101 / 21

SourcesR.J. Brachman and H.J. Levesque, Knowledge Representationand Reasoning, Elsevier, 2003S. Russell and P. Norvig, Artificial Intelligence: A ModernApproach, 2nd ed., Prentice Hall, 2003Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20102 / 21

Outline1IntroductionMotivationsGeneral Concepts2Probability and Bayesian NetworksProbability TheoryBayesian NetworksBeyond BNs: FOL of probabilityAlessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20103 / 21

IntroductionMotivationsWhat is the problem?Commonsense knowledge is not always categorical. x(Bird(x) Flies(x)) fails to capture much of what we couldthink and/or say about the world.the world is non-categorical!Statistical information.Personal belief.Imperfect measure tools.Unreliable information.Fuzzy concepts (vagueness).Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20105 / 21

IntroductionGeneral ConceptsUncertainty 6 VaguenessJohn is probably tall vs. John is quite tall.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20106 / 21

IntroductionGeneral ConceptsDifferent flavors of probabilityStatistical information about the domain.Things are usually (rarely, almost always, .) a certain way."90% of birds fly" x(P(Bird(x) Flies(x)) 0.9)Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20107 / 21

IntroductionGeneral ConceptsDifferent flavors of probability (Cont’d)Degree of belief.Laziness, ignorance.P(Flies(Tweety)) 0.75P( x(Bird(x) Flies(x))) 0.6Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20108 / 21

IntroductionGeneral ConceptsFrom statistics to beliefsGoing from statistical information to a degree of belief.e.g. "90% of birds fly" P(Flies(Tweety)) 0.9Usual procedure (Direct Inference):Find a reference class.Use statistics for that class to compute degree of belief.Problem: multiple reference classes.Example: what is the probability that Eric is tall?ABCD20% of American males are tall.25% of Californian males are tall.Eric is from California.13% of computer scientists are tall.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 20109 / 21

IntroductionGeneral ConceptsObjective vs. Subjective probabilityA philosophical debateObjectivistic viewProbabilities are real aspects of the universe, propensities ofobjects to be a certain way.Independent on who is assessing the probability.Philosophical position supporting frequentist statistics.Subjectivistic viewProbability is the degree of belief of the observer, no physicalsignificance.Philosophical position supporting Bayesian statistics.In the end, this distinction has little practical significance.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201010 / 21

Bayesian NetworksProbability TheoryPreliminary ConceptsSample space U: every possible outcomeEvents ai UB set of all possible eventsProbability function Pr : B [0, 1]Axioms of probability012Pr(a) 0Pr(U) 1If a1 , ., an are disjointPn events, thenPr(a1 . an ) i 0 Pr(ai )It follows34Pr(a) 1 Pr(a)Pr( ) 0Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201012 / 21

Bayesian NetworksProbability TheoryConditioning and Bayes’ RulePr(a b)Pr(b)1Conditional probability: Pr(a b) 2Conditional independence: Pr(a s) Pr(a s, b)Independece: S 3Conjunction: Pr(ab) Pr(a b) Pr(b)4If {b1 , ., bn } is a partitioning of U then Pr(a) 5Bayes’ RulePr(a b) Alessandro Panella (CS Dept. - UIC)Pni 0 Pr(a bi ) Pr(bi )Pr(b a) Pr(a)Pr(b)Probabilistic Representation and ReasoningMay 4, 201013 / 21

Bayesian NetworksProbability TheoryRandom Variables and Joint Probability(Propositional) random variable (r.v.): "feature" of the world whosevalue is uncertain.e.g. X1 outcome of the first coin toss.Might be discrete or continue.Interpretation I U: specification of the value for every r.v.Joint probability distribution J(I): degree of belief the agentassigns to interpretation I.X0 J(I) 1 andJ(I) 1IFor any event a, Pr(a) PI a J(I)Not useful for calculation: exponential number of interpretations.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201014 / 21

Bayesian NetworksBayesian NetworksBayesian NetworksA Bayesian (or belief) Network (BN) is a direct acyclic graphwhere:nodes Pi are r.v.sarcs represent (intuitively) direct dependency relationseach node has a conditional probability distributionPr(Pi Parents(Pi ))WeatherCavityToothacheCatch(from AIMA, 2ed)Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201015 / 21

Bayesian NetworksBayesian NetworksBayesian NetworksBasic propertyA node is conditionally independent from its non-descendants,given its parents.U1Um.  .  .XZ 1jZ njY1.  .  .Yn(from AIMA, 2ed)Full joint distribution.Pr(I) Pr(p1 , ., pn ) nYPr(pi Parents(Pi ))i 1Usually requires much less space.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201016 / 21

Bayesian NetworksBayesian NetworksBayesian NetworksAn .002AP(M )TF.70.01(from AIMA, 2ed)Pr(j m a b e) Pr(j a) Pr(m a) Pr(a b e) Pr( b) Pr( e) 0.9 0.7 0.001 0.999 0.998 0.00062Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201017 / 21

Bayesian NetworksBayesian NetworksQueries in BNsQuery variable X XSet of evidence variables ESet of nonevidence variables Y X \ (X E)Pr(X e) α Pr(X, e) αXPr(X, e, y)yExamplePr(b j, m) αXXe αXXeAlessandro Panella (CS Dept. - UIC)Pr(b, e, a, j, m)aPr(b) Pr(e) Pr(a b, e) Pr(j a) Pr(m a)aProbabilistic Representation and ReasoningMay 4, 201018 / 21

Bayesian NetworksBayesian NetworksComputational complexity of BN inferenceNaive computation: O(n2n )Depth-first computation: O(2n )Some terms are recurrentDynamic programming approach (Variable Elimination)Linear time for polytrees (at most one path between any two nodes)Exponential complexity in generalJoin tree algorithms are usually used in commercial tools.Not surprising: BN Propositional LogicInference is #P-hardNeed for approximate techniquesMany forms of sampling algorithms.Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201019 / 21

Bayesian NetworksFOLBeyond BNsBNs are essentially propositional.Finite and fixed set of variables, with a fixed domain.Don’t capture regularities of the domain.Extend probability to First-OrderFirst-Order Probabilistic Languages (FOPL).Theoretical difficulties [Halpern 1990].Need to restrict the semantics.Several approachesBayesian Logic (BLOG) [Milch et al.]Markov Logic Networks [Domingos et al.]Alessandro Panella (CS Dept. - UIC)Probabilistic Representation and ReasoningMay 4, 201020 / 21

Bayesian NetworksAlessandro Panella (CS Dept. - UIC)FOLProbabilistic Representation and ReasoningMay 4, 201021 / 21

Alessandro Panella (CS Dept. - UIC) Probabilistic Representation and Reasoning May 4, 2010 14 / 21. Bayesian Networks Bayesian Networks Bayesian Networks A Bayesian (or belief) Network (BN) is a direct acyclic graph where: nodes P i are r.v.s

Related Documents:

Description Logic Reasoning Research Challenges Reasoning with Expressive Description Logics – p. 2/40. Talk Outline Introduction to Description Logics The Semantic Web: Killer App for (DL) Reasoning? Web Ontology Languages DAML OIL Language Reasoning with DAML OIL OilEd Demo Description Logic Reasoning Research Challenges Reasoning with Expressive Description Logics – p. 2/40. Talk .

Polynomial-Time Probabilistic Reasoning with Partial Observations via Implicit Learning in Probability Logics Brendan Juba Washington University in St. Louis bjuba@wustl.edu Abstract Standard approaches to probabilistic reasoning require that one possesses an explicit model of the distribution in ques-tion.

A Visual Language for Explaining Probabilistic Reasoning Martin Erwig, Eric Walkingshaw School of EECS, Oregon State University, Corvallis, OR 97331, USA Abstract We present an explanation-oriented, domain-specific, visual language for explain-ing probabilistic reasoning. Explanation-oriented programming is a new paradigm

Use inductive reasoning to make a conjecture about the sum of a number and itself. Then use deductive reasoning to show that the conjecture is true. 11. Decide whether inductive reasoning or deductive reasoning is used to reach the conclusion. Explain your reasoning. All multiples of 8 are divisible by 4. 64 is a multiple of 8. So, 64 is .

Reasoning about Reasoning by Nested Conditioning: Modeling Theory of Mind with Probabilistic Programs A. Stuhlmuller a, N. D. Goodmanb aDepartment of Brain and Cognitive Sciences, Massachusetts Institute of Technology bDepartment of Psychology, Stanford University Abstract A wide range of human rea

deterministic polynomial-time algorithms. However, as argued next, we can gain a lot if we are willing to take a somewhat non-traditional step and allow probabilistic veriflcation procedures. In this primer, we shall survey three types of probabilistic proof systems, called interactive proofs, zero-knowledge proofs, and probabilistic checkable .

non-Bayesian approach, called Additive Regularization of Topic Models. ARTM is free of redundant probabilistic assumptions and provides a simple inference for many combined and multi-objective topic models. Keywords: Probabilistic topic modeling · Regularization of ill-posed inverse problems · Stochastic matrix factorization · Probabilistic .

Un additif alimentaire est défini comme ‘’ n’importe quelle substance habituellement non consommée comme un aliment en soi et non employée comme un ingrédient caractéristique de l’aliment, qu’il ait un une valeur nutritionnelle ou non, dont l’addition intentionnelle à l’aliment pour un but technologique dans la fabrication, le traitement, la préparation, l’emballage, le .