Èíôîðìàöèîííàÿ ñèñòåìà "Êîíôåðåíöèè"



The 9th Asian Logic Conference

16-19 August, 2005
Novosibirsk, Russia

Abstracts


Applications of logic in computer science

Logic and Probability synthesis in the prediction notion

Vityaev E.E.

Sobolev Institute of Matematic SB RAS (Novosibirsk)

The problem stated in the preface to the special issue of the journal of Ap-plied logic [1] “Artificial intelligence is one key discipline in which probability the-ory competes with other logics for application. It is becoming vitally important to evaluate and integrate systems that are based on very different approaches to reasoning …” was addressed at the first Symposium on “«Combining Probability and Logic”. In tasks of Artificial Intelligence, logic and probability are used jointly in to obtain explanations (predictions). There are three main definitions of explanation given early by Hempel: (1) Deductive-Nomological Inference (D-N) consisting in deductive inference of explanations from facts and theory; (2) Deductive-Statistical Inference (D-S) consisting of inference of explana-tions from facts and theory that may contain statistical inferences; (3) Inductive-Statistical Inference (I-S) designed for inference of facts from in-ductive theories. Ambiguities can arise in the process of the (I-S) inference, these are mu-tually exclusive statements that can be inferred from theory. To circumvent ambi-guities in the I-S inference, Hempel has introduced the rule of the Requirement of Maximal Specificity, RMS [2], the requirement of incorporating all information per-taining to explanation (prediction). Hempel did not give a formal definition of the RMS-rules and he did not specify how to derive them. For example, W.Salmon has defined the I_S inference on the basis of the causality concept [3]. Let us define such an inference that would approximate all the three types of in-ferences and in this way would provide synthesis of logic and probability in ex-planation (prediction). Let us define the semantic probabilistic inference (SPI) [4] that according to the probabilistic model M of the class D of the models [4] induc-tively infers a set of PT rules, which are, in a certain sense, a probabilistic ap-proximation of the theory Th of class of models D. The set PT of rules has the following properties: 1. if we set apart from PT all the rules that predict with conditional probability 1, PT1 = {C = (A1&…&Ak -> A0) in PT | Prob(A0/A1&…&Ak) = 1}, then the theory Th is inferred from the rules PT1 (theorem 1). Using rules PT, al-lows to take full advantage of the (D-N)-inference of predictions of facts of the D class of models on the basis of the theory Th and the other data of the class of the D models; 2. all the RMS rules, which can be defined for the class of the D models and used in the (I-S) inference, are contained by the PT, and they are leafs of the SPI-inference (theorem 2). This allows to take full advantages of the I-S inference using the rules from PT and the data of the class of the D models; 3. the PT rules approximate any D-S inference (theorem 3); Theorem 1. PT1 |- Th [5]. Theorem 2. Every rule at the end of a branch of SPI inference is an RMS-rule. Conversely, all the RMS-rules belonging to the probabilistic model M of data and formulated in the Th theory language are inferred in the process of the SPI-inference. The explanation (prediction) of the fact F in the D-S inference is the infer-ence of the fact from the theory Th and probabilistic facts from the probabilistic model M of data. The estimation of probability v of the D-S inference can be cal-culated in the frame of probabilistic logic. Theorem 3. For any D-S inference of an fact F with the estimation v from the theory Th and facts form the probabilistic model of data M, there is a SPT in-ference of the same fact F on the basis of the same data with an estimation that is not worse than v. References: 1. Jon Williamson, Dov Gabbay. Editorial. Special issue on Combining Probability and Logic, Journal of Applied Logic, 1 (2003) 135–138 2. Atocha Aliseda_LLera. Seeking Explanations Abduction in Logic Philosophy of Sci-ence and Artificial Intelligence. ILLC Dissertation Series, Universiteit van Amsterdam, 1997, pp.196 3. W.Salmon. Scientific Explanation and the Casual Structure of the World. Princeton, Princeton University Press,1984 4. Vityaev E.E. Semantic approach to knowledge base creating. Semantic probabilistic inference of the best for prediction PROLOG-programs by a probability model of data Logic and Semantic Programming (Computational Systems, v.146), Novosibirsk, 1992, p.19-49. (in Russian) 5. Evgenii Vityaev, Boris Kovalerchuk, Empirical Theories Discovery based on the Measurement Theory. Mind and Machine, v.14, #4, 551-573, 2004

Additional information:


Mail to Webmaster
alc9@math.nsc.ru
|Home Page| |English Part| [SBRAS]
Go to Home
© 1996-2000, Siberian Branch of Russian Academy of Sciences, Novosibirsk
    Last update: 06-Jul-2012 (11:44:52)