apple tree logo
apple tree logo

Logic & Formal Reasoning
(a subtopic of Reasoning)




 

 
photo of Nils Nilsson

The big problem for AI is what to say not how to say it. The predicate calculus [formal logic] does no more than provide a uniform language in which knowledge about the world can be expressed and reasoned about.

-Nils Nilsson, Artificial Intelligence: A New Synthesis

Primer on Logic and Logical Concepts. By Erwin M. Segal, Department of Psychology and Center for Cognitive Science at the State University of New York at Buffalo. "Logic was an attempt to describe normative or correct reasoning. A widespread belief in AI, Cognitive Psychology, and Cognitive Science is that there are procedures representing the way people think which can be implemented on computers using logic type rules directly (algorithms) or by finding other ways to implement logical principles. The study of reasoning in this case is the study of 'natural' logic."

Stanford Formal Reasoning Group. "The Stanford Formal Reasoning Group concentrates on Logical AI. This is an approach to the development of symbolic AI. It was first proposed by McCarthy in his 1959 paper, Programs with Common sense. It uses logic to describe the manner in which intelligent machines or people behave. Workers in logic-based AI hope to reach human-level intelligence with a logic-based system. Such a system, described by sentences in logic, would represent what it knew about the world in general, what it knew about the particular situation and what it knew about its goals, as proposed in Programs with Common Sense. Other data structures, e.g. for representing pictures, would be present together with programs for creating them, manipulating them and for describing them in logic. The holistic program would perform actions that it inferred were appropriate for achieving its goals. " - from Overview.

Propositional, First-Order And Higher-Order Logics: Basic Definitions, Rules of Inference, and Examples. By Stuart C. Shapiro. In Lucja M. Iwanska & Stuart C. Shapiro, Eds., Natural Language Processing and Knowledge Representation: Language for Knowledge and Knowledge for Language, AAAI Press/The MIT Press, Menlo Park, CA, 2000, 379-395. Here are some excerpts:

  • 1: "Logic is the study of correct reasoning. It is not a particular KRR language. Thus, it is not proper to say 'We are using (or not using) logic as our KRR language.' There are, indeed, many different logics."
  • 2: "A logic consists of two parts, a language and a method of reasoning. The logical language, in turn, has two aspects, syntax and semantics. ... We will use CarPool World as a simple example. In CarPool World, Tom and Betty carpool...."
  • 4.1: "Propositional Logics (sometimes called Sentential Logics) conceptualize domains at, but not below the level of sentences (or propositions)."
  • 5.1: "First-Order Predicate Logics (FOPLs) conceptualize domains at and below the level of propositions, down to level of individuals, properties, and relations."
  • 6: "Three important properties of logics are soundness, consistency and completeness."

Glossary of First-Order Logic. Maintained by Peter Suber, Philosophy Department, Earlham College. "This glossary is limited to basic set theory, basic recursive function theory, two branches of logic (truth-functional propositional logic and first-order predicate logic) and their metatheory." Also see Professor Suber's other resources such as:

Introduction to Logic. A tutorial from Dave Inman, School of Computing, South Bank University, London. Topics covered: What is Logic? Are we logical? An example of logic. What is good about logic? What is it used for? A little history. Simple types of logic. Complex types of logic. How do you represent the world in logic? How does Prolog work? Overview. References.

Computational Intelligence - A Logical Approach. By David Poole, Alan Mackworth and Randy Goebel. 1998. Oxford University Press, New York. "Our theory is based on logic. Logic has been developed over the centuries as a formal (that is, precise not obtuse) way of representing assumptions about a world and the process of deriving the consequences of those assumptions. For simple agents in simple worlds we start with a highly restricted simple logic. Then as our agent/environment requires, we increase the logical power of the formalism. Since a computer is simply a symbol-manipulation engine, we can easily map our formal theories into computer programs that can control an agent or be used to reason about an agent. Everything we describe is implemented that way." - From the Preface, which is available online, as is Chapter 1.

Logic lecture slides and accompanying transcripts from Professors Tomás Lozano-Pérez & Leslie Kaelbling's Spring 2003 course, Artificial Intelligence. Available from MIT OpenCourseWare.

Logic and Artificial Intelligence. Entry by Richmond Thomason. The Stanford Encyclopedia of Philosophy (Fall 2003 Edition); Edward N. Zalta, editor. "I imagine that the audience for this entry will consist primarily of philosophers who have little or no familiarity with AI. In writing this survey, I have tried to concentrate on the issues that arise when logic is used in understanding problems in intelligent reasoning and guiding the design of mechanized reasoning systems."

The Development of Formal Logic. Section 1.1.3 of Chapter One (available online) of George F. Luger's textbook, Artificial Intelligence: Structures and Strategies for Complex Problem Solving, 5th Edition (Addison-Wesley; 2005). "Once thinking had come to be regarded as a form of computation, its formalization and eventual mechanization were obvious next steps. As noted in Section 1.1.1, Gottfried Wilhelm von Leibniz, with his Calculus Philosophicus, introduced the first system of formal logic as well as proposed a machine for automating its tasks (Leibniz 1887). Furthermore, the steps and stages of this mechanical solution can be represented as movement through the states of a tree or graph."

Computer generates verifiable mathematics proof. By Will Knight. NewScientist.com news (April 19, 2005). "A computer-assisted proof of a 150-year-old mathematical conjecture can at last be checked by human mathematicians. ... A proof of the theorem was announced by two US mathematicians, Kenneth Appel and Wolfgang Haken, in 1976. But a crucial portion of their work involved checking many thousands of maps - a task that can only feasibly be done using a computer. So a long-standing concern has been that some hidden flaw in the computer code they used might undermine the overall logic of the proof. But now Georges Gonthier, at Microsoft's research laboratory in Cambridge, UK, and Benjamin Werner at INRIA in France have proven the theorem in a way that should remove such concerns. They translated the proof into a language used to represent logical propositions - called Coq - and created logic-checking software to confirm that the steps put forward in the proof make sense."

Claude E. Shannon: Founder of Information Theory. By Graham P. Collins. Scientific American Explore (October 14, 2002). "Shannon's M.I.T. master's thesis in electrical engineering has been called the most important of the 20th century: in it the 22-year-old Shannon showed how the logical algebra of 19th-century mathematician George Boole could be implemented using electronic circuits of relays and switches. This most fundamental feature of digital computers' design -- the representation of 'true' and 'false' and '0' and '1' as open or closed switches, and the use of electronic logic gates to make decisions and to carry out arithmetic -- can be traced back to the insights in Shannon's thesis."

The Isaac Newton of logic - It was 150 years ago that George Boole published his classic The Laws of Thought, in which he outlined concepts that form the underpinnings of the modern high-speed computer. By Siobhan Roberts. The Globe and Mail (March 27, 2004; page F9). "It was 150 years ago that George Boole published his literary classic The Laws of Thought, wherein he devised a mathematical language for dealing with mental machinations of logic. It was a symbolic language of thought -- an algebra of logic (algebra is the branch of mathematics that uses letters and other general symbols to represent numbers and quantities in formulas and equations). In doing so, he provided the raw material needed for the design of the modern high-speed computer. His concepts, developed over the past century by other mathematicians but still known as 'Boolean algebra,' form the underpinnings of computer hardware, driving the circuits on computer chips. And, at a much higher level in the brain stem of computers, Boolean algebra operates the software of search engines such as Google."

Related Pages