Propositional Logic

Author: Ernad Mujakic
Date: 2025-08-02


Propositional logic is a branch of Logic that deals with propositions, which are statements that are either true or false. An example of a proposition is the sentence, "Tomorrow is Thursday", since it's either true or false.
Propositional logic is Monotonic, meaning that when you add knowledge to a propositional Knowledge Base, it cannot lead to the loss of previously established truths.

Syntax

Propositional logic is made up of:

  • Propositional Symbols: Symbols that start with an uppercase letter and refer to a proposition. For example, , , and are examples of propositional symbols. Each symbol represents a distinct statement that can be true or false. Individual symbols are commonly referred to as literals. A literal is negative if there is a negation applied to it (e.g. ), else, it's a positive literal.
  • Logical Connectives: Operators which combine propositional symbols to create complex sentences. The primary logical connectives include:
    1. Negation (NOT, ¬)
    2. Conjunction (AND, ∧)
    3. Disjunction (OR, ∨)
    4. Implication (IMPLIES, →)
    5. Biconditional (IF AND ONLY IF, ↔)

Sentences

  • Atomic Sentence: An atomic sentence consists of a single propositional symbol and represents a basic assertion.
  • Complex Sentence: A complex sentence is made up of propositional symbols connected by parenthesis and logical connectives. They are also called formulas.

Relation to Other Logics

Propositional logic is often seen as a precursor to more complex logic systems including:

  • First-Order Logic: Extends propositional logic by introducing quantifiers and predicates, allowing for the representation of the relationships between objects.
  • Modal Logic: Incorporates modalities like possibility, necessity, or knowledge.
  • Temporal Logic: Allows for reasoning about change over time by introducing temporal operators.
  • Automated Theorem Proving: Uses propositional logic as the basis for many theorem proving techniques.

Logical Connectives

Negation (NOT, ¬)

Negation flips the truth value of the propositional symbol it is applied to. Negating gives , which can be understood in English as "not ". The Truth Table for negation is as follows:

T F
F T

Conjunction (AND, ∧)

A conjunction of two conjuncts is true only if both of its conjuncts is true. This can be understood in English as "and", and is denoted in propositional logic with the symbol . The truth table for conjunction is as follows:

T T T
T F F
F T F
F F F

Disjunction (OR, )

A disjunction of two disjuncts is true if either of its disjuncts are true. This can be understood as the English word "or", meaning if one or the other is true, then the disjunction is true. Disjunction is denoted with the symbol . The truth table for disjunction is as follows:

T T T
T F T
F T T
F F F

Implication (IMPLIES, →)

An Implication is a binary operator with it's left-hand side being the premise, and the right-hand side being the conclusion. An implication states that if the premise is true, then the premise must be true. Implications can be thought of as if-then statements and are denoted with the symbols or . An implication is only false if its premise is true, while the conclusion is false. The truth table for implication is as follows:

T T T
T F F
F T T
F F T

As shown in the truth table above, an implication is always true if its premise is false. This occurs because the sentence provides no information. It is like saying, "If , which is false, were true, then ". Since is false, the sentence holds no substantive meaning.
The concept of vacuous truth will be relevant later in the Mathematical Induction subsection, particularly in establishing base cases.

Biconditional (IF AND ONLY IF, ↔)

A biconditional can be thought of as double implication, meaning both sides imply the other. Biconditionals are true only if both symbols are the same truth value. The symbols used to denote biconditional is or , and can be thought of in English as "if and only if". The truth table for a biconditional is:

T T T
T F F
F T F
F F T

NAND (NOT AND, )

A NAND can be thought of as a negation applied to a conjunction, . A NAND is true as long as one of its operands it false. It's represented symbolically as or as , and has the following truth table:

T T F
T F T
F T T
F F T

XOR (EXCLUSIVE OR, )

A XOR, or exclusive or, is true when exactly one of its operands is true. It can be thought of as the English word 'or', in the context of when only one thing or the other is the case. XOR is denoted using the symbol, and its truth table is as follows:

T T F
T F T
F T T
F F F

NOR (NOT OR, )

A NOR can be thought of as a negation applied to an OR operator, . A NOR is true only when both of its operands are false and is represented symbolically as . The truth table for NOR is as follows:

T T F
T F F
F T F
F F T

XNOR (EXCLUSIVE NOT OR, )

A XNOR can be thought of as a negation applied to the XOR gate, . Where a XOR is true only when both of its operands are different, a XNOR is true only when both of its operands are the same. It has the same truth table as a biconditional, which is as follows:

T T T
T F F
F T F
F F T

Precedence of Logical Connectives

The order of precedence of logical connectives is as follows:

  • Negation (NOT, ¬)
  • Conjunction (AND, ∧)
  • Disjunction (OR, ),
  • Implication (IMPLIES, →)
  • Biconditional (IF AND ONLY IF, ↔)

Logical Equivalence

Two sentences are said to be logically equivalent if they have the same truth table. This is denoted using the symbol. A Rule of Replacement is a logical principle that allows for the substitution of one logical expression, for another, logically equivalent expression. Rules of replacement are used to construct proofs, simply logical expressions, and verify the correctness of logical statements. Propositional equivalence is vital for many domains, including:

  • Digital Logic Design: Used to simplify expressions in Boolean Algebra, thereby optimizing and simplifying digital circuits.
  • Artificial Intelligence: Used to in Knowledge Representation as well as inference algorithms, to derive new facts from an existing Knowledge Base.
  • Control Theory: Used for simplifying systems equations, analyze conditions, and modeling states and transitions.
  • Mathematical Proofs: Used to transform sentences into equivalent forms which are easier to prove.

Properties and Laws

Double Negation Law

The double negation law is a rule of replacement which states that, when negation is applied twice to an expression, the truth table for that expression remains the same. This is because the second negation 'cancels out' the first one, leaving only the original expression left.

Identity Laws

The identity laws state that, any proposition conjoined with a true value, is logically equivalent to the proposition alone. For disjunction, any proposition disjoined with a false value, is logically equivalent to the proposition alone. This is because it both cases, the expression's truth value relies entirely on the value of the proposition.

Domination Laws

The domination laws state that, any proposition conjoined with a false value is always false, and any proposition disjoined with a true value is always true. This is because, in the case of conjunction, the AND operation requires both operands to be true for the result to be true. Since one operand is false, the entire expression cannot be true, regardless of the value of the other operand. In the case of disjunction, the OR operation requires at least one operand to be true for the result to be true. Since one operand is true, the entire expression will always evaluate to true.

The idempotent laws state that a proposition conjoined or disjoined with itself is logically equivalent to the proposition alone. This means that applying conjunction or disjunction to a proposition with itself multiple times yields the same truth table as the original proposition.

The distributive laws describe how conjunction (AND) and disjunction (OR) interact with each other in propositional logic, allowing us to distribute one operation over the other.
The first distributive law states that a conjunction of with disjunction of and is logically equivalent to the disjunction of AND with AND R.
The second distributive law states that the disjunction of with the conjunction of and is logically equivalent to the conjunction of OR with OR .

De Morgan's Laws consist of two rules of replacement which define the relationship between disjunctions and conjunctions through negation.
De Morgan's first law states that the negation of a conjunction is logically equivalent to the disjunction of the negated conjuncts. This is directly equivalent to the definition of the NAND operator and can be seen as a direct implementation of De Morgan's first law.
De Morgan's second law states that the negation of a disjunction is logically equivalent to the conjunction of the negated disjuncts. This is directly equivalent to the definition of the NOR operator and can be seen as a direct implementation of De Morgan's second law.

The commutative laws state that the order of operands in a conjunction and disjunction are logically equivalent.

The associative laws state that the grouping of operands in a conjunction or disjunction does not affect the truth value of expression.


Clauses and Normal Form

A clause is a disjunction/conjunction of literals. When talking about clauses, usually it refers to a disjunctive clause, which is a logical expression formed by connecting literals with the OR operator.
The empty clause is a clause with no literals, commonly denoted as , , or . An empty disjunctive clause is always false, making it analogous to a contradiction. This is an important concept in proof by contradiction, as reaching an empty clause indicates that a contradiction has been proven.

Horn Clause

A Horn clause is a disjunctive clause with at most one positive literal.

  • Definite Clause: If a Horn clause has exactly one positive literal, it is a definite clause. For example, .
  • Goal Clause: If it has no positive literals, it is a goal clause. For example, .

A Horn clause can be represented in disjunctive form:
A Horn clause can also be represented in implicative form:
In both cases, is the only positive literal.

Horn clauses are computationally efficient for algorithms such as Resolution Theorem Proving, or for Forward/Backward Chaining. This makes them the basis of many Logic Programming languages, as well as for automated theorem proving or database querying.

A sentence is considered to be in conjunctive normal form (CNF) if it's a conjunction (AND) of one or more clauses.
Every sentence in propositional logic can be converted into an equivalent CNF form using the following steps:

  1. Implication/biconditional elimination (will be discussed later)
  2. Moving Negations Inward using De Morgan's laws.
  3. Double negation elimination
  4. Using the distributive property to distribute OR over AND, for example:

A sentence is considered to be in disjunctive normal form (DNF) if it is a disjunction (OR) of conjunctions (AND).
Every sentence in propositional logic can be converted to DNF in the following steps:

  1. Implication/biconditional elimination
  2. Moving Negations Inward using De Morgan's laws.
  3. Double negation elimination
  4. Using the distributive property to distribute AND over OR, for example:

Arguments

In logic, an argument consists of a set of sentences called premises, which, if all are true, the conclusion sentence must follow. This is a form of reasoning called Deductive Reasoning which uses general statements to derive specific conclusions.

Entailment

Logical entailment, or logical consequence, refers to a relationship between premises and a conclusion. The conclusion is said to be entailed by the premises if there is not assignment of truth values such that the premises are true while the conclusion is false. In other words, entailment states that the conclusion follows the premises; whenever the premises are true, so is the conclusion. Entailment is closely related to the idea of tautologies, as tautologies represent statements that are universally true and can help establish the validity of entailments.
Entailment is represented symbolically as . For example:

Validity, Soundness, and Satisfiability

  • Valid: An argument is valid if its conclusion must follow its premises. Meaning that if the argument's premises are true, then the conclusion must be true as well, otherwise, the argument is invalid.
  • Sound: An argument is sound if it's premises are true. If any of its premises are false, then the argument is unsound.
  • Satisfiable: A formula is satisfiable if there is at least one assignment of truth values which makes the formula true. If there is no assignment of truth values to make the formula true, then it is unsatisfiable, also called a contradiction.

If an argument is valid and sound, then its conclusion must be true. A sound argument is always valid, while a valid argument may be unsound.

  • Argument forms are patterns or templates that represent valid structures for arguments.
  • Syllogisms are argument forms that consist of two premises which support a conclusion.

Modus Ponens is a syllogistic argument form and rule of inference. It has the following structure:

  1. If then .
  2. .
  3. Therefore, .

The first two sentences are the arguments premises, and the final sentence is the conclusion. Modus Ponens is represented in Gentzen Notation as:

Modus Tollens is another valid syllogistic argument form, similar to Modus Ponens. Though, unlike Modus Ponens, Modus Tollens affirms the negation of antecedent from the negation of the consequent. The structure of Modus Tollens is as follows:

  1. If then .
  2. .
  3. Therefore, .

Like Modus Ponens, the first two sentences are the arguments premises, and the final sentence is the conclusion. Modus Tollens is represented symbolically as:

A disjunctive syllogism is a syllogistic argument form with a disjunction as one of its premises. It says that if one of the two disjuncts is true, and we know that one is false, then the other must be true. The structure of disjunctive syllogisms is as follows:

  1. or .
  2. .
  3. Therefore, .

In Gentzen notation:

A hypothetical syllogism, also called the chain rule, is an argument form which uses the transitive property of implication. The structure of a hypothetical syllogism is as follows:

  1. If then .
  2. If then .
  3. Therefore, if then .

Hypothetical syllogisms are represented in Gentzen notation as:

Rules of inference are logical rules that dictate derivations of conclusions from premises. Rules of inference differ from rules of replacement which state that two expressions are logically equivalent and can be freely swapped for one another.

Implication Introduction

Implication introduction is a rule of inference which is used to express implications in terms of disjunctions. The rule is as follows:
Implication introduction is crucial when converting formulas to normal forms like CNF or DNF.

Biconditional Elimination / Introduction

Biconditional introduction is a rule of inference which allow for the inference of a biconditional from two implications:
This transformation can also be done the opposite way, called biconditional elimination, where two implications can be inferred from a biconditional:
and:

Conjunction Elimination / Introduction

Conjunction introduction is a rule of inference which states that if two propositions are known to be true, then there conjunction is also known to be true:
Conjunction elimination states that if the conjunction of two propositions is known to be true, then either of its conjuncts is also known to be true:

Disjunction Elimination / Introduction

Disjunction introduction is a rule of inference which states that if a proposition is known to be true, any disjunction it is a part of is also known to be true:
Disjunction elimination works the opposite way, stating that if you have a disjunction where each disjunct implies a third proposition , then can be inferred:


Proofs

In propositional logic, proofs are essential for establishing the truth of a statement or the validity of an argument. A proof is a demonstration that a conclusion follows from a set of premises.

Proof Systems

Proof Systems are frameworks which provide the structure and methodology for constructing proofs. Proof systems are made up of:

  • Axioms: Foundational propositions that are accepted as true without having been proven.
  • Rules of Inference: Rules which provide the structure for how new statements can be derived from existing ones.
  • Syntax: The symbols used to represent statements.
  • Semantics: The interpretation of the meaning of the statements within the proof system.

The most common proof systems include Natural Deduction and Sequent Calculus.
A proof system is complete if every statement that is semantically true can be proven within that system. Otherwise, the system is incomplete.

Tautology and Contradiction

  • A tautology is a propositional formula which is true for all possible assignments of truth values.
  • A contradiction is a formula that is always false for all possible assignments of truth values.

Many proof techniques rely on tautologies and contradictions to prove that a particular formula entails another, or to prove that a particular formula is unsatisfiable.

Common Proof Techniques

A direct proof is one of the most common and straightforward methods for deriving conclusions. The structure of a direct proof is as follows:

  1. Identify the premises: Clearly state the premises which you will use to derive the conclusion.
  2. State the conclusion: State the conclusion you wish to prove from the premises.
  3. Logical deduction: Use the rules of inference to derive the conclusion from the premises step by step.

Proof by contradiction is an indirect proof that assumes the conclusion is false, then proves that this assumption leads to a contradiction. If assuming the conclusion is false does lead to a contradiction, then the conclusion must be true. The structure of a proof by contradiction is as follows:

  1. Identify the premises: Clearly state the premises which you will use to derive the conclusion.
  2. State the conclusion: State the conclusion you wish to prove from the premises.
  3. Assume the negation: Assume that the conclusion is false.
  4. Logical deduction: Use the rules of inference to derive a contradiction. A contradiction typically manifests as the empty sentence , which is a statement that is always false.
  5. Conclude the Conclusion: If the assumption that the conclusion is false did lead to a contradiction, then the original conclusion must be true.

Proof by contrapositive is an indirect proof which proves a conclusion by proving its contrapositive. Since a sentence and its contrapositive are logically equivalent, proving one proves the other. The structure for a proof by contrapositive is as follows:

  1. State the conclusion: State the conclusion you wish to prove, for example, .
  2. Identify the contrapositive: Identify the contrapositive of the conclusion, for example, .
  3. Logical deduction: Use the rules of inference to prove the contrapositive.
  4. Conclude the Original Implication: If the contrapositive can be prove, then the original conclusion is also proven.

Resolution resolves two clauses which contain complementary literals. Two literals are complements of one another if one is the negation of the other (e.g. and ). Resolution is defined:
The above example resolves and resulting in a new clause called the resolvent.
In the context of Resolution Theorem Proving, resolution is applied repeatedly to derive a contradiction, thereby proving that the negation of the statement that's being proven, is unsatisfiable. Ensuring all formulas are in Conjunctive Normal Form (CNF) dramatically improves the efficiency of resolution-based automated theorem proving.

Mathematical induction is a proof technique for statements that apply to all natural numbers. It starts with proving the base case—typically or . Then the inductive hypothesis assumes that the formula is true for some arbitrary natural number , where . Then, using this assumption, the inductive step involves proving the formula holds for .
For example, let's say were trying to inductively prove : The base case, where , is:
Now, we assume the inductive hypothesis is true:
Using the assumption, the inductive step involves proving:
Using Algebra, we get:
Therefore, is true for all natural number greater than or equal to 1.


Applications

Computer Science

  • Artificial Intelligence: Propositional logic is used for knowledge representation, Decision Theory, and inference algorithms in AI agents.
  • Automated Theorem Proving: Uses propositional logic as the basis for many theorem proving techniques.
  • Database Querying: Allows for complex queries using logical operators like AND, NOT, and OR. Queries can also utilize propositional conditional statements.
  • Satisfiability Problems: Propositional logic is central to many SAT solvers.

Computer Engineering

  • Digital Circuit Design: Propositional logic is used to design and analyze digital circuits. Digital Logic Gates correspond to the propositional logical operators.
  • Boolean Algebra: Propositional logic serves as the basis of Boolean algebra, which is essential for circuit design and programming.

Mathematics

  • Proof Techniques: Many proof techniques, including direct proof, proof by contradiction, and mathematical induction is built upon propositional logic.
  • Set Theory: Used to express relationships and operations on sets.
  • Logical Foundation: Propositional logic serves as the base for more complex logic systems like first-order logic, modal logic, and temporal logic.
  • Strategy Representation: Propositional logic can be used to design strategies for players.
  • Fault Detection: Propositional logic can be employed to detect faults in systems by establishing logical relations between system states and expected behavior.

References

  1. S. J. Russell and P. Norvig, Artificial Intelligence: a Modern Approach, 4th ed. Upper Saddle River: Pearson, 2020.
  2. Wikipedia Contributors, “Propositional calculus,” Wikipedia, Oct. 21, 2024.
  3. GeeksforGeeks, “Propositional Logic,” GeeksforGeeks, Jun. 19, 2015. https://www.geeksforgeeks.org/engineering-mathematics/proposition-logic/
  4. C. Franks, “Propositional Logic,” Stanford Encyclopedia of Philosophy, 2023. https://plato.stanford.edu/entries/logic-propositional/
  5. GeeksforGeeks, “Propositional Equivalences,” GeeksforGeeks, Jun. 22, 2015. https://www.geeksforgeeks.org/engineering-mathematics/mathematical-logic-propositional-equivalences/ (accessed Aug. 03, 2025).
  6. “Propositional Logic: Part I -Semantics 12-0.” Available: https://www.cas.mcmaster.ca/~lawford/2F03/Notes/prop.pdf
  7. Wikipedia Contributors, “Rule of replacement,” Wikipedia, Mar. 03, 2025.
  8. GeeksforGeeks, “Idempotent Laws,” GeeksforGeeks, Sep. 08, 2024. https://www.geeksforgeeks.org/maths/idempotent-laws/ (accessed Aug. 03, 2025).
  9. Wikipedia Contributors, “De Morgan’s laws,” Wikipedia, Sep. 05, 2019. https://en.wikipedia.org/wiki/De_Morgan%27s_laws
  10. “1.2: Basic Notions - Propositions and Arguments,” Humanities LibreTexts, Sep. 26, 2019. https://human.libretexts.org/Bookshelves/Philosophy/Fundamental_Methods_of_Logic_(Knachel)/01%3A_The_Basics_of_Logical_Analysis/1.02%3A_Basic_Notions_-_Propositions_and_Arguments
  11. Dr. Trefor Bazett, “Logical Arguments - Modus Ponens & Modus Tollens,” YouTube. May 23, 2017. Accessed: Sep. 02, 2022. [Online]. Available: https://www.youtube.com/watch?v=NTSZMdGlo4g
  12. Wikipedia Contributors, “Modus ponens,” Wikipedia, Jan. 20, 2019. https://en.wikipedia.org/wiki/Modus_ponens
  13. “Disjunctive syllogism,” Wikipedia, Mar. 03, 2024. https://en.wikipedia.org/wiki/Disjunctive_syllogism
  14. TrevTutor, “[Logic] Proofs and Rules #1,” www.youtube.com. https://www.youtube.com/watch?v=m2j0TX-e8NY (accessed Aug. 4, 2025).
  15. Wikipedia Contributors, “Resolution (logic),” Wikipedia, May 28, 2025.
  16. “Introduction to Logic - Chapter 6,” Stanford.edu, 2025. http://intrologic.stanford.edu/chapters/chapter_06.html (accessed Aug. 05, 2025).
  17. Wikipedia Contributors, “Horn clause,” Wikipedia, Dec. 18, 2019. https://en.wikipedia.org/wiki/Horn_clause
  18. Wikipedia Contributors, “Biconditional introduction,” Wikipedia, Aug. 01, 2023.
  19. “Tautology (logic),” Wikipedia, Sep. 17, 2023. https://en.wikipedia.org/wiki/Tautology_(logic)
  20. “Introduction to Logic - Chapter 6,” Stanford.edu, 2025. http://intrologic.stanford.edu/chapters/chapter_06.html
  21. Wikipedia Contributors, “Resolution (logic),” Wikipedia, May 28, 2025.