Theory of computation Study Guide
Study Guide
📖 Core Concepts
Theory of Computation – studies what problems can be solved by algorithms and how efficiently they can be solved.
Model of Computation – mathematical abstraction (e.g., Turing machine, lambda calculus) that captures the essential operations of a computer.
Automata – abstract machines (finite automata, pushdown automata, Turing machines) that recognize formal languages.
Formal Language – set of strings over an alphabet generated by grammatical rules; each automaton class corresponds to a language class.
Chomsky Hierarchy – four nested language families: Regular ⊂ Context‑Free ⊂ Context‑Sensitive ⊂ Recursively Enumerable.
Computability – whether a problem has any algorithm that solves it (decidable).
Halting Problem – no TM can decide for every TM+input whether it halts.
Rice’s Theorem – any non‑trivial property of the partial function computed by a TM is undecidable.
Complexity Theory – quantifies resource usage (time, space) of algorithms.
Big‑O – asymptotic upper bound; ignores constant factors and lower‑order terms.
P vs NP – asks if every problem whose solution can be verified in polynomial time can also be solved in polynomial time.
---
📌 Must Remember
Church–Turing Thesis: Anything computable by a “reasonable” physical device can be computed by a TM.
Equivalence: Deterministic & nondeterministic finite automata (DFA/NFA), regular expressions, and regular languages are interchangeable.
Context‑Free ≡ Nondeterministic Pushdown Automata.
Halting Problem: Undecidable ⇒ No algorithm solves it for all TMs.
Rice’s Theorem: Undecidable for any non‑trivial semantic property (e.g., “does the TM compute a total function?”).
Time Complexity: $T(n)$ = # of steps as a function of input size $n$.
Space Complexity: $S(n)$ = # of memory cells used as a function of $n$.
Big‑O: $f(n) = O(g(n))$ means $\exists c, n0$ s.t. $f(n) \le c\cdot g(n)$ ∀ $n \ge n0$.
P: solvable in polynomial time ($O(n^k)$).
NP: verifiable in polynomial time; includes problems like SAT, Hamiltonian Path.
---
🔄 Key Processes
Proving a language is regular
Show a DFA/NFA exists or construct a regular expression or apply the Pumping Lemma for regular languages (contrapositive).
Proving a language is context‑free
Build a CFG / PDA or use the Pumping Lemma for CFLs (contrapositive).
Reducing problems (computability/complexity)
Transform instance $I$ of known hard problem $A$ to instance $f(I)$ of problem $B$ in polynomial time; prove $A \lep B$.
Diagonalization (Halting problem)
Assume a decider $H$ exists, construct TM $D$ that on input $x$ runs $H(x,x)$ and flips the answer → contradiction.
Applying Rice’s Theorem
Identify a non‑trivial semantic property $P$; show that if $P$ were decidable, we could decide the Halting Problem → contradiction.
---
🔍 Key Comparisons
Deterministic vs Nondeterministic Finite Automata
Capability: Both recognize exactly the regular languages.
Construction: NFA may have ε‑moves; DFA has a single next state per symbol.
Finite Automata vs Pushdown Automata
Memory: FA = none (only current state); PDA = unbounded stack.
Languages: FA → regular; PDA → context‑free.
Turing Machine vs Primitive Recursive Functions
Power: TM = all recursively enumerable functions; Primitive recursive ⊂ total recursive (cannot express the Ackermann function).
P vs NP
P: solvable in polynomial time.
NP: solution verifiable in polynomial time; may require exponential time to find.
---
⚠️ Common Misunderstandings
“All undecidable problems are NP‑hard.”
Undecidable problems lie outside the NP hierarchy; NP‑hardness is defined only for decision problems in recursively enumerable set.
“Regular languages are a subset of context‑free languages.”
True, but the converse is false; many CFLs (e.g., $ \{a^n b^n\}$) are not regular.
“If a problem is in NP, it must be hard.”
Some NP problems are easy (e.g., linear‑time verification of a given solution).
“Big‑O gives the exact running time.”
It only provides an upper bound up to constant factors; lower‑order terms are ignored.
---
🧠 Mental Models / Intuition
“Machine ⇄ Language” – Think of each automaton class as a filter: feed it strings; if it accepts, the string belongs to the language class.
“Stack = memory for recursion” – PDA’s stack mirrors the call stack of a recursive program; this is why CFGs capture programming language syntax.
“Diagonalization = “self‑reference trap” – Like Gödel’s incompleteness, constructing a machine that asks “What would you answer about yourself?” creates a paradox.
“Reduction = “hardness transfer” – If you can solve a known hard problem using a solver for $B$, then $B$ is at least as hard.
---
🚩 Exceptions & Edge Cases
Deterministic Context‑Free Languages (DCFLs): Not all CFLs are deterministic; e.g., $\{a^i b^j c^k \mid i=j \text{ or } j=k\}$ is CFL but not DCFL.
Space vs Time Trade‑off: Some problems have exponential time algorithms but polynomial space (e.g., PSPACE ⊆ EXPTIME).
Primitive Recursive vs General Recursive: Primitive recursive functions are total and always halt; some total computable functions (e.g., Ackermann) are not primitive recursive.
---
📍 When to Use Which
Regular Expression vs DFA: Use regex for quick pattern matching in code; use DFA for formal proofs or to compute closure properties.
CFG vs PDA: Use CFG when constructing a grammar (e.g., language design); use PDA when proving recognizability or building parsers.
Big‑O vs Exact Counting: Use Big‑O for asymptotic analysis in exams; give exact counts only when the question asks for precise runtime.
Reduction vs Direct Proof: Use reduction when the problem’s hardness is known; use direct construction when you can exhibit a machine/grammar.
---
👀 Patterns to Recognize
“Pumping Lemma” problems – Look for strings longer than the pumping length; try to pump and show the resulting string leaves the language.
“Self‑reference” in undecidability – Any problem asking “does machine $M$ halt on input $M$?” signals a diagonalization argument.
“Closure properties” – Regular languages closed under union, concatenation, Kleene star; CFLs closed under union and concatenation but not under intersection/complement.
“Polynomial bound” clues – If a problem description mentions “verify in polynomial time,” suspect NP membership.
---
🗂️ Exam Traps
Confusing “decidable” with “in P”.
Decidable means some algorithm exists (maybe exponential); P requires polynomial time.
Assuming all CFGs are ambiguous.
Ambiguity is a property; many CFGs (e.g., arithmetic expressions with proper precedence) are unambiguous.
Choosing the wrong model for equivalence proofs.
Proving language regularity with a PDA (instead of a DFA/NFA) is insufficient; PDA is more powerful.
Misreading Big‑O direction.
$f(n)=O(g(n))$ is an upper bound, not a lower bound; $Ω$ denotes lower bound.
Over‑applying Rice’s Theorem.
It applies only to semantic properties of the computed partial function, not to syntactic properties of the TM’s description.
---
or
Or, immediately create your own study flashcards:
Upload a PDF.
Master Study Materials.
Master Study Materials.
Start learning in seconds
Drop your PDFs here or
or