๐Ÿšจ Limited Offer: First 50 users get 500 credits for free โ€” only ... spots left!
Theory of Computation Flashcards

Free Theory of Computation flashcards, exportable to Notion

Learn faster with 45 Theory of Computation flashcards. One-click export to Notion.

Learn fast, memorize everything, master Theory of Computation. No credit card required.

Want to create flashcards from your own textbooks and notes?

Let AI create automatically flashcards from your own textbooks and notes. Upload your PDF, select the pages you want to memorize fast, and let AI do the rest. One-click export to Notion.

Create Flashcards from my PDFs

Theory of Computation

45 flashcards

The Church-Turing thesis states that a function is computably calculable if and only if it can be computed by a Turing machine.
Time complexity measures how the running time of an algorithm scales with the input size. It is usually expressed using Big O notation.
The halting problem asks whether there exists an algorithm that can determine if a given program halts or runs forever on a given input. It is undecidable, meaning no such algorithm exists.
A finite automaton is a mathematical model of computation that recognizes patterns in strings of symbols. It consists of a set of states, a set of input symbols, a transition function, a start state, and a set of accept states.
In a DFA, for each state and input symbol, there is at most one transition to a next state. In an NFA, for each state and input symbol, there can be multiple possible next states.
The pumping lemma states that for any regular language, there exists a pumping length p such that any string s in the language with length >= p can be written as s = xyz, where y is a non-empty string, |y| <= p, and xy^nz is in the language for any n >= 0.
A push-down automaton is a finite automaton with an additional stack memory. It is used to recognize context-free languages.
P is the class of decision problems that can be solved in polynomial time by a deterministic Turing machine. NP is the class of decision problems whose solutions can be verified in polynomial time by a non-deterministic Turing machine.
The P vs NP problem asks whether every problem in NP can also be solved in polynomial time on a deterministic Turing machine, i.e., whether P = NP. It is one of the most important unsolved problems in computer science.
A Turing machine is an abstract model of computation that manipulates symbols on an infinite tape according to a set of rules. It is used to define computability and complexity classes.
A context-free grammar is a formal grammar used to describe context-free languages. It consists of a set of non-terminal symbols, terminal symbols, a start symbol, and a set of production rules.
The Chomsky hierarchy is a containment hierarchy of formal languages, ranging from the most restricted (regular languages) to the most general (recursively enumerable languages).
A regular expression is a sequence of characters that forms a pattern used to match strings. It defines a regular language.
A language is a set of strings over an alphabet, while a grammar is a set of rules that generates or describes a language.
A universal Turing machine is a Turing machine that can simulate any other Turing machine on arbitrary input. It is a key concept in demonstrating the computational universality of Turing machines.
The Post Correspondence Problem asks whether there exists a non-empty sequence of indices that generates two identical strings from given lists of strings. It is an undecidable problem.
The time complexity of mergesort and quicksort is O(n log n) in the average case, where n is the number of elements to be sorted.
Space complexity measures the amount of memory or space required by an algorithm to run as a function of the input size.
Any language that is decidable by a non-deterministic Turing machine is also decidable by a deterministic Turing machine, but not necessarily with the same time or space complexity.
The pumping lemma for context-free languages states that for any context-free language not containing the empty string, there exists a pumping length p such that any string s in the language with length >= p can be written as s = uvxyz, where |vy| >= 1, |vxy| <= p, and uv^nxy^nz is in the language for any n >= 0.
A decidable problem is one for which an algorithm exists that can always determine whether a given input is a member of the problem's set or not. An undecidable problem is one for which no such algorithm can exist.
A Turing machine with an oracle is a Turing machine that has access to an external oracle that can provide answers to certain types of questions. This allows the study of relative computability.
P is the class of decision problems solvable in polynomial time by a deterministic Turing machine, while BPP (Bounded-error Probabilistic Polynomial time) is the class of decision problems solvable in polynomial time by a probabilistic Turing machine with bounded error.
NP-completeness is a property of decision problems in the NP complexity class. If any NP-complete problem can be solved in polynomial time, then all problems in NP can be solved in polynomial time (i.e., P = NP). Many important problems are NP-complete.
A language is decidable if there exists a Turing machine that can decide, given any input string, whether the string is in the language or not. A language is recognizable if there exists a Turing machine that can recognize strings in the language, but may not halt for strings not in the language.
For any finite automaton, there exists a regular expression that defines the same language as the automaton, and vice versa. This shows the equivalence in expressive power between finite automata and regular expressions.
Deterministic space complexity classes measure the amount of space required by a deterministic Turing machine, while non-deterministic space complexity classes measure the space required by a non-deterministic Turing machine.
The Cook-Levin theorem, also known as the Cook-Levin theorem on NP-completeness, establishes that the boolean satisfiability problem is NP-complete. This result is fundamental to the theory of NP-completeness.
Time complexity measures the amount of time required by an algorithm as a function of the input size, while space complexity measures the amount of memory or space required by an algorithm as a function of the input size.
Finite automata can recognize exactly the set of regular languages, and vice versa: every regular language can be recognized by a finite automaton.
The Myhill-Nerode theorem provides a necessary and sufficient condition for a language to be regular, based on the existence of a finite number of equivalence classes of strings.
Time complexity classes are used to classify decision problems based on their time requirements, while space complexity classes are used to classify decision problems based on their space requirements.
Context-free grammars and push-down automata have equivalent expressive power, meaning that a language is context-free if and only if it can be recognized by a push-down automaton.
The time hierarchy theorem states that for any two time complexity classes T1 and T2, if T1 is strictly contained in T2, then there exists a language that is in T2 but not in T1. This implies that the time complexity classes form a strict hierarchy.
Turing machines are a mathematical model that precisely captures the notion of computability. A function is computable if and only if it can be computed by a Turing machine.
A recursive language is one that is decidable by a Turing machine, meaning that the machine can determine whether a given input string is in the language or not. A recursively enumerable language is one that is recognizable by a Turing machine, but not necessarily decidable.
Rice's theorem states that any non-trivial semantic property of programs is undecidable. It has important implications for the limitations of program analysis and verification.
Complexity classes are often defined and related to each other using the concept of reductions. If a problem A can be reduced to a problem B, then the complexity of A is bounded by the complexity of B.
Deterministic time complexity classes measure the time required by a deterministic Turing machine, while non-deterministic time complexity classes measure the time required by a non-deterministic Turing machine.
The Immerman-Szelepcsรฉnyi theorem states that non-deterministic space complexity classes are closed under complementation, meaning that if a language is in a non-deterministic space complexity class, then its complement is also in the same class.
Finite automata and regular grammars have equivalent expressive power, meaning that a language is regular if and only if it can be generated by a regular grammar.
Worst-case complexity analysis considers the maximum or worst-case running time or space requirements of an algorithm, while average-case complexity analysis considers the expected or average-case running time or space requirements, assuming some distribution of inputs.
Savitch's theorem states that the deterministic and non-deterministic space complexity classes PSPACE and NPSPACE are equal, implying that any problem that can be solved using polynomial non-deterministic space can also be solved using polynomial deterministic space.
Regular expressions and finite automata have equivalent expressive power, meaning that any regular language can be described by a regular expression and recognized by a finite automaton, and vice versa.
The space hierarchy theorem states that for any two space complexity classes S1 and S2, if S1 is strictly contained in S2, then there exists a language that is in S2 but not in S1. This implies that the space complexity classes form a strict hierarchy.