Computer Science

Theory of Computation

The Theory of Computation is a branch of computer science that deals with the study of algorithms, their computational complexity, and the limits of what can be computed. It explores the fundamental principles underlying computation, including automata theory, formal languages, and computability theory. This field is essential for understanding the capabilities and limitations of computers and developing efficient algorithms.

Written by Perlego with AI-assistance

3 Key excerpts on "Theory of Computation"

  • The Engine of Complexity
    eBook - ePub

    The Engine of Complexity

    Evolution as Computation

    2 Computation What is a computation?
    Listen to music or surf the Internet on your iPad. Start your car in the morning without touching the accelerator pedal and your engine “decides” how much gasoline and air flow into the cylinders; as the engine warms, the ratio of gas and air changes. Step on the accelerator and the ratio changes again. How are such things possible? Electronic computers function in each device. It is pretty hard these days to escape the influence of computers. They calculate our bills, produce photographs without film, keep track of your likes and dislikes, and predict the weather. Computers are machines that manipulate information, and computer science is the formal discipline that studies limitations and opportunities afforded by this type of activity. In this chapter we will explore the concept of computation. I hope to convince you that the notion of computation is more general than simply what happens in electronic computers and that it is not possible to cleanly separate many kinds of physical activity from computation.
    Basic to the discipline of computer science is the idea that information can be encoded in patterns of symbols, usually linear sequences that convey meaning to someone or something. Chapter 1 introduced one specific quantitative measure of information: Shannon information. This measure is very useful for some purposes, such as determining how much physical space you need to store some information you value or how much bandwidth you need to transmit it to a distant location. Later in this chapter I will introduce another measure, algorithmic information, which is particularly useful in computer theory.
    A convenient way to understand the notion of computation is to see every computation as a process in which a pattern (usually, but not always, a sequence of symbols) interfaces with a device in such a way that a series of changes occurs within the device, culminating in output. The output may be useful in ways that the input is not. This general idea framed in terms of information is diagramed in figure 2.1
  • Memory and the Computational Brain
    eBook - ePub

    Memory and the Computational Brain

    Why Cognitive Science will Transform Neuroscience

    • C. R. Gallistel, Adam Philip King(Authors)
    • 2011(Publication Date)
    • Wiley-Blackwell
      (Publisher)
    Could a single conceptual framework handle the many different computational problems that one could face? We saw that a set of primitive data, through combinatorial interaction, could serve all of our symbolic needs. Is there a similar set of computational primitives that can serve all of our computational needs? In the last chapter we gained an intuitive understanding of the nature of procedures that perform computations on symbols. Such concepts have existed for a long time; the word algorithm derives from the Persian Mathematician al-Khwarizmi who lived during the ninth century AD. Formalizing the concept of a procedure and what type of machine could implement these procedures without human aid, however, had to wait until the twentieth century.
    Formalizing Procedures
    Twelve years before Shannon published his paper laying the foundations of information theory, Alan Turing published his paper (Turing, 1936) laying the foundations of the modern understanding of computation. Turing started from the intuition that we know how to compute something if we have a step-by-step recipe that, when carefully followed, will yield the answer we seek. Such a recipe is what we have called a procedure . The notion of a procedure was important for those working on the foundations of mathematics, because it was closely connected to an understanding of what constitutes a rigorous proof. Anyone who has scrutinized a complex and lengthy proof is aware of how difficult it is to be sure that there is not a cheat somewhere in it – a step that is taken that does not follow from the preceding steps according to a set of agreed upon rules.
    Turing’s enterprise, like Shannon’s, was a mathematical one. He did not intend nor attempt to build an actual machine. Turing wanted to specify the elements out of which any procedure could be constructed in such an elementary and precise manner that there could be no doubt that each element (each basic step) could be executed by a mindless machine. The intuition here is that a machine cannot cheat, cannot deceive itself, whereas our minds do routinely deceive themselves about the cogency of their reasoning. Thus, he faced a twofold challenge: first, to specify some very simple operations that could obviously be implemented on a machine; and second, and by far the greater challenge, to make the case that those operations sufficed to construct any possible procedure, and were capable of performing all possible computations.
  • Philosophy of Computer Science
    eBook - ePub

    Philosophy of Computer Science

    An Introduction to the Issues and the Literature

    • William J. Rapaport(Author)
    • 2023(Publication Date)
    • Wiley-Blackwell
      (Publisher)
    Some of the features of computational thinking that various people have cited include abstraction, hierarchy, modularity, problem analysis, structured programming, the syntax and semantics of symbol systems, and debugging techniques. (Note that all of these are among the methods for handling complexity!)
    Denning (2009 , p. 33) also recognizes the importance of computational thinking. However, he dislikes it as a definition of CS, primarily on the grounds that it is too narrow:
    Computation is present in nature even when scientists are not observing it or thinking about it. Computation is more fundamental than computational thinking. For this reason alone, computational thinking seems like an inadequate characterization of computer science. (Denning, 2009 , p. 30)
    A second reason Denning thinks defining CS as computational thinking is too narrow is that there are other equally important forms of thinking: “design thinking, logical thinking, scientific thinking, etc.” (Denning et al., 2017 ).31

    3.16.5 CS as AI

    Computation … is the science of how machines can be made to carry out intellectual processes.
    —John McCarthy (1963 , p. 1, my italics)
    The goal of computer science is to endow these information processing devices with as much intelligent behavior as possible.
    —Juris Hartmanis (1993 , p. 5, my italics) (cf. Hartmanis, 1995a , p. 10)
    Computational Intelligence is the manifest destiny of computer science, the goal, the destination, the final frontier.
    —Edward A. Feigenbaum (2003 , p. 39)
    These aren't exactly definitions of CS, but they could be turned into one: computer science – note: CS, not AI! – is the study of how to make computers “intelligent” and how to understand cognition computationally.
    As we will see in more detail in Chapter 6 , the history of computers supports this: it is a history that began with how to get machines to do some human thinking (in particular, certain mathematical calculations) and then more and more. And (as we will see in Chapter 8 ) the Turing Machine model of computation was motivated by how humans compute: Turing (1936 , Section 9) analyzed how humans compute and then designed what we would now call a computer program that does the same thing. But the branch of CS that analyzes how humans perform a task and then designs computer programs to do the same thing is AI. So, the Turing Machine was the first AI program! But defining
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.