Computer Science

Complexity Theory

Complexity theory in computer science studies the resources required to solve computational problems, such as time and space. It explores the inherent difficulty of problems and classifies them based on their computational complexity. This field aims to understand the limits of efficient computation and the relationships between different types of problems.

Written by Perlego with AI-assistance

4 Key excerpts on "Complexity Theory"

  • Emerging Approaches to Educational Research
    eBook - ePub
    • Tara Fenwick, Richard Edwards, Peter Sawchuk(Authors)
    • 2015(Publication Date)
    • Routledge
      (Publisher)
    To refer to ‘complexity science’ or Complexity Theory as though it is a singular, monolithic body of knowledge is a misrepresentation. The complexity field embraces diverse developments that have informed one another. These include theories of general systems, cybernetics, chaos, deep ecology and autopoesis. The historical distinctions among these developments are outlined briefly later in this chapter. However, in the main, the discussion here focuses upon the shared understandings of complexity that appear most frequently in educational writings, using a single term ‘Complexity Theory’.
    What is Complexity Theory? One definition is offered by David Byrne, who authored Complexity Theory and the Social Sciences back in 1998, and more recently described complexity as
    the interdisciplinary understanding of reality as composed of complex open systems with emergent properties and transformational potential. A crucial corollary of Complexity Theory is that knowledge is inherently local rather than universal. Complexity science is inherently dynamic. It is concerned with the description and explanation of change.
    (Byrne 2005: 97 )
    Complexity’s definition in educational research, claim Davis and Sumara (2008) , varies according to who is defining it and their obsessions. Some focus more on complexity as a way to understand knowledge as emergent and enacted rather than represented and acquired. They may do so to consider alternative approaches to curriculum, or to advise teachers about pedagogies that might generate more open, improvisational and collaborative learning activities. Other educators work with concepts of emergence and self-organization to understand how systems take shape in unexpected ways. They may do so to show how change actually occurs in these systems, to trace the unpredicted consequences of particular changes and actions, or even to advise practitioners about effective ways to intervene in a system to cause change. A system can be any assortment of entities – material and virtual, human and technical, seen or unseen – held together by some kind of interrelations with one another to form a collectivity: a classroom of children, a team of professionals, a Facebook
  • Social Synthesis
    eBook - ePub

    Social Synthesis

    Finding Dynamic Patterns in Complex Social Systems

    • Philip Haynes(Author)
    • 2017(Publication Date)
    • Routledge
      (Publisher)
    This chapter considers the use of Complexity Theory in the social science in the last 25 years. The growing application of Complexity Theory in the social sciences followed the adaption of ideas from complexity science into the natural sciences. The method of Dynamic Pattern Synthesis (DPS) as proposed, developed and demonstrated in this book has its roots in two theoretical and methodological approaches to social science: Complexity Theory and critical realism. The growing recognition that Complexity Theory has implications for social science is changing the approach to the research methods used by social scientists (Byrne & Callaghan, 2014; Castellani & Hafferty, 2009). Social researchers informed by Complexity Theory see the need for methods that can take account of the strong interactions between people in social systems and that the way these interactions occur leads to indeterminate and difficult to forecast outcomes. Rather than looking for fixed causal laws, researchers turn their attention to less stable patterns of association that might suggest temporal causal patterns, these being patterns that change over time and can even reverse in their interactive effect in future relationships. Kauffman (1995, p. 15), a seminal figure in complexity science, notes that all of life ‘unrolls in an unending procession of change’. It is this constant change that feeds the need for research to understand dynamics. This chapter will explore the theory of Complexity Theory and consider how it is influencing social science methodology and its connections with the domain of critical realism and the mixed methods that result. As Byrne and Callaghan (2014, p. 8) remarked in their seminal review of the influence of complexity science on the social sciences
    When we say Complexity Theory we mean by theory a framework for understanding which asserts the ontological position that much of the world and most of the social world consists of complex systems and if we want to understand it we have to understand those terms.

    Complexity science

    The classical reductionist method

    Complexity Theory has its origins in contemporary scientific methodology and it questions the universality of the assumptions of Newtonian reductionist methods. Sir Isaac Newton, argued to be one of the greatest scientists of all time, and Luca-sian Professor of Mathematics at Cambridge University from 1669–1701, was much involved in the discovery of some of the most important predictable laws of science, for example, gravitation force and its effect on objects on the earth, including the motion of tides, but also its effect on planets in the solar system.
    Similar to Newtonian laws, reductionist methods premise that if research analyses the micro detail of physical matter it will be possible to deduce how higher order entities and their forms function. For example, the discovery of microbiology in medicine explained the causes of bacterial related diseases and therefore if drugs could be discovered to kill the destructive microbes, a cure was guaranteed. This classical scientific approach has certainly had historical influence on the development of the social sciences. As Byrne (2002, p. 6) writes:
  • School Leadership and Complexity Theory
    • Keith Morrison(Author)
    • 2012(Publication Date)
    • Routledge
      (Publisher)
    This interaction is so intricate that it cannot be predicted by linear equations: there are so many variables involved that the behaviour of the system can only be understood as an ‘emerging consequence’ of the sum of the constituent elements (Levy, 1992: 7). The key elements of Complexity Theory are set out in this chapter (Figure 1.1). Complexity Theory looks at the world in ways which break with simple cause-and-effect models, linear predictability and a dissection approach to understanding phenomena, replacing them with organic, nonlinear and holistic approaches (Santonus, 1998: 3) in which relations within interconnected networks are the order of the day (Youngblood, 1997: 27; Wheatley, 1999: 10). In the physical sciences, Laplacian and Newtonian theories of a deterministic universe have collapsed and have been replaced by theories of chaos and complexity in explaining natural processes and phenomena, the impact of which is being felt in the social sciences (e.g. McPherson, 1995). For Laplace and Newton, the universe was a rationalistic, deterministic and clockwork order; effects were functions of causes, small causes (minimal initial conditions) produced small effects (minimal and predictable) and large causes (multiple initial conditions) produced large (multiple) effects. Predictability, causality, patterning, universality and ‘grand’ overarching theories, linearity, continuity, stability, objectivity – all contributed to the view of the universe as an ordered and internally harmonistic mechanism in an albeit complex equilibrium; a rational, closed and deterministic system susceptible to comparatively straightforward scientific discoveries and laws
  • Managing Complexity of Information Systems
    eBook - ePub
    • Pirmin P. Lemberger, Mederic Morel(Authors)
    • 2013(Publication Date)
    • Wiley-ISTE
      (Publisher)
    Chapter 2Complexity, Simplicity,and Abstraction      
    Recursion is the root of computation since it tradesdescription for time .
    Alan Jay Perlis — Epigrams on Programming  

    2.1. What does information theory tell us?

    We start our journey through complexity and simplicity concepts with mathematics or, more precisely, information theory. This might seem an exotic topic if what we have in mind are applications to the IT world. However, concepts that will be at the core of our future preoccupations, information, randomness, and especially complexity have all been under close scrutiny by mathematicians for over more than half a century now. In their hands, these concepts have evolved into a set of ideas, which is both deep and robust. Moreover, information theory is actually one of those few areas where mathematics succeeded in rigorously formalizing imprecise, almost philosophical concepts, such as complexity and information, to which they bring a unique insight. It would thus seem unreasonable for us to overlook this body of knowledge altogether. These information theory concepts form a collection of metaphors that will help us build a healthy intuition that will prove helpful later when we venture into less rigorous but more practical IT concepts. As we shall see, this first look at the subject, through mathematical glasses, also highlights a number of important issues and limitations, which occur as soon as one seriously attempts to define complexity.
    As information theory is a highly technical and abstract topic, we can barely afford here to do more than just scratch the surface. We shall strive to present in plain language the major findings in information theory of relevance to us. The interested reader will find more details in Appendix 1 .
    Our quick overview of information theory will focus on only three concepts: Shannon's entropy , K-complexity , and Bennett's logical depth . Assume for simplicity's sake that any object or system, whose complexity we wish to define, is described by a binary sequence s such as 001101110… The three concepts mentioned above have one important point in common: they all evaluate the complexity of a system as the quantity of information that its description s contains, assuming that we have a specific goal in mind for s . This goal, as we shall see, is a foreshadowing, in the restricted mathematical context, of the concept of value that we shall examine in Chapter 3
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.