Für neue Autoren:
kostenlos, einfach und schnell
Für bereits registrierte Autoren
102 Seiten, Note: 1,0
List of Figures
List of Tables
2. Working through the OT discourse
2.2 Developing a metatheoretical tool
2.2.1 Preliminary remarks
2.2.4 Four paradigms
2.3 Describing patterns ofdiscourse progression
2.4 Drafting coherent models for organization and change
2.4.1 Preliminary remarks
2.5 Assessing framework market dynamics
3. Connecting the OT with the EC reform discourse
3.2 Tracing the convergence of rhetoric and practice
3.3 Exploring the divergence between rhetoric and practice
3.4 Explaining different adoption patterns
3.4.1 Preliminary remarks
3.4.2 Three types of organization
3.4.3 Change need, change readiness, and the strategy mix
4. Working through the EC reform discourse
4.2 Defining the European Commission
4.3 Exploring change before Kinnock
4.4 Describing the European Commission in
4.5 Assessing the Kinnock Reforms
4.5.1 Preliminary remarks
4.6 Exploring change after Kinnock
List of References
Figure 1 Organizing the OT discourse
Figure 2 Tracking the progression of the OT discourse
Figure 3 Employing systems theories as a metatheoretical device
Figure 4 Suggesting an integrated model of organization
Figure 5 Categorizing change initiatives
Figure 6 Suggesting a ‘best practice’ change management process
Figure 7 Describing dominant NPM interventions
Figure 8 Changing the informal organization through formal intervention
Figure 9 Tracing NPM interventions across the framework market
Figure 10 Illustrating a defective change management process
Figure 11 Disaggregating the Kinnock reforms
Figure 12Achieving culture change through formal intervention
Figure 13Assessing the focus of the Kinnock Reforms
Figure 14 Comparing the Kinnock proposals to dominant NPM measures
Figure 15 Identifying the type of change
Figure 16Assessing Kinnock’s change management process
Figure 17 Illustrating the gap between planned and actual change
Table 1 Assessing and comparing different paradigms
Table 2 Listing change initiatives within the European Commission (1958 - 1999)
illustration not visible in this excerpt
Since the demise of the Santer Commission, internal reform of the European Commission (EC) has received increased academic and political attention. The more recent academic discourse has focused on assessing outcomes of the Kinnock reforms but is marked by divergent approaches and conflicting interpretations. This text proposes to address these issues by embedding the more practical EC Reform discourse within the more abstract Organization Theory (OT) discourse and applying coherent models of organization and change to EC reform.
The first part of the text organizes the OT field within a metatheoretical matrix, proposes coherent models of organization and change, and assesses the dynamics of the management framework market. The second part connects the OT to the EC Reform discourse by analyzing the transfer of management practices from the private to the public sector and by identifying major adoption types across the private-public sector continuum. The third part uses the established models to investigate the change management history ofthe European Commission, and systematically relates the concept of internal administrative efficiency to external political effectiveness.
More specifically, the first part argues that the essence of most organization theories and management frameworks can be combined across four different onto- epistemologies. This allows drafting a comprehensive model of organization and change that can accommodate the majority of theoretical perspectives and research questions. The second part concludes that the transfer of private sector practices has accelerated significantly since the 1980s, mostly as a function of increased environmental pressures on public sector organizations. Since international organizations (incl. the EC) face comparatively low degrees of pressure, they are more likely to be late stage adopters. The third part confirms the EC as a late stage adopter as major internal reform could be avoided for roughly 20 years until increasing environmental pressures culminated in the Kinnock Reforms in 1999/2000. A detailed look at the reform process reveals an intense struggle for decision and definition authority at the intra- and interinstitutional level. While the reform design followed a modern managerial orientation, implementation shifted towards a more traditional, ‘Weberian’ focus on control systems. The net result has been a higher degree of overall strategic coherence and control at the expense of some policy initiative at the service level. At the same time, subsequent reforms, especially at the level of Human Resources, demonstrate an increased organizational capacity for change and a renewed focus on managerial development. In the continuing struggle for administrative efficiency (‘doing things well’) and political effectiveness (‘doing things right’), the EC will have to focus increasingly on finding the right balance between continuous systems optimization and organizational development in order to prepare for a potential shift to a more complex and demanding governance structure.
In this section, I develop an onto-epistemological framework as an analytical tool, draw up a coherent model of organization and change, and assess the dynamics of the framework market. The methodology rests on the assumption that each act of theory-building can be embedded within a chain of four related levels of abstraction, namely metatheory, theory, frameworks, and practice. Each chain segment represents a discursive field whose participants follow different strategies for sense-making. Put simply, metatheory focuses on defining basic assumptions, theory on putting forward a set of related hypotheses, frameworks on creating coherent sets of prescriptive heuristics, and managerial practice on applying and adapting frameworks. The segments are strongly related to their adjacent levels, so that the basic outline of all chain segments can be established for most acts of abstraction, from scientific to common sense hypotheses.
The missing segments can be construed in both directions (i.e., upstream and downstream). While, for instance, most theoretical choices have clear implications on related frameworks, most sensemaking acts by managerial practitioners can be traced back to established frameworks, related theoretical conceptions, and implicit metatheoretical assumptions. The four discursive fields also influence each other in both directions and beyond their adjacent levels. For instance, theory-derived frameworks influence managerial practice, but managerial adaptation also influences the design of new theories and frameworks.
I claim further that frameworks and theories at any level of abstraction are - explicitly or implicitly - based on a small set of distinct metatheoretical choices, i.e., they will stipulate what exists (their ontology) and how we can know about it (their epistemology). While ontological choices specify how categories and their relations can be established, epistemological choices focus on the generation, validation, and organization of related knowledge claims (Tsoukas & Knudsen, 2003: 2). Since ontological choices presuppose and imply epistemological choices, I agree with those researchers who treat the two categories as inseparable and speak of combined onto-epistemologies (e.g., Tsoukas, 2005: 607). For reasons of explanatory clarity, though, I will first discuss them sequentially and later combine them when specifying theoretical perspectives on organization and change.
I will first work through a number of fundamental ontological questions and then explore the opposition between a ‘static’ (or ‘modern’) ontological tradition, and a more ‘dynamic’ (or ‘postmodern’) conception. Following that, I will investigate three fundamental epistemological approaches and analyze how the OT discourse has created a division between an ‘external’, and an opposing ‘internal’ orientation. Finally, I describe how the four resulting onto-epistemological perspectives frame OT paradigms and explain theoretical progression.
Materialism vs. idealism
Ontological positions can be grouped along two dimensions, a materialist/ idealist and a static/ dynamic dichotomy with a range of hybrid positions in-between. Whether an ontology can be classified as materialist, idealist or hybrid (usually dualist) depends on the interrelation of three distinct dimensions that correspond to Karl Popper’s cosmology (1972): the realm of the material, the subjective, and the intersubjective. Simply put, the material level includes all physical objects and phenomena, the subjective level all individual mental states and processes, and the intersubjective level all forms of shared meaning and processes of ‘meaningful’ interaction (Popper, 1972: 31). Contrary to Popper, though, I argue that the categories can be embedded within each of the three ontological conceptions.
Materialist monism argues that the material world is external to the observing subject and that the conscious subject itself is part of that world. On a scale from simple inorganic to complex organic phenomena, human subjects represent highly complex organisms, shaped by evolutionary processes. Social (i.e., intersubjective) processes are an indirect function of the material dimension and are subject to the same assumptions.
Idealist monism asserts that the entire material world is a solipsistic imagination (i.e., at the subjective level) or an illusion brought about by another, bene- or malevolent, ideational entity (e.g., Descartes, 1641/1996; Markie, 2008). The subject is at the center, either as creator or recipient of an illusion. Material and social phenomena are a function of the subject or another ideational being.
Hybrid positions contend that there are two or more substances that may (or may not) be traced back to a single substance (neutral monism). Common dualist positions conceptualize existence as a mix of a material and an ideational substance. The ideational enters the material world as an initial cause, as a property of all matter, or as part of a selective interaction with certain organic forms (i.e., complex animals and humans, or just exclusively at the human level). The intersubjective level is the result of interacting ideational entities that use the physical form as the main medium of communication (see e.g., Popper& Eccles, 1984 and Libet, 1985).
Static vs. dynamic orientation
Whether a theory follows a more static or dynamic ontological conception depends on the theorist’s preference for abstract categories or concrete processes. An ideal-type static (i.e., platonic, modern) ontology conceptualizes objects and phenomena as atomistic, distinct, and stable and their interrelations as linear causual and potentially universally predictable. Categories of objects and phenomena are to be defined at the highest possible level of abstraction at which point the need for the creation of novel categories is eliminated. An ideal-type dynamic (i.e., heraclitan, postmodern) ontology, on the other hand, sees existence as a complex web of interdependent events. These interactions form mostly transient dynamic equilibria that constitute objects and phenomena. Continuous re-creation and change renders the creation of universal static categories a futile effort (Tsoukas & Knudsen, 2003: 19).
A dynamic materialist monism as an ontological base
The presence of external and internal objects and phenomena suggests a hybrid ontological stance. In contrast to this intuition, I assert that all relevant theoretical and metatheoretical positions within the OT discourse can be organized around a dynamic, materialist monism. This is preferable for two reasons: First, following the principle of ontological parsimony (i.e., semantic simplicity), I prefer one theory over another - ceteris paribus - if it is based on fewer ontological categories. With respect to the concept proposed in this text, a second substance would be explanatorily idle (Baker: 2004). Second, putting very different theories on similar ontological footing enhances innovative theoretical communication and combination.
I put forward two arguments for justifying this metatheoretical simplification. First, all approaches discussed in this text - explicitly or implicitly - presuppose an external dimension (i.e., a material level) and none explicitly puts forward a hybrid stance. Second, any theory that treats subjective and intersubjective phenomena as independent of the material level can also be captured by conceptualizing subjective dynamics as emergent non-reductive phenomena of underlying complex material processes. This allows me to establish the subjective level as a function of the material level and the intersubjective level as a function of the subjective level (and thus ultimately also of the material level). So, I claim that without the material, there is no subjective and intersubjective, and without the subjective, there is no intersubjective level. With this materialist monist embedding of the three categories I differ from Popper who used the same concepts to argue against monist and dualist conceptions (1972: 157). However, even from a pure materialist stance, I argue that it is useful to maintain the tripartite ontological conception as there are no coherent external approaches that could explain (away) conscious intentionality (i.e., the subjective level).
The ultimate dependence on the material level does not translate into a simplistic sociobiological determinism. While I adopt a moderate realist stance and argue that each level is marked by regularities, I also contend that the nature of these regularities is distinct at each level.
While I treat the underpinning material level as ontologically independent from the two other levels, I argue that the subjective level is initially formed by a combination of nature and nurture, i.e., material (genetic programming, material environment), and intersubjective structures (social environment). Intersubjective structures represent past social constructions that become shaping influences (e.g., primary and secondary socialization). In this way, nature and nurture determine how we preconceive, perceive and judge (Tsoukas & Knudsen, 2005: 9).
A preliminary definition oforganization and change Organization in a general sense - as process and as entity - can be defined as the purposeful coordination of activities by two or more individuals (e.g., March & Simon, 1993: 2; Clegg et al., 2008: 8) and constitutes an object/phenomenon across all three ontological levels. In this text, I focus primarily on private and public sector organizations which constitute functional social systems, i.e., purposeful, enduring patterns of interaction. More specifically, the organization consists of the labor process and related acts of communication (interaction), produces a product or provides a service (purpose), and is built to last (enduring). Organizations include lower level patterns, such as enduring and transitory groups and networks. Change in a general sense can be defined as an enduring modification of interaction (i.e., of the labor process and communication patterns).
The two ontological approaches discussed above lead to different perspectives on organization and change. An ideal-type static ontology defines an organization through formal processes and structures (i.e., the labor process, hierarchies, routines, boundaries, etc.). Change occurs if these predefined processes and structures are modified. An ideal-type dynamic ontology, on the other hand, defines an organization as a continuous (inter)subjective reenactment of structures and processes. Consequently, change could only occur through a modification of mindsets and behavior.
Three epistemological stances
There are three fundamental epistemological approaches that address three key issues in a different manner, namely questions about the source, nature, and limits of knowledge generation. Empiricism argues that all knowledge is generated a posteriori, through direct experience of the senses, i.e., awareness or apprehension of things by sight, hearing, touch, smell and taste (Crane, 2005). The external world fills the (mostly) blank slate of the mind with experiences that also trigger related abstraction processes, i.e., the way we think about perceived phenomena (Hume, 1748/1993; Markie, 2008). Thus, sense experience determines the type, amount and quality of knowledge. Good theory builds hypotheses inductively from a large amount and variety of experience. Knowledge is accumulated progressively through additional observations and related abstractions.
Rationalism argues that most knowledge capacity (the way we perceive and interpret knowledge) is generated a priori, i.e., determined by nature through an innate capacity to reason that allows intellectually grasping a proposition and deducing conclusions through valid arguments (intuition/ deduction hypothesis ) (Markie, 2008). This enables the development of theoretical models that are (mostly) independent from experience. Knowledge claims are falsified against selected experiments. Thus, contrary to empiricism, the type, amount and quality of knowledge depend mostly on reason and theory-dependent observation. Good theory develops a maximum number of bold, falsifiable hypotheses (Popper, 1972). Knowledge is accumulated through on-going falsification, i.e., the confirmation of bold conjectures and the falsification of cautious conjectures (Chalmers, 1999: 79).
Constructivism argues that most knowledge capacity is innate. At the same time, this approach doubts the reliability of experience and reason as put forward by empiricism and rationalism and consequently denies a universal rationalist or inductivist method for conducting science. Rather, good theory needs to focus on analyzing (inter)subjective conditioning and on deconstructing rationalist and empiricist research. Objective knowledge accumulation in the rationalist and empiricist sense is rejected. Instead, the focus is put on the normative basis of research and the sophistication of deconstructive methods.
Progressive knowledge generation
In the following, I argue that although every type of (scientific) thinking is inherently (inter)subjective, it is possible to conceive basic guidelines for progressive knowledge generation by combining and adjusting the three basic epistemological approaches outlined above.
The notion that (scientific) thinking is based on a ‘world-making’ subject is supported by three main arguments. First, any attempt to use the three discussed epistemological approaches to define an objective, universal method for scientific knowledge generation runs into problems of circulatory justification (i.e., explaining every induction, rational insight, deconstruction by another induction, rational insight, deconstruction) (Chalmers, 1999: 49).
Second, neuroscientific and psychological research strongly support the hypothesis that the nervous system is an operationally closed system independent from its surrounding environment (Maturana & Varela, 1992; F. B. Simon, 2008: 45). More specifically, interaction with the environment (e.g., sense perception) only initiates recursive loops of higher-order calculations. Visual perception, for instance, is initially encoded only in quantitative form (e.g., degree of brightness). Qualitative differences are then constructed through subsequent information processing by other areas of the brain (von Foerster, 1973: 29; F. B. Simon, 2008: 44). Various types of sense experiences are combined into concepts orschemes (e.g., ‘apple’, ‘table’, ‘mother’, etc.).
Piaget (1971) was able to show how the developing brain (of a child) uses cognitive scheme formation and extension as a learning strategy he referred to as ‘genetic epistemology’. The child initially creates basic schemes by combining different types of sense perceptions and related (e.g., emotional) connotations and then extends these schemes to novel experiences by either modifying the experience (‘assimilation’) or the original scheme (‘accomodation’) (Piaget, 2000: 5). While the developing brain engages strongly in scheme modification and new scheme generation, the fully developed (adult) brain has a stronger preference for assimilating experience (e.g., GellMann, 1994: 303). Knowledge generation thus depends on a genetically determined ‘apriori’ coding process that is initially co-constructed and then mostly stimulated by ‘a posteriori’ experience at all three ontological levels.
Third, sociological research has demonstrated that scientific knowledge progression does not occur through an ordered process of cumulative induction or ongoing falsification. Instead, every set of theories (i.e., paradigm, disciplinary matrix ) represents a system of beliefs (or schemes) that defines its own experimental standards and either ignores contradictory results or develops auxiliary hypotheses to explain them (e.g., Kuhn, 1970; Lakatos, 1965). While researchers disagree about the exact definition and description of knowledge progression, they agree that it has been and will be primarily determined by intersubjective dynamics. Subjective Bayesianism offers an insightful descriptive heuristic by assessing the probabilistic impact of a successful falsification, i.e., whether a researcher will abandon his original hypothesis or doubt the experimental set-up. Following the Bayesian formulas, it is rational behavior to doubt the experimental set-up and to retain the original hypothesis (e.g., by adding auxiliary assumptions) until a large number and variety of successful falsifications has been obtained.
In summary, there is strong support for the notion that (scientific) thinking is inherently (inter)subjective. Consequently, it can be argued that any form of knowledge claim has a hermeneutic quality and can be conceptualized in linguistic form (e.g., Chalmers, 1999: 10). This has lead a considerable number of researchers to describe any type of theoretical discourse as a collection of ‘language games’ (Wittgenstein, 1953; Astley & Zammuto, 1992; Mauws & Phillips, 1995).
At the same time, however, I reject a relativist stance towards theory and method (e.g., Feyerabend, 1975) and argue instead for clear methodical guidelines by conceiving scientific thinking as a cautious interplay of theory and observation.
While it is not possible to determine the full objective reliability of reason and sense experience, I maintain that it can be established in (inter)subjective form. If the subject is conceived as the product of natural selection, then its knowledge capacity and learning strategies (i.e., Piaget’s evolutionary epistemology) constitute advanced tools for adaptive sense-making and survival. Both, reason and experience form part of these capacities and can therefore be defined as reliable in this ‘functional’ sense. Scientific thinking can thus be conceptualized as the attempt to devise a method for increasing the reliability of reason and experience, and, as such, as a direct extension of common sense.
I argue further that scientific thinking needs to follow a number of clear methodical guidelines that specify how theory and experience should be designed and how knowledge can be accumulated. At the theoretical level, any set of hypotheses should be conceived more cautiously as ‘explanans’, i.e., as a deliberate explanatory construct that also determines the ‘explanandum’, i.e., the object of its research (Hempel & Oppenheim, 1948). Good theory should maximize its explanatory reach through the generation of falsifiable hypotheses while, at the same time, securing semantic (i.e., ontological) and syntactic (i.e., epistemological) simplicity. Further, rival theoretical conceptions should find as much common testing ground as possible. At the empirical level, any technique should maximize intersubjective verifiability by clarifying ingoing hypotheses, maximizing the generation of observable data, and following strict technical standards (Chalmers, 1999: 247).
Different epistemologies for different ontological categories
As outlined earlier, I argue that the type of knowledge differs for each of the three ontological levels. Consequently, I argue that each level needs to be addressed by a tailored combination of all three epistemological approaches.
Knowledge at the material level can be conceived as single hermeneutic and is best approached by an epistemological combination with a strong rationalist/ empiricist focus (Chalmers, 1999). Theories can be devised in mostly universal (e.g., mathematical) language and regularities can often be generalized in the form of laws. This facilitates the joint identification of critical phenomena, the agreement on universal definitions, and the generation offalsifiable hypotheses (Chalmers, 1999).
Empirical verification (falsification) occurs though highly standardized active manipulation (i.e., experiments) as this allows for the deliberate isolation of investigated phenomena. The systematic consideration of alternative explanations and the detailed documentation of standards and results allow for the creation of a stock of experiential knowledge that is useful for contemporary as well as for future research (Mayo, 1996).
In summary, by consistently comparing rival theories, focusing on the consistency of language, and building a flexible stock of experiential knowledge, the influence of (inter)subjective dynamics can be weakened and controlled knowledge progression can be ensured.
Knowledge at the subjective and intersubjective level is at least double hermeneutic (depending on the research method and question) and needs to be approached by an epistemological combination with a stronger constructivist/rationalist focus. In order to conform to the standards specified above, psychological and sociological theories need to produce generalized and falsifiable hypotheses, i.e., explain and predict (verbal and non-verbal) behavior. Since (inter)subjective regularities are (still) too complex to be conceptualized in standardized, structured language, theories differ decisively in the way they construct and combine underlying internal driving forces into an explanans. In addition, hypotheses about (inter)subjective phenomena have to be expressed in probabilistic form and may influence the people under study (e.g., Chalmers, 1999: 147; Durkheim, 1895/1982). Consequently, it is more difficult to assess semantic and syntactic simplicity across rival theories.
Empirical techniques include observing and participatory approaches that can be conducted in isolated experimental or in contextual ‘field’ settings and in an overt or covert manner (Bryman, 2008). Each approach has strengths and weaknesses and no approach reaches the reliability and validity of empirical data generation at the material level.
Research in experimental settings attempt to maximize reliability by conforming to strict methodical standards (e.g., double-blind, randomized, controlled) and isolating phenomena under investigation. In principle, it seems possible to create a large sample of comparable, reliable experiments that manipulate the same independent variable. At the same time, however, the reliability of the method and validity of results cannot be guaranteed. Reliability can be contested as an (inter)subjective research question cannot fully exploit the experimental set-up (as opposed to a typical medical research question, for instance, that has a stronger material focus). More specifically, a double-blind set-up cannot rule out direct and indirect feedback effects (i.e., from researcher to researched and from academic to public), a randomized sample cannot exclude a potential bias (it can always be constructed), the presence of control groups cannot guarantee that differences are meaningful (they may be caused by other variables), and all criteria together cannot guarantee the consistent isolation of (inter)subjective phenomena across a large number of trials. Consequently, the reliability of empirical results (e.g., statistically significant correlations) is undermined and any further interpretation can be attacked as biased or premature.
Validity can be contested as the experimental condition - to study phenomena in isolation - removes the research object from its context. Since the goal of most social research is to explain contextual behavior, this results in a considerable gap between explanans and explanadum that cannot be closed convincingly.
Field research, on the other hand, attempts to maximize the validity of empirical results (and related theory) by approaching real life situations and documenting the research object and process in high detail. In principle, it seems possible to create a reliable sample of cases that maximize the validity of results.
Again, however, the reliability of the method and the validity of results cannot be assured. Reliability can be contested because the complexity of the setting makes it impossible to isolate particular phenomena, work with a randomized group of people, observe independently, establish guidelines for repeating an investigation, and thus to establish a sufficiently large sample of similar cases to conduct ‘randomized field trials’ (Bryman, 2008: 369). In addition, field research is subject to similar direct and indirect feedback effects, which decrease the real life quality of the setting. Altogether, field data is much more likely to contain a bias. A transparent, detailed documentation of the research process can improve but not secure reliability.
While the field setting reduces the gap between explanas and explanandum, the validity of results can also be contested as the complexity of any setting makes it almost impossible to abstract from the particular case. Thus, the most immersive (i.e., participatory) studies generate the ‘richest’ pool of data but also the least reliable and transferable results. A systematic attempt to increase reliability and validity, on the other hand, approaches an experimental set-up and reduces the real life quality of the field setting.
In an attempt to maximize reliability and validity, a number of studies combine a variety of methods (e.g., participatory observation, interviewing, experiments). While it can be argued that different methods compensate for individual weaknesses, different methods (e.g., structured and unstructured interviewing) often lead to different results (Bryman, 2008: 585). In addition, the number of empirical techniques is limited through practical and ethical concerns, such as realizing experimental set-ups for large groups, employing covert observation and participation, and making use of systematic experimental manipulation.
In summary, the high degree of causual complexity at the subjective and intersubjective level makes it difficult to compare rival theories. Theorists define different explanans and explananda, and disagree about the reliability and validity of empirical techniques. Consequently, the generation of knowledge cannot follow the same theoretical and empirical stringency as natural scientific research and it is hardly possible to accumulate a stock of empirical set-ups and results beyond contemporary theory. If an empirical technique confirms one hypothesis over another, a rival theorist can always attack the technique employed. If, on the other hand, there is a continuous mismatch between a hypothesis and empirical results, the researcher can use the same argument in his favor by pointing to a lack of empirical reliability and validity.
Two opposing epistemologies within the OT discourse
As organization and change span all three ontological levels, most research questions address a different mix of all three types of knowledge outlined above. For instance, assessing the impact of a ‘lean manufacturing’ initiative in a private sector organization may seem like a rationalist/empiricist exercise. At the same time, the reorganization of tasks and the increase in worker responsibility pose interrelated question about intersubjective effects that require a more constructivist approach. Consequently, the combined epistemological approach has to be tailored to the individual research question.
Within the OT discourse, there are two dominant, opposing epistemological combinations that correspond to the division between material and (inter)subjective epistemologies outlined above. Theories of organization that follow an ‘external‘ (rationalist/empiricist) epistemology attempt to minimize hermeneutic exposure by either explaining away or by formalizing the subjective level. Organizational dynamics can thus be captured in a manner similar to the natural sciences that exclusively deal with the material level. Theories that follow an ‘internal’ (constructivist/rationalist) epistemology, on the other hand, tend to maximize hermeneutic exposure in order to account for (inter)subjective world making.
Together with the ontological dichotomy established above, the OT discourse can thus be combined into a metatheoretical matrix with four quadrants that correspond to a distinct set of basic assumptions (see figure 1). In the following, I will first describe each of the four paradigms in sequence, then compare their strengths and weaknesses, and finally sketch a framework that helps exploit theoretical combinations across all four quadrants.
The first quadrant
The first quadrant (Q1) includes theories that follow a static/external onto-epistemology and pursue a ‘subject-object’ orientation as proposed by most natural sciences. Following a traditional ‘Newtonian’ conception, organizational phenomena exhibit invariant structures and (statistical) laws, follow a straightforward linear causality, and can be captured by methods similar to those of the natural sciences. Once this invariant logic of organization has been uncovered, universal management guidelines can be established (e.g., Barnard, 1976; Thompson, 1957). More advanced approaches qualify the universal validity of ‘management laws’ by introducing environmental elements as contingency factors (e.g., Donaldson, 2003) or by qualifying human behavior as ‘bounded rational’ (H. A. Simon, 1997: 72).
illustration not visible in this excerpt
Figure 1 Organizing the OT discourse
The influence of the (inter)subjective is minimized in two ways. While the functionalist mode of explanation establishes the primacy of structures and explains away the subjective level, the rational choice mode formalizes the subjective level as a simplified set of cognitive rules (Scherer, 2003: 310).
Q1 theories are attributed to the disciplines of engineering (e.g., Fayolism, Taylorism, cybernetics), microeconomics (e.g., rational choice, transaction cost, property rights, resource dependence, and game theory), and sociology (e.g., Pearson’s functionalism, contingency theory). The goal of most static/external organization theory is to understand and improve the functioning of organizations. Broadly speaking, Q1 theories tend to follow a technocratic research agenda, “enhancing the effectiveness of formal organizations in the context of a rationalized society” (Tsoukas & Knudsen, 2005: 14).
Organizations are determined by their purpose and their formal composition, i.e., they exist because they fulfill a function more efficiently (e.g., incur lower collective transaction costs (Williamson & Masten, 1995)) than any alternative form of coordination and are adequately described by their formal components (i.e., strategy, structure, systems, processes, employees) (Tsoukas & Knudsen, 2003: 24). Change is conceived as a deliberate effort to increase the organization’s efficiency and effectiveness by manipulating these formal components.
The second quadrant
Theories of the second quadrant (Q2 theories) follow a static/internal onto-epistemology and oppose a natural scientific conception of organizational research. Instead, they follow a ‘subject-subject’ orientation and define organizations as intersubjective phenomena that need to be approached at the subjective level (Giddens, 1993: 28). At the same time, most Q2 theories presuppose invariant structures of organization and assume a linear type of causality. Similar to Q1 theories, research focuses on uncovering general principles of human behavior (specifically motivation) and on devising universal leadership guidelines.
Most Q2 theories follow an interpretivist mode of explanation and focuses on understanding the meaning of behavior (verbal and non-verbal) from the perspective of the investigated subject (Scherer, 2003: 310; Giddens, 1993: 28).
Q2 theories are attributed to the disciplines of psychology (e.g., psychoanalysis, group dynamics, human relations), anthropology (e.g., organizational culture), and sociology (e.g., new institutionalism). In line with static/external theories, the ultimate goal of research is to understand and improve the functioning of organizations within the scope of the encompassing system. At the same time, theories often include secondary emancipatory objectives (i.e., general personality development, empowerment).
Organizations are primarily conceived as historically constituted social collectives that negotiate and reproduce meaning (e.g., Scott, 1992). Consequently, the function and formal components alone cannot provide a sufficient description of organization. Instead organizational dynamics are determined by internal variables, such as individual and collective motivation and leadership. Change is achieved through a permanent modification of individual and collective (verbal and non-verbal) behavioral routines, particularly in the ranks of top and middle management.
The third quadrant
Theories of the third quadrant (Q3) follow a dynamic/external onto-epistemology and - as Q1 theories - pursue a ‘subject-object’ orientation. As opposed to Q1 theories, however, Q3 theories incorporate more recent advances in the natural sciences, assume varying degrees of nonlinear causality and put a stronger focus on emergent processes. In short, progress in systems analysis across different disciplines (mostly physics and biology) has lead to a better understanding of system behavior at different levels of complexity. More recently, the research has been extended to social phenomena, i.e., the field of economics and organization studies. Instead of searching for static, invariant organizational laws and structures, researchers concentrate on uncovering regularities in organizational behavior, i.e., the laws of dynamics. Management advice within this paradigm is eclectic as the applicability of advanced systems and complexity theory to social phenomena is still underresearched.
The rational choice and functionalist mode of explanation (constituting two opposing explanatory modes in the first quadrant) are combined in a joint dualist conception, as structures are explained as emergent effects (weak and strong) of individual interaction (Stacey, 2007: 34). Advances in information technology allow formalizing the subjective level in more advanced form, i.e., instead of resorting to a fixed set of simplified cognitive rules in a reduced setting, computer simulations allow the modeling of complex, adaptive interests within a changing environment (Stacey, 2007: 186).
Dynamic/external theories are attributed to physics (e.g., chaos, complexity theory, theory of dissipative structures), biology (e.g., population ecology, ‘autopoietic’ systems theory), economics (e.g., behavioral economics, complexity economics), and mathematics (e.g., network theory). As for most previous theories, the ultimate goal of research includes a better understanding and improved functioning of organizations. The research agenda is almost entirely technocratic and it is therefore mostly coincidental that a number of management guidelines are in line with emancipatory propositions (e.g., selforganization, empowerment) (Stacey, 2007: 212).
Organizations are mostly conceptualized as complex adaptive systems (CAS), i.e., as aggregates of self-interested agents whose deterministic, iterative, and nonlinear interactions constitute mostly unintended emergent (higher order) effects (Stacey, 2007: 194). Following the theory of dissipative structures (Prigogine, 1984), a complex system can take one of three possible states: stable equilibrium, explosive instability, and stable
instability. While the first state is marked by symmetrical and uniform behavior, the second constitutes complete disaggregation. The third state, on the other hand, occurs far from equilibrium - ‘at the edge of chaos’ - and is marked by patterns of selforganization (i.e., dissipative structures) around an equilibrium state (Stacey, 2007: 194). This dynamic cannot be captured through the formal functioning and composition of an organization. Instead, it is necessary to consider the entire ‘connectome’ (or network map) of the organization, i.e., the complete set of agents as well as the type and strength of all ties between them.
Change is defined as a permanent shift in organizational structures. While a system in stable equilibrium requires no effort to retain its structures and a great effort to change them, a system in a ‘dissipative’ state needs a great effort to maintain its structures and only a little amount to change them (Stacey, 2007: 193). If an organization is faced by a continuously evolving environment, the natural conclusion within a complexity framework would be to move the organization towards a dissipative state ‘at the edge of chaos’.
The fourth quadrant
Theories of the fourth quadrant (Q4) follow a dynamic/internal onto-epistemology and deny the existence of universal laws and structures as proposed by Q1 and Q3 theories. As theories from the second quadrant, they pursue a ‘subject-subject’ orientation and treat organization as an intersubjective phenomenon that needs to be approached at the subjective level. At the same time, Q4 theories shift their attention almost entirely to the intersubjective level and allow for nonlinear causality. Management guidelines are targeted towards improving communication and are more personal and concrete than those that follow from the other quadrants.
There are three modes of explanation that conceptualize organizational discourse in different manners. While the ‘complex interpretivist’ mode focuses on revealing the dynamic interplay of discursive processes thought to be similar to those of complex structures, the ‘critical’ mode concentrates almost exclusively on the impact of power relations, and the ‘postmodern’ mode focuses on revealing (deconstructing) the situation-specific construction of meaning.
Theories of all three modes can be attributed to psychological, sociological and philosophical thinking. At the same time, theories that follow a ‘complex interpretivist’ mode (e.g., theory of complex responsive processes, ‘psychological’ systems theory, social network theory) have a stronger link to process philosophy, theories that follow a ‘critical’ mode (e.g., critical management studies, open systems theory) are more strongly connected to psychoanalytic thinking, and theories that follow a postmodern mode (e.g., feminist theory, postmodern management theory) are more strongly linked to linguistics and philosophy of language. Accordingly, the goal of research is threefold. While ‘complex interpretivist’ theories have a stronger technocratic focus and aim at understanding and improving the functioning of organizations, critical theories have a clear emancipatory concern (i.e., identifying and leveling power disparities), and postmodern theories aim primarily at understanding and revealing the contingent nature of (inter)subjective meaning (Scherer, 2003: 310).
Q4 theories shift their focus almost entirely to the intersubjective level and see organization as a large, eclectic ‘discursive field’ that is established through continuous (verbal and non-verbal) acts of communication. Similarto Q1, change is achieved through a permanent modification of (verbal and non-verbal) behavior. As opposed to Q1 theories, however, there is a stronger focus on collective practices at lower levels of the organizational hierarchy.
Comparing the fourparadigms
A brief comparison of strengths and weaknesses reveals that it is impossible to devise a clear ‘hierarchy’ of quadrants as each theory has to cope with the trade-off between syntactic and semantic complexity and explanatory reach. Q1 and Q2 theories achieve a high degree of simplicity but do so at the expense of their explanatory reach. Q3 and Q4 theories, on the other hand, add metatheoretical complexity but are able to justify this step with an extended explanatory reach (see table 1). Consequently, the choice of theory depends rather on the type, complexity, and scope of the research question.
 I employ the term discourse in a very broad sense, i.e., the concept encompasses all acts of communication (verbal and non-verbal) and can refer to linguistic and social language structures (Fairclough, 2003: 2). Accordingly, a discursive field constitutes all acts of communication for a specific language community (i.e., a group of discourse participants that subscribe to particular conventions of language use) (Widdowson, 2007: 129).
 The concept of sensemaking “focuses attention on the idea that the reality of everyday life must be seen as an ongoing ‘accomplishment,’ which takes particular shape and form as individuals attempt to create order and make retrospective sense of the situations in which they find themselves.” (Morgan et al., 1983: 24; Weick, 1993: 11).
 i.e., ontological categories and their interrelation are based on a subject’s perception and judgment (rational and non-rational), and the interrelation of ontological categories determines certain epistemological characteristics (e.g., perception based on the brain as a material construction).
 With respect to Western metaphysical thinking.
 I prefer the antonymic pair ‘static/ dynamic’ to ‘modern/ postmodern’ as the latter two concepts have been employed in too many differing contexts and would only add ambiguity here.
 1 conceptualize ‘objects’ as more enduring external and internal entities and ‘phenomena’ as more transient external or internal events. Depending on the ontological stance, either concept can incorporate the other.
 The acceptance of other subjects implies a separating substance (i.e., the material level).
 While the mind/body problem is far from being resolved (see e.g., Popper & Eccles, 1984; Libet, 1985 vs. P.S. Churchland, 1981; Dennett & Kinsbourne, 1992), most neuroscientific research works from the assumption that human consciousness is an instance of ‘strong emergence’ (see e.g., Gaillard R. et al., 2009). With the Multiple Draft Model (MDM), Daniel Dennett (1991) has presented a comprehensive materialist conception of consciousness.
 Collapsing the intersubjective level conceptually within the subjective level corresponds to a conflationist stance towards the agency-structure dilemma as proposed by Giddens (1993). Conflationism “rejects the analytical dualism of both reductionism and determinism, and insists on the mutual and equal codetermination ofagency and structure”. (Tsoukas & Knudsen, 2003: 23)
 Related neuroscientific approaches cannot yet account for intentionality at the psychological level (if psychodynamics are an emergent result, how can there be a conscious decision?).
 A complete mathematical description is the goal of a number of researchers (see e.g., Churchland & Sejnowski, 1994).
 This is mostly in line with Bhaskar’s (1978) ontological conception across the natural and social sciences (more recently promoted as ‘critical realism’). As the specifics of the position are still widely debated, a detailed discussion is beyond the scope of this text.
 I argue that the influence of an observer on material phenomena at the subatomic level (i.e., the wave- particle duality) can be disregarded in the context of this paper.
 Even though the intersubjective is a function of the subjective level, every subject is born into structures that were created by other subjects.
 While all members in a group are interconnected, some members in a network are linked through hubs (or nodes). Enduring groups and networks constitute sub-systems.
 1 define knowledge as the accumulation of ‘justified true beliefs’ (e.g., Steup, 2008).
 I assume no significant role for pre-stored knowledge and concepts, which is a common additional rationalist hypothesis (Markie, 2008).
 Thomas Kuhn (1970/1996) initially used the notion of ‘paradigms’ but later changed this to the concept of ‘disciplinary matrix’ (1974).
 Neither objective nor subjective Bayesianism are able to give a coherent and universal account of paradigm change (Chalmers, 1999: 174). At the same time, subjective Bayesianism retains some explanatory value aboutthe limiting impactofsuccessfulfalsifications (Chalmers, 1999: 187).
 Corresponding to the principle of semantic simplicity (i.e., eliminating ontological categories that are explanatory idle), the principle of syntactic simplicity states that, ceteris paribus, one theory is preferable to another if it contains fewer and/or less complex hypotheses (Baker, 2008).
 Natural sciences that mostly focus on mathematical expressions have an additional advantage as they can focus on structural progression, i.e., hypotheses may be refuted but sound mathematical concepts will nevertheless be retained (Chalmers, 1999: 243).
 This shift towards an almost theory-independent stock of experiential knowledge is promoted as ‘New Experimentalism” (e.g., Mayo, 1996).
 1 thus reject the ontological claim for unverifiable categories (e.g., a male or female Oedipus Complex) as unscientific. However, the same categories would be acceptable as part of an explanans if they add value by suggesting additional falsifiable hypotheses.
 Since introspection cannot produce valid (intersubjectively verifiable) empirical material, it can only serve as an input for the construction of the underlying explanans, corresponding to the rationalist intuition/deduction hypothesis.
 Decreasing direct feedback through covert techniques raises strong practical and ethical concerns.
 Most interviewing techniques fall between the experimental and the field setting.
' See e.g., Gell-Man, 1994; Kauffman, 1995; Langton, 1996; Holland, 1998 for approaches rooted initially in physics and Maturana & Varela, 1992 for an approach initially rooted in biology.
 See e.g., Thietart & Forgues, 1995; Nonaka & Takeuchi, 1995; Pascale et al., 2000; F. B. Simon, 2007, F. B. Simon, 2008.
 The term Complex Adaptive System (CAS) was coined by Gell-Man, Kauffman, Langton, and Holland of the Santa Fe Institute (SFl) (Stacey, 2007: 213)
 ‘Connectome’ is originally a neuroscientific concept that refers mapping the entire nervous system including the type and strength ofall ties (see e.g., Lichtman, 2008).
 See e.g., Stacey, 2008; Simon, 2007; 2008; Chia, 1995; 1998; 2002; 2006. While Robert Chia is often associated with a postmodern orientation, his goal to understand and improve organization marks him as a ‘complex’ interpretivist. I reserve the postmodern category for researchers that deny the possibility of knowledge accumulation and focus almost exclusively on the deconstruction of meaning.
 See e.g., Willmott, 2008; Voronov, 2008.
 See e.g., Czarniawska, 1999; Calas, 1997.
Diplomarbeit, 159 Seiten
Bachelorarbeit, 144 Seiten
Bachelorarbeit, 165 Seiten
Masterarbeit, 102 Seiten
Diplomarbeit, 103 Seiten
Diplomarbeit, 73 Seiten
Diplomarbeit, 159 Seiten
Bachelorarbeit, 144 Seiten
Bachelorarbeit, 165 Seiten
Masterarbeit, 102 Seiten
Diplomarbeit, 103 Seiten
Diplomarbeit, 73 Seiten
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!