Qualitative data analysis using grounded theory. Leads to the completion of a theory with a minimum number of extreme cases Grounded theory

Quoted from: Yanchuk V.A. Methodology, theory and method in modern social psychology and personology: Integrative-eclectic approach.

Minsk: Bestprint, 2000.

The grounded theory approach emerged in the 1960s as a consequence of the ongoing debate over the issue of quantitative and qualitative methods. As a special concept, the term was introduced by sociologists Glasser and Strauss (B. Glaser, A. Strauss, 1967). They chose the term “grounded theory” to express the idea of ​​theory based on the testing of qualitative data obtained in specific settings, such as unstructured interview data, participant observation, and archival research. N. Pidgeon and K. Henwood emphasize: “the approach is an attempt to find a means of countering positivism, which is focused on confirming pre-existing theories that have little relation to the specific research area and to descriptions of reality as perceived by the research participants themselves” (1998, pp. 253-254).

Today, the concept of "grounded theory" is used in a wide range of contexts, particularly in relation to Glasser's research methodology (B. Glaser) and Strauss (A. Strauss) system analysis of unstructured qualitative data. In its essence, the approach is rather qualitative, although it involves some elements of quantitative data processing. Belonging to the qualitative approach is manifested in its “commitment to the ideas of local contextual theory; studying meanings in all their complexity and contexts and interest in reflecting the constructions of the world of research participants” (ibid., p. 254).

The grounded theory approach is aimed, first of all, at overcoming the enslaving and distorting influence of pre-existing theories that set patterns for conducting research, collecting and interpreting data and, thereby, alienating the researcher from the reality that, in fact, is present in reality. The pre-existing theory, as it were, directs the search not to identify the features and uniqueness of the phenomenon being studied, but to confirm or not confirm a theoretically based hypothesis, which ultimately leads to various kinds of distortions.

The grounded theory approach is implemented within the metaphor of “discovery” as opposed to the metaphor of “existence” used in classical positivist research and is the result of a constant interchange between data and research conceptualizations, between ideas and the life experiences of the researcher.

As Pidgeon and Henwood note, “The value of the grounded theory approach is that it offers a range of procedures to facilitate the use of subjectivity (and especially researcher creativity and interpretation) in the process of analyzing qualitative data. The general scheme for implementing the approach from the standpoint of grounded theory is presented in the diagram.

Stages of an evidence-based approach.


In connection with the above, it is important to note that the introduction of procedures for operating with subjectivity should not be understood as the presence of standard procedures and rules for achieving truth. Rather, they represent ways of introducing into practice the requirement of the need for active involvement in a detailed analysis of one’s own research material in order to stimulate and organize the materials being studied” (Pidgeon, Henwood, 1998, p. 255).

The main goal of implementing a grounded approach is to move from unstructured data to a set of theoretical codes, concepts and interpretations.

Pandit ( N.R. Pandit, 1996) offers a detailed description of the process of five analytical phases (not necessarily sequential) of grounded theory construction: organizing the study; data collection, data organization, data analysis, theoretical analysis. These phases are accompanied by appropriate research procedures and involve assessment according to the following criteria: construct validity; internal validity; external validity and reliability. Construct validity is ensured by clearly defined operational procedures. Internal validity is ensured by establishing causal relationships. External validity requires a clear definition of the area to which the results of the study can be generalized. Finally, reliability presupposes the possibility of confirming the results obtained [ibid., p. 2]. The general framework for constructing a grounded theory is presented in the table.

The process of constructing grounded theory (Pandit, 1996, pp. 3-4).

Activity

Logical reasons

Study organization phase

Step 1. Literature review.

Determining research questions.

Definition of a priori constructs.

Directions of research.

Step 2: Case selection.

Theoretical (non-random) sampling determination.

Focus on theoretically useful cases.

Data Collection Phase

Step 3: Develop strict data collection protocols.

Creation of a case study database.

Multiple data collection methods are used.

Qualitative and quantitative methods.

Increases reliability. Increases construct validity.

Strengthens the rationale for theory through triangulation. Increases internal validity.

A synergistic view of what is being studied.

Step 4. Enter the area.

The intersection of data collection and analysis.

Flexible and alternative data collection methods.

Speeds up analysis and brings useful controls to data collection.

Allows the researcher to accept the strengths of certain topics and the characteristics of unique cases.

Data Organizing Phase

Step 5: Organize your data.

Chronological placement of events.

Facilitates easier data analysis. Allows you to control the process.

Data Analysis Phase

Step 6. Data analysis for the first case

Using open coding.

Using axial coding.

Using selective coding.

All forms of coding contribute to internal validity.

Step 7. Theoretical sampling.

Literary and theoretical confirmation through all studied cases (return to the second step until the theory is saturated).

Confirms, expands and refines theoretical constructs.

Step 8: Facilitate completion.

Theoretical saturation in the presence of opportunities

Leads to the completion of the theory with a minimum number of edge cases.

Comparison phase with literature data

Step 9. Comparison of the constructed theory with data from specialized literature

Comparison with alternative approaches.

Comparison with similar approaches.

Improves the specificity of constructs and, on this basis, internal validity.

Also enhances external validity by establishing areas to which study results can be generalized.

The approach from the standpoint of grounded theory provides extremely broad opportunities for constructing theories that, firstly, most closely and fully reflect the originality of the real phenomenology of the social existence of the individual and his environment, secondly, helps to free himself from the pressure of stereotypes and the framework of previous theories, thirdly, contributes to increasing the construct, internal and external validity of research results. At the same time, implementing the approach in practice is associated with a number of difficulties noted by Pandit: time-consuming; long periods of uncertainty; incomplete data; economic high cost (Pandit, 1996, p. 13). The same author notes that the successful implementation of the approach is associated with the presence of certain research qualities and skills. “In particular, competence, creativity, and life experience (both in conducting research and in the context under study)” (ibid.).

Particularly noteworthy is the discussion of evaluating the grounded theory approach. Traditionally, the criteria for evaluating a theory (their analysis was presented in Chapter Three) have been validity, reliability, parsimony, empirical support, internal consistency, and generalizability. Nevertheless, a number of authors take the position that in relation to grounded theory, “there are no methodological criteria that guarantee the absolute accuracy of research (both quantitative and qualitative)” (Pidgeon, Henwood, 1998, p. 269). According to these authors, we should rather talk about some practical recommendations that promote progress in knowledge of the area under study. Among these criteria are defined: proximity to data; integration of theory at various levels of abstraction; reflexivity; documentation; analysis of theoretical sampling and negative cases; sensitivity to the reality of research participants (respondent validity); transparency; persuasiveness.

The presented review of two research methodologies, representing an attempt to solve the problem of the adequacy of the psychological study of the reality of the phenomenology of the social existence of the individual and his environment, of course, does not exhaust the entire problem field. As before, searches and debates rather take place within the framework of the dichotomy - better - worse. Another, no less fundamental aspect is not touched upon - the applicability of a particular research methodology to the qualitative characteristics of the reality being studied or human nature (biological, symbolic, reflexive). An attempt to solve this problem is made in our approach of integrative eclecticism through triangulation, which is described in the next section.

In the theoretical justifications of modern qualitative methods, this problem is considered in the context of the methodological tasks that sociologists set for themselves when using qualitative methods in their research. One of these methodological approaches, which is being developed by researchers such as J. Corbin and A. Strauss, is called “grounded theory.” In the context of this approach, “theoretical understanding of the reality being studied is directly included in the process of collecting, analyzing and interpreting data” 1 . Researchers address the challenges of adapting established scientific procedures for qualitative research; develop special forms of reports on the rules and methods of conducting research; determine the criteria for assessing the research results.

1 Vasilyeva T.S. Fundamentals of qualitative research: Grounded theory // Methodology and methods of sociological research. (Results of search engines research projects for 1992-1996) M., 1996. P. 56.

Grounded theory is based on pragmatism and symbolic interactionism. Pragmatism manifests itself in changing the method in accordance with a changing reality, or more precisely, with a change in the researcher’s perception under the influence of changes in the object being studied. Perception plays an extremely important role in the work of the researcher as a participant in communication. Luhmann compared perception to the gateways of a social system, which either allow or reject any message. As a result, in the process of communication development, a certain bifurcation occurs, in the sense of a state of uncertainty regarding its continuation or interruption. According to N. Luhmann, society is “a flow of self-reproducing information messages in a system that describes itself and observes itself” 1 .

1 Luhmann N. The Concept of Society // Problems of Theoretical Sociology. St. Petersburg, 1994. P. 33.

To ensure that these “gateways” of perception do not interrupt and distort the flow of information messages as little as possible, it is necessary to abandon strict determinism. N.K. Denzin identifies three basic assumptions of symbolic interactionism. "Firstly, social reality is a social product of feelings, knowledge and understanding. The interaction of individuals creates and determines their own meaning of situations. Secondly, people are able to assign the meanings they need through self-reflection. They are able to give certain forms to their behavior and control it and the behavior of others Thirdly, in the course of social interaction, one adjusts one’s points of view on the behavior of others to the meanings that others give to their behavior 2 .

2 Denzin N.K. The Research Act. A Theoretical Introduction to Sociological Methods. Englewood Cliffs (New Jersey), 1970. P. 5.

This adjustment is often carried out unconsciously, automatically. This usually occurs in a situation where a key phrase or word appears that indicates the possibility of a mismatch of meanings. In the above example of different perceptions of the concept of “fine weather,” a sociologist and a villager would never have noticed different interpretations They would not have tried to adapt this concept to their interlocutor if the peasant had not asked a clarifying question-statement: “So you have had rain? But for us, for a week and a half, there is not a cloud in the sky, everything is on fire.” If this question had not been asked, the interlocutors would not have noticed anything and would have lost the illusion of understanding each other and receiving reliable information about the weather.

The phenomenological method indicates that in everyday life It is not always easy to detect a discrepancy between the meaning of words and actions. "Garfinkeling" reveals differences in "background expectations" and "rules of speaking" most effectively in a laboratory experiment, but in a qualitative field study, creating a situation of artificial anomie most often leads to a breakdown in communication. The “gates” of the respondent’s perception are slammed shut in front of the sociologist, because frequent clarifications like: “What did you mean?” in the “rules of speaking” are also typified accordingly and, if the sociologist is not a foreigner with poor command of the language, but a representative of the same culture and generation to which the respondent considers himself, then this is no longer identified as “lack of understanding”, but as “lack of understanding” or “provocation, indirect aggression”, etc. For example, this can be interpreted as the fact that the “smart” city dweller demonstrates by his lack of understanding that the “dark” peasant cannot express his thoughts in a coherent, understandable, literate language. Such an interpretation is possible because the “main thesis of the interchangeability of perspectives” is violated and, although options for adjusting the interlocutors to each other are possible here, this communication will be unnatural and may further obscure the “horizons of typicality” of the individual world of the other.

The phenomenological method is a qualitative strategy for “collecting and analyzing data about the phenomenological composition of experience and the meaning that a certain object, situation, event or some aspect of one’s life has for a person.” The method exists in the “halo” of a broad philosophical and psychological approach, originating in Husserl’s phenomenological philosophy, aimed at understanding the complete system of formations of consciousness that constitute the objective world through direct intuitive consideration of the essences of transcendentally pure experiences. The initial basis of phenomenology as a scientific method is the idea that any individual experience is true, and reliable information about it can be obtained by studying the connection between things, human experiences and the meanings of the inseparability of being and the world, identifying the intentionality of consciousness as its meaning-forming relationship to the world. “From its very origins, phenomenology appeared in the works of E. Husserl as a form of research - the relationships between signs, object referents, the meanings and structure of our experiences, the ways of our everyday perception of things and the work of consciousness that ensures the coherence, meaningfulness and preservation of our experience over time.” . .

The phenomenological method is used in psychological research practice, psychotherapy, and psychiatry. Often the method is implemented within the framework of a case study, which serves as an example for an in-depth, detailed study of a mental phenomenon. Phenomenological research is designed to bring the researcher closer to the immediacy of empirical data, reflect the diversity and uniqueness of a person’s inner world, and replenish ideas about those properties of mental phenomena that are lost when formalized using statistical data. The phenomenological method is focused “on obtaining clear, accurate and systematic descriptions of certain aspects of a person’s experience, on revealing the structure of experience and the meaning that a certain object, situation, event or some aspect of one’s own life has for a person.”

Data collection in phenomenological research is carried out through the reports of the subjects (in the form of an oral or written survey, when introspective reports are created, reflections on a certain topic), reflective self-reports of the researcher, through any texts and documents that contain descriptions of a person’s inner life.

Data analysis is carried out in stages, when semantic units are identified in the interviews, written reports, observations, texts obtained during interviews, which are then combined into “clusters of meanings”, and on this basis a generalized description of the experience of the person being studied is produced.



To use the method in empirical research, you need to have a deep understanding of the essential aspects of the method, communication skills, and be familiar with the practice of its application and interpretation of results. A novice researcher can master this method only under the guidance of an experienced person who has this knowledge, qualities, and experience. The author of this manual is not one of such specialists, so he cannot recommend its use.

The contents and procedures of the phenomenological method are described in sufficient detail in the article by A.M. Ulanovsky. .

2.3.2. Grounded Theory Method

Grounded theory is an extensive, theoretically based, empirically informed method that includes a number of rational procedures and research techniques, which can be called an approach rather than a specific method. Its essence is the researcher’s step-by-step construction of his own analytical scheme, a theory of the specific phenomenon being studied, which allows him to interpret events in a certain area of ​​social phenomena.

The creators of the method, A. Strauss and J. Corbin, were inspired by the idea of ​​liberation from the influence of already existing theories on the process and results of sociological research, which, in their opinion, by setting procedures for collecting and interpreting data, alienate the researcher from the reality being studied and do not allow identifying the features of the unique social being studied phenomenon. To overcome the influence of pre-existing theories, they proposed to identify a number of analytical phases in conducting research (organization of research; data collection; data ordering; comparison with literature data; theoretical data analysis), techniques (definition of a priori constructs, intersection of data collection and analysis, use of “open” , “axial”, “selective” coding, etc.), following which (not necessarily in the proposed order) will ensure movement from unstructured data to interpretations that allow us to reflect the originality of real phenomena of social life and build a valid and reliable theory, free from the pressure of existing theories and approaches. The “quality” of this approach lies in the rejection of pre-proposed positions, hypotheses and the organization of research as their confirmation; the focus of research on the path of “discovery” of provisions and hypotheses, instead of following a prescribed order of actions, in constant reflection of the emerging provisions and actions of the researcher and their flexible change. At the same time, the construction of a “grounded theory” includes both quantitative and qualitative methods in collecting, processing, and interpreting data.

The method has been actively developed and continues to be developed by researchers, branching out and gradually turning into a set of organizational stages, techniques, validity testing procedures, etc. Researchers also note the disadvantages of the method: time and economic costs, inaccessibility to all researchers (certain research qualities and skills are required), lack of methodological criteria to guarantee the accuracy of the results.

A description of the method by its developers is available in the work.

In most studies, the relationship between religion and economy is studied using quantitative methods, in which the researcher a priori specifies certain sets of categories. The respondent, for the most part, can only agree or disagree. In this study, the emphasis is on the identified categories of the informants themselves, how they connect their economic activities with main maxim Christianity - salvation. Therefore, a focused semi-structured interview was chosen as the method for collecting field data (interview guide in Appendix 1). Its focus is on the topic of work and how a believer should conduct himself there. Moreover, this not very rigid structure gives greater freedom to the informant to express his own categories. In the ideal case, there is practically no imposition of the interviewer's categories on the informant. The same method will be used to survey experts, based on the results of which it is planned to compile a list of relevant church documents.

Grounded Theory procedures were chosen as data analysis methods. It was originally developed by B. Glaser and A. Strauss, and its basic principles were first outlined in the book “The Discovery of Grounded Theory.” Later, various modifications of the grounded theory appeared: “In addition to the works of B. Glaser, A. Strauss and their direct followers, the most famous are the versions of K. Charmatz and A. Clark.” In interview analysis, there are three types of data coding: (1) open coding, (2) axial coding, and (3) selective coding. Open coding identifies concepts according to their properties and dimensions, while axial coding establishes relationships between categories and subcategories, taking into account context, conditions, action/interaction strategies, and consequences. Finally, during the selective coding stage, the central category will be identified. Also during this procedure, there is a “systematic linking of the central category with other categories, validation of these connections and filling of categories that require further improvement and development” Ibid. Page 97.

The collection of two types of interviews (experts and laypersons), as well as the analysis of documents and interview transcripts, are planned to be carried out independently of each other. This is done in order to exclude mutual influence of categories and their non-introduction into various discourses (officially church and lay).

Methodological design of the study

Regarding sample size, since in qualitative research there is no strict formula by which it would be possible to determine the necessary number of objects needed for an in-depth and complete analysis, so-called “theoretical saturation” is often referred to to determine the number. It lies in the fact that data collection continues until there is an increase in new codes equal to zero. According to one study on this issue, the optimal sample size in this case is 12 interviews. If theoretical saturation is not achieved with the specified number, then the sample size is planned to be increased to 20 interviews.

Description of empirical material. Compendium of the Social Teaching of the Church

The text to be analyzed must be an official document of the Roman Catholic Church, reflecting the position of this institution on the issue of the economic life of the individual. To compile a corpus of such texts, two expert interviews were conducted with theologians: a priest who had already graduated from the seminary, and a seminarian in his last year of seminary. On their recommendation, my attention was first drawn to the Compendium of the Social Doctrine of the Church. Compendium of the Social Doctrine of the Church. M: Paolina. 2006., which contains two chapters concerning the economic life of the individual and society - the chapters “Human Labor” and “Economic Life”.

The Compendium of the Social Teaching of the Church is a systematic presentation of the Church's position on various social issues in various public spheres. It was compiled in 2005 by a theological commission at the direction of Pope Benedict XVI. The starting point of the so-called doctrina socialis was the pontificate of Pope Leo XIII, who issued the famous district message “Rerum Novarum” (1891) Leo XIII Rerum Novarum. Electronic version. URL: http://krotov.info/acts/19/1890/1891rerum.html (date of access: 04/27/2016), dedicated to issues of economic justice Zamagni S. Catholic Social Thought, Civil Economy and the Spirit of Capitalism // The True Wealth of Nations. Catholic Social Thought and Economic Life. Ed. by Finn Daniel K. Oxford University Press. 2010. P. 90. The reason for the appearance of such a document was a number of socio-economic events that occurred in late XIX- early 20th century: the increase in the use of machine labor, the opening of new markets, the development of industry and commerce, increasing urbanization and exodus from the countryside Teixeira P., Almodovar A. Economics and Theology in Europe from the Nineteenth Century: From the Early Nineteenth Century" s Christian Political Economy to Modern Catholic Social Doctrine // The Oxford Handbook of Christianity and Economics. Ed. by Oslington P. Oxford University Press 2014. P. 114. In addition, the Church faced the French Revolution, the overthrow of the monarchical regime and the emergence of liberalism. .

The Pontiff in his district message speaks out against two ideological movements: “predatory capitalism” and socialism. The first he accuses of the heartlessness of business owners and the fact that now “... a few rich people can keep many poor people under a yoke that is little better than slavery” Leo XIII Rerum Novarum. Electronic version. URL: http://krotov.info/acts/19/1890/1891rerum.html (access date: 04/27/2016). Socialism, which arose as a way to overcome all the problems of “predatory capitalism,” turned out to be no better, since it violates basic and natural human rights, especially the right to property. In this regard, Pope Leo XIII decided to answer the “challenge of the time” and propose, on behalf of the Church, a third way to solve the problem that had arisen. In fact, it consists in the application of Christian moral laws to a new economic reality.

Subsequently, the social teaching of the Catholic Church received powerful development in the writings of Pope John Paul II, many of which were devoted to labor issues. Three encyclicals had a particular impact on social teaching: Laborem exercens (1981), Sollicitudo rei socialis (1987) and Centesimus annus (1991). In fact, the compendium brings together everything that was said in various encyclicals, decrees, and the Catechism. Thus, the official church position was formulated on many issues and problematic situations in various spheres of human existence.

In sociological practice, qualitative data refers to data that is expressed in a non-numerical way. Their carriers can be drawings, photographs, video materials, various kinds of symbols and signs, things and objects, etc. But most often they are presented in the form of verbal information - text or speech. Qualitative data differs from quantitative data in that the content of the latter carries a meaning that directly characterizes its bearer, while quantitative data indicates the scale, volume, and intensity of those characteristics of the phenomenon being studied. Qualitative data allows us to reveal the meaning of a social phenomenon, while quantitative data shows how often it occurs or how intensely it is represented in social reality. Qualitative data indicates the subject, quantitative data shows how strongly this subject of research is manifested in the object. Continuing this kind of reasoning, we can conclude that some data are more focused on creating a judgment about a social phenomenon, while others are more focused on assessing the significance or testing this judgment. These differences in the nature of the two types of data have led to the fact that so-called qualitative research (research based on the collection and analysis of qualitative data) has come to be associated to a greater extent with the stage of generating or building a theory, and quantitative research with its verification.

Despite the fact that the methodology for analyzing qualitative data in Russian sociology is being increasingly used, this practice is still not generally accepted. At the same time, the problem of balance between qualitative and quantitative analysis strategies has been raised by sociologists since the beginning of the 20th century, and today the discourse of scientific debates about the place and role of qualitative and quantitative methodologies is expanding. In this paper we would like to trace the historical development of the use of qualitative data in sociology. Given that this issue has a rich history and cannot be confined to a single article, we will focus here on two strategies of qualitative analysis—analytic induction and grounded theory—and consider their continuity in historical perspective.

Analytical induction

The first most significant sociological experience in the analysis of qualitative data can be considered the five-volume work of W. Thomas and F. Znaniecki, “The Polish Peasant in Europe and America,” which has become a classic. It provides detailed comparative analysis the impact of social changes on the Polish family, primary groups and communal ties in two environments - in Poland and America.

The first volume contains methodological notes that define the basic concepts of the study: values, attitudes and definition of the situation. W. Thomas and F. Znaniecki divided the subject areas of sociology and social psychology. Sociology, according to them, studies value systems, social psychology - attitudes. The authors based their analysis of the problems of Polish emigrants on the basis of the laws of formation, which included both values ​​and attitudes. The changes found in Polish families who emigrated to America are not just changes in their values ​​and attitudes, but a synthesis of both, which they called “situation definition.”

The source of both family and social disorganization, first of all, was the destruction of the integrity of attitudes towards traditional forms of life, and hence the formation of more pronounced individuality of individuals. Changes in the subjective and objective aspects of social life have led to fundamentally new interpersonal relationships. Studying the process of these radical changes required researchers to develop a new type of concept with which to trace psycho-social dynamics. The formation of this concept was based on a detailed study of the subjective dimensions of the life of an individual or group.

The work of W. Thomas and F. Znanecki used materials from about 50 correspondence between families living in Poland and the United States, as well as newspaper clippings, letters, biographies, autobiographies, magazine clippings, field notes made in Polish communities and organizations. All these materials, united by the authors under the concept of “life documents” (human documents), fully corresponded to the methodological goals of the authors: they were an expression of human feelings, close to everyday experience and perception, and at the same time were “objective”, since they could be analyzed without distorting their content or meaning.

The analytical techniques used in this study were subsequently systematically reviewed by F. Znaniecki in his later work, The Method of Sociology. It is rather of a theoretical and methodological nature - the author discusses how social knowledge is created and disseminated. The discovery and proof of new knowledge, in accordance with the concept of F. Znaniecki, has different nature. The discoveries are related to psychological a process of thinking that is sometimes very difficult to explain rationally. The proof stage following the discovery stage is always based on logical laws of determining truth. “Only those conclusions are valid which, once discovered, can be deduced from valid premises in accordance with the rules of logic.” Since the stage of proof turns out to be more important for convincing others of the reliability of the acquired knowledge, scientific reports primarily reflect the process of deduction and verification of a new idea, which can create the illusion of priority of the latter.

To eliminate this methodological imbalance, a method was proposed analytical induction, aimed at systematically describing the process of developing hypotheses and defining new concepts. Induction as a logical technique is the process of deriving a general judgment from a certain set of phenomena taken separately for observation. Analytical induction contributes to the development of universal statements about the essential features of a phenomenon, or the causes or foundations that precede and determine it. Briefly the logic of this process can be illustrated as follows.

There is a certain set of cases A, B, C, D, E, F. Let's take a case A and explore its characteristics. It has signs P, R, S. Briefly it can be written like this: A (P, R, S). Let's explore other cases:

B (Q, R, S)
C(Q,P,S)
D (K, R, S)
E(K,P,S)
F(K,Q,S)
H(K,Q,P)

So, from the characteristics we have identified for the cases under study, the characteristic common and repeated for all will be S(except in case H). The remaining signs are either specific only to a single case, or the researcher has not yet been able to identify something significant that unites these different signs. In the first option (for example, in the case H) the case under study is considered not to belong to this type, in the second - for signs K, P, Q, R explanations that generalize them and are applicable to each case of the class under study are sought. In the end, it turns out that a necessary and sufficient condition for the existence of the phenomenon under study N there will be characteristics S, x, y. The absence of these characteristics will indicate the absence of the phenomenon N .

The result of a generalization, according to F. Znaniecki, is the identification of what is “essential in each individual case” of a given class of phenomena to which this generalization applies. The more significant the analytical induction, the more features are confirmed as identical or similar for most cases. In this regard, analytical induction can also be called by typical cases method. On this occasion, F. Znaniecki writes: “When a particular specific case is analyzed as a typical (or eidetic), we conclude that the essential features that define it as such are common to all cases of a given class and distinctive from others.”

He contrasts analytical induction with induction enumerative(from English - enumerate - enumerate), based on statistical generalizations. The advantage of the latter is that the conclusions obtained with its help meet the requirements of statistics for making statistical conclusions, since it is based on a sufficiently large sample of cases, which is assumed to represent the study under study. social group. Criticism of the shortcomings of enumerative induction boils down mainly to the practical impossibility of determining the size of the group on the basis of which the sample population is constructed. For example, if housewives are studied, it is impossible to determine the exact (even relatively accurate) number of them until we have a final, full list characteristics of the entire group, which can only be obtained using analytical induction.

“Analytical induction” was later used by followers of the Chicago school. Thus, according to P. Manning, the core of modern developments of this method are the works of Robert Cooley Angell “The Family Meets Depression” (1936); Alfred Lindesmith's Drug Dependence (1947), Donald Kressey's Other People's Money (1953), and two works by Howard Becker describing the same study: Becoming a Marijuana Smoker (1953) and Marijuana Use and Social Control " (1955).

The procedure of “analytical induction” was first operationally described in the work of W. Robinson. He identified six stages in the process of analytical induction.
1. An approximate definition of the phenomenon being studied is given.
2. Hypothetical explanations for this phenomenon are formulated.
3. One case is examined to determine whether the hypothesis matches the facts.
4. If the hypothesis does not correspond to the facts, then either the hypothesis is revised, or the phenomenon is rethought, or the studied case is excluded from those corresponding to the given phenomenon. After this, the definition is clarified.
5. A sufficient level of certainty can be achieved after several cases have been tested, but the discovery by the researcher of isolated facts that contradict the explanations requires reformulation of the hypothesis.
6. The process of testing cases to define a phenomenon and refine a hypothesis should continue until universal relationships are established.
Subsequently, one more item was added to this list:
7. Cases that do not belong to the area described by the definition are checked for their compliance with the final hypothesis. The researcher considers whether all scientifically established conditions of a phenomenon are always present in its presence, and always fail in its absence.

Thus, analytical-inductive research involves two important components: reformulating the hypothesis, which would ultimately cover negative cases, and introducing changes in the definition of the phenomenon itself in the process of eliminating some of them. These two mutually directed processes are carried out until their “closing point” is found, which was not initially obvious.
Revisiting a hypothesis to identify negative evidence is a fundamental characteristic of analytic induction. This position is based on the fact that the accumulation of evidence alone is not an adequate argument; You can always find evidence that contradicts what has been collected. For example, A. Lindesmith, in a study of drug addiction based on approximately 50 interviews with drug addicts, revised his hypothesis twice, receiving more and more refuting evidence. The initial assumption that drug use is a psychological problem was transformed during the course of the study. The researcher concluded that addiction occurs in the following situation: first, a person uses drugs for cognitive purposes, then he realizes that the drug alleviates severe suffering, and this, in turn, is a consequence of the absence of drugs.

Another way of using negative evidence was demonstrated in the work of D. Kressey. The concept of "embezzlement", redefined by him as a "criminal violation of a condition of financial trust", was used to review the collected cases. Cases that did not meet this definition were excluded from further analysis - these were interviews with criminals who did not occupy positions requiring a high level of trust: they lied to the employer or were initially intended to use the opportunities of this job to their advantage.

“The search for the universal,” as J. Turner called analytical induction, is a search for empirically established causes based on a selected number of cases analyzed in detail. In this process, most of the problematic features of social research must be dealt with realistically.

After the work of the late 30s, the method of analytical induction began to lose its popularity - sociologists more often began to turn to quantitative, formalized methods, which were more consistent with traditional scientific criteria. As R. Faris notes, the debate has revolved around the capabilities of statistical methods: how accessible human experience and its meanings are to statistical methods. An example of such a debate is the dissertation of Samuel Stouffer, which summarizes the experience of researching student attitudes towards alcohol prohibition and prohibition policies in general. An attitude scale was created to explore this issue and it was hypothesized that it would recreate the same picture as professionally conducted case studies. Students were asked to write autobiographies and include all of their life experiences relating to alcohol and the law prohibiting its use; in addition, they were asked to complete questionnaires based on the Thurstone scale. Judges' rankings of life stories were compared with scale scores. Stouffer found a strong, significant relationship between judge rankings and scale scores. She confirmed the assumption that the relatively more complex task of collecting and analyzing life stories did not provide significantly more knowledge about the set of attitudes that formed the scale. From this point of view, the quantitative approach was justified as a more adequate method for considering many problems: it was more effective, faster and easier to use.

The decline in the popularity of analytical induction was also facilitated by criticism of this method from a positivist point of view, often based on the rules for constructing statistical models. Among the arguments is the impossibility of predicting or determining the degree of manifestation and variation of the trait being studied. The possibility of causal analysis stated in the definition of this method was also criticized - the experience of using analytical induction has shown that it is more productive for the formation of concepts than for identifying universal causal connections.

Despite the above, the contribution of researchers who used this method is undeniable. Analytical induction, first applied and methodically developed within the framework of the Chicago school, laid the foundation fundamentals of qualitative analysis as an empirical method. Among them: 1. Preservation in an unchanged form of primary data documenting the cases under study; 2. The methodological possibility (and sometimes the necessity) of revising hypotheses and conceptual definitions of basic concepts in the process of analysis; 3. Typology is the main result of such a study; it is based on the most significant characteristics of the group or phenomenon being studied.

In addition to the methodological and methodological innovations introduced into sociology by Chicago sociologists, there is another no less important point- establishment and development a special genre of writing sociological works. Along with sociological works, the main part of the content of which was abstract theoretical reasoning, works based on detailed studies of real life situations began to appear. These works had many quotes from interviews, diaries, and other personal documents of the people studied, illustrating theoretical reflections, which made them more vivid and closer to the object being studied. The fact that the object of research in the works of sociologists “spoke” through quotes largely predetermined the development of further discussions about the role positions of the researcher and the subject. The works of Chicago sociologists are a striking example of the fact that the person being studied can be represented in the work as character having the right to vote. This use of quotations led to the expansion of the rhetorical possibilities of theoretical reasoning.

In the 1960s, qualitative methods gained renewed attention. In our opinion, this was due to the following three reasons.
Firstly, theoretical and methodological the premise was based on the fact that by the time the first works on “grounded theory” were written in sociology, the phenomenological understanding approach had become relevant. For example, at this time such theoretical books as “The Phenomenology of Everyday Life” by A. Schutz (1967) and “The Social Construction of Reality” by Berger and Luckmann (1966), as well as “The Presentation of Self in Everyday Life” (1959) were published. .) and “The Ritual of Interaction: An Essay on Face-to-Face Behavior” (1967) by I. Goffman and others. Such works created a general theoretical climate in which a revision and expansion of the understanding of sociological methods took place. In unison with the theoretical understanding approach, the theoretical and methodological positions of A. Sikurel and N. Denzin were developed.

Secondly, methodological the premise was due to the fact that, on the one hand, a large amount of quantitative data had been accumulated, and on this basis the limits of the cognitive capabilities of survey techniques and content analysis of texts were realized. On the other hand, research using a qualitative strategy also contributed to the accumulation of material for methodological generalization. This situation was suggested at one time by R. Merton. In a methodological description of his work, he wrote: “This part of our report ... is significant for the sociological community as a practice of incorporating in publications that examine in detail the ways in which qualitative analysis can actually develop. Only when a significant number of such reports have appeared will it be possible to define the methods of qualitative analysis in a more clear way ". A similar idea was expressed by P. Lazarsfeld in his work “Some functions of qualitative analysis in social research”

And finally procedural the premise followed directly from the previous one. The cognitive limitations of mass surveys provoked researchers to create new techniques that would make it possible to study aspects of social phenomena that are “unquantifiable.” Among the characteristics of the quantitative approach that limit the cognitive capabilities of studying dynamic social phenomena, one can highlight the rigidity of the order of research procedures, the a priori formulation of the model of the subject of research, the positivist understanding of the hypothesis (putting forward a hypothesis only on initial stage research and its verification - either acceptance or rejection, but not adjustment and modification).

This situation became the context for writing the book “The Discovery of Grounded Theory”, the authors of which, Bernay Glaser and Anselm Strauss, at the beginning of the book, based on the experience of the Chicago school and taking into account criticism addressed to it, create their own strategy for data analysis

One of the most consistent and productive critics of the Chicago School was Herbert Bloomer. In his critical remarks, he wrote that, despite the stated large number analyzed “documents of life”, not all concepts are based on empirical data. Many of the ideas presented in “The Polish Peasant...” were developments of previous works by W. Thomas. However, the principle that authors used quotations selectively rather than systematically was not stated. The second disadvantage stems from the first. Because many conclusions are drawn without the use of data, it is difficult to determine how reliable these claims are, even if they seem plausible. Thus, the indirect use of data that does not confirm a hypothesis, but rather encourages its development, is useful for theorizing, and not for verifying and establishing universal causal relationships.

In this regard, B. Glezer and A. Strauss note that “F. Znaniecki’s response to Blumer’s criticism regarding the verification problem is instructive. He agrees that the material in his monograph does not always provide solid support for theoretical formulations, but this is consistent with “the adequacy of the overall conceptual approach to considering the data.” . Despite the fact that Znaniecki talks about developing a theoretical approach, he still does not raise the question of methods for generating theory. And this is a good reason for criticizing analytic induction.

Another important omission of Chicago social scientists working with qualitative data was the lack of sophistication or technical rigor in using the data. It seemed that common sense and one's own logic of reasoning dominated the formation of scientific knowledge. Another disadvantage was that “the monographs based on qualitative data consisted of long, detailed descriptions that were summarized in short theoretical discussions.” The main focus of the researchers was on “making history coherent.” Therefore, B. Glaser and A. Strauss call such works insufficiently theoretical or too “impressionistic.” In contrast, since the late 1930s and especially after the Second World War, “quantitative researchers have made great strides both in developing rigorous evidence and in translating theoretical concepts into research tools. As a result, it has become possible to demand more rigorous confirmation of theory. Therefore, the advantages of quantitative methods have facilitated the testing of unsupported theories with evidence."

The preponderance towards the development of quantitative research techniques has led to the fact that the “rhetoric of verification” characteristic of quantitative research began to extend to qualitative research. Therefore, it is quite logical that from the standpoint of verification, qualitative methods were assigned only a secondary, complementary role, as was the case in the projects of P. Lazarsfeld and S. Stouffer. However, according to Glaser and Strauss, this approach to qualitative methods of analysis is inappropriate, moreover, it significantly narrows their capabilities. These authors argued that qualitative data analysis is characterized by a “rhetoric of theory generation,” which is built on different principles than the “rhetoric of verification.” Qualitative research requires a different analytical strategy, one that leads to the construction of conceptually "thick" theory based on collected documents from people's lives - a strategy that Chicago sociologists sorely lacked.

In order to make theory building more systematic, B. Glaser and A. Strauss propose several necessary components of an analysis strategy in qualitative research. Firstly, the research must be iterative, that is, the analytical process must alternate with the process of collecting information or even run parallel to it. Secondly, compliance with this principle allows you to create a theoretical sample during the research process, the purpose of which is to represent not the group of people being studied (the object of the study), but the aspects, properties, characteristics or qualities of the phenomenon under study (the subject of the study). “Theoretical sampling is the process of collecting data for theory generation by which the analyst collects, integrates, codes, analyzes his data and decides which data to collect next and where to look for it in order to develop theory as it emerges. This data collection process is guided by emerging theory.” And finally, the third component is constant comparative analysis used at different stages of the analytical process. Determining its place in the existing methodological field, the authors place it between the two main strategies and approaches of analysis used at that time. The first approach is classical content analysis: first, an encoding model is specified, and then the data is systematically collected, assessed and analyzed according to predetermined, unchangeable and common scales for all of them, which make it possible to give qualitative (verbal) data a quantifiable form. Based on a new structured data array, previously put forward hypotheses are proven (accepted or refuted) using a numerical model.

B. Glezer and A. Strauss associate the second approach with a situation where it is necessary to develop some preliminary ideas or hypotheses. In this case, the operation of detailed coding can only slow down the achievement of the goal, so “the analyst only looks through his data to find new properties of theoretical categories, and writes memos (analytic notes) about these properties.” This approach rather describes the initial stage of coding and is insufficient for theory building, since in the latter case constant transformation and re-integration of data is required as the material is accumulated and reviewed. And this task corresponds to the third approach proposed by the authors. In the analytical procedures of constant comparison, he combines the detailed coding procedure of the former and the style of theory development of the latter. “The goal of the constant comparison method, which combines coding and analysis, is to generate theory more systematically than assumed in the second approach through the use of extensive coding and analytical procedures.” While more systematic than the second approach, the method of constant comparisons is at the same time not related to the first, which is designed for preliminary testing, and not for discovering a theory.

The authors note that combining these two analysis strategies has already been attempted in “analytic induction,” but this method is used for other purposes in “grounded theory.” “In contrast to analytic induction, the constant comparative method is concerned with generating and plausibly conjecturing (rather than pretesting) categories, properties, and hypotheses about general problems. Some of these properties may be causes, as in analytic induction, but, in contrast, others are conditions, effects, aspects, types, processes, etc. In both approaches, these properties are generalized in an integrated theory. Further, the constant comparative method does not attempt to establish universality or prove the proposed causes or other properties. In the constant comparative method, as opposed to analytical induction, only data saturation is required, not reconciliation everyone obtained data, so they are not limited to one type of clearly defined cases.(Italics mine - O.K.) The constant comparative method, as opposed to analytical induction, is easier to use in research with any type of qualitative information, including observations, interviews, documents, articles, books, etc.”

The comparative method is used at each stage of the analytical process of grounded theory construction. It includes the following procedures: coding, identifying key categories, theoretical selection and theoretical sampling, theoretical saturation and theory integration.

The creation of a grounded theory goes through three stages - induction, deduction and verification, each of which is “absolutely essential” to the formation of a new theory. Moreover, it is important to note that all three stages, according to A. Strauss, are not used sequentially in the study, but are present to one degree or another at each stage of the study. “Induction comes down to actions that lead to the discovery of hypotheses, that is, to the emergence of intuitive hunches or ideas with their subsequent deployment in hypotheses and assessments, even if this is preliminary and conditional, for the typification of events, actions, relationships, strategies, etc. " . This definition induction is very close to the description of this process in the works of Charles Sanders Peirce, who distinguished two types of logical inference characteristic of the logic of discovery - qualitative (or analytical) induction and abduction. Induction, according to C. Pierce, describes a certain empirical phenomenon, implying an already existing category or rule. Abductive inference facilitates the discovery of hitherto unknown concepts or rules based on exploratory surprise and the identification of anomalous cases. Such inference creatively connects new and interesting empirical facts with previous theoretical knowledge. This often requires a reconsideration of prior viewpoints and theoretical preconceptions—assumptions and beliefs must be moved outside the scope of the study or at least modified. Characterizing the three indicated main types of logical deduction of inferences, C. Pierce wrote: “Deduction proves that something there must be, induction shows that something really at the moment, abduction simply assumes that something May be" .

The inductive stage of theory building is associated with solving the question: “Where do insights and premonitions and the generating questions that constitute them come from?” A. Strauss answers it: “They arise from preliminary experience of interaction with similar phenomena - be it personal experience or experience gained more “professionally” from purely scientific research phenomenon, or experience of studying the conducted research, or with the help of theoretical sensitivity based on the knowledge of the researcher of technical literature.” Deduction, in his opinion, “includes a description of the implications of hypotheses or their systems for their further verification.” The importance of a detailed description of not only the phenomenon under study itself, but also the logical moves of the researcher during the analysis of these descriptions is determined by the fact that qualitative data analysis is more subjective in nature than quantitative analysis techniques. The art of deduction is based on the researcher’s ability to conduct a comparative analysis of qualitative data, acquired in practical professional activity.

The verification process includes checking the developed categories, hypotheses, and definitions. Emphasizing the importance of the verification process, A. Strauss speaks of three types of data verification that must be performed by a researcher responsible for his results: 1. Verification of results by correlating them with primary “raw” data; 2. Correlating the results with personal life and professional experience; 3. Correlation with existing results of similar studies.

The authors of “grounded theory” pay great attention to data from the personal life experience of the researcher ( experiential- from English experience - experience, life experience, experiences, knowledge - data). “Data from personal life experiences are essential because they not only provide additional theoretical sensitivity, but also provide an abundance of prior assumptions for making comparisons, searching for variables, and broad sampling on a theoretical basis. All this helps the researcher, based on events, to formulate a conceptually “dense” and carefully ordered theory. processes under study, and thereby removes the confrontation between the researcher and the object of study.

The idea of ​​correlating results with other data or information of another type was developed in more detail by N. Denzin, who proposed calling this procedure the term “triangulation.” This term appeared in the context of a debate about the advantages and disadvantages of being an observer. N. Denzin revealed that, in contrast to mass surveys, “the field work of a participant observer is not limited to preliminary judgments about the nature of his problem, strict patterns data collection or hypotheses." However, he also noted that it is not without its difficulties. First, the researcher's focus on the present can obscure important events that happened before he entered the field. Second, the researcher often does not have the opportunity to study hard-to-reach carriers of the phenomenon. Thirdly, if the observer finds himself in the situation being studied, then he can make changes to it with his presence. And, finally, if the outcome is successful, when the observer is naturally included in the reality being studied, he, as D. Silverman later notes. , “will be able to gain too much information from the interaction with the participants that, like a child learning a lesson, he will not be able to remember everything.”

For these reasons, N. Denzin emphasized the importance of triangulation for qualitative researchers, identifying 4 types of it. 1. Data triangulation - correlation of data taking into account time, place, participants. 2. Triangulation of researchers - the use of data about the same phenomenon from different observers. 3. Triangulation of theories - the use of data obtained from different theoretical perspectives in the study of the same complex of objects. 4. Methodological triangulation, or the use of different methods to study one object and study the variation of data within one method

Conclusion

Let us draw some conclusions about the role of “grounded theory” in the further development of qualitative analysis. Despite the fact that this concept has its own characteristics of working with qualitative data, it can be concluded that in general it continues to develop the approach to data analysis identified in “analytic induction”. Its essence is that qualitative data can be subject to theoretical structuring - identifying analytical units (the phenomenon itself, causes, properties, etc.) and determining the system of their interrelations.

But “grounded theory,” which advocates the generation of conceptually “thick” theories, has a more thorough methodological basis than analytical induction. Its authors developed a conceptual apparatus of qualitative analysis, which began to be used both in intuitive (non-computer) methods of analysis and in computer programs. The apparent simplicity and clarity of this analytical strategy created the illusion of its high technicality, and its frequent mention even led to the idea of ​​creating a “new orthodoxy.” Speech in in this case is about using the rules of constructing “grounded theory” when creating computer programs CAQDAS (Computer Assisted Qualitative Data Analysis Software). Structuring text based on identifying categories is the leitmotif of any such computer program. The NUDIST program contains tools that allow the construction of hierarchies of code categories. ATLAS/ti has developed tools for constructing non-archaic connections - networks, for example, chains or loops. But “grounded theory” has terminological as well as conceptual impact. For example, M. Lonkila notes that the terminology in computer programs for qualitative analysis has much in common with the terminology of “grounded theory.” Since the advent of the first computer programs, it has been common to speak of “coding,” although the term “indexing,” preferred by some authors (for example, the developers of NUDIST), seems more accurate when viewed from the point of view of the origins of computer data management techniques.

However, upon closer examination of these programs, it becomes noticeable that their developers used different concepts for producing knowledge about social reality. For example, John Seidel created his ETHNOGRAPH software package based on methods of discourse analysis and phenomenological and ethnographic approaches. Udo Kukarz created the MAX and WINMAX programs based on Max Weber's concept of ideal types. And the AQUAD program used Popper's methodological approach.

Thus, coding and rearrangement, proposed as an auxiliary technique in "grounded theory" for generating sociological theories, is an "open technology", a technical approach used in different ways by different authors. As a result, the concept of "grounded theory" undergoes a complex transformation similar to that which occurred in its time with analytic induction.

* Over the past 10 years, several research projects based on qualitative methodology have been carried out in Russian academic sociology. One can note the group projects “The Century of Social Mobility in Russia” (under the leadership of D. Berto and V. Semenova), “The Social Structure of the Russian Village” (under the leadership of T. Shanin), “Women’s Rights in Russia” (the corresponding part of the project under the leadership of M . Malysheva), as well as individual studies by V.F. Zhuravleva, O.G. Isupova, I.P. Kiseleva, I.M. Kozina, O.M. Maslova, E.V. Meshcherkina, P.V. Romanova, E.R. Yarskaya-Smirnova and others. IN recent years not only articles and sections in books appeared, but also separate publications covering methodological and methodological problems of collecting and analyzing qualitative data - these are books by V.V. Semenova, E.M. Kovalev and I.E. Steinberg.

** The term “grounded theory” in Russian sociology still remains as a working term. This term is the first of the short and easy-to-pronounce translations of the phrase “grounded theory” with English language, appeared in the press and scientific discussions. Subsequently, researchers began to look for a term that more adequately reflected not the linguistic, but the scientific meaning of this concept. Thus, several more successful translations of “grounded theory” have been proposed - for example, this is the translation by V.A. Yadov as a “rooted theory” or as a “grounded theory” and V.V. Semenova as “ascent to theory”.

*** In this case, we consider the main stage of coding, which follows pilot coding, which, in turn, in terms of the research task and procedures is close to coding in a qualitative study.

11. See: Devyatko I.F. Participant observation // Devyatko I.F. Methods of sociological research. Ekaterinburg: Publishing house Ural University, 1998. pp. 15-43.
12. See: Romanov P.V. Ethnographic method in sociology. Dissertation...cand. sociol. Sci. P. 13.
13. Merton R.K. Social Theory and Social Structure. New York: Free Press of Glencoe, 1957.
14. Sociology: the progress of a Decade / Ed. by Lipset S.M., Smelser N.J. Englewood Cliffs, N.J.: Prentice-Hall, 1961.
15. Glaser B., Strauss A. The Discovery of Grounded Theory. Chicago: Aldine Publishing, 1967. P. 15.
16. Blumer H. Appraisal of Thomas and Znaniecki’s The Polish Peasant in Europe and America. New York: Social Science Research Council, 1939.
17. Strauss A. Qualitative Analysis for Social Scientists. New York: Cambridge University Press, 1987.
18. See: Hanson N.R. Patterns of Discovery: An Inquiry into the Conceptual Foundations of Science. Cambridge: CUP, 1965. Computer-Aided Qualitative Data Analysis: Theory, Methods, and Practice. Ed. by Kelle U. London: Sage, 1995.
19. Peirce Ch.S. // Posterior Analytics /Ed. by Jenkinson. Oxford: Ed. Ross, vol. V, § 146.
20. Denzin N. The Research Act in Sociology. London: Butterworth, 1970.
21. Silverman D. Interpreting Qualitative Data. Methods for Analyzing Talk, Text and Interaction. London etc.: Sage, 1993.
22. Kelle U. Theory Building in Qualitative Research and Computer Programs for the Management of Textual Data // Sociological Research Online 1997. Vol. 2. No. 2.
23. Lonkila M. Grounded theory as an emerging paradigm for computer-assisted qualitative data analysis // Computer-Aided Qualitative Data Analysis: Theory, Methods and Practice / Ed. by Kelle U. London: Sage, 1995.