Network MMMR – Workshop #5

DFG Logo

An Interdisciplinary Network on the Methodology and
Applications of Integrative Research Methods

 

Back to Welcome Page

Videos of selected conference presentations are available here.

From February 5th to 7th 2020 the fifth workshop of the research network MMMR took place at Helmut Schmidt University Hamburg. International guest experts Kathleen Collins (University of Arkansas) and Martyn Hammersley (The Open University UK) joined the network to discuss issues of Quality Criteria and Best Practice Standards for Mixed Methods and Multimethod Research.

The meeting took off with keynote talks by Kathleen Collins, Martyn Hammersley, and Judith Schoonenboom (University of Vienna). In her presentation on Enhancing Quality Criteria Design: A Multifaceted Approach Kathleen Collins gave an overview of her work on the concept of “validity design”. Assuming that the formulation of generalized quality criteria is of limited use in the area of integrative research methods, validity design highlights the importance of selecting and adapting relevant criteria specifically for a given study design. Collins argued that the focus of the methodological discourse on MMMR had shifted from “mixing” (i.e. the linking of more or less independent methods) to “integration” (the merging or meshing of different approaches), and that this shift also implied a more strongly integrated approach towards quality criteria. The term “legitimation” (Onwuegbuzie & Johnson) was proposed as an appropriate title for such an integrative perspective, highlighting the methodological potential but also the difficulties of this approach. Among the latter is the necessity of an intensified dialog between researchers of different disciplinary and methodological orientations, which comes with a certain potential for misunderstanding or even conflict. Validity design in MMMR, Collins argued, therefore requires a high degree of „philosophical clarity“ on the part of researchers‘ self-reflection as well as mutual understanding. However, Collins emphasized that in addition to clear communication on abstract theoretical assumptions, successful validity design also requires transparency of the research methods applied and their respective goals and functions.

Martyn Hammersley’s presentation revolved around the question: Are there Assessment Criteria for Qualitative Research? He initially described the diversity of objectives and applications in which research methods are used and the different shapes in which the question of quality criteria may consequently arise. Considering that quality criteria have traditionally been a rather controversial and somewhat inconclusive issue in qualitative research, Hammersley introduced a conceptual distinction between standards or dimensions of evaluation (Which aspects of research are regarded as relevant for quality assessment?), the benchmarks or thresholds guiding decisions on research quality (At which point does research turn from sound to unsound with regard to a given aspect?), and the indicators used to operationalize dimensions of evaluation. Hammersley then went on to criticize the popular idea of “transparency” of indicators for research quality, arguing that quality criteria are usually not applicable in a deterministic and objective manner, be it in qualitative or quantitative research practice. Rather, their application is often a highly interpretive and context specific act – corresponding with the Wittgensteinian maxim that there are no “self-applying criteria”. However, this does not make quality criteria arbitrary or irrelevant, but they provide a framework for the reflection of the validity of research, rather than clear-cut assessment criteria. In the context of method integration, where a joint evaluation of qualitative and quantitative components becomes necessary, the distinction between general standards and method specific indicators becomes especially important. Hammersley argued for universal standards of research quality – ultimately the truth and value-relevance of empirical findings – but a context and method sensitive selection and application of benchmarks and indicators. The research questions and inference goals of a given study, ranging from detailed description to generalized causal explanation, may serve as a guideline for such adaptation.

In her keynote on Developing and Integrating Claims as a Quality Criterion for Mixed Methods Research Judith Schoonenboom (University of Vienna) presented a model for evaluating research quality based around the concept of “meta-inferences”, i.e. the inferences synthesized from combined methods in MMMR. In contrast to most applications of this concept, Schoonenboom did not locate meta-inferences in the final interpretation-phase of an integrative research project, but rather described them as an ongoing process of making and developing claims, starting with the formulation of a research question, and constantly evolving over the course of a project. This development usually proceeds in an iterative and often circular manner, in which an initial claim derived from previous research is confronted with inconclusive or contradictory results, which is then followed by adapting the claim through further investigation of empirical material, such as sub-group-analyses. She presented an example in which an intervention for international development-assistance did not produce the intended results in a standardized evaluation. Subsequent analyses of qualitative data then lead to the development of explanatory hypotheses for this finding and an integration of both qualitative and quantitative findings resulted in adapted claims. With regard to MMMR generally, Schoonenboom argued, this points to the importance of the research process as a basis for quality assessment: It is not only relevant to ask whether initial questions have been answered and hypotheses have been rejected or corroborated, but to assess the plausibility and consistency of the evolution of claims throughout the project. To make such an assessment, it is most important to combine different (qualitative and quantitative) perspectives on research quality in a comprehensive way.

The afternoon sessions of the first workshop day started with two discussion papers by Udo Kelle and Felix Knappertsbusch. In his paper on Quality Criteria in Mixed Methods Research Udo Kelle highlighted the idea of method complementarity, i.e. the combination of approaches with „non-overlapping weaknesses“ and „complementary strengths“ (Brewer & Hunter), as a central orienting principle of MMMR. This assumption of specific strengths and weaknesses in different methods places “error theories” at the center of method integration. However, as Kelle argued, such error-theories have yet received limited attention in the MMMR-literature, which also leads to uncertainty with regard to the adequate quality criteria for integrative research. Kelle then went on to suggest that the quality criteria traditionally developed in the context of standardized methods (external and internal validity, reliability, objectivity) do provide a suitable framework for all kinds of empirical research, including qualitative methods. However, instead of discussing the method-specific application of these standards in a dialogical manner, researchers still tend to separate into qualitative and quantitative camps with independently developed concepts and terminologies for quality assessment. From a methodological point of view this is an unfortunate situation, because crucial opportunities for (self-)critical reflection are replaced by mutual hostility: criticisms of one’s own methodology are often met with a tit-for-tat-strategy of defense, pointing out faults of the opposite approaches. Typically, this leads to quantitative researchers underestimating problems of construct validity and confounding them with matters of reliability, while qualitative researchers tend to ignore issues of objectivity in interpretive analyses as well as lacking external validity in small-n samples. To avoid such bias, Kelle proposed a general fallibilistic approach towards all knowledge claims, along with a general scientific ethos of radical skepticism and sincerity, especially with regard to the shortcomings of one’s own work.

In his paper Can there be a Methodology of Discovery in Social Research? Felix Knappertsbusch concentrated on the possibility of „methods of discovery“ (Abbott). The development of new and innovative research perspectives beyond established methods and approaches is often named as one of the core aims of MMMR. However, there are considerable methodological and epistemological problems with devising methods for systematic production of novel knowledge and with the evaluation of such “methods of discovery”. One important reason for these problems, Knappertsbusch argued, is the paradoxical structure of the concept of scientific “discovery”: It implies insight beyond the scope of previous knowledge, but at the same time, it must not be pure imaginative play, but warranted knowledge conforming to what we already know. With regard to the issue of “methods of discovery”, this means that systematic approaches to discovery fail exactly when they succeed: If a method of discovery were truly sufficient for producing novel knowledge, the respective knowledge would be implied in the method, thus losing its value as discovery. Knappertsbusch discussed several approaches modelling this “paradox of discovery”, including the epistemology of “thought experiments” and Frankfurt school critical theory. He then pointed out their common reliance on “heuristic” methods that aspire to combine methodical procedures with openness and creativity. Heuristics for discovery, such as thinking in “conceptual metaphors” (Lakoff/ & Johnson) or “constellations” and “models” (Adorno), have yet received limited attention in social research methodology, including MMMR, which in part can be explained by critical rationalism’s influential exclusion of the “context of discovery” from philosophy of science. A starting point for investigating methods of discovery more closely could be the metaphorical, evocative, and constellation-building methodology they share and which is closely related to core ideas of MMMR.

The first workshop day concluded with discussions of current and planned network projects, including the foundation of a working group “Mixed Methods” within the German Sociological Association as well as the substantive and organizational preparation of an edited volume summarizing and discussing the results of three years of network activities.

The second workshop day focused on monographs planned as part of a series of mixed-methods-textbooks edited by Andrea Hense, Susanne Vogl, and Felix Knappertsbusch with Springer-VS publishing. Of the four introductory volumes currently in preparation, two were discussed in depth on February 7th: Judith Schoonenboom presented a concept for her introductory volume on “Mixed methods and multi-method research design“, which provides an alternative approach to popular design-typologies and focuses on a process-oriented perspective highlighting the emergent and iterative character of MMMR-designs. Leila Akremi and Andrea Hense outlined their concept of an introduction to „Mixed-Methods-Sampling“, which is also highly focused on research practice and centers around three core topics: The reconstruction and demarcation of cases and populations, essential techniques of case selection and data collection, and options for case comparison and generalization.

HSU

Letzte Änderung: 5. March 2020