1st Annual Advances in Qualitative Methods Conference

International Institute for Qualitative Methodology, Edmonton, Canada (February 18-20, 1999)

Reviewer: Mike Dobson, Lessons Learned Project, University of Calgary, Alberta, Canada


This first conference of the International Institute for Qualitative Methodology turned out to be one of the most interesting academic meetings I have ever attended. For an educational technologist like me the interest in such a meeting was to develop an understanding of methods that might contribute to solving some of the problems of getting high quality teaching and learning software into the educational system. Educational technologists are a pragmatic lot and respond quickly to invitations of demonstrations of exciting software tools that may help them with their work. This, combined with a recent accidental collision with Elliot Eisner's book 'The Enlightened Eye,' and a long standing interest in the Scandinavian qualitative phenomenography method, all persuaded me to investigate.

As it turned out, the experience left me feeling more of a 'qualitative research innocent,' than I ever expected. I now have a new list of books to read in the exciting areas of: grounded theory, narrative analysis, discourse analysis, and object hermenuetics. These are (some of) the theoretical foundations for the methods represented at the meeting and said to have lead to the qualitative data analysis (or QDA) software demonstrated there.

Actually it was Linda Apps, a Calgary artist just getting into computer-based learning who recently talked to me about 'The Enlightened Eye' and we decided to run a small study based on Eisner's work. It was quite a thrill to see him speak in person and to hear his invitation to 'slow down the process of research, to highten the perception of the experience and take time to explore results. Eisner used the metaphor of three sporting events to illustrate complex assessment problems in education. A mile-long race is simply assessed by who crosses the line first. Judging competitors in a high-dive requires measuring many interacting factors including, dive difficulty, elegance, water entry and symmetry. By contrast, 'ice dancing' is a much more complicated sport inolving fine grained distinctive qualities. These qualities even change as new competitors take part in the competition and the process is much more like fine-art than most other sports. Eisner sited John Dewey to reinforce this point and to warn against the tendency in the school system to compare factors that are easy to measure, 'Nowhere is comparison more odious than in the arts.' Eisner proposed arts-based qualitative research as a cure for the current assessment disease, again citing Dewey, 'The arts remove the viel that stops us from seeing.'

Eisner is a discerning arts-based researcher. 'The qualitative researcher may not be like any-old artist' he says. She does have an obligation to communicate the results of a study in a way that is cogent and understandable to her audience. After signing a couple of lines from Cole Porter's song, 'Anything goes,' Eisner reiterated that,' novelty alone does not constitute useful results.' He continues, 'All representations are biased, not just qualitative,' he said. 'Of course, even science errs both by omission and by commission, but fiction has to be true to be great and in this sense only, can fiction constitute good research.' Without explicitly saying as much, Eisner's message was that, ill-disciplined or perhaps ill-motivated qualitative methods and poor reporting are not acceptable in genuine research. There are systematic approaches that are simply features of intellectual integrity, for example, 'The anticipation of schema is a form of generalization.' Presumably, repeated observations of that schema would also be a legitimate form of qualitative verification.

Michael Agar presented a stunning survey of the complexity in qualitative methodology. He illustrated some fairly abstract common themes that run through much of the work done in the name of qualitative method. He argued convincingly that some basic problem-solving techniques from artificial intelligence (constraint propagation & search strategies) could help to provide a systematic approach to qualitative research and that these techniques could be chrystalized into 'a theory of noticing.' Qualitative research involves the translation of meaning from one context to another by way of a qualitative researcher and his tools and processes. Agar pointed to the 'abductive logic' of Charles Peirce as a model for the investigative process in qualitative research. Such a model suggests making observations in your data, then identifying very often associated phenomena, then hypothesising that those associated phenomena may also be present. Agar added that these abductively inferred phenomena must also be verifiable.

Finally I found some presentations covering the process of working with data and particularly the process of 'coding' with software. Several software packages were presented including, ATLAS-ti, NU*DIST, The Ethnograph and HyperRESEARCH. Qualitative research data including interviews, conversation traces and video tapes often contain hidden and contextualized concepts, constructs and categories. Each of these abstract knowledge-objects may have properties and dimensions that in turn may have ranges. Theory building and substantiation are achieved through observation and noticing, and to some extent reflecting and manipulating the data. Systematic processes such as breadth-first or depth-first focusing, can systematize the process, but there is no guarantee that interesting material is evenly distributed through the data trace. Data can apparently be analysed; line-by-line, sentence-by-sentence, or even one-document at a time. Much of the process involved in qualitative analysis is determined by the researcher themselves.

Re-presenting the data is done to allow knowledge-system development, theory building, hypothesis testing and eventually leading to closure and dissemination. To support this kind of knowledge building, it is natural then to use our understanding of knowledge representations to guide the development of system models. The QDA tools make extensive use of semantic networks and graphical binary tree diagrams to illustrate the relations between the concepts found in the data. Hearing about the data analysis process of qualitative research, I was struck how the researcher is empowered by the tools and the methods they allow. I was similarly amazed by the responsibility for intellectual integrity that the methods assume from their users. Selective and biased attention to particular aspects of the data could occur if the researcher were driven by some particular ideological concern. It may be impossible to come to the data without bias but the recognition of this implies that the researcher has the responsibility to keep their attitudes, opinions and beliefs as far from the analysis as possible.

After seeing the tools and listening to some of the reflections made on the previous ten years of QDA tools, I was strongly reminded of the opportunities and concerns being discussed in the context of the so-called knowledge economy. I am convinced that these QDA tools and others like them, and the experience of their development and commercialization, will play an important role in defining both how successful this new economy turns out to be, and what intellectual integrity we might expect to see there. The rest of this report skims over my reasons for saying so, and glosses over performance support systems & learning, ontological commitment & design, and cognitive support.

As an educational technologist with only a pragmatic interest in the theoretical nuances of qualitative approaches I am probably not unlike many researchers in health studies, curriculum development and in other application areas. This may raise some interesting issues for the continuing development of QDA software. Qualitative data analysis software developers are now, as Linda Gilbert mentioned late on Saturday afternoon, 'developing performance support systems - whether they like it or not.' This means graduate students and commercial researchers are very likely to mistake the functionality of qualitative research tools for the practice of qualitative research. In response to this we hear that the researcher should be the tool -- but while there are researchers like me, interested in increasing productivity with qualitative research, the tools will continue to be used and the prospects for misuse will systematically increase. Similarly, John Seidel, pioneer of The Ethnograph, provided an anecdote about one of his students that illustrated using the software may not be the best initial training method. As far as I could see from the demonstrations, these tools are not designed to help learn 'grounded theory' or any other qualitative method. I predict that without some other intervention new users will not understand the distinction.

The ontological commitment involved in the design of these tools reflects assumptions about the knowledge work they are intended to support. This is not true for all kinds of applications. For example, the word processor may not hinder the poet any more than the software manual author. Similarly, several fine-artists have begun to use high-resolution painting tools such as Adobe Photoshop, as well as the designers of labels for new brands of beer. By contrast however, knowledge productivity tools (QDA tools are such) often make selective choices from an ambiguous field of study, and will select one set of labels and descriptors at the expense of others. In the field of curriculum design for example, a tool that supported ëcomponent display theory' would commit to 'principles' and 'facts' and a tool that supported ëcognitive apprenticeship' may comit to 'errors' and 'rules.' The curriculum developed from using each tool would be very different. Increases in knowledge productivity can only result from supporting the cognitive processes involved in doing the knowledge work. If QDA tools are to support better and more efficient qualitative data analysis they may start to look at the specifics of the theoretical base that they claim to support and derive more specific function from this theoretical base.

Right now, these tools may not closely support the theoretical approaches they are said to be derived from. The current designs appear to support two main factors in qualitative research neither of which are the sole domain of QDA software: (1) getting the data off the floor, out of the filing cabinet and into an electronic format, and, (2) coding. If it is true that these tools provide little specific support for the methods themselves, then perhaps it would be interesting to blend their design with designs from other traditions that do not have sociological or ethnomethodological lineage. There are several such useful approaches to data interpretation that are not based on controlled experimental design and statistical analysis. Software developers are not in position to consider these issues on behalf of qualitative researchers. The nature of the software industry makes it hard enough (but easier than understanding the client), to manage keeping up with changing possibilities in the computing environment (e.g., iconic interfaces, separate & movable windows, direct manipulation graphics, and support for video and audio data).

Before reading all those books on object hermeneutics it's too difficult for me to tell what better tools designed to support the method might look like. However, I did get some ideas from Juilet Corbin, Thomas Muhr and Michael Agar about the difficulties people have in doing this kind of research and as an educational technologist that's where I would usually start.

To support the current data management and representation tools, the qualitative researcher may benefit from what might loosely be called, meta-cognitive tools, that help her think about the data. A few meta-cognitive processes were mentioned by Corbin. The flip-flop approach encourages analysts to look for counter-evidence of an hypothesis that is in development. This approach clearly encourages a self-moderating internal review of developing ideas that can only improve the credibility of the results. The transition from far-in to far-out noticing in the data, encourages the analyst to make analogical, metaphorical and similarity-based comparisons with many or few of the key construct properties. This probably helps the user to see links in the data that were not immediately obvious to themselves and hence opens the mind of the researcher to links that might be important to the subject. Some of these ideas are already included as open-ended options for link-types in ATLAS-ti. However none make specific support for argumentation and links that refer to parts of discourse analysis and grounded theory. The link types from argumentation theory, is-support-for and is-against (and others) may be useful in discerning persuasive aspects of text and even used to demonstrate power relationships in discourse. Perhaps the links, is-strategy-for, is-context-for, is-property-of, is-dimension-of, and has-range-of could be the beginning of tools to specifically support grounded theory research. The combination of such generic propositional link types with the procedural components of Agar's 'theory of noticing,' may together take the design of software for qualitative analysis to another level of impact in the emerging knowledge-economy.

The Second Annual Advancing Qualitative Methods Conference is January 25-28, 2000, Auckland, New Zealand.