I guess what I am asking is not only if this is possible, but also how would such a study be done? And what are it's advantages and disadvateges? How vulnerable is it to criticism?
The problem is within the initial question and its assumptions. Indeed, it leans towards a quantification of spoken responses (and by extension, observations) that satisfies a positivist (quantitative) rather than constructivist and/or subjectivist (qualitative) approach. This is what makes a grounded theory analysis attractive to many numerically-inclined researchers rather than using hermeneutics, content/discourse analysis, observation, and so forth. You also have to remember that the issues of validity and representative sampling are quantitative concerns whereas key informants, truthfulness, and saturation are the concerns of the qualitative researcher.... or as Yoda may have said: step away from the dark side, young Padowan learner!
I agree with the previous scholars who noted that transcription is the first level of 'coding', even though that term has additional baggage, and is the first modification or transformation of what was stated by your participants. Yet, we have not been given any information about whether you are proposing a study, have conducted it and gathered information, or the methods proposed/employed in the study. I'm assuming that you used the most common qualitative approach to qualitative data gathering and conducted semi-structured interviews and recorded (using written notes and/or electronic devices) the responses and are now seeking to analyze the data you collected. I will continue with this assumption.
An analytic approach that may be helpful to you can be found in Attride-Stirling's (2002) thematic network analysis technique; a bottom-up approach to data analysis. She provides a model of analysis whereby a network of themes is designed using (1) significant words or phrases within the transcripts are broken down into 'codes'; (2) related codes are then clustered into 'basic themes' that give meaning to the codes; (3) the process is repeated with the basic themes being clustered into related 'organizing themes' that allow the researcher to analyze both the relation of the basic themes to their respective organizing themes but also the interrelations between the organizing themes; and (4) the creation of one or more 'global themes' that bring together the organizing themes to answer the your research questions or hypotheses. In addition to tables and discussions, this method also provides the framework for a visual representation of the analysis and the creation of the themes at all levels.
Well depends on what we think is 'coding'. I 'drop' some thoght:
transcription is a first level coding ... listening and thinking "she is talking about her mom" is a meaning classification . Observing a landscape and thinking: "there are a vine yards, a church and some houses" is an activity of classification.
In a epistemic point of view we cannot know without 'classificate'.
Coding is a way to share with others our classification process. Inter-subjectivity is one of the basis of scientific methods.
There are different ways to code and share data, each one has pros and cons.
I am addressing of course the analysis of the data. The question is whether is is possible (in the sense of acceptable and valid) to analyze daata without coding it extensively, but only addressing what 'pops up' intuitively as one reads the accounts? Trusting one's capability to notice what is important without having to code it. A more "huma-as-instrument" (Maykut & Morehouse, 1984) kind of process.
Yes it is possible to conduct qualitative study without coding depending on method of analysis you are using. For instance, narrative methodology/analysis does not require coding into themes because such coding undermine the narrative sequence of event in participants' interview transcript. @ Fabio, transcribing does not necessarily means coding but only gives researcher a rough sketch or idea about the emerging or recurring troupes and themes dominant in the interview data.
Hey Cobi, I go with Ayodeji, coding is IMHO connect mainly to grounded theory. So it depends what kind of analysis method you are using. There are plenty of others (hermeneutics, content analysis, narrative analysis, documentary methods - sorry in case this is not the correct english translation). And here the ("Grounded theory") coding is not necessary.
But if you relate to coding in that way that you mark something in the text/material, then I would say no, you can not do any analysis withouth setting marks in the material. This is mainly for quality reasons and for to document/prove your ideas (it wouldn't be acceptable and valid, to use your words).
@Ogunrotifa I agree, with first level coding usually we refer to the action to segment the transcript and 'tag' the syntagma of the transcript.
I was using the word 'coding' in a broad sense :-)
My first answer was more a provocation to put emphasis on the fact that any type of 'transformation' we do on a human interaction, and the first transformation is the recording, introduces some modification of the meaning. This is quite obvious, but for a correct interpretation it is important to be aware of all processing on the original interaction.
What is coding depends of the methods you use for the analysis, I think Ronny Gey explained in detail what I mean. Is not possible to make analysis without classifying the content (anyway is what every human do in his/her normal mental process). The point is that the process have to be explicit.
The problem is within the initial question and its assumptions. Indeed, it leans towards a quantification of spoken responses (and by extension, observations) that satisfies a positivist (quantitative) rather than constructivist and/or subjectivist (qualitative) approach. This is what makes a grounded theory analysis attractive to many numerically-inclined researchers rather than using hermeneutics, content/discourse analysis, observation, and so forth. You also have to remember that the issues of validity and representative sampling are quantitative concerns whereas key informants, truthfulness, and saturation are the concerns of the qualitative researcher.... or as Yoda may have said: step away from the dark side, young Padowan learner!
I agree with the previous scholars who noted that transcription is the first level of 'coding', even though that term has additional baggage, and is the first modification or transformation of what was stated by your participants. Yet, we have not been given any information about whether you are proposing a study, have conducted it and gathered information, or the methods proposed/employed in the study. I'm assuming that you used the most common qualitative approach to qualitative data gathering and conducted semi-structured interviews and recorded (using written notes and/or electronic devices) the responses and are now seeking to analyze the data you collected. I will continue with this assumption.
An analytic approach that may be helpful to you can be found in Attride-Stirling's (2002) thematic network analysis technique; a bottom-up approach to data analysis. She provides a model of analysis whereby a network of themes is designed using (1) significant words or phrases within the transcripts are broken down into 'codes'; (2) related codes are then clustered into 'basic themes' that give meaning to the codes; (3) the process is repeated with the basic themes being clustered into related 'organizing themes' that allow the researcher to analyze both the relation of the basic themes to their respective organizing themes but also the interrelations between the organizing themes; and (4) the creation of one or more 'global themes' that bring together the organizing themes to answer the your research questions or hypotheses. In addition to tables and discussions, this method also provides the framework for a visual representation of the analysis and the creation of the themes at all levels.