Social workers operate in context of uncertainty, ambiguity and indeterminacy. In such contexts will the strict application of empirical or theoretical knowledge be effective? is there the need for a mixed paradigm to govern social work practice?
Considering the context within which social workers operate, a context of uncertainty, ambiguity and indeterminacy don't you think when we strictly employ this "rational technical approach" the 'social' in social work will be lost? why can't we rather consider practice-based evidence rather than evidence-based practice?
I think the teaching of Evidence-based researchs are appropriate for the social workers and for everyone in social sciences, but are not enough and the most important is personal and social formation.
It is worth looking at the idea of therapeutic assessment - using tools in a way that help the client - ediucative, supportive and future directed, rather than simply reflectively listening without this going anywhere.
Have just returned from an All-Ireland Conference on Advocating for Change in Social Work. A discussion began about evidence, evidence-based practice and what and how we measure, if we should at all.
One of the key conclusions was the importance of 'evidence-led practice', the importance of both having 'evidence' but also having a healthy critical view of same. The bottom line is that there is no substitute for the qualified, professional, human decision maker in social work practice. But this individual must always be at least informed by evidence but not dictated by it.
Evidence-based practice as colonised by the medical profession is inappropriate for social work and Charles and Joseph are both right in drawing attention to its limitations within a social model. Personally I prefer the term evidence -informed practice which seeks to bring together the importance of research evidence, practice wisdom and service user views to help identify effective practice.
EPB is essential for social work. I teach it in my classes because it helps social workers pick the best possible intervention. I'll never forget my first semester teaching social work. A local foundation was pushing a particular youth crime prevention intervention. I had my students assess the evidence and one student blurted out "Hey, this intervention claims to be impactful, but the only measure given was self-reported intent to join a gang, and scores changed after the intervention 11% with no control group! This doesn't sound very impactful to me." Since they had funding in the pipeline at their field site, they put together a proposal on how they would adapt the model to the indigenous migrant population with which they were working.
Resources on EBP may be found online. The key portal for medicine is the Cochrane Collaboration. http://www.cochrane.org/
For social interventions, the Campbell Collaboration http://www.campbellcollaboration.org/:
For those with access, I recommend that you read Eileen Gambrill's lead article in the most recent issue of the Journal of Social Work Education (US). http://www.tandfonline.com/doi/full/10.1080/10437797.2014.916937#preview
Eileen was one of the key figures who helped "colonize" social work. One item I remember from her graduate seminar was to avoid attacking a "Straw Man" version of evidence-based practice. Specifically, I am not aware of a version of EBP that denies clinical experience or case studies. I would encourage you to debate this issue with your colleagues in the context of specific organizations and scholars.
So Hugh, how do you distinguish between evidence-based and evidence-informed practice? In my view, what you describe as evidence-informed practice is exactly the definition of evidence-based practice: integrating research with clinical experience and patients'/clients' views, values and preferences.
Another interesting definition/distinction is that between research-based knowledge and practice-based knowledge, and how these two concepts can (and should!) be integrated. See for example Nilsen et al: Integrating research-based and practice-based knowledge through workplace reflection. J Workplace Learn 2012, 24:403–415.
Susanne, as you'll have seen I highlighted the ideological 'medically' driven definition of evidence based practice which appears to have taken over the Anglo-Saxon world with it's promise of an intervention to cure all social ills neglecting the 'context of uncertainty, ambiguity and indeterminacy' and proposing a highly positivistic hierarchy of knowledge. Whilst I can accept this within a medical setting (and even here there is some dissension) the further you move into a social setting setting the less helpful it becomes and where practice wisdom and service users views need to be reconciled alongside research evidence. In the Nordic countries this may be less of an issue than it is in the UK, Australia and the USA.
I would have thought that the organisation and practice of social services are already running largely on the basis of practice-based evidence, i.e. what the practicioners believe to work. But as we all know, we are bad at subjectively assessing how good such practices are in general. The idea is then that we evaluate the effectiveness of these practices, which is a process that requires comparisons with both no practice (no treatment) and alternative treatments. It is then only in those cases where we find that established practice doesn't work as well as we thought that we need to change the practice. But sometimes we find that old ways are the best ways, and we keep them, now in the guise of evidence-based practices in stead.
Obviously, our evidence practices are fallible, and so bad practice sometimes wins the day. But in the long run they are unlikely to win the war.
And, we may disagree whether the intervention model of evidence based research in medicine works for social work. But I am guessing that what is missing is a clear idea about how to do evidence summaries/synthesis of qualitative research, which is perhaps the better way to evaluate practices in social work. Until we work that out, we probably will have to make due with the quantitative 'randomised controlled studies' synthesis. But, frankly, what is an argument going to look like that says we shouldn't base our practices (in whatever field), on the best evidence available?
Thanks Hugh, I don't think we disagree and didn't mean to question your earlier statement. I'm just curious about the distinction between "evidence-based" and "evidence-informed". I've struggled with a similar distinction between "theory-based" and "theory-informed". The latter to me, COULD be interpreted as "yes we know what the evidence says but we don't base our work on it". Otherwise, why wouldn't you call it evidence-based? Joseph brings in another interesting term, "evidence-led", which, in my opinion, might land in between -based and -informed. What do you think?
But I don't think EBP in the medical world claims that an intervention will cure all, and that the concept doesn't also consider context, uncertainty, etc. Uncertainty is one of the major things addressed in the GRADE system of evaluating strengths of evidence, which is becoming dominant in the medical/healthcare world, and the system emphasises the importance of clinical expertise and patient involvement in all situations where there is uncertainty of (research-based) evidence. The hierarchy of evidence is positivistic, yes, but the wind is blowing toward much larger emphasis on experience-based evidence and patient/client/user values.
Rognvaldur, it's a little off topic but there are many methods to synthesize qualitative research. One that I'm familiar with and can recommend is thematic analysis (Thomas & Harden 2008). And yes, your final question summarises this discussion well! :)
Susanne. By synthesizing qualitative research I didn't mean the analysis of qualitative data in individual studies (which, as far as I know, is what thematic analysis is about). I was talking about ways of doing overviews of the sum total of qualitative research on a particular subject, in the way, say, Cochrane systematic reviews do meta-analysis of all the randomised controlled studies on various things. The problem is that in thematic analysis, each study is supposed to generate codes and themes from their particular data. This means that standardisation between studies becomes a problem. So, even if we assume that all the individual qualitative studies are OK, then it becomes difficult to summarise the combined evidence from many studies, and that is an essential ingredient in the analysis of the evidence base. As far as I can tell, this is the main difficulty of justifying certain practices on the basis of qualitative research. I do know the Cochrane collaboration are trying to develop ways to bring qualitative data into the evidence synthesis process, but is finding it difficult.
I did mean thematic synthesis, not thematic analysis, sorry. Thematic synthesis is one of the methods that can be used to synthesise qualitative data. As you say, it is certainly not as easy to standardise and to draw conclusions, but you can generate new themes from the findings of the individual studies. With thematic synthesis you develop descriptive and analytical themes, based on the data you analyse and it's a very systematic and structure process. And yepp, there are many ongoing efforts in the EBM/guideline world at this time to increase the importance of qualitative data. It will be interesting to see how these efforts evolve :)
Although evidence based research plays a vital role in social work practice, it is important that it is combined with other professional skills and experience. Social work practice occurs under circumstances where information may be incomplete and conflicting and for various reasons, this may lead to biases in the decision-making process.
To assist with decision-making, social workers use instruments. The two most common instruments used in social work practice are consensus instruments (which are flexible and rely upon the practitioner's experience in the field) and actuarial instruments (which adopt a numerical approach and focus upon static/historical factors). Items in actuarial instruments are identified from larger studies or meta-analyses that have an empirical relationship with a particular outcome. Items in consensus instruments are derived from professional opinion.
Many parallels can be identified between cognitive theory and the development of instruments. Historically, it was believed that there are two different forms of cognition: intuitive and analytical. Today, we have a much more sophisticated level of understanding which is based upon the complementary nature of these forms of cognition and to a certain extent, their interdependence.
Consequently, we must develop and use instruments that reflect this understanding. Specifically, the area of violence prediction in the forensic field has been adopting a combined approach for over one decade, with excellent results. Structured Professional Judgement, (SPJ) is now the gold standard. SPJ combines evidence based research and professional judgement, a process that allows practitioners to reflect upon the meaningfulness of a particular risk level (that is determined by the evidence based research), given the context of the individual case which is being assessed.
I have recently published a paper in the British Journal of Social Work, which further explains the synergies between cognition and the use of instruments used in social work practice - and concludes with the inevitable and vital role of evidence-based research in informing social work practice. (Decision making in social work with families and children: developing decision-aids compatible with cognition - doi:10.1093/bjsw/bcu087).
Just a general comment - professional judgement plays a core role in SPJ. Practitioners incorporate empirical research in their decisions, however the professional judgement component provides a vehicle for ethical or moral dimensions, or any other factor that may be considered important but has yet to be incorporated in the decision. Whilst the empirical component of the instrument in effect 'standardises' the assessment, the practitioner plays the vital role in contextualising it. The SPJ approach relies upon a combination of the practitioner's experience, expertise and high level training to carry out assessments - a perspective that ultimately promotes the professionalization of social work practice.
Evidence-based knowledge is critical in the social work field simply, because of the procedures required to address a human mental condition at time of assessment. There is need for mixed practice due to the cultural nature of practice. Empirical,theological, spiritual as well as instinctual knowledge are interdependent within the conceptual framework of social work.
In recent years social work has been dichotomized into evidence-based and relationship focused practice.Despite paradigm shifts the use of self in relationship building continues to be central to the profession of social work. We need to guard against dichotomizing. We need to be more encompassing of the broad range of theoretical and philosophical underpinnings in social work practice which has grown exponentially over the years especially in relation to our deeper knowledge and insight into the complexities of not only human beings but of social situations as well. A one size fits all mentality is hardly likely to find favour in social work practice.