by Peter Freebody
The University of Sydney
April 25, 2013
Issue—
We ask: ‘How can different types of research can be useful in guiding us in setting policy and shaping classroom practice?’ But there is a prior question: ‘Why, after so much research on literacy education, do we feel that we have not ‘set policy’ or ‘shaped classroom practice’ to our satisfaction?’ Why is there a sense of disappointment on this count among researchers, policy-makers, and teachers?
My Take
It seems to me that there are at least two kinds of possible answers—to do with the sites and the nature of our research. In spite of our incessant, forensic analyses of the psychological processes of reading and writing, we have relatively few detailed, well theorized studies of the two settings we claim we are trying to influence—the settings in which policy is formed, modified, and implemented, and classrooms.
Students bring to school a more complex and diverse set of socio-economic, technological, cultural, and language backgrounds, and they leave school heading for a more complex, diverse, and unpredictable life, learning, and work trajectories. So reforming both policy and pedagogy, and understanding their relationship, are crucial ongoing tasks for literacy educators.
Where Policy is Formed
How policy affects practice is complex. New policies don’t replace old ones the moment they are introduced, nor do they get acted out the same way in different sites within educational systems. They can rearrange the relationships among educators within systems, and reorder the authority of their expertise. We sense these things (see Elmore, 1996), but how they might happen next time, and with what consequences, we simply don’t know. So how we can align our ambitions with our practices is guesswork on each new occasion.
Classrooms
Similarly, there is much research on the application of commercial products, technologies, and strategies to classroom teaching and learning, but teachers’ work is affected by factors such as time, space, and technology constraints; teacher’s classroom work has many simultaneous functions: the need to manage bodies, movements, and attention, to maximize students’ participation, and their emotional and physical safety, and to monitor progress in their learning, and so on, as well as teacher’s need to teach syllabus content.
Teachers try to organize classroom activities so that all of these functions operate at the one time. So one crucial question for researchers is: ‘How are these functions best co-ordinated or orchestrated in different sites to maximize the instructional value of the activities (Dillenbourg, 2011)?
I-O Causal Connections
From this view comes a second kind of explanation for our ‘disappointment’. Much of the research we conduct in classrooms is based on two central ideas: 1) an intervention (I) of some sort—say, a new curriculum or teaching strategy—will or will not cause a change of learning outcome (O); 2) that this I-O causal connection holds in general, and 3) that we know this because we can amalgamate data from lots of classrooms or lots of individual studies. This is the powerful logic of drug testing: ‘this chemical causes the death of these bacteria—overall, generally, wherever.’
Open-System Campaigns
But researching the efficacy of educational activities may be more like researching governmental campaigns about the dangers of smoking than testing the effects of drugs on bacteria. These campaigns operate within ‘open systems’, and they may work or not depending on how they interact in possibly unpredicted ways with other mediating factors (Ms)—in families, neighbourhoods, or workplaces. In this light we have a more generative set of research questions about a variety of I-M-O connections (Reimann, in press/2013). We could improve our chances of doing rigorous work that at the same time speaks more powerfully to the sites in and around literacy education that we wish to influence. For a start, this would probably need to involve long-term projects in more extensive collaborations with teachers and policy-makers, and might even lead us into a refreshing ‘post-disappointment’ phase.
References
Dillenbourg, P. (Ed., 2011) Trends in orchestration. Second research and technology scouting report, D1.5. European Commission: Information, Society, and Media.
Retrieved 030413 http://www.academia.edu/2863589/Trends_in_orchestration.
Elmore, R.F. (1996). School reform, teaching, and learning. Journal of Educational Policy, 11, 499-505.
Reimann, P. (in press/2013). Testing times: Data and their (mis)use in schools. Chapter for H. Proctor, P. Brownlee, and P. Freebody (Eds.) Educational Heresies: New and enduring controversies over practice and policy. Heidelberg, Germany: Springer Scientific. (currently available from the author at peter.reimann@sydney.edu.au)
Reader response is welcomed. Email your comments to LRP@/