And that's the story of how I got my new shell. It looks just like the one I threw out yesterday, and I found it in the same dumpster. But this one had a live raccoon inside. Dr Zoidberg
I am still struggling to buy into this whole 21st Century Learner thing where we claim (in Zoidbergian new shell type arguments) that at some point, between yesterday and today, the learner and or their process of learning has changed.
Students today are different from those of yesterday. They think and engage differently. Hon Steve Maharey Minister of Education. The New Zealand Curriculum. Draft for Consultation 2006
What makes it harder for me is that those making the claims don’t/can’t elaborate on the research identifying the point of difference/ the live raccoon, between yesterday and today’s learners.
Whilst no one from the MoE is describing the learning experiences of current New Zealand students as impoverished and or dislocated, many of the statements around the release of the new draft curriculum suggest that its point of difference from the existing curriculum is that it will allow teachers to “engage all students in rich and authentic learning experiences.” Leading one to ask just what have our students been engaged in before this?
“Engagement” is an interesting notion, as is “rich and authentic”. When I hear schools advocating the use of student inquiry and authentic contexts over other pedagogical approaches on the grounds that it engages (and thus apparently motivates) students, I always want to ask
- How do you assess engagement?
- How different are these measures when students are learning through inquiry activities than when they are learning through other pedagogical approaches? And
- What difference do you find in student learning outcomes that can be causally attributed to your measures of engagement?
And when I think about “rich and authentic” I want to ask, authentic to whom? I want to know why “rich and authentic” is a more popular descriptor of the quantity and quality of the learning experience than “educationally relevant”
Ailsa’s recent blog post comment neatly captures my problem with depending on notions of student engagement as a necessary and sufficient foundation for choosing the curriculum and pedagogical approaches we use in classrooms.
My education actually did me a great service that I was not aware of at the time. 30 years later I can praise the choice of literature, George Orwell with Animal Farm. I hated it at the time- not all education is, or should be, enjoyed, and Teachers wait a long time for praise!
In my workplace the SEPs (Student evaluation of a paper) would never be able to pick up such longterm effectiveness of course content or teaching effectiveness.
As Ailsa notes, measures of student satisfaction with a course, subject, or topic are not necessarily related to student learning in the short term or long term.
Learning is not a consumer activity that can be assessed by surveys of satisfaction. “It was heaps of fun” is not a measure of relevant, effective and or long lasting student learning outcomes
And engagement, despite Prensky’s slickly marketable “engage me or enrage me” stuff, engagement is not a self report measure of wonderment and awe but rather a reflection of the determined and persistent focus that a learner needs to promote learning.
As Brabazon notes in her provocative book Digital Hemlock “To read remember, understand, synthesise and interpret knowledge is often drudgery. To learn with effectiveness requires repetition, practice and failure.”’ (p9)
I don’t know how to measure student engagement apart from perhaps self report surveys of study behaviours. And when I ask teachers what they think engagement is and how they might measure it they don’t know either. The conversation is all live raccoon stuff.
When unsubstantiated claims about student engagement/ or lack of it are being promoted as the reason for reengineering curriculum and or changing whole schools over to inquiry learning I reckon we should know more about it than we do.
The concept is simple, deceptively simple, the way we interpret engagement is anything but.
Let loose the raccoons.
It's interesting to observe your scepticism about the 21st Century Reformation of Education, from a US perspective in which the pull is rather more of a push toward direct instruction and narrow control of learning outcomes and methods. I share your doubts - of both pushes and pulls. What bothers me about these visions is the commodification of learning, as though it is a product being packaged for delivery. Government propaganda about Education, whether it be for Inquiry or DI, or whatever they've got on special today, has the ring of advertising more than research.
For Inquiry in education processes to gain traction, teachers should be asking questions, taking action, and reporting back to others with similar interests. But this would confound the curriculum writers, who need to tell us the answers.
Posted by: Doug Noon | December 28, 2006 at 03:33 PM
I think you are right Doug. Inquiry versus direct instruction is a false dichotomy. We need both and other stuff as well. It so frightens me when schools proudly claim that they base all their learning experiences around student inquiry.
I am not the only one intimidated by the prospect of “inquiry” run rampant. The prospect of students learning through Inquiry 24/7 was described by Ton de Jong during an ICCE2006 keynote as “a nightmare” ands this is from someone who has extensive research experience in designing effective simulation based inquiry environments at the University of Twente in the Netherlands.
De Jong, T. (2006) Computer Simulations – Technological advances in inquiry learning. Science, 312, 532 – 533.
de Jong described inquiry as an approach to learning that evokes a process exploration, and leads to asking questions and making discoveries in search of new understandings. He breaks the process down into Transformation processes [Orientation/ hypothesis generation/ experimentation/data interpretation and reaching conclusions) and Regulative processes (Evaluation/ Planning and Monitoring)
His research team in Twente have found that secondary age students have problems with ALL of the inquiry processes listed above
Students have difficulty choosing the right variables to work with, they find it difficult to state testable hypotheses, and they do not necessarily draw the correct conclusions from experiments. They may have difficulty linking experimental data and hypotheses, because their pre-existing ideas tend to persist even when they are confronted with data that contradict those ideas. Students also struggle with basic experimental processes. They find it difficult to translate theoretical variables from their hypothesis into manipulable and observable variables in the experiment; they design ineffective experiments, for example by varying too many variables at one time; they may use an engineering approach, where they try to achieve a certain state in the simulation instead of trying to test a hypothesis; they fail to make predictions; and they make mistakes when interpreting data. Students also tend to do only short term planning and do not adequately monitor what they have done.
Quite a list. How we support students at all of these process stages, or even identifying these stages for students is something that I don’t think we do very well in New Zealand where student inquiry is extensively promoted for much younger children.
One of the problems de Jong identified was the difficulty in giving effective formative feedback during inquiry when you may have missed the context of what the student has done. Another is that unless the students have a strong grounding in the relevant content in the domain they will struggle with the cognitive load of researching for new meaning. It seems that student inquiry ought to follow direct instruction not replace it.
I was especially interested in the selective circumstances De Jong’s team found suitable for inquiry learning to be effective
1. When the right kind of domain is used (intuitive deep conceptual knowledge rather than operational factual procedural knowledge) Direct instruction and practice should be used for factual and procedural knowledge
2. When relevant cognitive processes are triggered and scaffolded (by section)
3. When appropriate prior knowledge is available either with the co learner or in the system.
4. When learners have a goal to work to – eg a hypothesis
It seems inquiry learning needs much more thoughtful planning and discerning implementation than it currently attracts.
Posted by: Artichoke | December 28, 2006 at 04:34 PM
I've collated a range of critiques of Prensky on this page of the learning evolves wiki
wrt inquiry approaches, in a previous life as a science teacher I used the NZ developed learning in science project materials and they were brilliant. I think good materials exist but they are not always recognised or systematically implemented.
Posted by: Bill Kerr | December 29, 2006 at 06:04 AM
Thanks for this Bill,
We share a common past – the teaching of secondary science (I taught biology/chemistry/physics) and the excitement of discovering and then implementing LISP materials in classrooms.
LISP electrified me – I still have all the working papers – I cannot bear to throw them out. Even though it has been years and years and years since I taught secondary science. I have seen nothing since that captures the thinking in them. They are brilliant. The careful attempts to build on the ideas children have already acquired, the ways in which children’s views are determined and the planning of the teaching programme still make me excited. It was grand stuff and exactly the scaffolding that is missing from the constructivist inquiry programmes in our schools today.
Sadly as the MoE report identifies LISP has had little impact on the classroom learning experiences of New Zealand kids. (or in many cases their teachers – who would look at you blankly if you asked them about it)
The degree to which the LISP findings are reflected in the teachers' pedagogical practices and the students' received curriculum is less clear. Despite the pre-service programmes that include the LISP findings, and the in-service programmes run in some parts of the country as part of the 1993 science curriculum implementation work, the actual uptake and use of the LISP findings in classroom pedagogies and supporting resources is thought to be varied and has yet to be widely researched. In addition, the use of recent theorising on learning and mind (Bell, 2000) to develop and evaluate additional effective pedagogies has yet to occur as has the researching of the use of the pedagogies developed in the LISP programmes, without the perceived constraints of national and school curriculum and assessment policies.
and it seems the teaching workforce in New Zealand lacked the expertise to develop materials that allowed for students prior ideas
Moreover, the prevailing view of "curriculum" in the early 1990's, and the perceptions of accountability movement in New Zealand education, made it extremely difficult to action these notions in the official curriculum or the classroom curriculum (Bell and Cowie, 2001). It has been difficult for even experienced teachers/writers to write curriculum materials that took into account students' prior ideas.
I know you have been thinking about developing something "LISP like" for computer studies – how is it going?
Posted by: Artichoke | December 29, 2006 at 09:02 AM
Paul Chandler is developing a LISP like wiki about computing concepts , which I have contributed to. How is it going? I think "slowly" would be the honest answer, unfortunately.
On the home page Paul has a couple of great links to LISP materials. This one provides extensive lists of children's misconceptions about science. Good starting point as a refresher of for anyone curious about this approach.
I still have all the LISP working papers too arti, couldn't bear to chuck out such great teaching and learning materials either.
Posted by: Bill Kerr | December 29, 2006 at 02:34 PM
At the same time LIsp was coming to the forefront so was Kelvin Smythe with the Feeling for Approach to Social studies.Perhaps one needs to relook at the strengths of the approaches of the past .Has Inquiry become too wishy washy ?Are we presuming too much of our students.Are we really listening to their intial before views and then building on them to create new Knowledge .Just pondering
Posted by: Vicky | December 31, 2006 at 01:49 PM
Bill, I wish you had been with me at Professor Naomi Miyake’s keynote at the ICCE06 in Beijing. Miyake talked about Designed Collaboration as a Scaffold for Schematic Knowledge Integration
My conference notes will not do it justice, and Miyake covered many other ideas and design approaches in her presentation (including something quite remarkable using jigsaw matrices over a 3year course of learning in the cognitive sciences) but the stuff I would have liked to talk with you about follows.
The bit that reminded me of the Learning in Science Project was when she talked about HYPOTHESIS > EXPERIMENT > INSTRUCTION (HEI) [a Japanese approach to learning science]. I have not seen this being used in NZ schools, although it may be in use somewhere, have you seen it in Australia?
It starts out very like the LISP project but then develops into something that involves carefully planned for collaboration. It sounded very powerful and from the papers cited has been researched over a number of years.
The standard approach in HEI in Japan is to expose students to a series of tested questions designed to determine their individual views about different science ideas. [This seemed a bit like the LISP questions that determined the views held by children about force/ energy/ living etc)
1. Tested questions
Eg Which is heaviest
(a) Person standing on scales
(b) Person standing on one leg on scales
(c) Person squatting on scales
When students make their choices they are asked to explain why they chose what they did
2. Pupil responses shown by a show of hands
3. Pupils discuss choices made and reasons for them
4. But then offered option to change minds
5. Pupils then do a demo experiment with teachers guidance – do once
6. Pupils then respond to a series of questions designed to identify their new thinking
(a)What if a clay ball is changed into a different shape – flat pancake long sausage?
(b)Would a babies body weight change if she drinks a bottle of milk?
(c)Would your body weight change if you drank a carton of milk?
(d)Would dissolving sugar in water etc etc?
There is no great use of readings/ textbooks/ direct instruction for teaching concepts – instead they prefer a series of repeated HEI
And the learning outcome of HEI? Research results show that students gain solid conceptual understanding ( Inagaki and Hatano 1972 1983 2003)
The thinking behind the HEI Mechanism seems to be that
Students come with their own ideas about science, and these persist unless their learning experiences are carefully designed
The start up HEI questions are designed to identify these ideas and to encourage students to examine and explain them
Students who choose the same explanations are then grouped together to elaborate their ideas AND also to examine and falsify the alternative explanations of others
Since the acquisition of robust scientific schemas requires “externalisation” it is important to make each person’s own ideas visible and allow them to be compared.
A chance to compare own ideas with others ideas and opportunity for modifying own ideas as well as expanding them
Once others ideas are visible – we notice differences – the purpose is not convergence – instead we collaborate to allow the construction of schemas
So in HEI, collaboration is not about convergence. Collaboration is for interaction with differing ideas – for effective learning individual learning
Miyake argues that the collaboration bit in HEI is important because it
1. Supports individual construction of adaptive schema
2. Produces data for process analyses
3. Facilitates formative evaluation
4. Encourages students reflection on their own learning
“Even during highly collaborative comprehension activities, social sharing of the situation does not impede each participant from pursuing individualistic knowledge construction . Rather the interactive process supports each to realise different perspectives to check and modify their own understandings by making explicit the different perspectives which are not within their individual repertoire ….
Most interestingly in enhancing formation of abstracted solutions it allows different learning outcomes in each of the paired learners, so even the most able student is able to be advantaged by collaboration.
Posted by: Artichoke | December 31, 2006 at 01:55 PM
hi arti,
I still remember students voting on the nature of electric current after I chalked up different models (diagrams) on the board:
- same return
- no return
- less return
- clashing currents
After the vote I would ask the "same return" supporters to explain why batteries went flat. They couldn't so they changed their vote to "less return" usually. Then we did experiments setting up an ammeter before and after the globe, which provided evidence that "same return" was correct. Then I organised a role play of electric current, illustrating the difference b/w resistance, current and voltage as a resolution to the issues raised. Hey, I can still remember a lot of this even though I haven't taught it for years!
That example seems to contain most of the elements of the HEI process that you describe. I remember too that other science teachers were reluctant to take up these ideas even though they worked well in practice. After I while I became weary of arguing the case at science faculty meetings and moved onto the new exciting field of computing. Now that we have blogs we can connect world wide to supporters of good ideas that deserved to do better in the "old days".
Posted by: Bill Kerr | January 02, 2007 at 05:09 PM
The term "engagement" perhaps is not the best term to use. Perhaps, you might consider it as shorthand for the terms "autonomy" and "effective time on task" from self-determination theory and ACT-R theory, both of which have a substantial research base.
Posted by: Charles | January 06, 2007 at 07:14 AM
Hi Charles, thanks for the comment - the problem does seem to lie in the many differing meanings we attribute to "engagement"
I've been checking up on ACT-R theory - I am profoundly ignorant about theories of computer human interaction - and so have much enjoyed the distraction this morning in finding out something completely new to me.
It made me wonder what the ACT-R Production Rules would look like for "engagement"
For example
IF the goal is to determine engagement in learning
and the effective time on task is > 80%
THEN determine the learner as "engaged'.
And I am also enjoying new thinking about the influence of a comment box after reading your blog.
Posted by: Artichoke | January 06, 2007 at 01:26 PM
After doing some more thinking, I believe that Csikszentmihaly's concept of flow works better at looking at the notion of "engagement." As I don't see trackback capability on your blog, I thought I'd let you know that I've just posted about the fit between flow and engagement on my blog.
Posted by: Charles | January 09, 2007 at 07:12 AM
"When the experience of learning becomes its own reward" -
Well spotted Charles - a great post on engagement -I had forgotten about Csikszentmihaly's Flow with the tension at the intersection of challenge and skills. Flow explains engagement in a practical way that helps educators planning learning experiences -
low challenge + low skill = apathy
low challenge + high skill = boredom
high challenge + low skill = anxiety
challenge matching skill leads to feelings of control, arousal and then "flow"
clarifies engagement in an educational context
Posted by: Artichoke | January 15, 2007 at 10:02 AM