The NDF conference presentation on Web metrics by Seb Chan from the Powerhouse Museum who blogs at fresh
+ new(er) blog had some valuable
insights for institutions like NZCER who produce surveys claiming
to measure student engagement in school.
Me
and My School: student engagement survey
This
year we developed and launched a survey tool aimed at finding out what students
in years 7-10 think about their school and their learning. The tool was
trialled on more than 8000 students before we launched it in the third term
this year. Many schools have told us how useful they have found it, so much so
we are looking at expanding it to years 11-13. We hope to have that work in
development next year and an extended tool available in 2010. Meanwhile, the
years 7-10 version will be available again in 2009, with schools able to run it
in the third term. We make it available at the same time each year in order to
ensure the national norms are valid.
How we understand engagement is something that
I have fretted over before on Artichoke
... .
I’d much prefer we put our thinking and energy
into measuring student versatility and control.
Engagement is also something that
I doubt can be measured on a survey so I was especially alert to the current enthusiasm
in New Zealand schools for NZCER’s new measure designed to profile student engagement –and to listening to how principals are
talking about the use of the NZCER engagement data as an evaluative measure for
school programmes.
At the NDF conference Seb
Chan’s presentation was on the validity of different measurements of visitor
satisfaction used to evaluate the success
of the work of Museums.
His
presentation provided much to challenge our NZCER survey measurement of student
engagement used to evaluate the success of the work of schools.
Chan
started by looking at current measures of visitor engagement and how little they
really tell us.
For
example when Chan claimed that changes in the way people interact in online
environments
makes traditional Web analytics and metrics
that museums have used to measure and track success on the Web for the past
decade increasingly inadequate. Occasional user surveys and server-side log
analysis can no longer be relied upon by Web teams to guide them towards making
museum sites more user-centric and effective.
The
“more
user-centric and effective” bit reminded me of our NZC claim to “put students at the centre of teaching
and learning".
When
Chan claimed that
Whilst
basic reporting currently satisfies government and sometimes corporate
benefactors, far more complex analysis is required for museums themselves to
more effectively evaluate and refine their on-line offerings for their users.
I
was interested in how this might also relate to the conclusions gained from a
self report survey on engagement.
Chan
was an entertaining conference speaker.
He well exposed the flaws and deceit in commonly used web analytics - “Where
counting has no point” - through “A
Summary Of Old Problems”;
- The Problem With Log File Data,
- The Problem With Page Tagging Data,
- The Problem With 'Unique Visitors',
- The Problem With 'Visits' And 'Time Spent On Site',
- The Problem With 'Page Views'.
Even
the number of visitors who click on an interactive such as a video talk or
download a podcast was exposed when more detailed analysis shows so few of them
watch the whole video or listen to the whole podcast.
“In many ways the best measure of the success of a podcast is how much
feedback and discussion it generates. This is far more valuable than the total
number of downloads”.
Of
much more interest to Chan was how we might measure the stuff that really shows
visitor satisfaction.
If
just turning up on a website was not enough then ....Seb argued for third party web metric measures of visitor
behaviour using RSS feed tracking, comments
on the museum website, but also on other blog posts and comments, tagging and comments
on museum content on Flickr Commons photos and how these are used in other conversations
in communities and blogs, Technorati
trackback, and Facebook friends, fans and profile comments, gave a better
indication of the success of museums and exhibits and events than number of visitors/page
views.
He
suggested combining qualitative and quantitative measures when we measure
visitor comments online.
Again, it is far better to measure interactions – comments, trackbacks
– and then qualitatively assess them. Blogs should ideally be generating
conversation and discussion, and blogs will rank differently depending upon
your choice of what to measure (Chan & Spadaccini, 2007).
Chan
identified these alternative web analytics as a way of collecting “measures of
recommendation” – a kind of “how likely is it that you would recommend [the
company/ experience] to a friend or a colleague? – a broader sense of those net
promoter score stuff. He suggested that recommendation (and hence
allowing recommendation and sharing) is how we should understand the way people
interact with museums.
It
all made me think of our current excitement in education over measuring engagement.
If
engagement in learning is important then counting the numbers of students who
claim to be engaged in response to questions in a survey will tell us very
little.
We
should look carefully at Seb Chan’s museum analytics thinking and look for
measures of recommendation.
1. How
likely is it that students would recommend [the school, the teacher, the learning
experience] to others?
2. How
could we find this out using Web metrics?
My
thinking starts with mentions of learning on student social networking sites, blogs,
Rate my teacher ....
And then
3. How
could we use technology to allow for/ enhance the conditions for recommendation and sharing of
learning in school?
Sebastian
Chan
: Towards New Metrics Of Success For On-line Museum
Projects
I agree with the belief that students prefer a “more user-centric and effective” learning process that would “put students at the centre of teaching and learning". Researching on Google today can be a frustrating process, but sweetsearch.com is a new educational search engine that weeds out bad sites and only provides users with hand-picked, reliable ones. We hope that students can do research on their own, using the Web, and not feel so hopeless.
Posted by: Josh | January 31, 2009 at 08:38 AM