The infographic below (produced by OnlineUniversities.com) contains some interesting data that lends considerable weight to the argument that we are now in the midst of a major paradigm shift in the higher education space.
I was also pleased to see that there is no reference to ‘lecturer’ or ‘instructor’ (or even ‘teacher!). This is a major gripe of mine as despite the increasingly technology-enabled, learner-centric environment we work in, many of us don’t seem to be able to let go of the old terms and labels. Is it appropriate to use words like these when they connote a very different type of pedagogy?
Also, why do we persist with the ‘e’-prefix? This might have been apt in the 1990s, but e-learning just seems so passé to me. Around the time the term arose, we also used to talk about e-banking, and nowadays people just talk about doing their banking. Maybe it’s time we also just talked about learning.
Image source: thechronicleherald.ca
I came across a nice post (via @catspyjamasnz) today that gathers together some of the themes I have been blogging about in recent weeks. Roland Sussex succinctly defines ‘open education’ and then goes on to contextualise what this now means in the wake of the new models of delivery that have been attracting a lot of attention of late. He refers, for example, to the founding of Udacity following the successful trial of a course in artificial intelligence delivered completely free online:
The course attracted 160,000 enrolments (roughly the total enrolment of the UK Open University) from 190 countries, and 23,000 completed it. It provided frequent feedback and tests, with two examinations (mid-semester and end-semester), all handled by software. Students who passed the course received a letter of completion. And all this was totally cost-free to the student.
The students also provided creative input. They established two large Facebook pages for course discussion and interaction, and nearly 2,000 of them translated the course materials into 44 languages. And the solutions to the very frequent quizzes, now on Youtube, provided feedback and commentary. The course designers implemented a version of problem-based learning and made it work online.
Image source: adi-news.com
The key message is that using laptops and tablet computers in class is less about technology and more about effective pedagogy. In the industrial age we had no alternative to ‘factory-style’ education, delivering programmes en masse, invariably catering to the lowest common denominator. Under this model, students with learning difficulties are left behind and able students aren’t stretched, leaving educators with the issue of student disengagement, and the attendant problems of unruly students, truancy, drop-outs, and so on.
In the digital age, with easy access to information and communication technologies (ICTs), there is no reason to continue with this outdated mode of delivery. Yes, it will take time to make the transition, as professional development of educators is imperative, and — as the NYT article makes clear — there will be pitfalls along the way, but this is not a reason for delay.
Image source: telegraph.co.uk
An article published in the UK Sunday Times at the weekend and republished in The Australian yesterday adds another name to the growing band of influential figures seriously challenging the notion of university education — at least as it is currently structured. This time it is Larry Summers, former President of Harvard University, and erstwhile colleague of that other Harvard academic, Clayton Christensen, who has also set the cat amongst the pigeons with his most recent book, The Innovative University.
According to Summers, the explosion of knowledge, and our ability to access it through computers, demands change in the way universities operate. Furthermore, most companies look nothing like they did 50 years ago, yet undergraduate education looks much as it did in the middle of the 20th century. He also argues that:
Universities are going to have to be increasingly about pinpointing principles, ways of thinking, common values and common aspects of experience rather than trying to teach all there is to know because no one can know all there is to know.
This sounds to me like an argument for getting students to analyse rather than memorise, which may not appear a big deal except that it would mean a fundamental shift in the curriculum, pedagogy and assessment practices of a great many tertiary educational institutions around the world.
The image above is a common sight in universities everywhere. It does not resemble any real world setting where a graduate might be expected to apply their newly acquired knowledge and analytical skills. They are also using pens and paper which, while quaint, is not very 21st century.
I read a nice piece in The Globe and Mail this morning (courtesy of @Ronald Beach), in which the author, Margaret Wente, stridently puts forward the case for flexible delivery of higher education. In We’re ripe for a great disruption in higher education, she argues that:
The digital revolution will make higher education better, cheaper, more accessible, more engaging and far more customized than anything that exists today. It’ll also turn our current institutions upside down.
Citing the examples of MITX and WGU, and perhaps emboldened by Clayton Christensen’s most recent work, The Innovative University, Ms Wente certainly rattles the cages of some of the 180+ individuals commenting on her article within the first 24 hours. This, no doubt, will please her greatly, as I suspect this is precisely what she set out to do.
The striking thing about this infographic is not just the stunning penetration of Twitter, but also LinkedIn. It is also interesting that the bastion of tradition in the US higher education sector, Harvard University, has the greatest reach in terms of its social networks. Among the challenges noted, is that it is not enough to simply have a presence within the social media. To maintain credibility — in terms of marketing and communication, at least — an institution’s social media profile requires full-time attention.
It was S.T. Coleridge who made the observation that ‘the willing suspension of disbelief’ is required for the enjoyment of poetry, novels and the dramatic arts — I also believe this is a requirement in education where role play takes centre stage. Role play is the bridge between theory and reality and, done properly, constitutes an effective means for securing deep learning outcomes. The reason, simply, is that the arts stimulate our imaginations which, in turn, stimulate creativity. I might add that the ICTs are effective tools for the creation of authentic role-plays.
Ken Robinson makes this case very eruditely in this presentation entitled: ‘Out of our Minds’, in which he calls for the abandonment of standardisation and conformity in education, thus allowing individuals to explore their talents through a personalised curriculum.
Michael Wesch of A Vision of Students Today fame gave a TEDx talk last year that I revisited the other day when I found myself (frustratingly) having to justify the case for authentic assessment. “But they’ll cheat won’t they?” is the classic response I get when I present the case for an open-book, open-web summative assessment. Some research I did with Amy Wong back in 2009 suggests the contrary but a much more important question is why someone would want to cheat in the first place.
If the test of a person’s ‘knowledge’ is the ability to dredge information from the brain and use this to piece together something resembling a coherent argument in the unnatural setting of an examination hall, then resorting to unethical means for the dredging process will never be beyond temptation.
If, on the other hand, the knowledge to be tested is a person’s ability to think on one’s feet to solve unstructured, real-world problems, generating unique, meaningful and useful responses, how is it possible to cheat?
Wesch uses the term knowledge-able to describe this attribute, which amounts to a little more than being ‘knowledgeable’. The difference between the two is that the latter does not necessarily require you to practice your knowledge, where you go beyond critical thinking to actually create meaning.
Acquiring knowledge-ability effectively ‘future-proofs’ your learning … it is learning that lasts way beyond the ‘test’.
I’ve been monitoring the nascent Occupy Education movement with interest over the last couple of weeks, not least because there are so many dimensions to it. The complexity of it all has been addressed far more coherently than I ever could over at Tenured Radical, but my crude interpretation is that it essentially revolves around access and relevance (or lack thereof!)
These two factors would appear inextricably linked, in my mind, as access to an affordable education is obviously more worthwhile if it also happens to be relevant and useful, yet few of the Facebook contributions and Tumblr pages I have perused so far seem to make this connection, there being a tendency to focus on one or the other.
I understand how — given the origins of the Occupy movement — the socioeconomic dimension takes primacy, but if there is to be reform to provide equity of access, I hope an equal amount of energy is expended in ensuring it is access to a quality education, that accommodates the learning styles and life styles of individual learners, and not some factory model that caters to the lowest common denominator.
An Occupy Education movement that focuses on flexible delivery of programmes, allowing people to fully participate, assessing learning outcomes in an authentic and engaging manner is one I would happily sign up for, because I think it would have a good chance of addressing both the inequities and irrelevancies that currently plague the education system in the US and elsewhere.