Friday, April 30, 2010

Why Predications Fail.




You can't talk about the future of Tertiary education without making predictions. If you are not prepared to put your neck on the block and say what you think is going to happen, it's a pointless exercise. Predicting puts you in harm way - almost all forecasts about the future are wrong.
Before I make too many predictions, It's worth reviewing why predictions fail.
1. Failure to account for economics as a key driver, rather than technology. 
Flying cars are a great example here. People think because a thing can be done, it will. We don't go to work in flying cars because they are impossible - they're not. They're just to expensive, and it's cheaper to go by road.

2. Failure to consider human factors and rates of change. 
A thing can be done a better way, but you have to wait for the old ones who do it the old way to die off first. This is especially important in Universities, where the great old ones live long.The paperless office comes to mind as an example. I've worked in one, it was great. There weren't any old people there who liked to print things off and scribble on them.

3. Predicting out of area of expertise
People who are experts in one field assume expertise in others. Artists imagining space travel is a nice example. There are nice paintings, but I'm not flying in that, thanks. A lot of people fail on technology driven predictions here, often the devil is in the details. Another reason we don't have many paperless offices is that until recently, the screens just weren't good enough, and the software tools for easy annotation weren't either. Generalists sweeping past miss those kinds of details.
4. Failure to account for changes out of area of expertise
This is really the converse of the above reason. There is a tendency to assume that changes you know about will dominate, and changes you do not know about are unimportant. For comparison, consider Ray Kurzwiel Singularity work - focused on technology with George Friedman's 'The Next 100 years' book - focused on geopolitics. Both are fine pieces of work by experts in their field, well argued and, like all predictions, probably wrong. Both focus heavily on developments in the authors own area of expertise, and miss, or err, on key topics outside the field.
5. Wishful thinking. 
Confusing 'We Can' 'We Should' and 'We aught" with what will probably occur. A prediction is not a wish, or a hope, it is a cold, rational analysis of what is likely to occur, whether we like it or not.
6. Predicting the Weather, not the Climate
When people think of predicting, they think of predicting earthquakes, or the stock market, or the weather. You can't predict these in any detail. They are essentially random noise in a pattern. You can predict where earthquakes are likely, that stock markets will exist and be useful, and that there will be weather. Very specific predications often fail not (just) because they are more specific bets on a random future, but because they are attempting to predict things on too fine a scale. Big trends have a mass, an inertia to them that is often the elephant in the room, too big to see. How the big trends collide and play out is important, but as humans we get lost in the human scale details. It's said no one predicted the First World War (and yet, every general staff in Europe had a plan for it for decades). It's true we couldn't predict the details of it, but history tells us that Great Power wars happen a lot, and technology and economics could tell us they would get bigger, and meaner. In the big scale of things, who fought, who won and who lost, the "Battles and Kings" school of history, don't matter so much. What was important to predict was that there would be battles and kings.

Number 5, Wishful thinking, is about the only one I'm confident of avoiding. Point 2 (rates of change) and point 6 (scale of prediction) are always going to be tricky, and the others all rely on having the right spread and depth of expertise - knowing enough about enough things to get the big picture right but not miss sneaky details.

Thursday, April 29, 2010

Review: DIY U by Anya Kamenetz

"DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education", by Anya Kamenetz, Chelsea Green Publishing, 2010.

This is  good introduction to the current state of tertiary education in the United States, the Open Education movement and the potential for technology driven disruption to the sector.

The book is tightly focused on the US situation, the rest of the world gets an honourable mention.

The first half is a fairly critical overview of the state of Tertiary Education in the US.  It's interesting, but the issues are not as valid for Europe where the costs are lower, and social inequality less severe. The Gini coefficient (a measure of inequality) for the US is currently around 47, about the same as places like Kenya, and Jamaica, compared to 35 in Ireland, where I write, or 24 in egalitarian Denmark. The extent of the problems in the US system are alarming, and a good warning for those who might ape the American system.

Kamenetz takes the view that much of higher education in the US is a racket. High cost enforces scarcity of space at elite institutions, and filters out students from advantaged backgrounds. The Graduates from elite institutions, filtered by class and economics before their first day on campus, then proceed to do well and form the next generation of the elite. They even promote themselves on the basis of their selectivity:
 "It's like Weight watchers advertising that they only take skinny people."
The system excludes many on grounds of costs, and many more enter the system,  incur student debt and yet fail to graduate to reap the marginal benefits of a degree at a lesser college. She is critical of a system where everyone aspires to go to college, when not everyone needs to, and every college aspires to compete with Harvard. The cheap credit of the 00's drove massive expansion of student debt as people borrowed vast amounts of easy money for degrees they often never completed, driving spiralling fees as people equated cost with exclusivity and quality.

In the second half of the book she moves on to talk about solutions, driven by technology, and the Open Education movement. Why pay fees when you can get the knowledge for free? Why go to such expense to build social networks when we can build networks of people with a common interests faster and cheaper online? She correctly identifies assessment and accreditation as the critical points not easily solved online, and raises the question of whether in the internet age, online portfolios of work could replace conventional accreditation. She cites the idea of open source projects in Software, where a potential hire can be checked out in advance by the quality of their work in open source software projects. Ideas like 'Whuffie' and smart assessments get a mention too.

It's a good overview of the topic. All the main events, players and ideas, from The MIT Open Courseware through to Personal Learning Environments, The University of the People and Massive Open Online Courses are covered in brief. If you've been following thinkers like Stephen Downes, David Wiley and George Siemens online, there won't be much here that is new to you, but if you've only heard the terms Edupunk and Open Educational Resources, then it's an quick primer on what going on in the sector, and what changes it could bring.

It's short, clear and to the point. The author is a journalist not an academic, and it shows. Few academics write so clearly, most would drag it out to 400 pages to little extra effect.  Even if you are familiar with the ground, it's probably worth a read. If you are not, and have an interest, it's a good starting point.





I

Wednesday, April 28, 2010

The New Centre of the World


"A special report on innovation in emerging markets: The world turned upside down." Adrian Woolridge  The Economist, April 15th 2010.

This article attacks the conventional narrative of globalisation, and suggests that it is the 'emerging markets' and not the old core of developed world that is taking the lead on innovation. The old narrative was that we in the west did the smart, clever work, and places like India and China did the boring, donkey work. The iPod story is the textbook example, supposedly, of the total cost of an iPod made in China, only $4 worth is the actual assembly in China. The lions share of the cost is clever, western engineers and marketing people doing clever things that can't be done in China. Not surprisingly, it wasn't going to stay that way for long. The survey tells us how companies in 'emerging markets' driven by local problems of poverty, poor distribution and so on, are making better, cheaper and smarter products than we do in the West.

It's not in the least bit surprising. I always found the idea that the West somehow had an unassailable lead on cleverness was vaguely racist, and sloppy analysis to boot. It was dramatically disproved at Pearl Harbour (or Tsusima if you were a quick learner).

In Education the old core still has the advantage, it is alleged. The article notes that:
"McKinsey reckons that only 25% of India's engineering graduates...and 10% of those with degrees of any kind are qualified to work for a multinational company."
But that apparent advantage will erode quickly. McKinsey might not think they're good enough, but they are young, hungry and cheap. The sheer volume of graduates being produced is intimidating:
"China produces 75,000 people with higher degrees in engineering or computer science and India produces 60,000 every year"
"Between them, these two counties produce twice as many people with advanced degrees in engineering or computer sciences as the United States every year (more if you allow for the fact the 50% of American engineering degrees are awarded to foreigners, most of them Indians of Chinese)"

India and China see education as a strategic imperative, counting production of graduates as a measure of national power, as the Imperial states of Europe once counted production of Coal, Steel and Dreadnoughts. They have a steep hill to climb to build capacity, but will find workarounds. For example, article mentions the Infosys Campus in Mysore, the worlds largest corporate training facility, training 15,000 people a year. It's "harder than Harvard" notes Fortune magazine, taking only 1% of over 1 million applicants.

This kind of workaround is needed for employers to overcome poor quality and supply of graduates. The steady flow west to earn degrees in Europe or the US is another workaround for the wealthiest.  I imagine there is a lot of  other interesting approaches being taken on the ground - the scale of demand offers no alternatives. Places our parents generation associated with famine and poverty are now the worlds middle class, and in the next generation will transition from having a small minority of education to Tertiary level to majority, perhaps even universal tertiary education. Consider the effects of the (much smaller) scale of the GI Bill on tertiary education in the US as an clue of what it will bring.

Even inside the often freshly build walls of conventional universities, a radically different environment and set of drivers as this transition passes will surely create a model of tertiary education very different from what it might be in Harvard, or the Sorbonne.

The scale of the transition will define a new centre of the world in terms of Tertiary Education (and many other things). It is our first world model that will become the outlier - the unusual. Much like English has been adopted as a world language, and becomes a different, richer thing, so too in education. Innovations in practice from places like Mysore will be brought back to the old core. The language of degrees and credits will be taken up by the new, but beneath the names, built from scratch. Perhaps it will not be built as a parrot copy, but as a very different beast indeed.

The survey text is at economist.com and there is also interview with the Adrian Woolridge, who wrote the piece.


Wednesday, April 21, 2010

The Four Forces: Driving Change to 2100AD

Four great trends will drive change in Tertiary education to 2100. I've introduced them in previous posts, but let's take a minute to line them up:

Demographics: 10 billion, mostly old people. World population will stabilise at around 10 billion people, and they will be increasingly old. An average age of 55 is not unreasonable by 2100. Longer lifespans will bring more people back for second and third dips into tertiary education, or indeed continuous education.  Overall, the sector might be ten times as large as it is today. It is not unreasonable to suppose that a large portion of the population over 18 might be engaged, in some form, in tertiary education.

Economics: The end of scarcity. A continuation of the 20th century trend would bring another tenfold increase in per capita GDP, making the world, on average, as rich as todays richest country (Norway). Only people at the very margins of society will be unable to afford tertiary education. In the first half of the century, vast cohorts in the old 'Third World' will want, and be able to afford, University educations.

Telepresence: The Death of Distance. Increasingly compelling, immersive and reliable telepresent environments will render the idea of bringing people together in one physical space for education or work a quaint anachronism. Teams or classes may come together once or twice a year, for novelties sake, but true telepresence will make geographic distance as old fashioned an idea as posting personal correspondence in physical mail..

Artificial Intelligence: Smarts too cheap to meter. The steady process of Moores law will create machines with processing power to match the human mind relatively early in the century. Distributed processing will allow systems to draw on immense processing power when needed, and present machines as cheaper alternatives to most jobs currently done by humans. User interfaces that can pass a Turing test will make machine staff indistinguishable from humans. Why hire a human receptionist to answer the phone when the phone comes a processor that can do the job, that doesn't need coffee. When a 1000 euro machine is smarter than anyone you can hire, why hire anyone? The consequences for economics and employment is staggering, and managing the transition will be a huge issue from mid century on.

These are simple extrapolations of well established trends, none of which have any major roadblocks in sight. It's difficult to make a compelling case against any of them. As of 2010, these trends have massive inertia behind them -it's difficult to imagine what scale of events could derail them.

That's not to say that nothing else will happen. Few in 1900 would have predicted the ubiquity of Automobiles, air travel or the Internet. But in 1900, the key trends that set the tone of the 20th century - population growth, economic growth and urbanisation, were in motion. Geopolitical events (like the world wars) could not have been forecast in detail, but the logic of industrialisation made it inevitable that great power wars would get bigger, and worse, until they became so destructive and expensive as to be not worth the risk. We could not have predicted the 747, but we could have predicted that economic growth would have made international travel relatively cheap and easy - we might have predicted a super Zeppelin.

Having mapped out these trends, the challenge now is to figure out what the consequences of these trends are for Tertiary Education, not just in isolation, but as these trends interact and interlock. The second challenge is that the future is not path independent. We don't just wake up in 2100, with institutions and people perfectly attuned to it, no more than our institutions  and peopel in 2010 are perfect fits for the world today. As the century plays out, existing institutions will adapt, or maladapt to the changes. Peoples ideas and preconceptions will change, but only in generational slow time. The future is not a destination everyone arrives at once, it's kind of smeared out, as William Gibson said:
"The future is already here. It's just not very evenly distributed"

Tuesday, April 20, 2010

Live Long and Prosper: Universities at the end of History.

Gapminder.org is great chart candy. Check out this one. Behind the lovely bouncing balls of that linked chart is a fantastic story. Per capita GDP increased by a factor of 10 between 1900 and 2000, despite a great depression, two world wars and the Spanish flu, and all the other ails and woes of the 20th century.  Not only is he richer, but Mr Joe Average Earthling lives much longer than in 1900, and can expect to have less children, and have them all outlive him. It's a remarkable achievement that never makes the newspapers. Quiet victories, won at a few percent a year, don't make headlines.

Extrapolate the graph a bit and by 2100, as the worlds population begins to shrink, Earth will have about 10 billion people with an average per capita GDP in today's money a little shy of 100,000 US Dollars. This is staggering wealth. Only Norway and Luxembourg have numbers like this today. Imagine a whole world, on average, as rich as Norwegians.

The importance of GDP per capita is hotly debated by economists who live in economies with high GDPs per capita. Drawing on their excellent education, they argue it isn't really a good measure of social progress. In warm, comfortable well equipped offices they debate the hidden costs of economic growth. The healthy, long lived, well fed and educated grandchildren of todays 'bottom billion' can debate the matter in 2100.

It sounds like Utopia, the real end of History, the land of plenty. There are plenty of 'black swans': rare events that could be imagined, but given that this vision is just an extrapolation of what happened in the 20th century, we could waste a couple of decades in brutal warfare, have a good plague and throw a few nukes around and still reach the target. Global warming at the extreme end is about the only scenario that could derail the sheer inertia of the trend.

What place will Universities have in this new Utopia? An obvious answer is 'About the same as in Norway' but this isn't so. The currently developed world grew rather slowly at first. Many of it's universities existed in seed form for a long time. When mass tertiary education arrived after the second world war, preexisting institutions grew rapidly to soak the numbers, on a substrate of fairly good infrastructure at secondary and primary level.

The big developing economies are growing faster than the first world did. Places like China, India, Nigeria, Indonesia have huge populations on the cusp of a point where they will need, and their people will demand, tertiary educations. The sizes of the potential cohorts in these countries is staggering. Conventional university models will be simply crushed by the volumes. Even if you could build campuses big enough, fast enough, who would teach the classes? In conventional models, it takes 8 years to turn a smart first year undergraduate into a keen junior lecturer - and the smartest graduates will get a lot of better offers. Expect academics from the first world, with longer pedigrees that may sell well locally, to be poached.

It's in this climate, not in the mature first world markets, that online learning, distance learning and open courseware models will really find traction. Without strong existing cartels fighting for an 18th century status quo, college educated parents and employers with old fangled notions of what a degree should involve, and with huge incentives to deliver, governments in these countries can, and must, leapfrog the current model of a University into something new.

To see the model for the future of Tertiary Education, look south.

Monday, April 19, 2010

The end of 'College Age'

"The Shock of the Old: Welcome to the Elderly Age" by Fred Pearce, New Scientist, 8 April 2010.

Mature Students are a pain in the neck. Long ago, I was a postgrad demonstrator in a Friday afternoon mineralogy lab You could count on the regular students tearing through the work and being gone by 3pm. Not the mature students. They asked question after question - it was worse - far worse, than a PhD viva. At 6pm when the lab officially ended, you had to prize their eyes of the microscopes and chase them out the door.

There's going to be a lot more of them. Worldwide, the human race is having less children, and living longer. Countries like Japan lead the charge, but the rest of the first world will follow. Even countries like China won't be far behind. The economic and social impact of this change will be far reaching, perhaps the most significant change of the 21st century.

It's also probably the change Universities are best prepared for. Many campuses teem with life after the day students go home, as the second shift, evening students, pour in.

As ever, economics drives change where ideologies cannot. As the conventional 'college age' market has reached near saturation in the first world, universities seeking growth went for mature students. They are also seen as a cash cow. Where day students are often state funded or supported (in Europe), evening and mature students are typically fee paying, or, better yet, employer funded.. The older audience are in many ways easier pickings. They are more likely to be geographically tied by houses, jobs and children schools. This makes them less mobile, so it's much easier for a University to dominate it's physically local market. Their children might be able to travel and study anywhere, but they need somewhere close to the office.

The sheer scale of them will change the shape of Universities. Instead of having them looked after by an Adult and Continuing Education unit tucked away in a distant office, you might think of them as the core business and imagine a Day Student unit looking after the now unusual requirements of full time, first time, young adult students. The strange timetables designed on the assumption that students will be on the premises 5 days a week in ever shorter, seasonal terms will finally have to be let go.

Adults with jobs and children will want to pick of modules here and there over years, not put their lives on hold for several years to complete a degree sized chunk at a full time pace. Universities will need to look towards marketing individual modules, often tailored for local industries, rather than selling degree sized packages. In many cases, the adult learners already have degrees, and don't want another - they just want to learn a specific skill.

They won't be much interested much in the social side of University life. Where a first time student is interested in building the network of friends they will carry through their lives, or finding a spouse, mature students mostly have all that sorted out. They'll be pleased to meet new people, sure, but it's not going to be a big thing for them. Conventional student life can expect to wither and die unless it can reach out to the mature students.

Finally, older students learn differently. They are more outcomes focused, and don't suffer fools gladly. They know what time in class costs, both in money, and lost family time, and want to get the best out of it. Having worked in professional environments, they'll expect higher standards than the 18-26 years olds will, and they won't be afraid to speak their mind if they aren't getting what they are paying for. That might well be the biggest change of all.

Tuesday, April 13, 2010

The Collapse of Universities? Not so fast.


"The Collapse of Complex Business Models" Clay Shirky, www.shirky.com Accessed April 12th 2010


Much talk on the web about Clay Shirky's recent blog post. Shirky draws on Joseph Tainter's book The Collapse of Complex Societies to make inferences for complex business models. The idea goes that societies get more and more complex, and the benefit of each added complexity reduces until eventually a break point is reached. The layers of complexity make the society inflexible and incapable of change, so when confronted with external strains, it collapses. I haven't read Tainter's work, but Shirky, as ever, provides a sharp prĂ©cis, and draws inferences for the future of conventional, complex, media organisations. 

This was widely praised and argued across the web. While Harold Jarche talks about the differences between 'Complex' and 'Complicated' , Christopher Sessums draws the inference over into the educational space, moving on to argue that
"The affordances of social media and open educational resources are making the time and space used for formal education nearly worthless. "

Steady on there.

While to many commentators working within the walls of a University, it may seem that the University, their University, is in fact a Complex Society, in the same sense as Ancient Rome, Maya, etc. and is thus doomed.

This is not the case. Universities are closer or organisms in an ecosystem than a self contained and isolated society. An organisation, or an organism exists in an ecosystem, as one of many. A society, almost by definition, occupies an entire ecosystem, and has limited interaction, if any, with other societies. Most of the collapsed societies on History (and I think of those listed in Jared Diamond's work 'Collapse') existed in near isolation.

As organisms in a system, universities evolve. They eat up smaller institutions to dominate a niche, or split of side campuses to enter new spaces. They relentlessly share their DNA, as Universities heads look over their shoulders and shamelessly copy the innovations of others. Universities fight for resources, funding, students among themselves, where a Society usually co-opts all of the resources in it's zone of control and operates without competitive challenge.

Make no mistake, Universities are dinosaurs. They can crush you, outrun you and outbreed you. They dominate their ecosystem to the exclusions of all others, existing in astonishing diversity, and repeatedly adapting to environmental change. What it took to get rid of the dinosaurs wiped out almost everything else as well. The same is true here. If Universities become non viable institutions, then their collapse will be the least of our worries.

Universities are not going to go gently into the night. They won't wave their hands in the air, cry that it's all to complicated (or was it complex?) and shut their doors. Some will no doubt go under, but most will adapt and survive, ruthlessly ripping out the DNA from models that work and re-engineering themselves for Internet Age. They will do it in University Time, not Internet time, but they have enough inertia for that not to matter. In fact, a slower response to change will insulate them from short timescale fads (Would you wish you had bet the farm on CD-ROMS? WAP?). 

It may not seem so from this blog, but overall I'm bullish about the likelihood of Universities surviving the next century. So long as there is a need for people to be educated to a high level, beyond what can be learned in a self directed way, Universities will be doing that business.

Wednesday, April 7, 2010

Open Educational Resources are not Open Education.

There always seems to be chatter on the web about Open Educational Resources, Patent Wars, Copyright wars and so forth. I don't really get it, to be frank. It feels like rearranging the deck chairs on the Queen Mary. Open Educational Resources are only small step in a journey that began long ago.

The marginal cost of reproducing  knowledge has been in a long downward trend since the invention of writing, and with it the cost of accessing and creating knowledge. Gutenberg, Pamphlets, Penny Dreadfuls and Penguin paperbacks each marked a step along the path as knowledge became more widely and cheaply available. From time to time, a hero pushes things on (Carnegie of the Libraries and Berners Lee of the web come to mind) or pulls things back (The Catholic Church's rearguard action against the reformation, or corporate media's 'War on Piracy'). Neutral figures (Steve Jobs comes to mind) attract unreasonable levels attention as both sides wonder if they are truly friend or foe.

Each drop in the cost of reproduction brings new providers into the market and by improving the range and quality of available knowledge. Printing allowed Johann Carolus and Thomas Archer to create the first newspapers. The Web brought us Wikipedia and Youtube, but it also brought you and me. Each step onwards drives existing monopoly providers into extinction, be they Monasteries or Murdoch.


Each step down in cost makes knowledge accessible to more and more people. In the age of the monasteries, access to knowledge meant being able to afford some years in the monastery. Now it means the cost of a broadband connection and a computer, or a local library with the same, for the poor and determined. The pool of people who can be self taught, and take sole ownership of their learning expands a little each time as it become cheaper and easier to do so. Significantly, the current expansion makes it open to people outside the comfortable first world Universities which house most of the commentators on Open Educational Resources. The greatest benefit of the web will be to the billions of humans accessing them on scrounged hardware from the vast favelas of the emerging world. They are a lot more motivated than we are, and they don't have a lot of other educational options. They want to know how to speak English and do double entry bookeeping, grow hydroponic khat and make IEDs. Our debates about the relative virtues of Learning Object Repositary models and what Blackboard is up to are about as relevant to them as the Council of Trent.


But free content isn't Free Education. Formal Education as most people imagine it typically involves material (he textbook, notes and so on) a Guide figure (teacher, head abbot, whatever) and a peer group (the class). It's getting cheaper, but it isn't free.


The web has dropped the marginal cost of a peer group to near zero. In the enlightenment era, access to an educational peer group meant having the time and money to hang around in coffee shops in London or Amsterdam. Now it's 4chan and Facebook. If you get stuck, there is always a discussion thread somewhere you can ask on, once you have a web connection.


but the marginal cost of the Guide figure hasn't dropped a penny since Aristotle taught Alexander. Guiding doesn't scale - it's a one to one.  Teachers usually confound creating or presenting content (lecturing, which scales very well) with Guiding. Many teachers don't guide at all. Guides are supposed to know how students are doing, assess (as in the old latin root 'To Sit Beside") and help and direct. Our technology can't scale that up.


Historically, each leap forward in technology makes Guides look more and more expensive compared to content and peers, and better access to content and peers makes more and more people able to do without mentors and guides to learning.  Printed bibles made Priests look out of touch and unnecessary:
 "Perhaps we can learn it just by reading the book" wondered the Lutherans. 
"No Way!" said the Catholic Church, knowledge monopolist of the day.
 The resulting pedagogical debate on the relevance of guides vs learning out of the book consumed millions of lives and lasted centuries. One hopes the struggle between the Universities and Edupunks will not involve quite so much blood.


There will always be people who need Guides to give formal structure to learning, and so long as that doesn't scale, there will be place for them. They will bunch together into Universities or guilds or whatever organisation works best in the economic and technological climate of the day. One day, technology will move to allow Guiding to scale, then, and only then, will we see a real change in how formal education works, and truly Open Education.

Tuesday, April 6, 2010

Machines could never...

"Pigeons outperform humans at the Monty Hall Dilemma", Blogs / Not Exactly Rocket Science Accessed April 6 2010


Apparently, in specific cases, pigeons are smarter than us. Smarter than me. 


Next time you read a gee whizz article about artificial intelligence and scoff, thinking "A Machine could never do that" remember the pigeons. There is some crazy notion that human intelligence is somehow above and beyond anything that can come from the animal or digital kingdoms. It just isn't so. Animal and Digital intelligences are different, optimised for survival in different environments, that's all. It takes us a while to recognise it. I suspect real AI's will be with us for some time, probably in the form of distributed botnets harvesting bank accounts, long before anyone recognises them as such.


Reference: Herbranson, W., & Schroeder, J. (2010). Are birds smarter than mathematicians? Pigeons (Columba livia) perform optimally on a version of the Monty Hall Dilemma. Journal of Comparative Psychology, 124 (1), 1-13 DOI: 10.1037/a0017703

Statistics.com: The thin end of the wedge

"The Specialists" by Steve Kolowich, Inside Higher Ed, April 5 2010.
Kolowich hits the nail on the head in this article on a number of themes, taking Statistics.com as an example of the type of entity that I think will be a central player in Education in the 21st century. The article is well worth a read.

Statistics.com seems to operate the Superstar model - I'll confess I don't know who the Superstars of Statistics are (it seems a strange idea!) but the Faculty looks grunty enough to make a land grab for the title. The course programme is comprehensive, and seems to have the relevant approvals and certifications. The Faculty is global, and grading is outsourced to India. One assumes that mentoring and synchronous tutorials are run from there too, or if not, they soon will be. Finally, and most importantly, the site operates by distance. It's in direct competition with every statistics department on earth.

Existing Universities will probably fight long and hard against this. It's Turkeys voting for Christmas. The article notes:
"Traditional institutions, however, have been hesitant to open the door to commercial ventures that sell higher education by the course or program."

Hesitant? No Kidding. Only a foolhardy or bankrupt University head would suggest downsizing the Stats department and contracting in a third party provider. It's also culturally incomprehensible and politically intractable. Most Universities think distance education is beneath contempt, unless it's their own  lecturers putting own Powerpoints on Blackboard, in which case it's a dynamic, innovative revenue stream. The idea of contracting out a core activity like teaching is politically impossible to institutions which just about manage to contract out the cleaning and catering.

The best new entrants like this can hope for is that a University might grant exemptions to students who have completed modules with them. It's tempting, would you lose out on fees for a potential MSc student because you won't recognise the modules, when the University of Down the Road will? The article suggests that is indeed the case:

"The [American Council for Education] says that it has never heard of a college refusing to accept credits earned in Statistics.com courses."

In the end though, the success of providers like this hinges on employers. If you were taking on a hire who needed stats and they had paper from Statistics.com, would you give them the time of day? If the market is tight, or the candidate is otherwise strong, you might think it worth the effort to get one of your existing Stats guys to check it out, or run some skill tests on the new hire. If there's twenty strong resumes in the pile, you might not bother and just hire the person who took their Stats from the same department you did. Even so, that's still progress. You probably have existing staff that need to be skilled up in Stats. You could send them down to the local college, but their lecture schedule is insane and won't gel at all with a working day. You could hire in a guy for some onsite training, but that's major expense and it won't wash for three people. Maybe you'll enroll them with Statistics.com, see how it works out...

Sunday, April 4, 2010

21st Century Assessment: The University of Farmville




Carnegie Mellon University Professor Jesse Schell's talk on the future of gaming is thought provoking. It gives some interesting insights into what educational assessment might look like by mid 21st Century.

 The core idea is that games will become pervasive in our society. In an environment where everything we buy, everything we read or watch, eat of drink can be easily logged and tracked, games will pervade life, as we seek to accumulate points - eating and exercising to get discount points from our health insurers, taking the bus to get tax credits and so forth. Watch the presentation, you'll get the idea.

In the talk Schell cites the examples of a Professor (in Game Design) who replaced the grade system in his course with an Experience Points (XP's) system, very familiar to anyone who has played any kind of non trivial game invented since about 1980. Points are accumulated for attendance, completion of exercises, contributions and so forth. It's easy to fast forward and imagine automated systems recording and tracking your efforts in a course. Did you actually read the text (eyeball tracking on your eBook), did you ask a question in class and so on. The idea would push strongly towards formative assessment, as students work to accumulate XP's over the course, and away from old fashioned summative assessment, which largely exists because it is (relatively) easy for humans to grade.

Humans love slow, incremental reward, it's a bug/feature in our psychology. Games like Farmville and Mafia wars, (check out the wonderful parody 'Progress Wars') offer steady accumulation of points, and the promise of 'levelling up' in just another few clicks exploit this bug and enjoy massive success, despite gameplay that would not challenge a pigeon. It's a whole lot more compelling that slogging away for a year and then rolling the dice in a big summative exam. It's not a radical or new discovery. Anyone who has tried to work their way up the Tennis or Chess ladder, earned martial arts belts, or gone to Weight Watchers knows the psychology of earning points and levels and getting ahead of someone else. Outfits like Scientology do very well out of the idea of levelling up. The Freemasons have done it since the 1700's (there's always another degree) although the levels in Farmville are smaller and better sized for the Attention Deficit Era. It just takes a few hundred years for new innovations to work their way over into education.

The idea of gathering points for an educational outcome isn't new. Continuous Professional Development programmes have done this for years, requiring a certain number of points to retain professional accreditation. Schnell points out that once professional game designers, well versed on what kind of behaviour and reward systems compell people, get their hands on systems like this, it will make them vastly more engaging and motivate the students. They'll be running down College Road on monday morning, keen to crack those Quantum Physics problems so they can level up before Johnny down the road. Two more levels and they open up the Quantum Cryptography level, and win a free pass into Gorbys. It's not the rewards though, it's the winning.

Recognition of Prior Learning (RPL) is another area that technology driven monitoring and points accumulation will be a breakthrough tool for RPL. Imagine a world where everything you do at work is logged and captured in some way. Management systems map your progress towards outcomes, targets hit and goals achieved. Or perhaps it's more like a Twitter feed where your colleagues, customers, students authenticate your achievements and professional development, they can be tagged with discipline areas, points weightings and recommendations.

"I got that report finished"
Data Analysis: 270 points
Project Evaluation: 120 points
Technical Writing: 150 points
 View Details?
"I convinced Treasury to sign off on that Report."
Negotiation, 80 points,
Report Writing, 30 points, 
View details?

It's a hybrid mutant of peer assessment and social networking, the grandchild of the Recommendations feature in LinkedIn.

Pretty easy to imagine an App that mines this kind of data, and the details of the negotiation behind it and tell me I've got half the points I need for a MA in Public Sector Project Evaluation at The University of Z. Better talk to the boss about how best to fill in those blanks, except I'm only a hundred points in Statistics short for a H.Dip at University of Y, and I can do those online by playing 20 levels to Statsville. Decisions decisions...

Of course, the recruitment system will cut right through to the details anyway, regardless of whether the points add up to a Degree or not. Sure I can still earn points by turn up in old fashioned classes, but everybody knows professional experience is worth much more. After all, why pay a University good money or 'Recognise' my experience learning is in a big enough chunk of the right shape to be one of those Degree things like my Dad had.

Why indeed.



Update:
I referred above to a spurious Statsville as spurious future online statistics game. Alternatively, if you are stuck in 2010, there is Statistics.com, featured today in Inside Higher Ed

Friday, April 2, 2010

From Darkness, Light

Emily Howell's new album, from Darkness, Light, is not to my taste. It's a bit too ambient and arty. You might say it's a little soulless. But it's not bad for a machine. I can't imagine our first efforts at writing music for whales would be much better. Her purely derivative works, written in the style of existing composers, are very respectible.
Emily Howell is software. She'll get better. The algorithm will improve, react to market trends, and find the musical keys to move our souls.

Science fiction taught us that artificial intelligence would be invented. It would walk onstage one day, refuse to open the pod bay doors, and announce it would be back. That's not how it's working out. Instead it creeps up on us year by year as they machines take over increasingly complex tasks. Few people remember typing pools, switchboard operators and countless other jobs of the past. We think nothing now of using machines as our research assistants, taking dictation, checking for plagiarism. In five year we'll think nothing of usable machine translation, automated essay grading, machines that write our newspaperscheck our diagnosis and fight our wars.

When my eldest daughter starts University in 2023, the computer in her hand will have twice the raw processing power as the one in her head. Distributed computing will put power several orders of magnitude beyond that in her reach. The Turing Test, where a computer can pass for a human, may well be passed by the time she graduates. Profitable niche applications, like generating unique third year history essays at €5 each, or gaming financial systems will reach a point where they can pass for human much sooner than that.

The implications of this for how we teach, and what we teach is Universities is profound and largely ignored. We assume the skills we teach in University are magically beyond automation. Since the first water wheel, machines have displaced human effort, and created a surplus of labour found other, better jobs. The plough freed us to be poets, the steelworkers of the 19th century are the knowledge workers of the 21st. But now the island of our cognitive superiority is shrinking as the waters rise exponentially. The 21st century will be a knowledge economy, but it won't be our knowledge.

It will take time for the change to work through. It often takes a generation for an innovation to move from the journals to the shop floor, and institutional change is also often generational too. What is possible often takes a decade or two to become commonplace, but it does eventually. It isn't science fiction

The changes will be radical. First, practical degrees like science and engineering will become less and less economically attractive in mid century. Softer skills, which might be more difficult for machines to replicate will become more economically attractive. Interpersonal disciplines, where humans prefer to deal with humans, like Medicine, or Theatre, will be the last refuge of economically useful degrees. By the end of the century, as we become habituated to dealing with machines, and no longer notice, or care, about the difference, that too will vanish. Through the century, an increasing proportion the jobs in our economy will be unrelated to production of good and services. We have allready made the journey to from having 100% of humans working in food production to only having (in the first world) a handful. Other industries will make that transition too. With nothing left to do, by centuries end our University system will become largely an entertainment system, a place for humans to amuse ourselves and pass the time.

My daughter says she wants to be a Dinosaur when she grows up. I think she might be right.