Tuesday, December 30, 2014

Bogus quotes and graphs used in learning conferences

Watson I presume?
IBM may now be using ‘Watson’, the Jeopardy winner, to bring algorithmic decision-making to medicine and other social goods but they were not always so virtuous in selling algorithms. More of that later. Also, don’t presume the name ‘Watson’ comes from Sherlock’s foil and companion Dr Watson, it doesn’t. It's named after an early IBM CEO, Thomas Watson.
Over-used futurist quote
He is the source of that oft-repeated, tech quote, "I think there is a world market for maybe five computers”. Oh how we laugh! How stupid could these old timers be! But I groan when I hear this quote or see it on a PowerPoint slide by some showboating presenter who describes him or herself as a ‘futurist’. My most recent experience was at an OEB keynote in Berlin, by someone called Mark Stevenson. For me, it has become a touchstone. Whenever I see or hear it, my bullshit alarm rings and I’m eyes down on my laptop doing something else. I groan, first because it’s an overused cliché, used by lazy speakers, second, because the quote is a lie.
No evidence
Kevin Maney tried but failed to find the quote in any contemporary document, speech or account about Watson or IBM. It doesn’t appear anywhere until over 40 years after it was reported to have been said by Watson in 1943, first in a book by Cerf & Navasky (1984). Unfortunately the quote seems to have been lifted from an earlier book of Facts and Fallacies. It then pops up on a newspaper column (May 1985) and seems to have spread virally on the nascent internet from its first mention on Usenet, in 1986.
The fact that the quote was a myth was even discussed as early as 1985 and certainly in the Economist in 1973, "revealing that Watson never made his oft-quoted prediction that there was 'a world market for maybe five computers.'" Of course, the absence of evidence does not mean that it didn’t happen but the next time an overeager academic quotes this on his or her slides, I’m going to call for a timeout and ask for a citation.
Watson’s no saint
Although we can absolve him from blame on the quote, Watson did something far worse. He flew to meet Hitler in 1939 and sold him a primitive punch-card Learning Management System. As told in in the excellent book IBM and the Holocaust by Edwin Black, it stored data on skills, race and sexual orientation. Jews, Gypsies, the disabled and homosexuals, were identified and selected for slave labour and death trains to the concentration camps. I once mentioned this to Elliot Masie at a breakfast meeting in Florida. He went apeshit, as it was a rather uncomfortable truth. (IBM was a sponsor of his conference).
Einstein
Next up is Einstein. I once wrote a spoof blog about a course generator I had invented, that automatically generated schems of words all beginning with 'C' and an Einstein quote generator. No look at the future is complete without an Einstein quote. Yet many he never said and some just made up. Take "Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid." He vnever said it. Or "The definition of insanity is doing the same thing over and over and expecting different results." He never said that either. There are lots of these.
Not just quotes
Ever seen this graph, or one like it? It's still a staple in education, training and teacher training courses. I've seen it this year in a Vice Chancellor talk at a University and by the CDO of a major learning company. It’s bogus.
A quick glance is enough to be suspicious. Any study that produces a series of results bang on units of ten would seem highly suspicious to someone with the most basic knowledge of statistics. 
But it’s worse than nonsense, the lead author of the cited study, Dr. Chi of the University of Pittsburgh, a leading expert on ‘expertise’, when contacted by Will Thalheimer, who uncovered the deception, said, "I don't recognize this graph at all. So the citation is definitely wrong; since it's not my graph." What’s worse is that this image and variations of the data have been circulating in thousands of PowerPoints, articles and books since the 60s.
Further investigations of these graphs by Kinnamon ((2002) in Personal communication, October 25) found dozens of references to these numbers in reports and promotional material. Michael Molenda ( (2003) Personal communications, February and March) did a similar job. Their investigations found that the percentages have even been modified to suit the presenter’s needs. 
The one here is from Bersin (recently bought by Deloitte). Categories have even been added to make a point (e.g. that teaching is the most effective method of learning). The root of the problem is an image by Edgar Dale’s depiction of types of learning from the abstract to the concrete. He has no numbers on his ‘cone of experience’ and regarded it as a visual metaphor implying no hierarchy at all. 
Serious looking histograms can look scientific, especially when supported by bogus academic research. They create the illusion of good data. This is one of the most famous examples of not ’Big’ but ‘Bad’ data in the history of learning.
Conclusion

Lessons from all this? So-called 'futurists' largely just plunder debris from the past. To be honest, even if Watson did say that misued quote, he would have been justified, as it was years before computers were to move beyonf mainframes. Interestingly, with the new resurrected Watson, IBM is going back to a massive computer with a set of algorithms, available on the cloud, to deliver solutions to a myriad of problems. ‘Five’ may also not be too wide of the mark - a giant server farm for each of Google, Facebook, Apple, Microsoft and Amazon?

 Subscribe to RSS

Monday, December 29, 2014

My tech prediction for 2015 - two small letters…

I shall eschew the usual New Year prognostications and Horizon scans, alternatively known as statements of the bleedin’ obvious (mobile, cloud etc.) mixed in with the idiosyncratic interests of their authors (talent management, performance etc.) and bullshit (21st C skills, mindfulness etc.). Instead I proffer just one prediction, in just two letters – AI.
The term ‘Artificial Intelligence’ was coined in the year I was born,1956, and after a series of over-optimistic claims, false starts and dead ends we now see AI explode and embed in thousands of contexts, especially online. You are unlikely to do anything online that does not utilise AI.
My son, who is doing a degree in AI, says, “I can’t think of one thing you can’t apply AI to. Anything a human can do AI will be able to do and better”. When I asked him if teaching could be replaced by AI, he laughed, “That one will be easy” he said, “…in fact it’s already here”. He went on to explain that ‘search’ is the primary pedagogic technique for anyone wanting to know almost anything from anywhere at anytime “that’s AI”.
With 2500 years of theorising it now has some serious winds in its sails; maths. computer power, neuroscience, economics, cognitive psychology, linguistics and control theory have come together to produce a phenomenal cross-disciplinary effort. Add to that internet, with huge amounts of users and gargantuan amounts of data - it is an unstoppable force. Forget that backstop of PowerPoint futurists, the Gartner curve; this thing will just grow and grow.
Brief history of AI
Much of early AI history comes from my own degree subject, philosophy, starting with the colossal Aristotle, who kicked off logic with his syllogistic work. Hobbes, Pascal, Leibniz and Descartes all contributed to defining the mind as a reasoning machine. This was supplemented by a heavy dose of philosophical empiricism from Bacon, Locke and Hume. We even have the application of a calculus to ethics in Bentham and Mill.
In parallel, often overlapping, progress and significant leaps were made in mathematics. Euclid’s algorithm for calculating the greatest common divisors opened up the algorithmic approach to mathematics with al-Khowarazmi (from whom we get the word algorithm) introducing algebra to Europe. This eventually led to conceptual leaps in logic and computational theory by Boole, Frege and Godel. Add to this advances in probability from Cardano, Pascal, Fermat, Laplace, Bernoulli, and Bayes. This potent mixture of algorithmic reasoning, computational theory and probability comes together in the modern era, with a focus on algorithmic power of deep, universal algorithms and machine learning.
Birthplace of modern AI (1956)
The official birthplace of modern AI was my other alma mater Dartmouth College, where, in the year of my birth, 1956, McCarthy organised a two month study of AI. This was where I got my first introduction to the power of programming and computers that led to a lifetime in the use of technology to enhance human learning. 1956 was a turning point in other ways, with Kruschev’s Stalin speech, a turning point for global communism, the Suez crisis marked the demise of the British Empire, the first Islamic terror attack came in the Milk-Bar in Algeria , Castro landed in Cuba and Elvis broke with ‘Heartbreak Hotel’.
AI spring
After the AI winter, came the AI spring, when AI came together in the late 1980s under a more pragmatic approach that adopted the scientific method, where hypotheses were subjected to real experimental trials and analysed statistically for significant results.
Web uplift
But what really took the field to a higher plane was the enormous data thermal that was the web. It had billions of users, massive amounts of data and the acceleration of computer power, storage and capability. The trillions of words available in English, then other languages, led to rapid advances in search, translation and an explosion of algorithmic services used by billions of people worldwide.
Artificial General Intelligence
The field has moved on from being a set of isolated areas of activity and silos of research to a more pragmatic, scientific and problem solving world. Artificial General Intelligence, the search for universal learning algorithms that will work across different domains and in different environments is now a real movement.
Age of the Algorithm
I’ve read a stack of books on this subject over 2014, from academic textbooks to more populist works and each one just confirmed and reinforced my view that we are now in the ‘Age of the Algorithm’. This is formidable change in human history, where technology moves from becoming a physical aid, even cognitive extension to enhancing and significantly improving human performance. We are on the verge of something very big and very powerful, where machines trump man. The consequences will be enormous and, as usual, there will be forces for good and bad.
Conclusion
Importantly, we are now in a position of having a truly cross-disciplinary field, fed by many disciplines where the power of hungry algorithms is fuelled and tested by gargantuan amounts of data. The technology is now small, fast and cheap enough to have taken AI into the consumer world, where it is used by us all. Few of the services we use would work without this huge, invisible, beneath the surface machine intelligence guiding, shaping and often defining its delivery.

So, rather than skate across surface phenomena – MOOCs, cloud, mobile etc, AI is my pick, not just for 2014 but for the next 50 years, as the ‘Age of Algorithms’ is a deep, broad, paradigmatic shift that has already delivered powerful pedagogic shifts (search, recommendation and adaptive). If I were 20 again, this is where I’d be heading.

 Subscribe to RSS

Monday, December 22, 2014

'Wellington College" Festival of Education? I think not.

I got an email from an organisation called the Sunday Times Festival of Education. It was an odd, circumspect email asking for a telephone call to discuss - well, I'm not sure what. I asked them what they were really after - turns out to be sponsorship. Now I've talked at hundreds of conferences all over the world but I've never been asked to sponsor one! They were under the mistaken impression that I was the CEO of Epic. a company I sold in 2005 (so don't attend if you think they know anything about research). Even then, we did nothing in schools.
Even odder was the fact that the sender didn’t really say what the whole shebang was about. It was all very vague, as if they couldn't really tell the truth, as if it were all slightly embarrassing. I replied, politely, pointing them in the right direction – away from me.
Google to rescue
Surely a sign of the ‘Times’ that a ‘Festival’ of education should be held in an English Public School - Wellington College. That was a bad start. I know Anthony Seldon well and don't really have much time for his mindfulness (mindless) nonsense nor The Wellington Academy who recently sacked the headmaster (who I also know) for a slight drop in GCSE Maths results. Seldon and co have a political agenda and that agenda is not mine.
Looking at the 2015 festival site, I saw that it’s heavily sponsored by public school associations: ISBA (Independent Schools Bursars Association), AGBIS (Supporting Governors Bodies of Independent Schools, GDST (Girls’ Day Schools Trust), HMC (Leading Independent Schools), ASCL the small private sector leaning ‘union’. This is not good. Sponsors get privileges - speaker nominations and slots. I know because that was what he was offering. They skew the agenda.
Festival?
Now I’ve always seen a ‘Festival’ as a sort of democratic celebration of something good in life. I'm the Deputy Chair of a large Arts Festival in Brighton, and have had a decade of experience in being directly involved with something that I think is worthy and adds to the culture of my home town. But why celebrate a system where the 7% tail wags the dog? Scrape beneath the surface and you’ll find the usual suspects organising this tawdry affair. You can see exactly why Tristram Hunt has marched in step with Gove recommending that the private sector take pity on the poor state sector by handing down some baubles of expertise ( an old Seldon idea). I’m all for debate and discussion, and no fan of the status quo, but when a ‘Festival of Education’ is sponsored by a Murdoch paper and largely public school associations, held in a public school, hosted, organised and shaped by a minority with a strong political agenda, you need to probe a little deeper and ask if it really is a Festival or a subtle PR exercise.

 Subscribe to RSS

Saturday, December 20, 2014

Why are the Maths Zealots in our schools?

There’s a movement stalking our land and many other lands – the Maths Zealots. Driven by PISA envy they are zealots, sure that the solution to all our economic and social woes is knowing the quadratic formula. Before I start, don’t shoot the messenger. I like maths, I’ve taught maths, I helped commission major maths teaching projects. I just don’t like silver-bullet movements that take one subject and treat it as if it’s the pinnacle of educational achievement, when it’s clearly not. Let’s be honest, the last person you’d take most problems in life to, even practical ones, is a maths teacher or prof.
Latin - educational fossil
This is not new. Latin was forced down the throats of generations of kids for no better reason than laziness and snobbery. No it’s not a sensible way to help you learn other languages. No it doesn’t give you great insights into English. It’s a long-dead fossil that bores most with mostly fictional educational benefits.
Coding the new Latin
I have no doubt that coding should be taught in schools but compulsory coding for all is crazy. I’ve coded, spent a lifetime working with ‘coders’ and most of them will say that it’s not a subject for all. I don’t care how many ‘hackathons’ you’ve organised, coding is a specialist skill, readily purchasable from smart people in cheaper lands. For every coder there’s project managers, sales people, marketing people, finance people, graphic artists, video production….. funny how you never hear of the ‘hour of sales’, even though it’s the skill that’s most often absent in start-ups.
STEM - all too easy acronym
Some acronyms create more problems than they’re worth and this is one of them. I’m OK with promoting science but the ‘T’ is only there so that SEM is easier to say and remember. Is it “Technology or ‘Technical’? Doesn’t matter – what is the ‘T’ bit? Engineering? Isn’t that technology? Not sure. But what has happened – the design bit of ‘E” for engineering has been ripped out of the school curriculum by Arts graduates like Gove. Then let’s just lob maths onto the end. let's add some more letters like 'A' for the arts, "D' for drama and design, 'H' for history, 'G' for Geography, 'L' for languages, 'B' for Business, "P" for psychology, 'M' for Music and so on. Then again it wouldn't fit the great desire for all too easy to remember acronyms. STEAMBLGPHD...?
Numeracy
To be fair this is a more appropriate approach to maths but as is so often the case, rather than argue the case for a functional maths qualification, the quangos and charities turn this into a maths crusade. People are so focused on getting their MBEs and CBEs that they forget that this is primarily a political battle, where politicians and civil servants have to be challenged.
Maths, maths and more maths
More maths is likely to cause more problems. Why?  Remember what Henry Ford said, “If I had asked people what they wanted, they would have said faster horses.” Hammering home more calculation maths is a case of trying to shut the proverbial stable door one the horse has bolted. We desperately need a functional maths qualification, based largely on number theory but not just on calculations. Computational devices, we all have one, it’s called a mobile, do the job better. If we are to teach maths it needs to be more focussed on the practical and mathematical thinking. The GCSE maths standard is crap. It even has maths mistakes in the specification. Algebra, Trigonometry ad most of Geometry is neither necessary nor desirable for most learners. The vast majority of people in the workplace and in life do NOT need even GCSE maths. We have calculators, some basic number theory and functional maths will suffice. Even worse is the idea that you need GCSE maths in every apprenticeship and vocational qualification. This is patently false and is likely to result in massive and unecessary failure rates.
Conclusion
The myopia produced by single subject advocates so often descends into ugly zealotry. I have only defriended two people on Facebook, ever, and one was a persistent grammar and spelling nut who confused typos with stupidity. She never said anything remotely interesting on any subject other than police posts for spelling and grammar errors, even then she was often wrong, confusing Latin rules with English. I see the same behaviour in the single-subject nuts, who want to funnel us all into a small set of subjects. Rather than look at the breadth of subjects and open up young minds to many possibilities, they want to close these young minds down into human calculators. The common denominator here (to get all mathsy for a moment) is their desire to get the rest of the world to think like them. What THEY learnt, everyone should learn. I’ve met enough of them to know how catastrophic this would be.

PS
To show that I’m not against maths and computer science per se, check out Citizens maths and OCRs Computer Science MOOC. Two projects I'm proud to have helped get started.

 Subscribe to RSS