This week I have been reading a report by the cross-party Science and Technology Committee of the British parliament. It focused on the exponential growth of IT technology, AI (Artificial Intelligence) and the lack of preparedness for it in British Schools. The committee expressed concern that educators (in the UK at least) are not preparing young people for tomorrow's world and that something needs to be done about it.
I'm not convinced. If there is one thing that the IT revolution of the past 40 or so years has shown us is that mavericks and unconventional thinkers have been the pioneers. I am reminded that although Bill Gates was accepted for a place at Harvard, he didn't actually graduate. I also treasure the memory of a review I read in the press about the original iPad. The writer stated that as it was neither a laptop computer nor a mobile phone, he couldn't see any future for the device.
A while ago, a friend working at a Russell Group University in England told me that a Professorship for Computing had remained unfilled for the best part of a year. When I asked why, my friend said that suitable candidates could earn far more money in the private sector. Much more telling, however, was his observation that the world of IT was changing so fast, that virtually any undergraduate or even Master's course became obsolete within a few years of its inception. If that is the case at university level, what hope is there for meaningful education at secondary school?
The cost of IT is falling all the time, particularly when we think how much more powerful such devices have become, year on year. An industry pundit recently stated that if the huge fall in price for personal computers (in real terms) was equated with the cost of buying a new car, a top of the range Mercedes would nowadays sell for just a few dollars. Anyone wishing to sell a second hand computer will know how true this is: you can hardly give them away.
Back in 1987 when I embarked on a Master's degree, I bought my first computer. It cost $1500.00 and could do things that any basic mobile phone could perform with ease today. More revealing was that a year or so later, I took it back to the dealer to have a hard drive fitted. The IT technician told me that the new disk had the incredible storage capacity of 30 megabytes. He confidently stated that in my lifetime, I would never fill it. Easy to laugh now, but it gives us an idea of the rapidity of change and development in the IT world.
So, where does that leave educators in places such as KYUEM? We dropped A2 Computing from our curriculum a couple of years ago because western universities were no longer regarding it as seriously as other subjects. Yet despite this (or maybe even because of it) students are using IT more and more. Every year, we see our young people taking part (and frequently winning) external computing competitions, and because the technology comes so naturally to them, it is us, the teachers, who struggle to keep up. But there is another lesson to be learned here: Bill Gates and co made their breakthrough despite their education, not because of it. We need to encourage critical and creative thinking in tomorrow's entrepreneurs and leaders. This is something that cannot be taught; it should be nurtured and allowed to grow of itself. The phrase "thinking outside the box" has become a cliche these days, but that doesn't mean it has lost validity. Although students here must follow the AS and A2 syllabus requirements to gain good grades, we constantly ask them to think for themselves: to question why things are as they are. In this way, independent thought and action is allowed to flourish and this is a skill that will last a lifetime.