Could classroom technology & Big Data help us understand what great teaching looks like?

Could classroom technology & Big Data help us understand what great teaching looks like?

I wrote the post below a few months back but have been prompted to revisit it following the confluence of this story about Kevin Pieterson’s use of technology in cricket and Apple’s announced ambitions to improve human health through the big data set its smart watches will capture. This was complemented by the discussion in the comments of another post between its author and Crispin Weston.

The question all this begs is: What is the potential for technology to tell us things we don’t yet understand about effective teaching and about how people learn?



During a recent visit to a school and a conversation with some Year 9s about their teachers’ use of technology to help them learn, I was struck by how infrequently we truly apply technology in education to the task of improvement.

None of the pupils could identify a time when technology had taken their learning (or their teachers’ teaching) beyond what was possible with traditional methods and, if I’m honest, were a bit surprised that this was ‘a thing’ at all. An awkward silence descended on the group who were clearly thinking that this wasn’t panning out as the ‘Get of P5 French Free’ card they’d been promised.

“Well, do any of you use tech outside of school to do things ‘better’?” I asked, flailing around wildly in the face of their stony indifference to the alleged transformative power of technology.

I wasn’t hopeful but one girl, bless her, had about the best answer I’ve ever heard;

“I play cricket for Yorkshire U14s and we use an app with the coaches to improve our batting technique. They film us on the tablet and then compare our stance and grip with top professionals. That way I can see exactly where I’m going wrong and make small changes to what I do. It’s really helping me improve”.

What I love – and hate – most about what she said was that in her ‘real’ life, she could easily and fluently describe how well-applied, function-specific technology was improving her learning. In her school life, this was a totally alien concept. I’m not saying that this precise example isn’t replicated in PE lessons up and down the country – it is, and well done for that – but this form of analysis rarely makes it beyond the school gym.

My point is broader and its that as a sector, there isn’t a very widespread awareness that one of the key affordances of technology is its ability to tell us things about ourselves we wouldn’t otherwise know.

Then-Minister for Skills and Enterprise @matthancockmp spoke about this at the launch of the @EdnFoundation‘s ‘Technology in Education – A System View’ report. He wondered why, as a profession, we don’t have access in education to the same tools for improvement which are part and parcel of other’s jobs. Drawing a parallel with doctors (admittedly the ‘go to’ comparator for evidence-led thinkers), the Minister asked why in the age of big data and lesson capture technologies we don’t yet benefit from ‘lesson analytics‘.

Lesson analytics, achieved with technologies such as Iris Connect or Star Lesson and an anonymised, aggregated approach to data analysis, could help teachers understand which parts of their lessons or teaching techniques were most successful, and adapt their practice as a result. Furthermore, if data about learning were routinely gathered in the same way that Apple’s watches will harvest millions of points about each user each year, this might also be turned into useful information about what works. Of course, education lacks the mass-market hardware solution that the Apple Watch offers.

You’d also have to build in some kind of user feedback metric (e.g. learners self-reporting which parts they found the most useful) and correlate these data with harder measures (such as actual examination results achieved). You’d obviously want all this moderated by contextual data, so that extrapolations of ‘what works’ for certain pupils are valid and useful. It all sounds quite daunting, longitudinal and fraught with barriers, institutional as well as professional.

Such an initiative would probably need to be led by government if it is to gain the momentum and acceptance it needs. I recently had the chance to talk to a senior mid-Atlantic-type at a well-known search engine alternately renowned/ maligned for its collection and use of data at the macro level. When I raised the possibility of their organisation getting involved in learning or lesson analytics he visibly blanched at the thought of the negative press.

However, the pay-off would be worth it. It’s not that difficult to imagine a future UK where, once lesson analytics have been successfully systematised, teachers would routinely use the information delivered through a national dataset to pick the most effective methods for supporting their pupils, at a class and individual level. The current age of random pursuit of perceived ‘best practice’ would be a thing of the past and the profession would be, well, more professional – combining their judgement with what the evidence tells us.

It has always been our ethical duty to try and improve teaching, and now the tools to do this reliably and repeatably exist, albeit in a raw state. Isn’t it time we did something about this?


Image credit


Link to Educate 1-to-1 book


  1. Hi Dominic,

    Many thanks for the excellent post. In my opinion, you are spot on.

    You story of the cricket reminds me of a presentation I saw by Ron Dennis of MacClaren on the work they were doing for the British Olympic bobsleigh team, using the sort of data analytics they had used on their F1 team. It was about looking for very sensitive correlations between slight muscle movements and the speed of the bobsleigh at different points in the course, leading to very precise training of “muscle memory”. These techniques took the UK team from the bottom of the Olympic table to the top. Cognitive learning is a bit different of course – it is very heavily networked in a way that I suspect muscle memory is not – but there are still important things to learn from the sportsmen.

    I have been trying to make this argument for 3 years on my blog but one of the problems is that most of the ICT community that exists as the afterglow of all the money that Becta spent, has been against the argument that we are both making. They believe that technology is all about empowering independent learner, addressing 21st Century skills, the wisdom of the crowd – and not about the technology, which is easy to overlook because it doesn’t exist.

    The ICT community’s latest contribution has been the ETAG report, commissioned originally by Matt Hancock and now landed on the desks of Nick Boles in BIS and Nicky Morgan in the DfE. The great redeeming feature of the ETAG report is that it is *so* bad, that it is now clear to everyone that we need to go back to square one and look at this problem afresh. I think we need a really thorough discussion that can start at first principles and step through all the way to making precise, pragmatic recommendations to government, standing on its own feet all the way and not leaning on any existing flaky consensus.

    I am doing some work now on some basic first steps to setting up such a process. I will let you know how I get on and hope that you (and anyone else reading your post) might be interested in getting involved. But without prejudging what such a process might produce, here are some thoughts.

    One problem (of many) with data analytics is that there isn’t any data. And the data needs to come from somewhere. In supermarkets, its the barcode reader. In sports training, it is sensors or – a little more labour intensive – video. In education, I think it has to be digitally mediated learning activities that capture and assess the student’s responses. But activity software is hard to develop (it is generally beyond the capability of the OER community to produce) and most digital courseware is poor. There is an interesting new report on courseware in HE in the US at which examines some of the reasons for this.

    If you see courseware as the digital equivalent of textbooks then it is also a problem that (according to figures produced by Harry Webb ( only between 4% and 10% of English teachers use textbooks, compared to 68%-70% in Singapore and 94%-95% in Finland. So there is a culture shift that needs to occur in British schools. I think the problem is that too many teachers see their job as deciding what values to impart (meaning that they have to retain control of the Banda machine), and not so much about how to teach students effectively against objectives which are largely set by others.

    A third problem, I think, is the lack of good reviews. The fracas (the word of the moment) over James Theobald’s book review in Schools Week ( suggests to me that the whole concept of the critical review is somewhat alien to teachers. Maybe this is not surprising when you find that it is generally just as alien to academic educationalists, even though you would have thought that conducting reasoned, evidence-based debates was what they drew their salaries for. And when you look for critical reviews of textbooks (let along non-existent digital courseware), there just don’t seem to be any at all.

    Fourth is a problem with interoperability. When the data has been captured by an instructional activity, it needs to be passed to different software to do the analytics, the assignment management and sequencing, and creative product passed to portfolio tools to manage and showcase. And all this has to be automated so that it imposes no extra work on the teachers. That is why I have long been obsessed with the technical problem of open standards for data interoperability. It is as fundamental to the ed-tech market as the three-pin-plug is to the market for electrical appliances – but although a small group of us have been banging this drum for 20 years, no-one in government has been listening. I exaggerate slightly: Charles Clarke listened in 2002, but then moved on; and Becta listened in 2007 when I referred them to the European Commission for screwing up the 2006 Learning Platform procurement – but they were always reluctant collaborators and managed to procrastinate until they were closed in 2010.

    I am just in the process of setting up a W3C Community Group which will work up a specification for modelling the different data formats that are needed to support innovation in software development. The group is at, but I wouldn’t encourage anyone on this list to join unless they are really interested in modelling data in XML. The job of the teaching community is not to write the specifications but to consider how to generate the demand that is going to incentivise the suppliers to write the specifications and the software that implements those specifications in tandem. From the teachers’ point of view, the whole process should be “under the hood”.

    So I think the main requirement for addressing the issues you raise is the consultation: ETAG2 if you like. It would be good if you could follow my blog, as that is one place that I will make any announcement. Or get in touch on Twitter if you want to email.

    And thanks for writing this post – I think it is both timely and very helpful.

    Best, Crispin.

    • Thanks for this Crispin – readers should follow your blog (see link in comment)

  2. There has been some excellent research done using technology and data, but it doesn’t always make a splash in the wider press. You have to go searching for research to find it. Here are a couple of examples:

    It has been discovered that understanding of fractions and the division algorithm in Grade 5 is predictive of a student’s success in Algebra in Grade 8. These two skills–fractions and division–require a conceptual understanding of numbers that parallels the type of thinking required in Algebra.

    It has also been discovered that a basic sense of number–counting, knowing more/less–in preschool children is predictive of their success in math. Being able to compute basic addition/subtraction facts at G1 is also predictive of future success.

    These discoveries should set the priorities for our teaching.



  1. What impact? 5 ways to put research into practice in the 1-to-1 classroom - Educate 1 to 1 - […] only does habitual quizzing turn out to be pedagogically sound, but also the resulting automatic data collection can be hugely …
  2. Does teaching have more to gain than fear from Big Data? - Educate 1 to 1 - […] an example – it’s specifically about data gathered from lesson observations, but equally applies to any data that we …

Submit a Comment

Your email address will not be published. Required fields are marked *

Share with your friends

Share with your friends