Whether we like it or not, we’re swimming in an ocean of data, much of which is our own. We routinely give our data away, not just through obvious means such as using google searches or sharing our photos on Facebook, but by through much more subtle means: the games we play and how we play them, messages we send each other, the goods we buy and the decision processes we make while buying them.
Sometimes, it feels like the systems “out there”, know us better than we do ourselves – our whims, strengths, weaknesses and dreams. And this last point, from a training point of view is critical: a classic blunder for trainers used to be to take the views of learners and trainees as accurate, when in actual fact, learner understanding of their needs was often, at best, limited. We are biased to believe that we are correct, even when we are not, and that our world view is naevely realistic, when it is a construct of our unique experience and perspective. How then do we get to competent incompetence where we recognise our learning gaps better. Can technology help go beyond our human failings?
Two Ufi projects illustrate the benefits of using AI to profile learners to assess their needs; their “start and end states”. Game Academy uses profile data provided by gamers to identify the career-relevant skills of each individual. The gamers don’t have to do anything other than keep playing. The system works out, invisibly in the background, their levels of, for example, team working, collaboration and problem solving. It then identifies the gaps they need to fill, and provides guidance on how they should fill them. Fluence’s “Passive Accreditation” system takes a large range of data inputs generated by prisoners, including hand-written submissions, prison staff documentation, reviews and so on, and generates individual learning plans and assessments. Even in such a complex, secure environment, where individual needs are diverse and difficult to ascertain, it has been shown to cut down the effort in producing such assessments, by prison staff, to a fraction of the previous time, and drastically improve accuracy for each individual prisoner.
Of course, there are a range of complex ethical issues surrounding this. It’s fine to use AI to point an advertisement for a new brand of socks at a customer.