I had some time to reflect on the presentations and conversations at the 1st International Learning Analytics and Knowledge Conference in Banff. There were many thought provoking presentations that gave an inkling as to the direction people interested in education, or interested in educational technology, might take analytics in the coming years. One of the figures I found most interesting, came from Abelardo Pardo, who worked with Carlos Delgado Kloos on a 'virtual machine' that they used in their research. They asked their students to install the file on their regular computer and use it for their course work, while Abelardo and Carlos could track student activity. When they looked at the browsing behavior, only 28.51% of students accessed LMS-related pages for their learning activities, while all other pages accessed were outside this institutionally controlled environment. Now of course it is quite likely that there are contextual factors that influenced this behavior, but still, it clearly points towards a finding that students only make a limited use of the institutional LMS for their learning and that if analytics are to be meaningful, they will have to include student learning activities outside the LMS. Quite some analytics presented at the conference were related to the institutional LMS. Of course this begs the question: if students only use the LMS for such a limited amount of their learning, and data on the other learning is not collected, what will be the relevance and value of carrying out analytics on this LMS environment?
Some presentations showed that analytics might be used to enhance the effectiveness and streamlining of the processes taking place in educational institutions in four ways:
1. To support the administration
2. To adapt the learning support services to make up for deficiencies in student performance.
3. To show learners their analytics in order for them to reflect on their performance and perhaps adapt their learning and learning behavior in certain ways.
4. To adapt teaching to analytics findings about student learning and learning behavior
What do I think of these four?
1. Analytics to support the bureaucracy must always be a bad thing, as analysis of data always means inputting of data, from which follows that learners and educators will have to engage in this added burden. There is enough evidence to support that the bureaucratization of university is a negative, rather than a positive development (Foucault, Reading, Delanty).
2.I like the idea that analytics might make it possible for student support services to be better matched to student needs, but coming from a background in adult education and widening access to Higher Education, I have seen my fair share of problems with using the deficiency model to support learners. I feel more comfortable with
3. the analytics model promoted by Erik Duval who runs analytics on student activities and shows the students the results. This seems more empowering to learners as it involves a need for reflection on their learning.
4. Analytics can also be run as a research tool, so teaching staff might get a better understanding of the learner experience and the problems learners might come across in order to better match their teaching. Caroline Haythornthwaite showed us some of her visualizations of communication and group forming, which highlighted insights that analytics might provide in the ties between learners in learning settings.
If the analytics are solely run on the LMS related activity and the 28.51% figure is in any way generalizable to other institutions, of course all these analytics will only tell roughly a quarter of the story. It means that people will have to start using analytics outside the institution, on the network, as Helene Fournier and I have done here at NRC in Moncton. Of course carrying out analytics on networks is not easy as people access services in a distributed environment and the analytics would be most meaningful if these could somehow be linked, perhaps by using the same identification for all of them. It would be cool, though, and could enhance the learning experience of self-directed learners, if they would be able to quickly check if they would meet their learning goals through visualizations of their activity. As that is one thing I have learned engaging in analytics: visualisation does clarify activity pretty well.
Some other developments related to linked data are the research and design of recommender systems for learning. Currently there are problems with the testing of these as large data-sets are required to ensure reliability and consistency of results as Katrien Verbert highlighted in her talk. Some other analytics-related systems are currently under development at the Open University in the UK, such as Cohere, a discourse argumentation tool that aims for depth in discussion, and iSpot, related to BBC nature programmes, that uses a novel ranking system.
A theme running throughout the conference in several of the presentations, was the ethical dimension. analytics is about human behavior and of course there are some important ethical considerations to the collection of human data. This will be another post before long.
Learning Analytics is clearly a developing field and there is still a lot to learn for all involved! Thanks again George for bringing us all together in such a wonderful location.