Of course, the world changed with SoMe. No longer were we thinking about a central portal where everyone came to, rather the service became the medium for distributed communication – with distributed cores. Each core, each node generating intelligence, transmitting and amplifying information. We have started leaving data trails across various Web 2.0 and Web 3.0 services.
It is like marking a path with evidence of where you have been and what you have done. It throws open rich analytic and predictive possibilities for software systems and a far greater interoperability for data exchange between systems using open API.
George Siemens talks about these trails in-depth and he started a Learning Analytics Google Group.
Whereas the first generation of learning analytics have featured analysis upon base data like page visits, time spent, forum interactions, tool usage, basic user-to-user interactions etc., the second generation of Learning Analytics focuses on extracting data from lifestreams. George Siemens describes LA as going beyond web analytics and educational data mining.
Learning analytics is broader, however, in that it is concerned not only with analytics but also with action, curriculum mapping, personalization and adaptation, prediction, intervention, and competency determination.
In George’s vision of the process of LA, there are two broad sources of data. One is learner data that we collect through lifestreaming, LMS and PLEs. The other one is the contextual or intelligent data – curriculum, linked data and semantic data. While learner data helps us build “profiles”, learner and intelligent data feed forward into analyses of various types such as SNA and Signals through data trails. This builds the “basis for prediction, intervention, personalization, and adaptation“.
George’s vision is that LA will be transformative through systems that will analyze who the learner is, what her skills are and how do they measure against the state of the art, given a context. This marks a big transition from pre-designed curricula to “a real-time rendering of learning resources and social suggestions based on the profile of a learner, her conceptual understanding of a subject, and her previous experience.” It also changes what we think of for competency and performance.
Responding to a question about scalability and the division between automated analytics and human interventions in analytics on the Learning Analytics Google group, George acknowledges that we would need to harmonize the technical and social dimensions of learning and analytics. We would need to understand better what technology can do and what human interventions can do.
David Wiley also pointed out correctly that the cost and difficulty of aggregating lifestream information has gone down considerably. But at the same time, he is unhappy that Learning Analytics may become sort of Behaviorism 2.0 and suggests that:
It seems absolutely critical to me that the results of LA can provide only a portion of the data necessary for making decisions, and that it must be a human with more subtle meaning-making capabilities that ultimately acts on the data coming out of LA.
George also points out 9 different dimensions including:
- Learning Analytics version 1, where traditional analytics are used
- Web Analytics with SNA
- Distributed Network analysis – going from a single tool or platform to analysis across tools and platforms
- Social Integration: where content and connections get semantically linked to lifestreams
- Semantic/Linked Data leverage
- Knowledge analytics: some way to describe current vs expected state of learning and knowledge. I talked about Connection Holes from the Connectivist standpoint. Very simply speaking, if learning is the process of making connections, learning is deficient or has holes if the right connections are not made.
- Holistic physical/virtual world analytics
- Lastly, a fully integration of “what we do on a daily basis”