Tagged: W22A9

Week 22 – Activity 9: Relating learning analytics to the concerns of educators and students

Timing: 2 hours

  • Read this paper, focusing particularly on Table 1 and Table 4.
  • The authors include tables that focus on why learning analytics are used. Reflect on the questions asked by teachers (Table 1) and the goals of learning analytics (Table 4).
  • Note in your learning journal or blog which of these questions can be considered:
    • a.data driven
    • b.pedagogy driven.

    Some, particularly those about the learning environment, do not fit easily in either category.

  • In the forum, post a comment containing suggestions for a third classifier, and comment on other people’s suggestions.


Orange: Table 1

Blue: Table 4 Learning analytics are suppose to

Red: Table 4 Educators are suppose to

Grey: Table 4 Students are suppose to


Data driven:

  • How do students like/rate/value specific learning offerings?
  • How difficult/easy is it to use the learning offering?
  • Why do students appreciate the learning offering?
  • When and how long are student accessing specific learning offerings (during a day)?
  • How often do students use a learning environment (per week)?
  • Are there specific learning offerings that are NOT used at all?
  • By which properties can students be grouped?
  • Do native speakers have fewer problems with learning offerings than non-native speakers?
  • How many (percent of the) learning modules are student viewing?
  • track user activities
  • gather data of different systems
  • establish an early warning system
  • offer possibilities for (peer) comparison
  • find early indicators for success / poor marks / drop-out
  • monitor own activities / interactions / learning process
  • compare own behavior with the whole group / high performing students

Pedagogy driven:

  • How is the acceptance of specific learning offerings differing according to user properties (e.g. previous knowledge)?
  • Are students using specific learning materials (e.g. lecture recordings) in addition or alternatively to attendance?
  • Will the access of specific learning offerings increase if lectures and exercises on the same topic are scheduled during the same week?
  • Which didactical activities facilitate continuous learning?
  • How do learning offerings have to be provided and combined to with support to increase usage?
  • How do those low achieving students profit by continuous learning with e- test compared to those who have not yet used the e-tests?
  • Is the performance in e-tests somehow related to exam grades?
  • capture the interaction of students with resources / the interactions among students
  • provide educators / students with feedback/information on students’ activities
  • draw the users attention to interesting correlations
  • provide decision support
  • monitor learning process / way of learning / students’ effort
  • explore student data / get to know students’ strategies
  • identify difficulties
  • discover patterns
  • draw conclusions about usefulness of certain learning materials and success factors
  • better understand effectiveness of learning environments
  • intervene / supervise / advice / assist
  • improve teaching / resources / environment
  • become aware
  • reflect / self-reflect
  • improve discussion participation / learning behavior / performance
  • become better learners
  • learn
Third classifier ideas:
  • Uncovering unknowns (i.e. discovering correlations).
  • Potentially highly subjective