Category: Week 22

Week 22 – Activity 12: Checkpoint analytics and process analytics

Timing: 3 hours


Imagine your tutor group has been selected as an advisory group on learning analytics. The group has been asked to identify priorities for implementing learning analytics on the H817 module. It has been allocated funding and technical expertise to add one or two analytics to each week of the module, or to add a set of analytics that runs across one or more blocks.

As a member of the advisory group, you are responsible for analysing one of the four blocks on the module.

  • Select one of the H817 blocks:
    1. What is innovation?
    2. Open education
    3. Learning Design Studio
    4. Learning analytics
  • Post in the tutor group forum to let others know you are working on it. Ideally, each block should be worked on by at least one person in the group.
  • Look through the learning outcomes and learning activities in the block, noting where/why checkpoint analytics might be used and where/why process analytics might be used. You will be writing from an informed perspective, with an eye to what you would have found useful as a learner. Who would benefit from the analytics you propose, and what might be the impact of those analytics?
  • Select the analytics you would prioritise, and share your conclusions in the forum, together with some notes about why you have made your choice. If your decision is that no analytics could usefully be added to the block, note why you have come to that decision.
  • Read through other people’s postings and comment where appropriate. Reflect on whether people have favoured checkpoint analytics or process analytics, and why.
  • In your learning journal or blog use your conclusion and those of other group members to help you set out some recommendations for implementing learning analytics on H817. You may choose to focus on just one block, or to incorporate other people’s comments and deal with the module as a whole.
  • Share your main recommendations in the forum.


Block 3


Learning outcomes

Related Activities



14-15 have an understanding of the Learning Design studio model Read W14 and W15 overview Have SS read the page.
14-15 have an understanding of the block structure, practices and norms Read W14 and W15 overview Have SS read the page.
14-15 have set-up the project and learning environments for the block Activity 3: Role allocation

Activity 4: Set up your team website

Forum posts indicating website has been set up and roles have been defined. Signals of changes on website. Website: movement tracking to gauge ease of activity.
14-15 have an understanding of the importance of context in Learning Design Activity 5: Create and share personas Website page editing signal.
14-15 be able to articulate a Learning Design challenge, in the form of a desired change in a given context. Activity 6: Listing factors and concerns

Activity 7: List the forces

Activity 9: Agreeing on the challenge

Website page editing signal.
16-17 review documented case studies of similar innovations Activity 10: Review a case study or a theoretical framework Webpage update Discussion analysis (snap) – forum
16-17 identify suitable theoretical frameworks and derive design principles from them Activity 10: Review a case study or a theoretical framework Webpage update  Discussion analysis (snap) – forum
16-17 analyse selected case studies, reformulate them as design narratives, and elicit design patterns from these Activity 11: Elicit design patterns and principles Webpage update. Discussion analysis (snap) -forum
16-17 use the representational forms of design narratives, patterns and principles to inform your work and share design knowledge Activity 12: Conceptualise Forum update. Discussion analysis (snap) -forum
16-17 use the representational form of a storyboard as a means of articulating design proposals Activity 13: Creating storyboards Storyboard created on webpage. Web-page changes (by who, when, frequency)
16-17 use design narratives, patterns and principles to propose a storyboard for a possible solution to a design challenge. Activity 13: Creating storyboards (inspection) Storyboard updated on webpage. keyword checking, i.e. does the storyboard use the design patterns identified previously.
18-19 define a prototype in the context of a techno-pedagogical innovation Activity 14: Extract features from your storyboard

Activity 15: Select features to include in the prototype

Forum update. Discussion analysis (snap) – forum
18-19 design and construct a prototype Activity 16: Implement the prototype Prototype page update.
18-19 list the advantages and limitations of constructing a prototype Activity 17: Walk through Discussion analysis (snap) – forum
18-19 design a heuristic evaluation. Activity 18: Define the heuristic evaluation protocol for your prototype Heuristic page update. Comparison of starting point snapp analysis versus various points. **

* Bold indicates prioritised analytics.

CA = Checkpoint analysis

PA = Process analysis

General trends: first the CA delivers a signal to the tutor. Then various additional information is fed to him/her regarding the progress of the activity.

Challenges: a fair degree of freedom was given to students regarding their selection of tools. This would make the signals difficult to monitor. However, some central hubs such as forums, OU live, and updating the website would have generally be used and could fit a ‘best possible’ location for LA to be integrated. Changing the course to require certain tools are used with help.

** for comparison of overall role changes. Could be useful in noting groups that haven’t changed structure and if overal work speed remained steady. This could be further explored, to for example, find that groups that maintained structure and had predefined roles completed tasks in a more structured manner.


Week 22 – Activity 11: Learning analytics and learning design

Timing: 2 hours

  • Begin by reading:
  • The authors identify two broad types of analytic – checkpoint analytics and process analytics. Make notes in your learning journal or blog about these types of analytic and when they can be applied.
  • Consider whether these classifications are more useful than the ones you have considered at other points this week.


What are checkpoint analytics:

A “snapshot of data that indicate a student has met the prerequisites for leaning by accessing the relevant resources of the learning design.”

“Its value lies in providing teachers with broad insight into whether or not students have access prerequisites for learning and /or are progressing through the planned learning sequences (akin to attendance in face-to-face class.)”

When can they be applied:

Log-ins into the online course site, downloads of a file for reading, or signing up to a group for a collaborative assignment.

(pg. 1448)

What are process analytics:

“These data and analysis provide direct insight into learner information processing and knowledge application…within the tasks that the student completes as part of a learning design.”

When can they be applied:

“…social network analysis of student discussion activity on a discussion task” which could identify “level of engagement on a topic, his or her established peer relationships, and therefore potential support structures.”

(pg. 1448)

*Are these classifications more useful than the ones you have considered at other points this week:*

They could work in collaboration with the other categories (Educators are suppose to; Students are suppose to; Uncovering unknowns; Potentially highly subjective) to provide a matrix of understanding LA locations/uses/intentions.

Week 22 – Activity 10: Learning analytics in the library

Timing: 2 hours

  • Explore the Library Impact Data Project blog.
  • Make a list of the different types of data collected in university libraries. These range from swipe card data about student visitors to usage figures for specific websites.

1. number of items borrowed from library (excluding renewals)

2. number of visits to the library

3. number of logins to e-resources

4. which sites were accessed

5. most popular resources

6. time spent browsing (from entry to checkout)

  • Note in your learning journal or blog five ways in which these datasets might be used to support analytics that could lead to the improvement of learning and/or teaching.

(refers to both physical library, or university online library)

 1. Student’s grades, participation vs. books checked out

2. Students’s grades, participation vs. books checked out

3. Time spent in the library vs. contributions to online learning spaces (i.e. is the library an indicator of effort)

4. Resources accessed vs. those chosen as references (student ability to find relavent information). Can assist with embodied learning.

5. Time spent on downloaded resources – i.e. did students download them, read them off screen, where there distractions (were uses using social networks etc.) whilst browsing resources.

  • Visit the Library Analytics and Metrics (LAMP) project blog and read the guest post, So what do we mean when we say ‘analytics’?(Showers, 2014).This includes 49 user stories (presented in images of two tables) that identify ways in which library analytics can be used. Does the list include the five uses of data that you noted in your blog/learning journal at the start of this activity?

Not really, as there is little discussion about cross analytics comparison to really gather valuable usage data (i.e. triangulation). Besides some points such as:

– Merge data from the student registry

– Map e-resource usage events to actual users

  • Refer back to the definition of learning analytics you developed last week. Could all these analytics be classified as learning analytics, or just the ones that the table in the blog categorises as ‘T&L’? Or do they belong to some other subset?

More than just T&L. As for example, data analysis could transform into learning analytics if delivered (in a meaningful way) to students or educators. Recommendation analytics could also be seen to form part of adaptive teaching at a lower level. Overall, it is difficult to say they do or don’t, as on their own, many of the analytics might not be useful (partly due to a lack of explanation in the table). However, if the data is triangulated (those labeled as ‘data’ talk to this point) then they might well have characterises of LA.

Week 22 – Activity 9: Relating learning analytics to the concerns of educators and students

Timing: 2 hours

  • Read this paper, focusing particularly on Table 1 and Table 4.
  • The authors include tables that focus on why learning analytics are used. Reflect on the questions asked by teachers (Table 1) and the goals of learning analytics (Table 4).
  • Note in your learning journal or blog which of these questions can be considered:
    • driven
    • b.pedagogy driven.

    Some, particularly those about the learning environment, do not fit easily in either category.

  • In the forum, post a comment containing suggestions for a third classifier, and comment on other people’s suggestions.


Orange: Table 1

Blue: Table 4 Learning analytics are suppose to

Red: Table 4 Educators are suppose to

Grey: Table 4 Students are suppose to


Data driven:

  • How do students like/rate/value specific learning offerings?
  • How difficult/easy is it to use the learning offering?
  • Why do students appreciate the learning offering?
  • When and how long are student accessing specific learning offerings (during a day)?
  • How often do students use a learning environment (per week)?
  • Are there specific learning offerings that are NOT used at all?
  • By which properties can students be grouped?
  • Do native speakers have fewer problems with learning offerings than non-native speakers?
  • How many (percent of the) learning modules are student viewing?
  • track user activities
  • gather data of different systems
  • establish an early warning system
  • offer possibilities for (peer) comparison
  • find early indicators for success / poor marks / drop-out
  • monitor own activities / interactions / learning process
  • compare own behavior with the whole group / high performing students

Pedagogy driven:

  • How is the acceptance of specific learning offerings differing according to user properties (e.g. previous knowledge)?
  • Are students using specific learning materials (e.g. lecture recordings) in addition or alternatively to attendance?
  • Will the access of specific learning offerings increase if lectures and exercises on the same topic are scheduled during the same week?
  • Which didactical activities facilitate continuous learning?
  • How do learning offerings have to be provided and combined to with support to increase usage?
  • How do those low achieving students profit by continuous learning with e- test compared to those who have not yet used the e-tests?
  • Is the performance in e-tests somehow related to exam grades?
  • capture the interaction of students with resources / the interactions among students
  • provide educators / students with feedback/information on students’ activities
  • draw the users attention to interesting correlations
  • provide decision support
  • monitor learning process / way of learning / students’ effort
  • explore student data / get to know students’ strategies
  • identify difficulties
  • discover patterns
  • draw conclusions about usefulness of certain learning materials and success factors
  • better understand effectiveness of learning environments
  • intervene / supervise / advice / assist
  • improve teaching / resources / environment
  • become aware
  • reflect / self-reflect
  • improve discussion participation / learning behavior / performance
  • become better learners
  • learn
Third classifier ideas:
  • Uncovering unknowns (i.e. discovering correlations).
  • Potentially highly subjective


Week 22 – Activity 8: Analytics and pedagogy

Timing: 3 hours

  • Return to the ‘Innovating Pedagogy’ report that you read earlier in the module (Week 5, Activity 14).

    The report identifies ten types of pedagogy that might be used to transform education in the future.

  • Make some initial notes in your learning journal or blog about how three of these pedagogies could be supported by learning analytics. (When considering which three pedagogies to work with, please exclude emotional analytics as this is a form of learning analytics and we are looking for pedagogies that could potentially be supported by learning analytics: not those that embody it already.) You may need to refer back to the definition of learning analytics that you developed last week.
  • In your notes consider the following questions:
    1. What are these pedagogies trying to achieve?
    2. What might learners and teachers need support with?
    3. What sorts of data will be available?

    There is no need to make reference to existing analytic tools or systems. Decide which types of analytic would be valuable even if, as far as you know, they do not exist yet.

  • In your tutor forum, or in OU Live, share your conclusions with your colleagues.
  • Within the group, aim to identify ways in which each of the nine pedagogies could be supported by learning analytics. If the group finds it difficult to identify analytics for one or more of the pedagogies, suggest why this is the case.


Pedagogy 1: Adaptive teaching

  1. What is this pedagogies trying to achieve?
    • Using analytics to change teaching to match individual learning speed and understanding.
  2. What might learners and teachers need support with?
    • Identifying the key analytics to measure performance across different tasks. I.e. how thorough a student’s understanding is.
  3. What sorts of data will be available?
    • Test results; time spent on tasks; user contribution to forums; PLE analytics (i.e. outside learning); speed of answering; communication with others to work out solutions; feedback (qualitative) to support a learner’s perspective.

Pedagogy 2: Context-based learning

  1. What is this pedagogies trying to achieve?
    • To use the environment and context students find themselves in, to benefit the learning. Rather than neutralising context for students, bringing in outside experiences and real life scenarios to expose students to more scenarios with the aim of learning from the world around them.
  2. What might learners and teachers need support with?
    • Augmented reality; digital words; VR; crowd learning;
  3. What sorts of data will be available?
    • location data; time spent using different tools; interactions between students in digital worlds; what tools are not used (or aspects within tools); tracking students general activity patterns (i.e. VR > group chat > task).

Pedagogy 3: Embodied learning

  1. What is this pedagogies trying to achieve?
    • How does the body and our habits, actions, shape learning. It concerns how the body shapes and conditions are cognitive learning. It is concerned with the physical process of learning, i.e. what steps are taken to learn a task.
  2. What might learners and teachers need support with?
    • Using physical tracking devices (i.e. movement sensor devices – smartwatches, Fitbit); analysing the movements as they relate to success or challenge; monitoring student’s learning process through gestures, emotions, eye tracking.
  3. What sorts of data will be available?
    • Scenery data (movement); relation between movements/actions and result; time spent on certain movements (i.e. switching applications; eye movements to question and answer); movements that are difficult for students to master; emotion tracking data (sensing frustration through webcam, heart rate).

Week 22 – Activity 7: Google Analytics

Timing: 2 hours

  • Browse the Google Analytics website, using the Learn more links to explore beyond the Overview article.
  • Note the various types of data that are collected, and the reports that can be generated.
  • If you have the opportunity, investigate whether Google Analytics are used in your workplace and, if so, what they are used for.
  • Create a table in your learning journal with four columns:
    • Learner
    • Educator
    • Administrator
    • Other.
  • Decide which groups of people could benefit from some of the different types of data reported by Google Analytics, and make a note in the appropriate column(s).
  • Write a reflective piece in your learning journal or blog about how these data could be used to generate learning analytics, together with any problems that might be associated with their use in education.
  • Make reference to any occasions on which you have been presented with or had access to any of these data during your own learning and teaching, and share these experiences in the forum or in OU Live if they are not private or business sensitive.