Category: Complete

Week 25 – Activity 26: Thinking of the future

Timing: Less than 1 hour

  • Look at this Tumblr website that was set up to support reflection on learning analytics:

    Click the arrows at the top right of each picture to view the accompanying text. All the pictures on the site have been used to inspire reflection on analytics, using the tags #dream, #nightmare and #fairydust (for those who believe the promise offered by learning analytics is currently unrealistic). For example, an image of roughly brushed coloured stripes, ranging from pink to blue, was tagged #dream and prompted the reflection:

    • ‘All the separate elements come together to form a complete picture. Presenting them in this way makes it clear that they fall into three main groups, although it is also possible to pick out individual elements and their slight variations. The blue-and-white sky-like quality of the top of the picture suggests that the sky is the limit. Round the edge, the occasional gaps between the paint and the canvas edge serve as a reminder that learning analytics never reveal the whole picture.’

    An image of heavily pollarded trees was tagged #nightmare and prompted the reflection:

    • ‘For me, these trees represent the learners. They could be left to develop freely in all directions, but learning analytics constrain their choices and reduce them to basic outlines, each resembling the other. All their individuality is stripped away by the analytics, and they are set in an impoverished and deteriorating landscape.’
  • Choose one of the pictures on the Tumblr site, or another image of your own, and write a short reflection on learning analytics, which draws on what you have learned in this block and which is shaped by the image.
  • Post your reflection, together with the image or a brief description of the image, in the discussion forum and tag it as dream, nightmare or fairydust.
  • Look at other people’s contributions and comment on at least one of them.


Screen Shot 2016-07-31 at 19.54.30

Wassily Kandinsky – Composition VIII, 1932. Link to post here


The chaos of this image reminds me of the different directions LA takes, i.e. from the students side, to the teachers, to course administrators and management. They each have their own sets of objectives and interests. There are obvious uses for LA, such as the overall aim of improving retention, which can be seen in the most standout sections, like the big circle in the top left. However, the rest is still finding it’s way and being debated.

Furthermore, LA is about individuals. Individuals do not fit nicely into boxes, as is seen here. There are some crossing points, like the tic-tac-toe box on the right, but general there is little to easy extract consistent patterns form.

Tag: nightmare

Due to the various angles LA could took. Although there is hope, and I do believe we will find a way of really improving learning/teaching, but the path is very messy and we must be ready for many challenges, dips, and turns, amongst successes.



Week 25 – Activity 25: Evaluation frameworks

Timing: 3 hours

  • Read this report, which has a particular focus on examples of using the framework (Section 3.2):

    The fifth example in the framework is the SNAPP tool that you read about in Activity 14. Compare this with Example 4, which also focuses on a social networking tool.Make notes in your learning journal about how useful the framework (Figure 1) was for understanding and comparing the tools.


The framework was particularly useful at noting the key similarities and differences. These are highlighted in the table below (differences in red, similarities in blue). Without mapping the characteristics of each analytics projects against common criteria, one could simply concluder that both projects are similar to due similar pedagogical aims (networking/socio-connectivism).

It was useful in its conclusion, that Example 4 could not be considered analytics due to only ‘diffuse actionable insights’ and not thus [implied understanding] not predictive models.

Example 4 – Research Librarianship Social Network Analysis Example 5 – SNAPP (Social Networks Adapting Pedagogical Practice)
Analysis subjects, objects and clients Analysis subjects: journals, articles and associated authors (individually identified).
Analysis clients: researchers who study research librarianship, particularly new researchers. Analysis objects: not clearly defined
Analysis subjects: principally learners but also teachers (personally-identifying online forum interactions).
Analysis clients: teacher.
Analysis objects: teacher (their learning activity design).
Data origin Data was obtained from the Thomson Reuters Social Sciences Citation Index. This is a subscription service offering comprehensive and quality controlled data on scholarly works and citations. Private data from a learning management system (virtual learning environment). SNAPP uses raw data that is automatically generated in the course of using the online forum. Processing is at the level of a teaching cohort so the scale of analysis subjects is likely to be small in conventional educational settings, although the interaction count may be modest in scale.
Orientation and objectives Orientation: descriptive
Objective type: there is no clear objective but a general aim to “gain deeper insights into the discipline of research librarianship”
Orientation: a diagnostic orientation is implied by SNAPP’s creators, i.e. that increasing online interaction is desirable
Objective type: performance (how effective is the learning activity design)
Technical approach A descriptive approach to social network analysis is used. A descriptive approach to social network analysis is used, with a strong bias towards visual presentation.
Embedded theories and reality There are references to literature describing the development of social network analysis; broad theories of social communication and community development are clearly embedded (e.g. “The general assumption behind this genre of studies is, that the more authors are co-cited, the stronger will be the bond they have”). SNAPP’s developers are overt in their software being designed to support socio-constructivist practice.
Comment The article is typical of applications of social network analysis in being of descriptive character. The research questions posed and the conclusions drawn are essentially informative rather than being action-oriented. The absence of clearly identifiable analysis objects is consistent with the informative outlook.

This example largely fails to match our definition of analytics due to the presence of only diffuse actionable insights, even though it is a competent piece of data processing and probably of interest to researchers.

In contrast to the research librarianship example of social network analysis, the chosen use of SNAPP is a much stronger example of analytics. The technical aspects and visualisations are very similar but the intention towards actionable insights is quite different.

The technical approach in both cases is descriptive and does not surface mechanism. A more evolved approach might attempt to indicate the level of significance of differences between parts of the sociograms or of changes over time. A further elaboration to permit hypotheses about cause and effect between aspects the learning design and the observed patterns of interaction to be explored/tested is certainly in the realm of research and may only be tractable at much larger scale.

In practical use, it would be important to guard against SNAPP being used as the single lens on the effect of changing learning activity designs. What of interactions that are not captured?

SNAPP can also be used by students to self-regulate but there is anecdote to suggest that the tool is too teacher-oriented in its presentation for students to easily understand.

  • Now read the second framework:

    The paper goes into detail about how this framework was developed. For the purposes of this activity, though, focus on the framework as set out in Figure 9 and in the section called ‘Outline of the framework’.

    The framework’s authors are working on developing it as a tool for evaluating learning analytics. That work is still in progress and the 20 indicators can currently be regarded as generic headings that can guide evaluation and comparison.

  • Use the indicators identified in Figure 9 as headings to help you make notes about the SNAPP tool that you read about in Bakharia et al. (2009) in Activity 14. As the description of SNAPP in that paper is relatively short, you are likely to find that you cannot complete all the sections of the framework. This is always likely to be the case when you evaluate a tool. The framework helps you to identify questions you need to ask in order to make useful comparisons.


Headings from Figure 9. Answers, as they relate to SNAPP, are in bold. Non-bold text refers to direct quotations or points from the Sheffel et al., (2014) reading. It is noted above that I might not have enough information to complete each section as it related to SNAPP. 

  • Objectives:
    • Awareness: makes educators aware of at risk students; identify high and low perfuming students; indicate to what extent a learning community is developing in a course
    • Reflection: “Reflection is not an end in itself, but a mechanism for improving teaching and hence maximising learning.” (McAlpine & Western, 2000) (pg.127) ; Provide before and after snapshots of interventions. 
    • Motivation:
    • Behavioural change: Focus on improving teacher practices, mostly interventions.  Behavioural change on the part of the teacher. When to intervene, etc. Can also change the students behaviour through benchmarking individual performance against peers. 
  • Learning Support:
    • Perceived usefulness: “Educators hesitate to take the calculations of algorithms about learning and education effects as valid while at the same time they hope to gain new insights from those analytics results.” (pg.127). Easily integrate-able into LMS systems. 
    • Recommendation: Does not provide distinct recommendation to teachers, but suggests through self-analysis what might need to occur. This could certainly be extended to include predictions and base models in future iterations. 
    • Activity classification:
    • Detection of students are risk: Very much key aim. Through lack of interaction or not getting responses/engaging. 
  • Learning Measures and Output:
    • Comparability: Cross student comparison of engagement. 
    • Effectiveness: “Their [Dyckhoff et al., 2013] findings show that in many cases LA tools do not yet answer all of the questions that teachers have in regards to their educational setting.” (pg.127). There is a danger of excluding/not seeing the other factors that might influence SS success (or other forms of communication). In other words, over emphasising the importance of forum communication. 
    • Efficiency:
    • Helpfulness: Helpful to identify at risk students.
  • Data Aspects:
    • Transparency:
    • Data standards:
    • Data ownership: “the last few years have shown an immense rise in the availability of and open access to data sets fro the technology-enhanced learning, LA and education data mining domains” (pg.128).
    • Privacy: “The authors suggest four principles that provide good practice when tackling the above-mentioned conflicts [legal, risk, ethics]: (1) clarify, (2) comfort and car, (3) choice and consent, and (4) consequence and complaint.” (128). To improve privacy-related matters, one should consider “(1) transparency, (2), student control over data, (3) right of access/security, and (4) accountability and assessment.” (128). Not given, but given that names are available (Show names on figure 1), there could be ethics concerns. I.e. what if students want to opt out?
  • Organisational Aspects:
    • Availability: As a widget integrated directly into browsers. Cross platform. Unsure about the full range of LMS software/tools supported.
    • Implementation: “…(1) manage the student pipeline, (2) eliminate impediments to retention and student success, (3) utilise dynamic, predictive analytics to respond to at-risk behaviour, (4) evolve learning relationship management systems, (5) create personalise learning environments/learning analytics, (6) engage in large-scale data mining, and (7) extend student success to include learning, workforce, and life success.” (129). Unsure how it will fit into a broader picture. I.e. will the data to be fed to a larger data centre for further analysis? 
    • Training of educational stakeholders: Up to the institution.
    • Organisational change: “…such intelligent investments from the organisations have a strong and justifiable return on investment; the implementation of enhanced analytics is to be seen as a critical for student success on the one hand and achieving institution effectiveness on the other as without it, organisations cannot meet the current gold standard for institutional leadership.” (129) / “Arnold et al., (2014) argue that “in order to truly tranform education, learning analytics must scale and become institutionalised at multiple levels throughout an educational system” (129). Could change the focus of educators to enhancing forum support/guidance. 


  • Compare the structured evaluation that you have produced with the evaluation of SNAPP in Example 5 of Cooper (2012).


A very different focus. I prefer the Cooper (2012) analysis, as it brings in pedagogy and underlying elements (i.e. orientation and objectives), which, in plain terms, can help understand ‘what the point of the tool is, and will it be useful for me.’ However, Cooper’s is also more susceptible to user bias, rather than just reporting facts.


  • In the discussion forum, post a comment stating which framework you found most useful, and how you would suggest amending it for future use.

Week 25 – Activity 24: Engaging stakeholders

Timing: 6 hours

Imagine that your educational institution, or one you know well, has decided to develop a learning analytics programme and, early in the process, intends to run a workshop for stakeholders in order to develop a vision. You have been tasked with organising the one-day workshop that will move this process forward.

  1. Decide how many people you would like to attend the workshop, and which groups they should represent. Who definitely needs to be involved at this stage, and who can be involved later? Your decision will depend on the type of institution you have chosen, and on your view of learning analytics.Make notes in your learning journal or blog, and discuss your options in the discussion forum or using OU Live.
  2. Plan a timetable for the day. Workshops often run from 10 a.m. until 4 p.m., with an hour for lunch and two half-hour coffee breaks – but there is no need to keep to that format if it does not suit your plans or the local situation.You may want to include an introduction to learning analytics, a talk about the importance of a vision, some examples of learning analytics, an expert speaker or two, a chance to share experiences or previous work, an opportunity to brainstorm ideas, a chance to share ideas, and a final discussion.Although there is a lot to think about, your final schedule does not need to be more than a page long.Again, discuss ideas in the discussion forum or within OU Live. Together, you have a great deal of experience of participating in and running workshops, and this is a good opportunity to share that experience.
  3. Finally, use Powerpoint, Keynote or similar software to produce an introductory presentation for the workshop. Focus on why your chosen institution is interested in developing an analytics programme and why participants have been invited to help develop this vision.Make use of resources you have encountered during this block. The SoLAR and LACE websites both contain links to resources and presentations that could help. The LACE YouTube channel contains a series of short videos from learning analytics experts, including some from authors whose work you have read in this block. You might choose to embed one in your presentation.Share a link to your presentation in the discussion forum, and take the opportunity to view and comment on other people’s presentations.


Question 1:

  1. How many people you would like to attend the workshop, and which groups they should represent. 

Based on my organisation (modified), 20 people will this workshop.

  1. Who definitely needs to be involved at this stage, and who can be involved later?

Involved at this stage – Management and teachers

Involved later – Parents/Students


Question 2:





An introduction to learning analytics Presented by: Anna Lea Dyckhoff

Discussing her paper, ‘Supporting Action Research with Learning Analytics,’ with a particular focus on the goals of learning analytics for various stakeholders.


The importance of a vision Presentation by: Professor Eileen Scanlon

Discussing his paper ‘Beyond Prototypes: Enabling innovation in technology-enhanced learning,’ with a particular focus in turning learning analytics aims into reality.


Examples of LA in action Presented by: Rebecca Ferguson

Discussing her paper ‘Setting learning analytics in context: overcoming the barriers to large-scale adoption,’ with a particular focus discussing the frameworks used in successful implementations of learning analytics.




Sharing experiences or previous work. In groups of 3, staff share ideas relating to the following questions:

1) Have you ever been surprised by student actions, such as dropping courses, sudden enthusiasm, or failing tests?

2) Do you believe the analytics discussed by our guest speakers could have helped predict and/or prevent the issue occurring?


Brainstorming solutions to a problem. Within the same group, discuss solutions to the following problems:

1) A student asks to ‘opt out’ of analytics. Do you allow this to happen? What do you say to convince him/her otherwise?

2) The analytics developed by your organisation send a warning signal to you about a student in your class as on track to fail. What do you need to consider before intervening?


Sharing solutions. A group leader reports back on their solution to the cohort.


Wrap up. A final word from the organisers.



Question 3:



Week 25 – Activity 23: Deploying a vision of learning analytics

Timing: 4 hours

  • Select two implementations of learning analytics from the list below.
    • Case Study 1A: The Open University, UK: Data Wranglers (pp. 131–3).
    • Case Study 1B: The OU Strategic Analytics Investment Programme (pp. 133–7).
    • Case Study 2: The University of Technology, Sydney, Australia (pp. 138–41).
    • Cluster 1 – focused on student retention (pp. 19–21).
    • Cluster 2 – focused on understanding teaching and learning processes (pp. 19–21).
    • An example from your personal experience or from your reading on this subject.
  • Read about the implementations and, with the help of the ROMA framework (see Figure 3 of ‘Setting learning analytics in context’), make notes on how the visions that underpinned these impacted on the implementation of learning analytics.
  • You may find that the vision is not clearly defined. If this is the case, state it as clearly as you can, based on the information that you have.
  • In the list above, the options from Ferguson et al. (2015) are already aligned with the ROMA framework. The options from Colvin et al. (2015) are not single examples but clusters of examples. Looking at these examples may help you to become aware of elements that are not clearly represented in the ROMA framework.
    • I added point 19. Streamlining and improved data understanding/usability to be added to the end of the cycle as a consideration. 
  • In the discussion forum, or in OU Live, compare your findings with those of people who have selected different examples. Which vision aligns best with your view of how learning analytics should be used?


Q1. Make notes on how the visions that underpinned these impacted on the implementation of learning analytics


For Case Study 1A, the ROMA framework was directly linked to, as per figure 3. Slight changes in the language used and steps taken were used in the articles summary. Within asterisk, the actual ROMA step is defined by myself in the table below (from my point of view).

Screen Shot 2016-07-29 at 13.51.56


Impact (related to *ROMA*)

Case Study 1A

1. Using the volume of educational data more effectively. Policy objectives were defined (ROMA 1)

*ROMA 1*

2. Develop a group of staff with expertise in the individual faculty contexts. Policy objectives were defined (ROMA 1)

*ROMA 1*

3. Set up a system for collating, synthesising, and reporting on the available data. Policy objectives were defined (ROMA 1)

*ROMA 1*

4. produce reports at regular intervals Policy objectives were defined (ROMA 1)

*ROMA 1*

5. build strong relationships with the faculties. Policy objectives were defined (ROMA 1)

*ROMA 1*

6. analyse and influence teaching and learning practice. Map the context (ROMA 2)

*ROMA 1*

7. Senior management in each faculty (responsible for learning, teaching and/or curriculum development, curriculum developers, those responsible for data gathering and curation, and general senior management. Stakeholders were identified (Roma 3)

*ROMA 2*

8. Key focus on curriculum development and quality enhancement. Learning analytics purposes were identified (ROMA 4)

*ROMA 3*

9. Integrate available data with completion rates, pas rates. Learning analytics purposes were identified (ROMA 4)

*ROMA 4*

10. Conduct extensive consultation and feedback regarding implementation. Strategy development (ROMA 5)

*ROMA 5*

11. Conduct early pilot work. Strategy development (ROMA 5)

*ROMA 5*

12. Decide on an implementation plan, and dates for review. Strategy development (ROMA 5)

*ROMA 4*

13. Provide/achieve content analysis. Capacity analysis, Human Resources developed (Roma 6)

*ROMA 6*

14. Develop a full understanding of the faculty teaching and learning context. Capacity analysis, Human Resources developed (Roma 6)

*ROMA 6*

15. Deployment of new technical tools (data management software, etc.) Capacity analysis, Human Resources developed (Roma 6)

*ROMA 6*

16. Develop an understanding and appreciation of what the data could show, as well as an awareness of how to access it without the mediation of a Data Wrangler. Capacity analysis, Human Resources developed (Roma 6)

*ROMA 3*

17. Build in feedback from stokeholds into the delivery of reports. A monitoring and learning system was developed (Roma 7)

*ROMA 6*

18. Gather feedback from key stakeholders from evaluation exercises. A monitoring and learning system was developed (Roma 7)

*ROMA 6*

19. Undertake reviews with the aim of streamlining. Streamlining the process & Improve understanding of data usability(Non-ROMA). < Self-added

Case Study 1B

1. To use an apply information strategically (through specific indicators) to retain students and enable them to progress and achieve their study goals. *Roma 1*
2. (Macro) Aggregate information about the student learning experience at an institutional level in order to inform strategic priorities that will improve student retention and progression. *Roma 4*
3. (Micro) Make use of analytics to drive short, medium, and long term interventions. *Roma 4*
4. Stakeholders make use of integrated analytics to inform interventions designed to improve outcomes. *Roma 3*
5. Evaluate the evidence base for factors that drive student success (post initial intervention data collection) *Roma 3*
6. Develop models that ensure key stakeholders can implement appropriate support interventions for both short- and long-term benefits. *Roma 3*
7. To use an analysis of current student performance to identify priority areas for action, both in terms of change to the curriculum and learning design, and in terms of interventions with the students most at risk. *Roma 4*
8. Develop a common methodology to evaluate the relative value of interventions through measuring the resulting student behaviours and improvements (to inform future SS experience) *Roma 4*
9. Create near real-time data visualisations around key performance measure. *Roma 6*
10. Triangulation of different data sources, to help in identifying patterns that influence success in a given context. *Roma 6*
11. Create an ethics policy that details what data is being collected and its ethical uses. *Roma 6*
12. Develop machine-learning-based predictive modelling systems. I.e. Will a student hand in his/her assignment based on online activity? *Roma 3*
13. Improve feedback methods from students (shift from end of module assessment to in-module assessment) to improve reaction times to issues. *Roma 4*
14. Measure the success/impact of learning designs through the systematic collection of data. *Roma 5*
15.Create an CoP ‘evidence hub’ focused on the progression of first year students to second year. *Roma 6*
16. Development of ‘small data’ student tools, to allow students to monitor their own progress, visually, to make informed study choices. *Roma 6*

Q2: Which vision aligns best with your view of how learning analytics should be used?


Vision two. As there was a more defined focus on improving student progress. Rather than the random ‘Data Wrangling’ discussed in the first case study.

Week 25 – Activity 22: Developing a vision of learning analytics

Timing: 1 hour

  • Return to read Table 4 of the paper below. This identifies goals of learning analytics in general and also from the perspectives of learners and educators.
  • Goals and visions are not the same thing, and some of these goals are too mundane or too succinct to be inspiring. In a blog post, or in your learning journal, combine or develop some of these goals to construct a vision of what learning analytics could be at your institution or at one you know well. Aim for a vision statement that is no longer than two sentences.
  • Once you have constructed a vision, note whether it seems to be a learner’s vision, an educator’s vision, a manager’s vision or a combination of these. How would it need to change, if at all, to inspire other stakeholder groups?


The aim of LA at my institution is to:

Provide both students and educators with a means to monitor their individual progress as it relates to a group, with a key focus on identifying actionable insights into ways to improve their work. These LA insights will aim to bring students and educators closer together, as mutual understandings develop relating to work progress and desired results. 


This above statement appears to be more of a mangers vision to a team. However, as it encompasses students and educators as the key stokeholds, it is felt to be fairly inclusive.

Week 24 – Activity 21: Policy evaluation

Timing: 2 hours

  • Return to the OU’s policy on the ethical use of student data for learning analytics. Estimate how many words it contains – either by eye, or by copying and pasting the text into your word-processing program and using the Word Count tool. Do the same for the related documents on the same web page: the FAQs and the document on using information to support student learning.
  • Note how long it would take someone to read these documents – either by drawing on your own experience or by using an average reading speed of 300 words per minute to calculate the time required.
  • Now read Slade and Prinsloo (2014). The study that they report on in this paper was designed to inform the development of the OU policy.
  • In your learning journal, note the issues reported in the paper that were raised by students and that you consider to be the most important.


Not raised directly by students:

“Central to the general concerns regarding the protecting of privacy and informed consent, is the notion of “privacy self-management” which has its origins in the Fair Information Practice Principles (1973) which covers, amidst other issues, “individuals’ rights to be notified of the collection and use of personal data; the right to prevent personal data from being used for new purposes without consent; the right to correct or amend one’s records, and the responsibilities of the holders of data to prevent its misuse” (Solove, 2013, p.1882).”

“most of these initiatives to inform individuals don’t work because of the fact that “(1) people do not read privacy policies; (2) if people read them, they do not understand them; (3) if people read and understand them, they often lack enough background knowledge to make an informed choice; and (4) if people read them, understand them, and can make an informed choice, their choice might be skewed by various decision-making difficulties” (Solove, 2013, p.1888).”


““beyond privacy self-management”, we should perhaps rethink issues such as consent and the unequal power-relationship between the institution and students, the advantages of opting in rather than opting out, addressing privacy’s timing and focus and the codification of privacy norms and developing substantive rules for data collection (Solove, 2013).”

General note: The reason for their not being an opt out cause: the university feels that it has a moral responsibility to help students,versus giving students their own agency to make decisions, despite these decisions potentially being against what the data shows. (related to paragraph before conclusion).

From students:

“The general view was that more could be done to make clear what data is being collected, how it is being collected, where it is being collected from, the uses for which it being collected and who will have access.” (296)

“There’s a huge difference IMO between anonymised data to observe/monitor large scale trends and the “snooping” variety of data collection tracking the individual. I’m happy for any of my data to be used in the former; with the latter I would be uncomfortable about the prospect that it would be used to label and categorise students in an unhelpful or intrusive way”.

– stating exactly how information is used, with links to the detail;

– providing a basic summary of the key points on the student’s home page,

– communicating the approach at the point that a student is about to supply any data that is to be used;

– providing a fairly inclusive set of examples of what information is gathered and how it may be used” (296)

“[It is difficult] for the University though to flag issues like this [incorrectly targeted emails] to students without holding data about what we do/how well we do/whether we use the forums/need advice…” (297)

“…data could and perhaps should be used to provide a more personalised and relevant support service, with students suggesting that a learning analytics approach applied in conjunction with support delivered by a personal tutor might ameliorate the risks of labelling students incorrectly” (297)

“Others felt that the involvement of tutors could effectively prejudice the tutor:student relationship by impacting on the tutor’s expectations of that student” (297)

“- It is important to have a clear purpose for data collection and to communicate that purpose effectively ; to explain what data will/won’t be used for, and who can see it (e.g. on each student, in aggregate).

– A set of frequently asked questions developed for staff dealing with declaring personal information around diversity could usefully be replicated for students.

– There should be transparent policies about how long data can be held for and what the process is for handling requests for deletion of data.

– Data should only be shared on a ‘need to know basis’ – particularly where it is personal/sensitive.

– There should be strong and transparent governance in this area with a focus on ethics.

Data handling protocols are important and should be enforced effectively.

– There should be periodic data audits.

– There should be an up-to-date data dictionary.

– It is important to address any concerns about the sharing of information with other organisations or the processing of information by other organisations.

Could add: “lecturers/tutors should not make decisions based solely on data. The student needs to be consulted first. (my own addition)” From: “I have a concern that increased data-richness resulting in over-reliance on data and ‘computer says no’ responses. Catering for the individual is what’s needed. If data collection is used to help appropriate questions to be asked, fine – if it’s providing answers, very much not so.” (298)

“staff involved in data analysis and in the delivery of intervention [should be] appropriately trained.” (298)

Could add: “Students were quick to flag the dangers of data protection and privacy in relation to having their data passed on – e.g., where a third party undertakes a service on behalf of the University.” … “there was also a view expressed that the University should not attempt to draw in information from third party sites for its own purposes”

“No one should feel compelled to provide data if they don’t want to and they should be able to keep their reasons for this, which may be very personal, private.” (299)

“A right to refuse without compromising study ought to be built in” (299)

“The author also noted the argument for a duty of care to advise people against making a potentially costly mistake by continuing on a course they might not complete. S/he concluded this by stating “But it is ultimately their choice.” (299)

  • Return to the one-page summary of key points in the OU policy that you wrote in Activity 20. In the light of what you have read, make any amendments you feel necessary. Can you make your summary interesting enough and short enough to feel confident that students would be likely to read it?


To include:

– opt out clause;

– Without consent, acknowledgement: no passing on of information to third parties;

– A right to refuse (opt out) should not jeoprodise student’s studies in any way;

– Students should have the final say on all analytic interventions/suggestions. however, courses as a whole could benefit from analytic analysis (i.e. change in how instructions for a task are delivered).

– Without consent, acknowledgement: similarly, no receiving of information from third parties;

– Tutors/lectures, etc. cannot make assumption on data alone, consultation is needed;

– A clear data collection purpose is needed and should be communicated;

– FAQs for tutors/lecturers should be handed to students;

– transparency in how long data is held and how to request deletion;

– data should only be shared on a need to know bases;

– transparency and ethics should be key pillars;

– periodic audits;

– Staff involved in analytic must be properly trained for such;

– if information is shared with a third party organisation, or information is processed by a third party (i.e. for research), this should be made clear to students.

– consultation is needed with students where analytics detect a problem or recommend a change (i.e. no ‘computer says no’ situations)

  • In the discussion forum, or using OU Live, discuss ways of developing a policy that people will engage with. How should the policy be shared with different stakeholder groups such as learners, educators and systems administrators?



Could be delivered through videos, such as the approach taken by Pardue University when sharing details of their Signals analytics. 

Could also be delivered in a more ‘Facebook’ like approach. This is where, despite users not always (or seldom) actually reading the T&Cs, they are able to select what personal data they are comfortable sharing, or Facebook having access to.


Scheduled workshops where members of the analytics team, and head teachers/professors, engage a broader cohort of staff. This could be a mandatory session, held very 6 months. It can provide a time for delivering updates to staff, and answering questions. 

These sessions should also include detailing not only the facts, i.e. what the policy says, but also why it says analytics use, but also the reason and theoretical background. This will give educators some agency and theory to back up their decisions. 

Real life scenarios could be delivered as examples, which could be placed in test scenarios (online MCQ testing) for staff to complete. I.e. If XYZ arises, what do you do?

System Administrators:

They also need to be made aware of the theory and history of analytics at the university, in order to maintain track. As they are at the forefront of the analytics system, they should be responsible for attending fairly regular training, through video, testing, and face to face sessions. 

The policy document could be discussed in groups, testing could be delivered (where, for example, they are placed in the role of students or teachers). 

System admin could also be responsible for delivering training sessions, video guides, etc. to enhance their own knowledge through teaching others. 

Week 24 – Activity 20: Ethical use of student data

Timing: 2 hours

  • Read the OU’s policy on the ethical use of student data for learning analytics.
  • Write a one-page summary of the key points in the policy.
  • Aim to phrase the summary so that it would be helpful for someone in your own institution, or in an institution that you know well, who was working to develop a local set of ethical guidelines.
  • There may be published guidelines that you can readily access for your chosen context. If this is the case, you can compare these with those you have written and potentially produce an ‘improved’ set of guidelines.
  • Share your key points in the discussion forum or discuss them in OU Live.


A summarised document is available on my HDD. (Readings > Week 24 > Activity 20).