Tagged: W24A21

Week 24 – Activity 21: Policy evaluation

Timing: 2 hours

  • Return to the OU’s policy on the ethical use of student data for learning analytics. Estimate how many words it contains – either by eye, or by copying and pasting the text into your word-processing program and using the Word Count tool. Do the same for the related documents on the same web page: the FAQs and the document on using information to support student learning.
  • Note how long it would take someone to read these documents – either by drawing on your own experience or by using an average reading speed of 300 words per minute to calculate the time required.
  • Now read Slade and Prinsloo (2014). The study that they report on in this paper was designed to inform the development of the OU policy.
  • In your learning journal, note the issues reported in the paper that were raised by students and that you consider to be the most important.


Not raised directly by students:

“Central to the general concerns regarding the protecting of privacy and informed consent, is the notion of “privacy self-management” which has its origins in the Fair Information Practice Principles (1973) which covers, amidst other issues, “individuals’ rights to be notified of the collection and use of personal data; the right to prevent personal data from being used for new purposes without consent; the right to correct or amend one’s records, and the responsibilities of the holders of data to prevent its misuse” (Solove, 2013, p.1882).”

“most of these initiatives to inform individuals don’t work because of the fact that “(1) people do not read privacy policies; (2) if people read them, they do not understand them; (3) if people read and understand them, they often lack enough background knowledge to make an informed choice; and (4) if people read them, understand them, and can make an informed choice, their choice might be skewed by various decision-making difficulties” (Solove, 2013, p.1888).”


““beyond privacy self-management”, we should perhaps rethink issues such as consent and the unequal power-relationship between the institution and students, the advantages of opting in rather than opting out, addressing privacy’s timing and focus and the codification of privacy norms and developing substantive rules for data collection (Solove, 2013).”

General note: The reason for their not being an opt out cause: the university feels that it has a moral responsibility to help students,versus giving students their own agency to make decisions, despite these decisions potentially being against what the data shows. (related to paragraph before conclusion).

From students:

“The general view was that more could be done to make clear what data is being collected, how it is being collected, where it is being collected from, the uses for which it being collected and who will have access.” (296)

“There’s a huge difference IMO between anonymised data to observe/monitor large scale trends and the “snooping” variety of data collection tracking the individual. I’m happy for any of my data to be used in the former; with the latter I would be uncomfortable about the prospect that it would be used to label and categorise students in an unhelpful or intrusive way”.

– stating exactly how information is used, with links to the detail;

– providing a basic summary of the key points on the student’s home page,

– communicating the approach at the point that a student is about to supply any data that is to be used;

– providing a fairly inclusive set of examples of what information is gathered and how it may be used” (296)

“[It is difficult] for the University though to flag issues like this [incorrectly targeted emails] to students without holding data about what we do/how well we do/whether we use the forums/need advice…” (297)

“…data could and perhaps should be used to provide a more personalised and relevant support service, with students suggesting that a learning analytics approach applied in conjunction with support delivered by a personal tutor might ameliorate the risks of labelling students incorrectly” (297)

“Others felt that the involvement of tutors could effectively prejudice the tutor:student relationship by impacting on the tutor’s expectations of that student” (297)

“- It is important to have a clear purpose for data collection and to communicate that purpose effectively ; to explain what data will/won’t be used for, and who can see it (e.g. on each student, in aggregate).

– A set of frequently asked questions developed for staff dealing with declaring personal information around diversity could usefully be replicated for students.

– There should be transparent policies about how long data can be held for and what the process is for handling requests for deletion of data.

– Data should only be shared on a ‘need to know basis’ – particularly where it is personal/sensitive.

– There should be strong and transparent governance in this area with a focus on ethics.

Data handling protocols are important and should be enforced effectively.

– There should be periodic data audits.

– There should be an up-to-date data dictionary.

– It is important to address any concerns about the sharing of information with other organisations or the processing of information by other organisations.

Could add: “lecturers/tutors should not make decisions based solely on data. The student needs to be consulted first. (my own addition)” From: “I have a concern that increased data-richness resulting in over-reliance on data and ‘computer says no’ responses. Catering for the individual is what’s needed. If data collection is used to help appropriate questions to be asked, fine – if it’s providing answers, very much not so.” (298)

“staff involved in data analysis and in the delivery of intervention [should be] appropriately trained.” (298)

Could add: “Students were quick to flag the dangers of data protection and privacy in relation to having their data passed on – e.g., where a third party undertakes a service on behalf of the University.” … “there was also a view expressed that the University should not attempt to draw in information from third party sites for its own purposes”

“No one should feel compelled to provide data if they don’t want to and they should be able to keep their reasons for this, which may be very personal, private.” (299)

“A right to refuse without compromising study ought to be built in” (299)

“The author also noted the argument for a duty of care to advise people against making a potentially costly mistake by continuing on a course they might not complete. S/he concluded this by stating “But it is ultimately their choice.” (299)

  • Return to the one-page summary of key points in the OU policy that you wrote in Activity 20. In the light of what you have read, make any amendments you feel necessary. Can you make your summary interesting enough and short enough to feel confident that students would be likely to read it?


To include:

– opt out clause;

– Without consent, acknowledgement: no passing on of information to third parties;

– A right to refuse (opt out) should not jeoprodise student’s studies in any way;

– Students should have the final say on all analytic interventions/suggestions. however, courses as a whole could benefit from analytic analysis (i.e. change in how instructions for a task are delivered).

– Without consent, acknowledgement: similarly, no receiving of information from third parties;

– Tutors/lectures, etc. cannot make assumption on data alone, consultation is needed;

– A clear data collection purpose is needed and should be communicated;

– FAQs for tutors/lecturers should be handed to students;

– transparency in how long data is held and how to request deletion;

– data should only be shared on a need to know bases;

– transparency and ethics should be key pillars;

– periodic audits;

– Staff involved in analytic must be properly trained for such;

– if information is shared with a third party organisation, or information is processed by a third party (i.e. for research), this should be made clear to students.

– consultation is needed with students where analytics detect a problem or recommend a change (i.e. no ‘computer says no’ situations)

  • In the discussion forum, or using OU Live, discuss ways of developing a policy that people will engage with. How should the policy be shared with different stakeholder groups such as learners, educators and systems administrators?



Could be delivered through videos, such as the approach taken by Pardue University when sharing details of their Signals analytics. 

Could also be delivered in a more ‘Facebook’ like approach. This is where, despite users not always (or seldom) actually reading the T&Cs, they are able to select what personal data they are comfortable sharing, or Facebook having access to.


Scheduled workshops where members of the analytics team, and head teachers/professors, engage a broader cohort of staff. This could be a mandatory session, held very 6 months. It can provide a time for delivering updates to staff, and answering questions. 

These sessions should also include detailing not only the facts, i.e. what the policy says, but also why it says analytics use, but also the reason and theoretical background. This will give educators some agency and theory to back up their decisions. 

Real life scenarios could be delivered as examples, which could be placed in test scenarios (online MCQ testing) for staff to complete. I.e. If XYZ arises, what do you do?

System Administrators:

They also need to be made aware of the theory and history of analytics at the university, in order to maintain track. As they are at the forefront of the analytics system, they should be responsible for attending fairly regular training, through video, testing, and face to face sessions. 

The policy document could be discussed in groups, testing could be delivered (where, for example, they are placed in the role of students or teachers). 

System admin could also be responsible for delivering training sessions, video guides, etc. to enhance their own knowledge through teaching others.