Category: Week 24

Week 24 – Activity 21: Policy evaluation

Timing: 2 hours

  • Return to the OU’s policy on the ethical use of student data for learning analytics. Estimate how many words it contains – either by eye, or by copying and pasting the text into your word-processing program and using the Word Count tool. Do the same for the related documents on the same web page: the FAQs and the document on using information to support student learning.
  • Note how long it would take someone to read these documents – either by drawing on your own experience or by using an average reading speed of 300 words per minute to calculate the time required.
  • Now read Slade and Prinsloo (2014). The study that they report on in this paper was designed to inform the development of the OU policy.
  • In your learning journal, note the issues reported in the paper that were raised by students and that you consider to be the most important.

Answer:

Not raised directly by students:

“Central to the general concerns regarding the protecting of privacy and informed consent, is the notion of “privacy self-management” which has its origins in the Fair Information Practice Principles (1973) which covers, amidst other issues, “individuals’ rights to be notified of the collection and use of personal data; the right to prevent personal data from being used for new purposes without consent; the right to correct or amend one’s records, and the responsibilities of the holders of data to prevent its misuse” (Solove, 2013, p.1882).”

“most of these initiatives to inform individuals don’t work because of the fact that “(1) people do not read privacy policies; (2) if people read them, they do not understand them; (3) if people read and understand them, they often lack enough background knowledge to make an informed choice; and (4) if people read them, understand them, and can make an informed choice, their choice might be skewed by various decision-making difficulties” (Solove, 2013, p.1888).”

 

““beyond privacy self-management”, we should perhaps rethink issues such as consent and the unequal power-relationship between the institution and students, the advantages of opting in rather than opting out, addressing privacy’s timing and focus and the codification of privacy norms and developing substantive rules for data collection (Solove, 2013).”

General note: The reason for their not being an opt out cause: the university feels that it has a moral responsibility to help students,versus giving students their own agency to make decisions, despite these decisions potentially being against what the data shows. (related to paragraph before conclusion).

From students:

“The general view was that more could be done to make clear what data is being collected, how it is being collected, where it is being collected from, the uses for which it being collected and who will have access.” (296)

“There’s a huge difference IMO between anonymised data to observe/monitor large scale trends and the “snooping” variety of data collection tracking the individual. I’m happy for any of my data to be used in the former; with the latter I would be uncomfortable about the prospect that it would be used to label and categorise students in an unhelpful or intrusive way”.

– stating exactly how information is used, with links to the detail;

– providing a basic summary of the key points on the student’s home page,

– communicating the approach at the point that a student is about to supply any data that is to be used;

– providing a fairly inclusive set of examples of what information is gathered and how it may be used” (296)

“[It is difficult] for the University though to flag issues like this [incorrectly targeted emails] to students without holding data about what we do/how well we do/whether we use the forums/need advice…” (297)

“…data could and perhaps should be used to provide a more personalised and relevant support service, with students suggesting that a learning analytics approach applied in conjunction with support delivered by a personal tutor might ameliorate the risks of labelling students incorrectly” (297)

“Others felt that the involvement of tutors could effectively prejudice the tutor:student relationship by impacting on the tutor’s expectations of that student” (297)

“- It is important to have a clear purpose for data collection and to communicate that purpose effectively ; to explain what data will/won’t be used for, and who can see it (e.g. on each student, in aggregate).

– A set of frequently asked questions developed for staff dealing with declaring personal information around diversity could usefully be replicated for students.

– There should be transparent policies about how long data can be held for and what the process is for handling requests for deletion of data.

– Data should only be shared on a ‘need to know basis’ – particularly where it is personal/sensitive.

– There should be strong and transparent governance in this area with a focus on ethics.

Data handling protocols are important and should be enforced effectively.

– There should be periodic data audits.

– There should be an up-to-date data dictionary.

– It is important to address any concerns about the sharing of information with other organisations or the processing of information by other organisations.

Could add: “lecturers/tutors should not make decisions based solely on data. The student needs to be consulted first. (my own addition)” From: “I have a concern that increased data-richness resulting in over-reliance on data and ‘computer says no’ responses. Catering for the individual is what’s needed. If data collection is used to help appropriate questions to be asked, fine – if it’s providing answers, very much not so.” (298)

“staff involved in data analysis and in the delivery of intervention [should be] appropriately trained.” (298)

Could add: “Students were quick to flag the dangers of data protection and privacy in relation to having their data passed on – e.g., where a third party undertakes a service on behalf of the University.” … “there was also a view expressed that the University should not attempt to draw in information from third party sites for its own purposes”

“No one should feel compelled to provide data if they don’t want to and they should be able to keep their reasons for this, which may be very personal, private.” (299)

“A right to refuse without compromising study ought to be built in” (299)

“The author also noted the argument for a duty of care to advise people against making a potentially costly mistake by continuing on a course they might not complete. S/he concluded this by stating “But it is ultimately their choice.” (299)

  • Return to the one-page summary of key points in the OU policy that you wrote in Activity 20. In the light of what you have read, make any amendments you feel necessary. Can you make your summary interesting enough and short enough to feel confident that students would be likely to read it?

Answer:

To include:

– opt out clause;

– Without consent, acknowledgement: no passing on of information to third parties;

– A right to refuse (opt out) should not jeoprodise student’s studies in any way;

– Students should have the final say on all analytic interventions/suggestions. however, courses as a whole could benefit from analytic analysis (i.e. change in how instructions for a task are delivered).

– Without consent, acknowledgement: similarly, no receiving of information from third parties;

– Tutors/lectures, etc. cannot make assumption on data alone, consultation is needed;

– A clear data collection purpose is needed and should be communicated;

– FAQs for tutors/lecturers should be handed to students;

– transparency in how long data is held and how to request deletion;

– data should only be shared on a need to know bases;

– transparency and ethics should be key pillars;

– periodic audits;

– Staff involved in analytic must be properly trained for such;

– if information is shared with a third party organisation, or information is processed by a third party (i.e. for research), this should be made clear to students.

– consultation is needed with students where analytics detect a problem or recommend a change (i.e. no ‘computer says no’ situations)

  • In the discussion forum, or using OU Live, discuss ways of developing a policy that people will engage with. How should the policy be shared with different stakeholder groups such as learners, educators and systems administrators?

 

Learners:

Could be delivered through videos, such as the approach taken by Pardue University when sharing details of their Signals analytics. 

Could also be delivered in a more ‘Facebook’ like approach. This is where, despite users not always (or seldom) actually reading the T&Cs, they are able to select what personal data they are comfortable sharing, or Facebook having access to.

Educators:

Scheduled workshops where members of the analytics team, and head teachers/professors, engage a broader cohort of staff. This could be a mandatory session, held very 6 months. It can provide a time for delivering updates to staff, and answering questions. 

These sessions should also include detailing not only the facts, i.e. what the policy says, but also why it says analytics use, but also the reason and theoretical background. This will give educators some agency and theory to back up their decisions. 

Real life scenarios could be delivered as examples, which could be placed in test scenarios (online MCQ testing) for staff to complete. I.e. If XYZ arises, what do you do?

System Administrators:

They also need to be made aware of the theory and history of analytics at the university, in order to maintain track. As they are at the forefront of the analytics system, they should be responsible for attending fairly regular training, through video, testing, and face to face sessions. 

The policy document could be discussed in groups, testing could be delivered (where, for example, they are placed in the role of students or teachers). 

System admin could also be responsible for delivering training sessions, video guides, etc. to enhance their own knowledge through teaching others. 

Advertisements

Week 24 – Activity 20: Ethical use of student data

Timing: 2 hours

  • Read the OU’s policy on the ethical use of student data for learning analytics.
  • Write a one-page summary of the key points in the policy.
  • Aim to phrase the summary so that it would be helpful for someone in your own institution, or in an institution that you know well, who was working to develop a local set of ethical guidelines.
  • There may be published guidelines that you can readily access for your chosen context. If this is the case, you can compare these with those you have written and potentially produce an ‘improved’ set of guidelines.
  • Share your key points in the discussion forum or discuss them in OU Live.

Answer:

A summarised document is available on my HDD. (Readings > Week 24 > Activity 20).

Week 24 – Activity 19: Reflecting on the ROMA framework

Timing: 4 hours

  • Start by reading Sections 1–4 (pp. 121–30) of this paper:

    The paper makes links between the introduction of learning analytics, the TEL Complex and the ROMA framework, so it provides a way of reviewing work you have carried out this week. You will see that Macfadyen and Dawson, who wrote the paper you read in Activity 17, are co-authors of this paper. The 2012 paper by Macfadyen and Dawson identified problems with the implementation of learning analytics; this paper proposes a way of dealing with those problems.

  • In your learning journal, take the problems and barriers you identified and listed in the previous two activities and map them to the steps of the ROMA framework. You could do this in the form of a list, or you could use presentation software, such as PowerPoint or Keynote, to produce visualisations of the links between steps and issues.
  • Make notes of your reflections on this mapping. Did any of the issues you identified in previous exercises map clearly to steps in the framework? Did the framework prompt you to view some of the problems in new ways?
  • In the discussion forum, or in OU Live, discuss ways in which the ROMA framework would help you to find ways to deal with the issues you have identified.

Answers:

The ROMA model consists of the following seven-steps. The numbers alongside will be added to the list of problems and barriers. A full definition of what each of the seven steps entails, can be found in the paper.

1. Define a clear set of overarching policy objectives

2. Map the context

3. Identify the key stakeholders

4. Identify learning analytics purposes

5. Develop a strategy.

6. Analyse capacity; develop human resources

7. Develop a monitoring and learning system (evaluation)

 

Problems and barriers identified in previous activities:

Interpretation and Observability of Analytics

3 (esp. related to interpretation), 7 (regarding development of a learning system)

Institutional Resistance

1, 2, 3, 7 (if changes are needed to better accommodate these)

Lack of Clear Goal(s)

1, 2 (esp. evidence to convince), 4,

Investment in Student Outcomes

Senior Staff Reluctance and Poorly Understood Objectives/Aims

1, 2 (esp. evidence to convince), 3, 4 (purposes for their involvement)

Underdeveloped Ed Tech Sector

4, 5, 6 (to identify), 7 (in further improving this area)

Workload

1, 2, 5, 7 (if changes are needed to better accommodate these)

Lack of Support From Individuals

1, 3, 7 (if changes are needed to better accommodate these)

Institutional Culture

1, 2, 3

Funding and Revenue Generation

3,

Pedagogic Res Community 

1, 2, 5, 7 (if changes are needed to better accommodate these)

Student Community

1, 2,

Ecology of Practices

1, 2, 7 (if changes are needed to better accommodate these)

 

Further barriers identified in this paper:

Unwillingness to act on findings outside of own research area

2,

individual preferences for qualitative or quantitative approaches

2, 4,

basing decisions on anecdote rather than on research

2 (esp. the evidence), 4, 5,

the different forms of discourse used by researches and decision makers

4,

unfamiliarity with statistical methods on the part of the decision makers

2 (esp. links), 4,

different expectations around communication between researches and those responsible for implementation

1, 2 esp. (political context), 3

different levels of engagement with the research

3,

different expectations about the role and purpose of educational research

1, 2 esp. (political context), 3, 4,

Ethical Concerns

5, 7 (if changes are needed to better accommodate these)

Lack of Skills Available

6 (identification of these)

 

Consider:

1. Did any of the issues you identified in previous exercises map clearly to steps in the framework?

Many of them almost did. More problems and barriers were added whilst reading the paper. The points that would benefit the most (or most accurately) are:

– Funding an revenue generation, matched with considering the stakeholders (3).

– Senior Staff Reluctance and Poorly Understood Objectives/Aims can be combated through extensive explanation of the process and consultation. 

– Lack of clear goals can be combated in a similar way to the above point.

2. Did the framework prompt you to view some of the problems in new ways?

It prompted me to add new problems/barriers. Also, the cyclical cycle is very important, as ultimately we will need to evaluate and adjust, not only based on failures, but also on our findings. I.e. if analytic A is very successful, what does this mean for analytic B?

“Evaluation processes are important, not only to track progress, make any necessary adjustments and assess the effectiveness of the approach, but also to learn lessons from the future” (pg.130)

Week 24 – Activity 18: Considering the TEL Complex

Timing: 4 hours

The reading features many quotes and highlights. Good for TMA. 

Scenario

Imagine that an educational institution you know well has asked you to help develop a plan for rolling out learning analytics, either across the institution or across one section of the institution. The management team’s vision is that, in the future, they will be able to claim on their website that ‘learning and teaching at this institution are supported by learning analytics’. At this stage the management team is not sure what would be involved in implementing a change of this sort, so they need an outline of the changes and developments that might be required.

  • Start by reading the introduction to the report below, and Section 6, ‘The process of TEL innovation’.
  • Use the text in Figure 1: The Beyond Prototypes model of the TEL Complex as headings. Under each heading create a series of notes about what the institution would have to take into account.

Answer:

Policy Context:

Student Community

“The communities associated with these different sets of stakeholders [teachers, students, etc.] often have different sets of values, perspectives, objectives and above all, expertise.”

In my organisation this speaks to the parents of children, or the friends of adults, who might well have other ideas of education and practice. It is not only about the student, but also about satisfying the customers networks.

Ecology of Practices

“The strong community present within the TEL Complex constitutes a major challenge for TEL innovation, and in many cases exhibits super-stability, meaning that change is extremely difficult to achieve. In particular, current expectations of teachers and students affect the adoption of TEL innovations.”

In my organisation, I would include managers into the decision process. Ultimately the decision lies with them. There is also a slight variance in what teachers ‘on the ground’ see, and managers expect or assume. This disconnect can result in any large scale changes that would change the system, being seen as unfavourable.

Environment:

Pedagogic Res Community

“Pedagogy comprises an extremely complex and distinctive process which involved both student and teacher engagement, delivering a set of education services by means of specific channels.”

Consideration would need to be given as to how this relationship between student and teacher might change, given the potential unequal distribution of learning analytics. I.e. with the teacher holding information about the learner. Ethical considerations would need to come into account.

An open approach would be needed in the institution (Phoenix) to share with adult learners why certain actions are taking place. It would be challenging to assure that all students receive a similar level of teacher/with similar tasks. Especially in group contexts.

Revenue Generation:

Technical Communities and Teacher Communities (combined for ease)

will need to take into account the ecology surrounding the practices decided.

“Current practices are not easily altered; they are at the core of super-stability in the overall educational system.”

Both student and teacher practices need to be considered, and how these will be affected by changes in TEL. Training, apprenticeships, and experience is required. The changes cannot be made in isolation, i.e. likely a shift in TEL approaches in my organisation will result in shifts in other pedagogical aspects of the classroom.

Tech Context

“They are the technological elements that are used to support the pedagogy with the aim of achieving a vision that is concerned with enhancing learning in a specified way.”

Financial concerns might arise within my institution, due to new training and tools being required. Extensive decision making processes, as to ascertain what analytics will be deployed and how, will need to be agreed upon.

Technical

“These complex interdependencies make it difficult to get ant one element to work or make a difference by itself without consideration of the whole.”

“Ultimate success depends on the totality of the configuration or bundle, rather than on any single component.” Furthermore, often existing technologies come together to form new innovations. Which of the existing technologies in our institutions, such as the NOS, or the ETS, could be used in this manner?

As mentioned above, any changes to the system of teaching would surely result in changes being seen or needed in other areas. I.e. the changes to e-books over standard paper textbooks, which bring enhancements such as advanced annotation, however will limit the ways a textbook can be used in a  teaching context and could result in external concerns, such as distractions.

Funding:

“In the case of TEL, if policy dictates that funding for TEL is subsumed within general educational budgets or within special project funding, then competition with regular demands to the time-limited nature of project funding can work against long-term sustainability and adequacy of support. This is important, because complex innovations typically require decades for effective diffusion.”

Within my organisation, such consideration are important for much the same reasons as given above. Furthermore, employee wage demands based on workload increases could also play a role. This funding might not come from external sources, and could be bared on management to control.

  • In your notes, consider these questions:
    1. Which groups and factors fall under each heading?
    2. How are these groups likely to feel about the management vision identified above (would you recommend changing the wording)?
    3. Which barriers will need to be overcome?
    4. What changes would have to be made in order to achieve the vision?

Answers:

1. Done.

2. How would the following feel about the management vision:

Student Community

Changes not only to students syllabus and teaching style will need to be made, but student’s wider networks, such as friends and family, will need to be considered. How can the changes be conveyed to those paying, for example?

It is likely that student who have been with us for a longer period of time, will exert more confidence in changes.

Pedagogic Res Community

Changes in one aspect can overflow into others. It is important that in my institution we do not assume that TEL innovations/implementation occur in isolation. How will students, teachers, and management be affected by, for example, the instruction of analytics?

Important quote: “[A research group is] lulled into thinking that when the have a successful pilot the next step will be easy. This next step is the hardest step of all. When they go to schools with their piece of kit and their wonderful technology [they fail because] other factors such as curriculum, professional development, sustainability and appropriateness are not taken into consideration.” (pg. 32).

Senior staff would likely show reluctance to change, especially if it require new workloads. The vision of how such changes will benefit the organisation need to be clear.

Technical Communities

Training, and general experience is needed. Both for students and teachers. Extra budgets or time would need to be allocated to assist the transition.

Again, the benefits need to be clear for all stakeholders, as to move away from personal cost-benefit analysis, and a broader consideration of positive effects.

Teacher community

The influences of potential challenges and changes need to be made clear. Training and general fascination will be needed. Concerns in my organisation would relate to the division of labor, and in particular, if hierarchies emerge. A concern could be that younger employees are more accustomed to the technologies, making older employees less welcoming of changes that might see their skills becoming outdated.

  • In a blog post, or in your learning journal, write a short, informal report for the management team about the changes that would need to be made to the ecology of practices, and the technical context at the institution, in order to introduce learning analytics successfully.

Answers:

  • Take into account:
    • access to resources and training
      • The project would need to focus on design-based learning (pg. 34 summary), to take into account all stakeholders concerns and wishes.
      • Due to the nature of analytics or other TEL innovations, on ground support would be needed for training. This is especially true for senior staff.
      • Consideration of the cost-benefit analysis concerns (where individuals might reject the innovation due to time and work constraints vs. understood benefits) will require training to be a very important step in explanation to staff of what is occurring and WHY.
    • what the major barriers are likely to be
      • The vision is rarely achieved from the offset, and instead “emerges and evolves through exploration, through networking and through the active engagement in research, development and educational practice.”
      • “Engagement [with students and teachers during the design process] may also be a necessary condition to properly understand the ecology of practices that will be the context for any particular TEL innovation.”
      • Major barriers are likely to occur in the following sections above:
        • Student community/Ecology of practices – as these are the customers, their satisfaction and understanding of the new process is paramount. How it would affect them positively needs to be considered. The support of those paying, be it students themselves or parents/other needs to be considered.
        • Teaching community – for the reasons given above, teachers need to be on board with changes and willing to sacrifice some of the status quo.
        • Technical – training will be a large component, and would likely need to continue for an extended period of time. Senior staff in particular need to be considered. Furthermore, for newcomers into the company, training schedules will need to be defined to consider how easy it will be to have them catch up with developments and why they have occurred.
        • Funding – in my organisation this will occur internally. Concerns over staff workload will recur in relation to wages.
    • who would be responsible for dealing with problems.
      • Likely a team-leader will be selected, and a team constructed to deal with all aspects of the project/innovation. This leader will need to pay attention to all factors above, with particular concern to the spin-offs of a change affecting all sectors of the institution.

    If you conclude that it is not possible to achieve the management vision, make it clear why this is, and how your conclusion is influenced by the ecology of practices and the technical context.

  • Share your conclusions, or a link to your blog post, in the forum. Discuss there, or in OU Live, whether you think your chosen institution will be able to claim in the future that ‘learning and teaching at this institution are supported by learning analytics’.

Week 24 – Activity 17: Why analytics may be ignored 

Timing: 2 hours

  • Start by reading the paragraph headed ‘Outcomes of participant observation’ (p. 157) and then read from the heading ‘Why numbers are not enough’ (p. 159) to the end of the paper.
  • Dawson and Macfadyen group the reasons for lack of uptake under two headings: ‘Perceived attributes of an innovation’ and ‘The realities of university culture’. In a blog post, or in your learning journal, note the reasons they identify for lack of uptake, and choose your own headings to group them under.
  • In the discussion forum, or in OU Live, discuss the headings you have selected. Can you agree on a common set of headings? Do any of these groups of reasons stand out as more important than the others?

Answer:

Copied directly from the text. Cite before use.

Institutional Culture

– We suggest here that this may be the result of lack of attention to institutional culture within higher education.
– lack of understanding of the degree to which individuals and cultures resist innovation and change.
– lack of understanding of approaches to motivating social and cultural change.
– Although social systems such as educational institutions do evolve and change over time, they are inherently resistant to change and designed to neutralize the impact of attempts to bring about change.

Lack of Support From Individuals

– no vision or plan will emerge or be embraced without the support of faculty and staff.
– an individual’s reaction to change reflects their cognitive evaluation of the way in which a new event or context will affect their personal wellbeing
– individuals will assess it situationally for its “relative advantage.
– They will assess it for “compatibility”: the degree to which it is consistent with existing practice and values, and with needs of potential adopters.
– They will assess it for “complexity”: the degree to which it is perceived to be difficult to understand or to use.

Workload

– Faculty may view the introduction of technologies into teaching as a time-consuming imposition
– Activities may be perceived as being antithetical to the current institutional culture.
– time-commitment needed for quality instructional design

Poorly Understood

– Potential for learning technologies to enhance teaching and learning may be poorly understood and incongruent with individual perceptions and beliefs surrounding good teaching practice.
– Faculty may worry that spending time on technology will actually hamper their career due to poor evaluations of teaching.

Underdeveloped Ed Tech Sector
– academic culture still rewards faculty for verifiable teaching expertise.
– current lack of standardized methods of assessment of online teaching expertise.
– cooperative nature of effective team-based course development mean that incentives are often very low for faculty to invest time in working with technology.

Senior Staff Reluctance

– members of the senior administration participating in committees charged with LMS review and selection—are typically senior faculty members rather than professional managers.
– assessing the degree to which any change will burden themselves and their colleagues with the need to learn how to use complex new tools, and/or the need to redesign change their teaching habits and practices, without offering any appreciable advantage or reward.

Investment in Student Outcomes

– Information technology managers and staff similarly are most likely to assess proposals for new technology innovations from the perspective of workload and technical compatibility with existing systems, and have an even smaller investment in student learning outcomes.

Lack of Clear Goal(s)

– absence of a strategic goal or vision (and of any clear incentives to strive towards such a strategic vision), analytic data reporting on current LMS data have little motivating power.

Institutional Resistance

– institutional resistance” is found in the very culture of academic institutions
– consensus governance (rather than industrial-style hierarchical management)
– faculty control over the major goal activities (teaching and research)
– an organizational culture that supports change by adding resources rather than by strategically reallocating resources
– a curriculum structure that makes false (though some would argue, necessary) assumptions about learner homogeneity
– any direct interference in faculty democracy is not welcome.

Interpretation and Observability of Analytics

– Not used used to highlight progress and room for growth against a backdrop of institutional targets and vision—and if participants are committed to the vision and motivated to achieve it
– Interpretation remains critical
– Greater attention is needed to the accessibility and presentation of analytics processes and findings so that learning analytics discoveries also have the capacity to surprise and compel, and thus motivate behavioural change
– to date, efforts to mine educational data have been hampered by the lack of data mining tools that are easy for non-experts to use.
–  Poor integration of data mining tools with e-learning systems; and by a lack of standardization of data and models so that tools remain useful only for specific courses/frameworks.
– Collectively, these difficulties make analytics data difficult for non-specialists to generate (and generate in meaningful context), to visualize in compelling ways, or to understand, limiting their observability and decreasing their impact.