Timing: Less than 1 hour
- Look at this Tumblr website that was set up to support reflection on learning analytics:
Click the arrows at the top right of each picture to view the accompanying text. All the pictures on the site have been used to inspire reflection on analytics, using the tags #dream, #nightmare and #fairydust (for those who believe the promise offered by learning analytics is currently unrealistic). For example, an image of roughly brushed coloured stripes, ranging from pink to blue, was tagged #dream and prompted the reflection:
- ‘All the separate elements come together to form a complete picture. Presenting them in this way makes it clear that they fall into three main groups, although it is also possible to pick out individual elements and their slight variations. The blue-and-white sky-like quality of the top of the picture suggests that the sky is the limit. Round the edge, the occasional gaps between the paint and the canvas edge serve as a reminder that learning analytics never reveal the whole picture.’
An image of heavily pollarded trees was tagged #nightmare and prompted the reflection:
- ‘For me, these trees represent the learners. They could be left to develop freely in all directions, but learning analytics constrain their choices and reduce them to basic outlines, each resembling the other. All their individuality is stripped away by the analytics, and they are set in an impoverished and deteriorating landscape.’
- Choose one of the pictures on the Tumblr site, or another image of your own, and write a short reflection on learning analytics, which draws on what you have learned in this block and which is shaped by the image.
- Post your reflection, together with the image or a brief description of the image, in the discussion forum and tag it as dream, nightmare or fairydust.
- Look at other people’s contributions and comment on at least one of them.
Wassily Kandinsky – Composition VIII, 1932. Link to post here.
The chaos of this image reminds me of the different directions LA takes, i.e. from the students side, to the teachers, to course administrators and management. They each have their own sets of objectives and interests. There are obvious uses for LA, such as the overall aim of improving retention, which can be seen in the most standout sections, like the big circle in the top left. However, the rest is still finding it’s way and being debated.
Furthermore, LA is about individuals. Individuals do not fit nicely into boxes, as is seen here. There are some crossing points, like the tic-tac-toe box on the right, but general there is little to easy extract consistent patterns form.
Due to the various angles LA could took. Although there is hope, and I do believe we will find a way of really improving learning/teaching, but the path is very messy and we must be ready for many challenges, dips, and turns, amongst successes.
Timing: 3 hours
- Read this report, which has a particular focus on examples of using the framework (Section 3.2):
- Cooper (2012), A framework of characteristics for analytics.
The fifth example in the framework is the SNAPP tool that you read about in Activity 14. Compare this with Example 4, which also focuses on a social networking tool.Make notes in your learning journal about how useful the framework (Figure 1) was for understanding and comparing the tools.
The framework was particularly useful at noting the key similarities and differences. These are highlighted in the table below (differences in red, similarities in blue). Without mapping the characteristics of each analytics projects against common criteria, one could simply concluder that both projects are similar to due similar pedagogical aims (networking/socio-connectivism).
It was useful in its conclusion, that Example 4 could not be considered analytics due to only ‘diffuse actionable insights’ and not thus [implied understanding] not predictive models.
|Example 4 – Research Librarianship Social Network Analysis||Example 5 – SNAPP (Social Networks Adapting Pedagogical Practice)|
|Analysis subjects, objects and clients||Analysis subjects: journals, articles and associated authors (individually identified).
Analysis clients: researchers who study research librarianship, particularly new researchers. Analysis objects: not clearly definediii.
|Analysis subjects: principally learners but also teachers (personally-identifying online forum interactions).
Analysis clients: teacher.
Analysis objects: teacher (their learning activity design).
|Data origin||Data was obtained from the Thomson Reuters Social Sciences Citation Index. This is a subscription service offering comprehensive and quality controlled data on scholarly works and citations.||Private data from a learning management system (virtual learning environment). SNAPP uses raw data that is automatically generated in the course of using the online forum. Processing is at the level of a teaching cohort so the scale of analysis subjects is likely to be small in conventional educational settings, although the interaction count may be modest in scale.|
|Orientation and objectives||Orientation: descriptive
Objective type: there is no clear objective but a general aim to “gain deeper insights into the discipline of research librarianship”
|Orientation: a diagnostic orientation is implied by SNAPP’s creators, i.e. that increasing online interaction is desirable
Objective type: performance (how effective is the learning activity design)
|Technical approach||A descriptive approach to social network analysis is used.||A descriptive approach to social network analysis is used, with a strong bias towards visual presentation.|
|Embedded theories and reality||There are references to literature describing the development of social network analysis; broad theories of social communication and community development are clearly embedded (e.g. “The general assumption behind this genre of studies is, that the more authors are co-cited, the stronger will be the bond they have”).||SNAPP’s developers are overt in their software being designed to support socio-constructivist practice.|
|Comment||The article is typical of applications of social network analysis in being of descriptive character. The research questions posed and the conclusions drawn are essentially informative rather than being action-oriented. The absence of clearly identifiable analysis objects is consistent with the informative outlook.
This example largely fails to match our definition of analytics due to the presence of only diffuse actionable insights, even though it is a competent piece of data processing and probably of interest to researchers.
|In contrast to the research librarianship example of social network analysis, the chosen use of SNAPP is a much stronger example of analytics. The technical aspects and visualisations are very similar but the intention towards actionable insights is quite different.
The technical approach in both cases is descriptive and does not surface mechanism. A more evolved approach might attempt to indicate the level of significance of differences between parts of the sociograms or of changes over time. A further elaboration to permit hypotheses about cause and effect between aspects the learning design and the observed patterns of interaction to be explored/tested is certainly in the realm of research and may only be tractable at much larger scale.
In practical use, it would be important to guard against SNAPP being used as the single lens on the effect of changing learning activity designs. What of interactions that are not captured?
SNAPP can also be used by students to self-regulate but there is anecdote to suggest that the tool is too teacher-oriented in its presentation for students to easily understand.
- Now read the second framework:
- Scheffel et al. (2014), Quality indicators for learning analytics.
The paper goes into detail about how this framework was developed. For the purposes of this activity, though, focus on the framework as set out in Figure 9 and in the section called ‘Outline of the framework’.
The framework’s authors are working on developing it as a tool for evaluating learning analytics. That work is still in progress and the 20 indicators can currently be regarded as generic headings that can guide evaluation and comparison.
- Use the indicators identified in Figure 9 as headings to help you make notes about the SNAPP tool that you read about in Bakharia et al. (2009) in Activity 14. As the description of SNAPP in that paper is relatively short, you are likely to find that you cannot complete all the sections of the framework. This is always likely to be the case when you evaluate a tool. The framework helps you to identify questions you need to ask in order to make useful comparisons.
Headings from Figure 9. Answers, as they relate to SNAPP, are in bold. Non-bold text refers to direct quotations or points from the Sheffel et al., (2014) reading. It is noted above that I might not have enough information to complete each section as it related to SNAPP.
- Awareness: makes educators aware of at risk students; identify high and low perfuming students; indicate to what extent a learning community is developing in a course
- Reflection: “Reflection is not an end in itself, but a mechanism for improving teaching and hence maximising learning.” (McAlpine & Western, 2000) (pg.127) ; Provide before and after snapshots of interventions.
- Behavioural change: Focus on improving teacher practices, mostly interventions. Behavioural change on the part of the teacher. When to intervene, etc. Can also change the students behaviour through benchmarking individual performance against peers.
- Learning Support:
- Perceived usefulness: “Educators hesitate to take the calculations of algorithms about learning and education effects as valid while at the same time they hope to gain new insights from those analytics results.” (pg.127). Easily integrate-able into LMS systems.
- Recommendation: Does not provide distinct recommendation to teachers, but suggests through self-analysis what might need to occur. This could certainly be extended to include predictions and base models in future iterations.
- Activity classification:
- Detection of students are risk: Very much key aim. Through lack of interaction or not getting responses/engaging.
- Learning Measures and Output:
- Comparability: Cross student comparison of engagement.
- Effectiveness: “Their [Dyckhoff et al., 2013] findings show that in many cases LA tools do not yet answer all of the questions that teachers have in regards to their educational setting.” (pg.127). There is a danger of excluding/not seeing the other factors that might influence SS success (or other forms of communication). In other words, over emphasising the importance of forum communication.
- Helpfulness: Helpful to identify at risk students.
- Data Aspects:
- Data standards:
- Data ownership: “the last few years have shown an immense rise in the availability of and open access to data sets fro the technology-enhanced learning, LA and education data mining domains” (pg.128).
- Privacy: “The authors suggest four principles that provide good practice when tackling the above-mentioned conflicts [legal, risk, ethics]: (1) clarify, (2) comfort and car, (3) choice and consent, and (4) consequence and complaint.” (128). To improve privacy-related matters, one should consider “(1) transparency, (2), student control over data, (3) right of access/security, and (4) accountability and assessment.” (128). Not given, but given that names are available (Show names on figure 1), there could be ethics concerns. I.e. what if students want to opt out?
- Organisational Aspects:
- Availability: As a widget integrated directly into browsers. Cross platform. Unsure about the full range of LMS software/tools supported.
- Implementation: “…(1) manage the student pipeline, (2) eliminate impediments to retention and student success, (3) utilise dynamic, predictive analytics to respond to at-risk behaviour, (4) evolve learning relationship management systems, (5) create personalise learning environments/learning analytics, (6) engage in large-scale data mining, and (7) extend student success to include learning, workforce, and life success.” (129). Unsure how it will fit into a broader picture. I.e. will the data to be fed to a larger data centre for further analysis?
- Training of educational stakeholders: Up to the institution.
- Organisational change: “…such intelligent investments from the organisations have a strong and justifiable return on investment; the implementation of enhanced analytics is to be seen as a critical for student success on the one hand and achieving institution effectiveness on the other as without it, organisations cannot meet the current gold standard for institutional leadership.” (129) / “Arnold et al., (2014) argue that “in order to truly tranform education, learning analytics must scale and become institutionalised at multiple levels throughout an educational system” (129). Could change the focus of educators to enhancing forum support/guidance.
- Compare the structured evaluation that you have produced with the evaluation of SNAPP in Example 5 of Cooper (2012).
A very different focus. I prefer the Cooper (2012) analysis, as it brings in pedagogy and underlying elements (i.e. orientation and objectives), which, in plain terms, can help understand ‘what the point of the tool is, and will it be useful for me.’ However, Cooper’s is also more susceptible to user bias, rather than just reporting facts.
- In the discussion forum, post a comment stating which framework you found most useful, and how you would suggest amending it for future use.
Timing: 6 hours
Imagine that your educational institution, or one you know well, has decided to develop a learning analytics programme and, early in the process, intends to run a workshop for stakeholders in order to develop a vision. You have been tasked with organising the one-day workshop that will move this process forward.
- Decide how many people you would like to attend the workshop, and which groups they should represent. Who definitely needs to be involved at this stage, and who can be involved later? Your decision will depend on the type of institution you have chosen, and on your view of learning analytics.Make notes in your learning journal or blog, and discuss your options in the discussion forum or using OU Live.
- Plan a timetable for the day. Workshops often run from 10 a.m. until 4 p.m., with an hour for lunch and two half-hour coffee breaks – but there is no need to keep to that format if it does not suit your plans or the local situation.You may want to include an introduction to learning analytics, a talk about the importance of a vision, some examples of learning analytics, an expert speaker or two, a chance to share experiences or previous work, an opportunity to brainstorm ideas, a chance to share ideas, and a final discussion.Although there is a lot to think about, your final schedule does not need to be more than a page long.Again, discuss ideas in the discussion forum or within OU Live. Together, you have a great deal of experience of participating in and running workshops, and this is a good opportunity to share that experience.
- Finally, use Powerpoint, Keynote or similar software to produce an introductory presentation for the workshop. Focus on why your chosen institution is interested in developing an analytics programme and why participants have been invited to help develop this vision.Make use of resources you have encountered during this block. The SoLAR and LACE websites both contain links to resources and presentations that could help. The LACE YouTube channel contains a series of short videos from learning analytics experts, including some from authors whose work you have read in this block. You might choose to embed one in your presentation.Share a link to your presentation in the discussion forum, and take the opportunity to view and comment on other people’s presentations.
- How many people you would like to attend the workshop, and which groups they should represent.
Based on my organisation (modified), 20 people will this workshop.
- Who definitely needs to be involved at this stage, and who can be involved later?
Involved at this stage – Management and teachers
Involved later – Parents/Students
|An introduction to learning analytics||Presented by: Anna Lea Dyckhoff
Discussing her paper, ‘Supporting Action Research with Learning Analytics,’ with a particular focus on the goals of learning analytics for various stakeholders.
|The importance of a vision||Presentation by: Professor Eileen Scanlon
Discussing his paper ‘Beyond Prototypes: Enabling innovation in technology-enhanced learning,’ with a particular focus in turning learning analytics aims into reality.
|Examples of LA in action||Presented by: Rebecca Ferguson
Discussing her paper ‘Setting learning analytics in context: overcoming the barriers to large-scale adoption,’ with a particular focus discussing the frameworks used in successful implementations of learning analytics.
|Sharing experiences or previous work.||In groups of 3, staff share ideas relating to the following questions:
1) Have you ever been surprised by student actions, such as dropping courses, sudden enthusiasm, or failing tests?
2) Do you believe the analytics discussed by our guest speakers could have helped predict and/or prevent the issue occurring?
|Brainstorming solutions to a problem.||Within the same group, discuss solutions to the following problems:
1) A student asks to ‘opt out’ of analytics. Do you allow this to happen? What do you say to convince him/her otherwise?
2) The analytics developed by your organisation send a warning signal to you about a student in your class as on track to fail. What do you need to consider before intervening?
|Sharing solutions.||A group leader reports back on their solution to the cohort.|
|Wrap up.||A final word from the organisers.|
Timing: 4 hours
- Select two implementations of learning analytics from the list below.
- Ferguson et al. (2015), Setting learning analytics in context:
- Case Study 1A: The Open University, UK: Data Wranglers (pp. 131–3).
- Case Study 1B: The OU Strategic Analytics Investment Programme (pp. 133–7).
- Case Study 2: The University of Technology, Sydney, Australia (pp. 138–41).
- Colvin et al. (2015), Student retention and learning analytics:
- Cluster 1 – focused on student retention (pp. 19–21).
- Cluster 2 – focused on understanding teaching and learning processes (pp. 19–21).
- An example from your personal experience or from your reading on this subject.
- Read about the implementations and, with the help of the ROMA framework (see Figure 3 of ‘Setting learning analytics in context’), make notes on how the visions that underpinned these impacted on the implementation of learning analytics.
- You may find that the vision is not clearly defined. If this is the case, state it as clearly as you can, based on the information that you have.
- In the list above, the options from Ferguson et al. (2015) are already aligned with the ROMA framework. The options from Colvin et al. (2015) are not single examples but clusters of examples. Looking at these examples may help you to become aware of elements that are not clearly represented in the ROMA framework.
- I added point 19. Streamlining and improved data understanding/usability to be added to the end of the cycle as a consideration.
- In the discussion forum, or in OU Live, compare your findings with those of people who have selected different examples. Which vision aligns best with your view of how learning analytics should be used?
Q1. Make notes on how the visions that underpinned these impacted on the implementation of learning analytics
For Case Study 1A, the ROMA framework was directly linked to, as per figure 3. Slight changes in the language used and steps taken were used in the articles summary. Within asterisk, the actual ROMA step is defined by myself in the table below (from my point of view).
Impact (related to *ROMA*)
Case Study 1A
|1. Using the volume of educational data more effectively.||Policy objectives were defined (ROMA 1)
|2. Develop a group of staff with expertise in the individual faculty contexts.||Policy objectives were defined (ROMA 1)
|3. Set up a system for collating, synthesising, and reporting on the available data.||Policy objectives were defined (ROMA 1)
|4. produce reports at regular intervals||Policy objectives were defined (ROMA 1)
|5. build strong relationships with the faculties.||Policy objectives were defined (ROMA 1)
|6. analyse and influence teaching and learning practice.||Map the context (ROMA 2)
|7. Senior management in each faculty (responsible for learning, teaching and/or curriculum development, curriculum developers, those responsible for data gathering and curation, and general senior management.||Stakeholders were identified (Roma 3)
|8. Key focus on curriculum development and quality enhancement.||Learning analytics purposes were identified (ROMA 4)
|9. Integrate available data with completion rates, pas rates.||Learning analytics purposes were identified (ROMA 4)
|10. Conduct extensive consultation and feedback regarding implementation.||Strategy development (ROMA 5)
|11. Conduct early pilot work.||Strategy development (ROMA 5)
|12. Decide on an implementation plan, and dates for review.||Strategy development (ROMA 5)
|13. Provide/achieve content analysis.||Capacity analysis, Human Resources developed (Roma 6)
|14. Develop a full understanding of the faculty teaching and learning context.||Capacity analysis, Human Resources developed (Roma 6)
|15. Deployment of new technical tools (data management software, etc.)||Capacity analysis, Human Resources developed (Roma 6)
|16. Develop an understanding and appreciation of what the data could show, as well as an awareness of how to access it without the mediation of a Data Wrangler.||Capacity analysis, Human Resources developed (Roma 6)
|17. Build in feedback from stokeholds into the delivery of reports.||A monitoring and learning system was developed (Roma 7)
|18. Gather feedback from key stakeholders from evaluation exercises.||A monitoring and learning system was developed (Roma 7)
|19. Undertake reviews with the aim of streamlining.||Streamlining the process & Improve understanding of data usability(Non-ROMA). < Self-added|
Case Study 1B
|1. To use an apply information strategically (through specific indicators) to retain students and enable them to progress and achieve their study goals.||*Roma 1*|
|2. (Macro) Aggregate information about the student learning experience at an institutional level in order to inform strategic priorities that will improve student retention and progression.||*Roma 4*|
|3. (Micro) Make use of analytics to drive short, medium, and long term interventions.||*Roma 4*|
|4. Stakeholders make use of integrated analytics to inform interventions designed to improve outcomes.||*Roma 3*|
|5. Evaluate the evidence base for factors that drive student success (post initial intervention data collection)||*Roma 3*|
|6. Develop models that ensure key stakeholders can implement appropriate support interventions for both short- and long-term benefits.||*Roma 3*|
|7. To use an analysis of current student performance to identify priority areas for action, both in terms of change to the curriculum and learning design, and in terms of interventions with the students most at risk.||*Roma 4*|
|8. Develop a common methodology to evaluate the relative value of interventions through measuring the resulting student behaviours and improvements (to inform future SS experience)||*Roma 4*|
|9. Create near real-time data visualisations around key performance measure.||*Roma 6*|
|10. Triangulation of different data sources, to help in identifying patterns that influence success in a given context.||*Roma 6*|
|11. Create an ethics policy that details what data is being collected and its ethical uses.||*Roma 6*|
|12. Develop machine-learning-based predictive modelling systems. I.e. Will a student hand in his/her assignment based on online activity?||*Roma 3*|
|13. Improve feedback methods from students (shift from end of module assessment to in-module assessment) to improve reaction times to issues.||*Roma 4*|
|14. Measure the success/impact of learning designs through the systematic collection of data.||*Roma 5*|
|15.Create an CoP ‘evidence hub’ focused on the progression of first year students to second year.||*Roma 6*|
|16. Development of ‘small data’ student tools, to allow students to monitor their own progress, visually, to make informed study choices.||*Roma 6*|
Q2: Which vision aligns best with your view of how learning analytics should be used?
Vision two. As there was a more defined focus on improving student progress. Rather than the random ‘Data Wrangling’ discussed in the first case study.
Timing: 1 hour
- Return to read Table 4 of the paper below. This identifies goals of learning analytics in general and also from the perspectives of learners and educators.
- Dyckhoff et al. (2013), Supporting action research with learning analytics.
- Goals and visions are not the same thing, and some of these goals are too mundane or too succinct to be inspiring. In a blog post, or in your learning journal, combine or develop some of these goals to construct a vision of what learning analytics could be at your institution or at one you know well. Aim for a vision statement that is no longer than two sentences.
- Once you have constructed a vision, note whether it seems to be a learner’s vision, an educator’s vision, a manager’s vision or a combination of these. How would it need to change, if at all, to inspire other stakeholder groups?
The aim of LA at my institution is to:
Provide both students and educators with a means to monitor their individual progress as it relates to a group, with a key focus on identifying actionable insights into ways to improve their work. These LA insights will aim to bring students and educators closer together, as mutual understandings develop relating to work progress and desired results.
This above statement appears to be more of a mangers vision to a team. However, as it encompasses students and educators as the key stokeholds, it is felt to be fairly inclusive.