Tagged: W25A25

Week 25 – Activity 25: Evaluation frameworks

Timing: 3 hours

  • Read this report, which has a particular focus on examples of using the framework (Section 3.2):

    The fifth example in the framework is the SNAPP tool that you read about in Activity 14. Compare this with Example 4, which also focuses on a social networking tool.Make notes in your learning journal about how useful the framework (Figure 1) was for understanding and comparing the tools.

Answer:

The framework was particularly useful at noting the key similarities and differences. These are highlighted in the table below (differences in red, similarities in blue). Without mapping the characteristics of each analytics projects against common criteria, one could simply concluder that both projects are similar to due similar pedagogical aims (networking/socio-connectivism).

It was useful in its conclusion, that Example 4 could not be considered analytics due to only ‘diffuse actionable insights’ and not thus [implied understanding] not predictive models.

Example 4 – Research Librarianship Social Network Analysis Example 5 – SNAPP (Social Networks Adapting Pedagogical Practice)
Analysis subjects, objects and clients Analysis subjects: journals, articles and associated authors (individually identified).
Analysis clients: researchers who study research librarianship, particularly new researchers. Analysis objects: not clearly defined
iii.
Analysis subjects: principally learners but also teachers (personally-identifying online forum interactions).
Analysis clients: teacher.
Analysis objects: teacher (their learning activity design).
Data origin Data was obtained from the Thomson Reuters Social Sciences Citation Index. This is a subscription service offering comprehensive and quality controlled data on scholarly works and citations. Private data from a learning management system (virtual learning environment). SNAPP uses raw data that is automatically generated in the course of using the online forum. Processing is at the level of a teaching cohort so the scale of analysis subjects is likely to be small in conventional educational settings, although the interaction count may be modest in scale.
Orientation and objectives Orientation: descriptive
Objective type: there is no clear objective but a general aim to “gain deeper insights into the discipline of research librarianship”
Orientation: a diagnostic orientation is implied by SNAPP’s creators, i.e. that increasing online interaction is desirable
Objective type: performance (how effective is the learning activity design)
Technical approach A descriptive approach to social network analysis is used. A descriptive approach to social network analysis is used, with a strong bias towards visual presentation.
Embedded theories and reality There are references to literature describing the development of social network analysis; broad theories of social communication and community development are clearly embedded (e.g. “The general assumption behind this genre of studies is, that the more authors are co-cited, the stronger will be the bond they have”). SNAPP’s developers are overt in their software being designed to support socio-constructivist practice.
Comment The article is typical of applications of social network analysis in being of descriptive character. The research questions posed and the conclusions drawn are essentially informative rather than being action-oriented. The absence of clearly identifiable analysis objects is consistent with the informative outlook.

This example largely fails to match our definition of analytics due to the presence of only diffuse actionable insights, even though it is a competent piece of data processing and probably of interest to researchers.

In contrast to the research librarianship example of social network analysis, the chosen use of SNAPP is a much stronger example of analytics. The technical aspects and visualisations are very similar but the intention towards actionable insights is quite different.

The technical approach in both cases is descriptive and does not surface mechanism. A more evolved approach might attempt to indicate the level of significance of differences between parts of the sociograms or of changes over time. A further elaboration to permit hypotheses about cause and effect between aspects the learning design and the observed patterns of interaction to be explored/tested is certainly in the realm of research and may only be tractable at much larger scale.

In practical use, it would be important to guard against SNAPP being used as the single lens on the effect of changing learning activity designs. What of interactions that are not captured?

SNAPP can also be used by students to self-regulate but there is anecdote to suggest that the tool is too teacher-oriented in its presentation for students to easily understand.

  • Now read the second framework:

    The paper goes into detail about how this framework was developed. For the purposes of this activity, though, focus on the framework as set out in Figure 9 and in the section called ‘Outline of the framework’.

    The framework’s authors are working on developing it as a tool for evaluating learning analytics. That work is still in progress and the 20 indicators can currently be regarded as generic headings that can guide evaluation and comparison.

  • Use the indicators identified in Figure 9 as headings to help you make notes about the SNAPP tool that you read about in Bakharia et al. (2009) in Activity 14. As the description of SNAPP in that paper is relatively short, you are likely to find that you cannot complete all the sections of the framework. This is always likely to be the case when you evaluate a tool. The framework helps you to identify questions you need to ask in order to make useful comparisons.

Answer: 

Headings from Figure 9. Answers, as they relate to SNAPP, are in bold. Non-bold text refers to direct quotations or points from the Sheffel et al., (2014) reading. It is noted above that I might not have enough information to complete each section as it related to SNAPP. 

  • Objectives:
    • Awareness: makes educators aware of at risk students; identify high and low perfuming students; indicate to what extent a learning community is developing in a course
    • Reflection: “Reflection is not an end in itself, but a mechanism for improving teaching and hence maximising learning.” (McAlpine & Western, 2000) (pg.127) ; Provide before and after snapshots of interventions. 
    • Motivation:
    • Behavioural change: Focus on improving teacher practices, mostly interventions.  Behavioural change on the part of the teacher. When to intervene, etc. Can also change the students behaviour through benchmarking individual performance against peers. 
  • Learning Support:
    • Perceived usefulness: “Educators hesitate to take the calculations of algorithms about learning and education effects as valid while at the same time they hope to gain new insights from those analytics results.” (pg.127). Easily integrate-able into LMS systems. 
    • Recommendation: Does not provide distinct recommendation to teachers, but suggests through self-analysis what might need to occur. This could certainly be extended to include predictions and base models in future iterations. 
    • Activity classification:
    • Detection of students are risk: Very much key aim. Through lack of interaction or not getting responses/engaging. 
  • Learning Measures and Output:
    • Comparability: Cross student comparison of engagement. 
    • Effectiveness: “Their [Dyckhoff et al., 2013] findings show that in many cases LA tools do not yet answer all of the questions that teachers have in regards to their educational setting.” (pg.127). There is a danger of excluding/not seeing the other factors that might influence SS success (or other forms of communication). In other words, over emphasising the importance of forum communication. 
    • Efficiency:
    • Helpfulness: Helpful to identify at risk students.
  • Data Aspects:
    • Transparency:
    • Data standards:
    • Data ownership: “the last few years have shown an immense rise in the availability of and open access to data sets fro the technology-enhanced learning, LA and education data mining domains” (pg.128).
    • Privacy: “The authors suggest four principles that provide good practice when tackling the above-mentioned conflicts [legal, risk, ethics]: (1) clarify, (2) comfort and car, (3) choice and consent, and (4) consequence and complaint.” (128). To improve privacy-related matters, one should consider “(1) transparency, (2), student control over data, (3) right of access/security, and (4) accountability and assessment.” (128). Not given, but given that names are available (Show names on figure 1), there could be ethics concerns. I.e. what if students want to opt out?
  • Organisational Aspects:
    • Availability: As a widget integrated directly into browsers. Cross platform. Unsure about the full range of LMS software/tools supported.
    • Implementation: “…(1) manage the student pipeline, (2) eliminate impediments to retention and student success, (3) utilise dynamic, predictive analytics to respond to at-risk behaviour, (4) evolve learning relationship management systems, (5) create personalise learning environments/learning analytics, (6) engage in large-scale data mining, and (7) extend student success to include learning, workforce, and life success.” (129). Unsure how it will fit into a broader picture. I.e. will the data to be fed to a larger data centre for further analysis? 
    • Training of educational stakeholders: Up to the institution.
    • Organisational change: “…such intelligent investments from the organisations have a strong and justifiable return on investment; the implementation of enhanced analytics is to be seen as a critical for student success on the one hand and achieving institution effectiveness on the other as without it, organisations cannot meet the current gold standard for institutional leadership.” (129) / “Arnold et al., (2014) argue that “in order to truly tranform education, learning analytics must scale and become institutionalised at multiple levels throughout an educational system” (129). Could change the focus of educators to enhancing forum support/guidance. 

 

  • Compare the structured evaluation that you have produced with the evaluation of SNAPP in Example 5 of Cooper (2012).

Answer:

A very different focus. I prefer the Cooper (2012) analysis, as it brings in pedagogy and underlying elements (i.e. orientation and objectives), which, in plain terms, can help understand ‘what the point of the tool is, and will it be useful for me.’ However, Cooper’s is also more susceptible to user bias, rather than just reporting facts.

 

  • In the discussion forum, post a comment stating which framework you found most useful, and how you would suggest amending it for future use.