I have co-authored a paper with a colleague, David Jones which was published at the ASCILITE2014 conference being held in Dunedin New Zealand. The paper was titled Breaking BAD to bridge the reality/rhetoric chasm. The reality/rhetoric chasm is best expressed through the following metaphor, in the words of Professor Mark Brown:
E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (Laxon, 2013, n.p).
A central tenet of the paper is the following argument about this chasm:
Our argument is that the set of implicit assumptions that underpin the practice of institutional e-learning within universities (which we’ll summarise under the acronym SET) leads to a digital and material environment that contributes significantly to the reality/rhetoric chasm. The argument is that while this mindset underpins how universities go about the task of institutional e-learning, they won’t be able to bridge the chasm.
Instead, we argue that another mindset needs to play a larger role in institutional practice. How much we don’t know. We’ll summarise this mindset under the acronym “BAD”.
A comparison of SET and BAD is provided in the following table:
What work gets done?
Strategy – following a global plan intended to achieve a pre-identified desired future state
Bricolage – local piecemeal action responding to emerging contingencies
How ICT is perceived?
Established – ICT is a hard technology and cannot be changed. People and their practices must be modified to fit the fixed functionality of the technology.
Affordances – ICT is a soft technology that can be modified to meet the needs of its users, their context, and what they would like to achieve.
How you see the world?
Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy of distinct black boxes.
Distributed – the world is complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks.
The paper uses the establishment of the Moodle Activity Viewer (MAV) at my institution as an example of using BAD principles to improve e-learning. However, this is not the focus of this blog post. As a means of improving my own conceptions of BAD, SET and their interplay, I have begun reflecting on how I have unwittingly applied BAD principles to my other endeavours. A recent example relates to my use of task management software which is detailed in a recent post, but for which I’ll summarise here for brevity.
I recently switched to a new task management system called Omnifocus. Omnifocus provides the ability to select what it calls perspectives to show your tasks in different ways according to your workflows and context. One such perspective new to their recently released OSX version is called the Forecast Perspective. This perspective for the coming days, shows what tasks are due to be started (deferred to a later date when entered) and what tasks are due to be completed. This information is then augmented with appointments found in your OSX calendar application. Its a lovely way to see what you need to do, along side what existing time-based commitments you have to help plan to get things done. But there was a problem. Any deferred tasks that were not completed on the date they were deferred to, would not shift to the next day in the Forecast perspective. Instead, they simply disappear from the perspective entirely until their due date. Through an online search to see if I had mis-configured my database or if anyone else was as baffled as I was, I came across the following entry on the Omnifocus discussion boards:
I’m going to submit this as a feature as well, but I figure I’ll post it here to see whether it can get more traction. My issue is this:
If I have Deferred something to a start date in the future, odds are I probably think it’s pretty important that it starts on that day.
However, what happens if that day comes and goes and I didn’t start the item? In Forecast, the item disappears. That doesn’t make any sense to me. Forecast shouldn’t only be showing me the “past due” things I assigned dates to, but things I didn’t touch that I was supposed to.
Seems I was not alone in my frustration. The forum continued with discussion of various work-arounds, none of which I found particularly suitable to my context. So I sought other possibilities to resolve my problem.
Omnifocus makes use of the OSX Applescript frameworks which provide a high-level scripting language that can be used to customise behaviour and automate tasks to make things work in ways not originally conceived by software creators. A handful of applescript contributors exist that have created some very useful Applescripts for Omnifocus. One such contributor Curt Clifton has created a script that identifies projects where there is no next action to perform, suggesting that the project may have stalled.
The significance of this script is that I have been able to adapt it to solve my problem with deferred tasks disappearing from the Forecast view when they are not completed on the defer date.
Returning to the principles of BAD, there are some alignments undertaken by the Omnigroup company to allow customers to ‘break bad’ by customising their Omnifocus product to yield to their ends. In the absence of the Applescript integration in Omnifocus, there would be little hope other than waiting for Omnigroup to implement a feature to address the limitation.
The integration of the Applescript framework into Omnifocus allows Bricolage to occur. It offers Affordances such that I and others are able to solve problems locally and contextually according to our own specific needs and wants. The creation and use of this bricolage is distributed – Omnifocus do not have direct control or management over the extensions that can be applied. Things can be developed and/or shared in a Distributed fashion according to the needs of individuals.
While the use of the Applescript framework won’t solve everyone’s issues and challenges, like many products that do integrate an Applescript Dictionary, it does shift away from the traditional SET mindset of software development.
This blog post introduces an emerging implementation of learning analytics for lecturers that offers a novel approach to the visualisation of learning analytics within the Moodle LMS called the Moodle Activity Viewer (MAV). The motivation for its design was born from the frustration of using the standard analytics reporting functions available in Moodle 2.2 by lecturers at my institution. While there are many efforts underway to improve this functionality within the latest releases of Moodle, at least for my own institution, these improvements are likely to be years away from adoption. One emphasis with these improvements, is a greater use of graphs over tabular lists, most common in earlier Moodle 2.x versions.
What is MAV and How Does it Help Lecturers?
MAV takes a fresh approach to representing student activity within Moodle, by using heat maps (or click heat maps) as shown in the screenshot below:
Heat map of Resources Usage by Students using MAV
In the example above, MAV is representing the number of students who have accessed various resources and activities on a Moodle course site by colouring the links accordingly. In this way, MAV is focused on assisting with teacher reflection – identifying which elements of the course were used by the most students, and those which weren’t. On presenting the above snapshot to the lecturer, they responded: “Aaaah that’s interesting. I’m surprised that as many students as that used some of the links.” After further discussion, they shared the following comments:
I do feel that [the course] is guilty of that to some degree that we baffle them with BS and overwhelm them with far too many resources till they can’t separate the forest from the trees. I was certainly in two minds about even including most of those resource links at the beginning of the semester. I can certainly understand the results in the mid-term tests…they were compulsory
This is exactly the sort of teacher reflection that was intended by its design. Another excerpt from a Moodle page rendered using MAV from a much smaller postgraduate course is illustrated below.
Heat map of Assessment Resources Usage by Students
The lecturer of this course has been experimenting with MAV for a few months, and had the following to say about what changes might be made in future offerings:
… academics as students may already know and understand about the feedback resources I gave them, that is why they didn’t bother reviewing. Now I’m thinking of removing them. But in saying that, it is up to me to provide the scaffolding they require, so I’m thinking I should leave it there because it is good practice, even though I know they aren’t using it, but could potentially use it for their own students.
When asked about MAV’s ease of use:
It was very easy to use MAV to get an insight into useful resources. I
would have no idea how to get this info through normal Moodle tools and
Within higher education, learning analytics is predominantly used to identify “at-risk” students with the view to prevent or limit student attrition (Chatti, Dyckhoff, Schroeder, & Thüs, 2012; Lodge & Lewis, 2012). Identifying learners “at-risk” is only one, albeit important complex issue to act upon. MAV in its present form is designed to assist lecturers with teacher reflection and course learning design.
Why heatmaps and How Does MAV Work?
Norman (1993) reminds us that “We humans are spatial animals, very dependent upon perceptual information. Representations that make use of spatial and perceptual relationships allow us to make efficient use of our perceptual systems, to think experientially.” By using heat maps, it allows the lecturer to visualise student activity spatially within the real-world Moodle site itself, rather than through abstract graphs or tabular totals. This is not to say that graphs and tables are not valuable. The heat maps are just an alternate approach, and one that is more accessible to a broader cross-section of lecturers, as it easy to use and requires no training or guided instruction – it simply leverages our anthropological intuition. The screenshot (left) shows how the MAV can be switched on and off in the same way that Editing mode can be turned on and off – something even the complete Moodle novice quickly masters.
The tool presently has a modest list of configurable options, which it is planned to expand over time. This expansion will be balanced with the value of keeping the tool simple and focused on the tasks that lecturers wish to perform. At present, the options are largely focused on teacher reflection where lecturers are able to change the following properties of the representation:
display count of clicks versus count of distinct students
select specific weeks of the term for activity (incomplete)
select specific groups within the class
select either a heat map visualisation or a font size (think wordle or tag cloud) visualisation (for those with colour-blindness)
These options are changeable through the dialog (below) that is presented in the browser page, when the lecturer clicks on the Activity Viewer Settings option immediately beneath the on/off option in the settings menu (shown above).
MAV Settings Dialog within Moodle Page – Display Mode
How is MAV Implemented?
In its present form, MAV has limited affordance for action. It like many other existing analytics tools focuses heavily on information, and not enough on supporting action. For MAV however, this is surmountable due to its technical architectural design, which is somewhat unusual. MAV is not implemented as a Moodle plugin on the Moodle server, but rather as a browser addon on the lecturer’s computer.
This browser driven approach is not new. SNAPP, a tool for conducting social network analysis within Moodle uses what’s called browser bookmarklets and has gained considerable popularity. Another approach more closely resembling MAV was taken by Leony, D. Pardo, L. et al. (2012) who created a browser addon that “talks” to a related server component sitting along side Moodle from which it retrieves analytics data, and is then “drawn” into the Moodle page as a graph in a custom block. To the viewer, the graphing block appears as a seamless part of the Moodle page, but in reality, the information has been synthesised between the analytics server and Moodle. MAV too has a server component that “talks” to a copy of the Moodle database and extracts statistics for display by the browser addon. Where MAV diverges from the approach of Leony, D. Pardo, L. et al. (2012) is that it is not focused on the conventional approach of using a Moodle course block to display information. Instead MAV treats the entire Moodle page as a canvas for conveying information, and in a way that is contextual to the canvas itself.
The problem with traditional Moodle plugin development is that “the LMS is commonly managed at an institutional level and it must support several courses, [and so] installing a customised module becomes a complicated procedure both technically and administratively.” (Leony, D. Pardo, L. et al., 2012) By using a browser addon and matching analytics server, “we simplify the task of providing visualizations to participants of the course, valuable for the execution pilot studies or the evaluation of visualizations.” (Leony, D. Pardo, L. et al., 2012)
This approach makes it easy to be agile and innovative without disturbing critical high-availability environments such as Moodle. If the browser addon breaks in someway, it is easily disabled in the browser settings restoring the default Moodle functionality. It is of course not without drawbacks. To name a couple, changes in Moodle are likely to break the Browser addon and there are sure to be variations to consider between Moodle versions. It is believed on the whole, that the advantages outweigh the disadvantages. This thinking will be tested over time.
MAV and Open Source
MAV is licenced under the GPL, and will be made available in the coming weeks via github. It is hoped that other educational institutions using Moodle may be interested in collaborating with the technology and the approach to refine and improve on the concept, and implement new ideas. Research is planned around its design and use. An announcement will be made when the source code is available and how people can get involved.
MAV in the Future
The following outlines a sample of ideas for MAV in the future.
Allow the lecturer to visualise the activity of individual students
As a scenario, consider an assignment extension request from a distance student. In evaluating the request, one of the things lecturers often consider is the amount of work the student has done leading up to the due date, offset by any mitigating circumstances that may have prevented such work. Using MAV, the lecturer could select this student through the MAV settings, and see how often the student has accessed relevant aspects of the course, when, and perhaps in what sequence. This would assist the lecturer in making an informed decision about the validity of the student’s extension request claims.
Provide affordances for action based on the analysed data within MAV and Moodle
As previously mentioned, MAV presently offers little in terms of affordance for action on the visualised student activity. One useful function supportive of teacher reflection would be to assist lecturers in capturing their reflections on their course sites while they view the heat maps. In the examples given at the start of this post, the lecturers were analysing the student activity and making decisions about how they might change their course in the next offering. What if lecturers could click on a resource that they wish to change in the future, and make notes within the Moodle site itself using MAV. The change and their thinking is captured immediately as they reflect, without having to venture somewhere outside of the Moodle site and thus breaking the natural flow of their reflective activity (Villachica, Stone & Endicott, 2006). A small icon could be attached to the resource or activity as a reminder that a change is to be made. Then when they need to update the course for the next offering, sometimes in a years time, the information is readily available and still in context of their course site.
Integrate contextual information in Moodle pages in other ways besides heatmaps
As an example, wherever a student name of number is displayed on a Moodle page, provide a hover menu or visual that provides additional information about the student. This could be things like their contact details and other info from their profile page. But it could also be a range of information from other sources (see next point). This also marks a shift from the low hanging fruit that is clickstream data to information that embodies more depth. As a contextual menu to each heatmap link, options could be provided to see a list of students who haven’t, and perhaps just as importantly have accessed a given resource. These students could then be contacted via mail merge either encouraging them to engage with the resource or activity, or praising them for doing so.
Using the browser addon architecture to integrate and aggregate other information services and data
The browser addons need not be limited to aggregating/synthesising information from only the server analytics component and Moodle. Opportunities exist to integrate MAV with other initiatives at my institution such as the Student Support Indicators Project (SSI). This integration can work in both directions. Identify using MAV which students have not made use of critical resources or participated in activities, and then look up their student success factors through the SSI. Similarly, on identifying a student within the SSI who is showing lower engagement with their course, redirect the lecturer to the Moodle course site with MAV switched on, and only showing the elements of the course used by the student, giving further detail to their behaviour.
Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4(5/6), 318. Retrieved from http://www.inderscience.com/link.php?id=51815
Leony, D., Pardo, A., Valentın, L. de la F., Quinones, I., & Kloos, C. D. (2012). Learning analytics in the LMS: Using browser extensions to embed visualizations into a Learning Management System. In R. Vatrapu, W. Halb, & S. Bull (Eds.), TaPTA. Saarbrucken: CEUR-WS.org. Retrieved May 25, 2013, from http://ceur-ws.org/Vol-894/paper6.pdf
Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine.Cambridge, Mass: Perseus. Reading, MA: Addison Wesley.
Villachica, S., Stone, D., & Endicott, J. (2006). Performance Support Systems. In J. Pershing (Ed.), Handbook of Human Performance Technology (Third Edit., pp. 539–566). San Francisco, CA: John Wiley & Sons.
Getting back to action research,the fountain of knowledge provides some key words around action research. Its major attributes appear to be that of reflection, iteration and problem solving. It also seems that it is commonly done in groups or teams. So it seems to be a process of:
identify a problem
implement a solution
reflect on the appropriateness/effectiveness of solution
go to number 2
in a team setting.
There is a parallel here with the definition of a reflective teacher give or take the team aspect. In particular, teachers of levels 2 and 3 of the 3P model (Biggs, 2003). In fact, Bob Dick has drawn this comparison already. He contrasts action research and action learning (another popular eduspeak label – for another post) and suggests that: “Action learning was more often used in organisational settings. Action research [is] more common in community and educational settings. This distinction, too, is beginning to blur.” (Dick, 1997) So while action research is a general research methodology, it does overlap well with the education discipline.
Biggs, J., (2003), Teaching for Quality Learning at University, 2nd Edn, Open University Press, Berkshire.
Dick, B. (1997) Action learning and action research [Online].