Tuesday, 12 November 2013

Staff & Student Perspectives on introducing VLE Minimum Standards

I've recently closed the first annual student TEL survey in the Faculty of Health & Life Sciences at the University of Liverpool. This work is the final phase of some investigative work I've been doing to find our what's happening in relation to TEL across the University (and particularly in my Faculty) - this has included an audit of TEL activity and a staff TEL survey to gauge attitudes and experience to a range of TEL areas including VLE baseline/minimum standards; Online Submission and Lecture Capture.

I'll create a few posts to discuss the range of data that was obtained in this student survey, but in this post I wanted to share some specific data related to minimum standards in the VLE, with the idea that this could be an implementation to introduce some common consistency between VLE modules for students. It's also something that students have requested previously.

Firstly, 90% of respondents in the staff survey (n = 102 : Aug, 13) were in favour of introducing a baseline to help introduce consistency across areas. Oh goody! Further questions in the staff survey teased out the barriers for engaging with TEL initiatives further. 'Lack of time to innovate' was by far the most popular response. (61%).

In both the staff survey and the student survey (n = 840 : Oct, 13), respondents were asked to identify specific elements that should be included in any such baseline and were presented with a list of items (as follows):
  • A dedicated VITAL area for each module 
  • 'Welcome' to the VITAL area 
  • Contact Details (Module Leader) 
  • Contact Details (Other teaching staff) 
  • Module Specification 
  • Module Timetable/Schedule 
  • Module learning outcomes 
  • Module Assessment Strategy/Requirements 
  • Recommended/Further Readings 
  • Lecture notes/handouts 
  • Past exam papers (where appropriate) 
  • Ability to submit coursework online 
  • Opportunity for draft assignment (formative) feedback 
  • Online discussion forums 

I recognise this list could be endless, but I felt it gave us a reasonable idea, and respondents had the option to select 'other' and leave comments if needed.

The Results

I ensured both surveys contained the exact same question so I could compare answers. The chart below shows student (blue) and staff (red) responses to what they think should be included in any such baseline.
N.B. You might want to click the image to see the full size version.



It's clear that students value a few things mostly:
  • Lecture Notes (95%);
  • Past Exam Papers (93%);
  • Further Reading (88%);
  • Timetables (86%);
  • Module Leader Contact Details (83%).

What jumps out at me in this chart though, is how staff consistently undervalue specific criteria in comparison with students:
  • Is this a common occurrence in other studies? 
  • Do academic staff presume they know what's best for students, and well, appear to be wrong? Or do we see the general patterns between both sets of respondents and believe staff do indeed know what students want, but just value them a bit (significantly?) less?
  • The only area that staff felt more strongly than students was in relation to a 'welcome' to the VLE area. I'm surprised by this but perhaps our students just want to get straight to the content they need, and well, they are face-to-face students so they don't really need an online welcome(?).

The areas where the staff opinion differs mostly from the student opinion are in relation to:
  • Provision of Past Exam Papers (46% difference)
  • Online submission of coursework (41% difference)
  • Provision of a Module Specification (38% difference)
  • Opportunity for draft/formative feedback on work (34% difference)
I think these differences are pretty significant and to some degree, I don't understand why. For example, module specifications already exist, so why would staff not be open to making them available for students? (I think there is a separate argument related to how accessible these documents are for students).

However...

Commenting on a similar difference in attitudes between staff and students at an Australian university, Kregor et. al (2012) eloquently offer a potential reason:
We therefore attribute the difference to respective roles where time saving and flexibility gains for students may inversely require additional workload or skill demands for some staff. It is self-evident that the two groups have very different relationships with the technology as a function of role.
I wonder to what degree this might be evident here at Liverpool - a research intensive University where for many, research is a/the priority?

I'd like to hear other people's views on this so please do share in the comments.
Peter
@reedyreedles


References

Kregor, G., Breslin, M., & Fountain, W. (2012). Experience and beliefs of technology users at an Australian university: Keys to maximising e-learning potential. Australasian Journal of Educational Technology, 28(8), 1382–1404. Retrieved from http://www.ascilite.org.au/ajet/ajet28/kregor.html

Creative Commons License
The Reed Diaries by Peter Reed is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License

5 comments:

  1. Interesting re the undervaluing. I would argue that in most answers the general trend is similar, and that there is little significant difference in how they perceive the value of most activities. Perhaps students are focusing on the 'crutches' or artefacts that aid their learning, whereas the staff are more concerned with actual learning outcomes.

    I think that 3 of the areas you draw out as having a big difference illustrate this:
    Provision of Past Exam Papers (46% difference)
    Online submission of coursework (41% difference)
    Provision of a Module Specification (38% difference)
    They are just things. A lecture may think that historical exam papers are less important that the learning (although I concede that using them in a formative contextual way would be more useful). Module specification and Online submission are technical issues that lecturers may not even worry about.

    The final one "Opportunity for draft/formative feedback on work (34% difference)" may relate to workload?


    ReplyDelete
    Replies
    1. Thanks Lawrie,
      I like your more positive outlook on this :-)

      Interesting point re the difference in outlook between staff and students. Although things like lecture capture may have few pedagogic advantages, students want them (more on this in a later post). Things don't always have to be about *'enhancing'* learning in particular contexts, but can be about *'supporting'* the learning process over time. The 3 things you highlight are central to key NSS themes and do just this.

      Delete
  2. I'd agree with Lawrie about the possibility of this being about crutches. Most of those 'things' are really there to help train-to-the-test, not about education. Yes, it is good for students to learn to be goal focused, but if the goal is the assessment, isn't something wrong?

    Also, I'd take the results of any survey like this with a pinch of salt. I have run similar ones, and students identify that they want commonality across departments, for instance, but don't actually have access to the things in other departments, so it doesn't make a lot of sense. On top of that, they say they value lecture notes being online, but the evidence from observation and from logs says they still don't use them - there is a tendency to answer to match their expectations of what you want to hear!

    ReplyDelete
    Replies
    1. Thanks Pat.
      The consistency issue is between modules. So one module might be completely empty whilst another is overflowing. Granted many of the things they are asking for don't really *enhance* learning, but it's still a part of their education. Students want to be able get contact details for their tutors, or access timetables, etc. Granted, some things might be about training-for-the-test, but lots have nothing to do with it.
      I've tried to expand upon Mark Stubbs' idea of Hygiene factors in this post - http://thereeddiaries.blogspot.com/2013/11/tel-herzbergs-two-factor-theory.html, and I hope to expand upon it by running some focus groups soon.
      We know from server data that there is huge mobile access to the VLE - quite what they're accessing is unknown though.

      There is some difference between staff and student responses. I thought I included details but obv not. Stats isn't my strongpoint but here goes:

      • A Pearson product-moment correlation coefficient test was computed to assess the relationship between staff responses and student responses. There was a strong, positive correlation between the two variables (r=0.560), suggesting staff recommendations for the inclusion of criteria in minimum standards is correlated to student suggestions.
      • Furthermore, a two-sample T-test was used to determine statistical significance of responses. There was an extremely significant difference in suggestions for minimum standards from staff responses (M=53.93, SD=14.60) and student responses (M=77.29, SD=14.24; t(26)=4.28; p=0.0002) for at least some criteria.

      Delete
  3. Thanks for sharing this, these results are useful and significant. Perhaps not surprisingly, most of the items on the list are of the 'module management' (course management) type rather than the learning process itself. They are essential, but only part of the story. The formative feedback item gets the closest to measuring the learning. Discussion boards can be good, but it makes a difference what they are used for...are there any learning tasks that students are expected to do with them or are they just used for asking logistical questions about assignments, for example?

    The tricky bit is moving from just measuring how *much* the tools are used (easy) to how *well* the tools are used (hard).

    We have been using Blackboard's Exemplary Course Programme rubric as a guide when designing our minimum required presence and are currently working on an 'enhanced' level, which will include optional items such as use of blogs or wikis for reflective practice, group work, social media for peer learning.

    What we like about the ECP is that embedded in it you can find the keystones of recognised good educational practice, from Bloom's taxonomy, active learning, constructive alignment, etc. It goes beyond the tools in the VLE.

    Is anyone else here using the ECP?

    Many thanks for opening this discussion.

    ReplyDelete