Are students digitally prepared for university?

When I did my BSc (Hons) I did a module on Information Tecnology and Quantitative Biology, which involved sitting in a huge computer lab for 4 hours every Wednesday morning to use different general and biology specific software. Then later in my MSc I had a similar experience using Excel and R. Even before this, I completed a course on the use of office applications and taught myself basic coding. I did my school work experience in the labs at a power station as an electrical engineer, and then at a network installing company doing everything from building computer servers to a database for them.

There are a lot of expectations around the level of digital literacy a student has, but research suggests that the most digitally literate age group is 35-45 (I am publishing a review where this can be found, I will link to this once I have published this paper). I sit in this age group, and I believe the reason is that my digital literacy was built with a lot of training. There was a big move to digital, getting the first family computer (a Tiny PC) back when I was about 11 or 12 years old, then moving on to building my own (which I still do). After this, there has been a move to assume that people are ‘digital natives’ and develop digital literacy through usage. But this simply is not true. The evidence suggests that exposure does not equate to development (Instagram and Snapchat are definitely not equivalent to using Excel), and that the digital native idea has also been disproven.

What does this mean for students? For a start, academia should be able to show academic development. The award should reflect a combination of ability and motivation. With the move to a reliance on digital for teaching, learning and assessment the need for digital literacy development has never been more apparent, but never more apparently overlooked or seen as an independently learnt tool. Those with a lower developed level of digital literacy can be seen to need more guidance and overt training. Without this support, guidance and learning the development does not happen. This in turn leads to frustration when using digital tools, which leads to a low motivation to develop, and this then maintains a lower level of digital literacy.

The curriculum for foundation subjects in the English (at least) curriculum is around coding, debugging, etc. This would be fine, but if we look at Beall’s paper from 1983 we can see this is a predicted part of modern academia. I am not convinced that this is true for academics, and I am not sure that every student will use these skills. Students in higher education as a result of this curriculum may be lacking in more functional IT skills that are needed for learning, assessments, and just overall surviving in a post digital society that is ever changing. In fact, I would probably argue that the need for coding for everyone is less and less required as WYSIWIG development tools become more common. OK, you would need a coder to build these, but this is not going to be everyone.

There needs to be some work done to ensure that students awards are not limited by any skills, and they are given opportunities to get the award that represents their ability, knowledge and effort and not be hampered by the inability to use what many would consider to be everyday tools. That assumption that these are everyday tools for all needs to be challenged, and a personalised approach to development in digital needs to be assured for all students. From being able to complete simple tasks like create an Excel formula (yes this is a simple task, and yes Co-Pilot is actually very good at helping with this) to using add ins in Word to support the use of a third-party reference manager like Zotero.


Discover more from Barry Matthews

Subscribe to get the latest posts sent to your email.