We spoke with Professor Steven Furnell, Professor of Cyber Security at the University of Nottingham, about how the cybersecurity field has evolved since the early 1990s. He shared insights on the growing importance of cybersecurity education, both within universities and for the wider public, and why foundational knowledge is essential across all sectors — not just in technical roles. We also discussed the challenges small businesses face in improving their cyber resilience, and what employers often overlook when hiring for cybersecurity positions.
You’ve been working in cybersecurity since before it became a mainstream discipline. What does your early experience tell us about how the field has evolved; what role academia has played in shaping that evolution; and what role do you see it playing in the future?
Yes, one look at me will confirm that I’ve been around for a while, and things today are rather different from when I first set foot in the area. When I started doing my PhD (back in 1992), the books and papers were talking about ‘computer security’ or ‘information security’, and it wasn’t something that the average person in the street had much contact with. If you were a computer user back then (and not everyone was) you were likely to be using passwords somewhere, it made sense to backup your data, and viruses were on the rise so there was an increasing need to protect against them. But good practice wasn’t uniform. People often chose poor passwords, lots of data wasn’t backed up, and many consequently ended up losing it to virus infections. The same was true for organisations – security was often recognised as an issue, but they weren’t necessarily doing things about it. Fast forward to today, and technology usage is absolutely routine, being online is a basic expectation, and we do it across a whole range of devices.
What we have today is a much more widespread recognition, but combined with a much more widespread need. Cybersecurity is now an issue for everyone making use of technology and online services. But I’d say that cybersecurity practice doesn’t seem to have advanced at the same pace. Those bad password choices are still around, and backups and malware protection are often only better because the technology is now doing it automatically rather than relying on us to take care of it.
Academia has always had a role in promoting security topics and principles, but what varied – and what still does – is how much attention and emphasis is given relative to other subjects that students need to know about. As our reliance on the technology has grown, there’s been an increasing recognition of the need to protect it and to design it security in the first place. So what we now face is a more fundamental need for cybersecurity to be taught – both as a topic within its own right, as well as embedded within the teaching of wider computing topics such as architecture, software and networking. And added to that, we need have a general need for students of other topics to be cybersecurity literate, so that they can study without their use of technology leaving them exposed to cyber threats.
Cybersecurity education remains largely anchored in computer science. Is that too narrow a foundation? What do we lose by not taking a more interdisciplinary approach — drawing on fields like law, business, psychology, or engineering?
It’s true that cybersecurity is typically anchored to computer science, and that in itself isn’t a bad thing. What’s important is to recognise that this isn’t the end of the story, nor the only perspective from which we need to build and maintain cyber expertise. As an example, let’s take a look at a couple of topic areas outside computing and think about how they’re related to cybersecurity. The very reason cybersecurity is needed is often to protect the organisations making use of the associated systems and data. So, the business perspective is absolutely relevant and it’s important to recognise cybersecurity as a business enabler rather than an overhead. Meanwhile, the fact that a significant volume of cyber breaches will track back to people means that understanding their psychology can certainly be useful. This applies to understanding those that we’re trying to protect (and to prevent from doing things that cause breaches), and the psychology of those that may wish to attack us. This doesn’t mean that every cyber graduate from a Computer Science department needs to have learned the details of business and psychology perspectives, but they ought to have an appreciation that those perspectives exist and have a role to play.
I think we can also overlook the perspective of other disciplines needing to consider cybersecurity as part of how they are taught. For example, law students should arguably learn about topics such as cybercrime and data protection, and build at least some appreciation of how cybersecurity has a part to play in providing related protection. Similarly, engineering students will often need to be thinking about security in the design and implementation of the wider systems they are realising, particularly in the cyber-physical systems context. So, it doesn’t all come down to what gets included in computer science programmes, but also how other areas can recognise where their part comes in.
If you were designing the ideal cybersecurity degree programme from scratch today, what would you include that most current courses are missing? Should every university student — regardless of discipline — graduate with some baseline cybersecurity literacy? What would it take to embed that in higher education at scale?
I don’t think it necessarily comes down to one thing that current courses tend to miss, but I do think there is often a skew towards the technical topics that are often more immediately associated with cybersecurity (especially with reference back to your earlier point about most courses being anchored in computer science). Of course, there’s nothing wrong with a technical focus – it’s essential for many roles – but I tend to feel that students (and later practitioners) will fare better if they understand cybersecurity in wider context, such that they’re at least aware of how it relates to the business and the people within it.
On the issue of cybersecurity literacy, yes, I think this is very much something that would benefit all students regardless of what they’re studying as their main focus. This doesn’t mean they all have to get a assessed module or coursework about cybersecurity, but there’s a good argument to be made that – just like wider digital literacy – cyber literacy is something that all graduates ought to be able to leave with. If they don’t then we’re basically passing on the problem to individual employers to deal with later, and we can see from ample survey evidence over the years that (a) they have staff-related security issues and (b) only a minority of organisations are addressing cyber awareness in a significant way. So the more that we can do before people even get to the workplace the better.
You’ve contributed to national initiatives like the NCSC-certified degrees — which aim to standardize and raise the bar for cybersecurity education — and the Cyber Security Body of Knowledge (CyBOK), which maps out what professionals in the field should know. What kind of real-world impact have frameworks like these had so far? And where do you think they still fall short or risk becoming box-ticking exercises?
I believe that the degree certification has provided a basis for students and employers to have confidence that they will receive a good quality education in their chosen part of the topic. What the certification seeks to assess and ensure is that cyber security education is covering appropriate topics, being credibly delivered and assessed by a suitably qualified and resourced academic team, and that the student outcomes meet the expected standard. The extent of effort that an applicant needs to go through, and the evidence that they have to provide is significant. So too is the effort on the reviewer side. And so anyone who’s been thorough the process as an applicant or as a reviewer, will be very clear that it’s anything but a box-ticking exercise. For my part, I was involved in supporting the design of the original versions of the certification schemes and have remained involved in reviewing applications as the programme has evolved into using CyBOK as its foundation.
CyBOK is a substantial and ongoing activity to help map the core knowledge in the domain and help signpost people towards it. In that way it’s a good way of understanding what the key knowledge areas are, and how they fit together. Like any advancing area, of course, the challenge is that the knowledge evolves – and sometimes quite rapidly. If I look back to materials that I used in the early days of my PhD in the area, there are some things that still remain pretty much intact – confidentiality, integrity and availability got mentioned then, and they still get mentioned now. Similarly, general areas such as authentication, access control, cryptography and the like were all part of the picture. However, the technology landscape itself, the accompanying threats, and specific safeguards, have all changed dramatically. So, CyBOK faces the challenge to evolve and maintain the resource, and that’s what the next phase of their work aims to do, transitioning from a funded project to a community interest company as the basis for taking things forward. The body of knowledge is seen as a living document set, and that’s exactly as it should be.
You’re leading a project focused on building ‘Cyber Security Communities of Support’ for SMEs. What’s the gap you’re trying to close — and why hasn’t it been addressed until now?
Stats from the Federation of Small Businesses tell us there are around 5.5 million SMEs in the UK, accounting for three-fifths of employment and around half of turnover in the UK private sector. Collectively they are therefore a significant asset. At the same time, however, results from the latest Cyber Security Breaches Survey tell us that half of small businesses have experienced a cyber breach or attack in the last 12 months, while at the same there are many relevant security steps that smaller organisations are not taking. There can be several reasons for this, ranging from those that don’t know that the guidance is there or haven’t looked for it, through to those that have found it, understood it, but still face challenges to put it into practice. These challenges can include cost, lack of expertise, and lack of time.
Colleagues and I are currently involved in the final phase of a two-and-a-half year research project, looking into the cyber security support needs of SMEs and opportunities to improve the routes available to them. A key thing we recognised from the early work with the project is that while cybersecurity is a significant issue for SMEs and there is no shortage of information that attempts to address them, there is a still significant gap in terms of SMEs actually dealing with security and knowing how and where to get support. What our project – CyCOS – is doing in response to this is piloting a new community-based approach that seeks to bring SMEs and cyber experts together in a way that enables questions to be raised, advice to sought and recommendations to be offered from within a peer community. The project isn’t seeking to create new guidance or to replace existing resources, but rather to provide a further channel through which support can be sought.
The Communities of Support aim to help SMEs to better understand cyber security, and to prepare for and protect against common attacks. In addition to getting from advice from cyber professionals (contributing as volunteers to offer free, impartial advice), the intention is that the communities will also offer SMEs the chance to share their own experiences and learn from those of others. A series of pilot communities are scheduled to launch in Autumn this year, and we would welcome sign-ups from SMEs, cyber security professionals and others with an interest in taking part. The project team can be contacted via contact@cycos.org.
There’s a growing focus on building the cybersecurity talent pipeline — but is that the full picture? Beyond technical skills, what are employers still getting wrong about hiring and retaining cybersecurity professionals, and where can academia step in to help reshape expectations?
Growing the pipeline is certainly important, but it’s also important to be realistic about how academia has a role in growing it. What it has the opportunity to do – when compared (for example) to more targeted training courses, is provide a more holistic education in the subject area. This isn’t just about learning technical skills, and indeed ought to be about fostering the various transferable skills and wider understanding that cyber professionals will need in order to succeed in the workplace. In many cases, a technically competent cyber professional will be of significantly less value if they don’t understand how cyber security supports the wider business, or they can’t communicate effectively with others people within the organisation.
A common gripe that tends to recur about graduates (in cyber security and in computing more generally) is that “They don’t know XYZ tool / language / technology”. However, there’s a limit to which this is a fair comment to make, but the academic education is not just about teaching today’s tools. While we certainly shouldn’t expect universities to be teaching techniques and technologies that are obsolete, we equally shouldn’t expect that students will all be emerging with the hands-on expertise in whatever happens to be the latest technology du jour. Studying at university is in many ways about learning how to learn, and so a graduate from a Bachelors or Masters degree is demonstrating their capability to study and succeed at that level, rather than just the specifics of what they happened to learn. So having recruited someone from such a background, you’d expect them to be able to pick up other things in a similar manner, and to have the surrounding skills of critical thinking, analysis, communication, etc. Being frank about it, much of what you’d learn in a cybersecurity course would be expected to be outdated in a few years anyway, and it’s more the principles and foundations that would expect to be more resilient. So, any graduate will only be able to rely on the specifics of their academic education for a limited period, and without some degree of continuing professional development they will soon be left behind. Enabling and facilitating such development is something that employers should also be factoring in.
As cybersecurity continues to evolve, what do you see as the most important role academia should play in shaping the field over the next decade?
In many ways, I feel the role for academia is the same as it’s always had, such as offering the freedom to try things out without the pressures and constraints of commercial deadlines. Many of the cyber security ideas and approaches that are now used have their roots in academia, and equally many of the things that need security and protection have also arisen from there. So we need academic work to continue looking at how to advance security, and we also need academic work that isn’t specifically about security to still be mindful of when and where it’s going to be needed.


Leave a Reply