Select Page

7 cybersecurity lessons from the 1950s

CYBSAFE-SebDB Webinar-preblog-221011MS-36

8 August 2024

Expert’s 7 1950s-inspired principles for cyber resilience

Moving beyond train-the-user and buy-this-tech

Today’s cybersecurity could be rescued by ideas from 75 years ago. Yes, really.

That’s the view of Danny Dresner, Professor of Cyber Security at the University of Manchester. Danny’s all about a diverse community-based approach to cybersecurity, and about helping people to cope with cyber risks.

And…he says that in order to move forward with our collective cyber resilience, we have to look to the past.

Based on his electrifying talk at IMPACT2024, this blog explores Danny’s 7 1950s-inspired principles to reach a much-needed balance between people and technology.

Are you puzzled by how a vision from the past can illuminate a better future for cybersecurity? Well, good news: You’re in the right place. Let’s get into it.

Wait, who’s Danny Dresner?

We could fill a whole blog just about what Danny’s done with his life. So this is the nutshell version.

Before becoming the first Professor of Cyber Security at the University of Manchester, Danny spent 22 years with The National Computing Centre.

He’s everywhere—the IASME Consortium, Centre for Digital Trust and Society, Manchester’s DiSH – Digital Security Hub—you name it. Plus, he’s a Fellow and Founder member of the Chartered Institute of Information Security.

He takes a diverse community-based approach to cyber security, running exercises to help people cope with the risk of online harms.

Or to put it another way: When Danny speaks, listen up.

On demand Dany dresner

Why are we here? (And what has the 1950s got to do with it?)

Why is cybersecurity where it is today?

Partly, it’s to do with Norbert Wiener.

It’s hard to imagine a man called Norbert being ahead of his time. But Norbert Wiener certainly was.

Back in the 1950s, Wiener was a pioneer in the field of cybernetics—the study of communication and control in machines and living things.

His ideas are fascinating…but the 1950s wasn’t ready for them. That could be one of the reasons some people dismissed him.

Another reason? We humans love the promise of a quick, simple silver bullet to solve complex problems. 

Today, those silver bullets look like “buy this tech” and “train the user” solutions. It’s tempting to oscillate between the two, but they won’t deliver what they promise. At which point many security teams find it all too easy to blame the workforce.

But anyway, as Danny explains, “People weren’t too enamored with Weiner’s work…so essentially what they did was to behead the core part of processing.”

 

“And now here we are today, trying to glue things back together, worrying about the effects of the decisions made by AI on our lives and how it might be used and abused by the other side for their attacks on our cybersecurity while we actually haven’t got our systems in place.”

This is why Danny wants Norman Wiener’s ideas to make a comeback. And in his IMPACT 2024 talk, he built upon Wiener’s legacy by proposing 7 principles that bridge the gap between people and tech, and build a better future for cybersecurity.

Replacing blame with balance

Danny’s quick to call out the large burden placed on non-experts in organizations today: “I will even go as far as saying that when you sit down and do a security awareness session for someone you should apologize to them first and point out that you’re asking them to compensate for the fact that the systems that we are giving them to use have got so many flaws.” 

“We should invert the organization chart, because people on minimum wage are being asked to protect multi-billion-pound assets.” – Danny Dresner

 

We ask them to do huge things under duress,” he continues. “Spot these incoming emails from professional, clever social engineers … We should invert the organization chart, because people on minimum wage are being asked to protect multi-billion-pound assets.” 

 

“We’re asking people to have unfair extra tasks. We’re very quick to blame when things go wrong, but how many people have security in their objectives when they have their annual review? They’re given targets to meet, so of course they’re going to find the workarounds and the quick fixes.”

Time to change the chant

People! Process! Technology!

It’s a near-universal mantra in cybersecurity discourse. Hands up if you’ve used it. We certainly have in the past. Danny has.

But—while it’s a handy catchphrase to wheel out during board meetings—it doesn’t help us, says Danny.

Danny argues for a shift to “focusing on a balance of people and technology, and NOT using process as a piece of wet string” to try to compensate for our shortcomings in the “people” and “technology” aspects.

According to Danny, we’d all be better off if we ditched the “process” from this mantra, and instead focused on achieving a healthy balance between people and tech.

That’s what Danny’s principles are all about: that human–tech balance in cybersecurity.

Because, yes, it’s tempting to fall into the trap of “buy tech” or “train people”. And it’s easy to blame the tech or the people when things go wrong.

But there’s a better way. Or, rather, seven ways…

IMPACT 2024 the findings full of behavioral science

Here they are: Danny’s 7 lessons from the 1950s

1. Complement

Automate when that’s helpful; let me drive when it’s not.

Secure memory—it’s still an issue, even after all this time. “This is a hardware problem identified in the 1970s,” Danny points out, “and it’s only now that we’re starting to say, ‘Ah, wouldn’t it be a good idea if we built this into the hardware rather than remembering that actually it should be the developers that will be able to program around it?’ …We’re burdening people with more than we ought to.”

“We all go through fear, uncertainty, and doubt, and tell people about all these awful things that will happen if they click on the wrong thing. We don’t really talk about increasing the risk of success.”

We need tech to be secure by design. But on the flipside, we also need people for the important decisions and navigating unexpected scenarios.

Well designed automation and sound human judgment compliment each other. We need to remember this, Danny says.

2. Compensate

Preserve the safety of the system in the event of a cyber attack. 

We need systems that can take a hit and keep on going, says Danny.

Here, Danny brings up the many famous attacks on flat systems. “One of my favorite examples that I talk to my students about is the Sony hack from several years back, where one of the first pieces of data stolen was the network maps to help them around the flat area. If that’s not chutzpah, I don’t know what is!”

But what if Sony had made it harder for the criminals to find their way around? Just because someone has gained access, it doesn’t mean they can get at the good stuff. Break-ins are horrible, but a broken window is very different to a broken window and losing all of your valuables.

Breaches happen. But you can lessen the impact and keep operations ticking over by building resilience into systems.

3. Confidence

Help me to trust that emerging vulnerabilities will be made known and corrected.

People, says Danny, “shouldn’t have to worry about things being up to date. That should be integrated, and those fixes for vulnerabilities need to get that balance.”

“Yes there are many vulnerabilities,” he continues. “We may test out systems, do the pen testing, do the red teaming, the purple teaming, the blue teaming… It’s an equation, it’s not just that there’s a vulnerability, it’s a case of can a vulnerability be exploited, and are people actually likely to exploit that?”

In his words, vulnerability + exploitability + threat = risk.

“The greatest lesson I took out of my PhD study was the work that you do (and this is great when it comes to collaboration between government and industry and academia) is to get the balance right, and that balance is the difference between the relevance and the rigor.”

The balance helps inform how much to worry, and what to prioritize.

4. Cooperate

Don’t entice me to “hack” the technology—or co-workers—to get the transformations that I need.

“The most important thing that any system has to do is it has to operate, and it has to deliver,” says Danny.

If people come up against clunky, frustrating systems, they will find workarounds. And much like a back-roads diversion, these workarounds aren’t always safe or efficient.

Danny says it best: “We shouldn’t have to hack the technology, we shouldn’t have to have the workarounds. We’re doing it with good intentions, but we all know which road is paved with them.”

This is why it’s so important to have open communication, where the people using the systems and tools can voice their needs and their stumbling blocks. Cooperation is the route to identifying secure ways to get work done—and smash organizational goals.

5. Correct

If I’m doing it wrong, use the collective wisdom programmed into the system to deliver the right outcomes.

We hate to speak ill of the departed. But we need to talk about Clippy. He was (almost) universally hated because he wasn’t all that helpful. He was actually rather irritating.

So, what did Microsoft do? They got rid of him—completely.

Danny thinks this is where we get it wrong: All of that knowledge of why we didn’t like it was wiped out. It wasn’t just fed in and improved in an iterative fashion.”

We can’t correct what we completely discard. We have to learn from our mistakes instead, he adds. “We should have ways to verify and validate what we do.”

6. Cope

Let’s be resilient so that I can benefit from the system even when under cyber attack.

The best athletes know how to get knocked down and get back up and keep going.

That’s because they know there’s a good chance of knocks. And cybersecurity’s no different. “There will be failures, and things will go wrong,” Danny says.

The athlete’s resilience doesn’t just grow in a vacuum. They have support—a team of people looking out for them, and guiding them in how to think about and deal with problems. 

And just like athletes, the people in your organization need to have the right support…including emotional support. And…they need to be confident that they won’t be chastised for reporting incidents, says Danny.

“We need those blameless environments in which people can report,” he says. “But importantly, as well, we need that cyber CPR to help people get back to where they ought to be when bad things actually happen.”

7. Community

Make it easy to work with other people in other systems.

Danny’s final principle is all about the ecosystem—not just people but the whole landscape around us. 

“We need to understand where the boundaries are,” he says. “We need to understand our supply chains internally and externally and understand those interactions, and again get that balance between the people and the technology that can actually get things to work.”

Cybersecurity isn’t a lone-wolf endeavor. Danny reminds us of the importance of sharing knowledge, standardizing practices, and collaborating with each other.

IMPACT 2024: The findings report

Time for a (HRM) huddle

There you have it. 

Seven ways to create that magic balance between people and tech. Seven ways to step away from “train the user” vs “buy this tech” dichotomy.

We’ve been looking back in time in this blog. But how does Danny think people will look back on this time in cybersecurity?

“It’s all too easy to classify to the nth degree and then silo ourselves from each other…but when looking back from the future at this period, cybersecurity actually had the opportunity to bring people together,” he says.

Nailed it. 

And we’ll add: We believe this isn’t about a total overhaul. It’s not about making cybersecurity unrecognizable. After all, we saw what happened when Wiener’s concepts got beheaded, and when Clippy got #canceled.

Marginal differences—that’s where it’s at. It’s in how they compound. And it’s in us all being prepared to all work together. 

And in paying attention to visionaries like Danny, of course. We’ve given you the highlights here, but there’s no substitute for the real thing. Watch Danny’s full IMPACT 2024 presentation for the full insights, and to experience his inimitable presentation style.

On demand Dany dresner

There’s plenty more where that came from

This conversation is only a small part of what made IMPACT 2024 so eye-opening.

Missed out on all the fun? Download all the unmissable insights and headlines in one report: IMPACT: The Findings

It’s your cheat sheet to the biggest takeaways in human cyber risk. The latest academic research on the human aspect of cybersecurity. No filler, all hits.

What’s inside:

    • Finding and addressing vulnerabilities in your security strategy,
    • Why we get in our own way when trying to help people act safer,
    • Why cybersecurity is so messy…and why that’s okay!
    • The huge divide in our community, and how we might be able to close it,
    • And so, so much more.

Get your hands on it right now, right here:

IMPACT 2024 the findings
Behave Hub newsletter CybSafe

Do one more thing right today. Subscribe to the Behave newsletter

You may also like