Where should we teach cyber security? Should it be something that people learn on their own time? Or is it something that should be included into formal education?
Paypal recently (when I first wrote this paper) released a whitepaper on combating cybercime. In it, the authors assert that today’s educational efforts are good but do not scale to the required level of millions of computer users and requires significant investment by the government and private industry. Significantly more funding is needed.
The advantages of formally incorporating cyber awareness into the education system are clear:
By starting early, students have more time to gain exposure to a wide range of topics. This helps them build the level of deep expertise needed to bring together knowledge from different sources. With a formal curriculum in place, educators could organize the relevant knowledge organized to make it easier to absorb and recall.
On the other hand, creation of a cyber security curriculum in school is a major undertaking. It requires collaboration between industry and government and the knowledge is very specialized. Most adults today understand basic arithmetic, writing skills, reading skills, and social studies. Nearly all teachers are capable of teaching other subjects if they had to. However, expertise in computer security is not widespread. How many people in the world are experts on botnets? Malware? Hacking? Worse yet, how many people in the security industry have a background in education, teaching, and organizing their knowledge? The people who are good at teaching don’t know the subject, and the subject matter experts can’t teach it .
This is not an insurmountable problem but it would require a significant investment from both the private and public sector.
Software companies are not off the hook. Not only do we have a responsibility to educate the public, but we have a responsibility to write software in a way that makes it easy for users to be secure. We can achieve this by using a mechanism called “Choice Architecture.”
Choice Architecture is a principle that influences people’s decisions based upon the way that options are presented. People’s decisions can be swayed by a number of influences including ordering, peer pressure, and default choices.
For example, in a restaurant fast-food menu where people have lots of choices, most people will choose the first item. The public school system has experimented with this. Rather than placing unhealthy selections like French fries and hamburgers at the top, they put healthier selections like vegetables and yogurt at the top of the menu. The result? Students make more healthy selections than when the unhealthy choices are presented first. The same items are on the menu but the ordering influences their decisions.
A more powerful influence is the power of the default choice. Many employers today offer their workers a savings plan for retirement, such as a 401(k) or 403(b). This is where employees contribute to a plan, and frequently the employer also contributes. It’s almost “free money” for the employee if they are part of the plan. When employees by default are not opted into the plan and need to enroll themselves, enrollment is low – less than 50%. However, when their employer opts them into the plan by default and the employee must opt out in order to not participate, compliance is very high – over 90%.
The “power of default” is one of the most powerful tools that the security industry can use. Whatever the default setting is for a piece of software, the vast majority of users will stick with that. It doesn’t matter how much we tell users to switch to another setting, the “stickiness” of the default is what will remain. To use this, security vendors should make their software secure by default. In real terms, this means that software is set to update automatically and the user must opt out of downloading and installing the updates.
Modern software does this – Microsoft Windows has Windows Update, and Adobe regularly updates Adobe Acrobat; it prompts users if it wants to install after it has already updated. However, other pieces of software such as Internet browsers do not update by default. The browser is particularly vulnerable because it is the hacker’s weapon of choice for creating malware. These should be set up so that automatic updates are enabled upon installation and prompt the user to install when they are ready.
Although some browsers upgrade by default or prompt the user to update by default, not every piece of software upgrades by default. In my Firefox browser, I am running several plugins – Adobe Flash, Java, Shockwave, Quicktime, Silverlight and Media Player. Honestly, some of those plugins I use so rarely that I would never think to update them. However, a browser plugin called BrowserCheck from Qualys lets you scan your browser and tell you if any of the pieces are out of date. If so, there is a link that you can click on that will take you to the latest version:
I had to go and install this Qualys plugin myself, it wasn’t preconfigured on my browser. However, it should be. It’s useful because it consolidates a whole bunch of disparate plugins so I don’t need to keep track of them myself. Plugins like BrowserCheck should be standard on every browser, and there should be some sort of notification to let the user know when one of their plugins is out of date. Having a browser plugin checker installed by default forces users to be notified of security problems… and thereby help reduce the risk from one of the biggest attack vectors today.
In this series, I have looked at the problem of how to educate the public to become more aware of cyber security. I looked at why people don’t retain the message (because our teaching methods are poor) and how we can improve upon those.
However, I only looked at a small fraction of better educational teaching techniques; the subject is too vast for me to cover in 6000 words. What is encouraging about this is that because so much research has been done into formal learning, we know what works and what doesn’t:
There is no shortcut to being aware of the Internet threat landscape and giving people the skills they need to traverse it. But we do have a responsibility to tell users what they have to do and we also have a responsibility to ensure that they are learning, retaining, and using what we tell them. We do that by looking at ourselves and seeing what we can do to help.
And then, maybe one day, the cyber security industry won’t have such a big problem.
 If we knew how to teach it I wouldn’t be writing this article.