May 19, 2017
This Time, It’s Personal: WannaCry and the Psychology of the Cyber Accountability GapBy Andre McGregor
While the speed with which WannaCry ransomware spread is not new, it does raise new issues of accountability – specifically personal accountability for how we should be protecting ourselves from cyber attacks and, more importantly, holding both governments and vendors accountable for developing safe technologies. To curtail the next WannaCry epidemic, we need the same level of combined effort it took to make cars safe – from seatbelts and airbags to check-engine lights and backup cameras – applied to cybersecurity so everyone understands their part in protecting themselves from future cyber attacks.
(Image: Karen Nadine / Pixabay)
Back in 2010, the Stuxnet virus against Iran propelled many cybersecurity experts into the dangerous and unpredictable world of global cyber attacks. This week, with the blame leaning towards North Korea, the WannaCry ransomware attack propelled the general public-at-large into this very same dangerous and unpredictable world. This attack was so significant, even local news stations were covering it, with captivating headlines about “Cyber Chaos.”
We’ve seen countless cyber attacks in the past, so what makes this one different? This ransomware attack was a bit too personal for many of us, mainly because we have taken for granted the extent to which technology has permeated our daily lives. Most of us operate as though “things just work.” We fail to recognize the countless systems used to create a dial tone, run an elevator, or even dispense medication in an emergency room. When a system is up and running, no one thinks twice about whether or not it has a current operating system, is up-to-date on all of its software patches and uses hard-to-crack passwords. Instead, we make the phone call, get in the elevator, or take the pill.
But, what happens when you pick up the phone to dial 911 and no one is there? Or the elevator stops between floors and the emergency call button doesn’t work? Or you overdose on medication because your records disappeared? At this point, it is too late to think about patches and passwords. What was known is now unknown and even downright scary.
Amid the WannaCry panic, victims were scrambling to patch their outdated software, companies were scrambling to see if they were under attack and the general public felt powerless, worrying they would be next.
Let’s not kid ourselves that the recent WannaCry attack is any different in its form and structure from the many $300 bitcoin ransomware scares that came before it, or those that are sure to follow in its wake. What it did do was point out that we have failed to take personal responsibility for our own cyber safety and security. This must change. Each of us must be more involved and more aware of the technologies we rely on every day.
So, what are we learning as the dust settles? For one, this attack could have been mitigated months, if not years, before. Yes, technology upgrades hold a major part in the discussion on reducing widespread cyber attacks. However, it is people – those who hold responsibility over these systems – who decide on how these technologies are implemented and secured. A lack of personal responsibility leads IT professionals to develop insecure software, not report unpatched bugs, and even use decades-old software against the better judgment of even the most junior system administrators.
We, as a society, need to take greater personal responsibility for our own safety and security in the cyber world, just as we do in the physical world. Would you drive a car for years without ever changing the oil? Probably not. So, why would you use technology with outdated software? Would you take $200 out of an ATM at night in a poorly lit neighborhood? Unlikely. So, why would you visit questionable websites or open questionable emails? No one will tell you not to drive the car without changing the oil, or to avoid ATMs at night. You just know it is a bad idea.
Before the next WannaCry: How we can each do our part
Beyond our own individual responsibilities, we have collective responsibility for ensuring the greater good. Just as we have agreed-upon best practices for driving on streets and highways, so too do we need agreed-upon best practices for our behavior in the digital world. Here are some examples:
Public Sector: Government leaders urgently need a crash course in cybersecurity 101. Then and only then, can they truly understand the capabilities (and, quite frankly, limitations) of government agencies during a cyber crisis. Do your cyber defenders have the skillsets and tools they need to get the job done? If you don’t understand the basic tenets of cybersecurity, then how can you answer this question?
Private Sector: Good cyber security starts at the top. CEOs must bridge the cyber accountability gap and adopt a culture of responsibility by committing to the removal of outdated technologies in favor of more secure options. Simply put, unmanaged systems are not patched and unpatched systems lead to cyber attacks, therefore unmanaged systems lead to cyber attacks. Companies must find these unmanaged systems before criminals do. Doing business with a third-party vendor? Have your legal team write security standards into the agreements before you sign. Moreover, any company without a dedicated IT security team, a team that functions separately from those who manage IT tasks day in and day out, is sending a clear signal to its customers: Cybersecurity is not important to your company.
General Public. Individuals must demand more from government leaders and CEOs. The infrastructure that brings us our electricity and our ATMs, and the spectrum bandwidth that carries our live video streams and enables our smartphones and baby monitors and Alexa devices to function, are all the results of a public-private spirit of cooperation. But personal responsibility doesn’t stop there. For each of us, cybersecurity means more than just antivirus and a prayer. It means patching your home computers all the time, not just some of the time and not clicking “remind me later.” That way, your computer can’t be easily hijacked during the next ransomware or botnet scare. It means bringing the same good cyber hygiene practices you exercise at work into your home, and teaching your kids to be good digital citizens. It means being a good consumer by holding companies accountable when they fall short of their cybersecurity mandates. Reward companies for keeping your data safe and avoid those that do not.
To curtail the next WannaCry cyber epidemic – whether it is from North Korea or some college kid in a dorm room – we need the same level of combined effort it took to make cars safe. Just as we have seatbelts and airbags, check-engine lights and backup cameras, we need similar protections applied to cybersecurity. Only in this way will each of us understand our own personal responsibility in keeping ourselves, our companies and our society safe from future cyber attacks.
- Webinar: Keys to a successful Windows 10 migration
- The five questions you must answer for good security hygiene
- Looking inward for the best threat data
About the Author: In his role as Director of Security at Tanium, Andre McGregor is focused on cybersecurity. He possesses deep knowledge of criminal and counterintelligence cyber-techniques used to attack U.S. computer networks and infrastructure. Prior to joining Tanium, Andre served as an FBI Cyber Special Agent in New York City, later promoted to Supervisory Special Agent at FBI Headquarters. At the FBI, McGregor was the senior technical agent and the lead incident responder for several large-scale computer intrusions. Additionally, he served as both the FBI Cyber Liaison to the United Nations and FBI Cyber Liaison to DHS US-CERT and ICS-CERT. Before the FBI, McGregor went to Brown University, started his career at Goldman Sachs and then later as IT Director at Advogent (Cardinal Health) over all IT operations nationwide. In his free time, Andre is also the FBI and technical consultant for Mr. Robot.