top of page
Search

Cyber Security Expertise: How to Empower People to Appropriately Pursue Cyber Issues

  • Writer: Ganna Pogrebna
    Ganna Pogrebna
  • Oct 18, 2020
  • 6 min read

Updated: Nov 27, 2022





It is impossible to fully protect your own account as well as your business against all cyber-attacks. Think about it - with sufficient level of sophistication, phishing social engineering may get through (even if you have the latest technology and even if you are very knowledgeable about cyber threats) and you or your colleagues may inadvertently give away your or their email address to a fake email account that your or they

thought was from a friend. A key issue is to understand is the severity of these risks and

build solutions that area appropriate for the potential risk. A key issue that affects

approach to risk is the level of appetite for risk taking. This appetite may vary from culture to culture and from individual to individual. Your personality, cultural background, training, and even perceptional sophistication can influence your personal risk appetite, and, consequently, your propensity to become a victim of a cyberattack.


Safety versus Security


In safety (which concentrates on robustness), it is relatively easy to define and address risk factors. Naturally, safety assessments are conducted often, safety trainings are administered regularly, organizational responsibilities for safety are usually well-defined and involved the whole work force. But in security (which concentrates on resilience), especially cyber security, training cannot be a one-off exercise as the nature of cyber security risks are different. You can only do it that way if either nothing changes or if risks can always be clearly defined and assessed, but that is not the case.


Security is a long-term game: it is not like safety where you can get it right such as fire extinguishers, fire sensors and sprinklers in buildings and then conduct regular safety checks of equipment and procedures. You can get safety right and it is stable because the risks do not change that much in that particular context. Security is not that way, yet organizations are still trying to delivery cyber security training without explanation and appropriate methods. Organizations have not understood the fluid nature of the cyber threats.


Security is a long-term game: the technology and threats are changing all the time and

evolving. In a recent study by Professor Karen Renault, twelve groups trying to solve cyber security issues were assessed on how project managers were selected dealing with these issues. It turned out, the just matching job requirements with perceived job competency on a CV alone was not enough. The study also looked at (self-)autonomy and relatedness as well as competency. (Self-)autonomy is the degree of personal ability to carry out specific tasks, while relatedness is social ability to relate and communicate with others in influencing and getting tasks done. The study found that autonomy and relatedness are very important in meeting job needs, i.e. - competence alone is not sufficient to fulfil cyber security-related project management roles. This may be due to that type of role, but it was observed that meeting intrinsic needs increased when autonomy, relatedness, and competence were combined and effective, suggesting that dealing with cyber security risks required having a rather complex and composite expertise.





On experts, expertise, and fraud


With the world wide web, social media, YouTube channels, as well as availability of wide variety of materials, “everyone is an expert on everything”. This is the perception... Yet, it is clearly is not the case, and not a measure of intrinsic competence. Identifying true expertise is important and this becomes more and more difficult and undermined if purported authoritative research may be fraudulent. Think of the case of Dr Andrew

Wakefield and his published research in 1998 on MMR vaccine. Wakefield asserted that the MMR vaccine caused autism. Yet, this "finding" was later refuted by the General Medical Council (GMA) the British Medical Association (BMA) as fraud, who presented clear evidence of falsification of the reported data. Nevertheless, the case did a lot of damage - child vaccinations in the UK plummeted (reaching its lowest levels in 2003-2004). It took over a decade to return them to previous levels in 2012. It is fair to say that this still remains a contentious case in that the information about MMR vaccine causing autism remains out in the Internet and lingers over time stoking conspiracy theories and perceptions...


A casualty in these events is the erosion of trust in authority as well as roles with authority and expertise. More contentiously is the exposure of cost of care and the complexities of

regulation and meeting public health needs. Both the GMC, BMA and the National Health Service had to make extensive public communications to redress the perception of the event. This highlights a more fundamental issue in fact checking and the ability to validate the fact or message as true and from a valid source. This is a key aspect of non-repudiation. In law, non-repudiation refers to the ability to ensure that a party to a contract or a communication cannot deny the authenticity of their signature on a document or the sending of a message that they originated.





Empowering people: compliance versus participation


It is hard to use rules to control people. We can state we want to secure systems, we want to prevent hackers from stealing information; but how do we develop a culture that is incentivized to believe in these values? How do we change the people’s behavior and the organizational perception?


Well, for a start, cyber security systems need to stop treating people as enemies. When a person dies in a hospital from unexpected causes, the clinicians will carry out a mortician’s conference to discuss how could they have saved this person’s live. It is not

about finger pointing, it’s about all the people in the room learning from the experience. In the UK, this is set out in the Regulation 20: Duty of candour, Health and Social Care Act 2008 (Regulated Activities) Regulations 2014. Regulated by the Care Quality Commission, an independent regulator of health and social care in England. It sets out some specific requirements that providers must follow when things go wrong with care and treatment. The issue of blame versus a process for learning is key. This impacts human behaviour and a person who may resist telling the truth if something goes wrong and wants to avoid blame. It becomes a safety issue potentially in the wider context of make a living environment safe.


In computing, we are often concerned with accountability, who does what, who pressed

this button, etc. Yet, it is not about accountability when we consider malicious acts. Some acts may be unintentional mistakes. We should not treat everyone as if they are deliberately trying to hurt organizational systems. These are controversial issues. We do not just talk about cases when people died in a hospital but also about cases when people were saved. You need some flexibility to allow people to break rules if they feel there is an issue that needs to be resolved in a cyber attack.


The famous case of the “Gimli Glider” Air Canada Flight 143 was a scheduled domestic

passenger flight between Montreal and Edmonton that ran out of fuel on July 23, 1983 at an altitude of 12,500 meters (41,000 ft), midway through the flight due to a metric

conversion error. The crew was able to glide the Boeing 767 aircraft safely to an

emergency landing at a former Royal Canadian Air Force base in Gimli, Manitoba. An

interesting aspect was that Captain Pearson was an experienced glider

pilot and he with his First Officer Maurice Quintal, in adversity, managed to glide the plane having to guess the optimal glide speed of a 767 at 220 knots to the landing site. Another feature included that, without power, the pilots used a gravity drop, which causes gravity to lower the landing gear and lock it into place. There was nothing in the manual rules in how to deal with this event. The subsequent investigation revealed that a combination of company failures, human errors and confusion over unit measures had led to the aircraft being refuelled with insufficient fuel for the planned flight. It also concluded that the pilot and the First Officer (who, of course, were partially responsible for the insufficient fuelling mistakes) managed to land in conditions, when all other pilots failed (this was tested in a simulator). Had the crew resorted to the "by the book" rules, they would have never walked away from this incident alive and many lives would have been lost.





Takeaways


Cyber security strategy of any business is a balancing act: one has to find the correct recipe, a compromise between technology, human skill, creativity and compliance. It is not easy. Yet, it is clear that people should not get punished for honest mistakes and should not be subject to mis-communication of what they were trying to achieve. This is very much a problem for organizational and societal culture. This means that we must deal with the issues of culture, trying to create appropriate processes, and learn not to blame people.


This post was originally written by Ganna Pogrebna for the CyberBits blog in 2020

 
 
 

Comments


© 2023 by Ganna Pogrebna

bottom of page