This article appeared in Cybersecurity Law & Strategy, an ALM publication for privacy and security professionals, Chief Information Security Officers, Chief Information Officers, Chief Technology Officers, Corporate Counsel, Internet and Tech Practitioners, In-House Counsel. Visit the website to learn more.

"We don't fully understand it," said Dr. Chris Dimitriades, ISACA board member. "With all the news of hacks out there, it just doesn't make sense!"

"Is it the people? Lack of resources? Timing?" I wondered.

Dimitriades and I, over coffee in Athens, were wrestling with a question plaguing cybersecurity professional: Why do so many cybersecurity programs fail?

Despite the torrent of news about hacks, despite the clearly elevated level of awareness, despite the increasingly sophisticated tools and services, despite regulatory requirements, we both agreed that cybersecurity programs are not always successful.

We did not solve it, despite two hours of conversation and enough coffee to power a small generator. And yet it seems this should be a key question for everyone in business, government, technology, and cybersecurity: If we know the problem with cybersecurity, and have ways of methods of addressing the problem, why are we still failing?

Without an answer to this question, all our work in cybersecurity amounts to building sand castles.

|

Getting to an Answer: Doing the Math

First, let's do the math. In ISACA's State of Cybersecurity 2018 we read:

  • 50% of organizations surveyed believe that there will be an increase in cyberattacks.
  • 80% of those surveyed believe that an attack is likely or very likely.

Those data point in a positive direction. Organizations are aware that the threat is real and imminent. It's the next piece of data that's so troubling:

  • 31% of those surveyed do not believe that their boards have adequately prioritized cybersecurity. Thankfully, that's down 33% from the year before. One third.

The respondents to this survey are ISACA members worldwide, across industries, with 67% employed in an enterprise with at least 1,500 employees. Traditionally we view enterprises with over 1,500 employees and with a board of directors as "top tier" in terms of sophistication, governance, and resources. When one third of such organizations report their boards are not taking cybersecurity seriously, then everyone downstream should be running for the hills.

This is not the first time that alarms have been raised about cybersecurity programs. There are several excellent articles addressing a multitude of technical failures, as well as issues with cybersecurity awareness, cybersecurity management, and implementation of cybersecurity controls. Such problems remain, even as cybersecurity executive awareness inches closer and closer to maturity.

Why? Where are the breakpoints?

I want to take a more granular approach and look at all the key phases of cybersecurity program development, and identify where the pitfalls are, and what we can do about each one.

|

Phase I. Asset Classification and Asset Valuation

When we start any cybersecurity program, our first step is to get a thorough understanding of what are the assets that we will be protecting. Typically, they fall into one of the following classes: data, hardware, software, systems, processes and workflows. For each one of these assets, we must collect a set of asset metadata that should include (at a minimum) the asset owner, the asset custodian, its location, its confidentiality classification, its impact classification, the maximum tolerable downtime, its recovery point and time objectives, and a list of resources associated with this asset.

This is a lot of work, hard, tedious work. Nevertheless, it's essential work.

Who is responsible to perform all this work? Two people: The asset owner, and the person responsible for developing the cybersecurity program — ideally your Chief Information Security Officer (CISO), or one of her senior cybersecurity analysts.

This is where the first serious gap appears.

Frequently, the asset owner, a.k.a., the "business unit owner," is misidentified. A typical mistake is to assume that the firm's highly competent financial EVP is the "owner" of the finance department. Not always the case. Frequently, the "real" owner may be the controller along with other leads within finance (e.g., treasurer, AP/AR director, etc.) This is an honest mistake, but with catastrophic consequences. Why?

Because the VP of finance, as highly competent as she may be may not be sufficiently "in the weeds" to give you a realistic impact classification, or an actionable maximum tolerable downtime, or a realistic recovery point objective.

There are other errors that can be introduced during this process: Omissions, misclassifications, and misattributions. All these errors can be avoided, or at least minimized, when you are partnering with the right asset/business owner.

How Do We Get Asset Classification and Valuation Right?

Dig past the title. Ask to speak with the people who are directly responsible for business-critical functions, not their manager.

Read the culture. Ask around to determine the "go-to" person when you have a critical issue within each business unit. Don't just go by the organizational chart, do some detective work. Of course, you'll review the work with the EVP, and of course you must secure her blessing. But, make sure you have analyzed the full and detailed picture, not a picture from the 50,000-foot view.

|

Phase II. Threats & Vulnerabilities

The next phase in developing a cybersecurity program is taking a close look at threats and vulnerabilities. In the interest of oversimplification, when we are asking the question "Who's out to get me?" and "How likely is it?" we're essentially performing threat analysis. When we're asking the question "How easy is it to get me?" and "How can they get to me?" we're exploring our vulnerabilities.

This work forms a specific risk analysis, generates a risk register, and specific risk assessments.

This work requires the partnership between business stakeholders and cybersecurity professionals. As with the previous phase, the work is frequently tedious and exhausting, and also absolutely critical to a cybersecurity program's success.

It is at this stage that multiple errors can be introduced:

  1. The first of these is that business people can, and frequently do, underestimate the threats. Most common? "No one is out to get me! Our business is too small!" Or, "There is no threat — no one has attacked us yet, so …"
  2. Ignoring a vulnerability because of some "legitimate" reason. Here's one that may sound familiar: "Oh, that system was an experiment, and it is no longer in production. We have no need to patch it." If it's connected to your network, it's a vulnerability.
  3. Another common excuse? "We can't patch that server because our 'old database' system will not work." I feel your pain, but have you implemented any controls to work-around this vulnerability?

How Do We Get Vulnerability Assessment Right?

The truth of the matter is that you can do a lot more about the second and third examples, the "non-production" system and the old database, than you can about a business owner proclaiming immunity and sticking his head in the sand.

For examples one and two, the key is to get it all down on paper. Be thorough. Look under every rock. Audit-as-you-go and uncover and document all the vulnerabilities. Engage with the business stakeholders and your technology partners, and do not rest until you feel you have been exhaustive in your vulnerability cataloging. Then, engage a third-party vendor and have them perform vulnerability tests. Work within that reality and it will serve you very well. It takes time and data to convince people to act. The first step is to have the data. Then work with people over time to shift attitudes around what can and can't be done.

In the case where you have business owners that are in denial, your task becomes exponentially more difficult. This is not an exercise of "let me drown you in facts and figures."

Remember the adage: "A man convinced against his will is of the same opinion still."

With the head-in-the-sand crowd, your best chance, if not your only chance, is to attempt to engage on a personal level first. Establish trust, demonstrate that you have their interests first and foremost. This is not about rolling out some fancy cyber tech. This is about talking person-to-person about risk and risk management. This requires trust, empathy, and engagement.

|

Phase III. Controls and Incident Response

After all the work and knowledge gained in the previous phases has been completed, the business is ready to crystalize a Defense-in-Depth strategy by implementing layers of preventative, detective, corrective, and compensatory controls. Keep in mind that rolling out an Active Defense (the creation of honeypots and hacker traps that waste an attacker's time and can help with detection and identification) is an important consideration here, depending on resources and scope, and in perfect alignment with a Defense-in-Depth strategy.

At the same time, and part-and-parcel with our business continuity and disaster recovery plans, is the incident response (IR) plan. The IR plan will address identifying incidents, containing them, treating them and recovering from them.

The possible errors in this stage are many and mostly come in governance, resources, and skill level.

Let's start with the most obvious error: Having IT choose and implement controls. Many business owners may be blind to the inherent conflict of interest in such a set-up. Next up: Picking the wrong controls, which is closely followed by configuring the right controls the wrong way. From there, we can go to poorly designed incident response plan (a.k.a., checklist approach), to none or little incident response training and simulations.

How Do We Get Controls And Incident Response Right?

There's good news here because this phase is the most technical of all. Doesn't sound like good news? Well it is!

The key is to understand the different roles and who wears which hat. To this end, bear in mind that IT creates value, but cybersecurity protects value. IT professionals are the custodians of value generation — technology. Cybersecurity professionals protect that value.

Yet so many business people lump these two classes of professionals together into one big bucket of "the techies."

Follow a simple rule: Value protection cannot report into value creation, or vice-a-versa.

To get controls and incident response right, you will need skilled, experienced cybersecurity professionals, working in partnership with their IT counterparts. Cybersecurity and IT are two parallel tracks that your business is riding on. One simple misalignment here, and this train will derail.

When the governance and role picture is sorted out, the chance for success shoots up. Still, the work is complex: We need to make sure that control selection is appropriate and business-pragmatic, and that control configuration is done properly. There is no room for "tech ego." We need to make sure that we involve expertise from vendors, partners, and colleagues throughout this process. It is the cybersecurity professional's responsibility here to be informed, inclusive, and solicitous of expert advice.

With the incident response plan, the team must ensure not only that the correct technical skills and components are in place, but also that the business is fully on-board, trained, and in-sync with the plan.

When an incident happens, everyone needs to be singing from the same song sheet. Communications needs to be aligned with legal, which needs to be aligned with cyber, which also needs to be aligned with the business and the executive chain. Panic has no place in incident response, and the only way to remove panic is to train for it, repeatedly.

|

Phase IV. Living Cybersecure

It would be naïve for anyone to assume that if we avoided all the errors during program development that we instantly achieve cybersecurity nirvana. The program, no matter how well conceived and executed, still has a huge vulnerability.

What is this single, persistent, point of failure?

People.

I would argue that even if a technological cybersecurity nirvana is achieved through the use of artificial intelligence, machine learning, quantum computers, and any other yet-to-be-imagined technological miracles, even then, we would still have the one, persistent, single point of failure:

People.

The persistent people problem takes many forms: Boards that ignore cybersecurity; executives that pay it lip service; managers who ignore cybersecurity requests; employees that violate policies; and the "I know best" ego-driven cybersecurity professionals, to "IT knows better" techs.

How Do We Get Living Cybersecure Right?

The problem does seem overwhelming. How do you solve the "people" singularity? Some suggest keeping your head down and doing your work, and accepting "what will be, will be." Others argue that education is the answer. Others want more intensive training, awareness, and tying compensation to cyber compliance. Some, blame greed, and single-minded fiscal-performance focus. Others curse at the speed of technological change that is leaving people in the dust.

There are still others, much more studied in the understanding of human behavior and thinking, who suggest methods for convincing people of a position (e.g., the Inoculation theory). In this way of thinking, we need to create the necessary space and climate for discourse and understanding. I think highly of their work and urge everyone to study them and use them.

In my experience, I have found one thing that works most of the time: Strong engagement.

With engagement you build trust, establish communication channels, and develop durable bonds of respect with person across the table from you. Even though he or she does not "speak" the same cyber language, may not understand the technical nuances, has no time to deal with yet one more thing, and may be afraid to look ignorant or needy — with the right trust framework in place, you can talk to each other.

This approach works and has had an incredible side effect: It teaches us more about how to be effective in delivering businesses the right solution over all the skills, training, and experience combined.

And, in my experience, it has kept many a cybersecurity program alive and well.

*****

Chris Moschovitis is the CEO of the Information Technology Management Group, a New York based company focused on providing independent technology and cybersecurity managed services. He is both cybersecurity (CSX, CISM) and Enterprise IT governance (CGEIT) certified. Chris is the co-author of "History of the Internet: 1843 to the Present" as well as a contributor to the "Encyclopedia of Computers and Computer History" and the "Encyclopedia of New Media." Chris' latest book "Cybersecurity Program Development for Business: The Essential Planning Guide" was published by Wiley in 2018.