National author, speaker, consultant, and entrepreneur Evan Francen got into information security long before it was cool and buzzing in the media, and long before every so-called IT consultancy started chasing the money. In fact, he and I both dislike the money chasers. He and his growing consultancy, FRSecure are for-profit, but they don’t do it for the money.
Like a patriot who delays college to join the army amid dire national conflict, Francen offers a fact-based call to arms to fix the broken cybersecurity industry in his 2019 book “Unsecurity”. Having known him and his company for a few years, and having read the book and many on this subject, this content is worth sharing because too few people write or talk about how to actually make this industry better. Here are my three unbiased key points from his book:
1) We’re Not Speaking the Same Language:
Francen opens his book with a lengthy chapter on how poor communication between cybersecurity stakeholders exacerbates trouble and risk. You can’t see or measure what isn’t communicated well. It starts because there are five main stakeholder groups who don’t share the same vocabulary amid conflicting priorities.
a) IT: Speaks in data tables and code jargon.
b) Cyber: Speaks in risk metrics and security controls.
c) Business: Speaks in voice of the customer and profits.
d) Compliance: Speaks in evidence collection and legal regulatory frameworks.
e) Vendor: Speaks in sales and marketing terms.
Ideally, all these stakeholders need to work together but are only as strong as the weakest link. To attain better communication and collaboration between these stakeholders, all must agree on the same general security framework best for the company and industry, maybe NIST CSF with its inferred definitions or maybe ISACA Cobit. However, once you pick the framework you need to start training, communicating, and measuring against it and only it –going with its inferred definitions.
Changing frameworks in the middle of the process is like changing keys in the middle of a classical song at a concert – don’t do it. That’s not to say that once communication and risk management gets better, that you can’t have some hybrid framework variation – like at a jazz concert. You can but you need proof of the basic items first.
Later, in the chapter Francen describes the communication issue of too many translations. That’s too many people passing the communication onto other people and giving it their spin. Thus, what was merely a minor IT problem ticket turns into a full-blown data breach? Or people get tied up arguing over NIST, ISSA, ISACA, and OWASP jargon – all the while nothing gets fixed and people just get mad at each other yet fail to understand one another. Knowing one or two buzz words from an ISACA conference or paper yet failing to understand how they apply to NIST or the like does not help. You should be having a framework mapping sheet for this.
The bigger solution is more training and vetting who is authorized to communicate on key projects. The issue of good communication and project management is separate from cybersecurity though it’s a critical dependency. Organizations should pre-draft communication plans with roles and scope listed out, and then they should do tabletops to solidify them. Having an on-site Toastmasters group is also a good idea. I don’t care if you’re a cyber or IT genius; if you can’t communicate well that’s a problem that needs to be fixed. I will take the person with much better communication skills because likely they can learn what they don’t know better than the other.
2) Overengineered Foundations:
In chapter two, Francen addresses “Bad Foundations”. He gives many analogies including building a house without a blueprint. However, I’m most interested in what he says on page 76:
- “Problem #4 Overengineered Foundation – too much control is as bad as too little control, and in some cases, it’s even worse than no control at all.”
What he is saying here is that an organization can get so busy in non-real world spreadsheet assessments and redundant evidence gathering that their heads are in the sand for so long that they don’t see to connect the dots that other things are going array and thus they get compromised. Keep in mind IT and security staff are already overworked, they already have many conflicting dials and charts to read – amid false alarms. To bog them down in needless busywork must be weighed against other real-world security tasks, like patch management, change management, and updating IAM protocols to two-factor.
If you or your organization have an issue figuring this out, as Francen outlines, you need to simplify your risk management to a real-world foundational goal that even the company secretary can understand. It may be as simple as requiring long complex (multicharacter) passwords, badge entry time logs for everyone, encrypting data that is not public, or other basics. You must do these things and document that they have been done one at a time, engraining a culture of preventative security vs. reactive security.
3) Cultivate Transparency and Incentives:
In chapter five, “The Blame Game” Francen describes how IT and business stakeholders often fail to take responsibility for security failings. This is heavily influenced by undue bias, lack of diversity, and lack of fact-based intellectualism within the IT and business silos at many mid-sized and large organizations. I know this is a hard pill to swallow but its so true. The IT and business leaders approving the bills for the vendors doing the security assessments, tool implementations, and consulting should not be under pressure to give a favorable finding in an unrealistic timeframe. They should only be obligated to give timely truthful risk prudent advice. Yet that same advice if not couched with kid gloves can get a vendor booted from the client – fabricating a negative vendor event. Kinda reminds me of accounting fraud pre-Sarbanes Oxley.
The reason why is because risk assessors are creating evidence of security violations that the client does not agree with or like, and thus you are creating legal risk for them – albeit well justified and by their own doing. From Francen’s viewpoint, this comprehensive honest assessment also gives the client a way to defend and limit liability by disclosing and remediating the vulnerabilities in a timely manner and under the advisement of a neutral third party. Moreover, you’re going to have instructions on how to avoid them in the future thus saving you money and brand reputation.
Overall, transparency can save you. Customers, regulators, and risk assessors view you more positively because of it. That’s not to say there are not things that will remain private because there are many, trade secrets, confidential data, and the like. My take on Francen’s mention of the trade off’s between transparency and incentives in a chapter called “The Blame Game” is that it’s no longer acceptable to delay or cover up a real security event – not that it ever was. Even weak arguments deliberately miscategorizing security events as smaller than they are will catch up with you and kick your butt or get you sued. Now is the time to be proactive. Build your incident response team ahead of time. It should include competent risk business consultants, cyber consultants, IT consultants, a communication lead, and a privacy attorney.
Lastly, if we as an industry are going to get better we’re going to have to pick up books, computers, pens, and megaphones. And this book is a must-read! You can’t be passive and maintain your expert status – it expires the second you do nothing and get poisoned by your own bias and ego. Keep learning and sharing!