A civilian’s guide to military jargon for cybersecurity
3 min read

A civilian’s guide to military jargon for cybersecurity

A civilian’s guide to military jargon for cybersecurity

Snafu. Fubar. Charlie Foxtrot. These are among more than a handful of colloquial phrases military forces coined to describe incredibly messed up states of affairs.

One might be tempted to ascribe such words to the current state of consumer cybersecurity. Is it even possible to avoid having your Social Security or credit card number stolen?

Acknowledging and describing a problem is just the first step to addressing it. And along with military jargon, there might be some hard-earned military lessons we could absorb in our efforts to tackle our cybersecurity issues. Here’s a sampling of each.


There are no IOUs with the Internet of Things
How to protect what your car knows about you
How to reduce ‘collateral damage’ from blockbuster cyberattacks

KISS. Keep It Simple, Stupid. As a former U.S. Army officer, I begrudgingly give credit to the Navy for formalizing this design principle in 1960: The simpler something is, the easier it is and the less likely something can go wrong.

In technical design, the more complex a system or a system of systems, the more surface area there is for an attacker to compromise. Whether designing an application, a policy for people, or a process, keeping KISS in mind will benefit you.

Extending this principle to cybersecurity, we can get an easy reference view of a company’s “battlefield” to orient us to external threats. A threat is only a threat if it can get to you.

Encryption. The term “military grade” is often ascribed to encryption. Whether it’s data at rest or in transit, the allure of a Fort Knox-level of protection is enticing. The most common definition for this level of encryption is AES 256-bit, the first publicly accessible standard, approved by the National Security Agency in 1998.

To break the obviously less-powerful 128-bit encryption, which most financial institutions use, using brute force, an attacker would need today’s fastest computers and multiple times the age of the universe. Brute force is a simple attack; it tries every possible combination until it finds a winner.

The thing about encryption is that the end state—the number of bits—is only one factor in its security. The others are:

  • Where are the keys stored?
  • How are keys exchanged between sender and receiver to share data?
  • What type of encryption is used? This determines its entropy (randomness of the encryption), as well as its random seeds (which start the randomness).

This is where we start to see some strong military influence.

First, classified networks requiring encryption are physically isolated from one another, and they enforce limited personnel access. If you don’t have the clearance, you can’t just walk up to one.

Second, instead of encrypting through software, military forces use physical devices (hardware) and manually key the encryption. Because it’s very, very difficult to modify hardware, this is a much harder form of encryption to defeat.

Train as you fight. There are two elements to this military rule, whether we’re talking about cybersecurity or topics well beyond it: 1) You have to train, and 2) Your standard for training should be realistic and useful. Everyone seems to agree that training is good, but they don’t necessarily agree on what to train people to do.

At most organizations, cybersecurity training is the annual compliance check mark quickly forgotten. And it rarely reflects the fact that life isn’t a series of PowerPoint presentations and multiple-choice tests.

People, not computers, comprise the largest surface area—the sum of all possible approaches for an attack of any organization. Increasing our awareness of threats, and improving our daily cybersecurity hygiene, can go a long way in reducing risks to ourselves and our colleagues, whether fellow troops on the battlefield, co-workers, or family members.

Executives walking through a tabletop exercise will identify gaps in understanding, policy, and controls at a high level. Conducting a simulated adversarial attack called a red team (which comes from military parlance and means Opposing Forces) can identify gaps in people, process, and technology controls.

People can do funny things under stress. Through repeated simulation, you can find out how you would react to a situation and train yourself to perform better, thus reducing your stress level. The closer the simulations are to the real thing, the better the result.

Lead by example. Change requires leadership. If an organization is going to improve its security or stay secure, then the general in charge needs to establish security as critical to the organization’s mission.

This means the right level of investment, including staffing. It also means setting a concerned and focused cultural tone. These things won’t completely protect your organization from attacks, but they will reduce vulnerabilities to them—at least relative to its peer groups. It’s not about being faster than the bear, but rather faster than the other person running from the bear.

Malicious hackers are lazy in general and will go for what works and nothing more. And if you become the target, adopting some of these approaches will at least give you a fighting chance.

Enjoying these posts? Subscribe for more