SAN FRANCISCO — They may not realize it, but any company hit by the WannaCry ransomware attack over the past several months was impacted firsthand by a secretive U.S. government policy mechanism known as the VEP.

Short for the “Vulnerabilities Equities Process,” the VEP is the procedure through which the government decides whether to hang on to knowledge of computer security flaws for offensive uses (i.e., hacking), or disclose them to ensure they get patched. In the case of WannaCry, news reports and comments by Microsoft's chief legal officer indicated that the NSA knew about the vulnerability at the root of the worm, but only told Microsoft after losing control of it.

In the wake of the ensuing controversy, White House Cybersecurity Coordinator Rob Joyce last week for the first time unveiled a public version of the VEP Charter in an effort to shed some light on the government's decision-making process. The 14-page document describes in broad strokes the balancing act government hackers must go through after they discover new vulnerabilities. Here are a few things you ought to know about it:

|

1) The government will usually disclose the vulnerabilities it finds. Usually.

“The new charter makes an important policy decision that the presumption lies in favor of disclosing the vulnerabilities to the companies,” said Michelle Richardson, a deputy director at the Center for Democracy and Technology who has written about the VEP. “While several Obama officials had said as much in their personal capacities, it is crucial to have it be an official declaration from the whole of government.”

The relevant language from the charter reads: “In the course of carrying out USG missions, the USG may identify vulnerabilities that cyber actors could exploit. In the vast majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest.”

The caveat? The charter adds that there are “legitimate advantages and disadvantages to disclosing vulnerabilities, and the trade-offs between prompt disclosure and withholding knowledge of some vulnerabilities … can have significant consequences.” (Clearly.) When the government decides to keep a vulnerability under wraps, the charter says it will reassess that determination on an annual basis “until dissemination is accomplished,” or until the vulnerability becomes public or is “otherwise mitigated.”

|

2) There are a lot of cooks in the kitchen.

According to the charter, the body in charge of administering the VEP is known as the “Equities Review Board.” The board, which meets monthly, or more frequently as needed, comprises representatives from at least 10 different government agencies, including the Office of the Director of National Intelligence, the Department of State, the Department of Treasury, the Central Intelligence Agency, and the Department of Justice.

The National Security Agency “will support VEP governance by serving as the Executive Secretariat for the VEP, acting at all times under the authority, direction, and control of the Secretary of Defense,” the charter says. It adds that other agencies may become involved when “demonstrating responsibility for, or identifying equity in, a vulnerability under deliberation.”

The new charter requires the secretariat to submit an annual report to the various agency “points of contact” and the White House National Security Council, and create an executive summary written at an unclassified level. “As part of a commitment to transparency, annual reporting may be provided to the Congress,” it adds.

|

3) The government will not bother reviewing vulnerabilities that result from poor design.

There are types of vulnerabilities that will not go through the VEP process, according to the charter. Those include misconfiguration or poor configuration of a device that “sacrifices security in lieu of availability, ease of use or operational resiliency”; misuse of “available device features that enables non-standard operation”; and “engineering and configuration tools, techniques and scripts that increase/decrease functionality of the device for possible nefarious operations.” (Phone jailbreaking would ostensibly fall into this category.)

Lastly—and perhaps obviously—the government will not go through a VEP review upon discovering that a “device/system has no inherent security features by design.”

|

4) Companies should think about their patching policies.

Joyce, in his blog post, acknowledges that the risk of not disclosing a vulnerability that the government learns about is that it will be exploited by other actors “to harm legitimate, law-abiding users of cyberspace.” In weighing the various considerations, he says that one of the choices the government has is to disclose the security flaw to the vendor “with expectation that they will patch the vulnerability.”

Stewart Baker, a partner at Steptoe & Johnson LLP and a former Department of Homeland Security official, said the charter underscores that if the government tells a company about a security hole, the company better make sure it has a plan to respond. ”They're going to have to think, 'Am I somehow liable for failing to patch?'” Baker said.

SAN FRANCISCO — They may not realize it, but any company hit by the WannaCry ransomware attack over the past several months was impacted firsthand by a secretive U.S. government policy mechanism known as the VEP.

Short for the “Vulnerabilities Equities Process,” the VEP is the procedure through which the government decides whether to hang on to knowledge of computer security flaws for offensive uses (i.e., hacking), or disclose them to ensure they get patched. In the case of WannaCry, news reports and comments by Microsoft's chief legal officer indicated that the NSA knew about the vulnerability at the root of the worm, but only told Microsoft after losing control of it.

In the wake of the ensuing controversy, White House Cybersecurity Coordinator Rob Joyce last week for the first time unveiled a public version of the VEP Charter in an effort to shed some light on the government's decision-making process. The 14-page document describes in broad strokes the balancing act government hackers must go through after they discover new vulnerabilities. Here are a few things you ought to know about it:

|

1) The government will usually disclose the vulnerabilities it finds. Usually.

“The new charter makes an important policy decision that the presumption lies in favor of disclosing the vulnerabilities to the companies,” said Michelle Richardson, a deputy director at the Center for Democracy and Technology who has written about the VEP. “While several Obama officials had said as much in their personal capacities, it is crucial to have it be an official declaration from the whole of government.”

The relevant language from the charter reads: “In the course of carrying out USG missions, the USG may identify vulnerabilities that cyber actors could exploit. In the vast majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest.”

The caveat? The charter adds that there are “legitimate advantages and disadvantages to disclosing vulnerabilities, and the trade-offs between prompt disclosure and withholding knowledge of some vulnerabilities … can have significant consequences.” (Clearly.) When the government decides to keep a vulnerability under wraps, the charter says it will reassess that determination on an annual basis “until dissemination is accomplished,” or until the vulnerability becomes public or is “otherwise mitigated.”

|

2) There are a lot of cooks in the kitchen.

According to the charter, the body in charge of administering the VEP is known as the “Equities Review Board.” The board, which meets monthly, or more frequently as needed, comprises representatives from at least 10 different government agencies, including the Office of the Director of National Intelligence, the Department of State, the Department of Treasury, the Central Intelligence Agency, and the Department of Justice.

The National Security Agency “will support VEP governance by serving as the Executive Secretariat for the VEP, acting at all times under the authority, direction, and control of the Secretary of Defense,” the charter says. It adds that other agencies may become involved when “demonstrating responsibility for, or identifying equity in, a vulnerability under deliberation.”

The new charter requires the secretariat to submit an annual report to the various agency “points of contact” and the White House National Security Council, and create an executive summary written at an unclassified level. “As part of a commitment to transparency, annual reporting may be provided to the Congress,” it adds.

|

3) The government will not bother reviewing vulnerabilities that result from poor design.

There are types of vulnerabilities that will not go through the VEP process, according to the charter. Those include misconfiguration or poor configuration of a device that “sacrifices security in lieu of availability, ease of use or operational resiliency”; misuse of “available device features that enables non-standard operation”; and “engineering and configuration tools, techniques and scripts that increase/decrease functionality of the device for possible nefarious operations.” (Phone jailbreaking would ostensibly fall into this category.)

Lastly—and perhaps obviously—the government will not go through a VEP review upon discovering that a “device/system has no inherent security features by design.”

|

4) Companies should think about their patching policies.

Joyce, in his blog post, acknowledges that the risk of not disclosing a vulnerability that the government learns about is that it will be exploited by other actors “to harm legitimate, law-abiding users of cyberspace.” In weighing the various considerations, he says that one of the choices the government has is to disclose the security flaw to the vendor “with expectation that they will patch the vulnerability.”

Stewart Baker, a partner at Steptoe & Johnson LLP and a former Department of Homeland Security official, said the charter underscores that if the government tells a company about a security hole, the company better make sure it has a plan to respond. ”They're going to have to think, 'Am I somehow liable for failing to patch?'” Baker said.