Trusting Encyption Supply Chains
A recent
ArsTechnica article
raised questions about Chinese-supplied encryption chips, specifically
the possibility that the chips could include backdoors allowing
information to be compromised or decrypted by other than the
authorized parties. This is a serious concern, but in my opinion
the article erred by taking a far too narrow perspective on
the implications. The problem is far broader than a particular
Chinese company or companies. Rather, it is one of the myriad
variants of the far broader supply chain
problem.
It is a question of design verification.
Information leakage is a reasonable concern. On an individual level, compromise of Personally Identifiable Information (PII); health data; and financial data are concerns. Organizations who are custodians of personal data are concerned about that data as well as the organizations’ own business and technical data. Threat actors can be individuals, criminals alone or in groups, and nation states.[1] Motivations span the spectrum: joyriding, revenge, vandalism, profit, and national security advantage.
Information exposure is not the sole hazard. Ransomware is the antithesis of exposure. Ransomware is essentially data-hostage taking: the goal is denying the original holder access to their information rather than exfiltrating information.
Exposure and hostage-taking are far from the only possibilities. Deliberate destruction is also a possibility, either in the kinetic realm (e.g., Stuxnet[2]) or merely in cyberspace by simply rendering the information permanently inaccessible. Admittedly, data stores can be reconstructed from backups, if the restoration mechanisms are not themselves compromised by the original incident. There is also a question of scale: recovering a single destructive attack is one thing; dealing with large numbers of near simultaneous episodes is far more difficult.
Data breaches are often discovered during data exfiltration. Unauthorized, out-of-policy or unaccountable data flows are identified by network traffic monitoring. However, the most dangerous hackers are those whose highest priority goal is to avoid detection. Those hacking for notoriety or profit are almost invariably detected, be it via bragging; anomalous network traffic; or accounting records revealing unauthorized transactions. The most serious dangers occur when covert intruders succeed at their deliberate goal of avoiding detection.[3]
There is long-standing precedent: espionage and sabotage in the service of personal, commercial, or national security arenas. The 1972 Watergate incident was a failed covert attempt to install listening devices in the Democratic Party’s headquarters. More topically, since the Second World War, submarines have been used for covert missions to gather information; pickup/drop off personnel or cargo with explicit orders to avoid detection and contact at all cost. A fundamental goal of such missions was to deny the adversary knowledge that the operation had ever happened.
The ArsTechnica article raised the question of information leakage, but did not make mention of what may be an equal or more serious hazard: denial of access or effective destruction.
Consider a paradigmatic encrypted storage device consisting of: a host, an encryptor/decryptor, and a storage medium.
When data is to be encrypted the encryptor/decryptor is supplied with an -bit key , together with the data to be encrypted. Key consists of bits through . Whether is the low-order or high-order bit is irrelevant to this analysis. For simplicity, this discussion will use little-endian conventions.
Most testing will verity that what was written to the storage
medium can be successfully retrieved using the same key used
when the data was stored. However, such testing has a significant
blind spot: it verifies only that the encryption/decryption
process is symmetric. The roundtrip does not use an independent
path for either encryption or decryption: the test merely
verifies that encryption/decryption is symmetrical. The test
does not verify the actual encryption of the data while at
rest on the storage medium. It is taken at faith. Simply verifying
that stored ciphertext
is gibberish as opposed to
unencrypted plaintext similarly does not imply that the specified
encipherment or key was used. The only valid way to verify that the
at rest
data has been processed as advertised is
to decrypt the stored material using a separately validated process.
More broadly, black box
testing only probabilistically
detects errors. An example of an error that went undetected is the
1994 Intel Pentium FDIV bug discovered by
Lynchburg College professor Thomas R. Nicely.[4]
It was reasonable to presume that Intel had
thoroughly tested its then flagship microprocessor prior to
customer release.
Only testing/verification that covers
Consider some admittedly challenging possibilities:
forget the keyfunction is actively marketed as a feature for fast erasure of encrypted sensitive information.
Non-data exfiltration attacks leave no tell-tale data stream to be detected. Activation signals for covert features are can be miniscule and seemingly innocuous. This makes detection a challenge.
In All the President′s Men
[5] journalist
Bob Woodward described how signal used to request a meeting with
his most sensitive source, Deep Throat: Moving a potted plant on his
apartment terrace.[6] Newspaper classified advertisements
are another classic example.
In the national security space, one historic example was the
Imperial Japanese Navy′s attack on Pearl Harbor. The underlying
orders were never sent via radio. The Kido Butai
(Strike Force)
was operating under strict radio silence. However, the somewhat
elliptically-phrased operational execute order,
NITAKA YAMA NOBOE 1208
(in English, Climb Mount Nitaka 1208
) sent using
the Imperial Japanese Navy’s JN-25B cipher on December 2, 1941
(Tokyo time) stands out.[7] The message is atypical
in phrasing and content when compared to other messages sent in
a military cipher system. It was intercepted when sent, albeit
its import was not appreciated till after the attack.[8,9]
In contrast, the less well known, but for our purposes far more
relevant execute message, is the parallel
Hostilities imminent
message sent as part of the
December 5 (Tokyo Time) Radio Tokyo weather report using
pre-arranged wording, referred to as the Winds
message.[10] The Winds
message is
a good example of a covert execute message. It appears innocuous,
but has a very precise meaning to witting receivers.
Covert signals are common. Execute orders are simple codewords or
phrases. They initiate a pre-planned chain of events. Ideally, an
execute code is sufficiently rare that accidental triggering is
unlikely, but everyday enough that the phrase does not attract
attention or interest. Simply the presence of a certain bit
pattern in a JPEG
or GIF
file could be
the indicator. There is
no need for a control network point, a GIF
in an
online advertisement
is more than sufficient. An online advertising network is more than
sufficient to this task. The transmission would be virtually
impossible to distinguish from the vast numbers of online
advertisements served in an average day.
In a chip design containing hundreds of thousands or millions of gates, the circuitry to implement a covert signal and altered behavior can be a seemingly miniscule few hundred gates and corresponding connections. A specific key or a specific key in combination with a data pattern would be sufficient.
Black box testing is unlikely to identify a potential activation pathway triggered by a covert signal. Analytical analysis, including the design at the gate-level together with input and output data paths is the only route to verification.
Chip integrity is a serious concern, both in terms of initial design as well as ongoing revision and manufacturing.
[1] | Robert Gezelter (2012) Mobile Codein The Computer Security Handbook, Fifth Edition, Section 17.1.1 |
[2] | David Kushner (2013, February 26) The Real Story of Stuxnet |
[3] | Robert Gezelter (1997) Security on the InternetChapter 23 in Computer Security Handbook, Third Edition Supplement |
[4] | Alexander Wolfe (November 9, 1994). Intel fixes a Pentium FPU glitch |
[5] | Carl Bernstein and Bob Woodward (1974) All the President’s Men pp 72 |
[6] | John O'Connor, John (17 October 2006). 'I'm the Guy They Called Deep Throat' |
[7] | Edwin Layton, Roger Pineau, John Costello (1985) And I Was There: Pearl Harbor and Midway Breaking the Secrets Countdown to war(Chapter 22) pp 242 |
[8] | Edwin Layton, Roger Pineau, John Costello (1985) And I Was There: Pearl Harbor and Midway Breaking the Secrets [chapter name]Chapter xx, pp264; also Author’s Notes, III The East Winds Rain Enigma, pp517 et seq |
[9] | Imperial Japanese Navy message of December 2, 1941, page 65 This dispatch is Top Secret. This order is effective at 1730 on 2 December. Climb NIITAKAYAMA 1208, repeat 1208.(Climb Mount Nitaka December 8) SRN-115376 (In late 1945, possibly with the retrospective in-hand knowledge that this message was stipulated in Flt OPORDER#1, its meaning is understood by OP-20-G to be, Attack on 8 December.The notes on the 1945 retrospective translation state NITAKYAMA is [sic] the highest mountain in the Japanese Empire. To climb NITAKYAMA is to accomplish one of the greatest feats. In other words, undertake the task (of carrying out assigned operations) |
[10] | The Windsmessage was prearranged phrases in the Radio Tokyo weather report transmitted worldwide via shortwave radio. There is a dispute as to whether it was received by the US Navy listening posts. The actual page for the intercept is missing from the log.[11] However, the signal was intercepted by the UK Far East Combined Bureau and forwarded to London.[12] |
[11] | Edwin Layton, Roger Pineau, John Costello (1985) And I Was There: Pearl Harbor and Midway Breaking the Secrets “Author’s Notes, III The East Winds Rain Enigma” pp517 et seq |
[12] | Michael Smith (2000) The Emperor's Codes: Bletchley Park and the breaking of Japan's secret ciphers, pp 100 |
Mobile CodeChapter 17 in Computer Security Handbook, 5th Edition John Wiley & Sons
Protecting Web SitesChapter 20 in Computer Security Handbook, 5th John Wiley & Sons
Security on the InternetChapter 23 in Computer Security Handbook, 3rd Edition Supplement John Wiley & Sons
Security on the InternetChapter 23 in Computer Security Handbook, 3rd Edition John Wiley & Sons
The US Navy, NATO, and NASA are using a shady Chinese company's encryption chipsArsTechnica Retrieved from https://arstechnica.com/information-technology/2023/06/the-us-navy-nato-and-nasa-are-using-a-shady-chinese-companys-encryption-chips/ on June 20, 2023
The Stuxnet Attack On Iran's Nuclear Plant Was ‘Far More Dangerous’ Than Previously ThoughtBusiness Insider Retrieved from https://www.businessinsider.com/stuxnet-was-far-more-dangerous-than-previous-thought-2013-11 on July 2, 2023
The Real Story of StuxnetIEEE Spectrum July 2013 Retrieved from https://spectrum.ieee.org/the-real-story-of-stuxnet on July 2, 2023
‘I'm the Guy They Called Deep Throat’Vanity Fair Retrieved from https://www.vanityfair.com/news/politics/2005/07/deepthroat200507 on July 17, 2023
Intel fixes a Pentium FPU glitchElectronic Engineering Times Retrieved from http://davefaq.com/Opinions/Stupid/Pentium.html#glitch on June 21, 2023
Long: | http://www.rlgsc.com/blog/ruminations/trusting-encryption-supply-chains.html | |
Short: | http://rlgsc.com/r/20230725 .html |