Over the last 17 years we have all heard of DoD 5220.22-M 3-pass. It has been touted as the standard for data sanitization. The problem that I see with this is that it is a standard that the technology has outgrown. The exact verbiage used in the matrix in the original document that has everyone stuck is “Overwrite all addressable locations with a character, its complement, then a random character and verify. THIS METHOD IS NOT APPROVED FOR SANITIZING MEDIA THAT CONTAINS TOP SECRET INFORMATION.” This was designed to verify that non top secret information on “Non-Removable Rigid Disk and Removable Rigid Disk” had been sanitized. By 2012 we have already had multiple revisions of this document and many others yet the 3-pass “rule” still seems to endure. I propose that security sanitization practices are not rules but guidelines to follow based on an organizations risk / threat analysis.
During the creation of the original 5220.22-M it is speculated that the primary basis for data sanitization practices were floppy disks and their data storage characteristics. Policies were written to take into consideration both the longevity of information on this medium as well as the physical process by which it could be verified as sanitized. The electronic data sanitization industry had not yet been created, with some of the largest software data sanitization brands of today not even formed until a few years later. Therefore the process at the time was manual and by that virtue allowed for 3 instance of human verification during the sanitization process. This allowed for both software and human error to be checked, and in my opinion this was the original basis of standard. That being said even the government took into consideration that with all else being equal there are still certain instances where physical destruction was a better solution based on the risk assessment.
As time progressed and the data destruction industry developed the standards started to respond to industry trends. An entire industry had been created to automate and control the process that was laid out in a few lines of a government document. Software had been created to bypass the human interaction with each step of the process and arguably increase the success rate of 3-pass systems (as it was now more time/cost effective to complete) while at the same time removing the human aspect of verification. By the mid 2000′s manufacturing, testing and analysis of media had come a long way.
In 2006 NIST SP-800-88 stated that ”Basically the change in track density and the related changes in the storage medium have created a situation where the acts of clearing and purging the media have converged. That is, for ATA disk drives manufactured after 2001 (over 15 GB) clearing by overwriting the media once is adequate to protect the media from both keyboard and laboratory attack.” That same year DoD 5220.22-M removed all verbiage on single vs multiple pass. The standards were now leaning towards each entity making its own decisions based on its own risk and threat assessment. Essentially the message was “one pass is as good as multiple as long as it is verified complete. If you are in doubt or have something that is of a sensitive nature physically destroy it.”
Six years after the revisions and more research and data on sanitization, we still hear people ask if we do DoD 3-pass sanitization. The truth of it is at this point it doesn’t exist. The DoD has decided that secure information that must remain secure must be destroyed. NIST has restated in clear terms that a two person rule (read human verification) shall be implemented, but no guidelines as to what method of sanitization (it could be a single wipe with dual human verification, or a single destruction with the same.).
In todays data rich environment companies and individuals should take into consideration there unique risk vs value propositions. The tools are available to address any level of security issue. As a company we provide many levels of service from destruction only to sanitize and resell. Even as the service provider we are taking into consideration what risk we take when we place a program for our clients. Not only do we help interpret the guidelines but we too have to make the same decisions that our clients do everyday. Does this process provide enough value to balance the risk? It seems like an easy question but an entire industry has been created over the last 17 years over what is a low enough risk. What the industry is focusing on now is an educated opinion when putting together your program, with best practices and policies that can be implemented into your own data risk mitigation practices.