Securely Erase A Hard Drive

You can choose how often you want to overwrite the disks in the DBAN interactive mode. As I have mentioned before, for any normal persons use-case overwriting more than once is unnecessary. Overwriting an 8TB drives 37 times will take approximately 22days. That is just wasted time and energy.

you do it for the Michael Bay experience! EXPLOSIONS!!!

and maybe some lens flare!

:grin:

the primary reason i use d-ban is when the drives are wiped with it they are forensically sterile.
believe me it doesn’t take much for a lawyer to get copied evidence thrown out if the drive hasn’t been wiped thoroughly

small drives can be done in batches, but large drives are better done by themselves, and for this purpose investing in a fast motherboard and a good amount of ram can be a blessing while wiping drives.

well for one thing d-ban is free! and killdisk ultimate enterprise is $99.00

Wow thanks everyone! I will be using DBan

1 Like

I dont think DBAN itself has been updated for some time (2015 to be exact).

It seems it has been forked no nwipe according to good old wikipedia

I read most of the comments, all of which seem to be written by erudite experts, my nerdyness has been upstaged. sigh. But none of the comments evidence expertise in information security “tradecraft”. First off, “making sure” + “reasonably” = 7x overwrites with a variety of patterns reminiscent of 1st gen LED strips and their aggressively unartistic color cycling patterns, the lighting equivalent of headaches and nosebleeds, generations away from breathing 24 bit color individually addressable, but never-mind LEDs and which recursive functions are favored by the nerds who write forensic data “privacy assurance” software… am I the only one who noticed the H in HDD??! I’m all for staring at a nostalgia inducing ASCII only screen for 3 days while performing due diligence ass covering in case it ends up on wikileaks, but for the love of efficiency and the dividends that scientific literacy pays out not often enouth: TAKE A FREAKIN MAGNET TO IT! IF YOU DON’T HAVE ONE HANDY, TAKE YOUR SPEAKERS APART! I don’t care how expensive the data recovery setup is, nobody without serious magic powers is pulling anything more than a serial number off the sticker. Your drive can be sold and immediately forgotten. Yes, you can use it again after degaussing it. If you’re at a university, walk over to te chem building and go in the basement. Ask where the NMR is, and if you don’t have nipple or genital piercings, walk over to the metal egg on wheels, taking one HDD at a time in both hands, as though offering something to pagan gods (the 4 tesla magnet will try to rip it from your grip, but if you hold tightly, go in for a few teasing fly-by’s, and you just saved days worth of military-grade boredom, with your computer very busy and you can’t leave [security, remember?]. I vote for the big-ass-magnet. Anybody concur? [A subwoofer’s back-side or any decent neodymium magnets, or even an electromagnet (door lock?) will do the trick.] It’s easy to check afterward if you need to do it again, just DL a free recovery APP i.e. recuva. I always passed this test and never needed to redo the procedure, not even in my first few “destroy everything” missions. Honestly, Save the exercise in monastic patience for SSDs and NVMe’s, for HDDs, just use The Force…

I hope this brought some desperately needed panache to this boring chore. Whatever it is you do, make it look good!

Does this magnetism business work on a 5 platter drive too? :slight_smile:

If you are already considering the sort of equipment that can read an overwritten drive, I am not positive that massive magnetisation like you describe would necessarily work. I do not know enough about any of these to be certain, but it is interesting to speculate:

  1. How much does the drive casing prevent erasure by acting like a faraday cage?
  2. Does a large uniform field introduce enough noise to erase data?
  3. Can a drive still work if the platter has been degaussed?

For #1 and #2 I wonder, if the field strength is substantially reduced by the case shielding the platters, could you have a situation where it is non-functional, but the data is still there discernible to certain forms of magnetic microscopy. If you have not written random data before doing this, and then are stuck with only the uniform effects of attempted/amateur degaussing, maybe this might actually be more recoverable.

For #3, I thought I read somewhere that there is some form of more-permanent magnetisation on HDD platters that is used to separate or identify the tracks, if you manage to erase this I think the drive becomes unusable since this track information can only be magnetised onto the platter in the factory.


In short, I think a few random overwrites passes are still the superior method, even if you have access to a powerful magnet.

Theoretical concerns about uniform erase methods

To clarify my concerns about uniform overwrites in general: I wonder if in theory after a uniform change, like a physically large magnet or writing just zeros, the entire platter is actually easier to recover data from because the change between nearby bits is still preserved; that is to say, the data pattern might be hard to read, but it is still the only pattern there.

Here is a thought experiment: a drive where 1 is > 0.5 and 0 is < 0.5; where in these five bits we initially wrote 11010. Here I imagine a Uniform erase method being a large magnetic force or multiple write operations of zero to the same area, thus modifying the charge/flux of many bits at once in mostly the same way, with a bit of randomness; Random represents what I imagine writing random data (01100) to the area might result in:

Bit 0 Bit 1 Bit 2 Bit 3 Bit 4 Avg.
Initial 0.98 0.89 0.20 0.87 0.16 0.62
Uniform 0.068 0.053 0.012 0.057 0.010 0.040
Random 0.311 0.867 0.672 0.286 0.044 0.436

Again, these numbers represent a thought experiment, not a genuine calculation.
In this imaginary scenario, the original data can be recovered from a uniform overwrite by comparing it to the average of the bits, as long as your instrument is sensitive enough to see the differences.
With the single random pass you can still cluster your values into four groups, assuming that the residual charge/flux from the previous write will overcome the noise/randomness in the latest write pass and give you a hint of where data was being changed rather than overwritten to the same value; here that does seem to work:
Cluster A: 1 overwriting 1 — Bit 1 (closest to 1)
Cluster B: 1 overwriting 0 — Bit 2
Cluster C: 0 overwriting 1 — Bit 0 and 3 (since they are the closest neighbours)
Cluster D: 0 overwriting 0 — Bit 4 (closest to 0)

I imagine this is probably why multiple random passes are still recommended, even if that older concern about writing outside the track area is no longer an issue.


Edit: These two lengthy posts have been mostly how I imagine these things work; please take this with a grain of salt — actually, on that note, I might as well ask; @wendell : does this and my other post sound about right? Any glaring mistakes or common misconceptions I am accidentally reinforcing for myself or others here?

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.