Should you worry about compiling on a solid state drive?
by Dave Harms
Recently in the CW-Talk Skype chat Mark Goldberg wrote that he uses a RAM drive instead of the SSD for his OBJs. Why? Because compiling means disk writes, and SSDs really do wear out. As Geoff Gasior explains in The SSD Endurance Experiment: They're all dead:
This breed of non-volatile storage retains data by trapping electrons inside of nanoscale memory cells. A process called tunneling is used to move electrons in and out of the cells, but the back-and-forth traffic erodes the physical structure of the cell, leading to breaches that can render it useless.
Electrons also get stuck in the cell wall, where their associated negative charges complicate the process of reading and writing data. This accumulation of stray electrons eventually compromises the cell's ability to retain data reliably—and to access it quickly.
How many times can you write to flash memory? Depending on the technology, anywhere from 1000 to 1,000,000 times, but from my limited research I'd say you can expect most SSDs to be somewhere around the 2500-5000 write cycle mark.Â
Does that mean you could be limited to as few as a 2500 compiles, assuming the compiler is writing to the same location each time? At two compiles per hour the drive would be toast less than a year, right? Only the assumption that each write goes to the same location is a bad one.
SSDs employ wear-leveling algorithms so that writes are distributed evenly across cells. Beyond that, SSDs are typically over-provisioned with memory; you're getting more than the stated capacity, and if cells wear out they are replaced with these spare cells.Â
What you need to think about is total write capacity. Gasior's experiment ran a half dozen SSDs in the range of 240GB to 256GB. The first drive to fail shut itself down at 700TB by design; the last drive standing gave out after 1.1 petabytes. That's 1,100,000 gigabytes of data written, or more than 4000 times the actual drive capacity.
The largest system I've ever worked consists of over 200 apps and writes about 1.1 GB of files during a full debug mode build. That's about 720 MB of OBJ files and 380 MB of DLLs and EXEs.Â
Assuming I'm using a 240GB SSD with 700TB of write capacity, I could compile that system from the ground up over 600,000 times. To put that into perspective, imagine that I'll have that drive for five years before I decide to replaced it with something bigger and faster. On a five day work week with no days off for good behavior, I can still compile that entire 1.1 GB Â project 488 times per day; if that's even achievable with currently available hardware then I'd like one of those machines. Â
How long will it last?
Here's how I suggest you calculate how many years your your drive will last.
First get the total size of your application directory in MB. Call it TotalBytes.
Next, figure out the average size of the DLL or EXE you're creating on a compile, in MB. Call that ExecutableBytes.Â
Â
(Drive capacity in MB * 1000)
__________________________________________________________________
((TotalBytes * full compiles per hour) + (ExecutableBytes * partial compiles per hour)) * 40 * 48
Â
TotalBytes overestimates the bytes written on a full compile; ExecutableBytes underestimates the bytes on a partial compile but not by a lot because when you make one change you're typically only recompiling a small portion of the OBJs.Â
For example, say you have a 10MB app that takes up a total of 30MB of space, including a 4MB DLL. You're a maniac about global data so every hour you have to do a full compile. Assuming an eight hour day, that's one per hour. You do an incremental compile once every five minutes, or 12 per hour. Rough estimages? Sure, but any error is well within an order of magnitude.Â
You're still cheaping out on the SSD so you only have a 240GB model. And while you may be cheap you're also sensible enough to reasonable hours - 40 per week with four weeks holidays. If you're working 60 hours per week on a regular basis you're either a freak of nature or you were absent from class when the professor explained the law of diminishing returns. In any case, feel free to adjust these numbers as you see fit.Â
240,000 * 1000Â
____________________________
(30 * 1) + (4 * 12) * 40 * 48
Â
The answer:Â 1,602.56 years!
I don't think compiling on an SSD is much to worry about.Â
But if you really are concerned about minimizing disk wear, consider that big apps result in more writes because the executables are bigger. Similarly, having more than one procedure per module (a practice I'd really like to see abolished) results in writing procedures that haven't changed.Â
Is there still a good reason to use a RAM drive for OBJs? Sure - RAM is still quite a bit faster than flash memory for both reading and writing, so you will gain some performance. And since OBJs are expendable it won't matter that they go away the next time you reboot. So no harm no foul. But you don't need to worry about wearing out your SSD.Â
Disclaimer: I've checked my figures with reasonable care, but I make no guarantees about the durability and/or reliability of any hardware you may purchase or the applicability of this formula to that hardware. You're still on your own. If you do find an error in my calculation please let me know. And if you have an SSD you probably have or can obtain software to monitor the lifespan of the drive.Â
For your convenience, here's a spreadsheet with the above formula:Â SSD Write Life.xlsx
Â