- Mon May 03, 2021 9:04 am
Everything technical item (including solid state memory) has a so called "MBTF" (short for: "Mean Time Between Failures"
) value, which basically means that something may (and eventually will) break after a certain amount of uses (or time).
With memory, in this instance, this means that a single bit is "Guaranteed" (between quotation marks, because it is always an estimated value) to work for an X-number of read/write cycles (in this case 10.000 write cycles) after which it will eventually fail and may give unpredictable results (sometimes it works, sometimes it doesn't) in the mean time.
The MBTF of a component depends on a number of factors: quality of the component itself is one, but also the strategy used for writing to memory can be of influence.
"Intelligent" memory hardware/software will spread the usage of each bit of memory, so that not always the same piece of memory is used all the time, but memory allocations are used evenly.
Of course: if you have more memory at your disposal (say an SD card of 4GB), it's easier to lengthen the actual usage time of a component than when you have little (say SPIFFS of 4MB) available.
Another plus is that an SD card is cheap and easy to replace, when it starts to breaks: replacing the entire ESP (or only it's on-board flash memory) will be more of a hassle.
You cannot "Fix" failed memory by reflashing the code: you'll first also have to replace the memory chip.
Assumption is the mother of all f*ckups. At least: that's what I'm assuming.