How much does data weigh in flash memory?

Comments
Bookmark and Share

An interesting article in the New York Times has been making the rounds of the internet lately. It talks about the tiny theoretical increase in weight of a Kindle when its memory is full as opposed to when it’s empty. Since I’ve previously written about the weight of data on a magnetic hard drive, I couldn’t resist taking a look at the equivalent effect for flash memory.

To begin with, we need to know a little about how flash memory works, and to do that, we need to know how transistors work. A transistor is just a tiny electrical switch. It has two contacts, the source and the drain, that are separated by a layer of material with an excess or lack of electrons. Normally this configuration blocks any current from flowing between the source and the drain. But when the right kind of voltage is applied to the separation layer, it removes the excess (or fills the lack) of electrons, allowing current to pass through. (For the record, I know I’m not doing justice to semiconductor physics here.)

As described in a pretty good article on Explain That Stuff!, and several other sources I’ve looked at, a flash memory cell is basically a transistor with an electron “trap,” called the floating gate, attached. When the transistor turns on, some of the electrons in the current will tunnel through a thin insulating layer into the floating gate. In the simplest scheme, the presence of these electrons represent a bit set to 1, while their absence represents a bit set to 0.

Now, one of the simplest results of quantum mechanics is that a particle trapped in a finite region has a finite minimum (ground state) energy. Take an electron in a 3D infinite square well, for example; its minimum energy is

$$E = \frac{\hbar^2\pi^2}{2m}\biggl(\frac{1}{L_x^2} + \frac{1}{L_y^2} + \frac{1}{L_z^2}\biggr)$$

A flash memory cell acts kind of like an infinite square well. Typical cell sizes are on the order of \(\SI{100}{\nano\meter}\) or less, so the ground state energy is going to be something like

$$E = \frac{\hbar^2\pi^2}{2m}\biggl(3\times\frac{1}{(\SI{100}{\nano\meter})^2}\biggr) \approx \SI{1e-23}{J}$$

But that’s just for one electron. Each cell contains \(\num{1e3}-\num{1e5}\) electrons when fully charged. Of course, we can’t just multiply the energy for a single electron by the number of electrons, because the Pauli exclusion principle bumps most of the electrons up to higher energy quantum states. To get the real total energy of a large number of electrons in a box, you need to integrate over the Fermi sphere. I’ll omit the details of the calculation and just quote the result:

$$E = \frac{3N\hbar^2}{10m}\biggl(\frac{3\pi^2 N}{V}\biggr)^{\frac{2}{3}}$$

For about thirty thousand electrons, using my earlier estimate of a \(\SI{100}{\nano\meter}\) cubic cell, I get \(E = \SI{1e-16}{J}\). This applies to a single cell, and thus a single bit. It’s pretty close to the estimate of \(\SI{1e-15}{J}\) for a single bit quoted by Prof. Kubiatowicz in the New York Times article. Multiplying this by half of a Kindle’s memory capacity of 4 GB, I get \(\SI{200}{\nano\joule}\). Then you can divide this by \(c^2\) to convert it to an equivalent mass of \(\approx\SI{2e-21}{\gram}\). It’s a little off from the estimate quoted in the article, but then again, the math there doesn’t quite check out — and still, it’s within a few orders of magnitude, which is good enough for me considering how rough this whole calculation is.