The relationship between information, energy, and entropy is fascinating and would make for a good interview topic. It would be great to hear either Sean Carroll or Benjamin Schumacher on this topic.

As to the suggestion of weighing a disk drive with and without information recorded on the media, forget about detecting a difference. The Boltzmann constant relates energy to information by a factor on the order of 10^{-23} joules per bit per degree Kelvin. So the data (recorded information) filling a terabyte drive at room temperature has energy on the order 10^{-8} Joules. Divide that number by the speed of light (3*10^{8} meters/second) squared to convert joules to kilograms, and you see that it is a very tiny mass, such that a measurement will get lost in the noise. But the issue is more subtle than that, because the real information we need to consider is the microstate information -- the specification of position and momentum of each particle making up the drive. Even the unwritten bit domains on the medium have structure. Writing or erasing a bit changes the macrostate information, but the change in microstate information that this entails is not so readily quantifiable. To complicate it even more, when we write or erase data on a disk, we aren't changing the bits from unwritten to one or zero (or back to unwritten); we are overwriting the bits, flipping the value of each between 0 and 1. And a bit that is set to 0 contains the same amount of information as a bit that is set to 1 (whether or not the information is meaningful is a separate question, one of semantics).

Claude Shannon was a fascinating individual. Fortune's Formula, by William Poundstone, is a good introduction to Shannon and his colleagues Ed Thorp and John Kelly, and to an interesting application of information theory.