Okay, I'm starting to think this article doesn't really know what it's talking about...
For most of modern computing history, however, analog technology has been written off as an impractical alternative to digital processors. This is because analog systems rely on continuous physical signals to process information — for example, a voltage or electric current. These are much more difficult to control precisely than the two stable states (1 and 0) that digital computers have to work with.
1 and 0 are in fact representative of voltages in digital computers. Typically, on a standard IBM PC, you have 3.3V, 5V and 12V, also negative voltages of these levels, and a 0 will be a representation of zero volts while a 1 will be one of those specified voltages. When you look at the actual voltage waveforms, it isn't really digital but analogue, with a transient wave as the voltage changes from 0 to 1 and vice versa. It's not really a solid square step, but a slope that passes a pickup or dropoff before reaching the nominal voltage level. So a digital computer is basically the same as how they're describing an analogue computer.
I'm sure there is something different and novel about this study, but the article doesn't seem to have a clue what that is.
