There is a reason I liken computations to evanescent spirits. They are called into being with obscure incantations, spin filaments of data into complex webs of information, and then vanish with little trace of their ephemeral calculations. Of course, there are observable side effects, say when a computation activates a physical device or causes output to be sent to a printer, displayed on a screen or stored in a file. But the computations themselves are difficult to observe in much the way your thoughts are difficult to observe.
I'm exaggerating somewhat; after all, unlike our brains, computers are designed by human beings. We can trace every signal and change that occurs in a computer. The problem is that a complete description of all those signals and changes does not help much in understanding what's going on during a complex computation; in particular, it doesn't help figure out what went wrong when the results of a computation don't match our expectations.
An abstraction is a way of thinking about problems or complex phenomena that ignores some aspects in order to simplify thinking about others. The idea of a digital computer is itself an abstraction. The computers we work with are electrical devices whose circuits propagate continuously varying signals; they don't operate with discrete integers 0, 1, 2, …, or for that matter with binary digits 0 and 1.