The root of this difference is that voltages in a circuit (so called analog signals) are continuous whereas the sequence of numbers in the computer with which we are representing it is discrete, both in time and in value. I.e. by sampling the analog signal at regular intervals we ignore any changes that may take place between these sampling times, and by representing the value by a number with a limited number of digits we introduce an error between the actual value and its representation.