Difference Between Analog and Digital Signals

Main Difference – Analog vs. Digital Signals

Analog and digital are two forms which are used to transfer signals. The main difference between analog and digital signals is that, in analog signals, the signal can take any value within a given range whereas, in digital signals, the signal can only represent one of a discrete set of values.

What are Analog Signals

Analog signals are signals that can take up any value on a continuous scale. There may be a range for the values that the signal can take, but within this range, the signal could represent any value. An analog signal varies continuously and smoothly with time, as the quantity it is recording changes its value.

As an example for an analog signal, consider a vinyl record. When music is played in a studio, microphones pick up variations in air pressure due to the sound and convert these changes in air pressure into a change in voltage in an electrical circuit. The voltage also varies continuously whenever the sound varies. The electric circuit is connected to a needle, which moves according the voltage. When the needle moves, it creates grooves in a lacquer. Later, these grooves are transferred onto a vinyl disc. The variation in the grooves are continuous, and these variations correspond to the continuous variations of the original sound. When music is played back on a vinyl disc, a needle in the player moves along the grooves and convert its movements to a continuous electric signal. The signal can be conveyed to a speaker, and the speaker can make its membrane move back and forth according to the signal it receives.

Difference Between Analog and Digital Signals - Vinyl_grooves

A magnification of a vinyl record showing continuously-varying grooves, which are capable of producing a continuous signal.

Since analog signals vary continuously with time, they are said to have an infinite resolution. That is, an analog signal can transmit a change that occurs in an infinitesimally small time period. However, noise can still be introduced, which will deteriorate the quality of the signal over time.

What are Digital Signals

In a digital signal, the signal can only take up a set of discrete values. The signal itself is also discontinuous, changing its value at intervals of time. Personal computers are good examples of devices that use digital signals. Since computers communicate using “bits” of 1’s and 0’s, and because there is a finite number of bits that can be processed in a given time, a computer cannot handle a continuous signal. Instead, a signal has to be “broken down” to a digital form. This involves first sampling the analog signal at different points of time. Then, the signal is quantized: i.e., for each interval of time, the signal is given an approximate, discrete value to represent the original signal. The time intervals involved are often very small so that we cannot notice the difference (a song or a video heard on a computer looks continuous!)

The larger the discrete set of values that the digital signal could take, the closer the signal is going to be to the original, analog form. The term resolution indicates how many values that a signal could be broken down into. For example, a 1-bit conversion can only take two values: either 0 or 1. With a 2-bit conversion, the signal could take 4 different values (00, 01, 10, 11). The number of values that a digital signal could take varies as the two raised to the number of bits used. The larger the number of bits used, the better is the resolution.

Difference Between Analog and Digital Signals - A2D_2_bit_vs_3_bit

Converting the continuous analog signal (red) to a discrete digital signal (blue). On the left, the conversion has been done using 2 bits, thereby creating 4 different levels that the digital signal could take. On the right, 3 bits are used. Therefore, the signal can be represented by 8 different levels. This signal has a higher resolution and is “closer” to the original analog signal.

The image below shows a magnified image of a surface of a compact disc (CD). On a CD, data is recorded as a series of pits and bumps. Each pit or bump corresponds to a 0 or a 1, and so the signal produced as a CD is being read is a digital one. Compare these variations on the CD with the more continuous variations found on a vinyl disc (above).

Difference Between Analog and Digital Signals - CD_Pits

Pits and bumps on a CD surface (magnified using an atomic force microscope)

Over time, a digital signal may also acquire noise. However, it is easier to separate out the noise using a process known as regeneration.

Difference Between Analog and Digital Signals

Nature of Signal

An analog signal can take any value in a given range.

digital signal can only take one of a discrete set of values.

Resolution

Analogue signals have an infinite resolution.

Digital signals have a finite resolution, which depends on the number of bits used to convey data.

Removing Noise

It is difficult to remove noise in analog signals. Noise may build up over time.

In digital signals, it is much easier to remove noise.

 

Image Courtesy

“A macro shot of record grooves, with variation clearly visible.” by Shane Gavin (Own work) [CC BY 2.0], via Wikimedia Commons

“2-bit resolution with four levels of quantization…” by Hyacinth (Own work) [CC BY-SA 3.0], via Wikimedia Commons (Modified)

“3-bit resolution with eight levels of quantization…” by Hyacinth (Own work) [CC BY-SA 3.0], via Wikimedia Commons (Modified)

“micrograph of a CD-ROM made with an atomic force microscope (deflection mode)” by freiermensch (Own work) [CC BY-SA 3.0], via Wikimedia Commons

About the Author: Nipun