# If the highest frequency a song is 10 kHz and it is encoded at 16 bits per sample what is the minimum number of bytes needed to encode the 3 minute song?

• 736 views

The key to this question is to remember the Nyquist rate of a signal. This is the lowest sample rate which can be used for a signal without losing valid frequencies or gaining incorrect frequencies. This is equal to twice the highest frequency.

Therefore the sample rate needs to be 20 kHz. Since there are 16 bits per sample the number of bits per second is 16 multiplied by 20 000 which is 320 000 bits per second.

To calculate the number of bits in 3 minutes we need to multiply 320 000 by the number of seconds in 3 minutes. Which gives:
320 000 x 3 x 60 = 57 600 000 bits

Remember to divide by 8 to get it in bytes, since there are 8 bits in a byte. This finally gives:
57 600 000/8 = 7 200 000 bytes = 7.2 Megabytes

Still stuck? Get one-to-one help from a personally interviewed subject specialist.

95% of our customers rate us

We use cookies to improve your site experience. By continuing to use this website, we'll assume that you're OK with this.