Take any electrical signal, and look at it on an oscilloscope. Or look at the stock price of your favourite company, or the league position of your favourite football team, and watch how it changes over time. You're stuck in the 'time domain', and jolly confusing it can be too. Fortunately, a guy named Jean Baptiste Joseph Fourier (French, 1768 - 1830) figured out another way of looking at these things - what could be called the 'Fourier domain'. While perhaps equally confusing, the Fourier domain is confusing in a different way, so the same signal can show new subtleties and patterns.
The most common way to run into this is in sound. Something makes a sound by causing compression waves in the air - alternate bands of slightly more and slightly less compressed areas. When these bands hit your ear drum, it vibrates, and the brain picks this up and interprets it as music1. So far, so good. The problem comes when there are two or more sources of noise in a room, producing waves of different frequencies. And to make matters worse, some sources of sound produce many differing frequencies all on their own.
When these waves overlap, they interfere with one another, and the more waves of differing frequencies are added together, the harder it is to work out what the frequencies and amplitudes of the original waves were. The Fourier domain is the key to getting the original notes back from the general mishmash.
So What exactly is the Fourier Domain?
Well, in the time domain you look at the value of something as it changes over time - a series of snapshots, if you will. In the Fourier domain you look at the entire lifetime of the signal all at once - and analyse it in terms of the underlying frequencies that made it up. This means you can no longer see the value at any one time, nor the rate at which the signal is changing at any one time. Instead, for each possible frequency, you see the amplitude of the signal at that frequency.
There are a whole bunch of related Fourier words and phrases - which all mean similar things. A 'Fourier transform' is what you see when you view a signal in the Fourier domain. 'Fourier transformation' is the process of getting a Fourier transform. Doing things 'in the Fourier domain' means doing them using Fourier transforms, rather than using signals directly. 'Fourier analysis' means finding Fourier transforms using algebra and integrals and suchlike.
Sometimes, a Fourier transform consists of a (possibly infinite) number of discontinuous spikes at specific frequencies, rather than an amplitude which changes smoothly as frequency changes - this occurs whenever a signal continuously repeats exactly the same pattern - like a stuck record player. In this case the Fourier transform can be written as a sum of cosines and sines - and this (possibly infinite) sum is a 'Fourier series'.
'Discrete Fourier Transforms', or DFTs, are very much like the normal ones, but instead of generating amplitudes at all points, they generate them only at regular intervals. You might think that these are less useful than the normal ones, and you'd be right, but for one slight point: they are a heck of a lot easier to calculate. Aside from the loss of detail, and being easier to find, DFTs behave exactly like normal Fourier transforms.
How to Calculate Fourier Transforms
To calculate continuous Fourier transforms, you need to do a number of fairly tricky integrals, involving complex exponentials2 multiplied by the original signal. This is recommended only for masochists and those who don't mind seeing far too many letter 'e's.
To calculate discrete Fourier transforms, you need to feed your signal into a computer, and sit back while it churns away. The way the computer works is that to find each point in the output it adds together a large number of terms, calculated by sampling the original signal and multiplying it by various magic3 numbers. This would take people a long time, but it is the sort of mindless repetitive task computers do well.
The 'fast' Fourier transform is a clever algorithm that takes advantage of the fact that many of the terms are duplicates of one and another, so they need only be calculated once. A diagram of the way these various stored results flow through the computer looks kind of like a butterfly, so it has been named the 'butterfly algorithm'. Some people think it's very pretty, though consensus is that they should get out more.
Curiously, the brain somehow does a form of Fourier analysis really effectively. 'Tuning in' on different voices, recognising differences in high and low notes, etc - all that stuff requires some sort of Fourier analysis. Of course, this being the inner workings of the brain, scientists aren't entirely certain how the brain does it. And they don't have a clue how the brain does it so fast!
What's the Point?
One thing we do know is that both the ear and the brain throw away some of the information in music, and this seems to be largely based on frequencies. By doing Fourier analysis, we can find out what frequencies make up a piece of music, and strip out those parts which will be inaudible. That means smaller music files - and more music on each CD or hard disk. It's one of the techniques used by MP3.
One of the main reasons for doing it with computers and suchlike, rather than simply in our subconscious, is because once you are in the Fourier domain, a lot of complicated things become very easy. For example, if you want to 'differentiate a function,' then one way is to convert it to the Fourier domain, multiply it by a simple expression, and then convert it back again. 'Convolution'4 becomes just multiplication in the Fourier domain. Multiplication by exponentials just becomes a shift - and so on. So, if you need to do a number of these kinds of operations, it's simpler to just convert to the Fourier domain at the start of the whole thing and convert back only at the end.
Bring in the Dancing Mathematicians
Mathematicians and other theoreticians like Fourier series because they are the best way of representing functions as frequencies and amplitudes in three distinct ways. Firstly, when you add together the infinite sum of sines and cosines it results in the original function (except in a few cases where this is impossible). Secondly when you take only the first, say, ten terms - they are always as accurate as you can get with only ten terms. Thirdly, if you choose some number of terms so that the function gives a perfect answer at as many points as possible, and then let the number of points tend to infinity, you get the Fourier series. Fairly dull stuff for a non-mathematician, but very useful if you know how to apply that knowledge.
Just as an example, work by these theorists has given us the handy Nyquist rate - the rate at which you need to take samples of the instantaneous value of some band-limited function, in order to perfectly reconstruct it from only the samples. In other words, if you have a piece of music which only contains frequencies which a human can hear, then sampling the music at a specific, regular, interval will give you all the information you need to play it back flawlessly. It is this theory which ensures that the music recorded on a CD should be 'perfect' - any distortion must have come from some other source, such as the speakers.