The "middlemost" of a (finite) set of numbers. For instance, the median of

1,3,4,7,8,99,12345
is 7 -- 3 numbers are smaller than it, and 3 are larger.

If there's an even number of numbers, the definition doesn't make sense. But the 2 numbers closest to the centre are obvious candidates; either take one of them or take their average. For instance, if 8 above were missing, the median could be 4, 7 or 5.5.

The median is preferred to the mean when order is more important than magnitude. For instance, the mean wage tells you very little -- a few people are earning astronomical sums, and nobody earns a negative salary, so almost everybody earns less than the mean wage. The median wage is a much better statistic. Typically, the poverty line and minimum wage (where applicable) are therefore defined in terms of a certain fraction of the median wage, not the mean wage.

On the other hand, my median writeup reputation is a paltry +3. This usage of a median easily cancels out the impossibly high reputations of one's best nodes: given that one cannot have many writeups with very negative reputations, the mean writeup reputation should be much bigger.

To find a median in a computer program, use a variant of quick sort that keeps track of the index where it expects to find the median, and only recurses into the partition half which will contain the median. This is easier done than said, and in practice is a nice exercise to program. This runs in expected time O(n) word operations, where n is the size of the array.

A more complex version guarantees running in worst case time O(n) word operations. It's "better" in the theoretical sense. But it's probably less work to run the quick sort-type algorithm; shuffle the array beforehand (using the Fisher-Yates algorithm) if you're worried about superlinear times popping up.