Northwestern University professor Uri Wilensky likes to present his students with an interesting question:

Imagine a room with 50 people, each having 50 dollars. What would happen if every person in the room would have to give a dollar to another random person in the room, every minute?

The money would be split:

Standard Deviation 0.00
Rounds0
Participants:
How many dollars does each person have?
Ranked from richest to poorest person

For many, the intuitive answer would be that the money will end up being split somewhat equally among people, since in every minute each of them has an equal chance of receiving a dollar. Some believe there should even be an inclination towards stricter-than-average equality, since when a person in the room runs out of money, he or she wouldn’t continue paying (as there’s no reserve to pay from) but would continue being eligible to the chance of receiving money.

However, the reality is different. In practice, in each round of exchange, there is a random group of people receiving a lot of money, and another random group that receives none. Even after many rounds of exchange, the game’s distribution creates a large group of people keeping about the same money they had at the beginning, a small group of people gaining a lot of money, and also a small group of people losing a lot of money.

Originally posted by Snip. Read about the mathematical explanation to why this happens here.