Suppose you had $1, and I agreed to pay you 100% interest for it at the end of the year. Then in a year's time you would have $2.

Now suppose that instead of 100% interest once I offered you 50% every six months (i.e. twice). Then in six moths you would have $1.50, and by the end of the year, the amount would be $1.50 plus 50% of $1.50, which is $2.25. Hey, you're a quarter better off!


"Hmm...", you think to yourself. What would happen if the percentage were spread thinner? Suppose you took 25% every three months. Then you would have $1.25, then 1.25 plus 25% of $1.25, and so on, and at year's end you have $2.44. That's even better.

Now you're wondering whether finer and finer resolutions of payments would lead to indefinitely larger amounts. So you try to get a general form for the amount of money you would end up with. If you started off with $1, and you spread the 100% interest over n intervals, then each successive amount would be 1+(1/n)-times the amount preceding it. [Check: For six-month payments, there are two intervals, so n=2. Then the amount would increase by a factor of 1+(1/2), which is 1.5, every six months. Yep, makes sense. What about three-months payments? n=4, so the factor is 1+(1/4)=1.25. Seems OK].

OK, what do you do with this number? Well, your money is compounded by that factor n times. So, if you start off with $1, and you spread 100% interest over n payments, at the end of the year you would have (1+(1/n))^n dollars.


Great, now you start plugging in larger and larger values for n. But darn it, it doesn't seem to be increasing that much when n gets big. In fact, it seems to be levelling off. What is that number...?