En = -m / 2 hbar2 ( e2/4 Pi e0)2 1/n2 for n = 1,2,3....

This equation describes the allowed energies of a single electron when bound to a hydrogen atom nucleus (i.e., a proton). It was discovered by Niels Bohr in 1913, using a mix of classical and then-nascent quantum physics, and was one of the finds really kicked off the development of quantum mechanics.

Bohr found this equation by assuming that the angular momentum of an electron was quantized, in other words, it could only have specific values, and nothing in between those. From this restriction, he could work out what 'orbits' the electrons would be allowed to inhabit around the nucleus, and what the energies of those orbits should be.

While the equation is completely correct according to modern quantum theory, his derivation has several faults (completely understandable. Things like the Schodinger Equation did not arrive until a few years later). Electrons do not orbit around nuclei in any way; they simply have a probability distribution which places them near the nucleus if they do not have sufficient energy to scatter away.

Bohr's formula, also, only works for atoms with only a single electron. (Ionized helium, etc). Any more electrons, and there was no way to patch the formula to result in answers that were consistent with observation.

A modern derivation of this formula can be done using the 3-dimensional Schrodinger equation, and the potential energy equation for electric forces. That, and solving a few differential equations, doing a lot of substitutions, simplifications, and clever math tricks.

Of course, the benefit of the modern way is that one can apply it to any atom.

Oh, the mess of constants in front of the 1/n2 term turns out to be roughly 13.6 eV.