The biggest problem you'd have with this project is keeping your monkeys fed.

We can safely ignore the efficiency of an actual monkey versus a computer simulation of one, as this is simply a constant factor (The ratio of energy used between computer monkeys and real monkeys is not affected by the number of monkeys).

What we need to worry about are these three facts:
  • It takes energy1 to perform computation (The monkeys have to press keys)
  • The energy supply is limited (There is only so much mass to destroy in the solar system)
  • Each extra bit we're going to monkey doubles the work load. (Even if we copy our old results, and put a 1 on the end of one, and a 0 on the end of the other, the energy needed to copy them doubles each time. And so does the energy spent looking at the results to see if we've got anything useful)

What this means for us, is that given all the available energy in the solar system, there is a limit to the length of string we can monkey. And it turns out it's quite short. Somewhere around 192 bits, in fact2. There are many ways to represent english text as a binary number. The naïve way is to simply map the 26 possible characters to a sequence of bits. This takes 5-6 bits per character. Better ways include using a huffman tree of the character distributions in english, using a huffman tree for each preceding character, etc. etc. It is an open (and possibly as hard as the infinite monkeys problem) question whether there is a lower bound for this number.3

Giving the benefit of the doubt, we'll assume it takes only 1 bit to represent a character. So (assuming you are generating english more intelligently than picking characters at random, which would give you about 40 characters) you're getting something in the region of 192 characters. There aren't a lot of things that fit in 192 characters:

Alas, poor Yorick! I knew him, Horatio:
a fellow of infinite jest, of most excellent fancy: he hath borne me on his back a thousand times; and now, how abhorred in my imagination it is! my gorge rims at it. Here hung those lips that I have

You'd better hope that one of these things is an incredibly consise guide to building a computer (in the words of Bruce Schneier) from something other than matter and occupy[ing] something other than space.


1 - technically, we're talking about entropy - the amount of energy left that can be leveraged to do something useful. But entropy is also an information theory term, and using entropy to mean both in the same node is just confusing.
2 - APPLIED CRYPTOGRAPHY, 2nd edition (John Wiley & Sons, 1996) goes into more detail on the subject. It suggests that you could get all the way to 219 bits by building a dyson sphere around a supernova. (222 if you could find a way to catch all the neutrinos)
3 - These don't give us work for free, though. While they give english-like text a shorter representation, they make representations of non-english text much longer.