This is a follow-up to a previous post, A simulation of angel investing.

Several readers commented on Hacker News that my first stab at a simulation was misleading because it showed negative average returns for low deal sizes, when in fact expected returns should be not only positive but constant regardless of deal size.

They are right.

I had been using *payoff* as the random variable, but *rate of return* as the measured variable. The formula for rate of return (x^(1/t)-1) places the most weight on the zero-payoff case (where return = -1.0), so the simulation results were skewed towards negative expectations, especially for low values of D.

In this new post, I present a simplified and more accurate simulation of angel investing.

As before, my goal is to shed some light on this question:

How many angel investments are needed to make the combined payoff look attractive from an investment standpoint?

I coded the following simulation in Python. View the revised source code here.

1. Create a pool of 10,000 different investors, each investing in D deals, with a fixed distribution of payoffs per deal. Randomly simulate each investor’s combined payoff, then compute the mean and standard deviation of all payoffs in the overall pool.

Note that this simulation just uses the straight payoff, not a calculated rate of return.

2. Assume all D deals are made at the same time and that the payoff occurs at the same time.

3. For each angel investment, assume the following distribution of payoffs. Note that this is the same distribution as in the previous post.

Prob. | Payoff | |

50% | 0x | lose entire investment |

20% | 1x | get investment back |

15% | 3x | |

13% | 10x | |

2% | 20x |

(source: Gabriel Weinberg’s angel investing scenario spreadsheet)

4. New in the revised model: Calculate the median of all payoffs for each deal size. This gives an idea of what kind of results were achieved by the typical (not average, but typical) investor.

5. New in the revised model: Calculate the probability that none of the D deals were hits, which I define as investments with >= 10x payoff. Since big hits are what angel investors are after, this no-hit probability measures the odds of striking out across the board. It is calculated as (probability of < 10x payoff)^D = 0.85^D.

- The expected, a.k.a. mean, payoff is roughly constant for all deal sizes.

- The typical, a.k.a. median, payoff is low for deal sizes less than 5, but stabilizes quickly after 5 deals. After 20 deals, the median payoff is nearly identical to the mean.

- An angel investor would have to do about 13 deals before the risk of total loss is reduced to a > 2 std dev event.

- After 10 deals there’s still a 20% chance of never getting a “hit” investment. To push that chance to below 5%, an angel investor would have to do about 20 deals.

My takeaway from this revised experiment:

Angel investors can expect favorable payoffs with only 10 deals, but it takes at least 20 investments to truly be safe.

Some ideas for future enhancements to the model:

- Add true Google-like payoffs to the distribution. Very low probability with extremely high payoff.

- Measure percentiles other than median. 25% and 75% percentiles, for example.

- Don’t assume deals are all made at the same time.

- Don’t assume payoffs are uncorrelated.

If you have any suggestions, please leave them in the comments below.