Excel question: How to distribute a value using the normal distribution?

Here’s the challenge: I need to take a single dollar value and distribute it in a bell curve over a set of dates.

Variables:

  1. Dollar Value

  2. Start date

  3. End date

The dollar value would be distributed among the valid months (picture the date along the columns).

Perhaps it’s best asked by giving an example: say that I have $100,000 that I need to distribute between January and June (6 months). If it was uniformly distributed, it’s $16,667 per month. The formula would check if the current column’s date is between the start and end date and then divide it by the count of all months between those 2 dates. General formula in every cell along the row:

=IF(AND(CurrentMonth >= StartDate, CurrentMonth <= EndDate), (DollarValue)/(COUNTIFS(DateRange, CurrentMonth >= StartDate, DateRange, CurrentMonth <=EndDate)),"")

The above is a pretty simple formula to enter the average value among the valid date range (i.e. uniformly distributed).

The question is, are there any ideas for bell curves??

There’s a built-in cumulative normal distribution function: NORM.S.DIST(z,cumulative(T/F))

If you want each month to correspond to 1_σ_, then you have months 1, 2, . . ., 11, 12, and each month’s percentage would be “=NORM.S.DIST(month-6,1)-NORM.S.DIST(month-7,1)”.

If you want each month to correspond to , then the percentage would be “=NORM.S.DIST((month-6)*k,1)-NORM.S.DIST((month-7)*k,1)”.

Bear in mind that the normal distribution extends from −∞ to +∞, so you may end up with a total distribution that’s less than 100%. If so, I suggest that you sum all of the values, then divide each by the sum.

Thanks for the help on this. I tried something like this but it simply does not spread the data out like I want it to. The range in months can be between 3 and 18 months. 5 standard deviations from the mean of 0 would equal about 100%. To spread it out based on 3 months does not work well.