Post a New Question

statistic

posted by .

The distribution of cash withdrawals from the automatic teller machine at a
certain bank has a mean of $500 with a standard deviation of $70. To reduce the
incentives for robbery, the bank puts money into the machine every 12 hours and
it keeps the amount deposited fairly close to the expected total withdrawals for a
12-hour period. If 100 withdrawals were expected in each 12-hour period and
each withdrawal was independent, how much should the bank put into the
machine so that the probability of running out of money was 0.05?

Respond to this Question

First Name
School Subject
Your Answer

Similar Questions

More Related Questions

Post a New Question