posted by Chad on .
I must write a program using Java that calculates the amount a person would earn over a period of time if his or her salary is one penny the first day, two pennies the second day, and continues to double each day. The program should display a table showing the salary for each day, and then show the total pay at the end of the period. The output should be displayed in dollar amount.
I don't know how to do computer programming, but his/her salary for the 20th day would be $5,242.88.