By how much does the mean absolute deviation of database a exceed that of database be around your answer to two decimal places data set a is 99 10 10 10 11 12 14 18 database b is 7 8 8 9 10 10 11 11 12 and 15

To calculate the mean absolute deviation (MAD) for each database, we first need to find the mean of each dataset:

For database A:
Mean = (99 + 10 + 10 + 10 + 11 + 12 + 14 + 18) / 8
Mean = 84 / 8
Mean = 10.5

For database B:
Mean = (7 + 8 + 8 + 9 + 10 + 10 + 11 + 11 + 12 + 15) / 10
Mean = 91 / 10
Mean = 9.1

Next, we calculate the absolute deviation of each data point from the mean for each database:

For database A:
|99 - 10.5| = 88.5
|10 - 10.5| = 0.5
|10 - 10.5| = 0.5
|10 - 10.5| = 0.5
|11 - 10.5| = 0.5
|12 - 10.5| = 1.5
|14 - 10.5| = 3.5
|18 - 10.5| = 7.5

Sum of absolute deviations = 88.5 + 0.5 + 0.5 + 0.5 + 0.5 + 1.5 + 3.5 + 7.5 = 103

Mean Absolute Deviation for Database A = 103 / 8 = 12.875

For database B:
|7 - 9.1| = 2.1
|8 - 9.1| = 1.1
|8 - 9.1| = 1.1
|9 - 9.1| = 0.1
|10 - 9.1| = 0.9
|10 - 9.1| = 0.9
|11 - 9.1| = 1.9
|11 - 9.1| = 1.9
|12 - 9.1| = 2.9
|15 - 9.1| = 5.9

Sum of absolute deviations = 2.1 + 1.1 + 1.1 + 0.1 + 0.9 + 0.9 + 1.9 + 1.9 + 2.9 + 5.9 = 18.9

Mean Absolute Deviation for Database B = 18.9 / 10 = 1.89

The difference in mean absolute deviation between database A and B is:
12.875 - 1.89 = 10.985

Therefore, the mean absolute deviation of database A exceeds that of database B by approximately 10.99.