On one production line in a small custom chip factory, marginal costs decrease rapidly as more chips are made. The marginal cost of making x thousand chips in a day is thought to be C'(x)=40 / the square root of x in hundreds of dollars. Given that it costs 40,000 dollars to make one thousand chips, find the cost of making 16 thousand chips in a day. Total cost in hundreds would be?

c'(x) = 40/√x

so, c(x) = 20√x + k
since c(1) = 40,000 = 400*100,
20 + k = 400
k = 380
so, c(x) = 20√x + 380
Now just find c(16)

So when I plug in 16, it gives me 460 but is that meant to be 4,600? I know I can't just put 460 and I know it isn't 460,000. I'm just a bit confused by units.

To find the cost of making 16,000 chips in a day, we need to integrate the marginal cost function to obtain the total cost function.

First, let's write the marginal cost function in terms of x:

C'(x) = 40 / √x

Next, we integrate C'(x) to find the total cost function C(x):

C(x) = ∫ C'(x) dx

Integrating the function 40 / √x with respect to x gives us:

C(x) = 2 * 40 * √x + C

Where C is the constant of integration. To find this constant, we can use the given information that it costs $40,000 to make one thousand chips (x = 1). Substituting this into the total cost function:

40,000 = 2 * 40 * √1 + C
40,000 = 80 + C

Solving for C, we find that C = 39,920.

Now, we can determine the total cost of making 16,000 chips (x = 16) by substituting this into the total cost function:

C(16) = 2 * 40 * √16 + 39,920
C(16) = 2 * 40 * 4 + 39,920
C(16) = 320 + 39,920
C(16) = 40,240

Therefore, the cost of making 16,000 chips in a day is $40,240 (in hundreds of dollars).