X and Y are discrete jointly distributed discrete valued random variables. The relation between their joint entropy H(X,Y) and their individual entropies H(X),H(Y) is


H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent

H(X,Y)≤H(X)+H(Y), equality holds when X,Y are uncorrelated

H(X,Y)≥H(X)+H(Y), equality holds when X,Y are independent

H(X,Y)≥H(X)+H(Y), equality holds when X,Y are uncorrelated

The correct answer is:

H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent

The correct answer is:

H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent

Explanation:

Joint entropy is a measure of the uncertainty or average amount of information in two random variables X and Y when considered together. It is denoted by H(X,Y).

Individual entropy, on the other hand, is a measure of the uncertainty or average amount of information in a single random variable. It is denoted by H(X) and H(Y) for X and Y, respectively.

The relationship between the joint entropy and individual entropies is given by:

H(X,Y) ≤ H(X) + H(Y)

This means that the joint entropy H(X,Y) is always less than or equal to the sum of the individual entropies H(X) and H(Y). The equality holds when X and Y are independent.

Independence means that the occurrence of one variable does not provide any information about the occurrence of the other variable. In such cases, the joint entropy is exactly equal to the sum of the individual entropies.

Therefore, the correct answer is:

H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent.