An analog input signal x(t) = 1.5 +0.5 sin(100t) is sampled with 2 kHz sampling frequency fs. Assuming that sampling starts at t = 0 compute the value of the sample x [20]. determine the sampling period (Ts). Show your work.

Sampling period (Ts) = 1/fs = 1/2000 = 0.0005 s

x[20] = 1.5 + 0.5 sin(100*20*0.0005)
= 1.5 + 0.5 sin(1)
= 1.5 + 0.5(0.8414)
= 2.3414