Posted by Marissa on .
A bothersome feature of many physical measurements is the presence of a background signal (commonly called "noise"). It is necessary, therefore, to subtract off this background level from the data to obtain a valid measurement. Suppose the measured background level is 4.7 mV. A signal of 20.1 mV is measured at a distance of 28 mm and 15.3 mV is measured at 33.5 mm. Correct the data for background and normalize the data to the maximum value. What is the normalized corrected value at 33.5 mm?
So my problem is that I'm not sure what I should be subtracting the background 4.7 value from. Without the background, I know that I would just need to divide 15.3 by 20.1.
Can someone please help me understand where I'm supposed to be subtracting background from? Any help would be appreciated.
My concern is about the properties of the background noice.
Is it always positive and constant? What are the possible variations and the characteristics?
With the absence of more information, I would be tempted to subtract the noice from the voltage measurements (if it is a hypothetical question). If it is a lab or a research, I would try to find out more about the characteristics of the noise before doing more calculations.
It's an experiment.
In Part 2.2.4 of the experiment, some light that reflects off the apparatus or from neighboring stations strikes the photometer even when the direct beam is blocked. In addition, due to electronic drifts, the photometer does not generally read 0.0 mV even in a dark room.
Would you be able to get the characteristics of the noise from someone in the lab, or make some efforts to 'calibrate' the instrument so that you know how much you are adjusting, and when.
Most importantly, you need to know the range of errors of your correction, and its sensitivity to your results.