posted by kkk on .
why do you need to construct a calibration curve when doing spectrophotometric determinations? Is it not possible to obtain the concentration of the unknown solution by simply getting the extinction coefficient of the species under study and then determine its concentration in the unknown solution by using Beer's law? Thanks!
Many or most spectrometers split a beam of light in two and pass it through two different "arms" ot the spectrometer, and then take the ratio of the measured signals. One of the arms contains the sample material, in a holder.
There can be differences in the intensities of the two beams because of an imperfect beam splitter or different losses in the two beams, including absorption by windows of the empty cell. This can be evaluated by means of a calibration scan of the empty sample holder.
The ratio of the signals with full and empty cell should be used for the sample material transmittance or absorptance.
The absorption at a given wavelength may be due more than one species. I assume that you are not trying to deduce the concentration of one species in a mixture of species with overlapping bands.
It has been a long time since I calibrated a spectrophotometer, and there must be newer and better kinds around these days.
This article may help.
Let me add something to the answers provided by drwls. Knowing the extinction coefficient and using Beer's Law may work IF you assume that the extinction coefficient doesn't change with concentration. Sometimes it does and the calibration curves are not always linear. Some curve up, some down, some are straight lines.