The H&D curve, primary tool for the evaluation of radiographic film characteristics is defined as the optical density as a function of exposure. This optical density is traditionally measured by the transmittance of a beam of light through the film, and it is important to understand the film behavior in sensitometry. There are many mathematical and computational models to characterize the curve behavior and its properties, but none of them considers the interference caused by the equipment used. In a first approach, in an attempt to fit theoretical models, it was found that for high values of exposure, there was inconsistency toward the experimental data and the values predicted by theoretical models were higher than the measured values. This inconsistency would suggest a limitation of optical density. When it tried to measure the silver concentration in function of exposure by another method such as X-ray fluorescence, it was noticed that the silver concentration is reasonably according to the models. This led to the conclusion that the limitation perceived in optical density is mainly caused by measurement.