Im pondering whether it would be possible to scan a negative in such a way as to try and extrapolate all the light imaged in a negative, ie some of the dimensional data laid down in the photo using some kind of holographic 3 dimensional extraction of light data from a negative and see whether the blur resident in the photo could somehow be dimensionally imaged for subtraction into a new image containing a refocused version of the image. is the blur in a photo stored in the layers of the photo in a dimensional way in the sense it might be possible resimulate the way the data fell on the negative dimensionally and from that recreate and refocus the original light into a new image with the blur removed, could an xray dimensional scan be combined with a flat image scan to somehow correlate to capture the light that fell on the negative in a 3d dimensional way enabling its refocussing. ? modelling the camera and its lenses ?
this is probably impossible im just wondering whether 3d information about the way the light fell onto the negative could somehow be extracted from a standard negative.