Hello, dear habrovchane. It would seem: where it is already more real, but still, I have an idea. In order.
Introduction
Path tracing is a virtual reality scene creation method based on optics.
In three-dimensional space, a lot of rays are emitted from a light source (ideally how many, how many photons are in reality) and the stories of each ray are traced. When a beam meets an obstacle, several events are possible: absorption, reflection, refraction. The probability of each of them depends on the material of the barrier and the color of the beam. Those rays that hit the camera are drawn on the screen. In general, it can be read in more detail in Wikipedia.
')
I will add only that in optics time is isotropic, it means that both directions of time are equal. It turns out that you can follow the rays not from the source to the eye, but from the eye to the source. It is more practical, path tracing and so very expensive, you can not waste time on tracing rays, in which there is no chance to be painted.
Idea
And what if to trace not rays, but photons. This is the same! I will clarify. In one direction, emit at once many
monochromatic photons. Where there used to be a single beam, now 16, 32, 64 ... how many do not mind photons with a wavelength of 720, 700, ..., 450 nm. And each of them may have their own story. This will have meaning only when the material changes the direction of the photons at different frequencies in different ways.
From school physics it is known that this is the cause of the dispersion of light. Light dispersion occurs when the refractive index depends on the photon frequency. It turns out that with the old approach it is impossible to realize this very variance. It is impossible to "honestly" draw, for example, a diamond.
On pseudocode, it somehow looks like this:
i = 0..screen_h-1 j = 0..screen_w-1 double x = 2*j/screen_w-1; double y = 2*i/screen_h-1; table[i, j] = norm(center+x*left+y*up); n = 0..samples_num-1 i = 0..screen_h-1 j = 0..screen_w-1 freq = 0..color_quality-1 power = 0; Ray r = ray_from_vecs(eye, table[i, j]); trace(&r, &power, freq); screen[i*w+j] = screen[i*w+j]+power*rgb_from_freq(freq); i = 0..screen_h-1 j = 0..screen_w-1 screen[i*w+j] /= samples_num;
Here, I think, everything is clear. freq is the light wave frequency, screen_w, screen_h is the width and height of the screen in pixels, samples_num is the number of samples. True color is the arithmetic average of all samples when the number of samples tends to infinity. The vectors of the directions of photons are best stored in the table (table), so as not to count each time. They are calculated by a simple formula, in which also participate the vectors pointing to the center of the screen, the left and the upper border of the screen. For example, if the aspect ratio is 4: 3, then you can choose
center = (0, 0, 0) left = (4, 0, 0) up = (3, 0, 0)
Of course, it all depends on the implementation.
Now about the trace function, “magic” happens in it. It takes the position and direction of the beam, as well as its frequency, power (power) must be equal to zero. It returns a new position and direction and power. We are not interested in the position and direction, and it appears only so that the trace function can be called recursively.
Color - an array whose size coincides with the number of gradations of frequencies.
Again pseudocode
void trace() { Vec n;
Of course, this pseudocode does not claim anything. You can for example save different probabilities for refraction and specular reflection. Or even complicate the model. The main thing here - the probabilities and the refractive index depend on the frequency and each photon has its own.
I looked through the pictures on request path tracing in Google and did not see the dispersion of light. A search for the monochromatic path tracing did not show anything interesting. I reviewed the whole first page, everywhere these words are divided by others and in the article itself there is no my idea.
Of course, most likely it has already been implemented and described in books that I have not reached yet.
Consequences
There is no need to be a genius to understand: the tracing of monochromatic photons will cost 16, 32, ... more expensive than the standard algorithm (depending on the color quality). Now tracing the path in real time on the hardware of a regular user is impossible and incomprehensible when this can be done. A "monochromatic" tracing is even more expensive.