
Quasi-Monte Carlo (QMC) and traditional Monte Carlo methods differ in terms of sample space coverage due to their distinct approaches to generating samples. Traditional Monte Carlo methods rely on random sampling, which can sometimes result in uneven coverage of the sample space, with some areas being oversampled while others may be undersampled or even left unexplored. This randomness can lead to higher variance in the estimates and may require a larger number of samples to achieve a desired level of precision.
On the other hand, QMC methods use deterministic low-discrepancy sequences to generate samples, which are designed to cover the sample space more evenly than random sampling. This results in a more uniform coverage of the space, reducing the variance of the estimates and allowing for a more accurate approximation of integrals across multidimensional spaces. The use of low discrepancy points in QMC ensures that sample points cover the space evenly, which can be particularly beneficial in computer graphics for generating images more efficiently and realistically.
In summary, QMC methods provide a more even coverage of the sample space compared to traditional Monte Carlo methods, due to the use of deterministic low-discrepancy sequences instead of random sampling. This leads to more accurate estimates with lower variance and can be particularly advantageous in high-dimensional problems and applications where uniformity in specific dimensions is crucial.

In the quasi-Monte Carlo approach, discrepancy metrics play a crucial role in estimating the uniformity of the point distribution and how evenly the points cover the sample space. The goal is to create a low discrepancy point set, which means that the points are dispersed more equally and evenly throughout the space. By using low discrepancy points, one can approximate integrals across multidimensional spaces more accurately and efficiently. Thus, discrepancy metrics help in assessing the quality of the point set and ensuring better performance in quasi-Monte Carlo simulations.

Monte Carlo (MC) methods are a subset of computational algorithms that rely on repeated random sampling to obtain numerical results23. They are used for simulating and approximating complex real-world systems, especially when it is difficult or impossible to use other mathematical methods. MC methods are widely applied in fields such as financial mathematics, numerical integration, and optimization, particularly for risk and derivative pricing. They are also used in physics, engineering, climate change and radiative forcing, computational biology, and artificial intelligence, among other areas.