Sampling¶
Reference (Referentie)
- Panel (Paneel):
The integrator is the rendering algorithm used to compute the lighting. Cycles currently supports a path tracing integrator with direct light sampling. It works well for various lighting setups, but is not as suitable for caustics and some other complex lighting situations.
Rays are traced from the camera into the scene, bouncing around until they find a light source such as a light, an object emitting light, or the world background. To find lights and surfaces emitting light, both indirect light sampling (letting the ray follow the surface BSDF) and direct light sampling (picking a light source and tracing a ray towards it) are used.
- Viewport Samples
Number of samples for viewport rendering. Setting this value to zero enables indefinite sampling of the viewport.
- Render Samples
Number of paths to trace for each pixel in the final render. As more samples are taken, the solution becomes less noisy and more accurate.
- Time Limit
Renders scene until time limit or sample count is reached. When the time is set to 0, the sample count is used to determine when the render stops.
Notitie
The time limit does not include pre-render processing time, only render time.
Adaptive Sampling¶
With adaptive sampling Cycles automatically reduces the number of samples in areas that have little noise, for faster rendering and more even noise distribution. For example hair on a character may need many samples, but the background may need very few.
With adaptive sampling it is also possible to render images with a target amount of noise. This is done by settings the Noise Threshold, typical values are in the range from 0.1 to 0.001. Then render samples can then be set to a high value, and the renderer will automatically choose the appropriate amount of samples.
- Noise Threshold
The error threshold to decide whether to continue sampling a pixel or not. Typical values are in the range from 0.1 to 0.001, with lower values meaning less noise. Setting it to exactly 0 lets Cycles guess an automatic value for it based on the total sample count.
- Min Samples
The minimum number of samples a pixel receives before adaptive sampling is applied. When set to 0 (default), it is automatically set to a value determined by the Noise Threshold.
Denoising¶
Denoising removes noise while previewing scenes in Rendered mode in the 3D Viewport or for final renders.
- Render (Render)
Denoising for the final render can be enabled or disabled with the checkbox. For denoising the image after rendering with the Denoising node, the Data Render Passes also adapt to the selected denoiser.
- OpenImageDenoise:
Uses Intel’s Open Image Denoise, an AI denoiser. Typically provides the highest quality, and is the default.
- OptiX:
Uses NVIDIA’s OptiX AI denoiser. Supports GPU acceleration on some older NVIDIA GPUs where OpenImageDenoise does not.
Only available on NVIDIA GPUs when configured in the Cycles Render Device user preferences.
- Viewport
Denoising for the Rendered mode in the 3D Viewport can be enabled or disabled for with the checkbox.
- Automatic (Automatisch):
Uses GPU accelerated denoising if supported, for best performance. Prefers OpenImageDenoise over OptiX.
- OpenImageDenoise:
Uses Intel’s Open Image Denoise, an AI denoiser. Typically provides the highest quality.
- OptiX:
Uses NVIDIA’s OptiX AI denoiser. Supports GPU acceleration on some older NVIDIA GPUs where OpenImageDenoise does not.
Only available on NVIDIA GPUs when configured in the Cycles Render Device user preferences.
- Passes
Controls which Render Pass the denoiser should use as input, which can have different effects on the denoised image. Generally, the more passes the denoiser has to denoise the better the result. It is recommended to at least use Albedo as None can blur out details, especially at lower sample counts.
- None (Geen):
Denoises the image using color data.
- Albedo:
Denoises the image using color and albedo data.
- Albedo + Normal:
Denoises the image using color, albedo, and normal pass data.
- Prefilter OpenImageDenoise
Controls whether or not prefiltering is applied to Input Passes for use when denoising. Visible only when using OpenImageDenoise.
- None (Geen):
Does not apply any prefiltering to the input passes. This option retains the most detail and is the fastest, but assumes the input passes are noise free which may require a high sample count. If the input passes aren’t noise free, then noise will remain in the image after denoising.
- Fast:
Assumes the input passes are not noise free, yet does not apply prefiltering to the input passes. This option is faster than Accurate but produces a blurrier result.
- Accurate:
Prefilters the input passes before denoising to reduce noise. This option usually produces more detailed results than Fast with increased processing time.
- Quality OpenImageDenoise
Overall denoising quality. Visible only when using OpenImageDenoise.
- High:
Produces the highest quality output at the cost of time.
- Balanced:
Balanced between performance and quality.
- Fast:
Produces an output fast at the cost of quality (ideal for viewport rendering).
- Start Sample
Sample to start denoising in the 3D Viewport.
- Use GPU
Perform denoising on the GPU. This is significantly faster than on CPU, but requires additional GPU memory. When large scenes need more GPU memory, this option can be disabled.
See GPU Rendering for details on supported GPU.
Path Guiding¶
Path guiding helps reduce noise in scenes where finding a path to light is difficult for regular path tracing, for example when a room is lit through a small door opening. Important light directions are learned over time, improving as more samples are taken. Guiding is supported for surfaces with diffuse BSDFs and volumes with isotropic and anisotropic scattering.
Notitie
Path guiding is only available when rendering on a CPU.
While path guiding helps render caustics in some scenes, it is not designed for complex caustics as they are harder to learn and guide.
- Training Samples
The maximum number of samples to use for training. A value of 0 will keep training until the end of the render. Usually 128 to 256 training samples is enough for accurate guiding. Higher values can lead to a minor increases in guiding quality but with increased render times.
- Surface (Oppervlakte)
Enable path guiding for the diffuse and glossy components of surfaces.
- Volume
Enable path guiding inside volumes.
Lights¶
- Light Tree
Use a light tree to more effectively sample lights in the scene, taking into account distance and estimated intensity. This can significantly reduce noise, at the cost of a somewhat longer render time per sample.
Certain lighting properties are not accounted for in the light tree. This include custom falloff, ray visibility, and complex shader node setups including textures. This can result in an increase in noise in some scenes that make use of these features.
Note, this feature is currently disabled for AMD GPUs on macOS.
- Light Threshold
Probabilistically terminates light samples when the light contribution is below this threshold (more noise but faster rendering). Zero disables the test and never ignores lights. This is useful because in large scenes with many light sources, some lights might only contribute a small amount to the final image, and increase render times. Using this setting can decrease the render times needed to calculate the rays which in the end have very little effect on the image.
Geavanceerd¶
- Pattern
The random sampling pattern used by the integrator.
- Automatic (Automatisch):
Uses Blue-Noise (see below), but for viewport rendering, it optimizes for first sample quality for an interactive preview.
- Classic:
Use pre-computed tables of Owen-scrambled Sobol for random sampling.
- Blue-Noise:
Use a blue-noise pattern, which optimizes the frequency distribution of noise, for random sampling. This results in an output that appears smoother despite not being less noisy overall.
- Seed
Seed value for integrator to get different noise patterns.
- Use Animated Seed (clock icon)
Changes the seed for each frame. It is a good idea to enable this when rendering animations because a varying noise pattern is less noticeable.
- Sample Offset
The number of samples to skip when starting render. This can be used to distribute a render across multiple computers then combine the images with
bpy.ops.cycles.merge_images
- Scrambling Distance
These properties are not compatible with Blue-Noise sampling patterns.
- Automatic (Automatisch)
Uses a formula to adapt the scrambling distance strength based on the sample count.
- Viewport
Uses the Scrambling Distance value for the viewport rendering. This will make the rendering faster but may cause flickering.
- Multiplier
Lower values Reduce randomization between pixels to improve GPU rendering performance, at the cost of possible rendering artifacts if set too low.
- Min Light Bounces
Minimum number of light bounces for each path, after which the integrator uses Russian Roulette to terminate paths that contribute less to the image. Setting this higher gives less noise, but may also increase render time considerably. For a low number of bounces, it is strongly recommended to set this equal to the maximum number of bounces.
- Min Transparent Bounces
Minimum number of transparent bounces (more specifically “passthroughs”). Setting this higher reduces noise in the first bounces, but can also be less efficient for more complex geometry like hair and volumes.
- Layer Samples
When render layers have per layer number of samples set, this option specifies how to use them.
- Use:
The render layer samples will override the set scene samples.
- Bounded:
Bound render layer samples by scene samples.
- Ignore:
Ignore render layer sample settings.