Welcome to mirror list, hosted at ThFree Co, Russian Federation.

git.blender.org/blender.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorClément Foucault <foucault.clem@gmail.com>2020-06-23 14:59:55 +0300
committerClément Foucault <foucault.clem@gmail.com>2020-06-23 15:04:41 +0300
commit439b40e601f8cdae9a12fc3f503e9e6acdd596d5 (patch)
tree9a485ec18d1c9dd030ffdfe9309193adc96dc515 /source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl
parentcc3e808ab47887c002faaa8a28318a2b4f47e02a (diff)
EEVEE: Motion Blur: Add accumulation motion blur for better precision
This revisit the render pipeline to support time slicing for better motion blur. We support accumulation with or without the Post-process motion blur. If using the post-process, we reuse last step next motion data to avoid another scene reevaluation. This also adds support for hair motion blur which is handled in a similar way as mesh motion blur. The total number of samples is distributed evenly accross all timesteps to avoid sampling weighting issues. For this reason, the sample count is (internally) rounded up to the next multiple of the step count. Only FX Motion BLur: {F8632258} FX Motion Blur + 4 time steps: {F8632260} FX Motion Blur + 32 time steps: {F8632261} Reviewed By: jbakker Differential Revision: https://developer.blender.org/D8079
Diffstat (limited to 'source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl')
-rw-r--r--source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl31
1 files changed, 31 insertions, 0 deletions
diff --git a/source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl b/source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl
index f9011f5ae4e..95cd69ba310 100644
--- a/source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl
+++ b/source/blender/draw/engines/eevee/shaders/object_motion_vert.glsl
@@ -4,9 +4,14 @@ uniform mat4 prevModelMatrix;
uniform mat4 nextModelMatrix;
uniform bool useDeform;
+#ifdef HAIR
+uniform samplerBuffer prvBuffer; /* RGBA32F */
+uniform samplerBuffer nxtBuffer; /* RGBA32F */
+#else
in vec3 pos;
in vec3 prv; /* Previous frame position. */
in vec3 nxt; /* Next frame position. */
+#endif
out vec3 currWorldPos;
out vec3 prevWorldPos;
@@ -14,10 +19,36 @@ out vec3 nextWorldPos;
void main()
{
+#ifdef HAIR
+ bool is_persp = (ProjectionMatrix[3][3] == 0.0);
+ float time, thick_time, thickness;
+ vec3 tan, binor;
+ vec3 wpos;
+
+ hair_get_pos_tan_binor_time(is_persp,
+ ModelMatrixInverse,
+ ViewMatrixInverse[3].xyz,
+ ViewMatrixInverse[2].xyz,
+ wpos,
+ tan,
+ binor,
+ time,
+ thickness,
+ thick_time);
+
+ int id = hair_get_base_id();
+ vec3 pos = texelFetch(hairPointBuffer, id).point_position;
+ vec3 prv = texelFetch(prvBuffer, id).point_position;
+ vec3 nxt = texelFetch(nxtBuffer, id).point_position;
+#endif
prevWorldPos = (prevModelMatrix * vec4(useDeform ? prv : pos, 1.0)).xyz;
currWorldPos = (currModelMatrix * vec4(pos, 1.0)).xyz;
nextWorldPos = (nextModelMatrix * vec4(useDeform ? nxt : pos, 1.0)).xyz;
/* Use jittered projmatrix to be able to match exact sample depth (depth equal test).
* Note that currModelMatrix needs to also be equal to ModelMatrix for the samples to match. */
+#ifndef HAIR
gl_Position = ViewProjectionMatrix * vec4(currWorldPos, 1.0);
+#else
+ gl_Position = ViewProjectionMatrix * vec4(wpos, 1.0);
+#endif
}