Skip to content

[VFX] Fix compilation error when using cubemap arrays #5582

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Sep 9, 2021
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions com.unity.visualeffectgraph/Shaders/VFXCommon.hlsl
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,14 @@
#define UNITY_INV_HALF_PI 0.636619772367f
#endif

// SHADER_AVAILABLE_XXX defines are not yet passed to compute shader atm
// So we define it manually for compute atm.
// It won't compile for devices that don't have cubemap array support but this is acceptable by now
// TODO Remove this once SHADER_AVAILABLE_XXX are passed to compute shaders
#ifdef SHADER_STAGE_COMPUTE
#define SHADER_AVAILABLE_CUBEARRAY 1
#endif

struct VFXSampler2D
{
Texture2D t;
Expand Down