Skip to content
This repository has been archived by the owner on May 9, 2018. It is now read-only.

Pipeline DSL Proposal

Paul Bardea edited this page Apr 15, 2018 · 4 revisions

Background

After writing the shader code itself, actually providing the shaders with data and calling them is a difficult process. I call the process of taking information, transforming it, and feeding it to shaders a pipeline. The code for the shaders themselves is not addressed in this proposal.

WebGL rendering is effectively a pure function: it takes some input, and writes to the screen. Since we don't read data back from the screen, this is a unidirectional data flow. It is just a matter of defining the input required for a shader. The Regl library uses this abstraction, calling a render of a shader a command: https://github.com/regl-project/regl/blob/gh-pages/API.md#commands

However, because Regl doesn't parse your shader code, it doesn't know if you filled in all the data the shader needs. The best it can do is have you specify what the shader needs when defining a command, and checking at runtime that you passed in the values you said you would.

This aims to move the logic of calling a Regl-like command into a DSL so that we can typecheck it.

Proposal 1: YAML

YAML file defining a pipeline:

# Demo Calder pipeline, loosely based off of the SAO demo
---


#############################################################
# Register inputs needed at compile time
#############################################################

width: !StaticInput
  type: Number
  
height: !StaticInput
  type: Number


#############################################################
# Set up shaders
#############################################################

geometryPass: !Shader
  extensions: [ WEBGL_draw_buffers ]
  
  # Add our shader code inline (this is our higher level glsl)
  vertex: |
    attribute vec3 vertexPosition;
    
    // Use our other high level abstractions in glsl like normal
    buffer position;
    buffer normal;
    buffer diffuse;
    // ...
    
    void main() {
      // ...
    }
    
  # Or use a function to load from a file
  fragment: !LoadFile "geometry-fragment.cgl"
  
aoPass: !PixelShader
  extensions: [ WEBGL_draw_buffers ]
  fragment: !LoadFile "ao-final.cgl"
  
finalPass: !PixelShader
  extensions: [ WEBGL_draw_buffers ]
  fragment: !LoadFile "final-fragment.cgl"
  

#############################################################
# Set up buffers
#############################################################

position: !Texture2D
  type: RGBA
  format: FLOAT # implicitly requires OES_texture_float extension
  width: !Ref width # Reference a value from elsewhere in this file
  hight: !Ref height
  magFilter: NEAREST
  minFilter: NEAREST
  wrapS: CLAMP_TO_EDGE
  wrapT: CLAMP_TO_EDGE

# Alternatively, use default values for texture fields (and throw an error if something
# like width or height isn't specified)
normal: !Texture2D { type: RGBA, width: !Ref width, height: !Ref height, format: FLOAT }
diffuse: !Texture2D { type: RGBA, width: !Ref width, height: !Ref height }
ao: !Texture2D { type: RGBA, width: !Ref width, height: !Ref height, format: FLOAT }

# Default format for a type of DEPTH is different than for a type of RGBA (DEPTH_COMPONENT
# versus UNSIGNED_BYTE, respectively). Implicitly requires WEBGL_depth_texture
depth: !Texture { type: DEPTH, width: !Ref width, height: !Ref height }


#############################################################
# Set up exported commands
#############################################################

draw: !Command
  
  # Specify the state that needs to be passed in
  using:
    tick: Number # Specify a type for scalars
    projection: Mat4 # Accepts an array of numbers
    meshes: !ArrayOf
      vertices: !ArrayOf Number
      colors: !ArrayOf Number
      normals: !ArrayOf Number
      elements: !ArrayOf Number
  
  do:
    # Run geometry pass
    - !ForEach
      in: !Ref meshes
      as: mesh
      do:
        - !RunShader
          shader: !Ref geometryPass
          with:
            vertexPosition: !Ref mesh.vertices
            vertexColor: !Ref mesh.colors
            vertexNormal: !Ref mesh.normals
            position: !Ref position
            normal: !Ref normal
            diffuse: !Ref diffuse
            # `depth` is required if a shader renders to buffers and not the screen,
            # you must explicitly pass `null` if you don't want to use one
            depth: !Ref depth
            
            projection: !Ref projection
            transform: !RotateMatrix
              matrix: !TranslateMatrix { matrix: !IdentityMatrix {}, by: [0, 0, -40] }
              axis: [0, 1, 0]
              amount: !Mult [!Ref tick, 0.1]
          elements: !Ref mesh.elements # Use `count` if not using drawElements
    
    # Run AO pass
    - !RunShader
      shader: !Ref aoPass
      # No need to pass elements to a pixel shader
      with:
        position: !Ref position
        normal: !Ref normal
        diffuse: !Ref diffuse
        ao: !Ref ao
        depth: !Ref depth
   
    # Run final pass
    - !RunShader
      shader: !Ref finalPass
      with:
        position: !Ref position
        normal: !Ref normal
        diffuse: !Ref diffuse
        ao: !Ref ao
        depth: !Ref depth

Usage in JS:

import sao from './sao'; // Calder generates a js module for you to import

const canvas = document.getElementById('stage');
const gl = canvas.getContext('webgl');

const saoInstance = sao.build({
  width: canvas.width,
  height: canvas.height
});

let tick = 0;

const projection = mat4.create();
// assume these constants are defined somewhere
mat4.perspective(projection, fieldOfView, aspect, zNear, zFar);

const meshes = [
  teapot: { /* ... */ },
  ground: { /* ... */ }
];

function eachFrame() {
  saoInstance.draw({
    tick,
    meshes,
    projection
  });
  
  requestAnimationFrame(eachFrame);
}

eachFrame();

Syntax

Given all the raw information needed for a render, translating it into the format needed by the shader should be a declarative transformation, and can be modelled as markup. For that reason, rather than implementing a full programming language, the DSL can simply be a markup language defining those transformations.

I chose to use YAML for the syntax for a few reasons:

  • It is basically a literal abstract syntax tree. If we have time later, we can write out own language that produces an AST similar to the YAML we would write, but this will save us valuable time to begin with.
  • It is standardized. We don't need to parse it ourselves, and we don't have to write highlighters for every editor.
  • It is flexible. It's less verbose to write then XML or JSON. For example, it supports multiline string literals!

Semantics

I make extensive use of tags (the !-prefixed terms.) Tags will essentially act as functions. The node that gets tagged is essentially the input to the function.

I use the !Ref tag to reference a value that should be "in scope". This can reference top-level definitions, and also variables produced by functions that take in other functions (e.g. !ForEach.)

Here are the different top-level components that I believe we need to be able to define:

  • Compile-time constants. We need to define the sizes of textures upfront, and we need to get this information from Javascript-land.
  • Shaders. This includes the vertex shader code, the fragment shader code, and the extensions required. The GLSL code is Calder-augmented GLSL code.
  • Textures. These can basically have the same format as Regl's texture helpers, but embedded in our DSL.
  • Commands. These are similar to what Regl does, but typechecked. We specify the state needed, and their types. Then, in the DSL, we declaratively write the transformation from raw data to buffers. Because the shader is part of the DSL, we have all the information we need to ensure that the code provides everything the shader expects, assuming the input matches.
    • The transformation code will require us to have a good number of helper functions for things like mapping, reducing, iterating, arithmetic, and linear algebra
    • YAML syntax isn't as concise as a properly defined language and ends up looking Lispy because every operation has to be a function (i.e., no infix operators.) Let me know if you have any suggestions for how to make this better without bumping up to a full-on language that needs to be parsed.

Proposal 2: Have separate compile-time Javascript

The basic idea here is to use Javascript to generate the exact same thing that you would encode in the YAML. We still need to represent the transformation as data rather than a program, but if we use a program to generate this data at compile time, then essentially we are letting the end user create their own macros for parts they find verbose and can use any Javascript libraries they want to generate the schema Calder expects. Calder then generates more Javascript code, which is what gets run in the browser.

import * from 'calder-gl';

module.exports = {
  staticInput: {
    width: Num(),
    height: Num(),
  },
  
  shaders: {
    geometryPass: Shader({
      extensions: ["WEBGL_draw_buffers"],
      vertex: LoadFile("geometry_vertex.cgl"), // or write in a `` string literal
      fragment: LoadFile("geometry_fragment.cgl")
    }),
    aoPass: PixelShader({
      extensions: ["WEBGL_draw_buffers"],
      fragment: LoadFile("ao_fragment.cgl")
    }),
    finalPass: PixelShader({
      extensions: ["WEBGL_draw_buffers"],
      fragment: LoadFile("final_fragment.cgl")
    })
  },
  
  resources: {
    position: Texture2D({
      type: RGBA,
      format: FLOAT,
      width: Ref("width"),
      height: Ref("height"),
      magFilter: NEAREST,
      minFilter: NEAREST,
      wrapS: CLAMP_TO_EDGE,
      wrapT: CLAMP_TO_EDGE
    }),
    normal: Texture2D({
      type: RGBA,
      width: Ref("width"),
      height: Ref("height"),
      format: FLOAT
    }),
    diffuse: Texture2D({
      type: RGBA,
      width: Ref("width"),
      height: Ref("height"),
      format: FLOAT
    }),
    ao: Texture2D({
      type: RGBA,
      width: Ref("width"),
      height: Ref("height"),
      format: FLOAT
    }),
    depth: Texture2D({
      type: DEPTH,
      width: Ref("width"),
      height: Ref("height")
    })
  },
  
  commands: {
    draw: Command({
      using: {
        tick: Num(),
        meshes: ArrayOf(Obj({
          vertices: ArrayOf(Num()),
          colors: ArrayOf(Num()),
          normals: ArrayOf(Num()),
          elements: ArrayOf(Num())
        }))
      },
      do: [
        ForEach(Ref("meshes"), [
          RunShader(Ref("geometryPass"), {
            vertexPosition: Ref("mesh.vertices"),
            vertexColor: Ref("mesh.colors"),
            vertexNormal: Ref("mesh.normals"),
            position: Ref("normal"),
            diffuse: Ref("diffuse"),
            depth: Ref("depth"),
            projection: Ref("projection"),
            transfrorm: RotateMatrix(TranslateMatrix(IdentityMatrix(), [0, 0, -40]), [0, 1, 0], Mult(Ref("tick"), 0.1),
          }, Elements(Ref("mesh.elements")))
        ]),
        RunShader(Ref("aoPass"), {
          position: Ref("position"),
          normal: Ref("normal"),
          diffuse: Ref("diffuse"),
          ao: Ref("ao"),
          depth: Ref("depth")
        }),
        RunShader(Ref("finalPass"), {
          position: Ref("position"),
          normal: Ref("normal"),
          diffuse: Ref("diffuse"),
          ao: Ref("ao"),
          depth: Ref("depth")
        })
      ]
    })
  }
};

Implementation

The capitalized functions (imported from calder-gl to avoid a cgl. prefix on all of them) basically are just constructors for Plain Old Data objects. So our code would work like this:

  1. Run the user's Javascript to get the POD structure
  2. Using something like the visitor pattern, walk the data tree, passing along context as we go (what's in scope that a Ref can see), making type assertions. If anything fails, stop here, and call this a compiler error.
  3. If the typechecking is OK, walk the tree again, but this time, generating Javascript code from each node to build up a package for other browser code to use