Skip to content

It's a notebook of Real-Time High Quality Rendering by Lingqi Yan 2021

License

Notifications You must be signed in to change notification settings

fusheng-ji/Real-Time_High_Quality_Rendering

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

typora-root-url
./

Real-Time_High_Quality_Rendering

It's a notebook of Real-Time High Quality Rendering (GAMES 202) by Lingqi Yan 2021

微信截图_20220212021425

Class website: https://sites.cs.ucsb.edu/~lingqi/teaching/games202.html

Overview

What is Real-Time High Quality Rendering about?

  • Real-Time

    • Speed: more than 30 FPS (frames per second), even more for Virtual / Augmented Reality (VR / AR): 90 FPS
    • Interactivity: Each frame generated on the fly
  • High Quality

    • Realism: advanced approaches to make rendering more realistic
    • Dependability: all-time correctness (exact or approximate), no tolerance to (uncontrollable) failures
  • Rendering

    • graph LR
      1["3D scene (meshes, lights, etc)"]  --> 2[Calculating<br>light->eye]-->3[Image]
      
      Loading
  • Highest level: 4 different parts on real-time rendering

    • Shadows (and env)
    • Global Illum. (Scene/image space, precomputed)
    • Physically-based Shading
    • Real-time rag tracing

2022-03-13_09-37

Course Topics

  • Shadow and Environment Mapping

    2022-03-13_09-38

  • Interactive Global Illumination Techniques

    2022-03-13_09-38_1

  • Precomputed Radiance Transfer

    2022-03-13_09-39

  • Real-Time Ray Tracing

  • Participating Media Rendering, Image Space Effects, etc.

    2022-03-13_09-39_1

  • Non-Photorealistic Rendering

    • But will not be in depth / per game

    2022-03-13_09-41

  • Antialiasing and supersampling

    2022-03-13_09-41_1

  • Chatting about techs!

    2022-03-13_09-41_2

  • Chatting about games!

    2022-03-13_09-42

What's is GAMES202 not about?

  • 3D modeling or game development using Unreal Engine

    2022-03-13_09-59

  • Off-line rendering: Expensive (but more accurate) light transport techniques in movies / animations

    2022-03-13_09-59_1

  • Neural Rendering

    2022-03-13_09-59_2

  • Using OpenGL

  • Scene / shader optimization

  • Reverse engineering of shaders

  • High performance computing e.g. CUDA programming

How to study GAMES202?

  • Understand the difference between science and technology
    • Science != technology
    • Science == knowledge
    • Technology == engineering skills that turn science into product
  • Real-time rendering = fast & approximate off-line rendering + systematic engineering
  • Fact: in real-time rendering technologies, the industry is way ahead of the academia
  • Practice makes perfect

Motivation

  • Today, Computer Graphics is able to generate photorealistic images
    • Complex geometry, lighting, materials, shadows
    • Computer-generated movies/special effects (difficult or impossible to tell real from rendered...)

2022-03-13_10-19

  • But accurate algorithms (esp. ray tracing) are very slow
    • So they are called offline rendering methods
    • Remember how long it takes to render 1 frame in Zootopia?

2022-03-13_10-23

  • With proper approximations, we can generate plausible results but runs much faster

2022-03-13_10-24

Evolution of Real-Time Rendering

  • Interactive 3D graphics pipeline as in OpenGL

    • Earliest SGI machines (Clark 82) to today
    • Most of focus on more geometry, texture mapping
    • Some tweaks for realism (shadow mapping, accum. buffer)

    2022-03-13_10-26

  • 20 years ago

    • Interactive 3D geometry with simple texture mapping, fake shadows (OpenGL, DirectX)

      2022-03-13_10-27

  • 20 -> 10 years ago

    • A giant leap since the emergence of programmable shaders (2000)

    • Complex environment lighting, real materials (velvet, satin, paints), soft shadows

      2022-03-13_10-27_1

  • Today

    • Stunning graphics

      2022-03-13_10-30

    • Extended to Virtual Reality (VR) and even movies

      2022-03-13_10-30_1

Technological and Algorithmic Milestones

  • Programmable graphics hardware (shaders) (20 years ago)

    2022-03-13_10-31

  • Precomputation-based methods (15 years ago)

    • Complex visual effects are (partially) pre-computed

    • Minimum rendering cost at run time

      2022-03-13_10-32

    • Application: Relighting

      • Fix geometry
      • Fix viewpoint
      • Dynamically change lighting

      2022-03-13_10-34

  • Interactive Ray Tracing (8-10 years ago: CUDA + OptiX)

    • Hardware development allows ray tracing on GPUs at low sampling rates (~1 samples per pixel (SPP))

    • Followed by post processing to denoise

      2022-03-13_10-35

Recap of CG Basics

Basic GPU hardware pipeline

2022-03-13_11-41

2022-03-13_11-42

2022-03-13_11-42_1

2022-03-13_11-42_2

2022-03-13_11-42_3

2022-03-13_11-43

OpenGL

  • Is a set of APIs that call the GPU pipeline from CPU
    • Therefore, language does not matter!
    • Cross platform
    • Alternatives (DirectX, Vulkan, etc.)
  • Cons
    • Fragmented: lots of different versions
    • C style, not easy to use
    • Cannot debug (?)
  • Understanding
    • 1-to-1 mapping to our software rasterizer in GAMES101

How to use OpenGL? Important analogy: oil painting

A. Place objects/models
  • Model specification
    • User specifies an object’s vertices, normals, texture coords and send them to GPU as a Vertex buffer object (VBO)
      • Very similar to .obj files
  • Model transformation
    • Use OpenGL functions to obtain matrices
      • e.g., glTranslate, glMultMatrix, etc.
      • No need to write anything on your own
B. Set position of an easel
  • View transformation

  • Create / use a framebuffer

    • Set camera (the viewing transformation matrix) by simply calling, e.g., gluPerspective

      2022-03-13_11-50

C. Attach a canvas to the easel
  • Analogy of oil painting:
    • E. you can also paint multiple pictures using the same easel
  • One rendering pass in OpenGL
    • A framebuffer is specified to use
    • Specify one or more textures as output (shading, depth, etc.)
    • Render (fragment shader specifies the content on each texture)
D. Paint to the canvas
  • i.e., how to perform shading
  • This is when vertex / fragment shaders will be used
  • For each vertex in parallel
    • OpenGL calls user-specified vertex shader: Transform vertex (ModelView, Projection), other ops
  • For each primitive, OpenGL rasterizes
    • Generates a fragment for each pixel the fragment covers
  • For each fragment in parallel
    • OpenGL calls user-specified fragment shader: Shading and lighting calculations
    • OpenGL handles z-buffer depth test unless overwritten
  • This is the “Real” action that we care about the most: user-defined vertex, fragment shaders
    • Other operations are mostly encapsulated
    • Even in the form of GUI-s
E. (Attach other canvases to the easel and continue painting)
F. Multiple passes! (Use previous paintings for reference)
Summary: in each pass
  • Specify objects, camera, MVP, etc.
  • Specify framebuffer and input/output textures
  • Specify vertex / fragment shaders
  • (When you have everything specified on the GPU) Render!

OpenGL Shading Language (GLSL)

Shading Languages

  • Vertex / Fragment shading described by small program
  • Written in language similar to C but with restrictions
  • Long history. Cook’s paper on Shade Trees, Renderman for offline rendering
    • In ancient times: assembly on GPUs!
    • Stanford Real-Time Shading Language, work at SGI
    • Still long ago: Cg from NVIDIA
    • HLSL in DirectX (vertex + pixel)
    • GLSL in OpenGL (vertex + fragment)

Shader Setup

  • Initializing (shader itself discussed later)
    • Create shader (Vertex and Fragment)
    • Compile shader
    • Attach shader to program
    • Link program
    • Use program
  • Shader source is just sequence of strings
  • Similar steps to compile a normal program

2022-03-13_12-02

2022-03-13_12-03

2022-03-13_12-03_1

Vertex Shader

2022-03-13_12-03_2

2022-03-13_12-04

Fragment Shader

Debugging Shaders

2022-03-13_13-59

2022-03-13_13-59_1

The Rendering Equation

2022-03-13_14-00

2022-03-13_14-01

Environment Lighting

2022-03-13_14-02_1

2022-03-13_14-02_2

2022-03-13_14-02_3

Real-Time Shadows

Recap: shadow mapping

  • A 2-Pass Algorithm
    • Pass 1 - The light pass generates the SM
    • Pass 2 - The camera pass uses the SM (recall last lecture)
  • An image-space algorithm
    • Pro: no knowledge of scene’s geometry is required
    • Con: causing self occlusion and aliasing issues
  • Well known shadow rendering technique
    • Basic shadowing technique even for early off-line renderings, e.g., Toy Story

Overview

2022-03-13_14-16

2022-03-13_14-18

2022-03-13_14-19

2022-03-13_14-19_1

Results

2022-03-13_14-20

2022-03-13_14-20_1

Visualization

2022-03-13_14-20_2

2022-03-13_14-20_3

2022-03-13_14-20_4

2022-03-13_14-21

Issues and solutions

Self occlusion

2022-03-13_14-22

How to fix? (RTR does not trust in COMPLEXITY)

2022-03-13_14-22_1

2022-03-13_14-22_2

Aliasing

2022-03-13_14-32

The math behind shadow mapping

Inequalities in Calculus

2022-03-13_14-34

Approximation in RTR

2022-03-13_14-34_1

2022-03-13_14-35

2022-03-13_14-36

Percentage closer soft shadows (PCSS)

2022-03-13_14-37

Percentage Closer Filtering (PCF)

2022-03-13_14-37_1

2022-03-13_14-37_2

2022-03-13_14-38

2022-03-13_14-46

2022-03-13_14-38_1

2022-03-13_14-40

2022-03-13_14-40_1

2022-03-13_14-40_2

2022-03-13_14-41

2022-03-13_14-41_1

Basic filtering techniques

2022-03-13_15-12

2022-03-13_15-15

$$V(x)\ne \sum_{y\in \mathcal{N}(x)}w(x,y)V(y)$$

2022-03-13_15-19

Variance soft shadow mapping (VSSM)

2022-03-13_15-21

2022-03-13_15-21_1

2022-03-13_15-21_2

2022-03-13_15-21_3

2022-03-13_15-21_4

2022-03-13_15-21_5

2022-03-13_15-21_6

2022-03-13_15-22

2022-03-13_15-22_1

MIPMAP and Summed-Area Tables (SAT) Variance Shadow Maps

2022-03-13_15-25

2022-03-13_15-25_1

2022-03-13_15-25_2

2022-03-13_15-26

Moment (矩) shadow mapping

2022-03-13_16-18

2022-03-13_16-19

2022-03-13_16-19_1

2022-03-13_16-20

2022-03-13_16-20_1

2022-03-13_16-20_2

2022-03-13_16-20_3

2022-03-13_16-20_4

Real-Time Environment Mapping

Finishing up on shadows

2022-03-13_16-46

Distance field soft shadows

2022-03-13_16-46_1

2022-03-13_16-46_2

2022-03-13_16-46_3

2022-03-13_16-46_4

2022-03-13_16-47

2022-03-13_16-47_1

2022-03-13_16-47_2

2022-03-13_16-47_3

2022-03-13_16-47_4

2022-03-13_16-50

2022-03-13_16-50_1

Shading from environment lighting

2022-03-13_17-21

2022-03-13_17-21_1

2022-03-13_17-21_2

2022-03-13_17-21_3

2022-03-13_17-21_4

The split sum approximation

2022-03-13_17-21_5

2022-03-13_17-22

2022-03-13_17-22_1

2022-03-13_17-22_2

2022-03-13_17-22_3

2022-03-13_17-22_4

2022-03-13_17-22_5

2022-03-13_17-22_6

2022-03-13_17-22_7

2022-03-13_17-23

shadow from environment lighting

2022-03-13_22-29

2022-03-13_22-29_1

Background knowledge

Frequency and filtering

Basis functions

Real-time environment light (& global illumination)

Spherical Harmonics (SH)

Prefiltered env. lighting

Precomputed Radiance Transfer (PRT)

About

It's a notebook of Real-Time High Quality Rendering by Lingqi Yan 2021

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published