You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is an example that shows how to render to a high-res texture and display a downscaled version. It would be nice to have this as core functionality, so the users don't have to bother with all the rendering details.
As a first step, I quickly threw together the code from the examples into a separate struct, which you can add to your model right now:
// see https://github.com/nannou-org/nannou/blob/91cd548f8d92cfb8ebcd7bcb2069575acba66088/examples/draw/draw_capture_hi_res.rsstructHighResCapturer{// The texture that we will draw to.texture: wgpu::Texture,// Create a `Draw` instance for drawing to our texture.draw: nannou::Draw,// The type used to render the `Draw` vertices to our texture.renderer: nannou::draw::Renderer,// The type used to capture the texture.texture_capturer: wgpu::TextureCapturer,// The type used to resize our texture to the window texture.texture_reshaper: wgpu::TextureReshaper,upscale:u32,}implHighResCapturer{pubfnnew(window:&Window,format: wgpu::TextureFormat,upscale:u32) -> Self{let texture_size = [window.rect().w()asu32* upscale, window.rect().h()asu32* upscale];// Retrieve the wgpu device.let device = window.device();// Create our custom texture.let sample_count = window.msaa_samples();let texture = wgpu::TextureBuilder::new().size(texture_size)// Our texture will be used as the RENDER_ATTACHMENT for our `Draw` render pass.// It will also be SAMPLED by the `TextureCapturer` and `TextureResizer`..usage(wgpu::TextureUsages::RENDER_ATTACHMENT | wgpu::TextureUsages::TEXTURE_BINDING)// Use nannou's default multisampling sample count..sample_count(sample_count).format(format)// Build it!.build(device);let draw = nannou::Draw::new();let descriptor = texture.descriptor();letmut renderer = nannou::draw::RendererBuilder::new().build_from_texture_descriptor(device, descriptor);// Create the texture capturer.let texture_capturer = wgpu::TextureCapturer::default();// Create the texture reshaper.let texture_view = texture.view().build();let texture_sample_type = texture.sample_type();let dst_format = Frame::TEXTURE_FORMAT;let texture_reshaper = wgpu::TextureReshaper::new(
device,&texture_view,
sample_count,
texture_sample_type,
sample_count,
dst_format,);HighResCapturer{
texture,
draw,
renderer,
texture_capturer,
texture_reshaper,
upscale,}}pubfndraw(&mutself,window:&Window,mutview:implFnMut(&Draw,Rect)){// First, reset the `draw` state.let draw = &self.draw;
draw.reset();// Create a `Rect` for our texture to help with drawing.let[w, h] = self.texture.size();let r = geom::Rect::from_w_h(w asf32, h asf32);view(&draw, r);// Render our drawing to the texture.let device = window.device();let ce_desc = wgpu::CommandEncoderDescriptor{label:Some("texture renderer"),};letmut encoder = device.create_command_encoder(&ce_desc);self.renderer.render_to_texture(device,&mut encoder, draw,&self.texture);
window.queue().submit(Some(encoder.finish()));}pubfntry_save(&self,window:&Window,path:implInto<PathBuf>){let device = window.device();let ce_desc = wgpu::CommandEncoderDescriptor{label:Some("texture renderer"),};letmut encoder = device.create_command_encoder(&ce_desc);// Take a snapshot of the texture. The capturer will do the following://// 1. Resolve the texture to a non-multisampled texture if necessary.// 2. Convert the format to non-linear 8-bit sRGBA ready for image storage.// 3. Copy the result to a buffer ready to be mapped for reading.let snapshot = self.texture_capturer.capture(device,&mut encoder,&self.texture);// Submit the commands for our drawing and texture capture to the GPU.
window.queue().submit(Some(encoder.finish()));// Submit a function for writing our snapshot to a PNG.//// NOTE: It is essential that the commands for capturing the snapshot are `submit`ted before we// attempt to read the snapshot - otherwise we will read a blank texture!let path = path.into();
snapshot
.read(move |result| {let image = result.expect("failed to map texture memory").to_owned();
image.save(&path).expect("failed to save texture to png image");println!("Saved as {:?}.png", path);}).unwrap();}pubfnview_downscaled(&self,frame:Frame){// TODO: keep aspect ratio by drawing into a rect??// Sample the texture and write it to the frame.letmut encoder = frame.command_encoder();self.texture_reshaper.encode_render_pass(frame.texture_view(),&mut*encoder);}}
Of course, this needs a lot of polishing before adding it to nannou ever gets feasible.
I'm interested in contributing. But I'll need some guidance. Anyone has an idea where this could be integrated? Maybe into the Builder and App structs?
The text was updated successfully, but these errors were encountered:
There is an example that shows how to render to a high-res texture and display a downscaled version. It would be nice to have this as core functionality, so the users don't have to bother with all the rendering details.
As a first step, I quickly threw together the code from the examples into a separate struct, which you can add to your model right now:
Of course, this needs a lot of polishing before adding it to nannou ever gets feasible.
I'm interested in contributing. But I'll need some guidance. Anyone has an idea where this could be integrated? Maybe into the
Builder
andApp
structs?The text was updated successfully, but these errors were encountered: