-
Notifications
You must be signed in to change notification settings - Fork 33
Examples
- Image capture
- Raw image capture
- Timelapse mode
- Timeout mode
- Video recording
- Segmented video recording
- Quantization parameter
- Change encoding type
- Resizer component
- Splitter component
- Print pipeline
- Encode / Decode from FileStream - Image
- Static render overlay
- FFmpeg - RTMP streaming
- FFmpeg - Raw video convert
- FFmpeg - Images to video
If you want to change any of the default configuration settings, this can be done by modifying the static properties within the MMALCameraConfig
class. The main class, MMALCamera
which interfaces to the rest of the functionality the library provides is a Singleton and is called as follows: MMALCamera cam = MMALCamera.Instance
.
MMALSharp is asynchronous in nature, preventing any blocking of the main thread in your application. If you are planning on using MMALSharp within a console application, it is important to provide a context which your application will use when returning from asynchronous method calls. In the examples below, we are demonstrating usage with AsyncContext
included in the Nito.AsyncEx
library by @StephenClary. GUI applications provide their own context and therefore this should not be necessary.
For FFmpeg functionality, you will need to install the latest version of FFmpeg from source - do not install from the Raspbian repositories as they don't have H.264 support.
A guide to installing FFmpeg from source including the H.264 codec can be found here
Note: The await Task.Delay(2000);
is required to allow the camera sensor to "warm up". Due to the rolling shutter used in the Raspberry Pi camera modules, we need to wait for a few seconds before valid image data can be used, otherwise your images will likely be under-exposed. The value of 2 seconds is a safe amount of time to wait, but is only required after enabling the camera component, either on first run or after a manual disable.
Additionally, the call to ConfigureCameraSettings()
is only required if you have made changes to the camera's configuration.
Support for these encoders has been added in later firmware releases so will likely need a sudo rpi-update
in order for it to work. Please see this issue for reference.
The below examples describe how to take a simple JPEG image, either by using the built-in helper method or manual mode. Here we are using an Image Encoder component which will encode the raw image data into JPEG format; you can change the encoding format to be one of the following: JPEG, BMP, PNG, GIF. In addition, you can also change the pixel format you would like to encode with - in the below examples we are using YUV420.
Helper mode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
await cam.TakePicture(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420);
}
});
cam.Cleanup();
}
Manual mode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
});
cam.Cleanup();
}
In this example we are capturing raw unencoded image data directly from the camera sensor. You can change the pixel format of the raw data by changing the MMALCameraConfig.MMALStillEncoding
and MMALCameraConfig.MMALStillSubFormat
properties.
Helper mode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "raw"))
{
await cam.TakeRawPicture(imgCaptureHandler);
}
});
cam.Cleanup();
}
The timelapse mode example describes how to take an image every 10 seconds for 4 hours. You can change the frequency and duration of the timelapse mode by changing the various properties in the Timelapse
object.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
// This example will take an image every 10 seconds for 4 hours
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);
}
});
cam.Cleanup();
}
The timeout mode example shows how to take simultaneous image captures for a set duration. This is done via a helper method in the MMALCamera
class. We pass in a CancellationToken
which will signal when image capturing should stop.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
await cam.TakePictureTimeout(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, cts.Token);
}
});
cam.Cleanup();
}
The below examples show how to capture video using MMALSharp. For basic video recording, there is a built in helper method which uses H.264 encoding. If you wish to use a different encoding type, or would like to customise additional parameters such as bitrate, you can also do this manually.
Helper mode
// Self-contained method for recording H.264 video for a specified amount of time. Records at 30fps, 25Mb/s at the highest quality.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
{
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.TakeVideo(vidCaptureHandler, cts.Token);
}
});
cam.Cleanup();
}
Manual mode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
The segmented recording mode allows us to split video recording into multiple files. The user is able to specify the frequency at which the split occurs via the Split
object.
Note: MMALCameraConfig.InlineHeaders
must be set to true in order for this to work.
static void Main(string[] args)
{
// Required for segmented recording mode
MMALCameraConfig.InlineHeaders = true;
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler, null, new Split { Mode = TimelapseMode.Second, Value = 30 }))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
cam.ConfigureCameraSettings();
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(1));
// Record video for 1 minute, using segmented video record to split into multiple files every 30 seconds.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
The quantization parameter allows us to set a variable bitrate when recording in H.264 encoding. To enable this behavior, set the bitrate parameter to '0' and set the quality parameter to a value between 1-10. Note: this only applies to H.264, MJPEG makes use of both the quality and bitrate values.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. We make use of the quantization parameter (quality) to set a variable bitrate. The value 10 is the highest setting.
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 10);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
Due to the way MMALSharp handles the lifecycle of each component, if we wish to change the encoding of a component we must do this by leaving the scope of the encoder's current using
block; after doing so, this will free up the unmanaged resources of that encoder and will allow us to create a fresh instance with a different encoding type.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "bmp"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(0, MMALEncoding.BMP, MMALEncoding.RGB32, 90);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
await cam.ProcessAsync(cam.Camera.StillPort);
}
});
cam.Cleanup();
}
The same applies to video encoders too.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "avi"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
using (var vidCaptureHandler = new VideoStreamCaptureHandler("/home/pi/videos/", "mjpeg"))
using (var vidEncoder = new MMALVideoEncoder(vidCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s at the highest quality setting for MJPEG (90).
vidEncoder.ConfigureOutputPort(0, MMALEncoding.MJPEG, MMALEncoding.I420, 90, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
The MMALResizerComponent
can be connected to your pipeline to change the width/height and encoding type/pixel format of frames captured by the camera component. The resizer component itself is an MMALDownstreamHandlerComponent
meaning you can process data to a file directly from it without the need to connect an encoder.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var resizer = new MMALResizerComponent(800, 600, null))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
// Create our component pipeline.
resizer.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, cam.Camera.StillPort);
resizer.ConfigureOutputPort(0, MMALEncoding.I420, MMALEncoding.I420, 0);
imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
cam.Camera.StillPort.ConnectTo(resizer);
resizer.Outputs[0].ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
});
cam.Cleanup();
}
The MMALSplitterComponent
connects exclusively to the video port of the camera component. From here, the splitter provides 4 output ports, allowing you to further extend your pipeline and produce up to 4 file outputs at any given time.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var handler = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
using (var handler2 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
using (var handler3 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
using (var handler4 = new VideoStreamCaptureHandler("/home/pi/video/", "avi"))
using (var splitter = new MMALSplitterComponent(null))
using (var vidEncoder = new MMALVideoEncoder(handler, DateTime.Now.AddSeconds(10)))
using (var vidEncoder2 = new MMALVideoEncoder(handler2, DateTime.Now.AddSeconds(15)))
using (var vidEncoder3 = new MMALVideoEncoder(handler3, DateTime.Now.AddSeconds(10)))
using (var vidEncoder4 = new MMALVideoEncoder(handler4, DateTime.Now.AddSeconds(10)))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline.
splitter.ConfigureInputPort(MMALEncoding.I420, MMALEncoding.I420, cam.Camera.VideoPort);
splitter.ConfigureOutputPort(0, MMALEncoding.OPAQUE, MMALEncoding.I420, 0);
splitter.ConfigureOutputPort(1, MMALEncoding.OPAQUE, MMALEncoding.I420, 0);
splitter.ConfigureOutputPort(2, MMALEncoding.OPAQUE, MMALEncoding.I420, 0);
splitter.ConfigureOutputPort(3, MMALEncoding.OPAQUE, MMALEncoding.I420, 0);
vidEncoder.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[0]);
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 10, 25000000);
vidEncoder2.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[1]);
vidEncoder2.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 20, 25000000);
vidEncoder3.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[2]);
vidEncoder3.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 30, 25000000);
vidEncoder4.ConfigureInputPort(MMALEncoding.OPAQUE, MMALEncoding.I420, splitter.Outputs[3]);
vidEncoder4.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 40, 25000000);
cam.Camera.VideoPort.ConnectTo(splitter);
splitter.Outputs[0].ConnectTo(vidEncoder);
splitter.Outputs[1].ConnectTo(vidEncoder2);
splitter.Outputs[2].ConnectTo(vidEncoder3);
splitter.Outputs[3].ConnectTo(vidEncoder4);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.VideoPort);
}
});
cam.Cleanup();
}
Version 0.3 brings the ability to print out the current component pipeline you have configured - this can be useful when using many components and encoders (such as the splitter).
Calling the PrintPipeline()
method on the MMALCamera
instance will print your current pipeline to the console window.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var nullSink = new MMALNullSinkComponent())
{
cam.ConfigureCameraSettings();
// Create our component pipeline.
imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(nullSink);
cam.PrintPipeline();
// Camera warm up time
await Task.Delay(2000);
await cam.ProcessAsync(cam.Camera.StillPort);
}
});
cam.Cleanup();
}
MMALSharp provides the ability to encode/decode images fed from FileStreams. It supports GIF, BMP, JPEG and PNG file formats, and decoding must be carried out to the following:
- JPEG -> YUV420/422 (I420/422)
- GIF -> RGB565 (RGB16)
- BMP/PNG -> RGBA
Encode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var stream = File.OpenRead("/home/pi/raw_jpeg_decode.raw"))
using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))
using (var imgEncoder = new MMALImageFileEncoder(imgCaptureHandler))
{
// Create our component pipeline.
imgEncoder.ConfigureInputPort(MMALEncoding.I420, null, 2592, 1944);
imgEncoder.ConfigureOutputPort(MMALEncoding.BMP, MMALEncoding.I420, 90, zeroCopy: true);
await imgEncoder.Convert();
Console.WriteLine("Finished");
}
});
cam.Cleanup();
}
Decode
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var stream = File.OpenRead("/home/pi/test.jpg"))
using (var imgCaptureHandler = new TransformStreamCaptureHandler(stream, "/home/pi/images/", "raw"))
using (var imgDecoder = new MMALImageFileDecoder(imgCaptureHandler))
{
// Create our component pipeline.
imgDecoder.ConfigureInputPort(MMALEncoding.JPEG, null);
imgDecoder.ConfigureOutputPort(MMALEncoding.I420, null, 90, zeroCopy: true);
await imgDecoder.Convert();
Console.WriteLine("Finished");
}
});
cam.Cleanup();
}
MMAL allows you to create additional video preview renderers which sit alongside the usual Null Sink or Video renderers shown in previous examples. The purpose of the additional renderers is that they allow you to overlay static content which is shown onto the display your Pi is connected to.
The overlay renderers will only work with unencoded images and they must have one of the following pixel formats:
- YUV420 (I420)
- RGB888 (RGB24)
- RGBA
- BGR888 (BGR24)
- BGRA
An easy way to get an unencoded image for use with the overlay renderers is to use the Raw image capture functionality as described in this example, setting the MMALCameraConfig.StillEncoding
and MMALCameraConfig.StillSubFormat
properties to one of the accepted pixel formats. Once you have got your test frame, follow the below example to overlay your image:
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
PreviewConfiguration previewConfig = new PreviewConfiguration
{
FullScreen = false,
PreviewWindow = new Rectangle(160, 0, 640, 480),
Layer = 2,
Opacity = 1
};
AsyncContext.Run(async () =>
{
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
using (var imgEncoder = new MMALImageEncoder(imgCaptureHandler))
using (var video = new MMALVideoRenderer(previewConfig))
{
cam.ConfigureCameraSettings();
video.ConfigureRenderer();
PreviewOverlayConfiguration overlayConfig = new PreviewOverlayConfiguration
{
FullScreen = true,
PreviewWindow = new Rectangle(50, 0, 640, 480),
Layer = 1,
Resolution = new Resolution(640, 480),
Encoding = MMALEncoding.I420,
Opacity = 255
};
var overlay = cam.AddOverlay(video, overlayConfig, File.ReadAllBytes("/home/pi/test1.raw"));
overlay.ConfigureRenderer();
overlay.UpdateOverlay();
//Create our component pipeline.
imgEncoder.ConfigureOutputPort(0, MMALEncoding.JPEG, MMALEncoding.I420, 90);
cam.Camera.StillPort.ConnectTo(imgEncoder);
cam.Camera.PreviewPort.ConnectTo(video);
cam.PrintPipeline();
await cam.ProcessAsync(cam.Camera.StillPort);
}
});
cam.Cleanup();
}
In this example, we are using an unencoded YUV420 image and configuring the renderer using the settings in overlayConfig
.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
// An RTMP server needs to be listening on the address specified in the capture handler. I have used the Nginx RTMP module for testing.
using (var ffCaptureHandler = FFmpegCaptureHandler.RTMPStreamer("mystream", "rtmp://192.168.1.91:6767/live"))
using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
// Create our component pipeline. Here we are using the H.264 standard with a YUV420 pixel format. The video will be taken at 25Mb/s.
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
Note:
If you intend on using the YouTube live streaming service, you will need to create the below method to return your own FFmpegCaptureHandler
. You should replace the internal FFmpegCaptureHandler.RTMPStreamer
seen in the example above with your custom method. The reason for this is YouTube streaming requires your RTMP stream to contain an audio input or otherwise it won't work. Internally, our RTMP streaming method does not include an audio stream, and at the current time we don't intend on changing it for this specific purpose.
public static FFmpegCaptureHandler RTMPStreamerWithAudio(string streamName, string streamUrl)
=> new FFmpegCaptureHandler($"-re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv -metadata streamName={streamName} {streamUrl}");
Please see here which discusses the issue in-depth.
This is a useful capture mode as it will push the elementary H.264 stream into an AVI container which can be opened by media players such as VLC.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
using (var ffCaptureHandler = FFmpegCaptureHandler.RawVideoToAvi("/home/pi/videos/", "testing1234"))
using (var vidEncoder = new MMALVideoEncoder(ffCaptureHandler))
using (var renderer = new MMALVideoRenderer())
{
cam.ConfigureCameraSettings();
vidEncoder.ConfigureOutputPort(0, MMALEncoding.H264, MMALEncoding.I420, 0, 25000000);
cam.Camera.VideoPort.ConnectTo(vidEncoder);
cam.Camera.PreviewPort.ConnectTo(renderer);
// Camera warm up time
await Task.Delay(2000);
var cts = new CancellationTokenSource(TimeSpan.FromMinutes(3));
// Take video for 3 minutes.
await cam.ProcessAsync(cam.Camera.VideoPort, cts.Token);
}
});
cam.Cleanup();
}
This example will push all images processed by an image capture handler into a playable video.
static void Main(string[] args)
{
MMALCamera cam = MMALCamera.Instance;
AsyncContext.Run(async () =>
{
// This example will take an image every 10 seconds for 4 hours
using (var imgCaptureHandler = new ImageStreamCaptureHandler("/home/pi/images/", "jpg"))
{
var cts = new CancellationTokenSource(TimeSpan.FromHours(4));
var tl = new Timelapse { Mode = TimelapseMode.Second, CancellationToken = cts.Token, Value = 10 };
await cam.TakePictureTimelapse(imgCaptureHandler, MMALEncoding.JPEG, MMALEncoding.I420, tl);
// Process all images captured into a video at 2fps.
imgCaptureHandler.ImagesToVideo("/home/pi/images/", 2);
}
});
cam.Cleanup();
}