To send media in a video conference, you must be able to produce local audio and video. Local media is audio or video produced and sent by the current user. This article describes how to capture local media and send it to remote participants.
The local media abstraction in the client SDK covers the most common use case for producing audio and video data. It wraps a single audio track and a single video track. You can bypass the media abstraction and work with tracks, which are composed of a sequence of media processing elements, directly. Also, to support advanced use cases, you can bypass the track abstraction and work directly with media sources and pipes, which you can connect and disconnect at runtime. This article focuses on the media abstraction, since it addresses the most common use case.
You can have multiple types of local media implementations for each app. Generally, only one implementation is used because each local media implementation is associated with a specific set of inputs. The most common set of inputs used is a camera and microphone. This section describes capturing the user's camera and microphone using an implementation of RtcLocalMedia
named LocalMedia
. The LocalMedia
class is prefixed by an App
namespace so that it's clear the class is in the app layer and not part of the LiveSwitch SDK.
To extend RtcLocalMedia
, you must implement several factory methods. These methods create the components that capture and process your media data. If you are not using the specific component returned by the method, you can return null. This guide indicates which components are required for specific operations, so you can make this determination on your own. You never invoke these methods from your app code. LiveSwitch itself invokes them when necessary. Your only responsibility is to provide an implementation for these methods.
To capture local media, first define how this capture is performed. Extend the FM.LiveSwitch.RtcLocalMedia<T>
class. In this context, the generic type T represents the type of object used for displaying the video preview. For example, if you were using WPF, T might be System.Windows.Controls.Image
.
#if WINFORMS
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.WinForms.PictureBoxControl>
#else // WPF
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<System.Windows.Controls.Image>
#endif
{
public LocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
: base(disableAudio, disableVideo, aecContext)
{
Initialize();
}
}
To capture local media, first define how this capture is performed. Extend the fm.liveswitch.RtcLocalMedia<android.view.View>
class.
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<android.view.View> {
private Context context;
public LocalMedia(Context context, bool disableAudio, bool disableVideo, fm.liveswitch.AecContext aecContext) {
super(disableAudio, disableVideo, aecContext);
this.context = context;
initialize();
}
}
To capture local media, first define how this capture is performed. Extend the FM.LiveSwitch.RtcLocalMedia
class. In this context, the generic type T represents the type of object used for displaying the video preview.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
init(disableAudio:Bool, disableVideo:Bool) {
super.init(disableAudio, disableVideo, nil); // iOS/macOS have hardware AEC
initialize();
}
}
To capture local media, first define how this capture is performed. Extend the FM.LiveSwitch.RtcLocalMedia
class.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
init(disableAudio:Bool, disableVideo:Bool) {
super.init(disableAudio, disableVideo, nil); // iOS/macOS have hardware AEC
initialize();
}
}
To capture local media, first define how this capture is performed. Extend the FM.LiveSwitch.RtcLocalMedia<TView>
class. In this context, the generic type TView represents the type of object used to display the video preview. As an example, which focuses on camera capture, we use the concrete Android.Views.View
type for displaying a local preview on Android and we use the concrete FM.LiveSwitch.Cocoa.OpenGLView
type for displaying a local preview on iOS.
The _Preview
property here is necessary to display a preview of the camera. It is instantiated in the constructor so that we have access to it elsewhere.
#if __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
{
private FM.LiveSwitch.Android.CameraPreview _Preview;
public LocalMedia(Android.Content.Context context, bool enableSoftwareH264, bool disableAudio, bool disableVideo, AecContext aecContext)
: base(context, enableSoftwareH264, disableAudio, disableVideo, aecContext)
{
_Preview = new FM.LiveSwitch.Android.CameraPreview(context, FM.LiveSwitch.LayoutScale.Contain);
Initialize();
}
}
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
{
private FM.LiveSwitch.Cocoa.AVCapturePreview _Preview;
public LocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
: base(disableAudio, disableVideo, aecContext)
{
_Preview = new FM.LiveSwitch.Cocoa.AVCapturePreview();
Initialize();
}
}
#endif
To capture local media, first define how this capture is performed. Extend the FM.LiveSwitch.RtcLocalMedia<UnityEngine.RectTransform>
class.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<UnityEngine.RectTransform>
{
public LocalMedia(bool disableAudio, bool disableVideo, AecContext aecContext)
: base(disableAudio, disableVideo, aecContext)
{
Initialize();
}
}
Capture Local Audio
To enable audio for your App.LocalMedia
class, you must implement two methods. The first method, CreateAudioSource
, returns an audio source that is used for recording audio. For the purposes of this example, the audio source is the user's microphone. LiveSwitch provides a library for your platform that allows you to capture audio from the user's microphone. This is included in the LiveSwitch distribution.
For .NET platforms, first attempt to use DMO (DirectX Media Objects) and then fall back to NAudio if DMO is not supported. DMO has better sound quality and provides acoustic echo cancellation. If it is available, you should use it.
Important
In virtual environments, DMO can still produce echo. For improved results, turn off all enhancements to your virtual audio devices. For best results, don't run in a virtual machine.
You must include FM.LiveSwitch.NAudio.dll
and FM.LiveSwitch.Dmo.dll
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
protected override FM.LiveSwitch.AudioSource CreateAudioSource(FM.LiveSwitch.AudioConfig config)
{
if (Dmo.VoiceCaptureSource.IsSupported())
{
return new Dmo.VoiceCaptureSource(true);
}
else
{
return new NAudio.Source(config);
}
}
}
You must include fm.liveswitch.android.jar
.
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.AudioSource createAudioSource(fm.liveswitch.AudioConfig config) {
return new fm.liveswitch.android.AudioRecordSource(config);
}
}
You must include FMLiveSwitchCocoa.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createAudioSource(config:FMLiveSwitchAudioConfig) -> FMLiveSwitchAudioSource {
return FMLiveSwitchCocoaAudioUnitSource(config)
}
}
You must include FMLiveSwitchCocoa.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createVideoSource() -> FMLiveSwitchVideoSource {
return FMLiveSwitchCocoaAVCaptureSource(config: FMLiveSwitchVideoConfig(width: 640, height: 480, frameRate: 30))
}
}
You must include FM.LiveSwitch.Android.dll
for Android and FM.LiveSwitch.Cocoa.dll
for iOS.
#if __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
{
public override FM.LiveSwitch.AudioSource CreateAudioSource(FM.LiveSwitch.AudioConfig config)
{
return new FM.LiveSwitch.Android.AudioRecordSource(config);
}
}
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
{
public override FM.LiveSwitch.AudioSource CreateAudioSource(FM.LiveSwitch.AudioConfig config)
{
return new FM.LiveSwitch.Cocoa.AudioUnitSource(config);
}
}
#endif
The FM.LiveSwitch.Unity library already provides an implementation of the user's audio source: AudioClipSource()
. The actual audio source varies depending on the platform of the desired build of the Unity project.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
.
.
.
protected override FM.LiveSwitch.AudioSource CreateAudioSource(AudioConfig config)
{
return new FM.LiveSwitch.Unity.AudioClipSource(config);
}
.
.
.
}
The second method to implement is CreateOpusEncoder
. Technically, this isn't required, but if you do not implement it, your app uses lower quality PCMA/PCMU audio codecs. The code sample below demonstrates how to enable the Opus encoder. Once you complete this, your app can capture the user's audio and send it to other participants. LiveSwitch provides platform-specific libraries under the FM.LiveSwitch.Opus
namespace. Add the corresponding library to your project:
You must include FM.LiveSwitch.Opus.dll
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.AudioEncoder CreateOpusEncoder(FM.LiveSwitch.AudioConfig config)
{
return new FM.LiveSwitch.Opus.Encoder(config);
}
}
You must include fm.liveswitch.opus.jar
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.AudioEncoder CreateOpusEncoder(fm.liveswitch.AudioConfig config) {
return new fm.liveswitch.opus.Encoder(config);
}
}
You must include FMLiveSwitchOpus.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createOpusEncoder(with config: FMLiveSwitchAudioConfig) -> FMLiveSwitchAudioEncoder {
return FMLiveSwitchOpusEncoder(config: config)
}
}
You must include FMLiveSwitchOpus.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createOpusEncoder(with config: FMLiveSwitchAudioConfig) -> FMLiveSwitchAudioEncoder {
return FMLiveSwitchOpusEncoder(config: config)
}
}
You must include FM.LiveSwitch.Opus.dll
.
#if __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
#endif
{
public override FM.LiveSwitch.AudioEncoder CreateOpusEncoder(FM.LiveSwitch.AudioConfig config)
{
return new FM.LiveSwitch.Opus.Encoder(config);
}
}
#endif
Capture Local Video
Capturing video for the App.LocalMedia
class is similar to capturing audio, but has a few more steps. In addition to creating a video source and a video encoder, you also need to provide factory methods to create a video preview and to create image manipulation tools. To start, provide an implementation for the CreateVideoSource
method. The code sample below shows how to create your source. LiveSwitch provides implementations for a video source that captures video data from the user's camera. To use this, include the following library for your platform:
You must include FM.LiveSwitch.AForge.dll
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.AForge.CameraSource(new FM.LiveSwitch.VideoConfig(640, 480, 30));
}
}
You must include fm.liveswitch.android.jar
.
Android requires an additional parameter: a native object to display a preview of the user's camera. You must access this and pass it into the video source that you create. For simplicity, this has been created inline. The object is generally initialized earlier in the local media lifecycle, like in the constructor. Camera2 API is the new standard Camera API and available in API level 21+ (Lollipop and beyond). It's recommended to use the Camera2 API because the Camera
class is deprecated.
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
private fm.liveswitch.CameraPreview preview;
public fm.liveswitch.VideoSource createVideoSource() {
this.preview = new fm.liveswitch.android.CameraPreview(this.context);
return new fm.liveswitch.android.Camera2Source(this.preview, new fm.liveswitch.VideoConfig(640, 480, 30));
}
public View getView() {
return this.preview.getView();
}
}
You must include FMLiveSwitchCocoa.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createVideoSource() -> FMLiveSwitchVideoSource {
self.preview = FMLiveSwitchCocoaAVCapturePreview()
return FMLiveSwitchCocoaAVCaptureSource(preview: self.preview, config: FMLiveSwitchVideoConfig(width: 640, height: 480, frameRate: 30))
}
}
You must include FMLiveSwitchCocoa.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createVideoSource() -> FMLiveSwitchVideoSource {
return FMLiveSwitchCocoaAVCaptureSource(config: FMLiveSwitchVideoConfig(width: 640, height: 480, frameRate: 30))
}
}
You must include FM.LiveSwitch.Android.dll
for Android and FM.LiveSwitch.Cocoa.dll
for iOS. A native object for displaying a preview of the user's camera is required. You need to access this, and pass it into the video source that you create. We initialized this in the local media constructor above.
#ifdef __ANDROID__
// In the constructor above we instantiated the _Preview property.
// The _Preview is required to create your AVCaptureSource.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Android.CameraSource(_Preview, new FM.LiveSwitch.VideoConfig(640, 480, 30));
}
}
#elif __IOS__
// In the constructor above we instantiated the _Preview property.
// The _Preview is required to create your AVCaptureSource.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
{
public override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Cocoa.AVCaptureSource(_Preview, new FM.LiveSwitch.VideoConfig(640, 480, 30));
}
}
#endif
Unity handles the library for you, but make sure you are using the right plugins. The FM.LiveSwitch.Unity
library three sources of video: WebCamTextureSource()
To capture the user's camera, ScreenSource()
to capture the user's screen, and Texture2DSource()
to capture the Unity screen.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
.
.
.
private FM.LiveSwitch.VideoConfig cameraConfig = new FM.LiveSwitch.VideoConfig(640, 480, 30);
.
.
.
// To use the user camera
protected override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Unity.WebCamTextureSource(cameraConfig);
}
// To capture the screen
protected override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Unity.ScreenSource();
}
// To capture the Unity Screen
protected override FM.LiveSwitch.VideoSource CreateVideoSource()
{
return new FM.LiveSwitch.Unity.Texture2DSource();
}
.
.
.
}
Next, create a preview view control for the media session. This shows a preview of your camera as it records and sends data. To create this control, implement the CreateViewSink
method. This method must return a view of the same type as the type parameter of your App.LocalMedia
class.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.ViewSink<...> CreateViewSink
{
#if WINFORMS
return new FM.LiveSwitch.WinForms.PictureBoxSink
#else // WPF
return new FM.LiveSwitch.Wpf.ImageSink
#endif
{
ViewMirror = true
};
}
}
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.ViewSink<...> createViewSink() {
return null; // we override `getView` instead
}
}
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createViewSink() -> FMLiveSwitchViewSink {
return nil; // we override `view` instead
}
}
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createViewSink() -> FMLiveSwitchViewSink {
return FMLiveSwitchCocoaImageViewSink()
}
}
In the previous section, we created an object to use the platform's native camera preview. Here we provide that object as a view that displays the data from the preview instance that you created previously. We must also return null from the CreateViewSink function as shown here, so that the view we have created is used directly.
#ifdef __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
{
// In the constructor above we instantiated the _Preview property.
public new View View
{
get { return _Preview.View; }
}
// We return null so that our _Preview is used instead of a view sink.
public override FM.LiveSwitch.ViewSink<View> CreateViewSink()
{
return null;
}
}
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
{
// In the constructor above we instantiated the _Preview property.
public UIKit.UIView GetView()
{
return _preview;
}
// We return null so that our _Preview is used instead of a view sink.
public override FM.LiveSwitch.ViewSink<FM.LiveSwitch.Cocoa.OpenGLView> CreateViewSink
{
return null;
}
}
#endif
For the WebCamTextureSource()
, the WebCamTextureSource
provides access to an optimized preview of the video source. As such we can safely return null on the ViewSink
method and create a getter for the WebCamTextureSource
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
.
.
.
public override UnityEngine.RectTransform View
{
get { return ((FM.LiveSwitch.Unity.WebCamTextureSource)VideoSource)?.View; }
}
protected override FM.LiveSwitch.ViewSink<UnityEngine.RectTransform> CreateViewSink()
{
return null;
}
.
.
.
}
The next step is to specify which video codecs to use. This is done by providing implementations for the CreateVp8Encoder
, CreateVp9Encoder
and CreateH264Encoder
classes. These implementations are in the FM.LiveSwitch.Vpx
and FM.LiveSwitch.OpenH264
libraries. If you want to support a codec, return an appropriate encoder from the factory method associated with a codec. You can turn off a codec by returning null from the its factory method. The code below demonstrates how to enable these codecs.
Include FM.LiveSwitch.Vpx.dll
and/or FM.LiveSwitch.OpenH264.dll
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.VideoEncoder CreateVp8Encoder()
{
return new FM.LiveSwitch.Vp8.Encoder();
}
public override FM.LiveSwitch.VideoEncoder CreateVp9Encoder()
{
return new FM.LiveSwitch.Vp9.Encoder();
}
public override FM.LiveSwitch.VideoEncoder CreateH264Encoder()
{
return new FM.LiveSwitch.OpenH264.Encoder();
}
}
Include fm.liveswitch.vpx.jar
and/or fm.liveswitch.openh264.jar
.
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.VideoEncoder createVp8Encoder() {
return fm.liveswitch.vp8.Encoder();
}
public fm.liveswitch.VideoEncoder createVp9Encoder() {
return fm.liveswitch.vp9.Encoder();
}
public fm.liveswitch.VideoEncoder createH264Encoder() {
return fm.liveswitch.openh264.Encoder();
}
}
Include FMLiveSwitchVpx.a
.
Note
For iOS platforms, there is no OpenH264 library. These platforms natively support the H.264 codec.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createVp8Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchVp8Encoder()
}
override func createVp9Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchVp9Encoder()
}
override func createH264Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchCocoaVideoToolboxH264Encoder()
}
}
You must FMLiveSwitchVpx.a
.
Note
For Mac platforms, there is no OpenH264 library. These platforms natively support the H.264 codec.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createVp8Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchVp8Encoder()
}
override func createVp9Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchVp9Encoder()
}
override func createH264Encoder() -> FMLiveSwitchVideoEncoder {
return FMLiveSwitchCocoaVideoToolboxH264Encoder()
}
}
You must include FM.LiveSwitch.Vpx.dll
.
#if __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
#endif
{
public override FM.LiveSwitch.VideoEncoder CreateVp8Encoder()
{
return new FM.LiveSwitch.Vp8.Encoder();
}
public override FM.LiveSwitch.VideoEncoder CreateVp9Encoder()
{
return new FM.LiveSwitch.Vp9.Encoder();
}
}
#endif
Unity does not support H.264 encoding. This requires a runtime download from Cisco for licensing reasons, which is not supported by Unity.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
.
.
.
protected override FM.LiveSwitch.VideoEncoder CreateVp8Encoder()
{
return new FM.LiveSwitch.Vp8.Encoder();
}
protected override FM.LiveSwitch.VideoEncoder CreateVp9Encoder()
{
return new FM.LiveSwitch.Vp9.Encoder();
}
protected override FM.LiveSwitch.VideoEncoder CreateH264Encoder()
{
// OpenH264 requires a runtime download from Cisco
// for licensing reasons, which is not currently
// supported on Unity.
return null;
}
.
.
.
}
Finally, provide a minor image formatting utility by implementing the CreateImageConverter
method. This creates a tool that converts between various color spaces, which are different ways of representing colors. This is needed because webcams do not capture data in the i420 color space, which is required by the LiveSwitch video encoders. You should return an instance of FM.LiveSwitch.Yuv.ImageConverter
for this method. The code below demonstrates how to create this object. At this point, your App.LocalMedia
class can capture and send both audio and video. The next step is to start capturing data.
These classes are in the FM.LiveSwitch.Yuv
library. The library itself is a small wrapper around libyuv. LiveSwitch provides compiled versions of this library for your platform:
You must include FM.LiveSwitch.Yuv.dll
.
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<...>
{
public override FM.LiveSwitch.VideoPipe CreateImageConverter(FM.LiveSwitch.VideoFormat outputFormat)
{
return new FM.LiveSwitch.Yuv.ImageConverter(outputFormat);
}
}
You must include fm.liveswitch.yuv.jar
.
public class LocalMedia extends fm.liveswitch.RtcLocalMedia<...> {
public fm.liveswitch.VideoPipe createImageConverter(fm.liveswitch.VideoFormat outputFormat) {
return fm.liveswitch.yuv.ImageConverter(outputFormat);
}
}
You must include FMLiveSwitchYuv.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createImageConverter(outputFormat:FMLiveSwitchVideoFormat) -> FMLiveSwitchVideoPipe {
return FMLiveSwitchYuvImageConverter(outputFormat: outputFormat)
}
}
You must include FMLiveSwitchYuv.a
.
public class LocalMedia : FMLiveSwitchRtcLocalMedia {
override func createImageConverter(outputFormat:FMLiveSwitchVideoFormat) -> FMLiveSwitchVideoPipe {
return FMLiveSwitchYuvImageConverter(outputFormat: outputFormat)
}
}
You must include FM.LiveSwitch.Yuv.dll
.
#if __ANDROID__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<View>
#elif __IOS__
public class LocalMedia : FM.LiveSwitch.RtcLocalMedia<FM.LiveSwitch.Cocoa.OpenGLView>
#endif
{
public override FM.LiveSwitch.VideoPipe CreateImageConverter(FM.LiveSwitch.VideoFormat outputFormat)
{
return new FM.LiveSwitch.Yuv.ImageConverter(outputFormat);
}
}