Receiving Remote Media

As mentioned in previous sections, media data that is received from remote users is known as remote media. In the previous section you learned about Starting Local Media, which allowed you to capture audio and video data using a user's camera and microphone. This section focuses instead on displaying the audio and video data that you receive from other users.

Defining the Remote Media Object

The remote media object specifies platform-specific implementations for several core IceLink features. Specifically, it enables:

  • audio and video decoding
  • creating views to display audio and video data
  • image processing utilities

The remote media object works the same way as the local media object does. The IceLink SDK again provides platform-specific implementations for supported features and you must again create your own remote media class, which specifies which implementations to use. This step is not necessary for JavaScript, for the same reasons as it is not necessary for starting local media. If you are using JavaScript, you can skip ahead to the Displaying a Remote Video Feed section below.

To create a remote media object, create a new class. This class must inherit from FM.IceLink.RtcRemoteMedia<T>. The generic type T is the type that will be used to display remote video feeds. For example, if you were using WPF, T would be System.Windows.Controls.Image. For Swift and Objective-C, you do not have to specify a generic type. This guide focuses on displaying remote video feeds, and so describes them implementation of a RemoteMedia class. Again, you do not have to follow this convention - name your class whatever you would like.

Your remote media implementation must define several factory methods, where each one is associated with a specific feature. For example, the CreateOpusDecoder allows your remote media class to decode Opus-encoded audio. If you want to disable a feature, you can return null from any method. The next sections of this guide provide implementations for each method and discuss which are required and which are optional.

Initializing the Remote Media Object

To begin implementing your RemoteMedia class, you will need to define one or more constructors. Each constructor must call a parent constructors from the RtcRemoteMedia class and must invoke the inherited Initialize method. There are two parent constructors that you can choose from. The first has two parameters, disableAudio and disableVideo, which allow you to avoid receiving either audio or video data. The second constructor has an additional parameter that allows you to enable acoustic echo cancellation. This is beyond the scope of this guide. Refer to the Enabling Acoustic Echo Cancellation. guide for more information on this feature.

The following code demonstrates the implementation of a RemoteMedia class with a single constructor.


// for winforms
public class RemoteMedia : FM.IceLink.RtcRemoteMedia<FM.IceLink.Winforms.PictureBoxControl>

// for wpf
public class RemoteMedia : FM.IceLink.RtcRemoteMedia<System.Windows.Controls.Image>
{
    public RemoteMedia(bool disableAudio, bool disableVideo)
        : base(disableAudio, disableVideo)
    {
        Initialize();
    }
}
// for android
public class RemoteMedia extends fm.icelink.RtcRemoteMedia<android.view.View> {

// for java
public class RemoteMedia extends fm.icelink.RtcRemoteMedia<java.awt.VideoComponent> {
    public LocalCameraMedia(bool disableAudio, bool disableVideo) {
        super(disableAudio, disableVideo);
        
        initialize();
    }
}
@interface RemoteMedia : FMIceLinkRtcRemoteMedia
@end

@implementation LocalCameraMedia
- (instancetype) initWithDisableAudio:(bool)disableAudio disableVideo(bool)disableVideo {
    self = [super initWithDisableAudio:disableAudio disableVideo:disableVideo];
    
    [self initialize];
}
@end
public class RemoteMedia : FMIceLinkRtcRemoteMedia {
    init(disableAudio:Bool, disableVideo:Bool) {
        super.init(disableAudio, disableVideo);
        
        initialize();
    }
}

Now that you can instantiate your remote media class, the next step is to display a remote user's audio and video data.

Playing Audio

To play audio from a remote video feed, you must implement two methods. The first method, CreateAudioSink, returns an audio sink that will be used for audio playback. Normally, this audio sink is the system speaker. IceLink provides libraries for each platform that allow playback into the user's speaker:

  • For C#: include FM.IceLink.NAudio.dll.
  • For Java for Android: including fm.icelink.android.jar.
  • For Other Java Programs: include fm.icelink.java.jar.
  • For Objective-C: include FMIceLinkCocoa.a.

Each of these libraries has a platform-specific FM.IceLink.AudioSink implementation that plays remote audio data through the user's speakers. The code samples below show how to do this:


public class RemoteMedia : FM.IceLink.RtcRemoteMedia<...>

{
    public override FM.IceLink.AudioSink CreateAudioSink(FM.IceLink.AudioConfig config)
    {
        return new FM.IceLink.NAudio.Sink(config);
    }
}
public class RemoteMedia extends fm.icelink.RtcRemoteMedia<...> {
    @Override
    public fm.icelink.AudioSink createAudioSink(fm.icelink.AudioConfig config) {
        // for android
        return new fm.icelink.android.AudioTrackSink(config);
        
        // for java
        return new fm.icelink.java.SoundSink(config);        
    }
}
@implementation RemoteMedia
- (FMIceLinkAudioSink *) createAudioSinkWithConfig: (FMIceLinkAudioConfig *)config {
    return [FMIceLinkCocoaAudioUnitSink audioUnitSinkWithConfig:config];
}
@end
public class RemoteMedia : FMIceLinkRtcRemoteMedia {
    override func createAudioSink(config:FMIceLinkAudioConfig) -> FMIceLinkAudioSink {
        return FMIceLinkCocoaAudioUnitSink(config)
    }
}

The second method to implement is CreateOpusDecoder. As you already know from implementing the LocalCameraMedia class, IceLink provides bindings for Opus libraries. If you haven't already, you must include them:

  • For C#: include FM.LiveSwitch.Opus.dll.
  • For Java: include fm.liveswitch.opus.jar.
  • For Cocoa: include FMLiveSwitchOpus.a.

The implementation for CreateOpusDecoder will be similar to that of CreateOpusEncoder. Instead of supplying an Opus encoder implementation, however, you supply an Opus decoder implementation, as shown below.


public class RemoteMedia : FM.IceLink.RtcRemoteMedia<...>
{
    public override FM.IceLink.AudioDecoder CreateOpusDecoder(FM.IceLink.AudioConfig config)
    {
        return new FM.IceLink.Opus.Decoder(config);
    }
}
public class RemoteMedia extends fm.icelink.RtcRemoteMedia<...> {
    @Override
    public fm.icelink.AudioDecoder CreateOpusDecoder(fm.icelink.AudioConfig config) {
        return new fm.icelink.opus.Decoder(config);
    }
}
@implementation RemoteMedia
- (FMIceLinkAudioDecoder *) createOpusDecoderWithConfig: (FMIceLinkAudioConfig *)config {
    return [FMIceLinkOpusDecoder decoderWithConfig:config];
}
@end
public class RemoteMedia : FMIceLinkRtcRemoteMedia {
    override func createOpusDecoder(config:FMIceLinkAudioConfig) -> FMIceLinkAudioDecoder {
        return FMIceLinkOpusDecoder(config)
    }
}

You can now receive and decode audio sent from remote users. The next section shows you how to do the same with remote video.

Displaying Video

To play remote video with the RemoteMedia class, you must implement several methods, in the same way that you had to implement several methods to capture local video with your LocalCameraMedia class. The first method that you must implement is CreateViewSink. This method creates a view object that will display video data from a remote video feed. IceLink provides implementations for each platform:

  • For C# with WPF: include FM.IceLink.Wpf.dll.
  • For C# with Windows Forms: include FM.IceLink.WinForms.dll.
  • For Java for Android: including fm.icelink.android.jar.
  • For other Java Programs: include fm.icelink.java.jar.
  • For Objective-C: include FMIceLinkCocoa.a.

The code below shows how to implement this.


public class RemoteMedia : FM.IceLink.RtcRemoteMedia<...>
{
    // if using WPF
    public override FM.IceLink.ViewSink<System.Windows.Controls.Image> CreateViewSink
    {
        return new FM.IceLink.Wpf.ImageSink
        {
            ViewScale = LayoutScale.Contain,
            ViewMirror = true
        };
    }

    // if using Windows Forms
    public override FM.IceLink.ViewSink<FM.IceLink.WinForms.PictureBoxControl> CreateViewSink
    {
        return new FM.IceLink.WinForms.PictureBoxSink
        {
            ViewScale = LayoutScale.Contain,
            ViewMirror = true
        };
    }
}
public class RemoteMedia extends fm.icelink.RtcRemoteMedia<...> {
    // for android
    @Override
    public fm.icelink.ViewSink<android.widget.FrameLayout> createViewSink() {
        android.content.Context context;

        return new fm.icelink.android.OpenGLSink(context);
    }

    // for java
    @Override
    public fm.liveswitch.ViewSink<fm.icelink.java.VideoComponent> createViewSink() {
        return new fm.icelink.java.VideoComponentSink();
    }
}
@implementation RemoteMedia
// for iOS
- (FMIceLinkViewSink *)createViewSink {
    return [FMIceLinkCocoaOpenGLSink openGLSinkWithViewScale: FMIceLinkLayoutScaleContain];
}

// for macOS
- (FMIceLinkViewSink *)createViewSink {
    return [FMIceLinkCocoaImageViewSink imageViewSink];
}
@end
public class RemoteMedia : FMIceLinkRtcRemoteMedia {
    // for iOS
    override func createViewSink() -> FMIceLinkViewSink {
        return FMIceLinkCocoaOpenGLSink(viewScale: FMIceLinkLayoutScaleContain)
    }

    // for macOS
    override func createViewSink() -> FMIceLinkViewSink {    
        return FMIceLinkCocoaImageViewSink()
    }
}

The next step is to implement the factory methods for the various video codec decoders. These are CreateVp8Decoder, CreateVp9Decoder and CreateH264Decoder. For each method, create and return an instance of the appropriate decoder. As with encoders, if you do not wish to support a codec, you can return null. The implementations can be found in the following libraries:

  • For C#: include FM.IceLink.Vpx.dll and/or FM.IceLink.OpenH264.dll.
  • For Java: including fm.icelink.vpx.jar and/or fm.icelink.openh264.jar.
  • For Objective-C: include FMIceLinkVpx.a.

Again, for Apple platforms, there is no H264 library to include because the codec is supported natively.


public class RemoteMedia : FM.IceLink.RtcRemoteMedia<...>
{
    public override FM.IceLink.VideoDecoder CreateVp8Decoder()
    {
        return new FM.IceLink.Vp8.Decoder();
    }

    public override FM.IceLink.VideoDecoder CreateVp9Decoder()
    {
        return new FM.IceLink.Vp9.Decoder();
    }

    public override FM.IceLink.VideoDecoder CreateH264Decoder()
    {
        return new FM.IceLink.H264.Decoder();
    }
}
public class RemoteMedia extends fm.icelink.RtcRmoteMedia<...> {
    @Override
    public fm.icelink.VideoDecoder createVp8Encoder() {
        return fm.icelink.vp8.Decoder();
    }

    @Override
    public fm.icelink.VideoDecoder createVp9Encoder() {
        return fm.icelink.vp9.Decoder();
    }

    @Override
    public fm.icelink.VideoDecoder createH264Encoder() {
        return fm.icelink.openh264.Decoder();
    }
}
@implementation RemoteMedia
- (FMIceLinkVideoDecoder *)createVp8Decoder {
    return [FMIceLinkVp8Decoder decoder];
}

- (FMIceLinkVideoDecoder *)createVp9Decoder {
    return [FMIceLinkVp9Decoder decoder];
}

- (FMIceLinkVideoDecoder *)createH264Decoder {
    return [FMIceLinkCocoaVideoToolboxH264Decoder videoToolboxH264Decoder];
}
@end
public class RemoteMedia : FMIceLinkRtcRemoteMedia {
    override func createVp8Decoder() -> FMIceLinkVideoDecoder {
        return FMIceLinkVp8Decoder()
    }

    override func createVp9Decoder() -> FMIceLinkVideoDecoder {
        return FMIceLinkVp9Decoder()
    }

    override func createH264Decoder() -> FMIceLinkVideoDecoder {
        return FMIceLinkCocoaVideoToolboxH264Decoder()
    }
}

Finally, you will need to implement the CreateImageConverter method. The implementation is identical to the implementation for your LocalCameraMedia class. Simply return an instance of FM.IceLink.Yuv.ImageConverter. If you have not included the libyuv library yet, make sure you do so:

  • For C#: include FM.IceLink.Yuv.dll.
  • For Java: including fm.icelink.yuv.jar.
  • For Objective-C: include FMIceLinkYuv.a.

See the code below to check how this is implemented:


public class RemoteMedia : FM.IceLink.RtcRemotelMedia<...>
{
    public override FM.IceLink.VideoPipe CreateImageConverter(FM.IceLink.VideoFormat outputFormat)
    {
        return new FM.IceLink.Vuv.ImageConverter(outputFormat);
    }
}
public class RemoteMedia extends fm.icelink.RtcremoteMedia<...> {
    @Override
    public fm.icelink.VideoPipe createImageConverter(fm.icelink.VideoFormat outputFormat) {
        return fm.icelink.yuv.ImageConverter(outputFormat);
    }
}
@implementation RemoteMedia
- (FMIceLinkVideoPipe *)createImageConverterWithOutputFormat: (FMIceLinkVideoFormat *)outputFormat {
    return [FMIceLinkYuvImageConverter imageConverterWithOutputFormat:outputFormaat];
}
@end```
public class RemoteMedia : FMIceLinkRtcRemoteMedia { override func createImageConverter(outputFormat:FMIceLinkVideoFormat) -> FMIceLinkVideoPipe { return FMIceLinkYuvImageConverter(outputFormat: outputFormat) } }

Your RemoteMedia class can now display both audio and video data, but the received data isn't actually displayed anywhere. The next step is to let the layout manager know about these remote media instances, so that it can display them.

Displaying a Remote Video Feed

To display a remote video feed, you need to add a corresponding remote view to your layout manager instance. Before you can do this, you need to generate a unique identifier for the RemoteMedia instance. This is because you will have multiple RemoteMedia instances for each session. Each instance will be associated with a specific remote peer, and each instance should have its own unique identifier. To assign a unique identifier to a RemoteMedia instance, you will normally either generate a GUID, or use an identifier associated with the remote client. The former method is shown below.

If you are using JavaScript, use the fm.icelink.RemoteMedia class.


var remoteMedia = new RemoteMedia();

remoteMedia.Id = System.Guid.NewGuid().ToString();
RemoteMedia remoteMedia = new RemoteMedia();

remoteMedia.setId(java.util.UUID.randomUUID().toString());
RemoteMedia* remoteMedia = [RemoteMedia new];

[remoteMedia setId: [[NSUUID UUID] UUIDString];
var remoteMedia = RemoteMedia()

remoteMedia.setId(UUID().uuidString)
var remoteMedia = RemoteMedia();

remoteMedia.setId(yourCustomUuidFunction());

Now that your RemoteMedia instance has an id, you can add it to the layout manager. For each person that joins a session, you will create a new RemoteMedia instance, assign it an id, and then add it to the layout manager. To do so, invoke the AddRemoteView method of the layout manager, and specify the id and view of the RemoteMedia instance as parameters. These are accessible as shown below:


layoutManager.AddRemoteView(remoteMedia.Id, remoteMedia.View);
layoutManager.addRemoteView(remoteMedia.getId(), remoteMedia.getView());
[layoutManager addRemoteViewWithId:[remoteMedia id] view:[localMedia view]];
layoutManager.addRemoteView(id: remoteMedia.id(), view: remoteMedia.view())
layoutManager.addRemoteView(remoteMedia.getId(), remoteMedia.getView());

When a user leaves the session, you should remove the remote view from the layout manager. If you do not, the view will remain and it will appear as if the user's video has frozen. You remove the view from the layout manager by invoking its RemoveRemoteView instance, and specifying the id of the RemoteMedia instance as the first parameter.


layoutManager.RemoveRemoteView(remoteMedia.Id);
layoutManager.removeRemoteView(remoteMedia.getId());
[layoutManager removeRemoteViewWithId: [remoteMedia id]];
layoutManager.removeRemoteView(id: remoteMedia.id())
layoutManager.removeRemoteView(remoteMedia.getId());

Wrapping Up

You've now learned how to display remote video and play remote audio. If you haven't already, review the section on Starting Local Media, which describes how to capture data from a user's camera and microphone. If you have, go ahead to the Working with Streams section, which describes how to actually send and receive media data between users.