Bandwidth Constraints

Constraining the bandwidth of an outbound media stream can be an important factor in many streaming applications, especially for cases where low-bandwidth conditions are likely to be encountered. In addition to dynamically adjusting bitrate at runtime due to changing network conditions, which takes place internally within the SDK, setting a maximum outbound bitrate provides guarantees for predictable behaviour.

The Bandwidth SDP Attribute

The Session Description Protocol (SDP) defines a special "bandwidth" attribute that lets each endpoint indicate its maximum available bandwidth (in kilobits per second, or kbps) for each media stream:

...
m=audio ...
b=AS:64
...
m=video ...
b=AS:256
...


In the above example, the client generating this session description is indicating that they have 64kbps available for audio and 256kbps available for video. The client receiving this session description should therefore do what it can to ensure that what it sends stays within that limit.

Setting Local Bandwidth Constraints

To set local bandwidth constraints:

  1. Get the local description's SDP message.
  2. Add a Bandwidth attribute to each media description in the SDP message.
connection.OnLocalDescription += (c, localDescription) =>
{
    foreach (var mediaDescription in localDescription.SdpMessage.MediaDescriptions)
    {
        var mediaType = mediaDescription.Media.MediaType;
        var bandwidthKbps = 0;
        if (mediaType == FM.IceLink.Sdp.MediaType.Audio)
        {
            bandwidthKbps = 64;
        }
        else if (mediaType == FM.IceLink.Sdp.MediaType.Video)
        {
            bandwidthKbps = 256;
        }
        if (bandwidthKbps > 0)
        {
            // b=AS:{bandwidth}
            mediaDescription.AddBandwidth(new FM.IceLink.Sdp.Bandwidth(
               FM.IceLink.Sdp.BandwidthType.ApplicationSpecific, bandwidthKbps));
        }
    }
};
connection.addOnLocalDescription(function (c, localDescription) {
    var mediaDescriptions = localDescription.getSdpMessage().getMediaDescriptions();
    for (var i = 0; i < mediaDescriptions.length; i++) {
        var mediaDescription = mediaDescriptions[i];
        var mediaType = mediaDescription.getMedia().getMediaType();
        var bandwidthKbps = 0;
        if (mediaType == fm.icelink.sdp.MediaType.getAudio()) {
            bandwidthKbps = 64;
        } else if (mediaType == fm.icelink.sdp.MediaType.getVideo()) {
            bandwidthKbps = 256;
        }
        if (bandwidthKbps > 0) {
            // b=AS:{bandwidth}
            mediaDescription.addBandwidth(new fm.icelink.sdp.Bandwidth(
               fm.icelink.sdp.BandwidthType.getApplicationSpecific(), bandwidthKbps));
        }
    }
});
[connection addOnLocalDescriptionWithBlock:^(FMIceLinkConnection *c, FMIceLinkSessionDescription *localDescription) {
    for (FMIceLinkSdpMediaDescription *mediaDescription in localDescription.sdpMessage.mediaDescriptions) {
        NSString *mediaType = mediaDescription.media.mediaType;
        int bandwidthKbps = 0;
        if ([mediaType isEqualToString:FMIceLinkSdpMediaType.audio]) {
            bandwidthKbps = 64;
        } else if ([mediaType isEqualToString:FMIceLinkSdpMediaType.video]) {
            bandwidthKbps = 256;
        }
        if (bandwidthKbps > 0) {
            // b=AS:{bandwidth}
            [mediaDescription addBandwidth:[FMIceLinkSdpBandwidth 
                bandwidthWithBandwidthType:FMIceLinkSdpBandwidthType.applicationSpecific value:bandwidthKbps]];
        }
    }
}];
connection?.addOnLocalDescription({ (con: Any!, d: Any!) in
    let localDescription = d as! FMIceLinkSessionDescription
    for desc in (localDescription.sdpMessage().mediaDescriptions()) {
        let mediaDescription = desc as! FMIceLinkSdpMediaDescription
        let mediaType: String = mediaDescription.media().mediaType()
        var bandwidthKbps: Int64 = 0;
        if (mediaType == FMIceLinkSdpMediaType.audio()) {
            bandwidthKbps = 64
        }
        else if (mediaType == FMIceLinkSdpMediaType.video()) {
            bandwidthKbps = 256
        }
        if (bandwidthKbps > 0) {
            // b=AS:{bandwidth}
            mediaDescription.addBandwidth(FMIceLinkSdpBandwidth(bandwidthType: FMIceLinkSdpBandwidthType.applicationSpecific(), value: bandwidthKbps))
        }
    }
})
connection.addOnLocalDescription((c, localDescription) -> {

    for (fm.icelink.sdp.MediaDescription mediaDescription : localDescription.getSdpMessage().getMediaDescriptions()) {
        String mediaType = mediaDescription.getMedia().getMediaType();
        int bandwidthKbps = 0;
        if (mediaType.equals(fm.icelink.sdp.MediaType.getAudio())) {
            bandwidthKbps = 64;
        } else if (mediaType.equals(fm.icelink.sdp.MediaType.getVideo())) {
            bandwidthKbps = 256;
        }
        if (bandwidthKbps > 0) {
            // b=AS:{bandwidth}
            mediaDescription.addBandwidth(new fm.icelink.sdp.Bandwidth(
               fm.icelink.sdp.BandwidthType.getApplicationSpecific(), bandwidthKbps));
        }
    }
});


Honoring Remote Bandwidth Constraints

In an upcoming release of IceLink, bandwidth constraints will be honored automatically. This is one part of a rather large feature set that includes quality-of-service (QoS) prioritization, improved bandwidth estimation, and temporal/spatial scalability options for various codecs.

Until then, to honor remote bandwidth constraints:

  1. Get the remote description's SDP message.
  2. Set the media encoder bitrate to a value below the bandwidth constraint (if applicable).
connection.OnRemoteDescription += (c, remoteDescription) =>

{
    foreach (var mediaDescription in remoteDescription.SdpMessage.MediaDescriptions)
    {
        foreach (var bandwidth in mediaDescription.Bandwidths)
        {
            if (bandwidth.BandwidthType == FM.IceLink.Sdp.BandwidthType.ApplicationSpecific)
            {
                var mediaType = mediaDescription.Media.MediaType;
                if (mediaType == FM.IceLink.Sdp.MediaType.Audio)
                {
                    var opusEncoders = remoteMedia.AudioTrack.FindElements(el => el is FM.IceLink.Opus.Encoder, true);
                    foreach (var opusEncoder in opusEncoders)
                    {
                        ((FM.IceLink.Opus.Encoder)opusEncoder).Bitrate = (int)(bandwidth.Value * 0.9); // allow for 10% rtx
                    }
                }
                else if (mediaType == FM.IceLink.Sdp.MediaType.Video)
                {
                    var vpxEncoders = remoteMedia.VideoTrack.FindElements(el => el is FM.IceLink.Vpx.Encoder, true);
                    foreach (var vpxEncoder in vpxEncoders)
                    {
                        ((FM.IceLink.Vpx.Encoder)vpxEncoder).Bitrate = (int)(bandwidth.Value * 0.9); // allow for 10% rtx
                    }
    
                    var h264Encoders = remoteMedia.VideoTrack.FindElements(el => el is FM.IceLink.OpenH264.Encoder, true);
                    foreach (var h264Encoder in h264Encoders)
                    {
                        ((FM.IceLink.OpenH264.Encoder)h264Encoder).Bitrate = (int)(bandwidth.Value * 0.9); // allow for 10% rtx
                    }
                }
            }
        }
    }
};
connection.addOnRemoteDescription((c, remoteDescription) -> {
    for (fm.icelink.sdp.MediaDescription mediaDescription : remoteDescription.getSdpMessage().getMediaDescriptions()) {
        for (fm.icelink.sdp.Bandwidth bandwidth : mediaDescription.getBandwidths()) {
            if (bandwidth.getBandwidthType().equals(fm.icelink.sdp.BandwidthType.getApplicationSpecific())) {
                String mediaType = mediaDescription.getMedia().getMediaType();
                if (mediaType.equals(fm.icelink.sdp.MediaType.getAudio())) {
                    List<IAudioElement> opusEncoders = remoteMedia.getAudioTrack().findElements(el -> el instanceof fm.icelink.opus.Encoder, true);
                    for (IAudioElement opusEncoder : opusEncoders) {
                        ((fm.icelink.opus.Encoder)opusEncoder).setBitrate((int)(bandwidth.getValue() * 0.9)); // allow for 10% rtx
                    }
                } else if (mediaType.equals(fm.icelink.sdp.MediaType.getVideo())) {
                    List<IVideoElement> vpxEncoders = remoteMedia.getVideoTrack().findElements(el -> el instanceof fm.icelink.vpx.Encoder, true);
                    for (IVideoElement vpxEncoder : vpxEncoders) {
                        ((fm.icelink.vpx.Encoder)vpxEncoder).setBitrate((int)(bandwidth.getValue() * 0.9)); // allow for 10% rtx
                    }
    
                    List<IVideoElement> h264Encoders = remoteMedia.getVideoTrack().findElements(el -> el instanceof fm.icelink.openh264.Encoder, true);
                    for (IVideoElement h264Encoder : h264Encoders) {
                        ((fm.icelink.openh264.Encoder)h264Encoder).setBitrate((int)(bandwidth.getValue() * 0.9)); // allow for 10% rtx
                    }
                }
            }
        }
    }
});
[connection addOnRemoteDescriptionWithBlock:^(FMIceLinkConnection *c, FMIceLinkSessionDescription *remoteDescription) {
    for (FMIceLinkSdpMediaDescription *mediaDescription in remoteDescription.sdpMessage.mediaDescriptions) {
        for (FMIceLinkSdpBandwidth *bandwidth in mediaDescription.bandwidths) {
            if ([bandwidth.bandwidthType isEqualToString:FMIceLinkSdpBandwidthType.applicationSpecific]) {
                NSString *mediaType = mediaDescription.media.mediaType;
                if ([mediaType isEqualToString:FMIceLinkSdpMediaType.audio]) {
                    NSArray *opusEncoders = [remoteMedia.audioTrack findElementsWithWhereBlock:^bool(NSObject<FMIceLinkIAudioElement> *el) {
                        return [el isKindOfClass:[FMIceLinkOpusEncoder class]];
                    } recurse:true];
                    for (FMIceLinkOpusEncoder *opusEncoder in opusEncoders) {
                        ((FMIceLinkOpusEncoder *)opusEncoder).bitrate = (int)(bandwidth.value * 0.9); // allow for 10% rtx
                    }
                } else if ([mediaType isEqualToString:FMIceLinkSdpMediaType.video]) {
                    NSArray *vpxEncoders = [remoteMedia.videoTrack findElementsWithWhereBlock:^bool(NSObject<FMIceLinkIVideoElement> *el) {
                        return [el isKindOfClass:[FMIceLinkVpxEncoder class]];
                    } recurse:true];
                    for (FMIceLinkVpxEncoder *vpxEncoder in vpxEncoders) {
                        ((FMIceLinkVpxEncoder *)vpxEncoder).bitrate = (int)(bandwidth.value * 0.9); // allow for 10% rtx
                    }
    
                    NSArray *h264Encoders = [remoteMedia.videoTrack findElementsWithWhereBlock:^bool(NSObject<FMIceLinkIVideoElement> *el) {
                        return [el isKindOfClass:[FMIceLinkCocoaVideoToolboxH264Encoder class]];
                    } recurse:true];
                    for (FMIceLinkCocoaVideoToolboxH264Encoder *h264Encoder in h264Encoders) {
                        ((FMIceLinkCocoaVideoToolboxH264Encoder *)h264Encoder).bitrate = (int)(bandwidth.value * 0.9); // allow for 10% rtx
                    }
                }
            }
        }
    }
}];
// nothing to do - bandwidth constraints are automatically honored in JavaScript

Bandwidth Adaptation

IceLink 3 has built-in support for bandwidth adaptation. This feature responds to changing network conditions by modifying the quality of an audio or video stream so that the connection remains smooth. During poor network conditions, the quality will be decreased to avoid disconnection. On the other hand, if network conditions are good, the quality will be increased to provide a better experience. To enable bandwidth adaptation, set the BandWidthAdaptationPolicy property of the AudioStream and VideoStream instances that you create. An example of this is below:


stream.BandWidthAdaptationPolicy = FM.IceLink.BandwidthAdaptationPolicy.Enabled;
stream.setBandwidthAdaptationPolicy(fm.icelink.BandwidthAdaptationPolicy.Enabled);
[stream setBandwidthAdaptationPolicy: FMIceLinkBandwidthAdaptationPolicyEnabled];
stream.setBandwidthAdaptationPolicy(FMIceLinkBandwidthAdaptationPolicyEnabled)
// nothing to do - bandwidth adaptation is controlled by the browser

There are a few "rule of thumb" values that determine when bandwidth adaptation constrains will kick in. In general:

  • for 1080p video, at least 2Mbps is required
  • for 720p video, at least 1Mbps is required
  • for 480p, at least 512 Kbps is required
  • for 240p, at least 256 Kbps is required

If bandwidth adaptation is enabled and these speeds are not met, the quality of media stream will be reduced to compensate. As mentioned above, this works both ways. If the drop in network speed is only temporary, the quality will increase when more bandwidth is available.

Wrap Up

If we missed something, or if you have a suggestion on a way to improve this guide, please let us know by contacting support@frozenmountain.com or visiting support.frozenmountain.com.

Thank you, and happy coding!