Frequently Asked Questions
Cloud
What data is stored by LiveSwitch Cloud?
See Data Collection.
Does LiveSwitch Cloud support WebRTC Bundling?
Yes, it does. LiveSwitch supports WebRTC Bundling as per the Negotiating Media Multiplexing Using the Session Description Protocol draft. Implementation covers SDP negotiation, stream multiplexing, and demultiplexing.
For SFU and MCU connections, WebRTC Bundling is enabled by default. Connection bundling policy is controlled by the Connection.BundlePolicy
property, which is set to MaxBundle
by default (that's the one you want) for MCU and SFU connections.
If I'm on a restrictive network, how can I add IP addresses to an allow list?
LiveSwitch services' IP addresses change over time. To get all the addresses for a given Application, you can programmatically query endpoint Applications/{ApplicationID}/IPAddresses
and then update your firewall rules to add those IP addresses to the allow list.
The following cURL command shows how to query the endpoint:
curl -X GET "https://api.liveswitch.io/Applications/<your-applicationId>/IPAddresses?api-version=1.0" -H "accept: */*" -H "X-API-Key: <your-api-key>"
An example response looks like this:
{
"sync": [ //These are IP addresses for the Gateway Servers
"3.138.248.52",
"3.20.255.205",
"3.14.163.200",
"3.140.141.165"
],
"media": [ //These are IP addresses for the Media Servers
"3.137.158.14",
"3.137.158.14"
],
"api": [ //These are the IP addresses for the API Servers
"3.16.99.72",
"3.14.95.25",
"18.221.91.34"
]
}
What if network restrictions prevent the usage of WebRTC?
Certain firewalls conduct deep packet inspection preventing WebRTC connectivity. You may consider using Media-over-WebSockets
protocol fallback. LiveSwitch Cloud allows you to connect to WebRTC-based conferences over a WebSocket tunnel. See Media-over-WebSockets.
Can I interact with the Billing and Dashboard LiveSwitch Cloud API endpoints directly?
No, the Billing and Dashboard endpoints are cross-origin restricted. However, you can interact with other LiveSwitch Cloud API endpoints directly.
Client
Why are your DLLs unsigned?
Adding a signature is easier than removing one. By shipping the libraries without a signature, you can decide within your app whether to sign them or not. This also lets you ship libraries with your organization's digital signature. To add a digital signature to any of the .dlls
, just use SignTool, which ships with Visual Studio:
signtool.exe sign /f path\to\cert.p12 /p {your cert.p12 password} /t http://timestamp.verisign.com/scripts/timstamp.dll path\to\FM.dll
signtool.exe sign /f path\to\cert.p12 /p {your cert.p12 password} /t http://timestamp.verisign.com/scripts/timstamp.dll path\to\FM.LiveSwitch.dll...
To make things easier, you can add these commands as a pre-build step to your Visual Studio project so the copied libraries are always signed.
Why can't two apps start local media at the same time on Windows?
Windows doesn't allow two processes to access the camera at the same time - it uses exclusive locks. Unix (Linux/macOS) doesn't have this limit - it uses shared locks. If you have one webcam in Windows then the following are expected behaviors:
- Start up two .NET LiveSwitch example apps, and the second one fails to obtain local media.
- Run the ActiveX examples in IE with multiple browser windows, or multiple tabs in the same browser window, and the second fails to obtain local media.
- Run Chrome and then run Firefox in Windows, and Firefox rejects the request to open local media.
- Run Firefox first and then Chrome in Windows, and Chrome rejects the request to open local media.
- Run any combination of processes on Windows and the second one fails to obtain local media.
There are two noted exceptions. Both Chrome and Firefox work around Windows exclusive camera lock by running a centralized process that owns the exclusive lock and directing the camera contents to each tab using inter-process communication. This means that Chrome can grant multiple tabs access to the camera. Likewise, Firefox can as well.
How do I force a TURN relay or hide my private IP address?
The IceGatherPolicy
property on Connection
allows you to restrict candidates to relay-only:
IceGatherPolicy.Relay
connection.IceGatherPolicy = IceGatherPolicy.Relay;
The same property allows you to turn off "host" candidates so your private IP address is never shared:
IceGatherPolicy.NoHost
connection.IceGatherPolicy = IceGatherPolicy.NoHost;
How can I securely vend out TURN credentials?
A simple approach is to have your web/application server generate credentials on request from an authenticated and authorized user using the following algorithm:
Generate TURN Credentials
var turnUsername = appUserId + ":" + currentUnixTimestamp;
var turnPassword = Base64(HmacSha1(serverSecret, turnUsername));
The variables to this are as follows:
appUserId
is the user ID your app uses to identify a user.currentUnixTimestamp
is, as its name suggests, the current Unix timestamp.serverSecret
is some string you can share (secretly) between your web/application server and your TURN server.
Your client code can then plug these values into a new IceServer:
Connection.AddIceServer
connection.AddIceServer(new IceServer("your.turn.server.address", turnUsername, turnPassword));
Your TURN server can validate them by using serverSecret
and the TURN username to re-generate the expected password:
TurnAuthResult.FromPassword
var server = new TurnServer((e) =>
{
return TurnAuthResult.FromPassword(Base64(HmacSha1(serverSecret, e.Username)); // optional: cache result
});
How do I enable bandwidth adaptation?
Bandwidth adaptation downgrades resolution and frame rate automatically based on the sender's target bitrate. LiveSwitch enables bandwidth adaptation by default. This is considered the best practice. If, for some reason, you want to turn off bandwidth adaptation, in your client code set BandwidthAdaptationPolicy
to Disabled
on your VideoStream
or AudioStream
.
How do I capture a frame and convert it to a particular image format?
You can use the Snapshot API to grab a video frame dynamically and then convert it to a supported image format:
- Media (LocalMedia/RemoteMedia)
- GrabVideoFrame()): Future<VideoBuffer> method.
- VideoTrack
- GrabFrame(): Future<VideoBuffer> method. This delivers a reference to the next processed raw (YUV or RGB) video buffer.
- VideoBuffer
- Convert(VideoFormat): VideoBuffer method. While it doesn't have the performance of
libyuv
, this method converts anyVideoBuffer
to any of the supported video formats, even in JavaScript. Extremely useful for support and writing captured images to disk.
- Convert(VideoFormat): VideoBuffer method. While it doesn't have the performance of
Can I monitor my connection statistics?
You can monitor your connection statistics by the following:
You can expose connection statistics via webhook. If you want to monitor connection statistics for both your Media Servers and your Clients, select the Stats checkbox in the CONFIGURE EVENT TRIGGERS → Connection for the Webhook Configuration of a given Channel Configuration in your LiveSwitch Console.
The Connection Stats Event model encompasses all Connection Statistics that are gathered through the Stats API. Therefore, you can also get the connection statistics using the Stats API.
What is a ConnectionDeadStream
error?
Typically, when debugging a failed connection by looking through your logs, the error logs look like the following:
Error encountered: ConnectionDeadStream Dead stream detected. Causing transport: IceTransport. Connection will shut down.
This log message comes from a dead stream error. It indicates that, due to high latency, the media connection cannot be maintained. It occurs when LiveSwitch determines that none of the connection's candidates are able to route/receive traffic and maintain the connection. For more information on ICE and ICE candidates, see What Are STUN, TURN, and ICE?.
A candidate pair loses connectivity if it has not received traffic in the last 3 seconds. Here, traffic can mean payload (RTP, RTCP, DTLS packets, etc.), ICE Keep-Alives generated by remote end points, or ICE Keep-Alive responses received from remote end-points. If the candidate pair loses connectivity, then the ICE Transport layer switches to another candidate pair that is still in the connected state. If there are no such candidate pairs available, then the ICE transport transitions to the disconnected state.
The ICE transport then starts monitoring its state for the DeadStreamTimeout
amount of time. In LiveSwitch this is 5 seconds. Within this time frame, if no candidate pair regains connectivity, then the ICE transport transitions to the failed state, which ultimately causes the connection to also transition to the failed state, and the connection is lost.
Can I handle a capture device being disconnected mid-conference?
Absolutely. The cross-platform LocalMedia
class allows you to tie into an event that is fired when a MediaTrack stops processing media for any reason (that is, the capture device has become unavailable).
LocalMedia.GetMediaSourceInput
LocalMedia.OnVideoStopped(() => {
Log.Info("Someone unplugged the camera!");
});
LocalMedia.GetMediaSourceInput
LocalMedia.OnAudioStopped(() => {
Log.Info("Someone unplugged the mic!");
});
In the browsers, you must tear down the connection and LocalMedia
, and then start it all back up again to recover gracefully from this (somewhat unlikely) scenario.
Note
Safari does not raise the MediaTrack.OnStopped
event and because of this onVideoStopped
and onAudioStopped
cannot be raised by the LiveSwitch API in Safari.
Chrome and Screen Sharing
In Chrome, when screen sharing, the Stop sharing button that Chrome presents triggers the OnVideoStopped
event. You can hook into this event to handle the user pressing the Stop sharing button.
Our clients need to run behind a proxy. What kind of support is there for HTTP proxied client environments?
At this time proxy support is limited to the .NET platform. This means that all .NET-based clients, including ActiveX for IE11, support proxying. Also, for any JavaScript-based clients, proxying is provided by the browser, so proxy support works providing the browser supports it.
Proxy configuration is done at the system level. LiveSwitch's TCP socket class simply queries the system to detect proxy support, and then use the system's proxy configuration to negotiate a proxied HTTP(S) connection.
Note
It's best practice to let traffic run on default HTTP ports 80 and 443.
What type of proxying is supported?
HTTP(S) proxies are supported. This means that TCP, and TCP over TLS, traffic can be routed through a proxy, which in turn implies that proxied connections must be over relay.
Note
UDP proxying and SOCKS proxies are not supported.
TURN vs TURNS
- Most HTTP proxies do not forward TURN traffic since the TURN protocol doesn't look like HTTP.
- Most HTTP proxies do forward TURNS traffic since it is TLS-encrypted and looks the same as HTTPS.
- Having TURNS available for any deployments that anticipate requiring a proxy is recommended. For more information, see @security.
Does LiveSwitch run on FIPS compliant Windows?
Yes. The LiveSwitch .NET client SDK runs on Windows with the FIPS compliant security option enabled. The LiveSwitch .NET Client SDK is tested on FIPS enabled Windows machines and P2P/SFU/MCU connections work as expected (over both STUN and TURN). Audio Streams, Video Streams, and Data Streams are supported.
For more information on LiveSwitch FIPS compliance, contact Sales.
What issues should I be aware of when using WebSockets for network switching?
When using WebSockets, network switching (switching between WiFi and cellular data networks) and reconnection can be problematic and may take more time, sometimes up to a minute. For optimal performance, it's recommended to avoid environments with jitter greater than 50 ms and high round-trip times (RTT), as these conditions can affect WebSocket reliability.
How should I handle changing video resolution in the browser?
The recommended approach for changing video resolution in the browser is to apply new constraints to the existing video track, rather than creating a new video stream. This method is better supported by browsers and helps avoid errors. Here is an example of how to implement this:
async function upgradeResolutionViaConstraints() {
try {
// Request initial video stream with specific constraints
const stream = await navigator.mediaDevices.getUserMedia({
video: {
width: { exact: 640 },
height: { exact: 480 }
}
});
const track = stream.getVideoTracks()[0];
const settings = track.getSettings();
console.log("Video has width and height: " + settings.width + "x" + settings.height);
// Apply new constraints to change the resolution
await track.applyConstraints({
width: { exact: 1280 },
height: { exact: 720 }
});
const settings2 = track.getSettings();
console.log("Video now has width and height: " + settings2.width + "x" + settings2.height);
} catch (error) {
console.error('Error updating camera resolution:', error);
}
}
Android
I hear "garbled" audio. What is the problem?
If you are using Android Studio, turn off "Instant Run." Enabling "Instant Run" results in severe audio and video performance problems.
Why are my logs flooded with messages about buffer underruns
?
If you are using Android Studio, turn off "Instant Run" and turn off your debugger. Otherwise, they can have a significant effect on performance that results in logs being flooded with messages.
macOS
My app crashes with an unrecognized selector. How do I fix this?
- If you are getting a crash with an unrecognized selector in a project that does not bundle the LiveSwitch client SDK then add the
-ObjC
linker flag to your app project. - When bundling the client SDK as a framework for a Swift app, add the
-ObjC
linker flag to your app project and the framework project.
When building my project, I'm getting lots of "duplicate symbols" errors.
Remove the -all_load
linker flag from your app project.
Why can't Safari connect to Chrome on Android?
There are two mandatory-to-implement video codecs for WebRTC-compatible web browsers - VP8 and H.264.
Currently, Safari on iOS/macOS supports H.264 everywhere and does not support VP8 anywhere, which runs contrary to Apple's claim that it supports WebRTC. Sadly, efforts were made by Apple to make VP8 optional in WebKit's libwebrtc for the purposes of disabling it in Safari.
At the same time, Chrome on Android supports VP8 everywhere and supports H.264 on devices with hardware encoders as of version M57. This means some Android devices only support VP8 because a software encoder is not yet available.
Since Safari only supports H.264 and Chrome on some Android devices only support VP8, there are some cases where the two web browsers cannot connect. Unfortunately, nothing can be done about this aside from sending feedback to Apple and Google requesting that they follow the WebRTC specification and support both VP8 and H.264 in a future release of their web browsers.
JavaScript
How do I choose the front facing camera in JavaScript?
When you create LocalMedia in JavaScript, you can pass in a WebRTC constraints object instead of a boolean
for the 'audio' and 'video' parameters, e.g.:
var audio = true;
var video = {
facingMode: 'user' // use the front camera
};
var screen = false;
var localMedia = new fm.liveswitch.LocalMedia(audio, video, screen);
Mozilla has a great reference on the media track constraints you can use here.
iOS
On iOS sounds that play while using LiveSwitch (like notification sounds) have their volume increased/decreased for no reason. What is going on?
LiveSwitch uses the VoiceProcessingIO
audio module, which includes echo cancellation, microphone gain control, and other voice processing features. Unfortunately, there are some side effects to using this, which impact the volume of other sounds played while the conference is active, and even afterwards. These are known issues in iOS with no workaround other than to use the RemoteIO
audio module instead (set FMLiveSwitchCocoaAudioUnitSource.useVoiceProcessingIO to false
), but this disables echo cancellation and generally results in a less-than-desirable user experience.
There are more details about this iOS bug scattered across the web:
Unity
When I use Bluetooth headphones, there are gaps in my audio. I can't hear parts of what I'm supposed to hear. How do I fix this?
Bluetooth headphones might have audio gaps if your network latency is high. To resolve this, increase the value of hard latency by using the static reference to AudioClipSink.Latency
. For example:
AudioClipSink.LATENCY = 150;
When I pause the app and return back to it while in a session, audio and video aren’t synced on the receiver end. This issue also persists when the system settings of an app are modified while in a session. How do I fix this?
To prevent these sync issues, in the OnApplicationPause callback, the LocalMedia
must be stopped when the app is paused and it must be restarted when the app is reopened. The OnApplicationPause callback is invoked whenever the app is paused or reopened.
// stopping/restaring the localmedia to avoid the delay when app is put in background
private void OnApplicationPause(bool pause)
{
var localMedia = _LocalMedia;
if (localMedia == null)
{
return;
}
if (pause)
{
localMedia.Stop();
}
else
{
localMedia.Start();
}
}