Start an iOS Project
If you are starting an iOS project, create a new XCode project. Add the LiveSwitch libraries to your project in Xcode - they are the files under iOS/Libraries found in the downloaded ZIP file from the LiveSwitch Console. You should include, at minimum, the following:
libFMLiveSwitch.a
libFMLiveSwitchOpus.a
libFMLiveSwitchVpx.a
libFMLiveSwitchYuv.a
Note
Once you have added the libraries into the Xcode project folder you will still need to link them to the project. You can do this by opening the main project/solution file on the left and go to the 'Build Phases' tab. From there select the 'Link Binary with Libraries' section and select the '+' button. From there select the 'Add Other' dropdown then select the 'Add Files' option. From here find where you stored the libraries in the project and add the new references.
As with other project types, you also need some way to capture audio and video data. LiveSwitch provides a module to handle this. Include:
libFMLiveSwitchCocoa.a
Your project also needs some Apple framework dependencies. Include the following frameworks:
libz.dylib
AudioToolbox.framework
AVFoundation.framework
CoreAudio.framework
CoreGraphics.framework
CoreMedia.framework
CoreVideo.framework
CFNetwork.framework
GLKit.framework
OpenGLES.framework
Security.framework
VideoToolbox.framework
After you have added these dependencies, the last thing to do is to add -ObjC
linker flag. If you do not do this, you will get load errors at run time. You can add this under the "Other Linker Flags" section under the build settings for your current build target.
Note that we previously recommended that you also add the -all_load
linker flag. This is no longer the case. Do not add the -all_load
flag, as it can result in duplicate symbol definitions.
Integrate With CallKit
This is a common requirement for iOS apps that use LiveSwitch. Apple has some documentation on getting started with CallKit that shows you how to receive push notifications, handle incoming calls, make outgoing calls, etc. Once you have finished reading through that you will probably be left wondering how to wire all of this CallKit features into the LiveSwitch API. Here, we provide some advice on how to integrate your LiveSwitch app with CallKit based on our own experience working with the CallKit API.
Be sure you set the proper audio session category so you can hear audio.
- CallKit integration is an application-level concern. The LiveSwitch API can be integrated successfully with CallKit, but this is done entirely in app code. That said, it is recommended that you implement your CallKit integration using the most recently released version of LiveSwitch to ensure you have any improvements to the
CocoaOpenGLView
,CocoaAudioUnitSource
, and the other necessary Cocoa APIs. - In a CallKit integrated app you do not configure the
AVAudioSession
inLocalMedia
orRemoteMedia
(as our examples do). Instead you configure theAVAudioSession
when you answer the call. CallKit then takes theAVAudioSession
over, and finally returns it to your app when the call is activated. We demonstrate this in a code snippet below. - You should only start LocalMedia after CallKit activates the call and gives you the
AVAudioSession.
- You should properly handle the case where the user has first installed the app and needs to provide permissions for the mic and camera. You do not want the app to be asking for camera and mic permissions while the user is trying to answer their first call with CallKit. The recommended way to achieve this is to create and start a "throw away"
LocalMedia
on your app's first screen the first time the user opens the app. This way you uncouple acquiring permissions from receipt of an incoming call. - When a user answers a call with your app when the phone is not locked, then the local view appears black until the call connects and
LocalMedia
is started. Recall that with CallKit we cannot startLocalMedia
until after the connection is established, so this is expected behaviour. It is recommended that you show something else to your users, some UI telling them that a call is connecting. - We have provided abstractions of a
Call
, and aCallManager
, for your convenience. These classes are based heavily off of Apple's CallKit integration examples and you are welcome to use them.CallManager.swift
provides a collection of calls and functions to manage them, andCall.swift
is a convenience class for maintaining the state of a call.
Call.swift
//
// Call.swift
// Chat
//
// Copyright © 2017 Frozen Mountain Software. All rights reserved.
//
import Foundation
class Call {
let uuid: UUID
let outgoing: Bool
let handle: String
init(uuid: UUID, outgoing: Bool = false, handle: String) {
self.uuid = uuid
self.outgoing = outgoing
self.handle = handle
}
var connectingDate: Date? {
didSet {
stateDidChange?()
hasStartedConnectingDidChange?()
}
}
var connectDate: Date? {
didSet {
stateDidChange?()
hasConnectedDidChange?()
}
}
var endDate: Date? {
didSet {
stateDidChange?()
hasEndedDidChange?()
}
}
var stateDidChange: (() -> Void)?
var hasStartedConnectingDidChange: (() -> Void)?
var hasConnectedDidChange: (() -> Void)?
var hasEndedDidChange: (() -> Void)?
var hasConnected: Bool {
get {
return connectDate != nil
}
set {
connectDate = newValue ? Date() : nil
}
}
var hasEnded: Bool {
get {
return endDate != nil
}
set {
endDate = newValue ? Date() : nil
}
}
func answer() {
/*
Simulate the answer becoming connected immediately, since
the example app is not backed by a real network service
*/
hasConnected = true
}
func end() {
/*
Simulate the end taking effect immediately, since
the example app is not backed by a real network service
*/
hasEnded = true
}
}
CallManager.swift
//
// CallManager.swift
// Chat
//
// Copyright © 2017 Frozen Mountain Software. All rights reserved.
//
import Foundation
import CallKit
@available(iOS 10.0, *)
class CallManager {
static let CallsChangedNotification = Notification.Name("CallManagerCallsChangedNotification")
private let callController = CXCallController()
private(set) var calls = [Call]()
func callWithUUID(uuid: UUID) -> Call? {
guard let index = calls.index(where: { $0.uuid == uuid }) else {
return nil
}
return calls[index]
}
func addCall(_ call: Call) {
calls.append(call)
call.stateDidChange = { [weak self] in
self?.postCallsChangedNotification()
}
postCallsChangedNotification()
}
func removeCall(_ call: Call) {
guard let index = calls.index(where: { $0 === call }) else { return }
calls.remove(at: index)
postCallsChangedNotification()
}
func removeAllCalls() {
calls.removeAll()
postCallsChangedNotification()
}
func end(call: Call) {
let endCallAction = CXEndCallAction(call: call.uuid)
let transaction = CXTransaction()
transaction.addAction(endCallAction)
requestTransaction(transaction)
}
private func requestTransaction(_ transaction: CXTransaction) {
callController.request(transaction) { error in
if let error = error {
print("Error requesting transaction: \(error)")
} else {
print("Requested transaction successfully")
}
}
}
private func postCallsChangedNotification() {
NotificationCenter.default.post(name: type(of: self).CallsChangedNotification, object: self)
}
}
Now, let's take a look at how to answer a CallKit call. We recommend using a provider class to manage all of the CallKit related events. This ProviderDelegate
can be used in conjunction with the provided Call
and CallManager
classes to encapsulate all CallKit related code integrations. There are a few things our CallKit ProviderDelegate
needs to do. Obviously, it needs to answer calls and end calls. As discussed in point three above it also needs to start LocalMedia
after CallKit has activated the AVAudioSession
. It should also handle the case where the CallKit CXProvider
, your telephony provider, is externally reset. Let's handle the reset first because it is nice and simple:
providerRest
func providerDidReset(_ provider: CXProvider) {
// End all outstanding calls ...
for call in callManager.calls {
call.end()
}
// ... and remove them from the CallManager.
callManager.removeAllCalls()
}
Easy enough. Now let's look at something more complex - answering an incoming CallKit call. One of the key pieces in this snippet is the event handler for onConnected
. This event handler is set here where the CXAnswerCallAction
is performed, but fired from the connection handling logic when your connection transitions to the FMLiveSwitchConnectionState.connected
state.Answer call action
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction) {
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
// Configure your audio session.
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
try AVAudioSession.sharedInstance().setMode(AVAudioSessionModeVoiceChat)
}
catch {
FMLiveSwitchLog.debug(withMessage: "Failed to configure shared AVAudioSession.")
}
// Load up the ViewController that will handle your conference UI.
// Recommended to to create LocalMedia here, but do not start it.
...
// Start the async call to join.
_app?.joinAsync()
// Ensure this callback is set regardless of whether joinAsync succeeds or fails.
// If registration in joinAsync fails for any reason then we need reconnect logic
// to take over and this handler still be invoked.
self._app?.onConnected = FMLiveSwitchAction0(block: { () in
DispatchQueue.main.async {
FMLiveSwitchLog.debug(withMessage: "Fulfilling call action.")
if (!action.isComplete) {
action.fulfill()
}
}
})
call.answer()
}
When you have answered a call CallKit takes control of the (shared) AVAudioSession
that you configured and makes use of it. It then passes it back to your app letting you know that the AVAudioSession
has been activated for the given provider. It is at this point, and not before, that you should start your LocalMedia
as shown here. At this point you must also make sure that you tell iOS you want to get notified of audio interruptions being ended. This is necessary so that when an audio interruption ends, iOS lets your app know the interruption has ended, and the CocoaAudioSource
then resumes raising frames:
didActivate audioSession
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
// Ensure we will be notified of interruptions ending.
// This is necessary for the AudioSource to resume raising frames when the CXSetHeldCallAction
// ends for a CallKit Call that was on hold.
NotificationCenter.default.post(name: Notification.Name.AVAudioSessionInterruption, object: nil, userInfo:[AVAudioSessionInterruptionTypeKey : NSNumber.init(value: AVAudioSessionInterruptionType.ended.rawValue)]);
// You CANNOT start local media until you have an activated AVAudioSession.
// Remote media is created when the connection is established, but it is
// not started until it begins receiving data, which is after CallKit
// has activated the audio session, so that works out automatically.
self._app?.startLocalMedia()
}
Also take care of any non-call related audio:
didActivate audioSession
public func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {
FMLiveSwitchLog.debug(withMessage: "CALLKIT: Received \(#function) - didDeactivate - audioSession isInputAvailable=\(audioSession.isInputAvailable) ")
/*
Restart any non-call related audio now that the app's audio session has been
de-activated after having its priority restored to normal.
*/
}
Next lets take care of a call that gets muted:
Set Muted Call Action
public func provider(_ provider: CXProvider, perform action: CXSetMutedCallAction) {
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
self._app?.getLocalMedia().setAudioMuted(action.isMuted)
// self._app?.toggleStreamDisabled(streamType: FMLiveSwitchStreamType.audio)
action.fulfill()
}
Now lets take care of a call Held:
Set Muted Call Action
public func provider(_ provider: CXProvider, perform action: CXSetHeldCallAction) {
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
FMLiveSwitchLog.info(withMessage: "CALLKIT: CXSetHeldCallAction uuid = \(action.callUUID), isOnHold = \(action.isOnHold)")
action.fulfill()
}
So, that takes care of answering a call, LiveSwitch connection management, and activating the call, but you'll also need to handle the user ending a call via CallKit. This involves tearing down your connections and LocalMedia, and generally cleaning up, and then of course letting CallKit know that you are done.
End call action
func provider(_ provider: CXProvider, perform action: CXEndCallAction) {
guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
action.fail()
return
}
// Shut down your connections ...
self._app?.leaveAsync().then(resolveFunctionBlock: { (o: Any?) -> FMLiveSwitchFuture! in
// Stop LocalMedia ...
return self._app?.stopLocalMedia().then(resolveActionBlock: { (o: Any?) in
// Cleanup and load your default ViewController.
DispatchQueue.main.async {
self._app?.cleanup()
// ... load default VC
}
},
rejectActionBlock: { (e: NSException?) in
FMLiveSwitchLog.error(withMessage: "Could not stop local media", ex: e)
})
},
rejectActionBlock: { (e: NSException?) in
FMLiveSwitchLog.error(withMessage: "Could not leave conference", ex: e)
})
.then(resolveActionBlock: { [unowned self] (o: Any?) in
DispatchQueue.main.async {
self._app?.cleanup()
}
})
call.end()
callManager.removeCall(call)
action.fulfill() // Tell CallKit you are done.
}