How do I integrate with CallKit?

This is a common requirement for iOS apps that use IceLink. Apple has some documentation on getting started with CallKit that will show you how to receive push notifications, handle incoming calls, make outgoing calls, etc. Once you have finished reading through that you will probably be left wondering how to wire all of this CallKit functionality into the IceLink API. Here, we provide some advice on how to integrate your IceLink app with CallKit based on our own experience working with the CallKit API.

Be sure you set the proper audio session category so you can hear audio!

  1. Callkit integration is an application level concern. The IceLink API can be integrated successfully with CallKit, but this is done entirely in application code. That said, it is recommended that you implement your CallKit integration using the most recently released version of IceLink to ensure you have any improvements to the CocoaOpenGLView,CocoaAudioUnitSource, and the other necessary Cocoa APIs.

  2. In a CallKit integrated app you do not configure the AVAudioSession in LocalMedia or RemoteMedia (as our examples do). Instead you configure the AVAudioSession when you answer the call. CallKit then takes the AVAudioSession over, and finally returns it to your app when the call is activated. We demonstrate this in a code snippet below.
  3. You should only start LocalMedia after CallKit activates the call and gives you the AVAudioSession.
  4. You should properly handle the case where the user has first installed the app and needs to provide permissions for the mic and camera. You do not want the app to be asking for camera and mic permissions while the user is trying to answer their first call with CallKit. The recommended way to achieve this is to create and start a "throw away" LocalMedia on your app's first screen the first time the user opens the app. This way you uncouple acquiring permissions from receipt of an incoming call.
  5. When a user answers a call with your app when the phone is not locked, then the local view will appear black until the call connects and LocalMedia is started. Recall that with CallKit we cannot start LocalMedia until after the connection is established, so this is expected behaviour. It is recommended that you show something else to your users ... some UI telling them that a call is connecting.
  6. We have provided abstractions of a Call, and a CallManager, for your convenience. These classes are based heavily off of Apple's CallKit integration examples and you are welcome to use them. CallManager.swift provides a collection of calls and functions to manage them, and Call.swift is a convenience class for maintaining the state of a call.

    Call.swiftCallManager.swift


Now, let's take a look at how to answer a CallKit call. We recommend using a provider class to manage all of the CallKit related events. This ProviderDelegate can be used in conjunction with the provided Call and CallManager classes to encapsulate all CallKit related code integrations. There are a few things our CallKit ProviderDelegate will need to do. Obviously, it needs to answer calls and end calls. As discussed in point three above it also needs to start LocalMedia after CallKit has activated the AVAudioSession. It should also handle the case where the CallKit CXProvider, your telephony provider, is externally reset. Let's handle the reset first because it is nice and simple:

providerRest
func providerDidReset(_ provider: CXProvider) {
	// End all outstanding calls ...
    for call in callManager.calls {
        call.end()
    }
	// ... and remove them from the CallManager.
    callManager.removeAllCalls()
}


Easy enough. Now let's look at something more complex - answering an incoming CallKit call. One of the key pieces in this snippet is the event handler for onConnected. This event handler is set here where the CXAnswerCallAction is performed, but fired from the connection handling logic when your connection transitions to the FMIceLinkConnectionState.connected state.

Answer call action
func provider(_ provider: CXProvider, perform action: CXAnswerCallAction) {

    guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
        action.fail()
        return
    }


	// Configure your audio session.
    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord)
        try AVAudioSession.sharedInstance().setMode(AVAudioSessionModeVoiceChat)
    }
    catch {
        FMIceLinkLog.debug(withMessage: "Failed to configure shared AVAudioSession.")
    }

    // Load up the ViewController that will handle your conference UI.
	// Recommended to to create LocalMedia here, but do not start it.
    ...

    // Start the async call to join.
    _app?.joinAsync()

    // Ensure this callback is set regardless of whether joinAsync succeeds or fails.
    // If registration in joinAsync fails for any reason then we need reconnect logic
    // to take over and this handler still be invoked.
    self._app?.onConnected = FMIceLinkAction0(block: { () in
        DispatchQueue.main.async {
            FMIceLinkLog.debug(withMessage: "Fulfilling call action.")
            if (!action.isComplete) {
                action.fulfill()
            }
        }
    })

    call.answer()
}


When you have answered a call CallKit takes control of the (shared) AVAudioSession that you configured and makes use of it. It then passes it back to your app letting you know that the AVAudioSession has been activated for the given provider. It is at this point, and not before, that you should start your LocalMedia as shown here. At this point you must also make sure that you tell iOS you want to get notified of audio interruptions being ended. This is necessary so that when an audio interruption ends, iOS will let your application know the interruption has ended, and the CocoaAudioSource will then resume raising frames:

didActivate audioSession
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {

    // Ensure we will be notified of interruptions ending.
    // This is necessary for the AudioSOurce to resume raising frames when the CXSetHeldCallAction
    // ends for a CallKit Call that was on hold.
	NotificationCenter.default.post(name: Notification.Name.AVAudioSessionInterruption, object: nil, userInfo:[AVAudioSessionInterruptionTypeKey : NSNumber.init(value: AVAudioSessionInterruptionType.ended.rawValue)]);


    // You CANNOT start local media until you have an activated AVAudioSession.
    // Remote media is created when the connection is established, but it is
    // not started until it begins receiving data, which is after CallKit
    // has activated the audio session, so that works out automatically.
    self._app?.startLocalMedia()
}

Also take care of any non-call related audio:

didActivate audioSession
public func provider(_ provider: CXProvider, didDeactivate audioSession: AVAudioSession) {

        FMIceLinkLog.debug(withMessage: "CALLKIT: Received \(#function) - didDeactivate - audioSession isInputAvailable=\(audioSession.isInputAvailable) ")
        /*
         Restart any non-call related audio now that the app's audio session has been
         de-activated after having its priority restored to normal.
         */
}

Next lets take care of a call that gets muted:

Set Muted Call Action
public func provider(_ provider: CXProvider, perform action: CXSetMutedCallAction) {
        guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
            action.fail()
            return
        }

        self._app?.getLocalMedia().setAudioMuted(action.isMuted)       

        // self._app?.toggleStreamDisabled(streamType: FMIceLinkStreamType.audio)       

        action.fulfill()
    }

Now lets take care of a call Held:

Set Muted Call Action
public func provider(_ provider: CXProvider, perform action: CXSetHeldCallAction) {
	guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
    	action.fail()
    	return
    }
    FMIceLinkLog.info(withMessage: "CALLKIT: CXSetHeldCallAction uuid = \(action.callUUID), isOnHold = \(action.isOnHold)")
    action.fulfill()
}

So, that takes care of answering a call, IceLink connection management, and activating the call, but you'll also need to handle the user ending a call via CallKit. This involves tearing down your connection(s) and LocalMedia, and generally cleaning up, and then of course letting CallKit know that you are done.

End call action
func provider(_ provider: CXProvider, perform action: CXEndCallAction) {
    guard let call = callManager.callWithUUID(uuid: action.callUUID) else {
        action.fail()
        return
    }

    // Shut down your connections ...
    self._app?.leaveAsync().then(resolveFunctionBlock: { (o: Any?) -> FMIceLinkFuture! in
		// Stop LocalMedia ...
        return self._app?.stopLocalMedia().then(resolveActionBlock: { (o: Any?) in

            // Cleanup and load your default ViewController.
            DispatchQueue.main.async {
                self._app?.cleanup()
                // ... load default VC
            }
        },
        rejectActionBlock: { (e: NSException?) in
            FMIceLinkLog.error(withMessage: "Could not stop local media", ex: e)
        })
    },
    rejectActionBlock: { (e: NSException?) in
        FMIceLinkLog.error(withMessage: "Could not leave conference", ex: e)
    })
    .then(resolveActionBlock: { [unowned self] (o: Any?) in
        DispatchQueue.main.async {
            self._app?.cleanup()
        }
    })

    call.end()    
	callManager.removeCall(call)

    action.fulfill() // Tell CallKit you are done.
}