Starting a New Project

If you are familiar with the IceLink SDK, and are ready to start on your own application, it's time to create a new project. Many platforms have specific requirements or build options that you must specify. This guide tells you what you need to know to get up and running on the platform of your choice. Do  not skip this section, as much of the information here is difficult to discover on your own.

Starting a .NET Project

Create a new solution in Visual Studio, and add the following DLLs as references:

  • FM.IceLink.dll
  • FM.IceLink.Opus.dll
  • FM.IceLink.Vpx.dll
  • FM.IceLink.Yuv.dll

These provide you with the Opus and VP8/VP9 codecs. You can add FM.IceLink.OpenH264.dll as well, if you want to support the H.264 codec. You will also need some way to capture the user's audio and video. The default way to do this is to use the AForge and NAudio libraries, which allow you to capture audio and video data from a user's microphone and camera. To use the default configuration,  include the following DLLs:

  • AForge.dll
  • FM.IceLink.AForge.dll
  • FM.IceLink.NAudio.dll
  • NAudio.dll

If you are using WebSync, you will also need FM.IceLink.WebSync4.dll and a copy of the WebSync binaries. Specifically, you will need:

  • FM.IceLink.WebSync4.dll
  • FM.WebSync.dll
  • FM.WebSync.Subscribers.dll

You will also need to include architecture-specific builds for several of these libraries. These can be found in the NET/DotNetXX/Libraries/lib/win_x86 and NET/DotNetXX/Libraries/lib/win_x64 folders of the standard IceLink distribution, where XX is the .NET version, ie: DotNet20, DotNet35, etc. The following rules apply:

  • If you are using FM.IceLink.AudioProcessing.dll, include libaudioprocessingfm.dll.
  • If you are using FM.IceLink.OpenH264.dll, include libopenh264fm.dll.
  • If you are using FM.IceLink.Opus.dll, include libopusfm.dll.
  • If you are using FM.IceLink.Vpx.dll, include libvpxfm.dll.
  • If you are using FM.IceLink.Yuv.dll, include libyuvfm.dll.


Native libs

To include these libs in your project put them into a lib folder and set up the Copy to Output Directory property to Copy if newer.

Starting a UWP Project (including Hololens)

Create a new Universal Windows solution in Visual Studio, and add the following DLLs as references:

  • FM.IceLink.dll
  • FM.IceLink.Uwp
  • FM.IceLink.Uwp.Win2D
  • FM.IceLink.Opus.dll
  • FM.IceLink.Vpx.dll
  • FM.IceLink.Yuv.dll

These provide you with the Opus and VP8/VP9 codecs. You will also need some way to capture the user's audio. Note that video capture is provided natively by UWP, so no special library is needed for video capture. The default way to support audio capture is to use the NAudio library, which allows you to capture audio data from a user's microphone. To use the default configuration include FM.IceLink.NAudio.dll.

H.264 support for UWP

You can add FM.IceLink.OpenH264.dll as well, if you want to support the H.264 codec. However, the OpenH264 library cannot currently be downloaded at runtime due to UWP restrictions. To support H.264 for UWP you would have to include the OpenH264 binary provided by Cisco into your release and pay the corresponding royalties to MPEG LA (http://www.mpegla.com/main/programs/AVC/Pages/Intro.aspx).

If you are capturing audio and video then you will need to add microphone and webcam device capabilities to your project. You must also add the internetClient and privateNetworkClient capabilities to your project to allow full network capabilities. You can use the Manifest Designer, or in Package.appxmanifest you can add these directly:

Package.appxmanifest
<Capabilities>
	<Capability Name="internetClient" />
	<Capability Name="privateNetworkClientServer" />
	<Capability Name="internetClientServer" />
	<DeviceCapability Name="microphone" />
	<DeviceCapability Name="webcam" />
</Capabilities>


Necessary capabilities

  • To ensure that UWP clients can communicate over private networks you must include the privateNetworkClientServer capability.
  • To ensure UWP clients can act as servers for networking communication (enables the client to connect in answering SDP role, not just in the offering role) you must include the internetClientServer capability.

If you are using WebSync, you will also need FM.IceLink.WebSync4.dll and a copy of the WebSync binaries. Specifically, you will need:

  • FM.IceLink.WebSync4.dll
  • FM.WebSync.dll
  • FM.WebSync.Subscribers.dll

You will also need to include architecture-specific builds for several of these libraries. These can be found in the UWP/Libraries/lib/win_x86 and UWP/Libraries/lib/win_x64 folders of the standard IceLink distribution. The following rules apply:

  • If you are using FM.IceLink.AudioProcessing.dll, include libaudioprocessingfm.dll.
  • If you are using FM.IceLink.OpenH264.dll, include libopenh264fm.dll.
  • If you are using FM.IceLink.Opus.dll, include libopusfm.dll.
  • If you are using FM.IceLink.Vpx.dll, include libvpxfm.dll.
  • If you are using FM.IceLink.Yuv.dll, include libyuvfm.dll.

To have the correct architecture-specific libraries copied into your build artifacts you can set up conditional build rules in your .csproj file. These libraries need to be in the root of your build artifacts. For example this rule ensures that the x64 libraries are copied to the root of the build artifacts for builds that target the x64 platform. Platforms and build configurations are edited in the Configuration Manager.

<ItemGroup Condition="'$(Platform)' == 'x64'">
    <Content Include="lib\win_x64\libaudioprocessingfm.dll">
      <Link>libaudioprocessingfm.dll</Link>
    </Content>
    <Content Include="lib\win_x64\libopusfm.dll">
      <Link>libopusfm.dll</Link>
    </Content>
    <Content Include="lib\win_x64\libvpxfm.dll">
      <Link>libvpxfm.dll</Link>
    </Content>
    <Content Include="lib\win_x64\libyuvfm.dll">
      <Link>libyuvfm.dll</Link>
    </Content>
  </ItemGroup>
  <ItemGroup Condition="'$(Platform)' == 'x86'">
    <Content Include="lib\win_x86\libaudioprocessingfm.dll">
      <Link>libaudioprocessingfm.dll</Link>
    </Content>
    <Content Include="lib\win_x86\libopusfm.dll">
      <Link>libopusfm.dll</Link>
    </Content>
    <Content Include="lib\win_x86\libvpxfm.dll">
      <Link>libvpxfm.dll</Link>
    </Content>
    <Content Include="lib\win_x86\libyuvfm.dll">
      <Link>libyuvfm.dll</Link>
    </Content>
  </ItemGroup>

Starting an Android or Java Project

If you are starting an Android application, create a new Android Studio project. Otherwise, create an IntelliJ project. Add the following jars to your project:

  • fm.icelink.jar
  • fm.icelink.opus.jar
  • fm.icelink.vpx.jar
  • fm.icelink.yuv.jar

These provide you with the minimum set of Opus/VPX capabilities. You can add fm.icelink.openh264 as well, if you want to enable H264 support. You will also need a library to capture audio and video data from the user. For an Android application, include the following:

  • fm.icelink.android.jar

For non-Android Java applications, include:

  • fm.icelink.java.jar
  • fm.icelink.java.sarxos.jar

If you are using WebSync, you will also need fm.icelink.websync4.jar and a copy of the WebSync binaries. Specifically, you will need:

  • fm.icelink.websync4.jar
  • fm.websync.jar
  • fm.websync.subscribers.jar

You will also need to include architecture-specific native libraries for several of these. These can be found in the Android/Libraries/jniLibs/arm64-v8a, Android/Libraries/jniLibs/armebi-v7a and Android/Libraries/jniLibs/x86 folders. The following rules apply:

  • If you are using fm.icelink.audioprocessing.jar, include libaudioprocessingfmJNI.so.
  • If you are using fm.icelink.openh264.jar, include libopenh264fmJNI.so.
  • If you are using fm.icelink.opus.jar, include libopusfmJNI.so.
  • If you are using fm.icelink.vpx.jar, include libvpxfmJNI.so.
  • If you are using fm.icelink.yuv.jar, include libyuvfmJNI.so.

If this is a bit confusing, and you want to be safe, you can just drop the entire jniLibs folder into your project directory. This should ensure that you will always have the correct native libraries for any application.

Working with ProGuard

If you are using ProGuard, you will also have to specify some additional configuration settings so that ProGuard correctly shrinks your code. ProGuard is a code-shrinking tool that is available in versions of the Android SDK Tools 25.0.10 or higher. In your ProGuard configuration file, add the following rules:

-keep public class fm.icelink.audioprocessing.** { *; }
-keep public class fm.icelink.openh264.** { *; }
-keep public class fm.icelink.opus.** { *; }
-keep public class fm.icelink.vpx.** { *; }
-keep public class fm.icelink.yuv.** { *; }

If you are using ProGuard and your app calls a method from the Java Native Interface (i.e.: uses any of the native libraries noted above), then you cannot minify your APK builds as ProGuard incorrectly removes this code. 

buildTypes {
    release {
        minifyEnabled false
        proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
        signingConfig signingConfigs.release
    }
}

Disable Instant Run

Disable Instant Run

Developers should disabling "Instant Run" in Android Studio.

When "Instant Run" is enabled in Android Studio it can result in severe audio and video performance problems on some development machines: https://stackoverflow.com/questions/35168753/instant-run-in-android-studio-2-0-how-to-turn-off

Permissions

You will need to add permissions to your manifest as follows:

<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/>
<uses-permission android:name="android.permission.BLUETOOTH" />
<!-- largeHeap is necessary to avoid OOM errors on older devices -->
<application android:label="..." android:theme="..." android:name="android.support.multidex.MultiDexApplication" android:largeHeap="true"></application>

Starting a Cocoa Project

If you are starting an iOS or macOS project, create a new XCode project. Add the IceLink libraries to your project in Xcode. They are the files under either iOS/Libraries or macOS/Libraries, depending on which platform you are developing for. You should include, at minimum, the following:

  • libFMIceLink.a
  • libFMIceLinkOpus.a
  • libFMIceLinkVpx.a
  • libFMIceLinkYuv.a

As with other project types, you will also need some way to capture audio and video data. IceLink provides a module to handle this. Include:

  • libFMIceLinkCocoa.a

If you are using WebSync, you will also need **libFMIceLinkWebSync4.a** and a copy of the WebSync binaries. Specifically, you will need:

  • libFMIceLinkWebSync4.a
  • libFMWebSync.a
  • libFMWebSyncSubscribers.a

Your project also needs some Apple framework dependencies. These will vary slightly depending on whether or not you are developing for iOS or MacOS. Include the following frameworks:

  • libz.dylib
  • AudioToolbox.framework
  • AudioUnit.framework (macOS only)
  • AVFoundation.framework
  • CoreAudio.framework
  • CoreGraphics.framework
  • CoreMedia.framework
  • CoreVideo.framework
  • CFNetwork.framework (iOS only)
  • GLKit.framework (iOS only)
  • OpenGLES.framework (iOS only)
  • Security.framework
  • VideoToolbox.framework

After you have added these dependencies, the last thing to do is to add -ObjC linker flag. If you do not do this, you will get load errors at run time. You can add this under the "Other Linker Flags" section under the build settings for your current build target.

Note that we previously recommended that you also add the -all_load linker flag. This is *no longer* the case. Do not add the -all_load flag, as it can result in duplicate symbol definitions.

Starting a JavaScript Project

If you are starting a JavaScript project, create a new project in the IDE of your choice, whether that's WebStorm or Notepad. There are no optional dependencies to include with JavaScript, as all hardware and media codec functionality is provided by the browser. Using a <script> tag, include the core SDK files:

  • fm.icelink.js

If you are using WebSync, you will also need to include fm.icelink.websync4.js and a copy of the WebSync libraries. Specifically, you will need:

  • fm.icelink.websync4.js
  • fm.websync.js
  • fm.websync.subscribers.js

Starting a Xamarin iOS Project

The [Downloading IceLink](/icelink3/getting-started/Downloading-IceLink.html) section mentions that Xamarin iOS uses AOT (ahead-of-time) compilation, as opposed to other C# targets, which use JIT (just-in-time) compilation. Because of this, you must ensure that Monotuch links to the native libraries so that they are available at compile-time. To do this, add the following MSBuild directive to your Xamarin iOS .csproj file:

<MtouchExtraArgs>-gcc_flags "-L${ProjectDir}/libs/native -lvpxfm-iOS -lopusfm-iOS -lyuvfm-iOS -force_load ${ProjectDir}/libs/native/libvpxfm-iOS.a -force_load ${ProjectDir}/libs/native/libopusfm-iOS.a -force_load ${ProjectDir}/libs/native/libyuvfm-iOS.a"</MtouchExtraArgs>

If you don't want to edit the .csproj file directly, you can instead specify this through the IDE. Open the project property window and look for the "Monotouch Arguments" property. Copy the contents of the MTouchExtraArgs element here.

If you do not specify this, then Xamarin iOS won't compile for physical devices and there will be run-time exceptions when the application is run on the iOS simulator.

As with other platforms, you will need to include architecture-specific native libraries for  Xamarin iOS. Include the libraries that can be found in the Xamarin/Libraries/iOS/lib/native folder. The following rules apply:

  • If you are using FM.IceLink.Opus.dll, include libopusfm-iOS.a and libopus-iOS.a.
  • If you are using FM.IceLink.Vpx.dll, include libvpxfm-iOS.a and libvpx-iOS.a.
  • If you are using FM.IceLink.Yuv.dll, include libyuvfm.iOS.a and libyuv-iOS.a.

If you want to be safe, you can include the entire lib folder in your project. This will ensure that you will always have the correct native libraries.

Mono Dynamic Registrar Flag

If you are using Mono 5.10.x or higher there is a know issue with Xamarin iOS where the dynamic linker is removed at runtime. The native libraries will fail to load without the dynamic linker. A work around is to add "--optimize=-remove-dynamic-registrar" flag to MTouchExtraArgs in the .csproj file . Adding this flag will prevent Mono from removing the dynamic linker.

After you add this in your .csproj file, your .cjproj file should have the following:

<MtouchExtraArgs>--optimize=-remove-dynamic-registrar -gcc_flags "-L${ProjectDir}/libs/native -lvpxfm-iOS -lopusfm-iOS -lyuvfm-iOS -force_load ${ProjectDir}/libs/native/libvpxfm-iOS.a -force_load ${ProjectDir}/libs/native/libopusfm-iOS.a -force_load ${ProjectDir}/libs/native/libyuvfm-iOS.a"</MtouchExtraArgs>

Starting a Xamarin Android Project

For Xamarin Android, you should be aware that Xamarin Android runs two garbage collectors: one for Java and one for .NET. This makes any garbage collection operation very slow relative to all other platforms. Try to avoid any unnecessary allocations.

To specify which native libs will be built into your Xamarin Android project you need to add the following to your .csproj file:

<AndroidSupportedAbis>armeabi-v7a;x86;arm64-v8a</AndroidSupportedAbis>

This results in native libs included for all of the necessary architectures.

As with other platforms, you will need to include architecture-specific native libraries for  Xamarin Android. Do not include the Android JNIs. Instead, include the shared object libraries that can be found in the Xamarin/Libraries/Android/lib/arm64-v8a, Xamarin/Libraries/Android/lib/armebi-v7a and Xamarin/Libraries/Android/lib/x86 folders. The following rules apply:

- If you are using FM.IceLink.AudioProcessing.dll, include libaudioprocessingfm.so.
- If you are using FM.IceLink.OpenH264.dll, include libopenh264fm.so.
- If you are using FM.IceLink.Opus.dll, include libopusfm.so.
- If you are using FM.IceLink.Vpx.dll, include libvpxfm.so.
- If you are using FM.IceLink.Yuv.dll, include libyuvfm.so.

If you want to be safe, you can include the entire lib folder in your project. This will ensure that you will always have the correct native libraries.

For release builds, we strongly recommend using AOT (ahead-of-time) compilation, as it significantly improves the initial connection time. To enable AOT, add:

<AotAssemblies>True</AotAssemblies>
<EnableLLVM>True</EnableLLVM>

... to the PropertyGroup that targets Release (e.g. <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">).

Permissions

You will need to add permissions to your manifest as follows:

<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.INTERNET" />
<!-- largeHeap is necessary to avoid OOM errors on older devices -->
<application android:theme="..." android:label="..." android:largeHeap="true"></application>

Starting a Raspberry Pi Project

Raspberry Pi users will need to modify two settings in the audio library (PulseAudio) shipped with Pi. In the /etc/pulse/daemon.conf file, add the following settings:

default-sample-format = s16le
default-sample-rate = 48000

After modifying these settings, you must kill the daemon and restart it. You can do that with the following commands:

pulseaudio --kill
pulseaudio --start

You must also ensure that the pulseaudio server restarts after a reboot, or the SDK will not function properly.

Wrapping Up

Now that you've got a project started, you need to let the SDK know that you have a license. The next section on Configuring Your License Key discusses the various types of licenses and how you can provide these to the SDK.