In this chapter, we move back to native Xamarin. We will integrate native audio functions for processing a sound file using the AVFramework
in iOS with the AVAudioSessions
, AVAudioSettings
, and AVAudioRecorder
objects. In Android you will use the MediaPlayer
object from the Android.Media
library.
Expected knowledge:
AVAudioSessions
, AVAudioSettings
, and AVAudioRecorder
, or the Android MediaPlayer
and MediaRecorder
classesIn this chapter, you will learn the following:
SoundHandler
interfaceSoundHandler
using the AVAudioPlayer
frameworkNSLayout
AudioPlayerPageViewModel
SoundHandler
using the MediaPlayer
frameworkNow that we are back to Xamarin native, it's time to get your mind out of XAML and back into native iOS and Android. We aren't going to spend much time on user interface design, but more on audio processing using the native frameworks.
As we have looked into cross-platform applications and code sharing, we are going to apply some of these principles to native development and setup an MVVM architecture. Let's begin by setting up three different projects, an iOS, Android, and PCL project:
18.217.254.118