ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel

Getting Started with iOS Audio Playback

Updated on July 14, 2013
klanguedoc profile image

Kevin is a Software Developer with 20 years experience designing and building business intelligence and system integration solutions.


The iOS SDK provides excellent APIs to allow a developer to create audio enabled playback using the AVFoundation Framework. The following tutorial creates an iOS app as a brief introduction to handling different audio files for wav, m4a and mp3 codecs that are represented in this introduction however the API can handle many more.

What you will take away:

  • How to implement AVFoundation
  • How to use AVPlayer
  • How to setup UIPickerView
  • How to load files from Resource bundle
  • How to interact with AVPlayer to adjust volume using a UISlider

This quick tutorial is demonstrate audio playback by walking you through the steps to create a simple iPhone iOS app. The app will feature an UIPickerView of different audio files stored in the main resource bundle. Upon selecting a file, a variable will hold the filename and we will that value to locate the actual file and begin playback. Once the the file has been opened and begun playback you will be able to adjust the volume.

The tutorial will show you how to setup the UIPickerView and create and initialize the AVPlayer in the AVFoundation framework. Also the you will learn how to implement a volume control.

Running AVFoundation AVPlayer App
Running AVFoundation AVPlayer App | Source

Project Setup

I guess a good place to start is to create a Single View Application using the corresponding iOS project of the similar name in Xcode. You can name your app anything that fits your fancy. Once the project is setup, select the top node in the Project Explorer which will open the Project Summary page on the righthand side in the Xcode IDE. Scroll towards the bottom until you locate the “Linked Libraries and Frameworks”. Click on the “+” button and in the search field that appears, start to type in AVFoundation. Select it from the list and click the “Add” button. You can accept the other defaults in the page. Also you can drag the framework into the “Frameworks” group in the Project Explorer.

Next, create a new group (File - New - Group) or right-click on the top node group and select New Group from the context menu. Name it “audio” and then right-click on it and select “Add files to ….your project name”. Browse to the location on your computer for a source of sound files or you can use from sound files from numerous royalty free websites and or use the files from my project.

Now with the project setup complete you can start building the Storyboard that will allow a user to select a sound file from the main Resource bundle and initiate playback.


The Storyboard is the main UI canvas where you can create any of the sophisticated User Interfaces of the multitude of apps in the iTunes App Store. The sound playback app has a very simple functional can contains an UIPickerView and an UISlider.

To begin, open the Storyboard and drag an UIPickerView from the palette on the right and also drag an UISlider onto the Canvas of the Interface Builder. To interact with the UISlider, open the Editor Assistant (the tuxedo icon in the toolbar, or from the Xcode menu, View - Assistant Editor - Show Assistant Editor, or press the option + command + enter keys). Then drag a connection (ctrl+drag) from the UISlider to the open header file, naming the newly created IBAction something like “adjustVolume”. You will add code to this method later which was automatically added to the implementation file for you.

Next you will need to connect the UIPicklist. As with the UISlider, ctrl+drag a connection to the open header file. Name the IBOutlet soundPicklist or something meaningful. To make the UIPickList interactive you will need to implement the UIPickListDelegate and UIPickListDataSource. This has two parts; first you need create connections in the Storyboard and later you will set the protocol statements to the header file. Ctrl+drag a connection from the UIPickList to the View Controller proxy (it is the yellow globe in the panel at the bottom of the ViewController on the IB Canvas. When you release the mouse button a popover will appear allowing you to select either the delegate and/or DataSource. Highlight the either one first to make the connection. Repeat the same operation for the second one.

You can select the “Standard Editor” icon in toolbar to go back to the normal screen. This completes the setup of the Storyboard for the app. Next you will start adding code to the header and implementation files.

Setting the datasource and delegate for the UIPickList in the Storyboard
Setting the datasource and delegate for the UIPickList in the Storyboard | Source



The last part of the project is the ViewController. These classes act as the intermediary between view and the model in a MVC designed app. Let’s start with the header (code listing 1).

First add the declaratives for the UIPicklist protocol, namely UIPickerListDelegate, UIPickListDataSource and AVAudioPlayerDelegate which are placed between angle brackets next to the @interface statement.

Next below the IBOutlet and IBAction properties that were added in the previous step, add a NSArray property for the sound files which will be the data source for the UIPickList and a NSString property to hold the selected filename in the UIPickList when the user selects a filename to play and of course an AVAudioPlayer which playback the sounds from the audio files.

The last bit of code to be declared is a method called playAudioFile which will take the name of the selected file as a parameter and pass it to the AVAudioPlayer object above.


The remainder of this tutorial will be the implementation file where all the pieces come together.

To implement an UIPickList and it’s protocol, you will need to add a few required methods which are listed below.

  • - (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView

  • - (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component

  • - (void)pickerView:(UIPickerView *)pickerView didSelectRow:(NSInteger)row inComponent:(NSInteger)component

The first method defines the number of columns that will appear in the UIPicklist. In this AVAudioPlayback app, the UIPicklist will have only one column so you can set the return value to 1. The second method defines the length of the datasource and displays the number rows that will be needed in the UIPicklist. You usually define this as the length of the array, return [sounds count]. The last required method returns the value of the selected row when the user turns the selector in the UIPicklist. The returned value is stored in the selectedSound NSString variable.

But to fully implement the methods and variables, you will to start by synthesizing your variables using the @synthesize keyword (refer to the source code below). Next you will need to initialize and populate the sounds NSArray. You will do this in the viewDidLoad ViewController method which is called when the ViewController is finished loading.

The audioPlayback method requires more work as you can see in the implementation code in code listing 2 below. First you will need a NSRange variable to pinpoint the location of the “.” in the filename because the you will have to use the fileURLWithPath

method in the NSURL API which takes two parameters; the filename itself and the file extension. You will get these by defining a filename NSString variable and initialize with using the substring method and using the range value less one character. Repeat the operation by defining a NSString variable for the file extension, fileExtension. This variable will be initialized using the substring method to get the extension as you can see in the code listing 2 below.

Now you can create the NSURL variable as previously mentioned. You will use the url variable as an initialization parameter for the _audioPlayer object. Once this is done, all you will to do is call the prepareToPlay and play methods to start the playback.

The final method to implement is adjustVolume which passes a sender parameter of the id type (generic object). First cast the UISlider class to the sender value and then pass the returned value to the _audioPlayer volume property to adjust the volume during playback.

That’s it. Now you know how to handle audio files. You build on this simple axample and more sophisticated apps. The complete iOS app project can be downloaded from


//  ViewController.h
//  AudioPlayback
//  Created by Kevin Languedoc on 7/1/13.
//  Copyright (c) 2013 Landgo Interactive. All rights reserved.

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController<UIPickerViewDataSource, UIPickerViewDelegate, AVAudioPlayerDelegate>

@property (strong, nonatomic) IBOutlet UIPickerView *soundPickList;
- (IBAction)adjustVolume:(id)sender;

@property(nonatomic, strong)NSArray *sounds;
@property(nonatomic, strong)NSString *selectedSound;
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;




//  ViewController.m
//  AudioPlayback
//  Created by Kevin Languedoc on 7/1/13.
//  Copyright (c) 2013 Landgo Interactive. All rights reserved.

#import "ViewController.h"

@interface ViewController ()


@implementation ViewController
@synthesize sounds, selectedSound;

- (void)viewDidLoad
    [super viewDidLoad];
	sounds = [[NSArray alloc]initWithObjects:@"slackbanger.m4a",@"succulus.m4a",@"old_coffee_machine.wav", @"swamp.wav",@"Lethian Dreams - Dawn.mp3",@"Snowden - Anti-Anti.mp3", nil];

- (void)didReceiveMemoryWarning
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
- (NSString *)pickerView:(UIPickerView *)pickerView titleForRow:(NSInteger)row forComponent:(NSInteger)component{
    NSLog(@"%@",[sounds objectAtIndex:row]);
    return [sounds objectAtIndex:row];
- (NSAttributedString *)pickerView:(UIPickerView *)pickerView attributedTitleForRow:(NSInteger)row forComponent:(NSInteger)component{
     NSLog(@"%@",[sounds objectAtIndex:row]);
    return [sounds objectAtIndex:row];

- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView{
    return 1;
- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component{
    return [sounds count];
- (void)pickerView:(UIPickerView *)pickerView didSelectRow:(NSInteger)row inComponent:(NSInteger)component{
    selectedSound = [sounds objectAtIndex:row];
    [self playAudioFile:selectedSound];


    NSRange range = [selectedSound rangeOfString:@"."];
    NSString *filename = [selectedSound substringToIndex:NSMaxRange(range)-1];
    NSString *fileExtension = [selectedSound substringFromIndex:NSMaxRange(range)];

    NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
    NSError *error;
    _audioPlayer = [[AVAudioPlayer alloc]
    if (error)
        NSLog(@"Error in audioPlayer: %@",
              [error localizedDescription]);
    } else {
        _audioPlayer.delegate = self;
        [_audioPlayer prepareToPlay];
        [_audioPlayer play];

- (IBAction)adjustVolume:(id)sender {
    UISlider *volume = (UISlider*)sender;
    if (_audioPlayer != nil)
        _audioPlayer.volume = volume.value;


    0 of 8192 characters used
    Post Comment

    No comments yet.


    This website uses cookies

    As a user in the EEA, your approval is needed on a few things. To provide a better website experience, uses cookies (and other similar technologies) and may collect, process, and share personal data. Please choose which areas of our service you consent to our doing so.

    For more information on managing or withdrawing consents and how we handle data, visit our Privacy Policy at:

    Show Details
    HubPages Device IDThis is used to identify particular browsers or devices when the access the service, and is used for security reasons.
    LoginThis is necessary to sign in to the HubPages Service.
    Google RecaptchaThis is used to prevent bots and spam. (Privacy Policy)
    AkismetThis is used to detect comment spam. (Privacy Policy)
    HubPages Google AnalyticsThis is used to provide data on traffic to our website, all personally identifyable data is anonymized. (Privacy Policy)
    HubPages Traffic PixelThis is used to collect data on traffic to articles and other pages on our site. Unless you are signed in to a HubPages account, all personally identifiable information is anonymized.
    Amazon Web ServicesThis is a cloud services platform that we used to host our service. (Privacy Policy)
    CloudflareThis is a cloud CDN service that we use to efficiently deliver files required for our service to operate such as javascript, cascading style sheets, images, and videos. (Privacy Policy)
    Google Hosted LibrariesJavascript software libraries such as jQuery are loaded at endpoints on the or domains, for performance and efficiency reasons. (Privacy Policy)
    Google Custom SearchThis is feature allows you to search the site. (Privacy Policy)
    Google MapsSome articles have Google Maps embedded in them. (Privacy Policy)
    Google ChartsThis is used to display charts and graphs on articles and the author center. (Privacy Policy)
    Google AdSense Host APIThis service allows you to sign up for or associate a Google AdSense account with HubPages, so that you can earn money from ads on your articles. No data is shared unless you engage with this feature. (Privacy Policy)
    Google YouTubeSome articles have YouTube videos embedded in them. (Privacy Policy)
    VimeoSome articles have Vimeo videos embedded in them. (Privacy Policy)
    PaypalThis is used for a registered author who enrolls in the HubPages Earnings program and requests to be paid via PayPal. No data is shared with Paypal unless you engage with this feature. (Privacy Policy)
    Facebook LoginYou can use this to streamline signing up for, or signing in to your Hubpages account. No data is shared with Facebook unless you engage with this feature. (Privacy Policy)
    MavenThis supports the Maven widget and search functionality. (Privacy Policy)
    Google AdSenseThis is an ad network. (Privacy Policy)
    Google DoubleClickGoogle provides ad serving technology and runs an ad network. (Privacy Policy)
    Index ExchangeThis is an ad network. (Privacy Policy)
    SovrnThis is an ad network. (Privacy Policy)
    Facebook AdsThis is an ad network. (Privacy Policy)
    Amazon Unified Ad MarketplaceThis is an ad network. (Privacy Policy)
    AppNexusThis is an ad network. (Privacy Policy)
    OpenxThis is an ad network. (Privacy Policy)
    Rubicon ProjectThis is an ad network. (Privacy Policy)
    TripleLiftThis is an ad network. (Privacy Policy)
    Say MediaWe partner with Say Media to deliver ad campaigns on our sites. (Privacy Policy)
    Remarketing PixelsWe may use remarketing pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to advertise the HubPages Service to people that have visited our sites.
    Conversion Tracking PixelsWe may use conversion tracking pixels from advertising networks such as Google AdWords, Bing Ads, and Facebook in order to identify when an advertisement has successfully resulted in the desired action, such as signing up for the HubPages Service or publishing an article on the HubPages Service.
    Author Google AnalyticsThis is used to provide traffic data and reports to the authors of articles on the HubPages Service. (Privacy Policy)
    ComscoreComScore is a media measurement and analytics company providing marketing data and analytics to enterprises, media and advertising agencies, and publishers. Non-consent will result in ComScore only processing obfuscated personal data. (Privacy Policy)
    Amazon Tracking PixelSome articles display amazon products as part of the Amazon Affiliate program, this pixel provides traffic statistics for those products (Privacy Policy)