How my app, Tomorrow, records and play audios using AVAudioRecorder and AVAudioPlayer (Part II)

Last week, we covered the how to get AVAudioRecorder set up correctly with the use of a singleton sharedInstance design pattern. So now that we’ve set up the recorder, how are we going to get it saved into a backend? Or Core Data?

Here’s what the animation looks like for stopping of recording:

logo

Hold and press down for starting of recording. Release to end recording. We’re not going to look at the animation code, but rather, what’s happening in the background!

Firstly, I have a similar core data stack as to the one described in this post written by Marcus Zarra.

I have also created a Recording class that has the following public properties:

Recording.h
@interface Recording : NSManagedObject

@property (nonatomic, retain) NSString * createdAt;
@property (nonatomic, retain) NSString * idNumber;
@property (nonatomic, retain) NSData * memo;
@property (nonatomic, retain) NSDate * showAt;
@property (nonatomic, retain) NSString * simpleDate;
@property (nonatomic, retain) NSString * timeCreated;

Then I have another singleton handler that will take care of saving of all the recordings. Let’s call this RecordingController.

In it we have a public method that we put in RecordingController.h:

- (void)addRecordingWithURL:(NSString *)urlPath 
                andIDNumber:(NSString *)idNumber 
             andDateCreated:(NSString *)createdAt 
               andFetchDate:(NSDate *)showAt 
              andSimpleDate:(NSString *)simpleDate 
             andTimeCreated:(NSString *)timeCreated;

In the implementation file:

- (void)addRecordingWithURL:(NSString *)urlPath andIDNumber:(NSString *)idNumber andDateCreated:(NSString *)createdAt andFetchDate:(NSDate *)showAt andSimpleDate:(NSString *)simpleDate andTimeCreated:(NSString *)timeCreated {
  Recording *recording = [NSEntityDescription insertNewObjectForEntityForName:recordingEntity inManagedObjectContext:[Stack sharedInstance].managedObjectContext];

  recording.urlPath = urlPath;
  recording.idNumber = idNumber;
  recording.createdAt = createdAt;
  recording.showAt = showAt;
  recording.simpleDate = simpleDate;
  recording.timeCreated = timeCreated;

  [self save];
}

We’re preparing the AudioController to make sure that all the correct data is passed over. Now back to AudioController.m:

- (AVAudioRecorder *)stopRecording {
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
      [self.recorder stop];

      [self data];

      [[RecordingController sharedInstance] addRecordingWithURL:[self nowString]
                                                    andIDNumber:[self randomIDNumber]
                                                 andDateCreated:[self createdAtDateString]
                                                   andFetchDate:[NSDate createdAtDate]
                                                  andSimpleDate:[self simpleDateString]
                                                   andGroupName:[self groupName]
                                                 andTimeCreated:[self currentTime]];

    [[RecordingController sharedInstance] save];

  });

  return self.recorder;

}

Upon release of the longpress on the circle, there is a lot happening. There is animation, a pop sound we want to play, and it also needs to stop recording. We want to make sure that the saving is happening in the background thread, that way the animation can continue to do its magic.

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) makes sure that this is being done in the background thread which will allow the pop sound and the animation UI (which can never be done in a background thread) to perform without any lag or hiccups.

[self.recorder stop]; is a method that allows AVAudioRecorder to stop recording.

[self data] is where we are capturing the NSData file and just storing it in memory; but we don’t want to save it the core data because that would be like saving it twice. It’s already being stored into the directory we wanted it to be in so we’re just making note of it for now.

The other things that we’re saving are important only in the fact that it will be easy for me to set up a specific fetch Request for the objects and also, specific time stamps, dates, and locations for UI upon playback of the recordings. It’s kind of like metadata.

Just in case you wanted to know what [self data] actually is:

- (NSData *)data {
    NSData *dataFile = [NSData dataWithContentsOfURL:self.url];

    NSString *string = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
                                          NSUserDomainMask, YES) objectAtIndex:0]
     stringByAppendingPathComponent:[self nowString]];

    [dataFile writeToFile:string atomically:YES];

    return dataFile;
}

In the next part, we’ll talk about playing back audio which is something I struggled with for many hours banging my head on my keyboard.