Open Source

My first ever contribution to open source, let alone it's to Apple's very own Swift XCTest. The fact that my name is with these individuals: 

Incredibly exciting! :)  It was a very simple one. With the new proposed updates for Swift 3.0 coming ahead, one particular one I was aware of was that ++ operators were going away. I noticed in the code that within a for loop in one of the classes contained this very own ++ operator which would generate an error in Swift 2.2 and will eventually no longer work in Swift 3.0. Therefore, I changed it as can be seen below:

Incredibly exciting! :)

It was a very simple one. With the new proposed updates for Swift 3.0 coming ahead, one particular one I was aware of was that ++ operators were going away. I noticed in the code that within a for loop in one of the classes contained this very own ++ operator which would generate an error in Swift 2.2 and will eventually no longer work in Swift 3.0. Therefore, I changed it as can be seen below:

As someone who uses Swift on a daily basis, it's incredibly rewarding and exciting. I continue to fall in love with Swift more and more each day and look forward to its continuous improvement in the next coming years.

As someone who uses Swift on a daily basis, it's incredibly rewarding and exciting. I continue to fall in love with Swift more and more each day and look forward to its continuous improvement in the next coming years.

It’s been a while...

It's been months since I've updated this little thing and I hope this doesn't become a habit. I was fortunate enough to get a job very shortly after my last post which is what has caused this extended absence.

So much has happened over the past few months. WWDC happened. While I didn't attend the conference. I was in the thick of it; experiencing WWDC afterparties, getting to know quite a few smart developers. Note to self: Make business cards just for this event. Seems silly because we live in a digital world where we receive the latest news and texts on our very watches, but still, it's a must.

I was fortunate enough to attend AltConf. Very amazing. I particular enjoyed the talks by:

Matt Ronge - Lessons in App PR: How to Launch

NatashaTheRobot - Swift Thinking

Abezir Nazir - What Haskell Teaches Me About Swift

I'm positive there are quite a few more that are amazingly impactful and informative; unfortunately, I didn't have the time to see them. (But the videos are up. Check them out!) The three above really allowed me to pause and think about multiple layers of the vast world of iOS.

Additionally, I've had the pleasure of working at Musey for the past few months. We're making apps for the interior design space by helping individuals find products for the world's interiors and creating a solutions to help designers manage client relations more efficiently. It's a small team, but we move really quickly and it's so much fun to tackle new challenges. Best of all, the apps are in Swift!

I remember when I began learning the iOS framework; Swift had just been announced and it was the age old debate. Should I learn Swift? Should I learn objective-C? The mentality it seemed was that if I wanted a job, learn obj-C, if I want to build apps, learn Swift. I'm very much glad I learned objective-C first still. I do struggle daily with writing code in a more swifty way, but it's a good thing. I enjoyed the very uniform and verbose way of writing objective-C because I think it taught me a disciplined way of writing code. It's crazy that just a year ago, I was looking at the objective-C books and thinking; how the hell am I going to learn this crap!? As I continue to read through Swift books, I'm finding new ways to refactor my code and make it more Swifty. Huzzah!

Still, I'm just scratching the surface. I have many weaknesses in my game. I want to continue to master the things that I'm good at and round out my weaknesses. Setting up UI stuff; whether it's programmatically or via storyboard, I absolutely love. I love working on POSTs and GETS to and from the backend!

Bruce Lee states: "I don't fear the man that knows 10,000 kicks, I fear the man that practiced one kick 10,000 times."

Such power in that statement. Perfect my craft.

Level up your keyword ranking game for indie iOS developers

I’ve been using Sensor Tower for the past 2 weeks and it’s been absolutely amazing in helping me track my app keywords. So in case you didn’t know, while it’s not completely clear, it appears that App Store Optimization (ASO) is based on 100 characters of keywords that I provide along with what’s in my title for the app.

The title of my app? You ready? Tomorrow: Record Inspiring and Influential Messages Today with Your Voice, Get it Tomorrow to Inspire Your Future Self

It’s not spammy but I admit, it’s wordy. I believe it’s a tad over 100 characters. I also have an additional 10 words or so that I’ve been tracking.

Sensor Tower provides amazing graphs of how you have been progressing for particular keywords that you are currently ranking for.

Here is my word, desire.


As you can see, 2 weeks ago, I was ranked 60+ and I’ve been steadily climbing the charts and now I’m currently ranked 6 for the word, desire. Rather than having to go into the App Store, and looking up all my keywords individually, Sensor Tower gives me all of them at once! Amazing!

Sensor Tower gives me my rank for each of my keywords and tells me how hard it is to rank. Here’s another shot of what it looks like:


First Column: Keyword

Second Column: How often this keyword is actually searched scale: 1-10

Third Column: How hard is it to rank for this keyword (1-10)

Fourth Column: How many apps are currently ranked for this keyword

Fifth Column: Your current ranking for this keyword

It’s a matter of finding the best words that suit your app and finding synonyms that you can rank higher for.

For example, it’s quite difficult to rank for the word, messenger, so I’d do my best to find a word that is similar, and has a lower number for the third column. Once you become king or queen of that word, then we can jump onto the more difficult words to rank for.

This is just the tip of the iceberg. It’s loaded with many more amazing features like spying on your competitors, potential keywords that would work for your app, etc that I have yet to even look at. I can’t wait to dive in further. Bottom line, ASO is so important because iOS phone owners use the search feature to find new apps. It is an absolute must that you at least rank for the name of your own app!

How my app, Tomorrow, records and play audios using AVAudioRecorder and AVAudioPlayer (Part II)

Last week, we covered the how to get AVAudioRecorder set up correctly with the use of a singleton sharedInstance design pattern. So now that we’ve set up the recorder, how are we going to get it saved into a backend? Or Core Data?

Here’s what the animation looks like for stopping of recording:


Hold and press down for starting of recording. Release to end recording. We’re not going to look at the animation code, but rather, what’s happening in the background!

Firstly, I have a similar core data stack as to the one described in this post written by Marcus Zarra.

I have also created a Recording class that has the following public properties:

@interface Recording : NSManagedObject

@property (nonatomic, retain) NSString * createdAt;
@property (nonatomic, retain) NSString * idNumber;
@property (nonatomic, retain) NSData * memo;
@property (nonatomic, retain) NSDate * showAt;
@property (nonatomic, retain) NSString * simpleDate;
@property (nonatomic, retain) NSString * timeCreated;

Then I have another singleton handler that will take care of saving of all the recordings. Let’s call this RecordingController.

In it we have a public method that we put in RecordingController.h:

- (void)addRecordingWithURL:(NSString *)urlPath 
                andIDNumber:(NSString *)idNumber 
             andDateCreated:(NSString *)createdAt 
               andFetchDate:(NSDate *)showAt 
              andSimpleDate:(NSString *)simpleDate 
             andTimeCreated:(NSString *)timeCreated;

In the implementation file:

- (void)addRecordingWithURL:(NSString *)urlPath andIDNumber:(NSString *)idNumber andDateCreated:(NSString *)createdAt andFetchDate:(NSDate *)showAt andSimpleDate:(NSString *)simpleDate andTimeCreated:(NSString *)timeCreated {
  Recording *recording = [NSEntityDescription insertNewObjectForEntityForName:recordingEntity inManagedObjectContext:[Stack sharedInstance].managedObjectContext];

  recording.urlPath = urlPath;
  recording.idNumber = idNumber;
  recording.createdAt = createdAt;
  recording.showAt = showAt;
  recording.simpleDate = simpleDate;
  recording.timeCreated = timeCreated;

  [self save];

We’re preparing the AudioController to make sure that all the correct data is passed over. Now back to AudioController.m:

- (AVAudioRecorder *)stopRecording {
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
      [self.recorder stop];

      [self data];

      [[RecordingController sharedInstance] addRecordingWithURL:[self nowString]
                                                    andIDNumber:[self randomIDNumber]
                                                 andDateCreated:[self createdAtDateString]
                                                   andFetchDate:[NSDate createdAtDate]
                                                  andSimpleDate:[self simpleDateString]
                                                   andGroupName:[self groupName]
                                                 andTimeCreated:[self currentTime]];

    [[RecordingController sharedInstance] save];


  return self.recorder;


Upon release of the longpress on the circle, there is a lot happening. There is animation, a pop sound we want to play, and it also needs to stop recording. We want to make sure that the saving is happening in the background thread, that way the animation can continue to do its magic.

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) makes sure that this is being done in the background thread which will allow the pop sound and the animation UI (which can never be done in a background thread) to perform without any lag or hiccups.

[self.recorder stop]; is a method that allows AVAudioRecorder to stop recording.

[self data] is where we are capturing the NSData file and just storing it in memory; but we don’t want to save it the core data because that would be like saving it twice. It’s already being stored into the directory we wanted it to be in so we’re just making note of it for now.

The other things that we’re saving are important only in the fact that it will be easy for me to set up a specific fetch Request for the objects and also, specific time stamps, dates, and locations for UI upon playback of the recordings. It’s kind of like metadata.

Just in case you wanted to know what [self data] actually is:

- (NSData *)data {
    NSData *dataFile = [NSData dataWithContentsOfURL:self.url];

    NSString *string = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
                                          NSUserDomainMask, YES) objectAtIndex:0]
     stringByAppendingPathComponent:[self nowString]];

    [dataFile writeToFile:string atomically:YES];

    return dataFile;

In the next part, we’ll talk about playing back audio which is something I struggled with for many hours banging my head on my keyboard.

How my app, Tomorrow, records and play audios using AVAudioRecorder and AVAudioPlayer (Part I)

The name of my next app is going to be Tomorrow. Record inspiring messages today, get it tomorrow. The next day, it’s gone. Very excited. I’ve created a quick and dirty website: My brother, David, will be redesigning the splash page to make it much better.

Today shall be a day where we discuss AVAudioRecorder. There are quite a few unique situations with AVFoundation. While AVAudioRecorder and AVAudioPlayer are very easy to use, it took a really long time to make sure that I was doing everything correctly and everything was performing the way I wanted it to. On stack overflow, you’ll find code of it being in the viewcontroller and in it’s simplest form. I’ll show you a refactored version of AVAudioRecorder through a singleton design pattern.

First, let’s set up some private properties:

@interface AudioController ()

@property (nonatomic, strong) AVAudioRecorder *recorder;
@property (nonatomic, strong) AVAudioPlayer *player;


We want to make sure these are in the .m file and not the .h file because other classes do not need to know what’s happening with these particular properties.

The next thing that needs to be done, initialize the AVAudioRecorder. This is the press and hold of the green button in ther previous gif.

The way that I accomplished this by using a singleton handler:

+ (AudioController *)sharedInstance {
    static AudioController *sharedInstance = nil;
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        sharedInstance = [[AudioController alloc] init];

    return sharedInstance;

The way that AVAudioRecorder is initialized: - initWithURL:settings:error:

The URL is the file system location is recorded to. The settings is the settings for the recording session The error returns, by-reference, a description of the error, if an error occurs. It is best to make sure to preset NSError *error = nil and pass in &error into the parameter to make sure that you are able to detect an error if one exists.

So first what I did was I added an NSURL to the file:

@property (nonatomic, strong) NSURL *url;

The reason for this is because we’re going to be using the same url for the start of the recording and stopping of the recording. One way to initialize an NSURL is -fileURLWithPathComponents. This is a class method that returns a newly created NSURL object as a file URL with specified path components. This is what we need because the path components are separated by forward-slashes (/) in the returned URL.

So here’s a lot of private methods that I used create an easy name for me to distinguish recordings, get a directory for the file to be a part of, and set up the recorder settings:

- (NSString *)nowString {
    NSDate *now = [NSDate date];
    NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
    [formatter setDateFormat:@"MMMdyyyy+HHMMss"];

    NSString *nowString = [formatter stringFromDate:now];

    NSString *destinationString = [NSString stringWithFormat:@"%@.aac", nowString];

    return destinationString;

- (NSArray *)documentsPath {
     NSArray *documentsPath = [NSArray arrayWithObjects:[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject], [self nowString], nil];

    return documentsPath;

-(NSDictionary *)getRecorderSettings {
    NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] init];
    [recordSettings setValue:[NSNumber numberWithInt:kAudioFormatMPEG4AAC] forKey:AVFormatIDKey];
    [recordSettings setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
    [recordSettings setValue:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
    [recordSettings setValue:[NSNumber numberWithInt:AVAudioQualityHigh] forKey:AVEncoderAudioQualityKey];
    [recordSettings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
    [recordSettings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

    return recordSettings;

Let’s break it down:

I wanted my audio files to be named by “Month/Day/Year-Hour/Min/Sec.aac.” While I could’ve used a UUID, this gives me a much simpler, easier way to distinguish if there are any timing issues or delays in other parts of my code as it is a timestamp of when a recording has occurred.

The documents path was confusing for me initially, and I’m not quite certain I’ve fully grasped it yet. There are numerous spots where one can save on the iPhone. It can be in a temporary directory or on the home screen, etc.

When you look in the documentation regarding NSSearchPathForDirectoryInDomain, it says:

“Creates a list of directory search paths. Creates a list of path strings for the specified directories in the specified domains. The list is in the order in which you should search the directories. If expandTilde is YES, tildes are expanded as described in stringByExpandingTildeInPath.”

I wanted to put it in the home dirctory so I used : NSDocumentDirectory, NSUserDomainMask

And the other object we’re putting into the array is the date filename string we created earlier.

Finally, settings. We want to make sure we are using key-value coding. So create a dictionary that can contain a bunch of values. The two things that were very crucial in making sure that it worked correctly:

[recordSettings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
    [recordSettings setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

Then I create a public method that records the audio to a directory:

- (AVAudioRecorder *)recordAudioToDirectory {
      NSError *error = nil;
      self.url = [NSURL fileURLWithPathComponents:[self documentsPath]];
      self.recorder = [[AVAudioRecorder alloc] initWithURL:self.url settings:[self getRecorderSettings] error:&error];
      [self.recorder prepareToRecord];
      self.recorder.delegate = self;
      self.recorder.meteringEnabled = YES;

      [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
      [[AVAudioSession sharedInstance] setActive:YES error:&error];

      [self.recorder record];

      return self.recorder;

Hooray! All of that just to record the audio. Now the stopping has a bunch more and I’ll do a part two because it uses another singleton handler for saving into Core Data! :)

Some silly animation fun

I’ve been having a blast making this iOS app. I’ve been playing around with a ton of animations and I wanted to share one in particular that didn’t make the final version of the app but was a fun bit of code anyways.

Here is what it looks like (really hope this works):


So there are a few things going on here.

First there is a long press on the green circle to record. And upon the release, the recording session ends and displays a few buttons to click on. There are some cool pop noises that unfortunately can’t be heard in a gif. Y U NO SOUND GIF!?

So the code is the following:

[UIView animateWithDuration:1 delay:0 usingSpringWithDamping:.15 initialSpringVelocity:.08 options:UIViewAnimationOptionCurveLinear animations:^{
      button.transform = CGAffineTransformIdentity;
} completion:^(BOOL finished) {
      self.containerView.alpha = 0;
      self.containerView.hidden = NO;
      button.backgroundColor = [UIColor customGreenColor];
      self.recordAgainButton.hidden = NO;
      self.recordAgainButton.alpha = 0;
      [self zeroState:ButtonStateZero];
      self.recordCornerButton.hidden = YES;
      self.playCornerButton.hidden = YES;
      [UIView animateWithDuration:0 delay:0 options:UIViewAnimationOptionTransitionCrossDissolve animations:^{
            self.containerView.alpha = 1;
            [self.containerView animateLayoutButtons];
            self.recordAgainButton.alpha = 1;
      } completion:^(BOOL finished) {
            [self showBottomButtons];


So this is basically just the end of the animation you have just witnessed. It’s animation blocks inside of animation blocks. The thing I want to highlight is [self.containerView animateLayoutButtons];

Here is the public method I am calling within the UIView subclass of containerView:

- (void)animateLayoutButtons {
    [UIView animateWithDuration:.1 delay:0 options:UIViewAnimationOptionCurveEaseIn animations:^{
        self.focusButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
        self.presenceButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
    } completion:^(BOOL finished) {
        [UIView animateWithDuration:.1 delay:0 options:UIViewAnimationOptionCurveEaseIn animations:^{
            self.focusButton.transform = CGAffineTransformIdentity;
            self.presenceButton.transform = CGAffineTransformIdentity;
            self.courageButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
            self.funButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
        } completion:^(BOOL finished) {
            [UIView animateWithDuration:.1 delay:0 options:UIViewAnimationOptionCurveEaseIn animations:^{
                self.funButton.transform = CGAffineTransformIdentity;
                self.courageButton.transform = CGAffineTransformIdentity;
                self.ambitionButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
                self.imaginationButton.transform = CGAffineTransformScale(CGAffineTransformIdentity, 1.1, 1.1);
            } completion:^(BOOL finished) {
                [UIView animateWithDuration:.1 delay:0 options:UIViewAnimationOptionCurveEaseIn animations:^{
                    self.ambitionButton.transform = CGAffineTransformIdentity;
                    self.imaginationButton.transform = CGAffineTransformIdentity;
                } completion:nil];

These simple lines of code give it the fun popping effect as the 6 buttons get put into place. All I’m doing is changing the scale of each button to 1.1 and setting it back to its identity while another button is being scaled to 1.1.

I wish it were a part of my app, but I realized that it had no place in my app for now. Tune in for AVAudioRecorder and AVAudioPlayer next!

New App is Coming Soon!

My next app is based upon the idea that we are our best motivators. We can easily get inspired to do something by watching something uplifting on the Internet or a heartfelt story that your friend sent over; however, it’s quite fleeting. I wanted to create an app that allows you to use your own voice to motivate yourself.

Initially, it started out as an idea that was based on meditation as I am an avid meditator to find balance in my life. I was planning to make a meditation app that allowed you to use your own voice and create your own guide meditation overlaying music that I provide.

As I continued to think about it and talk about it with my friend, Ben Adamson, I decided to pivot and take it down a new path based on using your own voice to inspire and motivate yourself.

The basic concept is this: Record something uplifting today, receive it tomorrow. After you receive the recording, it will be available for you to listen to for 24 hours, and then, poof! It’s gone!

It’s a fairly simple app, with a few complexities. One of the craziest things is that it only has 3 view controllers! How ridiculous is that!? My last app, Cardalot, had at least 40, maybe 50, but after a somewhat large, first ever project, I wanted to dial it back a bit and simplify it and brush up on some new techniques that I haven’t really used too much.

I’ll be using AVFoundation for audio and there is lots of animation that will occur for nice even flow. I haven’t had a lot of practice with animation nor have I ever worked with AVFoundation so this is incredibly new territory and I’m excited. I’ve done lots of reading and know what I plan to do.

This brings to my next point that Core Audio seems much more powerful than AVFoundation. I think it can help benefit the app greatly but is definitely a beast in itself. I plan to start out with AVAudioPlayer and AVAudioRecorder, then we’ll see from there. I’ll talk about some of the challenges that I ran into in my next post!


I was going to post about some of the new frameworks that I have been learning while building my app; however, today, I have some great news. The first app that I have submitted on the app store has finally been approved!

Cardalot is a flashcards app that I worked on with two other contributors that allows you to make studying and learning easier by making it more interactive and fun! It includes tinder swipes for whether you got it right or wrong so you can memorize or study things more efficiently. Additionally, it also includes progress graphs and reminders to make sure you take the time to look over your flash cards.

Our website will also be up soon and I’m more pumped than ever. I’d love any feedback. You can download Cardalot starting today!

Busy busy busy

I’ve been incredibly busy with working on a new project so I haven’t been able to post everything I wanted to. It seems that I start to write a post and I never really finish it due to conflicts. To update what I’ve been doing, I just finished working on a team project with two others. We made an app called Cardalot that helps students study more efficiently with flashcards in a more pleasant manner.

We realize that there are many other flash card apps in the app store, but I’m quite proud of our first ever app that we created. It’s pretty neat in that it has swipe features, a feature to remind you to study, and ways to view your progress on how well you have been doing on memorize your flash cards.

We spent just two weeks on building this app and man, let me tell you; we had many moments where we were pulling our hair out, unable to figure out what we were doing wrong. We’re using core data to save everything on the app. We even created a company around building apps together. The name of our company, Weekend Concept, LLC. I rather like the name as it has a nice ring to it and seems to embody the feeling of everything we went through in getting this app built.

My brother, David is being kind enough to build our website for us for a quick design that I created. It’ll be up on soon. The app is currently in review in the app store and hopefully it’ll get put into the store soon.

I am now working on an app which currently doesn’t have a name just yet, but I’m very excited about it. The app provides you with a way to send a message to your future self to take charge of your thoughts and your emotions. This app is something that I wanted for myself and evolved from a previous idea that I had. I’ll be working on this as a fun side project and will use the skills that I have learned over the past few months to build it. I’ll expand more upon it and explain some of the new things that I have discovered and some of the struggles that I went through in building out the app.

Please allow me to introduce myself...

My name is Jason Noah Choi. These posts have been fewer than I had initially hoped; nonetheless, I write to talk about my journey. This is a post I had previously written but not posted until now.

A bit of background: former agent’s assistant (think Lloyd from Entourage), sales guy, and finally account management/biz dev for healthcare startups. I’ve always felt like I was born to do something big, at least be a part of it. Additionally, I’ve always thought to myself:

“it’d be great if someone created this…” or “I wish that this existed so that…”

At the incredibly young age of 25 (not!), I came to the realization that I can be the one to create this with some proper training. The only thing stopping me was my personal drive and motivation. So wait…you’re telling me only I can stop myself?

I believe my path is not so atypical. I attempted to learn on my own through online resources having no programming experience and boy, it’s quite difficult to use your brain when you’ve been on autopilot for so many years. I sought the help of others and was fortunate enough to land a gig and work on some projects for Dunpatrick, Inc while polishing up my skills.

So let's think about code. There are so many different languages. Do I want to learn HTML/CSS/Javascript and go the front end route? Should I learn Python or Rails and do some backend? I decided that I want learn iOS. Objective-C. Swift.

This blog will continue to focus on things that I ponder as I continue to learn more and more about iOS development.