Developing 'AR Give It Up!'

Last Friday morning fellow workers greeted me congratulating. Seeing me puzzled they went on: "You do know your app has been featured, don't you?" and showed me the places it was highlighted on the App Store. Although I had spent more than a year developing this app, I didn't know this could be the special day for me. This is the story of how I got there.

Love at first sight

In June last year I had no idea what to do next. Work on my former app had been finished and I was looking for either joining one of the ongoing projects or start something new. I knew I didn't want to work alone again, but all the projects were based on Unity, something I had worked with before and wanted to avoid at all cost. Then Apple announced ARKit at WWDC'17 - right at the perfect time for me.

I quickly downloaded the AR example projects and discovered SceneKit. I didn't know about SceneKit before, but in a single day I managed to put together the basic visuals of the new app and it looked promising. I knew the app would be based on the original Give It Up!, our app that become an organic months-long number one hit in China in 2015. It's visuals and game mechanics are simple and I thought I could deliver an AR version by the time AR debuts in the store in September. (spoiler: I missed it by a year)

The idea was well received at Invictus. To be honest, being one of the co-founders, I can almost do whatever I want, but I feel a lot better if others like an idea too. I entitled it as an experiment to discover both SceneKit and AR. I asked around if anyone would join, but no-one wanted to leave their team. Founders destiny I guess.
Perhaps because the goal was so tempting and I thought I can get there fast I neglected my fears about working alone.

I know what I don't know

Just two months later I saw it was impossible to hit the September deadline. I barely had the first prototype ready. Looked back and realized I had been overly optimistic. For one, I didn't count for the summer vacations. Also, SceneKit's documentation lacked examples, and on Stackoverflow it's a niche topic too which hurt the tight schedule. But it was okay for an experiment so I simply dropped the schedule, planning to release when it's done. This helped me to become a lot more open to what the fist internal playtests revealed.

Playing at 177 beats per minute the Give It Up! series feels like impossible first. Then, as the player keeps retrying, muscle memory develops and suddenly there's that rewarding feeling of progression. I still remember that great feeling from 2014 playing the prototype of the original Give It Up!. I realized AR poses additional difficulty by adding the cameraman's role to the player. To compensate for that, the mechanics in AR Give It Up are much simpler. No ceiling spikes, no relocating platforms, and all taps are accepted. Discovering AR's limitations and possibilities was fun.

Then unexpected things started to slow me down too. At some point I needed to run the app frequently to hunt a bug down. A lot of my mental energy went into staying calm as it took anything between 10-30 seconds for ARKit to understand the environment and finally start my debugging session. Going into autumn, with less and less daylight available, and office lightning being moderate, I started to struggle even more with ARKit not picking up the environment at all. In general debugging anything AR related was a pain as I could not reproduce sessions - they depended on how I moved the phone.

AR revelation

In spite of all the difficulties, at about 4 months into the project I felt I learned something about AR. All the apps we did before had that screen based approach. Launch screen, menu screen, loading screen, play screen. Screens and scenes everywhere. Now, screens are alien in AR. Reality don't have screens. There are connected spaces instead in which things rarely disappear or reappear. The most dramatic real-world space switching I can think of is the elevator's door sliding in front of you. But forget about spaces. AR is usually played (and always promoted) on a well lit table, that is, using a single space, no moving from space to space. So how to port something heavily screen based into a single space? I've been thinking a lot about acceptable ways of appearance and disappearance. I've found that if things grow, slide or fall into your view, then slide out or explode is fine with me. Fading in and out could also work although have that magical spice. After I knew how to bring things into or out of the player's view, I tried to avoid it as much as possible. Things in reality usually don't fall or grow into your view after all. So the main menu connects to the play session with the menu item being the first platform of the session as well. If Blob survives the last platform is connected back into the menu the same way. Platforms grow out from the ground and then sink back. The pink souls and white splash marks of Blob remain around for some time and does not disappear when leaving the session or reentering the menu, to strengthen the connection between real and augmented objects. As in real life, there is no HUD in the game, instead extra information unfold on objects by closing on them.

The submission game

The app changed less and less between playtesting runs, and looked good for a release. I had lot more ideas left. However a year have passed, and I felt I lost the motivation and energy to implement the rest. I felt anger against myself. Why had I started working alone again? I had the feeling the app is at the 80/20 point, and there was only 4 years to go. Yay! Did I mention how I hate and love Vilfredo Pareto? I also realized a core issue about the idea and I didn't know what percentage of the players will be affected: tapping the screen moves the device and it's camera which can result in a really shaky experience.

I hacked the worrying situation: striked out all ideas, fixed the bugs and happily submitted the GM build!

And then.. weeks of silence.

I had been waiting for 2 weeks just to saw the app got rejected with cryptic reasons about Performance issues, Hidden features, and being not sufficiently different from a mobile browsing experience. The message also explained that the app is considered sneaky, that's why Apple's response took so long, and subsequent reviews will be delayed as well.

None of us had no idea what's wrong, the feedback looked like nonsense. I didn't get why the review team kept copy pasting the most relevant parts of the App Store Review Guidelines, but never commented. So we've started guessing. Optimized performance. Waited weeks. It got metadata rejected, because, guess what, again, not being sufficiently different from a mobile browsing experience. Despite being an AR app. More guessing, new screenshots, more waiting, and more rejections.. but hey, finally they've sent us a screenshot, actually a pitch black screen. Imagine us staring at that "screenshot" as the app could not have black screen but displayed the augmented camera capture all the time.

You know what was that? Starting an AR session pops up a dialog asking for camera permission - only once in the app's lifetime. We've never tested the case when one disallows camera access. But when you are in that mood, and disallow that, the app gets a black screen instead of the camera capture. As this app relies on AR to augment the menu into the world, it just kept displaying that blank screen. Actually a few seconds later it suggested to turn on some lights thinking it's in total darkness. We'll never know, but perhaps that triggered all those strange responses from the review team.

End game

I've been very happy the review team helped us to nail that bug and finally accepted v1.0. I've submitted the promotion request form, telling them my story, right away. I've also set up automatic app release planned for 8 weeks later, as Apple recommends to wait for 6-8 weeks.

And then.. weeks of silence...

Two weeks later I got an email from a Developer Relations Manager, asking for a TestFlight build. Woo-hoo, it's happening, I though.

And then.. weeks of silence...

I've never got a response to any email I've sent to her. TestFlight showed no activity.

More weeks of silence. Guess I give it up, I thought.

Two months after the successful submission, the app got automatically released. No featuring. OK, I though it's no surprise after all the rejections earlier, but still.. I've set up cross promotion from our other apps, and considered the app a failure. (We never do user acquisition, just rely on featuring. Search Ads are unavailable from Hungary.) That happened a month ago.

And this is why the featuring took me as a surprise. Getting featured one month after worldwide release? How that can be? Then I double checked my old emails I realized I told Apple a release date off by one month..

Being featured is like getting a hug from those who know apps. I needed this kind of motivation really bad. Thank you unknown and silent Apple staff for that!