I was looking for a way to get the original width and height of a Texture2D in Unity (i.e. the size of the texture before it was potentially scaled up or down to power-of-two dimensions). By itself, Unity only exposes the width and height of images after they’ve been imported and rescaled.

The workaround I came up with is a little hack-ish, and it only works inside the Unity Editor (which means you could still use it to export the original texture dimensions to an XML or some such). But I didn’t find a solution on the Unity forums, so I thought I’d post mine.

Continue reading »


(Note: This was written in 2012! Some of it is still applicable, but most of it should be taken with a grain of salt.)

Part three of my series on various things we learned about selling on the iOS App Store. (Part 1, part 2)

Don’t pay for AdMob clicks!

We’ve tried buying ads on AdMob for all three of our games, and, like just about any other account on the web that we’ve read said, the results were abysmal. It wasn’t for lack of trying to be smart, either:

Continue reading »


(Note: This was written in 2012! Some of it is still applicable, but most of it should be taken with a grain of salt.)

In part two of my series on lessons we learned about selling on the iOS App Store, I’ll focus on our experience with review sites. Again, this is not an exhaustive list of all the stuff you need to know, but rather a collection of things that eluded us when we published our first couple of apps. Part 1 is here, in case you missed it. Part 3 is here.

Continue reading »


(Note: This was written in 2012! Some of it is still applicable, but most of it should be taken with a grain of salt.)

We’ve been releasing apps on the iOS App Store for about a year now and have three apps in the store (our games “Drifts” and “Coign of Vantage”, and a sandbox game / interactive toy named “CellShades”). Although we’ve tried our best to do our homework and read up on how to do business on the App Store before our first release, there are numerous things that eluded us when we started out.

In this series of posts, I’ll try to summarize some of the things we learned over the year. This is by no means a comprehensive list – just a series of items that I would have liked to read about when we worked on our first app. Part 2 is here, and part 3 here!

Continue reading »


[Update: I've also written a three part series about general app store tipps: part 1, part 2, part 3]

If you’ve released an iOS app, you’ve probably carefully read the stories of developers who were burned by getting the wrong release date in the App Store and subsequently missing out on the new releases list. As we painfully learned with CellShades, there are still pitfalls to look out for in 2012. I’ll get to these in a moment. First I’ll describe what used to be the problem:

When you submit an app for approval, you pick an availability date in iTunes Connect, at which you want the app to be released. If the availability date is after Apple approves your app, the app will be released at the availability date. If the approval process takes longer and the availability date is passed, the app will be released once Apple has approved it.

New apps that arrive in the app store are listed in the “new releases” section, by the day they were released (and then alphabetically. PROTIP: your app has a better chance to be seen if its name starts with an A or B!). Being listed there means that a lot of users get a chance to discover your app, which can be critical in building up the momentum a new app needs in order to get picked up by review sites or top lists in the App Store.

Continue reading »


Last week, we released a new iPhone/iPad app named CellShades.

CellShades is an experimental little app, similar to “Cellular Automata“, in which you spill liquid which provides energy for cells to spawn and harvest. These cells follow a handful of very simple rules (such as that they will always move towards adjacent positions which they perceive as more energetic, or that they will perish and dissolve into liquid if they can’t keep up their energy levels) to produce a wide variety of different behaviors. The examples that ship with the app range from crashing waves to flowering structures to larger microorganisms crawling across the screen, giving a taste of the huge spectrum of dynamics possible with the “simulation”.

The app comes with 9 free preset behaviors. A one Dollar (0.79€, etc.) in-App purchase unlocks the ability to manipulate the simulation’s underlying parameters and save your own behaviors.

Technology and libraries

We wrote CellShades in Objective-C, with the main simulation parts being mostly C. I started out writing the app in cocos2d, mainly because it provided a very quick way of setting up everything so that the app could draw and update its simulation and respond to user input. The main GUI of the app consists of UIKit elements residing on top of the cocos2d surface.

Continue reading »


[Update: Iain Peet pointed out that the definitions of “aliasing” and “anti-aliasing” in audio are more narrow than I thought when I wrote this post. Specifically, aliasing in audio refers to the artifacts you get when you shift or create frequencies beyond half the sample rate, and anti-aliasing refers to low-pass filtering audio signals in order to remove these artifacts. Both of these play no part in this post, and I have updated it to remove the terms.]

In this fifth installment of my series on dynamic audio programming in AS3, I want to take a quick look at the artifacts we introduce when we process audio signals without interpolation. We’ve already had a brief encounter with these, when we implemented a basic flanger effect back in part 3, which turned out to have a somewhat dirty, distorted sound to it. In this article, we’ll take a closer look at where this dirt came from, by looking at a naïve implementation of pitch shifting.


Pitch-shifting, and how not to do it

To be clear, for the purposes of this article, “pitch shifting” is what happens when you slow a recorded sound down (the pitch lowers, by an octave whenever the speed is halved) or speed it up (the pitch rises, by an octave whenever the speed is doubled). Pitch-shifting while preserving the original duration of a sound is an entirely different story, and a good deal more involved.

With that out of the way, let’s look at what a basic implementation of the effect might look and sound like:

Continue reading »


In part 3, we had a first look at creating audio effects in AS3 by processing microphone input with robot voice effects. One of the things we did was to create a so-called comb filter by adding a delayed version of the original signal to itself. We then created a flanging effect by oscillating the delay’s offset.

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Audio sample: 2 drum samples, one dry, one processed with a flanger.

Given that flangers are one of the staples of audio effects (especially when applied to electric guitars), you may have been wondering why exactly this worked the way it did. In this article we’ll figure this out.

This gives us an opportunity to review a bit of audio theory, independent of AS3. Next time, we’ll redo our quick and dirty flanger implementation and add antialiasing, but today’s article will be free of code – you’ll just need a little high school math.


Continue reading »


In the first two tutorials of this series on dynamic audio in AS3, we’ve covered pretty much everything that Flash’s realtime sound API offers us. Let’s put all of it to use and benefit humankind by building a little app that will turn your voice into a horrible robot!

Pictured: Code sample 5. (image source: http://www.dieselsweeties.com)

Basically, what we’re going to do is take input from the microphone in a stream of sound samples, try to come up with something interesting to do with the samples, and then send the samples on to the sound card’s output.

Rather than building the app so that the user gets record and play buttons with which they can record a take and play back the processed version, we’ll build a real-time effect and process and play back the sound as it comes in. There are two reasons I want to go this route:

First, there’s less chrome involved that doesn’t add to the subject (GUI, application state and such).

Second, real-time processing is actually harder and more interesting than processing pre-recorded audio, because it poses the question how we can make (reasonably) sure that we have received enough input from the mic whenever we’re asked to fill a new output buffer. I’ll go over the intricacies of that in the next two sections – if you want to get right to the part where we do nasty things to your vowels, feel free to skip these and come back to them later.

Continue reading »


In this second installment of my series of tutorials on dynamic sound in ActionScript, I’ll discuss the different parts of the sound API and show you how to extract single samples from a sound that’s in memory or coming from the microphone, as well as how to generate simple dynamic audio in real time.

As I wrote in part 1, the sound API introduced in Flash Player 10 is essentially just a set of methods and classes that let you access individual samples in a sound. Just as the introduction of the BitmapData class enabled you to manipulate and read pixels in bitmap images and from a webcam’s input, the sound API lets you process sound in memory or coming from microphone input, at the sample level.

There are two basic ingredients you need to understand in order to cook up dynamic audio – ByteArrays and SampleDataEvents:


Continue reading »