Creating perfect App Store Screenshots of your iOS App

Your App Store screenshots are probably the most important thing when it comes to convincing potential users to download or purchase your app. Unfortunately, many apps don't do screenshots right.

A quick overview over existing methods to generate screenshots:

Manually create screenshots on all devices and all languages

It comes without saying that this takes too much time, which also decreases the quality of the screenshots. Since it is not automated, the screenshots will show slightly different content on the various devices and languages. Many companies end up creating screenshots only in one language and use it for all languages. While this might seem okay to us developers, there are many potential users out there, that cannot read the text on your app screenshots if they are not localised. Have you ever looked at a screenshots with content in a language you don't know? It won't convince you to download the app.

Biggest disadvantage of this method: If you notice a spelling mistake in the screenshots, you release an update with a new design or you just want to show more up to date content, you'll have to create new screenshots for all languages and devices... manually.

Please click on the pictures to enlarge them for more details.

Notice how the size of the font of the first 3 screenshots (iPhone 4, iPhone 5, iPhone 6) are exactly the same. Only the last screenshot (iPhone 6 Plus) the text seems larger as it's a @3x display.

Create screenshots on one device type, put it into frames and resize them

This way, you only create 5 screenshots per language on only one device type and put them into frames. By putting the one screenshot into different frames, the tool you use can resize the resulting image to match the iTunes Connect requirements. 

Below some example applications that use this technique. I only had to upload one screenshot and got the result shown below. (left: iPhone 4, right: iPhone 6 Plus)

Do you see the difference between the font sizes in the screenshots? Carrier is easily readable on the iPhone 6 Plus and maybe the iPhone 6, but not on the other devices.
Another problem with this service is the wrong device types: The iPhone 6 should not look the same as the other devices.

A different example which now uses the correct device frame for each screen size. Do you see how very small the font on the iPhone 4 is? All 4 frames use the exact same screenshot. On smaller devices this results in very small fonts which are difficult to read for the end-user. On larger devices the screenshot is scaled up, which causes blurry images and fonts.

Don't get me wrong, using a web service that does these kind of frames for you is a great and easy way to get beautiful screenshots for the App Store. It's also the best solution if you don't want to invest more time automating taking better screenshots.

To sum up, the problems with existing techniques:

  • Wrongly scaled screenshots resulting in blurry font
  • Not using the correct device frames for the various screen sizes
  • Screenshot doesn't show the screen the user will actually see (iPhone 6 Plus user interface should look differently than iPhone 4)
  • No landscape support
  • No Mac App Support

Using correct screenshots for all device types and languages ("The Right Way")

Checklist for really great screenshots:

  • Screenshots localised in all languages your app supports
  • Different screenshots for different device types to have the correct font in your screenshots
  • Same content in all languages and device types (means same screens visible with the same items)
  • No loading indicators should be visible, not even in the status bar
  • No scrolling indicators should be visible
  • A clean status bar: Full battery, full Wifi and of course 9:41
  • Localised titles above your screenshots
  • Device in screenshots actually matches the device of the user (except for the color)
  • A nice looking background behind the frames
  • Optionally a coloured title

Clean Statusbar

Notice the following things:

  • 9:41 AM (or just 9:41)
  • All 5 dots (formerly known as bars)
  • Full WiFi Signal
  • Full battery

To achieve such a nice looking status bar, I can really recommend SimulatorStatusMagic by Dave Verwer. It's very easy to setup and you get all the above mentioned points for free.

Nothing is perfect

I worked on screenshot automation for a really long time, but haven't found the ultimate solution (yet). Even with tools like snapshot and frameit there are some open issues. Things like support for Chinese, Japanese characters and still having nice line wraps while keeping the design clean and beautiful. 

Okay, back to what you can actually do today :) Below are the results of nice screenshots, which were all generated completely automatically.

What's wrong with those screenshots? The time isn't 9:41.

On the iPhone 4, iPhone 5 and iPhone 6 the font size is exactly the same. The iPhone 6 Plus, again, has a @3x display which is why the text appears larger.

How does this look like for landscape screenshots?

Since the above screenshot collection looks a bit messy I decided to automatically resize the screenshots in the following examples. Instead of leaving all screenshots 1:1 they now appear properly aligned next to each other.

Landscape screenshots of MindNode: The iPhone 6 Plus shows a split screen when the app is in landscape mode. The smaller screen sizes show the list only. The users can see how the app looks like on their device before even installing the app.
Another interesting detail: Take a look at the lock button on the different devices, on the 2 screenshots on the top the lock button is on the top of the iPhone, while the lock button is on the right side on the latest generation.
The screenshots don't have a status bar, since MindNode doesn't show it in landscape mode.

Special thanks to Harald Eckmüller for designing the MindNode screenshots.

How does this look like for multiple screenshots?

How does this look like when you support multiple languages? 

Generating this amount of screenshots takes hours, even when it is completely automated. The nice thing: You can do something else on your Mac while the screenshots are generated, as long as you don't need the simulator. Instead of working while the screenshots are generated, you can also take a nap or tweet about fastlane.

How does this magic work?

All MindNode screenshots shown above are created completely automatically using 2 steps:

Creating the Screenshots

Using snapshot you can take localised screenshots on all device types completely automatic. All you have to do is provide a JavaScript UI Automation file to tell snapshot how to navigate in your app and where to take the screenshots. More information can be found on the project page. This project will soon be updated to use UI Tests instead of UI Automation to write screenshot code in Swift or Objective C instead.

This step will create the raw screenshots for all devices in all languages. At this point you could already upload the screenshots to iTunes Connect, but this wouldn't be so much fun.

Adding the device frame, background and title

frameit was originally designed to just add device frames around the screenshots. With frameit 2.0 you can now add device frames, a custom background and a title to your screenshots.

  • Custom backgrounds
  • Use a keyword + title to make the screen look more colourful
  • Use your own fonts
  • Customise the text colours
  • Support for both portrait and landscape screenshots
  • Support for iPhone, iPad and Mac screenshots
  • Use .strings files to provide translated titles

The same screenshot on the iPhone, iPad and Mac.

Take a closer look at the screenshots above: The iPad's time is 9:41 and the carrier name is MindNode. The other screenshots don't have a status bar, as MindNode doesn't show it on the iPhone in landscape mode.

A timelapse video of snapshot creating the MindNode screenshots.

The generated HTML Summary to quickly get an overview of your app in all languages on all device types.

How can I get started?

To make things easier for you, I prepared an open source setup showing you how to use frameit to generate these nice screenshots, available on GitHub.

The interesting parts are:

All you have to do now is run frameit white, to frame all the screenshots generated by snapshot.

Putting things together

Calling snapshot and frameit after each after is far too much work, let's automate this.

Take a look at the fastlane configuration of MindNode: Fastfile

To generate new screenshots, frame them and upload them to iTunes Connect you only have to run 

    $ fastlane ios screenshots

More Information

snapshot, frameit and deliver are part of fastlane.

Special thanks to MindNode for sponsoring frameit 2.0 and providing the screenshots for this blog post. 

Run Xcode 7 UI Tests from the command line

Get started with UI Tests to automate User Interface tests in iOS 9

Apple announced a new technology called UI Tests, which allows us to implement User Interface tests using Swift or Objective C code. Up until now the best way to achieve this was to use UI Automation. With UI Tests it's possible to properly debug issues using Xcode and use Swift or Objective C without having to deal with JavaScript code.

First, you'll have to create a new target for the UI Tests:

Under the Test section, select the Cocoa Touch UI Testing Bundle:

Now open the newly created Project_UI_Tests.swift file in your Project UI Tests folder. On the bottom you have an empty method called testExample. Focus the cursor there and click on the red record button on the bottom.

This will launch your app. You can now tap around and interact with your application. When you're finished, click the red button again. 

The generated code will look similar to this. I already added some example XCTAsserts between the generated lines. You can now already run the tests in Xcode using CMD + U. This will run both your unit tests and your UI Tests.

You could now already run the tests using the CLI without any further modification, but we want to have the UI Tests in a separate scheme. Click on your scheme and select New Scheme.

Select the newly created UI Test target and confirm.

If you plan on executing the tests on a CI-server, make sure the newly created scheme has the Shared option enabled. Click on your scheme and choose Manage Schemes to open the dialog.

Launch the tests from the CLI

It's important to also define the version of Xcode to use, in our case that's the latest beta:

export DEVELOPER_DIR="/Applications/Xcode-beta.app"

You can even make the beta version your new default by running

sudo xcode-select --switch "/Applications/Xcode-beta.app"

Example output when running UI Tests from the terminal:

Generating Screenshots

No extra work needed, you get screenshots for free. By appending the derivedDataPath option to your command, you tell Xcode where to store the test results including the generated screenshots.

Xcode will automatically generate screenshots for each test action. For more information about the generated data, take a look at this writeup by Michele.

Next Steps

To create screenshots in all languages and in all device types, you need a tool like snapshot. snapshot still uses UI Automation, which is now deprecated.

I'm currently working on a new version of snapshot to make use of the new UI Tests features. This enables snapshot to show even more detailed results and error messages if something goes wrong. I'm really excited about this change 👍

fastlane Example Setups

You want to see how Wikipedia, Product Hunt, MindNode and Artsy are using fastlane to automate their iOS app submission process? 

The above companies were so nice to open source their fastlane setups. I collected them and put it on GitHub.