Outwarians at AltConf

Outwarians Jeames, Mahmudul, Tina and Tom (L-R) at WWDC AltConf where Mahmudul presented a talk on ResearchKit, ‘Less data, more research’.

Jeames gives us his update on all the latest news from WWDC Day 2 below:

CocoaTouch / UIKit


The new multitasking in iOS 9 on iPad allows 2 apps to run at the same time. The layout engine uses size classes to handle resizing your iPad app to work while in multitasking.

This means that when your app is in side-bar mode or split screen mode it uses a compact width (Similar to an iPhone application) to layout its views. This means for apps using size classes and that work on iPhone, this should be an incredibly straightforward process.

There is also a new picture in picture mode for apps that play video. This means you can watch your video on screen (move it around and resize) while using other apps. There are some new APIs for handling this.


A new UI component! The UIStackView is a sort of “flow” layout similar to that on the apple watch. You add components to it and they are automatically stacked vertically or horizontally. The UIStackView handles all of the constraints for you, and you can adjust the spacing and alignment as you wish. It fully supports nesting (Unlike UITableView!) which means you can create complex interfaces using stack views.

Keyboard Assistant

iOS 9 adds a bar of assistant items above the keyboard alongside the word suggestions. These are customisable by apps creating a standard way to handle extra keyboard buttons in your app.


Linking! You can now move sections of your storyboard into separate files and link them via storyboard references. This should be great for complex storyboards that previously were large and unwieldy.

You can also now unwind segues via storyboards.

Touch Prediction

iOS now provides a built in touch prediction API, which will provide you with the predicted future location of the users touch, to allow smooth and responsive interactions.

Notification Text Input

Allow text input from users with a custom push notification. You can use this for a custom reply to a message, or any quick text input directly on the notification without opening your app.



Apple has included some great changes to objective-c to facilitate usage with swift, but I think these changes are really great for the language even without the swift compatibility. Video available here.

Nullability Specifiers

This was introduced in an earlier version of swift, but I strongly recommend that everyone use this in their Objective-C applications. Being clear and explicit of where a null value makes sense in your API is an incredibly powerful tool, and helps the compiler to pick up errors as you write them, instead of once it crashes.

We have already started using these on CabFare and find them incredibly useful!

Typed Collections allows you to specify the types allowed in your collection types, giving a sort of static type safety to objective c. eg.

NSArray<UIView *> subviews

We now know that this is a collection of UIViews, and if we try and put something else in there, it will show us a warning at compile time.

KindOf types

When using typed collections, KindOf types allow implicit downcasting of types where needed.

Code Snippet of KindOf Types

Here we are specifying that subviews elements will always be a UIView, or a subclass of UIView. Here we can call the setTitle method of UIButton without explicitly casting. This allows more explicit type information throughout your app, instead of relying on id which could be anything!

Swift 2

The Swift language has been updated with some great new language features and tooling with version 2.0. Video available here.

Guard Statement

A powerful way of explicitly handling edge cases in your functions, that works really well with Swifts if-let syntax. eg.

Code snippet for Guard Statement

guards  can be used with any condition, but also allows for optionals unwrapped with let to stay around for the rest of your method.

Defer Statement

The defer statement will ensure that the code inside will be executed at the end of the current scope. eg.

Code snippet for Defer Statement

This will ensure that doAThing() always runs at the end of the current scope, no matter where it exits. This is great for making sure clean up operations happen after throwing errors or exiting early from a function.

Error Handling

We all love to hate NSErrors and the pattern of passing references to them around our code. Swift 2 introduces a new paradigm for handling errors which is baked into the language, making writing code that can fail much cleaner and easier.

Here’s a small example of what the code looks like, see the session for a more in depth example of how it works.


The error handling employs an ErrorType protocol (Which NSObject conforms to), so you can create your own errors by implementing this.

Note that this is not like exceptions in java. You can’t throw an error from anywhere, it must be explicitly marked as throws to allow the function to do this. This makes Swift error handling much easier to debug, and much more performant.

API Availability Checking

Checking for system versions is now super easy in swift.

#available(OSversion) can be used to switch out code based on the users device.

As an added bonus, the compiler will now show a warning if you try and use features and are unavailable for your current deployment target.

Protocol Extensions

You can now extend protocols with functions! For example you may want to add a method that can be used on any collection type like array, dictionary etc. The example given in the session is adding a countIf method for conditional counting.

This functionality also allows many previously global functions such as map and filter to be chained with much clearer semantics. eg.

Code Snippet for Protocol Extensions


Xcode support for swift has been greatly increased as well, with a bunch of new warnings and features in the editor for developing in the language.



Tina and Jeames at WWDC

With five Outwarians on the ground at WWDC this year, we’re really excited to share updates from Day 1. Most notably, that two of our developers, Tina and Jeames, made it onto the WWDC website!

We also had a small camp out in the Richmond office, with Jet, Adam and Rick calling out the most exciting updates as:

  • Swift being made open source in late 2015
  • Multi-tasking on iPad using a split screen, making the iPad a better productivity tool
  • Native watchOS apps
  • Enhanced intelligence, e.g. smarter Siri & Spotlight, able to guess who’s calling you from an unknown number

The following update is from our Senior iOS Developer, Mahmudul Alam, who is attending WWDC.

Native Watch App

As expected, apple announces support for a native watch app and introduces watchOS 2. WatchKit extension moves to the watch and watch apps can work without the phone being present. Watch OS comes with heaps of new features:
  • Watch connectivity framework
  • Watch extension can talk to web service directly
  • Animation support
  • Audio and video playback support on watch
  • API access to accelerometer and HealthKit sensor data
  • API access to Taptic engine.
  • In addition to glances and notification, introduces complications. Complications are glanceable custom information like upcoming flights or sports score.
  • API to access the digital crown
  • High priority push notifications to push immediate updates to watch apps.
XCode 7 and iOS9
On demand resource API:
  • Assets need not be in the app bundle, can be downloaded and used on demand
  • Customisable download order and priority
  • Resources will be hosted by app store
Storyboard refactoring:
  • Subsections of a storyboard can be extracted into another storyboard and replaced by a storyboard reference in the original one
App transport security:
  • Built into NSURLSession
  • Auto enforces current state of the art security standard (TLS 1.2, forward secrecy)
Test support:
  • API and network performance test support
  • Native UI test support
  • Native support for code coverage (unit tests and UI tests)
  • Recording UI  tests to generate tests with nearly zero coding effort
  • Supports all these feature on both Objective-C and Swift
  • Core location profiling
  • App transitions and network calls profiling
  • Address sanitiser – helps diagnose and fix memory issues
Crash analysis:
  • Get crash logs from both test flight and app store builds Open crash logs directly into the line of code causing issue.
… and a lot more to come! Watch this space for more updates from Day 2!


Accessibility is becoming increasingly important in the digital space. With the reach of digital products extending millions of people all around the world, organisations have a responsibility in making their products accessible and easy to use. Outware Mobile’s vision is to create intuitive, effective and engaging mobile experiences that make a difference by helping people get things done, be more productive, learn, grow and be entertained. We believe that mobile technology is an enabler, and accessibility is critical to achieving this vision.

What is accessibility?

Accessibility for digital products means making them available and friendly for people with temporary, permanent or increasing disabilities. People with visual, auditory, physical, speech, cognitive, and neurological disabilities should have the opportunity to access and use digital products, and also contribute to them, the same as everyone else.

For those of us who work in the digital industry, this means that we have several factors to consider in design, development and testing to ensure that we create a product that everyone can use.

Increasing the accessibility in a digital product doesn’t just benefit those that have disabilities, it also helps people on slow internet connections, those who are ageing, and those on mobile devices. For example, drop-down menus are notoriously difficult to use on mobile, and are very difficult to use for those with disabilities due to their temporary nature (i.e. the drop-down menu when exposed stays in a temporary state – it is not permanently on show).

The global standard for accessibility

The Web Accessibility Initiative (WAI), run by the World Wide Web Consortium (W3C), have created accessibility guidelines which are used globally for desktop and mobile websites. These guidelines outline the different considerations that need to be made for all user interface (UI) elements on a screen, and how these could be made accessible.

In 2009, the Australian Government agreed to raise the level of accessibility of government websites, complying with an A level by the end of 2012. Some states committed to reaching a AA level by the end of 2014.

Unfortunately, there are no official W3C guidelines for implementing accessibility in native mobile apps. Native apps use UI controls and elements that are specific to the device and operating system (iOS/Android/Windows). The existing guidelines include non-mobile app elements, such as HTML, web technologies, and server side implementations, to name a few.

Our work on the RentRight app for Consumer Affairs Victoria addresses this gap.

Case study: RentRight app by Consumer Affairs Victoria


We were approached by Consumer Affairs Victoria to work on the second version of their RentRight app, which included a refresh of the user interface, and the addition of a number of new features. We were also asked to make the app compliant with the AA standard of accessibility.

Having discovered that there were no official W3C guidelines for implementing accessibility in native mobile apps, we adapted the Web Content Accessibility Guidelines (WCAG) for web and mobile web.

In order to derive comprehensible Mobile Application Accessibility Guidelines (MAAG) from the WCAG, we assessed every recommendation for its applicability to mobile. We analysed its meaning in the context of a mobile app and a mobile operating system, and how it could be met to either produce the same or better results as the results produced by the WCAG for the context of a web page.

We produced a new description to re-formulate the recommendation for the MAAG. Then we assessed if the operating system provided functionality to meet the recommendation or if further considerations needed to be taken. Those could be in the user experience design, in visual design, or during implementation. We then created comprehensive guidelines for each platform on how to meet the criteria of each recommendation which could be followed throughout the process and used as a checklist when reviewing our work.

We had a look at the native accessibility features that are available as part of the Android and iOS platforms. iOS and Android devices do support accessibility, including features such as increasing font sizes, inverting colours, using voice-over, and switch controls. However, there are still considerations we need to make when designing and building the app, to make it as friendly and accessible as we can.

Some of the WCAG recommendations simply do not apply to mobile apps. We did not just disregard them though. Instead, we chose to either incorporate the recommendation into the MAAG as non-applicable and explicitly explain why it does not apply and has no impact on any mobile application, or we could find an intention behind the recommendation that does apply in some form to mobile apps and would derive a recommendation on that basis rather than finding a mere translation from web to native mobile.

When following the platform implementation guidelines, many of the accessibility requirements are provided by the Android and iOS operating systems. Other requirements need to be considered right from the start of designing the application and are carried through from concepts to visuals to implementation, before being tested against the original MAAG.

Design: User experience, business analysis, visual design

During the design phase, we went over the adapted guidelines to ensure we addressed every point that applied to our app. Not everything applied, such as the accessibility requirements for videos, since we didn’t have videos in the app. Several important things we considered included:

  • Form fields: It’s important to keep labels on form fields persistent, and not hiding them depending on state (i.e. whether data has been filled in or not). In many apps, you’ll see labels on the fields in a form used as ‘hint text’, and when you start typing in data, the labels get hidden. If you were in accessibility mode using voice-over, and wanted to hear the field labels and entry after completing the form and prior to submission, the labels wouldn’t be read out if they were hidden, therefore providing less context to the data entry and increasing the chance of errors.RentRight Android App - General property Issues template
  • Error messaging: When validating the responses in a form, any errors that are produced should provide help and instruction to the user as to where the error has originated. Imagine filling out a long form only to hear that there is an error and you can’t proceed – but didn’t know where the error was! A way to solve this on a native app with accessibility in mind is to change the label of the field with the error – e.g. “Name” might change to “Name – error”, so when the label is read out, it is clear which field needs to be changed. Another way to do this is to visually signify the field, such as putting a line around the field. This can help those who don’t have voice-on and can still see visual cues.
  • Use of colours: We ensured that colours were not used as a distinguishing factor throughout the app, with all buttons containing words, or iconography used for actions in the action/navigation bar. Users did not have to distinguish, for example, between a green, red, or blue box on a screen in order to make a decision. Any links that were used to visit an external website were underlined and coloured in blue.
  • Voice-over copy: The voice-over copy was written to convey instructions on how to navigate and complete the screen, and describes the elements on the screen. This is particularly important for someone with vision impairments, and need to have the screen read out to them. As each label is read out, it is important that these are descriptive and not ambiguous. States can also be described, for example, a button might be or ‘button – dimmed’ when it is not tappable, or ‘button – active’ when the disabling is lifted.

Development: Android

Provided when you create your user interface you use system provided standard components, the Android platform requires only a few additional steps to make your applications accessible. Some of the additional measures we had to take to make RentRight accessible include:

  • Focus-based navigation: It was important to ensure that all user interface elements can be reached and interacted with via directional input methods (either hardware or software based). In unique instances, we were required to change the focus order in our user interface to be more logical for users, as well as potentially make components focusable to provided more context. This was achieved by applying the android:focusable, android:focusableInTouchMode and android:nextFocus XML layout attributes.
  • Describe user interface controls: We had to ensure that all user interface elements had descriptive text (which may have differed from what was displayed) to help provide the user context via voice-over. We had to pay special attention to visual components such as images and tabs. This was achieved by applying the android:contentDescription XML layout attribute.


Development: iOS

On iOS we took the full advantage of the native VoiceOver and UIAccessibility programming interface (available from iOS 3.0). The UIAccessibility protocol is implemented as a part of the standard UI controls. This ensures the the basic accessibility feature being available when we used native UIKit elements. As part of the development, we make sure that all the standard UI elements has the following attributes are set (where applicable as per MAAG) in RentRight:

  • Label
  • Traits
  • Hint

However, to comply with the MAAG on RentRight we needed to customise the accessibility behaviour on few standard elements such as segmented control. In addition to that we implemented the our own custom views that provides it’s own accessibility information for different states. For example, the area assessment screen in RentRight:

Area assessment screen in RentRight

There are two ways to implement custom attribute information

Define custom accessibility information in Interface Builder (IB)

  • Define accessibility programmatically.
  • In RentRight we used the programmatic approach.

Following is an example code snippet providing attribute information in a custom subclass implementation:

Code Snippet


Our main aim in testing was to ensure the following basic points for accessibility in the app:

  • Label: Every element in the application should have a meaningful label.
  • Hints/Language: Every element where a user needs to take an action should provide a meaningful and clear hint. Based on these hints, a user should be able to easily decide the flow in application.
  • Flow: The user flow on the screen should be from top to bottom and left to right. The flow throughout the app should be intuitive.
  • Display: The user should be able to zoom in, increase the font of the text in the application. Error messages should be clearly visible.
  • Focus: The user should be able to focus on every element on the screen.

To help achieve these, we decided to close our eyes, put on earphones and simply follow the hints and the text speech to navigate through the application. The reason behind this was if in case the label or hint was not clear to us after testing the app for a few months, then it would not be clear to new users.

For checking the display, the user should be able to zoom into different parts of the screen and even increase the font size. In this case, the text should not be truncated and the entire text should be read out.


Our experience on RentRight helped us create a set of accessibility guidelines specific to native iOS and Android applications, that can be re-used to assess and design other apps. Our goal is to help our clients understand the importance of accessibility and create mobile experiences that are intuitive and easy to use for all.

For more information on accessibility:

Download the app:


Download from the App Store Download from Google Play


This article was written by:

  • Kelly Jennings, UX/BA
  • Frank Ziegler, Project Manager
  • Jonathan Causevski, Android Developer
  • Mahmudul Alam, iOS Developer
  • Meetu Singh, QA Analyst