A Modern Testing Perspective, Plus Cool New Features in iOS Gateway 5.0

By David Vernon | 2/21/18

This blog is only partially about our newest iOS Gateway 5.0 release with device and simulator support for Touch ID and Face ID (which is super cool, but more about that later). It’s also a blog about how testing has changed — a lot — in a short amount of time.

These days, testing is all about UX, continuous delivery, and ensuring applications can run on any and every platform, and every device. We’re testing richer and more dynamic UIs (like maps and augmented reality), and complex systems — it’s not necessarily about systems in which you can access the code.

To keep up with these changes, we’re inventing new and different ways to interact with technology. Eggplant solutions are known for intelligent, image-based testing, but there are so many ways to enhance that, such as with object-based testing, API testing, database testing. To provide testing solutions that work with our customers’ full suite of products and that can test their customers’ true UX, we have to innovate to create products, techniques, and methodologies to test all these new modes of interaction. Voice interfaces (Alexa), facial recognition, and Touch ID to name a few.

As a technology, voice recognition is a tough nut to crack, primarily because it puts computing into a much more natural setting, so there's an assumption that the machine is responsible for adapting itself to the human. And the smarter systems get, the more we expect them to be intelligent (and behave more like humans). This raises the bar in development and even more so in testing—you need to better understand your users, as well as why and how they’re using your product. 

When we were working on our latest iOS Gateway 5.0 release, we had to get creative and tweak the APIs that Apple had given us so we could automate them; because for automated testing to work with Face ID and Touch ID, you’d need to use someone’s face or finger. Face ID works on the latest iPhone when a user gestures with the device and holds it up to their face so the sensor sees and recognizes them. With some swizzling, we changed that to a process that could be scripted to pop up a dialog box with a success or fail button for detecting issues and recognizing the tester’s face. For Touch ID, we swizzled the APIs controlling the sensor on the home button and swapped it out with code that lets you click on an accept and fail button to test and script both paths.

What’s so cool about these new capabilities in iOS Gateway 5.0 is that you can test security features from the user perspective. Prior to this release, testing security features was out of scope for automation. But now you can test what you ship and test the true UX, which, like I mentioned earlier, has shifted to be a top priority for businesses worldwide.

To learn more about iOS Gateway 5.0, check out the release notes.

Topics: continuous delivery, iOS Gateway, API testing, testing strategy, UX testing, User experience testing, UX, software testing, Eggplant solutions

David Vernon

Written by David Vernon

Formerly a contributor to Eggplant Functional, Dave is currently herding the cool cats on the Eggplant communications team. He has been developing for macOS since it was called NeXTSTEP, and enjoys hiding bugs just as much as finding them. Dave splits his time between Boulder and Estes Park, Colorado.

Stay up-to-date with the latest in test automation

Lists by Topic

see all