Test Automation, a term everyone in testing probably hears on a regular basis, but what is it? If we listen to consultancies and tool vendors, it’s the holy grail. If we listen to some gurus in the testing space it’s the current and future of testing. Others are more conservative. However, I commonly hear people say Test Automation when they actually mean Automated Testing, to be frank, they’ve become synonyms. This confuses me. As a skilled tester I use automation and tools all the time, some I create, some I download, but automated tests they are not. I’ve built automation to help my team create data, I’ve built low tech dashboards and I’ve built tools to install my mobile app under test on as many devices as I could connect to my computer. I’ve written tiny little scripts they saved me hours on a weekly basis. All these tools provided a lot of value. I’m also built numerous architectures to do automated checking, which just like my tools provided a lot of value. My focus on all these occasions was my testing mission. So, what is Test Automation? How can we succeed with it? What skills do we need to succeed with it? Questions I’ve been pondering for the last four years, and I think I finally have some answers. Not just answers though, I also have lots of actionable advice for you to take back to work. This advice will be woven between real examples like those listed above, exploring why I created them, the skills needed and how they helped me with my testing mission.
Richard Bradshaw Software tester, Speaker and Trainer, FriendlyTesting
Richard is currently the BossBoss at the Ministry of Testing, co-creator of the Automation in Testing (AiT) namespace, blogs at https://automationintesting.com and https://thefriendlytester.co.uk. Tweeting over at @FriendlyTester and also the creator of the YouTube channel Whiteboard Testing.
All started in 2010 when XING, the largest business network in German speaking countries, decided to go mobile and to staff a team with 2 iOS and Android developers, 2 software testers, 1 product manager and a freelance mobile designer. Back then the mobile team developed against a non-public API and tried to catch up with features from the web platform that had been developed over the past 7 years. In the initial 2 years everything was working more or less fine, but mobile traffic increased and exceeded half of the overall traffic of XING.com including iOS, Android and Windows Phone. Alongside the increased traffic, customers requested more mobile features, but feature development speed of the singular mobile team was too slow. The development approach with only one mobile team did not scale compared to over 200 web developers in more than 15 teams. Therefore, the company decided to adapt the scale of the mobile development to the whole company and unleash mobile development onto the web teams. As of early 2015, XING has 7 mobile teams with iOS and Android developers as well as software testers. The so-called domain teams are now responsible for feature development on web and mobile. However, the scaling onto multiple mobile development teams exceeding a total of 50 people bore new challenges that had to be solved. In this talk Daniel will show you how XING is scaling mobile development and testing efforts to 7 mobile teams with more than 20 mobile developers and 12 (mobile) software testers. He will explain how the bi-weekly releases are coordinated and organized and how real users play an important role in the release process. The second part of this talk will concentrate on the mobile test automation solutions that are in use within the XING mobile teams and how an internal device cloud was established to provide several devices to all the mobile teams across different locations.
Daniel Knott Lead Mobile Tester, Xing
Hi, my name is Daniel Knott and I am a passionate software tester since 2008. In my career I worked for companies from different industries such as IBM, Accenture, XING and AOE. In different agile projects I gained strong knowledge in different areas of software testing e.g. mobile, search and recommendation technologies, web and desktop applications. Since 2011, I am the blogger of this blog and publish posts about software testing, mobile testing and any other kind of interesting topics around software development. If I find the time, I am also a speaker at testing conferences and quality assurance articles. An overview of my publications can be found here. In 2014, I published my first book about mobile testing. The title of the book is “Hands-On Mobile App Testing” and can be purchased on http://www.handsonmobileapptesting.com/. In the beginning of 2016, Daniel released a new eBook called “Smartwatch App Testing” and is published at https://leanpub.com/smartwatchapptesting.
"How my product knowledge aids my testing" -- In agile teams, we focus on delivering value to our customers continuously and incrementally. In order to do that, testers need to be able to decide on priorities based on risk and impact. What’s the best way to do that? I think it’s about effectively employing our product knowledge to support the team in all test-related activities. Most of us gather product knowledge by testing and re-testing or asking questions in grooming and plannings. So testing activities is what makes most of us product specialists. How do we discover more about product's dark sides? Should the architecure of the product guide our testing? What can we learn about the product from a pull request? Since these product insights are at the core of our decisions, can we be more effective at gathering and employing them? This talk is about applying a set of learning models (e.g. David Kolb's model) to discover the feature and technical sides of a software product.
Anastasia Chicu Agile Tester, Xing
Anna is an Agile Tester at XING with over 4 years of digital experience. Since joining XING she has coordinated testing activities, supported her team and the product department, and organised regular trainings to the User Care team. Her past experience as QA in an outsourcing company has helped her develop effective communication skills in her team and with other departments. Her debate experience, on the other hand, determined her to trigger constructive discussions that add value to the product.
Pedro Pereira CEO, byAR Augment Your Reality
With the conviction that Augmented Reality (AR) is part of our present and future, in 2015, I founded byAR Augment Your Reality, a company that designs and develops digital experiences based on AR technologies. I'm responsible for the areas of creativity and management of the company. In the last two years I've been developing knowledge in the AR technologies, trends and market. I developed new interaction concepts and projects with devices of our future, for example, Microsoft Hololens. I started in 2001 as researcher and responsible for Virtual and Augmented Reality Group at Computer Graphics Research Institute (CCG). In this institute, during 4 years, I took part in some EU projectos for developing augmented reality technologies. Then, during 10 years, I took part of the innovation and creativity process, in several positions, namely in national and international sales management, development and project management and marketing management at Edigma, a company focused in developing interactive digital projects and multitouch technologies. I'm graduated in Computer Engineering and Computer Engineering and I have an MBA with a specialization in Marketing at Catholic Business School of Porto / ESADE in Barcelona. More recently, I attended to executive education course in Brand Management at Catholic Business School Lisbon.
In a modern software house, security is a top priority. It is a fast-paced working environment focused on continuous delivery and integration. Keeping up is an endless and demanding challenge for the security team. Issues arise and must be addressed efficiently and in an expedited manner. Besides keeping abreast with emerging technology, the team needs to develop strategies that ultimately work within the organisation. Can classic analysis tools be used on a strict time cycle? How can a security team handle the demands of a product team? How to ensure that everyone is on the same page and understands what’s under the hood? In seeking answers to these questions, this talk assembles a set of tips and tricks by showing some work and spikes used in a real software house. One hopes to provide a potential roadmap for the implementation of secure and improved Software Developing Life Cycles.
Renato Rodrigues Senior Application Security Analyst, Farfetch
Holder of a MSc in Informatics Engineering, with high interest in security issues, working in the AppSec world for a while. Speaker and trainer in conferences like BSides, OWASP and others security and IT related. Curator of an AppSec Ezine and promoter of a security enthusiasts group (0xOPOSEC Meetup) – more than breaking through challenges, it is all about sharing the knowledge.
Software engineering is much more than coding. It is “an engineering discipline that is concerned with all aspects of software production”. We are responsible for the code we write. We should not release untested code, and we should not ask our peers to perform code reviews without delivering a solid test suite. It’s our job to reduce the bugs that are found after the development phase. Although, we live surrounded by bad practices, stress, and fear of missing a delivery timeline. We all work or worked on projects with those characteristics and if we don't enforce a shared culture of testing, no matter what, the quality of the software that we produce will suffer. With this talk/topic, I want to share the problems and solutions that will enable us to create a shared testing culture across our team and company.
Pedro Tavares Software Engineer, Talkdesk
Pedro is a passionate Software Engineer currently working at Talkdesk. Distributed Systems and Software Engineering best practices bring a big smile on his face. He likes to read books and scientific papers on Computer Science and he's currently running the Porto's chapter of Papers We Love.