Tuesday, April 24, 2018

I Feel The Need...The Need For Speed! Enabling Your Team Testing Efforts To Focus On Automation (Part 4)

Hey There! Welcome to part four of the four-part series dedicated to giving a glimpse into how the Amrock Tech (Formerly known as Title Source) team converted it's manual testing efforts to mostly automated testing! I know, it's been quite a journey through the last four posts, but you made it, and I thank you for sticking with me and putting up with my blabbering :) This post will focus on the value we've seen with respect to the test automation journey which started with the Top Gun training and is a supporting blog post for my talk at Agile and Beyond 2018 conference. Hope you get some value from it!


Part 4: Results Of The Efforts Top Gun Kicked Off



TLDR;


1. As a result of focusing on changing a mindset, which was kicked off with the Top Gun training, we have created automation focused ninjas that created over twenty thousand tests that are executed on a nightly basis.

2. As a result of focusing on changing a mindset through the Top Gun training, we have created a support army (of Devs, business analysts, etc.) for our automation focused ninjas that help traditional QAs with writing automated tests.

3. We are not done with creating armies of testing ninjas. We understand that the quality struggle is real and we have to evolve as the world around us does. To be effective Quality Champions, we need to be involved in all aspects of testing, from UI automated tests to unit tests, to performance and security tests.

4. If you want to create armies of ninjas, you need to start with training one ninja. Change is hard, excuses are easy. Dedicate yourself to taking the first step, whatever that may mean for your team.

5. Chances are that you've made it through all the blog posts in this series, and as a reward, I'd like to share with you all the material with respect to running the Top Gun training. From lesson plan and quiz examples to powerpoints and evaluation examples. Please peruse it on my Github page, and feel free to use what you want to design a test automation training of your own :)

Was It Worth It?


As you may have noticed in this series of blog posts, our team has gone through a lot of trouble to focus on ensuring test automation is our primary method of testing. But what have we achieved? What outcomes have we seen? What does the road ahead look like?

Before we answer these questions, we need a bit of quality assurance theory. This theory will help us understand our results. So let's get a little into the QA world. Let me introduce to you the concept of the test pyramid.

Courtesy of Martin Fowler's website: The basic premise, the more your automated tests are focused on the UI, the more expensive and slow they will be.


It is a representation of the percentage of time and investment a team spends on different types of tests and the return each layer of testing provides. I like it because it paints a good picture of roughly how much effort is going into automating tests and into which levels of test automation. We will not focus on the details but on the holistic picture. 4 years ago, when we started talking about automation, we literally ran one script, which had to be babysat and "spacebarred" through (ie. the space bar key had to be hit at critical points in the test to keep it going). I know, because I watched in amazement as this script was described to me as our "automation". As of today, our team has over 20,000 tests...that run...EVERY NIGHT.

Reducing Daily Risk


We run 514 automated tests that test the user interface, 2,867 automated tests that test the logic layer through user flows (integration tests) and 16,153 unit tests, which test very granular pieces of logic, such as individual methods, or custom properties on classes.


Our current test automation test pyramid make up


We communicate our results on a daily basis through an automated email, have set up dashboards for individual teams and even are working on bots to do the work for us.

Dashboard for test results for one of our teams


When we see consistent failures in our automated test results, our quality champions act immediately, as they understand their teams and leadership care about the results.

Because of the above activities, we find issues early on in our cycle and usually, not in production.

Test Automation For Everyone


By maintaining a central test code repository, we can tell who is checking in scripts and know that our focus on pushing for cross-functionality is working. In the last year, we've had 36 individual team members checking into our automated test suites, of which 22 were not traditional quality champions. Our team members do care about quality, and are practicing their skills in contributing to our efforts!


Contributors to our test automation source control area in the last year

The Road Ahead


Our journey in test automation began in 2014, and the last four years have brought us many lessons learned, struggles, and rewards...but we are not done yet. As software development changes around us, we have to as well.

While in 2014 the concept of automated tests was fresh and hot, now it is simply the ante to play. We are seeing DevOps push the envelope with respect to the speed of delivery, and testing has to keep up. We are seeing SecDevOps push the envelope forward with respect to security testing, and we have to involve our selves in those efforts and don't forget about performance, and load tests :)

As you can see, our mission is not simply to learn something new, the real mission, is to continuously learn something new, and be comfortable with being uncomfortable.

While we still have a long way to the moving target of all appropriate things automated, we have made what I consider monumental strides in the right direction, and I think it is because of the process and investment which I described to you in the last few posts.

Changing mindsets is no easy feat, and it took us about 4 years, but you now have the opportunity to take our lessons, and learn from them!

To spread the quality butter "Simply" follow the following steps


1. Convince leadership the juice is worth the squeeze

2. Prepare and deliver a focused, purposeful, measurable training

3. Empower trainees by focusing them on performing tasks learned in training, and reward them for sharing the knowledge learned to empower other team members to learn as well.

4. Struggle, encourage, fail, change, repeat until goals are met...then set higher goals ;)


What Is Your First Step?



The above probably seems like a lot of to do....because it is. But the first step for you or your team is to make a conscientious decision to do it. Change is hard. Technology is hard. Excuses are easy.

Take the first step, acknowledge the fact that you need to focus on quality, by focusing on the hard problem that is instilling a culture of automation first testing. What to do depends on how far along your team is, but the attitude change, which arguably the most necessary piece of your transformation, starts with you.

Image result for jocko willink quotes
As Jocko Willink of "Extreme Ownership" advises, do not let the moment pass. Start your team's automation journey, today.



I Feel The Need...The Need For Speed! Enabling Your Team Testing Efforts To Focus On Automation (Part 3)

Hey There! Welcome to part three of the four-part series dedicated to giving a glimpse into how the Amrock Tech (Formerly known as Title Source) team converted it's manual testing efforts to mostly automated testing! This post will focus on the lessons learned with respect to changing team member mindsets about automated testing and impacting non testers to help with the effort. This is a  supporting blog post for my talk at Agile and Beyond 2018 conference. Hope you get some value from it!


Part 3: Beyond Top Gun: Changing Our Mindset To Proliferate Automated Testing Throughout Our Teams


TLDR;

1. To keep your team of technically trained testers always evolving and highly skilled, do not introduce direct hires who have a lower level of technical skill...EVER.

2. Establish a Quality Community, and stick to a disciplined regime of value-adding meetings, and activities, open to all functions, not just testers.

3. As your team members practice their skills (learned in Top Gun), encourage them to share with other functions. Include those functions in quality tasks directly via sprint planning.


Top Gun Wrap Up


The last cohort of the Top Gun training graduated in spring 2016. In 2014, we set out on a mission to convert all of our existing QA team members to focus on automation and make automated testing their primary way of testing. After that last class graduated, ninety percent of our entire QA team was writing automation. We were seeing great growth in the number of tests, and a lot of energy being put into automation. We knew we were building a great base for technical skill, and a lot of expertise in automated testing.

But even with all of our efforts, we understood that this achievement was only a start. In order to scale automation and achieve a true agile quality focus, we needed the entire team (traditional tester or not) to buy in, and participate. From developers to business analysts and project managers. We needed everyone involved in the quality mindset. Since training the last Top Gun cohort in 2016, our mission has been clear. Convert our QA team members to act as Quality Champions, and involve the rest of the team in all aspects of quality. Below are the rules and processes we followed and implemented, which have brought us a great deal of movement toward the goal.

1. Do Not Dilute Your Team


In order to keep a strong technical base amongst our quality champions, we have fought hard to not hire external candidates who did not meet the current standards for technical knowledge on our teams. As mentioned above, our mission since 2016, has been to spread technical knowledge about quality practices to the rest of our teams for scaling. This mission did not allow us to focus on training external hires in the same way we did during the Top Gun training. We have been tempted, as we have grown, but we pushed back and for the most part, have filled needs of teams with bright, driven, interns turned full-time team members, which were tasked to prove their passion for the job during their internship.Our interns had to figure out how to learn about our automation standards and show their skill by producing automated tests by the end of their internship.

Intern-Leader Red Wings Game

2. Establish A Strong Quality Community


After leaving Top Gun, our team members became bastions of quality on cross-functional teams. In order to provide each one of them with a place to share lessons learned, ask function specific questions, and ask for help when necessary, we ensured that we kept a strong focus on the Quality Community. Although it is a constant struggle to bring people together, we have implemented numerous processes to ensure that the community is valuable to all attendees. Below were some of the lessons we have learned over the last few years, for keeping our community strong.

a. Well Defined Organizational Duties


In any volunteer community, it is hard for one person to always do all the prep work. In order to share the burden, we established a rotating role of "Huddle Captain". We created a job description for what the Huddle Captain is responsible for (ex. gathering topics, updating calendar invites, sending reminder emails, taking notes) and created a list of Huddle Captains for 4-6 months in advance. We also established a general mailbox which the Huddle Captain uses to update the huddle agenda and rotation.

Example Huddle Agenda Email


b. Well Documented Schedule


We established a wiki page for our huddle and maintained the same huddle cadence. We ensured that the quality huddle is very rarely canceled, and ensured to post the schedule for the huddle well ahead of time.

c. Valuable Discussions


In an attempt to keep our attendees interested, we regularly took polls with respect to desired topics, and then presented on those topics. This by far, is the most difficult thing to maintain, as with a wide variety of attendees, topics are hard for everyone to get value from.

d. Open To All


To encourage all roles to come learn about quality, we extended the invitation to all roles. To cross-pollinate our ideas to other communities, we also hosted cross-functional sessions, focused on ideas we both cared about. For example, when interest came up about unit testing, we combined our huddle with the developer community.

Keeping a community which has a voluntary attendance base is really hard. But with the above actions, we've maintained a consistently high participation rate, and have reaped the rewards with respect to relationship building. We consistently take notice of quality champions and other team members from other functions asking questions, contributing to technical projects and overall maintaining a great rapport due to having a place to share ideas and get to know each other.


Team Leader Isaac Gilman loves attending our meetings...it may because he's a former QA, but we like to think it's also because our meetings are awesome :)

3. The Final Step To Becoming A Quality Champion: Spread Your Learnings To Others


After we finished the last cohort of Top Gun, we imagined a place where our QAs would be able to write enough automation to keep up with the development team and provide coverage for every feature written during our sprint cycle. We would no longer need to manually test most new features, and life would improve drastically.

That reality has been a bit slower to materialize than we anticipated. The more tests we wrote, the better we got at it. The better we got at it, the more need for automated tests we discovered. After all of our Top Gun graduates got a hold of the user interface driven tests, they started to learn how to write integration tests, and some started writing unit tests. We needed help. We came to a realization, that we had to train other team members to write automated tests and that we needed help in testing from our entire team.

To accomplish this, we went on a campaign to influence the leadership and other functional communities. We talked about the importance of testing to business analysts, and developers. We shared the learnings from Top Gun with those two groups, and given we wrote our tests in the same development language as our application, the barrier to entry for developers and some BAs, to help us with our efforts, was very low.

Our QAs developed different ways to involve their entire teams. We received support from our leaders to task developers with helping us write automated tests and we involved the entire team in team-wide testing efforts ("Team Testing Days"), where the entire team would subscribe to exploratory testing, to cover any non-automated requirements. Generally, both of those efforts promoted interesting bug finds and raised the level of ownership for the quality cycle for the entire team. We ran workshops to show our business analyst friends how to write automated tests and cheered them on when they accomplished testing tasks assigned to them.

We spread awareness and communication about the value of understanding automated tests in whatever way we could. Our team even held a Lego-themed "Testathon", a voluntary "hackathon" style unit testing competition which produced a great deal of awareness about unit tests and produced thousands of unit tests.

Each team member of the winning Testathon team received a trophy and substantial prize :)

The efforts to establish a mindset of Quality Championship are a continuous battle, which we continue to fight to this day. Convincing our QAs that in order to be proficient at what you do, you must evolve with the times is not an easy endeavor. As testing evolves, we lean on each other, and our friends from different disciplines to upgrade our testing efforts. While Top Gun kicked off our struggle to become as technical as we can be, we continue new learnings daily and don't foresee an end anytime soon. As our industry evolves, we must also do the same.




I Feel The Need...The Need For Speed! Enabling Your Team Testing Efforts To Focus On Automation (Part 2)

Hey There! Welcome to part two of the four-part series dedicated to giving a glimpse into how the Amrock Tech (Formerly known as Title Source) team converted it's manual testing efforts to mostly automated testing! This post will focus on the lessons learned with respect to the design and implementation of the Top Gun training and is a supporting blog post for my talk at Agile and Beyond 2018 conference. Hope you get some value from it!




Part 2: Teaching Top Gun, Lessons Learned To Establish A Disciplined Instructional Design Process



TLDR;


1. One does not simply teach without preparing

2. Equal access to instruction through a proper communication channel is very important

3. Training organization makes the training. For a successful training, a course outline which lays out clear expectations, learning objectives and knowledge checks has to be distributed well before the start of a training.

4. Feedback is a gift. It should be delivered to the team member and his or her leader at the end of the training, and include actionable takeaways for the team member to work on.

After pitching Top Gun, and getting buy-in from leadership and team members, the real work began. We needed to figure out how to deliver an effective training, that provided value to our team and team members. For us as eager technologists, the organization and instructional design of the program, was definitely the biggest learning. We knew the content, but over time, we learned many lessons with respect to delivery and evaluation, which evolved the process to deliver a great amount of value to both the team, and the team member.

1. Equal Access To Instruction Through Physical Organization and Standardization of Communication Methods


Part of the requirement for a team member to participate in Top Gun was the fact that he or she would be 100% dedicated to the program. In order to achieve this dedication, early on in the process, the designers decided that all team members must sit in the same area, and be segregated from their regular team. Prior to the start of each cohort, a dedicated Top Gun space was found and reserved for the duration of the program. The key to space where the team was located, was that it was away from the regular technology team, from which the team members came from. This purposeful decision had a great impact toward ensuring that the team members involved in Top Gun were not easily distracted.

All of our eager Top Gunners, were a bit like Dori from Finding Nemo...easily distracted. We eliminated many distractions, by moving our team members to a different area.


Another lesson learned during the last cohort of the program, which was not realized in the initial cohorts, was the value of a uniform communication method within the Top Gun team (trainees). Due to the team project-oriented nature of the program, team members needed to constantly be in close communication with each other. During the initial cohorts, all team members were physically present in the same room, but in the later cohorts, some team members were in the office on a part-time basis or worked completely remotely.

To ensure that all team members were able to communicate with each other, a requirement of communication over the same medium was established. All team members were equipped with headsets and web cameras, and even though 80% of the team members sat in desks within feet of each other, all were constantly in contact with each other via a web meeting. This had the effect of including all team members in conversations, and sharing sessions, equally. Because of this requirement, team members which were located in San Diego, or Florida, had the same access to instructors and other team members,  as team members who were physically located in the office.

To remind our on-site team members of our remote team members, we represented the remote team members presence via "buddy bears"




To standardize communication, we used the "Zoom" teleconferencing solution for all training communication

2. Pre-Determined Lesson Plans, Learning Objectives, and Knowledge Checks


As the Top Gun program was designed, a large focus was put on instructional design. While it must be said that the design improved drastically, as the program evolved, it is extremely important to note that from the inception of the program, the instructional design was one of the most emphasized areas. While the initial cohort was rolled out with merely a basic course outline, the latter cohorts benefitted from a well thought out, adhered to instructional design structure. As Top Gun continued to evolve, it was determined that the following pieces of instructional design brought tremendous value to the training.

The notes from the first Top Gun design session


a. Course Outline With Strongly Defined Expectations


An example of a course outline used for Top Gun can be found here. The course outline pre-defines basic expectations of the candidates, and gives visibility to the candidates' leader, with respect to what the candidate needs to be successful in the program, even before starting. From basic technology needs to the length of the program, and the type of material to be covered. It is essential to release this document to the candidate and the leaders with enough time for the candidate to prepare the pre-requisites for the course. For most Top Gun cohorts, the course outline was released at least two weeks before the start of the program.


b. Learning Objectives


A learning objective is a statement which is meant to "articulate the knowledge and skills you want students to acquire by the end of the course" (source). Learning objectives were introduced in Top Gun in the third cohort as a result of a lesson learned with respect to the amount of knowledge acquired by the candidates.

At the beginning of Top Gun, the concept of learning objectives was not incorporated into the planning of the program, and so, there was no direct link between what the candidates learned and what the instructors intended for them to learn. Given the initial design of the program did not incorporate specific learning objectives, we also could not fairly judge how much knowledge the candidates were actually leaving with.

This fact was extremely unfair to the candidates, as although we deemed them successful in the program, we actually were not sure whether the lessons the candidates learned in the program were going to be applied when the candidate was released back to his or her team. After incorporating learning objectives, we had a very clear understanding of what we wanted the candidate to learn and how to check the level of knowledge the candidate had at the end of the program. All of our learning objectives were identified and communicated via the course outline. An example of a learning objective can be found in the course outline, found here



c. Knowledge Checks


In general, when students in classroom settings are told they are going to be taking a test or quiz, they get nervous. There is a stigma that surrounds evaluations, which during the design of Top Gun, we wanted to destroy. In order to do this, when we started talking about evaluations and quizzes, we clarified the reason for them. We wanted to ensure that the administered quizzes, acted as points of self-check, as opposed to pre-requisites for passing the program or staying in the program. We also made the quizzes directly related to the learning objectives which we designed, to ensure relevance. An example of a quiz administered to the students can be found here


d. Capstone Project


While quizzes were not tied to the success of the student, there was a portion of the program that was. Unlike the previous projects and assignments, the capstone project was an individual project, assigned to be worked on for the last two weeks of the project. The requirements for the project were communicated to the students well in advance of the two weeks and outlined in the course outline. The project also acted as an outlet for creating automated tests for the teams the Top Gun candidates were delivering for. As a reminder, one of the value propositions of the program was that each cohort would increase the amount of automated test coverage, for the team. Overall, approximately a thousand test cases were delivered through means of the capstone project, across all Top Gun cohorts.


e. Feedback Interview


The final piece of feedback which the candidates received, was delivered through a feedback interview, which consisted of the candidate and their leader. The purpose of this interview was to provide insight for the team member's leader and the team member, with respect to how the team member performed during the program, and to provide suggestions on how to keep growing the skills of the team member after he or she returns to their regular team. During the interview, a top gun instructor discussed a pre-compiled "report card" for the team member, which documented the interview discussion topics. An example of the report card can be found here.



Overall, the process explained above, evolved across the multiple cohorts of the Top Gun program. They were lessons which provided a better, more structured learning experience for each cohort of candidates. Every time the Top Gun training was delivered, our organizational and instructional design process got better.


I Feel The Need...The Need For Speed! Enabling Your Team Testing Efforts To Focus On Automation (Part 1)



Agile and Beyond 2018


Hey! Did you know that I am going to be speaking at Agile and Beyond in about a month? I'm so excited about it, that I'm writing a series of blog posts to leave a lasting impression :) If you want to know what the conference is all about, check it out here. If you're interested in info about my session, check it out here.

 I started writing this blog with the intention of providing a peek into the journey which my team at Amrock, formerly known as Title Source, went through, to turn our testers into lean mean automation machines! It's a four-part series, and maybe should be a short novel ;) Hope you can get something out of this first part, which describes why I'm sharing, and the beginning of the journey for "test automation awesomeness" (Joe Colantonio, "TestTalks Podcast", every single episode).

Series TLDR;


1. The problem with manual testing is that it cannot keep up with fast-paced product development


2. The solution is to establish a mindset of automated testing


3. To establish a mindset of automated testing, teams need to invest in their people, through a focused systematic approach.



Why I Am Sharing



About four years ago, my teams' journey toward automated testing began. With a lot of blood, sweat, and tears, we were able to refocus our testing efforts from mainly manual, to mainly automated, and have seen our testing efforts get faster and more reliable. From increased quality to removing bottlenecks to empowering team members, it's been a great journey, the lessons of which, I feel should be shared with others.

The Problem


Manually repeating tests to ensure the quality of a growing product feature set is madness. Without a proper set of automated tests, development of new features will not succeed. In the current world of fast-paced delivery cycles, manual testing simply cannot keep up. Teams that perform exclusive manual testing, will ship products with low quality and break the trust of their consumer (internal or external).


Our Story


In 2014, the Title Source Tech Team (now known as Amrock) was heading down the path of implosion (personal opinion ;)). Our team was growing, our technology was "hot", and we were cranking out features our business wanted. The problem was that our quality efforts, could not keep up. We were focused on scaling our tests via adding people and that strategy could not be sustained. In two years, our centralized QA team grew from 2-20 people. Of those 20 people, 1 was involved in automation. Although his efforts were valiant, he simply could not create enough automated tests. We did not have a concentrated effort to ensure all engineers could help with automation and our test efforts were not scaling with our growth. No matter how many test cases we wrote, no one person could execute them all as we were constantly adding tests. We were constantly in a state of not being able to test enough. We needed to change our mindset, improve our technical skills and lean on each other.

QA Team Building, circa 2014

Some more of Our QA Team, 2014

Top Gun: A Focus On The Automated Testing Mindset


We realized we needed to establish an automated testing mindset, but we did not have the skills necessary to do so. In order to help our team members learn, we created a focused training, during which, team members with no automated testing, or coding experience, were given the opportunity to learn how to write automated tests. We taught team members from the ground up and produced automated tests aimed at pre-identified products, that really needed the help.

Throughout the training, team members received training and practice in how to write automated tests, and the teams for which we were writing our tests for, received many (hundreds) automated tests. In the end, the training program was a really successful way to perpetuate the automation mindset and change the way we tested at Amrock Technology.

Top Gun Summary


  • 4-6 week focused training without previous coding experience
  • Bootcamp style, 100% focus on training
  • Targeted teams
  • Capstone project delivered up to 400 automated tests for targeted teams.



Part 1: Launching Top Gun, The Future Value Proposition


TLDR;


1. Convince your team members automation is the future of their career, then mandate that it's going to be the future of the role on your team.

2. Convince leadership that automation is absolutely necessary to their product development life-cycle

3. Convince leadership that by investing in their people, they are investing in their team, and doing the right thing

4. Market every win: Program kick-off, graduation, etc.


Actions  Taken


As a Team Leader, imagine someone asking your team to give up your only tester, for 4-6 weeks for the promise of automation in the future. That proposition seems crazy for you to accept right? Who would perform the testing for your team? How can you be sure there would be pay-off from the program? All these questions were ones that we had to answer in order to establish trust from leadership, which was absolutely necessary to secure the team members that were to participate in the training.

Convince Team Members Automation Is The Future


The first step in creating a want for the program was to convince our testing group, that automation was the future. At the time of the inception of the program (2014), most of the testing community still focused on manual testing. Only "unicorn" companies had a heavy investment in automated tests. It was easy to believe that what we were doing (manual testing), was the "normal" way to perform our job. In order to change this mindset, the leadership of the testing group had to establish a mindset change toward the future. At Amrock, (and I'd imagine at most companies), we pride ourselves on being forward thinking and future proof. We used this mindset and pride to drive forward the idea that manual testing, is the way of the past, and that automated tests were going to be a key factor in a successful career in quality.

We used our social and professional networks to perform some analysis of companies that we all looked up to. I personally talked to some friends at Microsoft, and we read some Google testing documentation, which all pointed out that at these companies, there are no manual testers. We discovered that at Microsoft, everyone who tested was writing automation. There were no dedicated manual testing groups. After realizing this, we were able to convince our team members,  that it is in their best interest to know automation. Whether they wanted to stay at our company or not, automation was a skill that would drive your quality assurance career forward.

After we had these discussions with our team members, we (as leadership) mandated that all of our team members needed to learn how to automate. We then announced that each of the team members will have the opportunity to do so, by participating in the Top Gun training program. Even before the mandate, the prestige of getting a crack at learning something which Microsoft and Google did, was enough to excite our team members. We ran five sessions of the training and after the first one, we started receiving requests from team members from other companies to go through the training. We knew we had built something awesome.


Automated tests needed to be the primary form of testing! We had such trouble finishing our manual testing, that we had to test through lunch!

Convince Leadership That Automation Is The Future 


In parallel to convincing team members that they needed to become automation savvy, we needed to convince their leaders that the investment the team member wanted to make to learn automation was worth it. From a team leader perspective, the gains from the training needed to be short-term, and long-term. At Amrock, team leaders lead cross-functional teams, which means that the leaders themselves come from a multitude of technical disciplines. Some team leaders are former developers, some are former business analysts and some former project managers.

Our task was to sell them on automation and sell them on incurring some short-term pain for a long-term gain. The main short-term selling point to convince leadership to let their team members participate in a non-production focused 4-6 week training was, the fact that when that person came back, they would immediately be writing automated scripts for their team. As mentioned before, at the time of this proposition, team leaders were struggling with completing testing per sprint. Our teams were churning out a large amount of code, and testing was always the longest task in the development cycle. We knew that it was not realistic to say that automation would solve this problem immediately, but we also knew that immediate wins could be achieved from automation.

To convince the team leaders to allow their team members to learn about automation, we explained to them, a test automation strategy which in the short term would ease the pain of really repetitive tasks, and in the long term, cover most regression test cases. We educated the team leaders on the fact that during Top Gun, their team members would be taught how to pick test scenarios to automate based on specific heuristics which provide the most value for the team. We also told the team leaders about our plan to teach the team member what tests should not be automated. We assured team leaders that after successfully completing the training, team members would be able to immediately identify test scenarios which would be valuable to automate, and would eliminate repetitive, simple checks. In the long term, as team members added automation, the regression test suite would eventually become mostly automated, providing time to focus on testing new features in innovative ways.


We created a battle cry, to help with our effort. In retrospect, not all things should be automated as we learned ;)


Convince Leadership They Are Investing In Their People


The final argument which we used to convince our leaders to allow their team members to attend the training, was one which appealed to the leaders' aspirations to be good people, and invest in the knowledge of their team members. From a leadership perspective, allowing team members to grow their technical skills benefits the team, the team members, and is simply the right thing to do. We appealed to this fact and even used some of our "ISMs" to help us out. We made it really easy for leaders to see all the potential impact that this training could have on the team member's career, and by also providing clear benefits to the team which would materialize as a result of the team member's participation, made them an offer they could not refuse.


Marketing

As part of the design process for the program, we decided that we needed to market its' existence. We wanted the program to be known by everyone on our team, be seen as an elite program, and have enough buzz around it, that everyone wanted to participate. Before the launch of the program, we designed a program logo and printed massive FatHeads (Vinyl Decals) which we then pasted in plain sight. We held ceremonies to celebrate the kick off of the program, and the graduation. Everything was done in public, and successful team members were celebrated as "aces". Graduating team members received really cool trophies, and the top performers were rewarded with paid for trips to regional developer conferences. We invited all team leaders of participating team members and spread the word through conventional (meetings, celebrations) and non-conventional ("pitch day") channels. We began with a program which was focused around a team of twenty testers at Amrock (a company in the Quicken Loans Family of Companies) and by the last cohort, we were receiving interest for the program from the entire Family of Companies. The marketing effort was a huge part of spreading the word about the program and establishing it as a prestigious brand.

To spread hype about the effort, we created a brand!



We celebrated graduations in cheesy ceremonies and costumes!


Monday, January 15, 2018

CodeMash 2018 Recap


This past week, I participated in a conference at the Kalahari Resort, in Sandusky, Ohio. CodeMash is an annual developer conference. Judging by the speakers and participant volume, it seems to be one of the better attended and organized technical conferences in the midwest. My company sent myself and 5 of my friends to the conference to learn about and bring back new ideas which could improve our team. We traveled together, ate together, went out for beers together, and learned a crap ton. This post will summarize what I thought were some of the most important themes, tools, and ideas that I encountered.

  

Cross Functionality

As a quality champion attending what I thought was a developer-focused conference, I kind of expected to encounter mainly development talks and workshops. But turns out the beauty of CodeMash, is that at its' nature, it caters to all types of technologists. This year's tracks included Architecture, Data (big/small/otherwise), Design (UI/UX), DevOps, Enterprise/Large-Shop Development, Hardware/IoT, Mobile, Programming Principles, Project Leadership/Soft Skills, Security, Software Quality, Web/Front-End. I found myself focusing on Software Quality, Security and the DevOps tracks, with a few other session types sprinkled in. But it was refreshing to see all aspects of the SDLC represented, pushing the idea of a true cross-functional technologist forward.

Notable Sessions

Webapp Pentesting for Developers and QA Persons

This session was conducted by Brian King and focused on tools which could help developers and QA persons discover penetration testing. Brian did a really great job of differentiating what functional vs. Pentesting was and then went on to guide us through some common approaches to pentesting, using free tools. He walked us through example tests and drilled the idea home, that essential pentesting approaches can be carried out not only by specialists (like himself) but also by pentesting noobs (like me). I walked away with tools and approaches which I am excited to bring back to my team.


DevOps Zen: Injecting Automated Tests Into Infrastructure 




This session was conducted by Stephen Shary and focused on testing NGINX when it is implemented as a reverse proxy. I was honestly blown away by this session, not only because it was really well conducted, but because it introduced the idea of integration tests to testing the NGINX configuration. Stephen works for Kroger Technologies (yea, the grocery chain!) and ran into a problem, with testing his infrastructure. NGINX setup focuses on the premise of a configuration file(s) which specify the routing of traffic through the appliance, to the web applications which sit upstream of it. Stephen's teams maintained the configuration file in source control but ran into mega issues when changes were made and checked in. This caused him and his team to look for and eventually develop an open source integration testing framework called SnowGlobe.



The value of the framework is that when deployed, it mimicks upstream dependencies, effectively mocking your web apps, while running tests against your NGINX configuration(s). The framework comes wrapped in a nice docker container and is able to be integrated into a continuous integration flow quite easily. During the session, Stephen demonstrated how tests could catch erroneously checked in configuration changes, such as a poorly configured re-direction. Stephen's team is eager for other teams to adopt the framework and add to it, so he has extended an offer to help with teams trying it out. Watch out Stephen, our team is pretty eager to get some tests running!

Favourite Workshop

Devour The Cloud With Locust Swarms: Hands-On Load Testing


                                      

This workshop was run by Steven Jackson and Nick Barendt. It involved building a cloud-based load testing lab, and launching an application for testing (application under test) on AWS. We started with the launching of the infrastructure necessary for hosting the load testing infrastructure (load generator), and the application under test, which was a great lesson in itself. We then moved to writing simple scripts to run on the Locust load testing framework, and we followed that up with varying degrees of difficulty of load testing. Finally, we implemented fixes (introduction of caching) to our application under test and saw results of the fixes in the subsequent load tests.

This process was really awesome to walk through as it covered the full spectrum of what an engineer interested in performance would have to do. I've been to many workshops focused on writing tests, which do not give you an idea of the work necessary before any tests are written. Steven and Nick did an awesome job giving us the tools necessary to truly establish a load testing environment and run various degrees of difficulty of load tests. It was challenging, but due to crystal clear instructions in their Github repo, I did not have a problem completing the exercises.




Favourite Talk

Sondheim, Seurat, and Software: finding art in code



Due to unanticipated unfavorable weather forecasts, this was the last talk I saw at CodeMash 2018. But what a talk it was. Any time you get the chance to listen to one of the software industry's gurus, you just go. I must admit, before this talk, I was a bit skeptical, but I knew that if I had the chance to see Jon Skeet talk about anything relating to software, I should.

I was not disappointed. Jon's talk was one that I think transcended the traditional boundaries of technical, or soft skill talks which I encountered at Code Mash. Jon spoke of software and compared it to his favorite musical, "Sunday In The Park With George", by Stephen Sondheim. Jon talked about all types of lessons leading to the idea that developing systems, is similar to writing a play. He supported this by exemplifying design, composition, and light and drawing parallels to craftsmanship in both disciplines.

Listening to Jon Skeet speak of the SDLC, was less of a lecture, but more of a sermon. His passion for the higher ideals of craftsmanship and passion shone through above else, and really inspired the rest of us to think along the same ways. I was blown away by his ability to translate his experiences and make them relatable to our individual struggles.



Conclusion

This was my first time being at Code Mash, and I am excited to say I think I found a gem of a conference. Anyone who's ever been has told me it's pretty great, and now I can confirm it as so. I will be back, and in the future, I'll bring more of my family :)



Specific Resources Bookmarked At Code Mash

https://www.thoughtworks.com/radar/tools
https://github.com/Kroger-Technology/Snow-Globe
https://csfirst.withgoogle.com/en/home
http://cidrdb.org/cidr2015/Papers/CIDR15_Paper16.pdf
https://github.com/repsheet/repsheet-nginx/blob/master/spec/integration/integration_spec.rb
https://github.com/jemurai/nginx_workshop
https://github.com/stevenjackson/devour-the-cloud

Wednesday, November 1, 2017

Tech Dad Manifesto

On October 3rd, 2017 at 7:13am, I became a Dad. I was preparing for that moment for a long time, but I truly had no idea what the moment would bring. The minute my daughter Victoria arrived, my life changed from being the best me to be the best Dad. Immediately, I started thinking about new improvements to my personal and family process, which would positively impact Victoria and Brandi my wife. I blazed through books such as “Dude, You’re A Dad” and “Dad Jokes”, and believe it or not, even started investigating purchasing a minivan (GASP!). I was away from the office for about a month, and in that timeframe, got used to being #1Dad (I even have a plaque). I cooked, I cleaned, I shopped, and I did a lot of baby snuggling. The time I spent with my new family addition was awesome. 

Unimpressed Baby Is Unimpressed


Then reality set in, and I realized, that in order to continue being #1Dad, I would have to elevate my game at work, both in terms of process and delivery. Now even though our tech team alone added 11 new little humans to our families in the last year (2017), I understand that not all of us are caring for little ones. But I am confident, that all of us on the tech team, struggle with elevating our game at work while having enough time to care for or participate in the things we love.
Whether its kids, puppies, or video games, the struggle of maintaining a balance of consistent high performance and time spent on the things or people you care about outside of work is real! After spending a month at home with my daughter and wife, I realized I needed to tighten up my process to fully utilize every possible second of time toward the things which would provide the biggest gain in personal process efficiency. The below statement of action is what I challenged myself to follow:

The Tech Dad Manifesto

Be Purposeful In All Activities
I refuse to do things without an agreed-upon purpose. Activities that I cannot attribute to a predetermined goal do not lead to my teams or my own long-term success. Goals are established before the tasks are initiated. Goals are prioritized based on some sort of needs analysis. I understand that when finished, some goals may not turn out to be impactful. It is ok to only know what you know, and try and establish the best course of action with the information at hand. Ignore the noise.


Be Disciplined
I will eliminate all unnecessary distractions. Whether it’s closing browser windows, turning off my phone, or aggregating team member questions to a pre-determined “question time”. My time is precious, and I have to ensure it is best used. I will politely exit conversations or meetings which do not have a purpose that relates to my goal or task. I am the master of my own time, and while I understand that I am not perfect, I will strive to improve my time discipline every single day. Every second counts.

Be Honest
I will give and receive feedback immediately and constructively. When I see an opportunity for improvement, I will voice my opinion, even if it uncomfortable to do so. I will voice feedback with the intent of improvement and care. I will change my behaviour to ensure feedback is received as constructive, and not belittling or passive aggressive. When I receive feedback, I will take action on it, and ensure the feedback giver is treated with appreciation, even if the feedback is hard to hear. Do the right thing.



Be Comfortable With Being Uncomfortable
As a technologist, I understand the pace of our industry is very fast. I understand that in order to keep up, I have to always be learning and refactoring. I will not apologize for not being the master of all domains. I will be comfortable, with the fact that I need to keep learning, advancing, and that I may not remember all the things I have learned in the past. I will always strive to learn new things, document, implement and share my learnings. I will not be disheartened by the daunting task of implementing my big dreams. I will not become complacent in my thirst for knowledge and I will not give up. Obsessed with finding a better way.


Be A Team Player
I will believe in my team and contribute my ideas to their process. I will be engaged, and make clear expectations of my team members to be as well. I will not stand for disengagement and will seek to understand the reason behind its occurrence.  I will lean on my team and expect they lean on me in their time of need. I will foster and rely on relationships throughout the team, as a way of solving hard problems. We are the “they”.



Balance

Above all else, I will strive to achieve a balance between what I deem most important, and the activities to support said thing. I will not spend all my time on one while sacrificing the other, but I will not use the first as an excuse for not performing at the other. I will be purposeful, disciplined, honest, uncomfortable and a team player. Due to this approach, I will be able to provide for my family, while being the best geek I can be. That is how I will remain #1Dad and a top contributor on my tech team.

My 2017 Gilbie :)

WIFM (What's In It For Me)

I know, not everyone is #1Dad, or #1Mom…or plans to be, or even wants to be. But I am confident that every one of us wants more time to do the things we love, and most of us want to be awesome at what we do.

The above tenants were individually derived from the experience of the Crusaders team, and they have worked really well for us to date. I am a firm believer that they are highly reproducible for each and every one of us, and while not easy to perfect, will improve your ability to strike a balance between doing the things you love, and the thing that allows you to do those things. Ask yourself, what do you love to do? What is the reason you push yourself to be better every day? 

For me, that reason was re-defined on October 3rd, 2017 at 7:13 am.


Mission in life: #1DAD

Wednesday, September 20, 2017

QL Tech Con 17 Resources

Hey Ya'll!

It's QLTECHCON17 TIME!!!!!




In support of my talk at QLTechCon17, I'm posting a few resources that I thought you'd find useful:

Presentation Resources

1. My presentation
2. My twitter handle
3. The best technology team career site

Tool Examples: Orchestrators

4. TFS
5. Visual Studio Team Services
6. Jenkins
7. Git Test

Tool Examples: Execution Environments

8. Xamarin Test Cloud
9. Sauce Labs
10. Selenium Grid