Tuesday, December 29, 2015

Upgraded To Visual Studio 2015: What It Meant For Me.


Background

In a previous post I talked about the benefits of VS 2015, and that I was ready to take the plunge. Welp, my hand was forced about three weeks ago. Our build server was upgraded to TFS 2015, which meant that it was the perfect time to upgrade my IDE as well. I did about two weeks ago.

As part of the upgrade testing, I wanted to ensure that I could effectively write a test in VS 2015, compile it, check it in, associate it with a test case, and be able to run the test case in the test lab environment.

The Problems

GUMP: I needed to update a crap ton of dependencies. At the end of the day, automated test cases rely on multiple assemblies to run. For example, test attributes, assertion classes all stem from the UnitTest and UITesting assemblies. So without them, our tests don't run. More specifically, without the correct versions of those assemblies our tests don't even get added to the test runner!

1. Dependencies Built Into CodedUIExtensionsAndHelpers

My initial problem after upgrading to VS 2015, and attempting to run a test case manifested itself through not being able to run any tests on my local machine. More specifically, not even being able  to hit a debug point. The test suite was building fine, but as soon as I'd go to run it, I could not hit any breakpoint, and the test suite would seem to not do anything.

How I Diagnosed This
At first I thought something was wrong with my test runner, and so I started trying to run test cases via the ReSharper test runner. This did not help. I not saw that ReSharper treated my tests as inconclusive, but still did not run them. This problem was fairly hard to diagnose as I was not getting any explicit exceptions, or obvious errors. After some consultation with one of my senior dev buddies, I was able to figure out what the problem was using the call stack window. I was able to see where the coded ui test script started crashing. Turns out that the package that we use for providing helper methods and framing for Coded UI elements (CodedUIExtensionsAndHelpers) had a dependency on test DLLs, which were compiled using VS 2013. This put me in a bit of a pickle, as I did not really own the source code to that package.

How I Fixed This
Because the developer who owns these packages is a former colleague of ours, I emailed him, texted him and called him up to see if he could push a version of the extensions package which was compiled via VS2015. I waited a few hours (lol), then a day or so, (perhaps a weekend?), and after no response got impatient. I figured that I could pull down the source code (since it's on GitHub @ https://github.com/spelltwister/CodedUIFluentExtensions), recompile it, and push the DLL along with our code. After a lot of promising to push the proper way after proving that recompiling would work to my dev buddy who was helping me, I was able to show that after recompiling with VS 2015 and updating all the references in the CodedUIExtensionsAndHelpers package, I could build locally and debug. I then promptly forked the source of CodedUIExtensionsAndHelpers to my own Github (https://github.com/mkonkolowicz/CodedUIFluentExtensions), recompiled it, repackaged and updated all the references so I could update the DLL from Github. So the end result was that the package no longer had dependencies on VS2013 asssemblies, and I could debug locally.





2. Dependency Versions In The Checked In Projects

After figuring out that I could build locally, I assumed that I will be able to run the nightly runs with no problems. After all, everything else should just upgrade itself right? Welp, I was wrong, and spent a long time figuring that one out. I was able to build and run locally, but all my test runs on the test lab environment kept failing 100% of the time with error messages specifying that a specific version of an assembly was still missing (see the screenshot below for the specific message).


How I Diagnosed This
The build process for QAAutomation, takes all the code necessary for our test cases (which includes dependency dlls such as UITesting.dll) and drops them into a shared folder from where they are used by the test agent during the test run. I knew that if the error above was true, then there should be something interesting to see in the drop folder. I navigated there and inspected the files. I noticed they were all there, but I also noticed that the file versions were not correct, and they were all still pointing to the VS 2013 versions (V12.0.x)



How I Fixed This
I figured that if my projects all contained references to the correct versions of the referenced assemblies, I'd be all good. For 90% of the troubleshooting time I focused on the one big project (ATLAS) and spent a lot of time messing with references, as even after removing, cleaning, re-adding checking in, building and running our test suites, but I still could not get the proper versions. I then realized, that more than one project was being built by the build process that was dropping dependencies in to our drop location. After checking that project out, updating the references I was finally able to get the tests added to the test run. 


3. Lack Of Support For VS2015 On Lab Environment

So then I really got excited...I was able to get all the proper versions of the files I needed in the drop location, which theoretically meant I could push them all and run my tests right? Nope. Still having problems. This time, it was one I was completely unfamiliar with...

"Error adding test case [100372] to test run: Unable to load the test container '\\mi\DFS\TSI-ArchivesTemp\TFSDrop\Area51\QAMain\QAMain_20151228.3\atlasguitests.dll' or one of its dependencies. Error details: System.TypeLoadException: GenericArguments[0], 'Microsoft.VisualStudio.TestTools.UITesting.WpfControls.WpfControl', on 'CodedUIExtensionsAndHelpers.PageModeling.PageModelBase`1[T]' violates the constraint of type parameter 'T'.
System.TypeLoadException: GenericArguments[0], 'Microsoft.VisualStudio.Test"

I was completely flabbergasted by this error and had no idea what to do about it. I referred to my dev buddy, who basically told me that to google it. I mean at this point I googled the heck out of this whole process, so this advice did not seem like it would help...

How I Diagnosed This

But after googling, I did find something. Well, specifically, he did. We found this post: http://blogs.msdn.com/b/gopinath/archive/2015/02/27/test-agents-support-for-visual-studio-2015.aspx?CommentPosted=true#commentmessage. And that post, got us back to a running state. But I was not happy about it. Basically what the post says, is that VS2015 does not support agents, which means our test lab environment just won't work with V14.0 assemblies. The gump of the fix is that we have to trick the assemblies being shipped to the test lab environment to look like V12.0 (VS2013) assemblies, and only then will the tests run.

How I Fixed This
I followed the instructions in the post above and pasted in a bunch of assembly binding redirection into each of the projects' I'm working app.configs. I still have a few to fix (here's looking at you Nexsys), but all in all, the hack to get our tests running works as last night the biggest chunk of our tests (ATLAS) ran again. 

What I've Learned From the Experience

1. Microsoft makes mistakes, even the new Microsoft

It seems that MS has forgot to update agents for VS 2015. I'm not sure why, perhaps it's because not much noise is made from developers about GUI testing? I don't know too many that love that domain...but still Coded UI and test lab is a supported MS technology, so it seems to me that it just didn't get the love that the rest of their dev ops system received in this latest update. It's clear to me that this is the case, and MS needs to fix this. As we say, every client, every time no exceptions, no excuses. Even if it is the fat ugly kid in the MS family of test offering.


2. Perhaps it's good to explore other options

Being tied to one testing platform restricted our test suite to be non functional for about a week worth of time. I understand in the long term, that is not such a huge deal, but it seems that our automation goes with whatever MS does, and I'm not sure I like that. Now to be fair, I don't think Visual Studio will ever disappear, but Coded UI may not get as much love down the road. For the short term, I have a few things on the radar, but for the long term, having the ability to add in other things to our arsenal is a must.

That's all for now quality fanatics.

Over and Out.

Interesting Links Mentioned Above
1. http://blogs.msdn.com/b/gopinath/archive/2015/02/27/test-agents-support-for-visual-studio-2015.aspx?CommentPosted=true#commentmessage (The hack)

Upgrading To VS 2015 in An Enterprise Environment: What It Means For A GUI Automator


Visual Studio 2015: The Good Things For Quality


Background

On July 20, 2015 Microsoft announced the official RTM (Release To Manufacturing) of the newest version of VS (Visual Studio). After seeing a sneak peak at the Microsoft Build Conference in April of this year, I was really pumped to check it out. I played around with it a bit before the RTM via some Azure VMs, but never actually did "work" work with it. But after letting it bake a bit until Update 1 (November 2015) I decided it was time to take the leap. I was also encouraged by the fact that our team's build platform (TFS, Team Foundation Server) was going to be upgraded to the 2015 version.

From a QA perspective, we mainly utilize VS as our IDE (Integrated Development Environment), TFS as our build process, MTM (Microsoft Test Manager) as our Test Centre and Lab Management (an MTM offshoot) as our test lab management tool.

So as you can see, I was excited to get the hotness that was VS 2015, as the rest of the mentioned tools would follow.


Why Do It?

I really tried hard to not be impacted by the MS (Microsoft) propaganda I saw at Build2015. There were sessions, t-shirts, stickers, etc. that enticed all of us to upgrade as soon as possible. There were twitter feeds which showed the new features, all seemed rosy. I tried really hard to aggregate the benefits. Below you'll find a list of official and unofficial (but useful) sources describing the list of improvements in VS 2015. From a dev perspective, most seem shiny, new and awesome in general.
 

The Official Improvements: 

-https://msdn.microsoft.com/en-us/library/bb386063.aspx
-https://www.visualstudio.com/en-us/products/compare-visual-studio-2015-products-vs.aspx
-https://www.youtube.com/watch?v=CrRQKIXGq4Y

The Skinny:

So if you looked at the above links, you'll think the same as I did. Yea, there's a lot of new stuff! VS is truly becoming the poster child of "The New" Microsoft. Seemingly gone are the days of Steve Balmer and the "You're with us because you have to be" mindset. Satya Nadella (MS CEO)has truly attempted to change the course of the ship to create an open source movement which strives for extensibility and transparency and VS is no exception.

Some examples of the new mindset in VS 2015 (as features)

1. Support for multiple platforms (Interesting, from a Dev Perspective)

You no longer have to only develop for Windows. You can use VS 2015 to write native apps for Android, iOS or Windows. VS 2015 also comes with the ability to extend frameworks which will allow you to write mobile apps in C#, but ship them on iOS, Android or Windows (enter Xamarin, Cordova)


2. Plugins (Interesting for all!)
To allow for better package management, the nuget pacakge management eco system has been extended and revamped. But additionally, as of VS 2015 Update 1, an online ecosystem for VS plugins has been created (https://marketplace.visualstudio.com/). The most important part of the marketplace, is that you can create your own plugins for VS. Imagine that!



3.Roslyn (Interesting for all, pertinent to quality folks too!)
Another huge difference to exemplify the new MS mindset...open sourcing the core .Net compiler (Roslyn). The fact that MS is now giving the ability of anyone to branch and use their .Net compiler on most platforms is kind of bonkers. For so many years, this was  effectively the core competitive advantage in the .Net space. Now it's completely transparent as to what is going on. What does this mean to us though? Do I really care about the fact that I can write my own compiler for .Net? As a geek, sure, realistically, as a Software Quality Engineer, not really. What I do care about though, is about some of the features which are a part of the move to open source the compiler. 


 
4. Roslyn Feature: Analyzers
Basically, you can now write your own set of rules to be followed as part of your development process. The entire process is really well described at https://msdn.microsoft.com/en-us/magazine/dn879356.aspx. But basically if I want to enforce a set of dev rules, similarly to how tools such as fxCop do it. Why would we want to do this? Well say we'd want to enforce a set of naming rules for specific items, we can. Want to ensure a property of type List always is named "x_List"? You can do that. Want to ensure a property of type model, is called a "model"? You can do that. By creating an analyzer you effectively provide a set of rules to follow, which won't allow compilation if broken. Alternatively, breaking the rules can also provide for warnings, but really, if you're going to set the rules, enforce them :)


5. Directly Impacting Features: Unit Test Generation (Intellitest)
Another feature of VS 2015 which is will directly impact the Quality of what we're writing, is the newly shipped IntelliTest feature. The description of this feature can be found at https://msdn.microsoft.com/en-us/library/dn823749.aspx. But basically, imagine a way of generate unit tests. Granted the tests generated are not of high complexity, but they do give you a quick way of providing quality to your specific implementation. I have not played with this feature yet, so I cannot provide a personal perspective, but it's definitely on my list of things to do. 

6. Directly Impacting Features: Performance Testing Upgrades
Again new in Visual Studio 2015 is the direct integration of Application Insights. I believe this is not a new feature to the VS suite, but I believe it has been included by default for the first time in VS2015. This feature is fully described at https://www.visualstudio.com/en-us/get-started/test/get-performance-data-for-load-tests, but basically allows you to identify bottlenecks within your application's code, when ran against the source code. I have not run this feature specifically, but look out for a future blog post about it, since I plan to do so in my series of ventures into performance testing. 


7. Directly Impacting Features: Awesome debugging (ex. Lambda's in Locals Window)
In Visual Studio 2015, Microsoft has drastically improved debugging. Due to the way the compiler has been improved, one can now debug lambda statements as the code is running. This means that you can insert lambda expressions in the watch window, and have them evaluated.

This is a great improvement for us as automators dealing with hand coding, since lambda expressions are used all the time in our code. Remember how hard it was to figure out if that pesky string to date function will work in your model implementation? You now don't have to set variables to see the result. Simply use the watch window. 

The other thing I noticed which was pretty awesome is the increased ability to jump back and forth in running code. Apparently this is again due to the way the compiler handles debugging.

How Do You Do It?

Welp obvy you need an MSDN subscription if you want VS2015 for enterprise use. If you don't have an MSDN subscription, you can always download the community edition (https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx). With the enterprise edition, you get the full suite of all goodness, with community edition, you get a bit more of a basic environment. If you're interested in all the feature differences, check out the third link in the interesting links secion.

What Are The Implications
So, Visual Studio 2015 is the hotness right? It's got all the things for all the environments that matter...right? It's the new Microsoft right? Everything works all the time right? FALSE. There are some pitfalls. Specifically when it comes to test lab management, which is something very near and dear to my  heart. But since this blog post is getting really long winded, I'm going to wrap it up and tell you about the smelly pieces I have run into in the next one.

Over and Out.



Interesting Links Mentioned Above
1. VS Features:
-https://msdn.microsoft.com/en-us/library/bb386063.aspx
-https://www.visualstudio.com/en-us/products/compare-visual-studio-2015-products-vs.aspx
-https://www.youtube.com/watch?v=CrRQKIXGq4Y
2. The Skinny
-https://marketplace.visualstudio.com/ (Plugin Marketplace)
-https://msdn.microsoft.com/en-us/magazine/dn879356.aspx (Analyzer)
-https://msdn.microsoft.com/en-us/library/dn823749.aspx (Intellitest)
-https://www.visualstudio.com/en-us/get-started/test/get-performance-data-for-load-tests (Application Insights)
3. VS 2015 Community
-https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx

Tuesday, December 15, 2015

My Fore Into Ensuring Applications Behave Well...Part 1: What Is Load Testing?!?!!


Don't Be A Donkey...Load Test Your Cart!

My Fore Into Ensuring Applications Behave Well...Part 1: What Is Load Testing?!?!! 

Hey there friends, a little while ago, I decided that in order to expand my quality perspective, I needed to get involved in some sort of performance testing. I always had a bit of a hunch that it was important, but always focused on automated testing which ensured the application under test behaved correctly. Oddly enough I never really focused on ensuring it ran well. Silly me. Anyway, a few months ago, a few of us on my current QA team went on a performance kick. So, I decided it was time to get involved in ensuring our apps behaved correctly and well! The timing for this couldn't be better, as our team was in the later stages of releasing some externally facing applications, which really needed to behave correctly and well.

What Is Load Testing
After being able to dedicate some time to figuring out what performance testing really was, I started researching. I was able to deduce that performance testing was essentially two things. Figuring out how to measure how applications behave under certain conditions, and then exasperating those conditions. Boy was I excited. I mean, I think one of the best portions of testing for quality, is staging scenarios which stress, bend, twist, prod and poke our apps. So to be able to learn how to do this to the max, seemed like a lot of fun.

Load testing emulates high user load on an application by means of automation. A Load test is basically made up of two components: a performance test, and repetition to achieve load.

Performance Test
The first portion of the load test (the performance test) essentially emulates a single user's actions through HTTP requests. HTTP requests are essentially what browsers use to communicate with websites, and the performance test (after written or recorded) replays captured HTTP requests.



In order to write or record a performance test, one captures the traffic resulting from regular user actions. The traffic is then massaged, to be scaled. Performance testing tools such as SOASTA's Cloud Test, or Microsoft's Visual Studio Performance Testing suite allow us to do this relatively easily. When writing a performance test, you will need to think about what it means to repeat the same actions, but possibly with a different user, or with different security credentials.


Some of the things which we may want to vary per test, could be user login, password, or different links to click. The tools which we work with allow us the ability to parametrize our HTTP requests, so we can replay them with input parameters, such as a list of orders to accept. Our goal for creating a performance test, should be to create a script which can be replayed many times with different parameters.
                                                                     

How To Turn On The Heat (Load)
So after we establish our performance test, and are able to run it in single iterations, we need to scale it. This is where the true power of our load testing tools come into play. Most load testing tools give you the ability to utilize the power of multiple machines to create a great deal of HTTP requests. Modern load testing tools even give us the ability to use cloud infrastructures such as MS Azure, or Amazon Web Services to create extensive load and even control which region of the world it comes from!

This really is magic, as we now are able to setup a load test to not only simulate a user's HTTP traffic, but also simulate where it comes from! We can even set the user agent strings on the HTTP requests to see how our application behaves for different simulated browsers, or devices.

The Load Test, is effectively a scaling of one performance test. And the fact that we can have a script and framework work together to do this at scale is amazing.





The Most Important Piece
We now have a good idea of what a load test is , and what it consists of. But the real question is, what value do we get from this? What should we be watching for? A load test tool will give us a great deal of metrics and graphs and charts and shiny things. But what should we be doing with all this info? From my experience, and based on advice from more experience performance testers, we need to know what we want to see. We need to know what to look for, so we know when see something different! There are a few ways of doing this, and one sure fire way is to establish a baseline. In order to do this, we basically run a test, record some metrics , and then judge against those. From my limited experience, I've noticed that for high load threshold, I tend to look at request send rate vs. request response time, and for performance measurement, I tend to look at page load time. Using these two metrics, I am able to get a good idea if my application is really tanking, when stressed. These two metrics also provide me with a good idea of how well the application will behave for the user. There are definitely other measures which are interesting, but at a high level, I believe these two are a great starting point.




Alright crew, so in this introductory post, we talked about what load testing is, what it's components are and what is important in a load test...now I bet you want to get into specifics? Welp, stay tuned as next up, we'll talk about some specific Load Testing tools and their advantages / pitfalls.

Until next time.



Thursday, August 27, 2015

What QA Team Work Means And How To Accomplish It

Recently, I've come to the conclusion, that I finally have a concrete example, of what QA team work is all about. In the past , that phrase represented something fairly elusive and fluffy to me. Sure I understood, that everyone should pull together by communicating, and by executing test cases in a pre-defined test plan, etc. etc. etc., but this week, I've really come to a new realization. Let me exemplify it to you.

As part of our team's new age QA approach, we breathe, live and preach automation, and more importantly, being technically able to implement meaningful, maintainable and reproducible automated tests when we choose to do so. With this mantra in mind, our team takes on immense challenges, which are supported by training initiatives. We started with a small group of our team members, and attempted to educate them to the point where they were self sufficient with respect to technical analysis and issue resolution. The first few months, they relied heavily on our senior developer, who happened to be our teacher and mentor. But after a while, and mainly due to the mentor leaving for greener pastures, something exciting and amazing started to happen. We continued to train more people, and we started to develop a community. Now I use the word community, since I believe that a community transcends a traditional team in the way that in a community, everyone benefits, in a team, the goal is achieved. Our team members, started to truly rely on their own, to figure out problems. And this is where the transcendence between team and community began. A team, has a leader, in a community, everyone is equal. That is what I believe is now happening, and that is what I think is most beneficial to a quality organization. If I help with your problem, I gain more experience, and I gain your trust that when I have a problem, you will help me. The more people have these experiences, the more people believe in the power of the community, the more each individuals' confidence grows, and Together Each Achieves More...think about that last point :)

As our team heads into a fairly challenging and busy fall, I am proud to say that I am no longer a leader of the team, but a part of the problem solving community!

Charge on!

Friday, June 19, 2015

Unit Testing...Not A QA Activity? #BuckTheNorm

Sooo...this past few weeks I've been getting down and dirty with a type of testing that is new to me, but not new testing (see how I did that? ;)). The type of testing that in my opinion, should be the start of the quality cycle during development, what say you, am I speaking of? Unit testing of course! Finally, after months of being to "tied up" and "busy", I had the opportunity to devote at least 6 hours per week, to learning how to unit test stuffs.

This opportunity was created, by the launch of a pilot class for our advanced "Top Gun" training program. Being a pilot, I thought I needed to prove out the fact that unit testing (and TDD in greater context) could be taught to people who did not come from a traditional developer background...aka yours truly.

Is Unit Testing Hard

Welp, the short answer is yes. The long answer, is yes initially, but not in the long term. So my first foray into unit testing, was focused around an area of our team's product, which focuses on ensuring a thing, can fall into a queue. So essentially, we are testing to ensure that the method which decides if something should be in a queue works.

The Hard Part

Unfortunately, the thing that we are testing, is not well suited for testing, due to lack of abstractability through traditional means such as interfaces. So, we could not new up our own real thing, and pass it into our queue checker method.  What to do? No useful interfaces meant we were in a pickle.

Mocking & Shimming To The Rescue

As a small team focused on attempting to unit test something that is hard to abstract, we decided to attempt to create a shell of the real thing, and fill it out as necessary through a mocking framework. I won't get into the nitty gritty details, but essentially mocking out a complex allows you to create a scaffolding of properties and methods that belong to the object, and when called, need to be provided input down to the property level. For example, if you have an order, which requires a transaction, which requires a list of requirements, which is a list...you'd need to provide your shell with all of those elements, but nothing else. The powerful thing is, by creating the shell and providing the information necessary, you understand what the expected outcome of the method call should be, therefore are able to ensure it is or isn't working properly.

The Easy Part

So after going through figuring out which pieces needed to be filled out in our tests, we were able to abstract a few of the repeatables away, which allowed us to become faster, and created an overall repeatable and understandable process. Which to me, shattered the stigma of a hard unit test.

What This Means To Overall Quality

Think about it this way...as a quality fanatic, I want to be involved in all aspects of the application stack. From how the code is written, to how the pieces talk to each other, to how the GUI works. Unit testing provides me the technical ability to speak to provide quality at the very lowest technical level, and at the very earliest possible time, with respect to when the application is written.

Additionally, writing unit tests, forced me to understand at a very granular level, how our application logic really worked. This is a very powerful idea. I now can understand what methods drive our queues!

Summary
I believe that all quality champions should get involved in unit testing, as that is where the meat of your application logic lives. You will help your developers code faster (since they won't have to write unit tests), you will help your team achieve higher quality (since tests will be written earlier, and be more granular) and you will definitely help yourself, by becoming more technically savvy.

But most importantly, you will keep striving to the ultimate Quality Champion state...

Friday, June 5, 2015

What Losing A Senior Developer Means To A Quality Professional

This week, my team experienced the loss of a senior developer mentor. The loss (not actual loss loss, just leaving the company), came as a shock for most, and I noticed a major deflate in our team morale. Losing the person who wrote our framework, pretty much gave us our GUI testing technical direction and solved many of our problems gutted everyone, even me as a leader. For a day or so I thought to myself, man it's really going to be hard to figure out the next big thing in testing. "Viper", was the dude who really set us off on the path we are going down now.

With his mentoring, it was amazing to watch our team evolve from arguing about which tool to use for GUI automation, to 'bicker' about which interfaces to implement. What I'm getting at, is while the loss of Viper will be hard to swallow, I have already moved on, and need to take the next steps for our team to do so. I have already noticed that although Viper has departed, our team is moving on, and this is an opportunity to form more new relationships with other awesome developers.

It is imperative for us as Senior Quality Professionals (because I really believe automation professionals are just that) to form more deep relationships with not only one developer.

The loss on of one senior developer cannot be viewed as a disaster. It has to be viewed as an opportunity for growth and a push to form new relationships with other devs.

Today while at a team standup, I pushed our team to push their devs to review automation code. We are forging ahead. We will get stronger and deal with the loss of one senior dev mentor/ally , by acquiring others.

So while Viper will be missed, Iceman and the rest of Top Gun will continue to forge The Fortress.

Melodramatic Iceman Out.


My Build 2015 Gump

So, a few months ago I went to \\Build\ 2015. Honestly, it was the best conference I have been to so far in my professional career. It was filled with new products, fun, and innovation. I learned a lot about new technologies, and even though testing is close to my heart, I explored some new dev territory. Build conference, is not only for developers you know...even though their focus was...developers , developers, developers. I would love to describe everything that was near and dear to my heart in text, but I had the privilege of giving a tech talk to my team mates , and so decided to post the presentation for everyone to view @ http://tinyurl.com/MyBuildGump . If you are interested in what I thought of \\Build\ 2015, and what I learned check.it.out. That's all for today. Over and Out.

Friday, May 29, 2015

My Introductory Rant

Welcome! I'd like to introduce myself and give a bit of background as to why I'm starting this blog, and what I hope to accomplish. My passion for learning new things and asking questions has led me to currently focus around innovation in the Software Quality Engineering realm. I spent a bit of time in manual testing, but then I decided it was boring and tried my hand at automation.

So that is what I am currently trying to master. More specifically, I am trying to master being a mentor to younger SQEs (Software Quality Engineers) , while perfecting my own skills.

So far, I've learned that no matter how technical you think you are, no matter how many mad skills you think you've mastered (in the automation space), there are always things to learn. This is why I'm trying to share my experiences, because people sharing theirs with me, has taught me so much, I'd like to give back.

So Why Should I Continue Reading?
Welp, in this entry, I will talk about why as an automation engineer, you need your developers to speak the same language. Which I believe is an essential base layer requirement for a successful automation team, and successful automation system to exist.

How To Persuade Your Devs That Automated Testing Is Interesting
Easy. SPEAK THE SAME LANGUAGE! What does this mean? Be interested in their domain! In my experience, developers take to testers, if and only if, testers are interested in the development. To me, this was a pre-requisite which regardless of how technically uncomfortable, I had to overcome.

Becoming Technically Apt, from Scratch
I thought to myself, if I am to understand the dev world, I need to immerse myself in it, so I did. I attended code reviews, I wrote high level scripts, and tried to get more experience while achieving my test objectives. Queue the automation. At one of my jobs, even though my devs wrote in C++, and I wrote in VBScript and JavaScript, I educated myself in the tenants of OOP. I took an interest to the nitty gritty details of implementation and slowly became more and more comfortable with the implementation details of the fairly complex architecture of the app we were developing. These activities also had a positive side effect on my manual tests, since I was able to churn out more specific, high value tests. As I started understanding concepts and activities, I also started seeing gaps in my automated scripts and started re-writing them. I noticed that as my scripts became more complicated, I required more troubleshooting help from my devs and we became closer co-workers and started drinking beer and watching hockey together, which was awesome! Through the above activities, I established a relationship with the dev team, which allowed me to pick their brain about automation implementation, and provide me with a deeply technical understanding of the application I was testing.

If The Opportunity To Use The Same Language Exists, Jump On It!
But , as with all good start points, you accelerate quickly and eventually plateau. When I did, I decided I needed a change of scenery and eventually switched jobs.  I remember the first week of my new job. It dawned on me, that the automation which was promised to exist during my interview process did not, and I needed to take the charge on building it. So me being me, I initiated the process. This , I thought was an awesome opportunity to do things "the right way". I did a bit of research, and realized that I had a golden opportunity to establish processes focused around quality, via automation, in the same language as the app was being developed in. I pushed this concept, and although with a bit of resistance to change, was able to convince my team to write our automation in C#, just like our devs were doing for the app we support. This decision turned out to be a great victory for two big reasons. First, we now spoke the same language as the devs, and our entire team could use any of our devs as a resource, and second, there was a great amount of room for improvement of our testing code and approaches. The second point was very evident as I kept evangelizing the automation process, and devs would give their feedback on testing. We were able to build frameworks to support our efforts, and scale the introduction of automation to non-technical team members, which were before the framework, very scared of writing automation. The amount of change we have been able to foster, has been incredible. And I attribute it to the fact that our devs, support automation, since we are both talking C#.

So What? Should I Take Away?
Welp, from my experience so far, if you are a novice tester, or even an experienced tester, a conversation with a dev who wrote the app you working on is invaluable. And what I think you should do, is get on the same page as your dev team, by learning more technical implementation detail, while automating in the same language as it is written in. It just so happens, the last two go hand in hand.

Happy relationship building, and happy automating.

Maciek
Over and Out.