Tuesday, December 29, 2015

Upgraded To Visual Studio 2015: What It Meant For Me.


Background

In a previous post I talked about the benefits of VS 2015, and that I was ready to take the plunge. Welp, my hand was forced about three weeks ago. Our build server was upgraded to TFS 2015, which meant that it was the perfect time to upgrade my IDE as well. I did about two weeks ago.

As part of the upgrade testing, I wanted to ensure that I could effectively write a test in VS 2015, compile it, check it in, associate it with a test case, and be able to run the test case in the test lab environment.

The Problems

GUMP: I needed to update a crap ton of dependencies. At the end of the day, automated test cases rely on multiple assemblies to run. For example, test attributes, assertion classes all stem from the UnitTest and UITesting assemblies. So without them, our tests don't run. More specifically, without the correct versions of those assemblies our tests don't even get added to the test runner!

1. Dependencies Built Into CodedUIExtensionsAndHelpers

My initial problem after upgrading to VS 2015, and attempting to run a test case manifested itself through not being able to run any tests on my local machine. More specifically, not even being able  to hit a debug point. The test suite was building fine, but as soon as I'd go to run it, I could not hit any breakpoint, and the test suite would seem to not do anything.

How I Diagnosed This
At first I thought something was wrong with my test runner, and so I started trying to run test cases via the ReSharper test runner. This did not help. I not saw that ReSharper treated my tests as inconclusive, but still did not run them. This problem was fairly hard to diagnose as I was not getting any explicit exceptions, or obvious errors. After some consultation with one of my senior dev buddies, I was able to figure out what the problem was using the call stack window. I was able to see where the coded ui test script started crashing. Turns out that the package that we use for providing helper methods and framing for Coded UI elements (CodedUIExtensionsAndHelpers) had a dependency on test DLLs, which were compiled using VS 2013. This put me in a bit of a pickle, as I did not really own the source code to that package.

How I Fixed This
Because the developer who owns these packages is a former colleague of ours, I emailed him, texted him and called him up to see if he could push a version of the extensions package which was compiled via VS2015. I waited a few hours (lol), then a day or so, (perhaps a weekend?), and after no response got impatient. I figured that I could pull down the source code (since it's on GitHub @ https://github.com/spelltwister/CodedUIFluentExtensions), recompile it, and push the DLL along with our code. After a lot of promising to push the proper way after proving that recompiling would work to my dev buddy who was helping me, I was able to show that after recompiling with VS 2015 and updating all the references in the CodedUIExtensionsAndHelpers package, I could build locally and debug. I then promptly forked the source of CodedUIExtensionsAndHelpers to my own Github (https://github.com/mkonkolowicz/CodedUIFluentExtensions), recompiled it, repackaged and updated all the references so I could update the DLL from Github. So the end result was that the package no longer had dependencies on VS2013 asssemblies, and I could debug locally.





2. Dependency Versions In The Checked In Projects

After figuring out that I could build locally, I assumed that I will be able to run the nightly runs with no problems. After all, everything else should just upgrade itself right? Welp, I was wrong, and spent a long time figuring that one out. I was able to build and run locally, but all my test runs on the test lab environment kept failing 100% of the time with error messages specifying that a specific version of an assembly was still missing (see the screenshot below for the specific message).


How I Diagnosed This
The build process for QAAutomation, takes all the code necessary for our test cases (which includes dependency dlls such as UITesting.dll) and drops them into a shared folder from where they are used by the test agent during the test run. I knew that if the error above was true, then there should be something interesting to see in the drop folder. I navigated there and inspected the files. I noticed they were all there, but I also noticed that the file versions were not correct, and they were all still pointing to the VS 2013 versions (V12.0.x)



How I Fixed This
I figured that if my projects all contained references to the correct versions of the referenced assemblies, I'd be all good. For 90% of the troubleshooting time I focused on the one big project (ATLAS) and spent a lot of time messing with references, as even after removing, cleaning, re-adding checking in, building and running our test suites, but I still could not get the proper versions. I then realized, that more than one project was being built by the build process that was dropping dependencies in to our drop location. After checking that project out, updating the references I was finally able to get the tests added to the test run. 


3. Lack Of Support For VS2015 On Lab Environment

So then I really got excited...I was able to get all the proper versions of the files I needed in the drop location, which theoretically meant I could push them all and run my tests right? Nope. Still having problems. This time, it was one I was completely unfamiliar with...

"Error adding test case [100372] to test run: Unable to load the test container '\\mi\DFS\TSI-ArchivesTemp\TFSDrop\Area51\QAMain\QAMain_20151228.3\atlasguitests.dll' or one of its dependencies. Error details: System.TypeLoadException: GenericArguments[0], 'Microsoft.VisualStudio.TestTools.UITesting.WpfControls.WpfControl', on 'CodedUIExtensionsAndHelpers.PageModeling.PageModelBase`1[T]' violates the constraint of type parameter 'T'.
System.TypeLoadException: GenericArguments[0], 'Microsoft.VisualStudio.Test"

I was completely flabbergasted by this error and had no idea what to do about it. I referred to my dev buddy, who basically told me that to google it. I mean at this point I googled the heck out of this whole process, so this advice did not seem like it would help...

How I Diagnosed This

But after googling, I did find something. Well, specifically, he did. We found this post: http://blogs.msdn.com/b/gopinath/archive/2015/02/27/test-agents-support-for-visual-studio-2015.aspx?CommentPosted=true#commentmessage. And that post, got us back to a running state. But I was not happy about it. Basically what the post says, is that VS2015 does not support agents, which means our test lab environment just won't work with V14.0 assemblies. The gump of the fix is that we have to trick the assemblies being shipped to the test lab environment to look like V12.0 (VS2013) assemblies, and only then will the tests run.

How I Fixed This
I followed the instructions in the post above and pasted in a bunch of assembly binding redirection into each of the projects' I'm working app.configs. I still have a few to fix (here's looking at you Nexsys), but all in all, the hack to get our tests running works as last night the biggest chunk of our tests (ATLAS) ran again. 

What I've Learned From the Experience

1. Microsoft makes mistakes, even the new Microsoft

It seems that MS has forgot to update agents for VS 2015. I'm not sure why, perhaps it's because not much noise is made from developers about GUI testing? I don't know too many that love that domain...but still Coded UI and test lab is a supported MS technology, so it seems to me that it just didn't get the love that the rest of their dev ops system received in this latest update. It's clear to me that this is the case, and MS needs to fix this. As we say, every client, every time no exceptions, no excuses. Even if it is the fat ugly kid in the MS family of test offering.


2. Perhaps it's good to explore other options

Being tied to one testing platform restricted our test suite to be non functional for about a week worth of time. I understand in the long term, that is not such a huge deal, but it seems that our automation goes with whatever MS does, and I'm not sure I like that. Now to be fair, I don't think Visual Studio will ever disappear, but Coded UI may not get as much love down the road. For the short term, I have a few things on the radar, but for the long term, having the ability to add in other things to our arsenal is a must.

That's all for now quality fanatics.

Over and Out.

Interesting Links Mentioned Above
1. http://blogs.msdn.com/b/gopinath/archive/2015/02/27/test-agents-support-for-visual-studio-2015.aspx?CommentPosted=true#commentmessage (The hack)

Upgrading To VS 2015 in An Enterprise Environment: What It Means For A GUI Automator


Visual Studio 2015: The Good Things For Quality


Background

On July 20, 2015 Microsoft announced the official RTM (Release To Manufacturing) of the newest version of VS (Visual Studio). After seeing a sneak peak at the Microsoft Build Conference in April of this year, I was really pumped to check it out. I played around with it a bit before the RTM via some Azure VMs, but never actually did "work" work with it. But after letting it bake a bit until Update 1 (November 2015) I decided it was time to take the leap. I was also encouraged by the fact that our team's build platform (TFS, Team Foundation Server) was going to be upgraded to the 2015 version.

From a QA perspective, we mainly utilize VS as our IDE (Integrated Development Environment), TFS as our build process, MTM (Microsoft Test Manager) as our Test Centre and Lab Management (an MTM offshoot) as our test lab management tool.

So as you can see, I was excited to get the hotness that was VS 2015, as the rest of the mentioned tools would follow.


Why Do It?

I really tried hard to not be impacted by the MS (Microsoft) propaganda I saw at Build2015. There were sessions, t-shirts, stickers, etc. that enticed all of us to upgrade as soon as possible. There were twitter feeds which showed the new features, all seemed rosy. I tried really hard to aggregate the benefits. Below you'll find a list of official and unofficial (but useful) sources describing the list of improvements in VS 2015. From a dev perspective, most seem shiny, new and awesome in general.
 

The Official Improvements: 

-https://msdn.microsoft.com/en-us/library/bb386063.aspx
-https://www.visualstudio.com/en-us/products/compare-visual-studio-2015-products-vs.aspx
-https://www.youtube.com/watch?v=CrRQKIXGq4Y

The Skinny:

So if you looked at the above links, you'll think the same as I did. Yea, there's a lot of new stuff! VS is truly becoming the poster child of "The New" Microsoft. Seemingly gone are the days of Steve Balmer and the "You're with us because you have to be" mindset. Satya Nadella (MS CEO)has truly attempted to change the course of the ship to create an open source movement which strives for extensibility and transparency and VS is no exception.

Some examples of the new mindset in VS 2015 (as features)

1. Support for multiple platforms (Interesting, from a Dev Perspective)

You no longer have to only develop for Windows. You can use VS 2015 to write native apps for Android, iOS or Windows. VS 2015 also comes with the ability to extend frameworks which will allow you to write mobile apps in C#, but ship them on iOS, Android or Windows (enter Xamarin, Cordova)


2. Plugins (Interesting for all!)
To allow for better package management, the nuget pacakge management eco system has been extended and revamped. But additionally, as of VS 2015 Update 1, an online ecosystem for VS plugins has been created (https://marketplace.visualstudio.com/). The most important part of the marketplace, is that you can create your own plugins for VS. Imagine that!



3.Roslyn (Interesting for all, pertinent to quality folks too!)
Another huge difference to exemplify the new MS mindset...open sourcing the core .Net compiler (Roslyn). The fact that MS is now giving the ability of anyone to branch and use their .Net compiler on most platforms is kind of bonkers. For so many years, this was  effectively the core competitive advantage in the .Net space. Now it's completely transparent as to what is going on. What does this mean to us though? Do I really care about the fact that I can write my own compiler for .Net? As a geek, sure, realistically, as a Software Quality Engineer, not really. What I do care about though, is about some of the features which are a part of the move to open source the compiler. 


 
4. Roslyn Feature: Analyzers
Basically, you can now write your own set of rules to be followed as part of your development process. The entire process is really well described at https://msdn.microsoft.com/en-us/magazine/dn879356.aspx. But basically if I want to enforce a set of dev rules, similarly to how tools such as fxCop do it. Why would we want to do this? Well say we'd want to enforce a set of naming rules for specific items, we can. Want to ensure a property of type List always is named "x_List"? You can do that. Want to ensure a property of type model, is called a "model"? You can do that. By creating an analyzer you effectively provide a set of rules to follow, which won't allow compilation if broken. Alternatively, breaking the rules can also provide for warnings, but really, if you're going to set the rules, enforce them :)


5. Directly Impacting Features: Unit Test Generation (Intellitest)
Another feature of VS 2015 which is will directly impact the Quality of what we're writing, is the newly shipped IntelliTest feature. The description of this feature can be found at https://msdn.microsoft.com/en-us/library/dn823749.aspx. But basically, imagine a way of generate unit tests. Granted the tests generated are not of high complexity, but they do give you a quick way of providing quality to your specific implementation. I have not played with this feature yet, so I cannot provide a personal perspective, but it's definitely on my list of things to do. 

6. Directly Impacting Features: Performance Testing Upgrades
Again new in Visual Studio 2015 is the direct integration of Application Insights. I believe this is not a new feature to the VS suite, but I believe it has been included by default for the first time in VS2015. This feature is fully described at https://www.visualstudio.com/en-us/get-started/test/get-performance-data-for-load-tests, but basically allows you to identify bottlenecks within your application's code, when ran against the source code. I have not run this feature specifically, but look out for a future blog post about it, since I plan to do so in my series of ventures into performance testing. 


7. Directly Impacting Features: Awesome debugging (ex. Lambda's in Locals Window)
In Visual Studio 2015, Microsoft has drastically improved debugging. Due to the way the compiler has been improved, one can now debug lambda statements as the code is running. This means that you can insert lambda expressions in the watch window, and have them evaluated.

This is a great improvement for us as automators dealing with hand coding, since lambda expressions are used all the time in our code. Remember how hard it was to figure out if that pesky string to date function will work in your model implementation? You now don't have to set variables to see the result. Simply use the watch window. 

The other thing I noticed which was pretty awesome is the increased ability to jump back and forth in running code. Apparently this is again due to the way the compiler handles debugging.

How Do You Do It?

Welp obvy you need an MSDN subscription if you want VS2015 for enterprise use. If you don't have an MSDN subscription, you can always download the community edition (https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx). With the enterprise edition, you get the full suite of all goodness, with community edition, you get a bit more of a basic environment. If you're interested in all the feature differences, check out the third link in the interesting links secion.

What Are The Implications
So, Visual Studio 2015 is the hotness right? It's got all the things for all the environments that matter...right? It's the new Microsoft right? Everything works all the time right? FALSE. There are some pitfalls. Specifically when it comes to test lab management, which is something very near and dear to my  heart. But since this blog post is getting really long winded, I'm going to wrap it up and tell you about the smelly pieces I have run into in the next one.

Over and Out.



Interesting Links Mentioned Above
1. VS Features:
-https://msdn.microsoft.com/en-us/library/bb386063.aspx
-https://www.visualstudio.com/en-us/products/compare-visual-studio-2015-products-vs.aspx
-https://www.youtube.com/watch?v=CrRQKIXGq4Y
2. The Skinny
-https://marketplace.visualstudio.com/ (Plugin Marketplace)
-https://msdn.microsoft.com/en-us/magazine/dn879356.aspx (Analyzer)
-https://msdn.microsoft.com/en-us/library/dn823749.aspx (Intellitest)
-https://www.visualstudio.com/en-us/get-started/test/get-performance-data-for-load-tests (Application Insights)
3. VS 2015 Community
-https://www.visualstudio.com/en-us/products/visual-studio-community-vs.aspx

Tuesday, December 15, 2015

My Fore Into Ensuring Applications Behave Well...Part 1: What Is Load Testing?!?!!


Don't Be A Donkey...Load Test Your Cart!

My Fore Into Ensuring Applications Behave Well...Part 1: What Is Load Testing?!?!! 

Hey there friends, a little while ago, I decided that in order to expand my quality perspective, I needed to get involved in some sort of performance testing. I always had a bit of a hunch that it was important, but always focused on automated testing which ensured the application under test behaved correctly. Oddly enough I never really focused on ensuring it ran well. Silly me. Anyway, a few months ago, a few of us on my current QA team went on a performance kick. So, I decided it was time to get involved in ensuring our apps behaved correctly and well! The timing for this couldn't be better, as our team was in the later stages of releasing some externally facing applications, which really needed to behave correctly and well.

What Is Load Testing
After being able to dedicate some time to figuring out what performance testing really was, I started researching. I was able to deduce that performance testing was essentially two things. Figuring out how to measure how applications behave under certain conditions, and then exasperating those conditions. Boy was I excited. I mean, I think one of the best portions of testing for quality, is staging scenarios which stress, bend, twist, prod and poke our apps. So to be able to learn how to do this to the max, seemed like a lot of fun.

Load testing emulates high user load on an application by means of automation. A Load test is basically made up of two components: a performance test, and repetition to achieve load.

Performance Test
The first portion of the load test (the performance test) essentially emulates a single user's actions through HTTP requests. HTTP requests are essentially what browsers use to communicate with websites, and the performance test (after written or recorded) replays captured HTTP requests.



In order to write or record a performance test, one captures the traffic resulting from regular user actions. The traffic is then massaged, to be scaled. Performance testing tools such as SOASTA's Cloud Test, or Microsoft's Visual Studio Performance Testing suite allow us to do this relatively easily. When writing a performance test, you will need to think about what it means to repeat the same actions, but possibly with a different user, or with different security credentials.


Some of the things which we may want to vary per test, could be user login, password, or different links to click. The tools which we work with allow us the ability to parametrize our HTTP requests, so we can replay them with input parameters, such as a list of orders to accept. Our goal for creating a performance test, should be to create a script which can be replayed many times with different parameters.
                                                                     

How To Turn On The Heat (Load)
So after we establish our performance test, and are able to run it in single iterations, we need to scale it. This is where the true power of our load testing tools come into play. Most load testing tools give you the ability to utilize the power of multiple machines to create a great deal of HTTP requests. Modern load testing tools even give us the ability to use cloud infrastructures such as MS Azure, or Amazon Web Services to create extensive load and even control which region of the world it comes from!

This really is magic, as we now are able to setup a load test to not only simulate a user's HTTP traffic, but also simulate where it comes from! We can even set the user agent strings on the HTTP requests to see how our application behaves for different simulated browsers, or devices.

The Load Test, is effectively a scaling of one performance test. And the fact that we can have a script and framework work together to do this at scale is amazing.





The Most Important Piece
We now have a good idea of what a load test is , and what it consists of. But the real question is, what value do we get from this? What should we be watching for? A load test tool will give us a great deal of metrics and graphs and charts and shiny things. But what should we be doing with all this info? From my experience, and based on advice from more experience performance testers, we need to know what we want to see. We need to know what to look for, so we know when see something different! There are a few ways of doing this, and one sure fire way is to establish a baseline. In order to do this, we basically run a test, record some metrics , and then judge against those. From my limited experience, I've noticed that for high load threshold, I tend to look at request send rate vs. request response time, and for performance measurement, I tend to look at page load time. Using these two metrics, I am able to get a good idea if my application is really tanking, when stressed. These two metrics also provide me with a good idea of how well the application will behave for the user. There are definitely other measures which are interesting, but at a high level, I believe these two are a great starting point.




Alright crew, so in this introductory post, we talked about what load testing is, what it's components are and what is important in a load test...now I bet you want to get into specifics? Welp, stay tuned as next up, we'll talk about some specific Load Testing tools and their advantages / pitfalls.

Until next time.