QA best practices

Top QA Best Practices To Implement For Delivery Of Quality Software

In any software organization, team or group, the ultimate goal is to produce and release quality software that has a benefit to the market. More importantly the software has to be user friendly, durable, and reliable. The latter is where QA comes in. There’s no use in producing a great software product in theory but is practically not reliable or user friendly. i.e. bug-free, and performs at the same rate at scale and over time.

For many organizations, testing software can quickly become a long and expensive process, and this ultimately could lead to delays in releases and lengthening release cycles. To avoid long release cycles, long hours testing, and reduce production defect leakage here are some QA best practices that from my experience can help mitigate these factors:

Think QA and Not Just Testing

Each QA’s role should be to think in terms of overall quality, which includes but is not limited to testing. This means QA should begin to understand and think of potential risks, and gaps earlier on in the software lifecycle. In an agile methodology this means attempting to identify gaps during story grooming sessions, analysis of acceptance criteria and sprint planning.

A popular saying in QA is that it’s cheapest to find a bug during requirements analysis, and most expensive in production. With Dev, QA and UAT getting progressively more expensive respectively. So identifying bugs earlier on in the the software lifecycle will ultimate lead to money saved.


Define Your Strategy Early

This goes without saying but having a high level Test Strategy, and process aids in the efficiency of how the QA team will perform overall. Included in this Test Plan should be what methodology your group follows (agile or waterfall), key contacts and stakeholders, test case repository and management system, bug tracking mechanism, integrations with other applications, functional and regression testing plan, performance testing needs/strategy and of course your test automation strategy.

Having this strategy defined earlier on in a project lifecycle helps anchor all QA engineers to defined strategy, and increases overall efficiency.


Implement Test Automation As Part of Regression

As features get developed/tested, the number of test cases created will continue to increase. This can make regression testing a challenge if tested manually as the team begins to look at hundreds/thousands of test cases to re-execute before any release. This is where automation gets its big return on investment.

Depending on the complexity of the application under test, define an automation strategy that is effective, and ultimately efficient. Ideally, test automation should be started while functional testing is also ongoing. In an agile set up, test cases will be automated within the same sprint as they are developed and tested. By the time you’re ready for a release, you should have a significant amount of your tests automated and already executing, thereby cutting your regression overhead significantly. In addition it’s a good idea to have your already automated tests run daily or weekly. I shared my ideas on automation strategies in a separate post here.


Use Analytics to Capture Trends

Multiple test management systems now have analytics capabilities built in them. Enabling these features can help the team determine key test metrics like bug distribution, how quickly bugs are resolved, test execution results over time, and in some instances execution time. Other analytics capabilities that can be used are log management systems like Splunk and Greylog that can capture the frequency of specific errors in an application over a certain period of time.

This data will help the team to highlight problem areas, frequent components that break, and develop specific tests to target them.


Don’t Forget To Engage Performance Testing

In the early phases of test planning, it’s a good idea to begin to map out what performance testing will look like. While performance testing typically doesn’t happen until the product is stable, early planning and understanding of the application by the performance testing team (you should have one) will make performance testing more efficient.

In more mature environments, the performance testing team can also utilize some of the API and UI test automation scripts for performance testing, thereby reducing scripting time and ensuring accuracy of scenarios.


Choose The Right Environment

This also goes without saying but choosing the correct environment for the appropriate phase of testing is a key good practice every organization should have.

That is to say, a QA environment for QA functional tests, a performance environment specific for performance testing and subsequently Stage/UAT. Most important, avoid testing in the same environment as Dev as the frequency of changes, and instability will lead to inaccurate test results.


Maintaining software quality while keeping up with the demand of frequent releases is a key indicator of how well a QA team is performing. A high performing QA team should think outside the box on how to improve processes to maximize efficiency. Additionally, engaging other teams members (Dev, DevOps & Product) in these practices to varying degrees improve the overall quality of software being delivered to customers. Following these best practices is a quick way to shorten development cycles for your team and your customers, and keep your team members engaged and motivated.

API Testing; Techniques You Should Be Using For Testing Your APIs

The API layer of an application is one of the more important components as it comprises most of the key business functionality of the application. It drives the business process as well as the integration with linked applications. With that it presents an excellent avenue to test and catch errors/bugs earlier on in your software development cycle, and reduce the overall risk of more expensive bugs late in the game. 

As defined in a previous article, API testing (and automation) should cover a significant portion of your testing effort for any application as defined in the Test Pyramid seen below:

However, the key question now becomes, how do we achieve maximum efficiency in our API testing and automation effort while maintaining scalability, reusability and reliability? Here are some key techniques and types of tests you should adopt during your API testing phase.


API Validation Testing

This goes without saying, however it is the most important type of test you should have for APIs. This includes correct status codes (200, 201 for positive requests, 4xx, 5xx for negative) for valid/invalid requests, valid responses, valid requests/response headers etc. 

This will also cover authorization and permissions testing as well.


Requests Chaining for Business Flow

Here, what we strive to do is as much as possible simulate the business flow of the application as much as you can using the APIs built. This will typically consists of a series of requests and responses linked together through chaining i.e. Using data from one response as part of the input for a subsequent request.

For example, let’s say you have a business flow that allows a certain user permission to upload a document, set the document to Pre-Approval but the user can’t approve it. Your scenario will consist possibly of a LoginIn request which returns a userID that can be passed into a GetPermissions request which returns the allowable permissions for that user. You can then call your UploadDocument request (with the userID and permissions), SetPreApproval request (with the documentUploadID), and subsequently attempt to approve the document via a ApproveDoc request (which should return a 401 error).

In the scenario above, you would have successfully simulated via API testing the business flow of logging in, uploading a document, and attempting to approve it using chaining. Multiple testing tools including ReadyAPI, Postman support chaining. 


Data Driven & Boundary Tests

As the name suggests above this covers testing your set of APIs with multiple data scenarios and permutations and combinations. This is typically more important with APIs that input a considerable amount of varying datasets.

With data driven testing, you can simulate test data either through a test data generation function or a provided test data file generated outside of your testing tool. The idea being that any kind of data expected into the system is tested without necessarily having to go through the UI or other integrating applications. 

Test coverage here typically includes boundary tests, invalid data types (special characters, null values etc), large int/string/long values, extreme future and past dates etc. These tests provide an excellent opportunity to ‘tighten up’ the APIs against certain worst case scenarios.


Performance 

While this isn’t necessarily a full blown performance testing effort, API testing provides an early indication of potential performance issues. The general industry standard response time for APIs should be at about 200ms, and at maximum 1s. 

While performing your API tests, it is good practice to have a response time assertion which enables you to catch sudden latency in the application or slow performing APIs. This could be tested in isolation or under load.

How AI, Visualization and Data Analytics Are Helping Us Tackle Covid-19

Every morning when I wake, after I go through my normal early morning routine, I get on my laptop and go directly to the John Hopkins dashboard for tracking Covid-19 confirmed cases. I skip the news, social media and go straight to looking at the numbers. 40 years ago, or better yet 102 years ago (During the Spanish Flu pandemic) this would not have been remotely possible. This is a quick high-level snapshot of what technology has already enabled us to see with respect to Covid-19.

While the idea of a dashboard showing cases all over the world seems pretty simple and straight forward, the underlying technologies that make this possible have come about as a result of decades of software engineering, artificial intelligence and data processing. I’m not a medical professional or a public health expert, however I do understand data science, so l’ll delve into the tools and technologies helping our experts and officials tackle this global Covid-19 pandemic.


Data Gathering and Normalization

Data collection has been around since the inception of the concept of writing. With today’s technology however, the speed of collecting, gathering, and normalizing data has grown astronomically. We’re able to log the count of covid-19 cases anywhere in the world within 6 hours of confirmation, and immediately store that in a database that can be access from anywhere. Additional related data about a particular confirmed case (e.g. age, gender, race, underlying illness etc) can also be collected and normalized accordingly. Data is the bedrock of everything else we’d want to do, and the ability to collect, gather and normalize data from all over the world in a few hours is critical to how public officials manage the crisis.


Predictive Analysis & Trends

With accurate data gathered, we can begin to utilize various algorithms to perform predictive analysis. Simulating models to determine spread rate, effects of social distancing, mortality rate within various demographics, are all possible with predictive analytics. This provides significant information to help public officials make better decisions for the health of the greater population. For example, using predictive analytics, one can simulate the effects of social distancing (or not) on the spread rate in a particular city, and use multiple variables (e.g. population density, weather, median age etc) to determine where this method will be more effective than not, and try to predict where the next hot spot could potentially be.


Visualizations

I might not need to explain this too much as it’s become ubiquitous in today’s world. Thanks to the amount of data we have available, and powerful visualization tools like Tableau, Power BI, Google Charts etc, it’s easy to visually ‘see’ your data. Geographical charts enable us to see, on a world heat map, what regions have been hit harder with the virus vs those that haven’t. Multiple charts also give us the ability to view and analyze increase/decrease rate in multiple locations around the world, drill down to view mortality rate for various demographics, and other important information like testing rate, hospitalization rates etc.


Artificial Intelligence (AI)

AI has generally been a concept that has scared a lot of people in the past few years. However, in the race to develop a vaccine or find a cure for Covid-19, AI might just be the accelerator we need. In speaking to some in the medical industry the type of vaccine that will probably be developed to tackle Covid-19 is the nucleic acid vaccine which injects genetic material of the pathogen into human cells to stimulate an immune response. With AI, scientists are able to predict the 3D structure of a protein based on its genetic sequence and significantly reduce the number of rounds of permutations and combinations that would have ordinarily occurred with a normal vaccine development cycle. Hence why we hear 12-18 months vs 8-10 years. More here.

These are some of the ways that technology and big data has enabled us as a society to tackle this pandemic. It’s part of the multiple reasons why we may not actually get to the horrific numbers of the Spanish flu of 1918. We have 100+ years of technological advantage.

What I Learned From My Digital Declutter For 40 Days

As a practicing catholic (somewhat), the lent season has always been a period I’ve grown to cherish. A predefined timeframe, every year where I can take some time to reflect, take stock spiritually, and perform a personal sacrifice for the prescribed 40 days.

Over the last few years, I’ve given up things like ‘Not eating fried food’, social media, ‘No sweets’, ‘no alcohol’, etc. However, earlier this year, I read the book, Digital Minimalism, and the idea of a digital declutter was planted in my head. Lent provided the perfect opportunity to go through with my digital declutter at a time that I’ve grown accustomed to already. Perfect timing.


What My Digital Declutter Consisted Of

My digital declutter consisted of getting off all social media apps and any other applications deemed “optional.” This essentially meant that any applications or websites that are not absolutely required for me to function in 2020 had to go. For me, that meant I left all social media platforms (except LinkedIn) – Facebook, Twitter, Snapchat, Instagram. In addition, apps like WhatsApp, Facebook messenger, Netflix, EPL, Bleacher Report, News apps, Sports apps, photo apps etc were all uninstalled. I left my bank apps, my kids daycare app, and NikeRun for my running exercises.

That meant I was going in almost cold turkey but not entirely. I’ve heard some instances where others simply switched from a smart phone to an older device. Some others include tv watching in that as well. It’s all what works for you while being honest with yourself. Now let’s get into what I learned from the experience (besides a better battery life for my phone).


A Lot of Apps Are Trivial Time Wasters

Probably the biggest and most obvious lesson I learned from my declutter is that a lot of apps are absolute time wasters. Whether it’s endlessly scrolling through a feed of dozens of trivial information & images, or engaging in arguments that add little value in the grand scheme or just clicking through headline after headline after headline, the theme is consistent. I could’ve spent a few hours a day just roaming around different apps/social media sites and achieved little to nothing of note.


Overall Productivity & Focus Increased

With a lot of internet time wasting cut out, my productivity overall increased. Without the constant sounds or vibrations of notifications, my focus was much sharper. I realized, in just about a week, that a large percentage of the reason I was procrastinating work I needed to do was simply because something else was chewing up my time. Within the first few weeks I was able to do a lot more than would’ve ordinarily taken me months to get done.

I’m not necessarily advocating for a social media ‘cancellation’ because I do see some of the value in it. Connecting, information sharing etc are all nice things to have. However, it became apparent to me that these apps are designed to keep you engaged endlessly while returning very little value. Cutting those out gave me a lot more time to do more rewarding and productive activities including (but not limited to) creating this site we’re on right now.


Important People Actually Called

Another thing that struck me within a few weeks is somewhat twofold. My closer friends and family started calling more, and those that didn’t really matter probably didn’t even notice I was AWOL. It also worked in reverse as well. Instead of assuming someone was doing fine because “they just posted a selfie smiling while lounging my the pool”, I actually took the time out to call and talk to friends. Find out how they’re doing, catch up on memories etc. Those turned out to be much more fulfilling than sending heart emojis on a picture or a 6 sentence “check in.”

I joked with some friends that called me after they realized I was off the grid that they’re now in my book of life. But all jokes aside, the lesson here was we swapped numerous trivial connections for less but more valuable in-depth connections. In a weird way it gave me the insight that overall we’re just not that important to a lot of people we think are worthy of our time and energy.


Mental Health and Anxiety – No Notifications, No Worries.

What you don’t know wont bother you. Without endless newsfeeds and hundreds of notifications I was less anxious overall about anything. I realized that I really didn’t need to know of the person in Wisconsin who fell ill and died from flesh eating bacteria or the right wing politician making noise in Chile. I can’t do anything with that information and all it’s done is occupy my mental space and increase my anxiety while offering me no solution. A lot of what is in the news I found to be either unnecessary or just an overload of information. Same can be said with social media especially in the era of COVID-19. Everyone apparently has a story to tell.

I resorted to reading 1 news item a day (generally from BBC or Aljazeera), and not bothering with anything else for the rest of the day. I’m not any less aware of current happening, and in these times it’s been a savior. I’ve felt better about things. Even when it seems like everyone is freaking out, I think I’ve been relaxed about it all.


The Art Of Being Present

It was something of a bad habit with me (and a lot of us). Being somewhere, but not really being there. I would be at dinners, or hanging out at a friend’s, or at the playground with my daughters, and about 50% of the time I’m checking my phone. It could’ve been an argument on twitter, or a meme on instagram, the scores, family chat, responding to an email. Whatever it was, it was denying me of the ability to be present in my current activity.

Making that conscious decision to declutter gave me the ability to be without my phone for hours, and also enjoy the present moment. I paid more attention to what my daughter was doing or saying, engaged more with the people around me, or just enjoyed the beauty of nature without feeling the need to show the internet world that I’m enjoying nature. I can enjoy activities and people wholly and not worry about the perception of what that fun activity looks like to others.


Better Social Interactions

I added this point because prior to the Covid-19 shutdown, this was one of the best things I was able to gain from this experience. When I didn’t have the option of mindless likes or dead end banter online, I naturally resorted to interacting more with the people around me. I planned out game night for every few weeks to get friends together, I talked to my neighbors more, and spent some time really getting to know some of my coworkers. Even conversations with family was much better.

Too bad Covid cut most of that short, but the lesson was learned. Post Covid, we’ll take these habits with us.


Conclusion

While I’ve done many “I’m leaving social media” pledges, this declutter was quite different. Previously, I simply replaced one digital tool with another, and didn’t get true value from the experience. Even though my 40 days are up, I’ve come up with an outline of what social media apps I’ve return to, and how frequently I’ll be using them. Some once a week, and others once a day for 30 mins. For some other apps, I’ve decided I can actually do without. There was absolutely nothing I missed about them when all was said and done.

That will allow me to continue some of the habits I learned in these 40 days. Calling close friends more, having more events where people actually interact (after Covid), take more nature walks without my phone, be in the present with my kids, and ofcourse be more productive with even less time.

Inner peace, being present and less time spent arguing on twitter about which politician is lying more? Yep. I can live with this.

Image credit: https://www.inc.com/

6 Test Automation Strategies You Want To Define Before Writing A Script

In today’s software development world, test automation is not just a nice to have, it’s become a must have. Software architecture has increasingly become more complex, and so the need to not just test more but test quickly has become very essential to any software organization.

It is important to note that while test automation is key to any software development environment, it does not (and should not) entirely replace manual testing, and the “eye test” (risk based testing) is still as valuable as ever. Some might argue, it’s even more valuable now.

With that, I figured my first official blog post should touch on the best techniques and strategies an organization should utilize to get the best out of software test automation.


Dedicate A Team

While it’s true that a good automation engineer should also be a good manual tester, when it comes to defining the automation strategy for an organization, it’s key to have a core dedicated team of automation engineers. Test automation is a full-time task, and having engineers juggle between manual and automation testing is a risk. This might not be a possibility for all, but as much as possible it’s important to have at least a dedicated core of automation engineers focused on automation tasks.


Design A Framework That Is Scalable

How successful your automation effort at your organization is depends heavily on how you start. Designing a framework that is scalable and suitable for the application (or suite of applications) is absolutely paramount. Decisions about test tools, version control, reusability of artifacts should all factor into this design.

The last thing you’ll want to happen is going down the wrong path for months/years, and having to restart your effort due to an unsuitable decision about an automation tool, framework or development strategy.


Automate Below GUI Level As Much As You Can

GUI automation while very necessary can be both expensive to execute (time) and expensive to maintain when the UI elements change often. For web based applications, as much as possible develop a framework that utilizes API or webservice automation.

The key reason here is execution time, and speed to fail. The time it takes to execute an API and determine if the application has a server error is a fraction of the time it’ll take to launch a browser, navigate to the application URL, and then validate the same error.

Once again, this is an ‘as much as you can’ situation and does not completely eliminate the need for GUI automation, but a fair bet is having a large percentage of scenario based testing done below the GUI level. Be it API/Webservices, or command line executions for desktop applications.


Make Sure You Have A Business Level GUI Smoke Test

You can call it a smoke test, sanity test or build verification test (BVT). Whatever your organization calls it, be sure to have this test (or suite of tests) automated. This is a high level end to end test that can validate at any time that your latest build is a safe build and the integrated components are not broken. In a high delivery development environment, having this test (or suite of tests) is key to maintaining a solid environment with little to no manual effort.


Decide Your Execution Strategy

Congratulations! You’ve written all these automated tests, now you have to decide how, when and where they’ll be executed. Defining an execution strategy is highly significant to getting the benefits of your test automation effort.

  • How – Determine if your tests (and which tests) will be executed through a manual trigger, on schedule, or through a CI/CD pipeline (Bamboo, Jenkins come to mind)
  • When – How often are these tests executed? Which tests are executed when? Your BVT will want to be run daily or after every build, and others suites (including the full regression) can be run nightly or on demand. Depending on your application, these decisions will have to made earlier to ensure your development strategy is in sync.
  • Where – What suite of tests will be executed against which environments. Some tests can be run in Dev, others in QA and another group in a UAT environment.

Don’t Forget To Collect Metrics For Analytics

To be able to determine how effective your automated tests are, what the ROI on your automation investment is and track results trends/environment stability, you will have to determine a strategy to collect metrics and measure KPIs. Are you looking to show your pass/fail rate overtime? your median execution time? or defect leakage prevention? Your strategy to collecting metrics will enable your organization to measure KPIs and make decisions based on proper analytics, and not speculative thoughts.


Conclusion

These strategies of automation can save your organization plenty of time, money and headaches if done correctly early. Technology is evolving at a rapid pace, and keeping your strategies in focus will be key to ensuring that decisions are made with an end goal in mind. While the tools we use might (or will) change, the strategy should stay consistent.