Light globe world

Where to compromise on the Software Testing Life Cycle to meet a deadline?

This topic is written for QAs in the first place, to not get lost if you’re asked to test a release in a short period of time, when the deadline cannot be pushed but you still need to do your job properly. It might be of some use for Product/Project managers and Developers to see what the testing process is about and how we can help each other within the Software Development Life Cycle (SDLC). 

First Aid tutorial in case of an unexpected deadline from a QA with 11+ years of experience. Thoughts out loud on short term process optimisation in order to meet the deadline and carefully compromise on the quality. Emergency releases and releases with tight deadlines happen more often than we want them to. We will not go into the reasons why such situations arise, instead we will look at how you, as a QA, can deal with it.

Example: You are informed about an upcoming emergency release with new features/functionality OR current release doesn’t meet the dates set and you slowly understand that the deadline cannot be met. You start to wonder how to put the test activities all-together and make it happen with minimum risk on quality. As the Software Testing is not just a single activity, like testing itself, let’s remember the Software Testing Life Cycle (STLC) phases one by one and see what we can do.

Requirements Analysis

All software development starts with an idea, and usually to make this idea happen we have requirements. So the first thing you need to get access to is the requirements. To understand what exactly needs to be tested and define the scope you’d need to make a deep dive into the requirements and analyse what the changes are about.

  • First Impression. The thing about first impression, is that you can get it only once. While you’re reading/analysing requirements, take notes for questions and ideas that pop up in your head while reading. Later, when you’re more familiar with the release changes, you’ll see that some of those were very good questions.

  • Knowledge. As a QA working with the product on a daily basis and testing different areas in details, makes you one of key members in the team. Use that knowledge to see how the changes of the release impact the functionality of the product. Try to see the full picture of the release and you’ll more likely find gaps in the requirements.

  • Communication. Don’t hesitate to talk to project manager and developers about changes and areas of the application that will be affected by these changes. Ask developers about their point of view and concerns, and you’ll find out more details. Good documentation takes a lot of time to make so don’t expect everything to be listed and described.

  • Questions and answers. Set up a call (better series of daily calls) with your team to go through questions you have to clarify/fulfil the requirements, acknowledge concerns and possible showstoppers.

This phase plays a crucial role as you’re creating a base for the testing itself. With a thoughtful approach together with your team you can prevent issues from happening and lower the number of regression issues.


With an understanding of what the release is about and which parts of the software are affected, now you can plan what needs to be tested.

  • Scope of testing. Define the scope of testing that is needed without considering the deadline terms, prepare an estimate based on how many people you have, an environment required for testing and based on when the software will be available.

  • Documentation. If process at your company allows, do not create a whole Test Plan as you would do in best times, but instead create a list of activities with short description of their purpose and possible risks in case these activities are not kept.

  • Adjust the scope. Based on how much time you have for testing, see how you can adjust the scope and list the risks it brings. Shortening the scope of testing doesn’t mean that you exclude the testing that is not needed, it means that if testing is not done for that area of the product, risks that something goes wrong are low or simply acceptable by everyone involved in the release.

  • Tune the scope. Share your test plan and make it clear for all teams involved, what will and won’t be tested in the release due to the deadline. That’s the moment when everyone shares the responsibility. Based on concerns of people involved, you might need to tune the scope again.

Test cases creation

Once you’re familiar with the release changes, start listing scenarios.

  • Keep scenarios written. Do not skip this phase even if you’re in a big rush. In 1–2 days of testing, you won’t remember what exactly was tested and what were the results.

  • Keep it short but descriptive. Good test case creation takes time and in the short term you might need to postpone good practice and use something simple. Handy practice that will save some time at this phase could be a usage of a checklist. If you’re already using Test Case Management tools like Testrail or xRay plugin, you can add cases there, but instead of fully formatted test cases you can leave thesisses of what should be tested.

  • Priorities. Always list critical scenarios first. At the point when developers start providing you with a POC or first build, more likely only key functionality will be ready for testing and you’d need to start with those. Less critical scenarios you can add in between testing runs.

  • Update on the way. Test case/Scenarios management does not stop until you test the final build. There will be situations when bugs are converted into features, or requirements are changing during development, all that need to be reflected in your test documentation, especially if you’re not the only one person in the QA team. That will save time for Qa team and developers as well, no one likes spending time on Closed by design/works as expected statuses.

Test execution

As soon as the first version is ready, start testing.

  • What to test? Always ask about functionality that is ready for testing for every build/version, e.g Release Notes. No need to run tests for the functionality that is not developed yet.

  • Updates. Summarise results of testing for every run. You’re the one who shows the status of the app based on test results. Make sure your team, project manager and developers, are aware of what was tested and what was found.

  • Adhoc or Exploratory testing could be a lot of help. Use your creativity and knowledge about the app to make a step right/left from the test scenarios. Just 30–60 min of such testing can reveal issues/scenarios that you couldn’t think of during functional testing.

  • Regression. Start with regression testing when development for the release changes is done and testing of it is more or less complete.

  • Notes. Take notes for your learnings about new functionality and other details that you discover during testing, e.g. the ones that will be good to include in a User Guide or to provide to the Customer Support department.

Test activities closure

  • Make sure the agreed scope of testing is complete
  • Once you’ve finished the test execution, provide the summary: coverage, results and issues that were deferred for future releases.
  • Document the lessons learned during testing and any other helpful details/instructions that can be reused for another release.
  • Convert test scenarios into proper test cases, add steps and details to keep a good practice for future testing

As you can see, the key measures here are open communication with all teams involved to stay up to date on the release details and status; to postpone some QA good practices to the Testing Closure phase and to tune the scope of testing based on the risks the team agrees to take.

And always remember that it’s not just QA team who is responsible for the quality, but all teams involved in the software development. Even small decisions taken on different levels can impact the end result, but it’s in your responsibility to always do your best as you play the important role in Software Development Life Cycle (SDLC),


Lead QA Engineer