I Don’t Need Explicit Written Requirements to Test

I’ve been in Software Testing and Quality for almost 10 years now, and a disturbing trend I tend to hear every now and then, are individuals on the test team complaining and whining that they are unable to test because there aren’t any written requirements available.  I’ve seen these same individuals refuse to begin any testing effort (not even willing to launch the system) without having the written requirements in front of them, and waiting for somebody to produce these requirements before even launching the system, not realizing the valuable time they’ve just wasted, time that could have been used to test.  Of course, I believe a big part of this approach and expectation could be due to their uninformed belief of what software testing (and their job) really is – but more on that later.

Even before I practiced and got better at approaching situations where I was asked to test and told there were no written requirements, I believed in getting the information I needed in order to test.  In the last few years I’ve began referring to it as “information hunting”.  This information could be anything relevant to the situation at hand, from the manner in which to access the system under test, to who the end users of the system will be.

If I find myself in a situation where there are no written requirements, I am going to go “information hunting” to find out more about the project and testing situation I am in.  If I find myself in a situation where there are written requirements, I’ll be sure to read & analyze them and I will also go “information hunting”.  You see, there is SO MUCH more to software testing than just the requirements. I’ve seen many systems in the past, with tons of written requirements and numerous test cases mapped to each of the requirements, yet when I actually used the system myself and tested it – it was less than positive experience using the system and there were many problems present, despite of all those test cases that had been mapped to each of the requirements and “passed” by whomever was assigned to run the tests.

Now going back to what and how some people on test teams view their role and job – some people just don’t see software testing as anything more than validating the requirements – but that’s not software testing.  Software testing is about providing important information about the quality of the system under test.  One of the many things I’ve learned from Michael Bolton is that you don’t need explicit written requirements in order to launch a system and begin to question it, observe its behavior and learn about it – so that you’re to provide important information about the quality of the system you are testing.  This is such a huge part of Software Testing that I believe is missing on many software test teams – the understanding and appreciation of the skills required to be a valuable software tester, the use of the brain, and actually thinking while interacting with the system under test.

When I go “information hunting” I want to learn and get more information about the current test situation (every test situation is different) that I find myself in.  I carefully put together and analyze all of the information I am able to gather about the testing needs and the system, so that I can create and share my test approach with my managers and project stakeholders – and carry out good testing.

I like to find out the reasoning justifying the cost behind the development effort.  What are the information needs of my stakeholders for this specific test effort at this specific time?  What is the purpose of the system and what are the goals that the system is setting out to accomplish?  Who will be the end users of the system?  How will the users use the system?  Why will they use the system?  What will they use the system for?  When will they use it?   Who are the developers with whom I will be collaborating with and can/will they answer my questions about technical changes and possible impacts to consider in my test approach?  Who can I work with to discuss the risks?  Who can I work with to discuss security and performance expectations?  Which other systems and components will be impacted?  There is more information I may look to seek out depending on the specific situation and circumstances surrounding the system under test (for example test environments, remote team members, third-party involvement).

Using the information I hunt down, I am able to think about the system and how I am going to approach my assignment of testing it.  The answers to my “information hunting” questions have a direct impact of what I test and how.  I am always thinking when I test, while I observe my interactions with the system, and the behaviours I observe within it.

Within the past few years I have also studied and learned different test skills, the use of judgement, oracles, and heuristics and to incorporate them into my testing, in order to help me find and report important quality related information.

I don’t need explicit written requirements to start testing, nor do I heavily rely on them in order to do my job well and to be a valuable contributor to the team.

7 thoughts on “I Don’t Need Explicit Written Requirements to Test

  1. Agree that
    “… you don’t need explicit written requirements in order to launch a system and begin to question it, observe its behavior and learn about it …”
    and, sometimes you even need to test to what makes sense rather than what is written.

    However, there’s a risk of testing where there isn’t some kind of persisted requirement or description, and that is that you can find and characterize what appears to be a bug, and sometime in the future someone might say that it’s not a bug, which means that you either have to fight for the bug or you could just give up on the wasted time.

    If the requirements aren’t persisted, they’re more likely to change for some reason, and then the picture of overall quality you’re building of the product is built on jello 🙁

    • Hi Matt,

      In regards to your comment about the risk of testing where there isn’t a stated requirement or description, I believe it’s important for a Tester to write the bug report well and do a good job of advocating it, but I don’t agree about the fighting for it part. If the bug is written well and the tester does a good job of advocating for it, there shouldn’t be a need to fight for it – the content of the bug report is important information that the stakeholder or business needs to make a decision about. Good information found and reported to the right people isn’t a waste of time, we’re not quality gatekeepers, but rather quality information providers. Additionally, in my post I am advocating for the tester go out and get a description of what needs to be tested, and describing myself and what I do in order to get a description of what I will need to test.

      As for overall quality, there are many other factors that influence the overall quality of the product besides requirements and/or testing. The requirements can be well defined, and a good job testing and discovering problems can be done – yet it’s entirely possible that the product is built on jello. Software Testing doesn’t control or assure quality.

      • I’m actually conflating “advocated for” and “fighting for” a bug.

        Agree, one should seek out information to understand the SUT, but I’ve been there and often it’s too much work and bothering people to try to get a clear vision of what’s expected. In cases like that, it would be much easier for people to persist some records, and better for overall quality measurements too.

        Sometimes, it’s not just the details of the decisions but the reasoning behind the decisions that is important as well.

  2. I personally think this statement is bold and highly inaccurate, plus does not reflect the reality most people live at work.

    I could see this happening, in a very small software development company that will not enough resources to implement proper qa processes or that they develop mickey mouse applications. Other than that, I sincerely hope nobody else will experience such a frustrating work environment.

    As test manager with over 15 yrs of exp (software testing and quality assurance) for small, medium and large organizations, I profoundly and categorically object to your statement.

    I strongly suggest to specify the type of company where you plan to use this technique, and tell me which company will do that because I will stay away from them.

    I have done the “information hunting” and let me tell you, it is not a good exercise most importantly it is bound to produce tons of gaps at the end. People might think Agile is the way to go, and this like “outsourcing to India” failed. Large and medium companies are getting away from that approach, as does not work for highly complex environments.

    When you work for a large organization, where a release might include changes to up to 12 different applications, I suggest you to go tech for the information…. Good luck to you.

    Approved and final Requirements, are a “condition Si ne qua non” that must be fulfilled prior to commencing of the testing phase.

    Testing without requirements, is like working in the wild wild west, where people do what they want, and chaos is the currency everyone gets paid with. We may as well play Russian roulette with a fully loaded gun, odds are the same.

    Just my humble opinion

    Regards

    Carlos The Master

    • Carlos,

      While you are entitled to your opinion, I’d like you to elaborate when you say “this statement is bold and highly inaccurate”. You seem to be implying that no testing effort can begin without explicit written requirements. You are also ignoring (or not mentioning) the impact and value that actual software testing skills can have, and the value a skilled tester can bring to the team. I’d also like you to elaborate on what Software Testing is to you and what value you believe it has within the SDLC …

      As for not reflecting the reality that most people live at work – I think you are highly mistaken as there are a lot of testers who are asked to begin testing with very little or no written requirements, and quite often incomplete requirements. Are you suggesting and encouraging testers in these situations to flat out refuse to start testing rather than assessing the specific situation to determine if/how they can provide value to the project?

      Now let me be clear, I am in no way downplaying the importance of written requirements, but rather I am advocating Software Testers learn collaboration skills, learn different ways in which to be valuable to their teams, and consider the specific situation they are in, to determine how they can start adding value and contributing rather than following the old (bad) habit of not being able to add value without having written requirements handed to them.

      As a skilled, hands-on Software Tester I have tested many applications and have been on many projects – some with great written requirements, some with no written requirements and a lot in between. No matter the situation, when I test – my approach, thought, test design, test strategies are highly dependent on the context of the project and testing effort – I don’t test just the written requirements, but I test in a manner where I am able to provide important information (written requirements or not) to my stakeholders. I’ve been applauded numerous times, from numerous managers for my ability to go out and get any missing information needed to test and not wasting valuable time waiting around for written requirements to be handed to me, as they’ve seen many other testers do.

      As I said in my post – Software Testing is SO MUCH more than just validating written requirements. There are individuals on test teams who believe testing is just based on written requirements and test cases and put so much emphasis solely on pass/fail results for test cases – these individuals are harming the craft of software testing (and setting us years back). Software Testing is actually a very skilled, information providing activity requiring intelligent thinking.

      I have gone “Information Hunting” at many companies, including large companies because it’s a manner in which I am able to add value to my team, and it’s a part of what I believe a good Software Tester should be able to do. I use the information (about the actual requirements) I find and use it in my test design along with any written requirements that may come along with it.

      I agree that testing without requirements is like working in the wild wild west, but NOWHERE in my post do I advocate for that type of approach to testing. I do advocate for testers to work and collaborate with other members of the team and to be a valuable asset to their teams. I also advocate for testers to go out and learn actual testing skills and use them (heuristics for example) to enable them to be better and more skilled testers so that they are able to add value in specific situations without absolutely requiring written requirements.

      I strongly believe that one of the things that separates a good, skilled, thinking Software Tester from a tester who depends on explicit written requirements in order to test, is the ability to adapt their test approaches to different contexts and add value by being able to provide important information with and without written requirements – given the context.

      Do you believe that the quality and information derived from your tests will only be valuable based on the explicit written requirements given to you?

      — qualitycaptain

      • There is no added valued to what you are saying, in fact you are polluting the tester’s head with wrong and conflictive information. You are wasting gas for no reason, and most people will tell you that.
        Yes, no testing effort can begin without completed and approved Requirement by the stakeholder. The fact that you are testing something that does not have proper requirements is wrong in so many levels.
        That is following the approach of Garbage In- Garbage Out, and yes I have seen this in very large organization. Mostly with very old Legacy applications. That your approach is being done, yes you are right, but that is good to do that no.
        The fact that you are testing something without requirements, means that development went on their own and decided for the client in many ways, also whatever they could not figure it out, application will not work or will have a gap. Why would you invest resources time to test something that is not stable and that will certainly change?
        You mentioned that testing is more than verifying requirements and traceability, that is the only thing I can agree with you. Nonetheless, I think the way you positioned your reply, I see you talking about 2 different things…. The quality assurance process and the quality control process….QA is validating the requirements get approved, traced to specs, use cases, test cases, etc. Testing is measuring the quality of the software by ensuring it meets the needs of the final client.
        QA is an important piece of the puzzle, without process and audits (balances and checks), the testing no matter how good the tester can be it will be incomplete, inaccurate and bound to have disaster code.
        I think that the quality of the testing derived from written requirements is extremely valuable, assuming that you are a good tester. It is a given that you will be doing blackbox negative (wacky tests, extreme cases, etc) and positives cases, review the non-functional requirements, CPR etc….This should be sufficient.
        Hunting for information it is only applicable when you don’t know the application, which is the case most times and you rely on other people to explain it you. Hunting for information is not testing, I would see it as onboarding, an in which case you are most likely using a current version. You are not testing, you are learning the application.
        Again testing without requirements, is guessing and the only thing certain is loss of time and $$$

        Lastly remember testing is more than hunting information and validating requirements but also but measuring risk. The risk of testing something incomplete, that will change 2 or 3 times before it is final is very high. As a test lead you are bound to Inform your clients about this risk. If your client is willing to live with it and blow the cash right and left… Testing without requirements is the way to go.

        My opinion

        • There is no added vale to the type of robot-like testing you seem to be doing and advocating, it’s old, repetitive and non-evolving.

          If you believe and practice that no testing can begin without completed and approved written requirements by the stakeholders – we definitely see and practice Software Testing and the value it provides differently. Information Hunting is about going out and getting the information you need in order to test and this includes the information needs of your stakeholders.

          Are you referring to smart, intelligent, thinking Software Testing making use of actual skill (something robot like testers don’t exercise or have much of) garbage in, garbage out? Do you know anything about heuristics? Critical thinking? Scientific testing? Software structure? Perhaps your lack of knowledge or your fear of skilled Software Testers and their thinking is what you is provoking you to call Testing involving actual thinking (something test robots don’t do) garbage in, garbage out.

          I am not talking about 2 different things (you might be). My post is about Software Testing – not Quality Assurance or Quality Control. Software Testing is a skilled activity done to find important information about the quality of the software you are testing and to provide that information to stakeholders so that they are able to make good, informed decisions. You say “Testing is measuring the quality of the software by ensuring it meets the needs of the final client” – well let me ask you, how are you measuring the quality? Do tell me because I am very interested in your response.

          I don’t disagree and did not disagree in my post that the quality of the testing derived from written requirements is extremely valuable. I do think that a good tester is one that can test and provide quality information with and without written requirements given the context.

          Actually refusing to engage in testing efforts in situations where testing can provide value even without written requirements is a huge loss of time and $$$.

          Hunting for information can be important regardless of the availability of written requirements or not. Are you saying that when you have written requirements, you yourself don’t encourage your testers to speak to developers, BA’s, architectures to find out more information to possibly use in test design? You don’t encourage your testers to speak to developers about possible impacts, technical details, to think and ask good questions so they can consider these things when they test? “Hunting for information is only applicable when you don’t know the application” is one of the most irresponsible and ignorant statements I’ve read in ages!

          As for measuring risk, testing only written requirements – how are you measuring that risk? How are you accounting for information that may expose/prevent risks but have nothing to do with the written requirements? As for clients blowing cash right and left – you know what I see clients blowing cash like that doing? Paying “testers” to spend countless hours writing mind numbing robot test cases, and 90 page test artifacts that nobody will ever read. And you know what happens after that? There is no $ left to do actual testing – you know that activity that is done to provide quality information.

Leave a Reply to Matt Griscom Cancel reply

Your email address will not be published. Required fields are marked *