Support

Expand all | Collapse all

Creating relationships between Requirements and Test Runs

  • 1.  Creating relationships between Requirements and Test Runs

    Posted 26 days ago
    Is there an easy way to create relationships between a Set of (requirement) Items and the Test Runs that have verified those requirements?

    I can easily-enough create relationships between my Requirements and the Test Cases written to verify them (which is good and useful) BUT what I really want to do is relate my Requirements to the actual Test Runs that include the actual results of the tests conducted.

    The way that I imagine this working is something like this:
    1. Relate all of the Requirements within a given Set of Requirements to their respective Test Cases
    2. Create a Test Plan containing a bunch of Test Runs derived from the Test Cases in (1)
    3. Perform a "bulk relating" step to relate the Requirements to the Test Runs in my Test Plan. Maybe something like:
      1. Get a List View of my Set of Requirements (filtered so I'm looking ONLY at Requirements - no Folders or Texts)
      2. Select all the Requirements and then invoke a "relate to Test Plan" mechanism that allows me to point to my Test Plan
      3. If Requirement A is related to Test Case A, a relationship is created to any Test Run derived from Test Case A that is within the pointed-to Test Plan
    4. I can now create a Filter to see the sub-set of Requirements that have FAILED downstream Test Runs
    This seems like something that might already be possible but I just haven't figured out how to do it. Is there existing advice on how to do this?

    Thanks for your help and advice,

    ------------------------------
    Ian Webb
    Enphase Energy, Inc.
    ------------------------------


  • 2.  RE: Creating relationships between Requirements and Test Runs

    Posted 25 days ago
    Ian:

    Hi there! So, I haven't been able to play around with this yet but, I wanted to chime in. The main outcome we want is to relate your requirements to the actual test runs so that you can see the results of the test which validated your requirements, right? And I am assuming that you want this data in the Test Case/Runs to remain there for informational purposes? How frequently are you wanting to go through this process; once a quarter, once every year, several times a day?

    In order for this information to persist in the Test Run, you will need to find a way to set up your test management center so that you only run each test run once (or at least, when you are done with with the test case and you want to relate it to the Requirement and then - no one uses that same test case again). If you reuse a test run within a test case it will erase whatever information was in the old test run with the new information from the current run.

    You could do this with Reuse. Reuse the Test Cases, making sure everyone in your org knows not to use old test cases. But, I could see no real way to lock this down if someone has permissions to the Test Cases.

    This is why I asked about the frequency because, if this is a process you are only doing once a quarter or once a year (and depending on how granular in scope you have organized your projects) you could just lock down the whole project permissions when these requirements are where you want them. Duplicate the project for the next release. And you could repeat this process, therefore saving this information in the old Achieved projects.

    Alternatively, you could also use our Community reports - such as Test Results Report (test cycle), Test Results Report (test case) and/or Test Plan Report. This is probably my favorite option:
    1. use these reports to download a snapshot in time of the Steps, expected results, notes etc
    2. you could then upload these reports as an attachment to the Requirement you need it to be related to.
    3. if you really need this to become an "item" with an official relationship to the requirement you could upload the results as a Jama item and then relate it to the requirement that way.

    Ian, it all depends on how you are wanting to use the information and how often you are wanting to go through this process. I hope I gave you some food for fodder here.

    Thoughts?




    ------------------------------
    Chloe Elliott
    Jama Software
    Portland OR
    ------------------------------



  • 3.  RE: Creating relationships between Requirements and Test Runs

    Posted 24 days ago
    Hi Chloe,

    Thanks for your interest in this. Responses to your questions and comments follow but firstly I'd like to check that I'm not suffering from a fundamental misunderstanding of how Test Management should be done in Jama ...

    Let's say that within a Jama Project (I'll call it "System"), I've got two Sets of Requirements, one for each of the two different products that make up "System" - one Set for "Widget" and the other Set for "Gadget."
    I've also got a Set of Test Cases, one of which is "Drop Test."
    Now, I create two Test Plans - one for Widget and one for Gadget. I assign "Drop Test" to both of my Test Plans because, of course, both Widget and Gadget need to be drop tested. That is, each of the two Sets of Requirements contains a Requirement along the lines of "Product (either Gadget or Widget) shall withstand Drop Test."
    So, the one Test Case "Drop Test" has now spawned two Test Runs - one in each of my Test Plans.

    This scenario illustrates how I think I should be using my Set of Test Cases - basically as a 'library' from which to select the tests I want to include in my (multiple) Test Plans. That is, I need only one Test Case called "Drop Test" even though I'm intending to actually perform two Drop Tests - one on Gadget and one on Widget.

    Now let's consider the coverage / traceability implications in this scenario.
    I've got two Requirements - "Widget shall withstand Drop Test" and "Gadget shall withstand Drop Test" - which are both Related to the single Test case "Drop Test." So far, so good; I can see that I have 'coverage' for my two Requirements.
    BUT, what I REALLY want to be able to see downstream of the Requirement "Widget shall withstand Drop Test" is not the Test Case "Drop Test," but rather the Test Run "Drop Test" that is in the Test Plan for Widget. (And I DON'T want to see the Test Run "Drop Test" that is in the Test Plan for Gadget.)

    So, finally, to my question: Is the 'library of Test Cases' situation I've described, where one Test Case is used to create many Test Runs, in line with the way that this module is designed to work? If not, then I've really misunderstood the design intentions!



    Now for some responses ...

    "The main outcome we want is to relate your requirements to the actual test runs so that you can see the results of the test which validated your requirements, right?"

    Correct! I can create a Relationship between a given Requirement and the Test Run that actually verifies it but doing this manually to relate a whole Set of Requirements to all the Test Runs in the relevant Test Plan would be mightily laborious. That's why I'm asking about a quick way to do this.

    "And I am assuming that you want this data in the Test Case/Runs to remain there for informational purposes?"

    I wrote the long and rambling 'scenario question' at the top of this message after reading this question of yours. I would certainly want the data within a Test Run to remain static once I've completed performing the Test. But you've lumped Test Cases & Test Runs together in your question, which suggests to me that my notion of a Test Case being just a 'data free template' for creating Test Runs (which are where the actual test data is stored) might be incorrect.


    I've already written quite a bit Chloe so I'll stop here for the moment.

    Thanks,

    ------------------------------
    Ian Webb
    Enphase Energy, Inc.
    ------------------------------



  • 4.  RE: Creating relationships between Requirements and Test Runs

    Posted 23 days ago
    Ian:

    Whew! I think you have a really great grasp of the Test Center! The workflow is so dependent on what you want to do with it.

    I would say that you can use the Test Case as a "drag-drop template" in some cases. For instance: Let's say you were doing an experiment where you had several control groups, each group is trying to achieve the same outcome but each group has a different theory on how to achieve the same outcome. So, you put them all to the test at the same time using different Test Plans but, all utilizing the same Test Case (no matter the method - these are the metrics they need to achieve). In this scenario when all the test plans are utilizing the same test case, and everyone is doing test runs, thus passing and failing those test runs, the outcome is: Test Case Status will reflect the latest status of whatever test run in whatever test plan was last ran (pass or fail).

    If you use the Test Case as a template for all the different Test Plans, you will not be able to rely on the Status (Fail, Pass) of that Test Case to inform you if Requirement A passed while Requirement B failed. If you look at the 'shared' Test Case, it will reflect the latest Status shared with it from among ALL the Requirements with that 'shared' Test Case. This would also be the case for any Test Runs derived from any Test Cases related to the Requirements in this case. (I think this is why I tend to conflate the two terms, because the Test Run is like a variable and changeable object derived from Test Case).

    In your use case, it is sounding like you really need that latest status within the Test Case to reflect the unique Requirements you are qualifying.

    Ok, so it sounds like you want to be able to:

    1. Filter for Requirements who have a particular Test Case Status - In order to set this up so that the Test Case Status is actually "unique" to the Requirement, you will need to create a unique Test Case for each Requirement. You can for sure use a Test Case template but, you will have to utilize "Reuse" instead of "drag and drop" in the Test Center. (Ian, it might be possible that you are currently utilizing "Reuse" in this case.)

    2. You want the historic information of the Test Run for your Requirement - Once you set up your Test Cases to be "unique" to each Requirement as above. You can then relate the Requirement directly to the Test Run and the Test Case. You just need to change your Relationship Rules to allow this relationship to become realized.

    Ian, thank you for all your information you have shared here, that took time and effort. I only want to help but, I also realize that my visibility is a bit "limited" on this Community platform. I think you got this! Please let me know if you want me to reach out to your Account Manager to look at any alternate help for your organization. Let me know how I can help.

    Best,

    ------------------------------
    Chloe Elliott
    Jama Software
    Portland OR
    ------------------------------



  • 5.  RE: Creating relationships between Requirements and Test Runs

    Posted 2 days ago
    Hi again Chloe,

    (It's taken me a while to respond - sorry about that. Please don't think it means that I've lost interest in this - I'm still very interested. I note that your most recent message said that you might not be able to provide further advice on this so, yes, please do pass this on to someone else if you think another perspective might be useful.)

    Here are some responses and comments to your last message ...

    "... Test Case Status will reflect the latest status of whatever test run in whatever test plan was last ran (pass or fail)."

    Agreed! But I think that it is ONLY the 'Status' of the most recent Test Run that gets "written back" to the 'Test Case Status' of the relevant Test Case - not any of the other information from that most recent Test Run (the 'Results' that the tester has written into the 'Steps' table in the Test Run, for example). So the Test Case is never an actual record of a Test Run - not even the most recent one. True?

    "If you use the Test Case as a template for all the different Test Plans, you will not be able to rely on the Status (Fail, Pass) of that Test Case to inform you if Requirement A passed while Requirement B failed."

    Agreed! But I wouldn't expect it to given that, as discussed above, the Test Case is never an actual record of the conduct of an actual Test. (The fact that the most-recent 'Status' does get "written back" to the Test Case so that the Test Case becomes a partial record of the most recent Test Run seems a bit odd to me to be candid.)
    I've got Test Run "Drop Test" in Test Plan "Widget" and it's the data in that Test Run that I'm interested in when i want to verify Requirement A ("Widget shall withstand Drop Test.")
    Similarly, it is Test Run "Drop Test" in Test Plan "Gadget" that I'm interested in when i want to verify Requirement B ("Gadget shall withstand Drop Test.")

    "In your use case, it is sounding like you really need that latest status within the Test Case to reflect the unique Requirements you are qualifying."

    I don't think this is what I want. As discussed above, I really don't think that the Test Case is where I want to look if I want to see the results of a test that has been run. Even if I have just one Requirement, one Test Case, and one Test Run, it's the Test Run I want to look at to check that my Requirement has been verified.
    This brings me back to where this started - I want to be able to Relate my "Widget shall withstand Drop Test" Requirement to the actual Test Run that verified it - the Test Run "Drop Test" in Test Plan "Widget".

    There is a dialog I've noticed that gets me REALLY CLOSE to what I want to be able to do:
    Here, I'm seeing that the Test Run that I'm working in right now has been derived from a particular Test Case and I can see whatever Requirement(s) are upstream of that Test Case.
    What I really want to be able to do in this dialog is to tick the check box that I wish I could see next to the 'Upstream Items' list and then click the imaginary button that says "Relate to this Test Run."

    There are some other subtleties that I'd like to get into but I've already written quite a bit and I'm mindful of your warning that you might not have much more to say on this topic so I'll finish here.


    ------------------------------
    Ian Webb
    Enphase Energy, Inc.
    ------------------------------



  • 6.  RE: Creating relationships between Requirements and Test Runs

    Posted 2 days ago
    Ian:

    You can directly relate your Requirement item type to the Test Run but you have to force it. Go to Admin>Relationships and make a new relationship rule like so:This will allow the Test Run to be directly related to whatever Requirement item type you choose.

    What I was also attempting to communicate above is that even if you relate the Requirement directly to the Test Run, that information will get overwritten by new information if you reuse the Test Run. The Test Run is derived from the Test Case, therefore, you cannot "drag and drop" the same Test Cases for different Requirements if you want that Test Run information (you related to a Requirement) to keep the historical information. The Test Run will erase the old information and reflect the current run.

    Best,

    ------------------------------
    Chloe Elliott
    Jama Software
    Portland OR
    ------------------------------