View Only
  • 1.  Managing variants with shared requirements / test runs

    Posted 04-11-2023 02:59

    Hi Jama experts,

    Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.

    We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level. 

    We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).

    Here is the question:
    How should Jama be utilized to best distinguish between Product A and Product B?
    a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
    b. By using the categories option to indicate which variant a given item relates to
    c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
    d. By distinguishing within test plans, e.g., separate test plans for each variant
    e. Something else?

    It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.

    Below I have outlined how it would look with having Product A and Product B within the same Jama project:

    Jacob Brodal
    Systems Engineer

  • 2.  RE: Managing variants with shared requirements / test runs

    Posted 04-20-2023 07:02

    Hello Jacob,
    I've just seen your very interesting question in the forum.
    In our company (medical device, infusion pumps) we have not yet reached the scenario you are describing, however we will, soon reach that point, so I am very interested in reaching a conclusion. 
    I have a couple of questions on your approach before I could provide you with a potential answer.
    For example,
    1. you have the concept of "Verification Protocols". What is a "Verification Protocol" in your view? Is it a Test Plan with associated Test Cases? Or is it something else (another item type)?  
    2. you mention that the reports are being baselined. How can you baseline a report? That's not feasible technically in Jama. Do you imply that the test runs have been put under review and you use the review baseline? Or something else?

    Warm regards

  • 3.  RE: Managing variants with shared requirements / test runs

    Posted 10-25-2023 07:41

    Good afternoon Jacob,

    Great topic, thank you for asking.

    Your use scenario can be addressed with the Reuse functionality.

    Your option C., is that could be good starting point to work out a solution.

    Let me share a diagram that outlines a potential configuration approach to your problem.

    Have a great day


    Szabolcs Agai

  • 4.  RE: Managing variants with shared requirements / test runs

    Posted 10-26-2023 01:55

    Hello Saby,
    Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa. 
    Best regards


  • 5.  RE: Managing variants with shared requirements / test runs

    Posted 10-26-2023 03:10

    Hello Dimitrios,

    Thank you for the feedback and for asking.

    There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.

    Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
    Let me share a simple example on this.

    Have a great day


    Szabolcs Agai

  • 6.  RE: Managing variants with shared requirements / test runs

    Posted 10-26-2023 23:42

    Hello Saby,

    Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)

    So after you perform test runs on project A, you are

    1. updating those custom field Test Cases on Project A 
    2. syncing those Test Case items back to the Parent(Common) Project
    3. syncing those changes from the Parent(Common) Project to Project B

    Are the above assumptions correct? 

    Warm regards


  • 7.  RE: Managing variants with shared requirements / test runs

    Posted 10-27-2023 08:27
    Edited by Geoffrey Clemm 10-27-2023 08:28

    The approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans.   If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants.   If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.

    Geoffrey Clemm
    Jama Software

  • 8.  RE: Managing variants with shared requirements / test runs

    Posted 10-29-2023 22:08
    Edited by Steve Sutton 10-29-2023 23:38

    Test Plan Setup – 1 vs Many For Multi-Release Product

    If I may ask for some added information related to my project and considerations, specifically around setup and employment of Test Plans, much like the original question.

    My project consists of a main software application under development that will go through iterative releases while capability is added to the product throughout its development cycle until it is complete and fully released.

    Each release will have added capability and may be released to the customer as a release milestone towards full acceptance.

    Additionally, for some items, they have specific locations they need to be tested, such as in specific LAB's. As an example, a single release may have X verification procedures that will need to occur in LAB 1 and X verification procedures that will need to be conducted in LAB 2 (no procedures require the usage of more than one LAB).

    Currently I have built Verification procedures that will cover all requirements. However, each iterative release will only have a given number of requirements completed and able to be verified.


    Release 1 – X requirements satisfied and able to be verification tested for acceptance

    Release 2 – X additional requirements satisfied in this release and able to be verification tested for acceptance

    Release 3…..

    Final Release – Remaining requirements satisfied and able to be verification tested for acceptance

    My Points of concern for clarity 

    • What is the best Test Plan approach? A single Test Plan for the whole Final Release or a "Multi-Test Plan" approach where each test plan covers only a single releases requirements and verifications?
    • Separately, if a multi-plan approach is best, recommendations on test plan allocation based on Release, LAB testing location, or both, i.e., Test Plan 1 - Release X - LAB 1, Test Plan 1 - Release X - LAB 2, ect?
    • I will need to provide coverage and traceability to the customer for each iterative release given to them. Is one approach better than the other?
    • I am taking into consideration the export reports, and looking to simplify the output for requirement traceability and coverage of each release. I can manually combine multiple reports, but if there are easier ways to let JAMA do that for me via proper setup of test plans and configured custom export templates, all the better.

    Steve Sutton
    L3Harris - ForceX

  • 9.  RE: Managing variants with shared requirements / test runs

    Posted 10-30-2023 03:35

    Hello Dimitrios,
    All these are really depends on what you would like to achieve, what are the rules and processes around, in your development process, reusing test case results.

    The project I know of took a simpler approach than you were describing in step 2 and 3. . They were updating those custom fields from test results, than duplicated the project as a non-merge-able branch and moved forward the new release development in that duplicated project.

    So, event though neither a project duplication does not duplicate test results, nor a reuse does not consider test results, there is still a way to deal with test result reuse-ability.

    Have a great day


    Szabolcs Agai