Managing variants with shared requirements / test runs
Hi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
Comments
-
Hello Jacob,
I've just seen your very interesting question in the forum.
In our company (medical device, infusion pumps) we have not yet reached the scenario you are describing, however we will, soon reach that point, so I am very interested in reaching a conclusion.
I have a couple of questions on your approach before I could provide you with a potential answer.
For example,
1. you have the concept of "Verification Protocols". What is a "Verification Protocol" in your view? Is it a Test Plan with associated Test Cases? Or is it something else (another item type)?
2. you mention that the reports are being baselined. How can you baseline a report? That's not feasible technically in Jama. Do you imply that the test runs have been put under review and you use the review baseline? Or something else?Warm regards
-------------------------------------------
Dimitrios
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
Szabolcs Agai Jama Staff, Data Exchange, Automotive Solution, Medical Devices & Life Sciences Solution, Robotics Solution, Airborne Systems, Jama Connect Interchange™ (JCI), Jama Validation Kit (JVK) + Functional Safety Kit (FSK) Posts: 7
Good afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
------------------------------
-------------------------------------------
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
Hello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
-------------------------------------------
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
------------------------------
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
Szabolcs Agai Jama Staff, Data Exchange, Automotive Solution, Medical Devices & Life Sciences Solution, Robotics Solution, Airborne Systems, Jama Connect Interchange™ (JCI), Jama Validation Kit (JVK) + Functional Safety Kit (FSK) Posts: 7
Hello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
------------------------------
-------------------------------------------
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
Hello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
-------------------------------------------
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
------------------------------
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
The approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
-------------------------------------------
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
Test Plan Setup – 1 vs Many For Multi-Release Product
If I may ask for some added information related to my project and considerations, specifically around setup and employment of Test Plans, much like the original question.
My project consists of a main software application under development that will go through iterative releases while capability is added to the product throughout its development cycle until it is complete and fully released.
Each release will have added capability and may be released to the customer as a release milestone towards full acceptance.
Additionally, for some items, they have specific locations they need to be tested, such as in specific LAB's. As an example, a single release may have X verification procedures that will need to occur in LAB 1 and X verification procedures that will need to be conducted in LAB 2 (no procedures require the usage of more than one LAB).
Currently I have built Verification procedures that will cover all requirements. However, each iterative release will only have a given number of requirements completed and able to be verified.
Example:
Release 1 – X requirements satisfied and able to be verification tested for acceptance
Release 2 – X additional requirements satisfied in this release and able to be verification tested for acceptance
Release 3…..
Final Release – Remaining requirements satisfied and able to be verification tested for acceptance
My Points of concern for clarity
- What is the best Test Plan approach? A single Test Plan for the whole Final Release or a "Multi-Test Plan" approach where each test plan covers only a single releases requirements and verifications?
- Separately, if a multi-plan approach is best, recommendations on test plan allocation based on Release, LAB testing location, or both, i.e., Test Plan 1 - Release X - LAB 1, Test Plan 1 - Release X - LAB 2, ect?
- I will need to provide coverage and traceability to the customer for each iterative release given to them. Is one approach better than the other?
- I am taking into consideration the export reports, and looking to simplify the output for requirement traceability and coverage of each release. I can manually combine multiple reports, but if there are easier ways to let JAMA do that for me via proper setup of test plans and configured custom export templates, all the better.
------------------------------
Steve Sutton
L3Harris - ForceX
FL
------------------------------
-------------------------------------------
Original Message:
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
Szabolcs Agai Jama Staff, Data Exchange, Automotive Solution, Medical Devices & Life Sciences Solution, Robotics Solution, Airborne Systems, Jama Connect Interchange™ (JCI), Jama Validation Kit (JVK) + Functional Safety Kit (FSK) Posts: 7
Hello Dimitrios,
All these are really depends on what you would like to achieve, what are the rules and processes around, in your development process, reusing test case results.The project I know of took a simpler approach than you were describing in step 2 and 3. . They were updating those custom fields from test results, than duplicated the project as a non-merge-able branch and moved forward the new release development in that duplicated project.
So, event though neither a project duplication does not duplicate test results, nor a reuse does not consider test results, there is still a way to deal with test result reuse-ability.
Have a great day
Saby
------------------------------
Szabolcs Agai
------------------------------
-------------------------------------------
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
------------------------------
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
Jacob Brodal
Systems Engineer
3Shape
------------------------------
0 -
For this situation, I would recommend having each release be a separate Cycle in a single Test Plan, and to have a separate Group for the Test Cases added in a given Release. Then you can decide for a given Cycle/Release which Test Cases will run in that Cycle/Release (minimally, you will include the Group for that Release, but you may decide to rerun Test Cases from previous Releases).
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
-------------------------------------------
Original Message:
Sent: 10-29-2023 22:07
From: Steve Sutton
Subject: Managing variants with shared requirements / test runsTest Plan Setup – 1 vs Many For Multi-Release Product
If I may ask for some added information related to my project and considerations, specifically around setup and employment of Test Plans, much like the original question.
My project consists of a main software application under development that will go through iterative releases while capability is added to the product throughout its development cycle until it is complete and fully released.
Each release will have added capability and may be released to the customer as a release milestone towards full acceptance.
Additionally, for some items, they have specific locations they need to be tested, such as in specific LAB's. As an example, a single release may have X verification procedures that will need to occur in LAB 1 and X verification procedures that will need to be conducted in LAB 2 (no procedures require the usage of more than one LAB).
Currently I have built Verification procedures that will cover all requirements. However, each iterative release will only have a given number of requirements completed and able to be verified.
Example:
Release 1 – X requirements satisfied and able to be verification tested for acceptance
Release 2 – X additional requirements satisfied in this release and able to be verification tested for acceptance
Release 3…..
Final Release – Remaining requirements satisfied and able to be verification tested for acceptance
My Points of concern for clarity
- What is the best Test Plan approach? A single Test Plan for the whole Final Release or a "Multi-Test Plan" approach where each test plan covers only a single releases requirements and verifications?
- Separately, if a multi-plan approach is best, recommendations on test plan allocation based on Release, LAB testing location, or both, i.e., Test Plan 1 - Release X - LAB 1, Test Plan 1 - Release X - LAB 2, ect?
- I will need to provide coverage and traceability to the customer for each iterative release given to them. Is one approach better than the other?
- I am taking into consideration the export reports, and looking to simplify the output for requirement traceability and coverage of each release. I can manually combine multiple reports, but if there are easier ways to let JAMA do that for me via proper setup of test plans and configured custom export templates, all the better.
------------------------------
Steve Sutton
L3Harris - ForceX
FL
------------------------------
Original Message:
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
>> then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
Geoffrey, I think its important to mention, that the caveat that in this case is that you cannot rely on the Test Case Status since that is (unfortunately) automatically calculated based on the most urgent run status (which leads to incorrect conclusions in this case)
(From Jama help) : If the test case is used in multiple active test plans, the test case status reflects the most urgent run status based on the following hierarchy, regardless of when it ran.
Original Message:
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
Using this approach, the Test Case Status of a Test Case will tell you whether this Test Case has passed in all of your variants. If you want to know whether the Test Case has passed for a particular variant, then you would look at the Test Run Result for the Test Plan for that variant.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
-------------------------------------------
Original Message:
Sent: 01-15-2024 00:42
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runs>> then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
Geoffrey, I think its important to mention, that the caveat that in this case is that you cannot rely on the Test Case Status since that is (unfortunately) automatically calculated based on the most urgent run status (which leads to incorrect conclusions in this case)
(From Jama help) : If the test case is used in multiple active test plans, the test case status reflects the most urgent run status based on the following hierarchy, regardless of when it ran.
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
Geoffrey,
-------------------------------------------
I was referring to the case where there are multiple test runs assigned from multiple test plans, the Test case status shows the most "urgent" hierarchically speaking. This User Guide article on Test Case Status explains this: https://help.jamasoftware.com/ah/en/test/test-cases/test-case-status.html
This is the behavior I also have experienced and it is confirmed by support.
To overcome this limitation Jama should have a configurable option to allow test case status become triggered by Status Hierarchy (what Jama does now) or Latest Test Run Execution date. Would you agree?
Original Message:
Sent: 01-15-2024 21:38
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsUsing this approach, the Test Case Status of a Test Case will tell you whether this Test Case has passed in all of your variants. If you want to know whether the Test Case has passed for a particular variant, then you would look at the Test Run Result for the Test Plan for that variant.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
Original Message:
Sent: 01-15-2024 00:42
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runs>> then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
Geoffrey, I think its important to mention, that the caveat that in this case is that you cannot rely on the Test Case Status since that is (unfortunately) automatically calculated based on the most urgent run status (which leads to incorrect conclusions in this case)
(From Jama help) : If the test case is used in multiple active test plans, the test case status reflects the most urgent run status based on the following hierarchy, regardless of when it ran.
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0 -
The semantics of Test Case Status is that you re-run the Test Case (possibly in a new cycle) in the Test Plan if you want to replace the result of the previous run of that Test Case, and you run the Test Case in a different Test Plans if you want the minimum (worst) status of running that Test Case in those Test Plans. If you want different semantics, you can submit an Ideation, and until that Ideation is implemented, you just not use the Test Case Status field in your Test Cases.
------------------------------
Geoffrey Clemm
Jama Software
------------------------------
-------------------------------------------
Original Message:
Sent: 01-17-2024 07:04
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsGeoffrey,
Original Message:
I was referring to the case where there are multiple test runs assigned from multiple test plans, the Test case status shows the most "urgent" hierarchically speaking. This User Guide article on Test Case Status explains this: https://help.jamasoftware.com/ah/en/test/test-cases/test-case-status.html
This is the behavior I also have experienced and it is confirmed by support.
To overcome this limitation Jama should have a configurable option to allow test case status become triggered by Status Hierarchy (what Jama does now) or Latest Test Run Execution date. Would you agree?
Sent: 01-15-2024 21:38
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsUsing this approach, the Test Case Status of a Test Case will tell you whether this Test Case has passed in all of your variants. If you want to know whether the Test Case has passed for a particular variant, then you would look at the Test Run Result for the Test Plan for that variant.
------------------------------
Geoffrey Clemm
Jama Software
Original Message:
Sent: 01-15-2024 00:42
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runs>> then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
Geoffrey, I think its important to mention, that the caveat that in this case is that you cannot rely on the Test Case Status since that is (unfortunately) automatically calculated based on the most urgent run status (which leads to incorrect conclusions in this case)
(From Jama help) : If the test case is used in multiple active test plans, the test case status reflects the most urgent run status based on the following hierarchy, regardless of when it ran.
Sent: 10-27-2023 08:27
From: Geoffrey Clemm
Subject: Managing variants with shared requirements / test runsThe approach I usually recommend for sharing test results between multiple variants of a product is to define the verification of a given product variant to be the results from a specified set of test plans. If the results of a particular test plan apply to more than one variant of a product, then that plan would be included in the test plan list of each of those product variants. If a particular set of test cases can be reused in multiple product variants, but they each need to rerun the tests (since the test results do not apply to all the variants), then you would create a separate test plan for each variant with those test cases, so that each variant would have its own (distinct) set of test results.
------------------------------
Geoffrey Clemm
Jama Software
Original Message:
Sent: 10-26-2023 23:41
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Many thanks for sharing your approach. If I understand it correctly you have added extra custom fields to an item of type Test Case which hold the test run execution details (ID of latest Test Run, Status of latest Test Run, Test Plan of latest Test Run, etc)
So after you perform test runs on project A, you are
- updating those custom field Test Cases on Project A
- syncing those Test Case items back to the Parent(Common) Project
- syncing those changes from the Parent(Common) Project to Project B
Are the above assumptions correct?
Warm regardsDimitrios
Original Message:
Sent: 10-26-2023 03:09
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsHello Dimitrios,
Thank you for the feedback and for asking.
There are many ways to that, one is to copy over Test Run contents to the Test Case item and carry that forward to newer product development projects. New product projects can decide than it they rely on previous test results or not and what process they want to build around of reusing test run results.
Either an XLS roundtrip (bit tedious) or using Jama Connect Interchange XLS function can take us there.
Let me share a simple example on this.Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 10-26-2023 01:55
From: Dimitrios Pananakis
Subject: Managing variants with shared requirements / test runsHello Saby,
Thank you for sharing this. I am struggling to see how this implementation can cover the requirement to have common test runs (if those runs are executed on Project A they should be available on Project B, or vice versa.
Best regardsDimitrios
Original Message:
Sent: 10-25-2023 07:40
From: Szabolcs Agai
Subject: Managing variants with shared requirements / test runsGood afternoon Jacob,
Great topic, thank you for asking.
Your use scenario can be addressed with the Reuse functionality.
Your option C., is that could be good starting point to work out a solution.
Let me share a diagram that outlines a potential configuration approach to your problem.
Have a great day
Saby
------------------------------
Szabolcs Agai
Original Message:
Sent: 04-11-2023 02:58
From: Jacob Brodal
Subject: Managing variants with shared requirements / test runsHi Jama experts,
Hope you can offer some best practice for the following scenario OR guide me to existing threads with same scenario.
We currently have a system which contains of a set of modules. This makes up "Product A" with test runs on both module and system level.
We plan to create a new variant of the product (same product family), "Product B", with added capabilities, but re-using a lot of the existing documentation (requirements, test protocols, even test runs). The Product B will also be subjected to external submissions, thus anything created in Jama should be distinguishable and easily exported (from baseline).
Here is the question:
How should Jama be utilized to best distinguish between Product A and Product B?
a. By tagging requirements that are unique to each variant and creating separate baselines for each variant
b. By using the categories option to indicate which variant a given item relates to
c. By using re-use and sync to separate the variants, thus adding distinct items to each "jama project"
d. By distinguishing within test plans, e.g., separate test plans for each variant
e. Something else?
It is important to note that existing test runs from Product A could also be applicable to Product B, i.e., still valid without re-run.
Below I have outlined how it would look with having Product A and Product B within the same Jama project:
------------------------------
Jacob Brodal
Systems Engineer
3Shape
------------------------------0