Performance Testing and Planning: Community Deep Dive

By Chloe posted 04-15-2019 10:45

  

Welcome to our first Community Deep Dive Webinar, April 17, 2019 at 7:00am PST, today's topic is:
Performance Testing & Planning 
What is the Jama Quality Assurance testing process? Under what circumstances should you write your own tests? What are some tips for writing tests? How to measure your results? How can Enterprise Elite and Essentials benefit you?

Performance Testing & Planning

This is a YouTube live event from Jama Software headquarters. We will be monitoring this page and taking questions during the live event (April 17th at 7:00am PST).

To participate: make sure you are logged into the Community, have this page open and use the "Comment" stream below to ask questions. Here is a link to the Jama Connect Performance Whitepaper and the Elite Support Program Data Sheet to supplement this presentation. Any unanswered questions will be answered in the Community after the event. 

Here is a direct (link to Jama Software YouTube watch page), giving you another option to this embedded Community page.

Some Technical Notes: 1. If this video doesn't initiate automatically at the start time, press the play button. 2. Making comments may stop your instance of the embedded video, pressing the play button will allow you to reengage the live feed. For this reason, please use the comments exclusively for asking questions, not for chat functionality. Thank you.

2 comments
201 views

Comments

04-17-2019 10:33

​When we update our Jama environment we focus on our workflows. We do not automate tests, but let our users test our workflows in a defined time frame with a copy of the current production data.
We experience ~5 to 10 issues we report to Jama with every release which I would not expect to be found in a performance test, but in a functional test - so I'd be interested in how Jama runs the functional test and regression tests. Is that covered here as well?

04-17-2019 10:12

​When talking about performance I feel that you currently focus on the technical, measurable performance of the software itself.
For the users the "felt" performance increases immediately if the Usability is improved - so they can finish their task in say 50% of their time if the Usability is improved, regardless of how quick REST calls go through.
How does Jama perform Usability tests to increase the performance (=the time a user needs to fulfill a certain task)?