Community Quality Assurance

Community Development Best Practices edit

General testing best practices

  • Know the application and understand its requirements. If you test without knowledge of the requirements, you will not be able to determine if a program is functioning as designed and you will not be able to tell if required functionality is missing.
  • Think like an end user. Test the application as an end user. Technical plus end user thinking will assure that your application is user friendly and will pass acceptance tests.
  • Some testers have assumptions that all bugs logged by them should get fixed. It is a good point to a certain level but you must be flexible according to the situation. All bugs may or may not be fixed. Management can defer bugs to fix later as some bugs have low priority, low severity or no time to fix.
  • Know what needs to be tested and what is code complete, see the JIRA section for details on using filters in JIRA.
  • Report all issues you find. Never assume that someone else will report the issue; you may be the only one seeing it. It's better to report a non-issue than no issue at all.
  • Test using different browsers such as Firefox 3.x.x, IE 7 or 8, Google Chrome and Safari. We do not support IE6 and older versions of Firefox. If you are testing Safari on a Mac please specify this in the defects you log, same for Linux machines.
  • Use Jira to create and execute test cases. Jira provides a status of the test cases that have been run and their status. When you run a test case mark it using Pass or Fail. See the Jira – Test Cases section of this page for details.
  • If you fail a test case due to a defect, put the defect number ie.. LUC-20 in the test cases comment section.
  • Watch for usability and visual problems and report them as you find them.
  • As you test think about security and what someone might do to gain access to information.
  • Test reasonable boundaries, if a field accepts a number between 0 - 9 try a -1 or 10. Also try a 1 and an 8 to make sure they work.
  • Always watch for performance issues. If a page is taking to long to load log a defect with the time it's taking to load.
  • Think about scalability or loading a screen. For example, with the Local Unit Calendar project you can have many calendars and many events. Load up the calendars and events to find limits. The opposite can be true as well find out what happens without any events in a calendar.
  • Validate that invalid data which should produce errors are caught when saving. Enter invalid data into fields and save it.
  • Look for broken links and report any that you can find. You can link sleuth the web site using a tool called XENU found here Sorry about the Alien what can I say it's free.
  • Try accessing URL's within the website that you shouldn't be able to access until you log in.
  • Check that cookies are not in clear text and that they are not holding readable confidential text or compromising information. Cookies should be encrypted.
  • Test the roles of users and validate that the rules governing their role are working. If you have any question about rules or roles log a question issue in JIRA.

Find more Software Testing information and guidelines at

JIRA issue guidelines

  • Bug VS Enhancement - When you find issues before logging them please take the time to identify whether the issue is a defect in the software according to current feature requirements, or if it is an enhancement request. An easy way to do this is to look at the requirements for the product if this issue is not defined as a requirement but is adding to the requirements then it’s an enhancement request and should be logged as such. If it breaks functionality then it's a defect.
  • Reproducible - All reported bugs should be reproducible in the latest build with the steps detailed as clearly as possible. Screen shots are good when they help the developer see the issue. They aren't helpful when it doesn't show the problem.
  • Specifics - All reported bugs should contain as much information as possible relevant to the bug. Keep speculations and facts clearly separated. List Browsers and OS you are using.
  • Log only one bug per report
  • Actively watch your defects - There will be times when a developer will add comments to a defect that require your input. Please be active with your defects and provide needed feedback. A developer may also mark a defect as "Can't reproduce" when this happens retest the defect using the latest build. If you can still duplicate it reopen the defect with comments.
  • Know the iteration development is working on- An iteration is a two week development interval which outlines tasks to be worked on. We will have a QA Project Info Wiki page for each community project which will have the current iteration. The iteration is needed when you log defects.
  • If you log it you close it - Try not to close someone else's bug or enhancement; User Stories are different. The person who logged the defect or enhancement is the best one to close it. If a User Story has been adequately tested then the person who did most of the test for that user story can close it.
  • Avoid duplicates – Before logging an issue, do a search to see if the issue has already been reported. If it has, you can add additional comments. If not, you can then create the issue.

Using JIRA

JIRA is our defect tracking tool found at

  • Use this tool to report defects, enhancements and questions. Jira is also used for Test Case management and tracking. See Test Cases below.
  • Use this tool to understand what features are code complete and ready to test.
  • Subscribe to the filters for the project you are working on. For example for the Local Unit Calendar project, subscribe to the following filters by clicking on "Manage Filters", Click Search and search for the filter you are wanting to use (Click search to see all filters), Click the “star” or select subscribe to subscribe to the filter you want to add. For example, you can search and subscribe to the following filters.
    1. Assigned to me – These need your attention
    2. Calendaring – Defects being worked on
    3. Calendaring – Fixed Defects to Validate
    4. Calendaring – Questions to Developers
    5. Calendaring – User Stories being coded
    6. Calendaring – User Stories Ready to Test
    7. Calendaring – User Stories Tested and Closed
  • When you find an issue use the following priority list to determine the priority of the defect:
    1. Blocker - This issue is keeping you from testing this feature and all testing will stop until this issue is resolved.
    2. Critical - This issue indicates that the feature is not working at all and causes the browser to crash or sensitive data to be exposed or data corruption. These must be fixed.
    3. Major - In a significant way the feature is not working the way it was designed. These issues need to be reviewed before shipping.
    4. Minor - Insignificant and can be pushed off to the next release. Does not impact the functionality of the feature, usually there is an easy workaround.
    5. Trivial - Very small visual problems such as a pixel is off or fields need to be positioned slightly to the right or this color needs to be sharper etc... does not impact functionality at all.
  • When reporting an issue, please use the proper format to help the developers resolve the problem:
    1. Include a one line summary of the problem: "Repeating events are an hour off with each day it repeats"
    2. Include further details in description that may help them understand the issue: "This seems to happen only when I change my time zone to Central Standard Time.
    3. Include steps to reproduce the problem:
      • Steps:
      1. - In the OS change the local time zone to Central Time
      2. - Log in to LUC as Pholder and click on Create an Event
      3. - Create an event that repeats on every Friday at 2:00 pm and click Save.
    4. Include the results you are seeing: "Notice that when I move to the following Friday the repeated event starts at 3:00pm"
    5. Include the browser and version you are using to duplicate the problem. Specify if it is browser specific: "Happens only with IE7; Firefox 3.5 is fine".
  • When your defect has been marked Resolved Fixed, it is up to the individual who logged the defect to test the fix and close the defect. Only test and close the defects you have logged.
  • Watch the Resolved Fixed user story filter. These user stories are features that have been coded and are ready to test. When a new user story is completed, create a test case for the user story (if one has not already been created) and link it to the user story. Once a test case has been written for the user story execute the test case based on the user story requirement.
  • Watch the "Fixed defects to validate" filter. This is a bucket of defects that have been fixed. Find the ones you logged and validate that the fix is working, test around the areas of the fix and close the defect adding comments stating that you have validated this fix. If the defect has not been resolved, re-open the defect adding addition comments as necessary to help resolve the issue.

JIRA work flow

  • All issues logged are reviewed, prioritized and assigned to developers by project management. Some may be pushed to later iterations or later versions.
  • Issues with an Open status are being reviewed or worked on by development.
  • When an open issue is being worked on by a developer, the status will be set to In Progress when they select Start Progress. These are issues that are actively being worked on. Do not close these issues. If you have any other clarification notes, add those to the comments.
  • If a JIRA issue is opened and assigned to a developer don't close it. Let the developer mark it resolved. Add any additional comments that may help the developer get the issue resolved.
  • If a JIRA issues is marked Resolved Can't Duplicate retest it and if you can still reproduce the problem reopen the issue with any other additional clarifying comments.
  • Issues can be closed with the following status:
    1. Fixed - Defect or enhancement was fixed. Validate the fix with the latest code and close the defect.
    2. Won't Fix - If you disagree with this status, feel free to reopen the issue and list reasons why.
    3. Duplicate - Another defect / enhancement was logged before this one. If you agree that they are duplicates, close this defect.
    4. Incomplete - Add more information to the defect to help the developers understand what is happening.
    5. Cannot Reproduce - Try to reproduce the issue again using the latest code. If you can still reproduce it, reopen the issue and clarify your steps. If you cannot reproduce close the issue with comments.

Web services testing

Web service testing will be done with soapUI. soapUI is open source and free to download and use.

All TestSuites are saved in XML and can be located in the project version control system.

Automated testing

Automated testing will be done with Selenium-RC.

The SVN path will be added soon.

Manual testing

Like defects, test cases are tracked using Jira. The following glossary will help identify proper terms:

  • Test Case: A test that exercises a specific requirement or bit of functionality.
  • Fix Version/s: The version of the code that this test case is planned to be executed against. It can reside in multiple versions. Example: I create a test case that will be executed on build versions 1, 3,4,5, etc. (In other test case management software, this would often be referred to as the test run.)
  • Affects Version/s: When a test case status is updated (Pass or Fail) the version of software that was tested against, is the Affects Version.

Add New Test Case

To add a new test case in Jira. Use the following steps: (make sure you are in the correct project)

  • Select Create New Issue, ensure the proper project is selected, change issue type to Test Case, hit next
  • Fill out the test case with appropriate information: Summary, Description, Expected Outcome, Test Case Steps. Select Create to commit the test case. Priority is set on a project bases, work with the project lead.
  • The test case will be approved by the assigned Test Lead(s).

Test Case Approval

Test cases are approved the project test lead. They will use the following principles in approving the test cases.

  • Is the test case in the appropriate project?
  • Is the test case valid and meet project requirements
  • Does the test case give the functionality proper coverage
  • Is the test case unique? Ensure that the test case does not already exist?

Unscheduled Test

A test case that has a status of Unscheduled Test is waiting to be assigned to an appropriate version. Once it has been assigned to a version, the test case can be executed.

Ready to Run

This is the queue of test cases that have been approved, and are now ready to be executed. The tester should look for test cases that are ready to run and start executing against those test. If there is no current version (test run) scheduled, execute against the latest code version, and make a note showing the affected version.

In Progress

  • Any test case that has a status of In Progress, is being executed by a tester. Make sure to change the assignee to yourself if you are the one executing the test.
  • While running through the test cases, mark them with the appropriate Pass or Fail. If a test case is Invalid and no longer should be used, mark the test case as Invalid.
  • If a test case is “Retired” (or has a status of Closed) it is no longer executed on. This is the same as a closed defect.
This page was last modified on 11 October 2012, at 12:34.

Note: Content found in this wiki may not always reflect official Church information. See Terms of Use.