Home Consulting Web Surveys Employment Polls Access User's Business List Payments The Music Workshop

Recently, Jingdong has cooperated with rolex replica , one of the world's most famous high-end replica bags . The hermes replica has entered gucci replica , a luxury e-commerce platform of replica hermes

Applications Alternatives 

    Survey and Analytics Information Technology Specialists



Our CMMI Based Approach to Requirements Analysis, Testing, and Quality Control

Applications Alternatives? approach to requirements analysis, testing, and quality control is based on the Capability Maturity Model Integration (CMMI) method. This approach involves the following:

1. Definition of requirements down to the granular level. From interviewers with stakeholders, client managers, and end users, we define detailed requirements for the application being developed. Given the nature of the application, we structure the definition of these requirements to fit a traceability matrix.  For these, each column represents a stage in the software development life cycle. The first column represents the highest level of the requirement. Each successive column represents a more detailed level of the requirements with the column to the right representing the granular requirements.

2. Creation of the test data. Software requirements contain algorithms. The test data generated for testing must test every algorithm completely. It must also reflect the volume of data that is expected in production with enough categorical levels to be equivalent to the production environment. With its analytic capability, we use SAS development environment tools to aid in the creation of test data. If that test data must exist in a DBMS, SAS has tools that allow for the loading of data into a number of commercial DBMS software packages.

3. Definition of the testing criteria. In many cases, software testing is done with an inductive approach where the tester comes up with test cases ?out of his/her head.? Requirements are rarely tied directly to the application. Our approach, in contrast, is to test every granular requirement, defined in the traceability matrix, by adding a new column to the matrix (on the right), that defines the acceptance criteria for each requirement. By doing this, every expected feature in the application is verified to be working or not. Nothing is left to the imprecise memory of the tester or developer. For algorithmic requirements, using SAS, we generate the expected outcomes from our test data and place them either in the acceptance criteria column or in a table that represents how an output data file, report, or screen should look when generated during the test. Testing is not completed until every requirement has been verified.

4. Quality Control.

a. Quality Control Plan. Our approach to quality control starts with the quality control plan. In it, we define the steps that we will take in performing quality control for the specific application being reviewed.

b. Requirements. We review requirements documents to determine whether the requirements have been defined completely and in enough detail that any ambiguities about what is expected from the application have been eliminated.

c. Development. We determine whether the application design has been defined clear enough and in enough detail to allow for any developer to understand what the application?s functions are. Also, has the application design been reviewed by other members of the development team with minutes of the reviews? Has a code review been conducted that determines that the application is structured enough to make the implementation of software modications feasible and meets industry standards for quality? Are application artifacts following the configuration management plan? Is there a configuration management plan?

d. Testing. Is a group, independent of the development team, conducting testing? Is the test data that has been created for stress testing the application equivalent to production? Does the test acceptance criteria verify that all of the detailed requirements have been met? Have all tests been completed?

e. Training. We review the training materials and the training outline. We determine if the training is covering all user related aspects of the application.

f. Implementation. We review the implementation plan. We determine whether it follows the configuration management plan. We determine the extent of end user disruption that will occur when new versions of the application are implemented. We also determine the extent of system risk that will occur when using the implementation plan.

Last Modified: 6/28/2018