Skip Navigation LinksHome > About Us > Quality Assurance

Every software development, enhancement or maintenance project undertaken at gallops includes quality assurance activities. Even a simple one-person development job has QA activities embedded in it.

A project’s formal QA program includes the assurance processes that each team member goes through. It involves planning and establishing project-wide standards, rather than relying on personal standards and processes. The extent and formality of project QA activities are decisions that the client, project manager and the QA department make based on their assessment of the project and its risks.

Compliance with established requirements, standards, and procedures is evaluated through process monitoring, product evaluation, audits, and testing. Quality Assurance department at gallops ensures that the management and engineering efforts result in a product that meets all of its requirements.

Specific project characteristics and risks influence Quality Assurance needs and every quality assurance plan is tailored to its project. Characteristics that are considered include mission criticality of the project, schedule and budget, size and complexity of the product to be produced, and size and organizational complexity of the development staff.

Software testing verifies that the software meets its requirements and that it is complete and ready for delivery.

Objectives of QA Testing

Assure the quality of client deliverables.

• Design, assemble, and execute a full testing lifecycle.

• Confirm the full functional capabilities of the final product.

• Confirm stability and performance (response time, etc.) of the final product.

• Confirm that deliverables meet client expectations/requirements.

• Report, document and verify code and design defects.

The first products monitored by QA are the project's standards and procedures. QA assures that clear and achievable standards exist and then evaluates compliance of the software product to the established standards. Functional and technical document evaluations follow. This ensures that the software product reflects the requirements and standards as identified in the management plan.

QA assures Verification and Validation (V&V) activities by monitoring and participating in product reviews, inspections, and walk-through. A review looks at the overall picture of the product being developed to see if it satisfies its requirements. In formal reviews, actual work done is compared with established standards. Reviews are part of the development process, and they should be conducted at the end of each phase of the life-cycle to identify problems and determine whether the interim product meets all applicable requirements. The results of the review should provide a ready/not-ready decision to begin the next phase. Although the decision to proceed is a management decision, QA is responsible for advising management and participating in the decision.

Quality Checking and Testing

There are many different types of tests that can be executed depending on the objectives of the test engineer. Some examples include the following:

  • Error testing

    This testing verifies the product’s responses to error conditions. These tests ensure that the responses are meaningful, correct, and may include the display of context-sensitive, error-specific help screens. Reviewing error message lists may help to identify each error which can be generated by a specific screen or module.

  • Stress testing

    This testing measures the system’s response to large amounts of data. This may include single-users manipulating large sets of data, or multiple machines running the application simultaneously.

  • White-box testing

    This testing places the focus on the internal structure of the software. Tests are designed using the code and the functional spec as inputs.

  • Black-box testing

    This type of testing views the software from the end-user perspective, and is unaware of the underlying code.

  • Distributed testing

    This may include verification of startup data, modifications to tables, verification that cascading deletes are correct, and that dependencies between database tables are correct.

  • Multi-user testing

    This type of testing includes concurrency tests which verify the interaction of two simultaneous users on two separate machines. For example, if one user locks a record, a second user may receive an error message when attempting to lock the same record. Stress testing verifies the response of the system when a large number of users are connected to the database. Other types of tests verify that a process on one machine is capable of initiating a process on another machine.

Levels of testing

We typically do our testing at the following four levels:

  • Testing

    Examines each object, verifying specific properties of the object, including, state, size, location, contents, type, etc. Tests might also verify that the object is functioning according to standard behavior.

  • Unit testing

    Examines the behavior of a group of objects, which together provide a specific feature for the end-user. Often, unit tests look at a dialog as a collection of objects, and the functionality provided by the dialog is verified by these tests. Unit tests verify the interaction between objects.

  • Integration testing

    Examines how groups of dialogs and screens function together. Typically, common user scenarios are verified through these tests.

  • System testing

    Examines how the software interacts with other software products within the software environment. Connectivity product testing, such as TCP/IP or other network protocol testing are examples of system tests.

  • The outputs of the integration and test stage include an integrated set of software along with the online help file, an online help system, and the product release document.
Home About Us Service Software Products Hardware Products Clients Career Bussiness Inquiry Contact Us Webmail  © Copyrights 2011 Gallops Infrastructure Ltd.
(ISO 9001:2008)