Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Slides on Software Testing - Software Engineering | CMSC 435, Study notes of Software Engineering

Material Type: Notes; Professor: Zelkowitz; Class: Software Engineering; Subject: Computer Science; University: University of Maryland; Term: Spring 2009;

Typology: Study notes

Pre 2010

Uploaded on 07/30/2009

koofers-user-jem-1
koofers-user-jem-1 🇺🇸

10 documents

1 / 31

Toggle sidebar

Related documents


Partial preview of the text

Download Slides on Software Testing - Software Engineering | CMSC 435 and more Study notes Software Engineering in PDF only on Docsity! cmsc435 - 1 Software testing cmsc435 - 2 Objectives ● To discuss the distinctions between validation testing and defect testing ● To describe the principles of system and component testing ● To describe strategies for generating system test cases ● To understand the essential characteristics of tool used for test automation cmsc435 - 3 How does software fail? ● Wrong requirement: not what the customer wants ● Missing requirement ● Requirement impossible to implement ● Faulty design ● Faulty code ● Improperly implemented design cmsc435 - 4 Testing goals ● Fault identification: what fault caused the failure? ● Fault correction: change the system ● Fault removal: take out the fault cmsc435 - 9 Typical inspection preparation and meeting times Development artifact Preparation time Meeting time Requirements document 25 pages per hour 12 pages per hour Functional specification 45 pages per hour 15 pages per hour Logic specification 50 pages per hour 20 pages per hour Source code 150 lines of code per hour 75 lines of code per hour User documents 35 pages per hour 20 pages per hour • Faults found during discovery activities. Discovery activity Faults found per thousand lines of code Requirements review 2.5 Design review 5.0 Code inspection 10.0 Integration test 3.0 Acceptance test 2.0 cmsc435 - 10 Proving code correct ● Formal proof techniques ● Symbolic execution ● Automated theorem-proving ● Will discuss these next lecture cmsc435 - 11 Test thoroughness ● Statement testing ● Branch testing ● Path testing ● Definition-use testing ● All-uses testing ● All-predicate-uses/some-computational-uses testing ● All-computational-uses/some-predicate-uses testing cmsc435 - 12 Comparing techniques • Fault discovery percentages by fault origin Discovery technique Requirements Design Coding Documentation Prototyping 40 35 35 15 Requirements review 40 15 0 5 Design review 15 55 0 15 Code inspection 20 40 65 25 Unit testing 1 5 20 0 • Effectiveness of fault discovery techniques. (Jones 1991) Requirements faults Design faults Code faults Documentation faults Reviews Fair Excellent Excellent Good Prototypes Good Fair Fair Not applicable Testing Poor Poor Good Fair Correctness proofs Poor Poor Fair Fair cmsc435 - 13 Test planning ● Establish test objectives ● Design test cases ● Write test cases ● Test test cases ● Execute tests ● Evaluate test results cmsc435 - 14 The testing process ● Component (or unit) testing  Testing of individual program components;  Usually the responsibility of the component developer (except sometimes for critical systems);  Tests are derived from the developer’s experience. ● System testing  Testing of groups of components integrated to create a system or sub-system;  The responsibility of an independent testing team;  Tests are based on a system specification. cmsc435 - 19 (Non-)Testing ● Beta-testing – Relied on too heavily by large vendors, like Microsoft.  Allow early adopters easy access to a new product on the condition that they report errors to vendor. A good way to stress test a new system. cmsc435 - 20 Sample system A DCB FE G cmsc435 - 21 Bottom up testing Test E Test F Test G Test B,E,F Test C Test D,G Test A,B,C,D E,F,G cmsc435 - 22 Top down testing Test A Test A,B,C,D Test A,B,C,D, E,F,G cmsc435 - 23 Sandwich testing Test A Test B,E,F Test A,B,C,D, E,F,G Test E Test D,G Test F Test G cmsc435 - 24 Big bang testing Test A Test A,B,C,D, E,F,G Test B Test D Test E Test C Test F Test G cmsc435 - 29 Use cases ● We saw use cases before under software design processes. ● Use cases can be a basis for deriving the tests for a system. They help identify operations to be tested and help design the required test cases. ● From an associated sequence diagram, the inputs and outputs to be created for the tests can be identified. cmsc435 - 30 Collect weather data sequence chart :CommsController request (report) acknowledge () repor t () summarize () reply (report) acknowledge () send (report) :WeatherStation :WeatherData cmsc435 - 31 Performance testing ● Part of release testing may involve testing the emergent properties of a system, such as performance and reliability. ● Performance tests usually involve planning a series of tests where the load is steadily increased until the system performance becomes unacceptable. cmsc435 - 32 Performance tests ● Stress tests ● Volume tests ● Configuration tests ● Compatibility tests ● Regression tests ● Security tests ● Timing tests ● Environmental tests ● Quality tests ● Recovery tests ● Maintenance tests ● Documentation tests ● Human factors (usability) tests cmsc435 - 33 Stress testing ● Exercises the system beyond its maximum design load. Stressing the system often causes defects to come to light. ● Stressing the system test failure behaviour. Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data. ● Stress testing is particularly relevant to distributed systems that can exhibit severe degradation as a network becomes overloaded. cmsc435 - 34 ● Objectives are to detect faults due to interface errors or invalid assumptions about interfaces. ● Particularly important for object-oriented development as objects are defined by their interfaces. Interface testing cmsc435 - 39 Equivalence partitioning cmsc435 - 40 Equivalence partitions Between 10000 and 99999Less than 1 0000 More than 99999 9999 10000 50000 100000 99999 Input values Between 4 and 10Less than 4 More than 10 3 4 7 11 10 Number of input values cmsc435 - 41 Search routine specification procedure Search (Key : ELEM ; T: SEQ of ELEM; Found : in out BOOLEAN; L: in out ELEM_INDEX) ; Pre-condition -- the sequence has at least one element T’FIRST <= T’LAST Post-condition -- the element is found and is referenced by L ( Found and T (L) = Key) or -- the element is not in the array ( not Found and not (exists i, T’FIRST >= i <= T’LAST, T (i) = Key )) cmsc435 - 42 ● Inputs which conform to the pre-conditions. ● Inputs where a pre-condition does not hold. ● Inputs where the key element is a member of the array. ● Inputs where the key element is not a member of the array. Search routine - input partitions cmsc435 - 43 Testing guidelines (sequences) ● Test software with sequences which have only a single value. ● Use sequences of different sizes in different tests. ● Derive tests so that the first, middle and last elements of the sequence are accessed. ● Test with sequences of zero length. cmsc435 - 44 ● Sometime called white-box testing. ● Derivation of test cases according to program structure. Knowledge of the program is used to identify additional test cases. ● Objective is to exercise all program statements (not all path combinations). Structural testing cmsc435 - 49 ● 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 14 ● 1, 2, 3, 4, 5, 14 ● 1, 2, 3, 4, 5, 6, 7, 11, 12, 5, … ● 1, 2, 3, 4, 6, 7, 2, 11, 13, 5, … ● Test cases should be derived so that all of these paths are executed ● A dynamic program analyzer may be used to check that paths have been executed Independent paths cmsc435 - 50 Test automation ● Testing is an expensive process phase. Testing workbenches provide a range of tools to reduce the time required and total testing costs. ● Systems such as Junit support the automatic execution of tests. ● Most testing workbenches are open systems because testing needs are organisation- specific. ● They are sometimes difficult to integrate with closed design and analysis workbenches. cmsc435 - 51 Testing workbench adaptation ● Scripts may be developed for user interface simulators and patterns for test data generators. ● Test outputs may have to be prepared manually for comparison. ● Special-purpose file comparators may be developed. cmsc435 - 52 Automated testing tools ● Code analysis  Static analysis • code analyzer • structure checker • data analyzer • sequence checker  Dynamic analysis • program monitor • Test execution – Capture and replay – Stubs and drivers – Automated testing environments • Test case generators cmsc435 - 53 When to stop testing ● Coverage criteria ● Fault seeding ● Confidence in the software detected seeded faults = detected non-seeded faults total seeded faults total non-seeded faults C = 1 if n > N S/(S-N+1) if n < N C = 1 if n > N S s S N N s−       + + +       1 1 if n < N cmsc435 - 54 System testing process ● Function testing: does the integrated system perform as promised by the requirements specification? ● Performance testing: are the non-functional requirements met? ● Acceptance testing: is the system what the customer expects? ● Installation testing: does the system run at the customer site(s)? cmsc435 - 59 Testing safety-critical systems ● Design diversity: use different kinds of designs, designers ● Software safety cases: make explicit the ways the software addresses possible problems  failure modes and effects analysis  hazard and operability studies ● Cleanroom: certifying software with respect to the specification cmsc435 - 60 Key points ● Testing can show the presence of faults in a system; it cannot prove there are no remaining faults. ● Component developers are responsible for component testing; system testing is the responsibility of a separate team. ● Integration testing is testing increments of the system; release testing involves testing a system to be released to a customer. ● Use experience and guidelines to design test cases in defect testing. cmsc435 - 61 Key points ● Interface testing is designed to discover defects in the interfaces of composite components. ● Equivalence partitioning is a way of discovering test cases - all cases in a partition should behave in the same way. ● Structural analysis relies on analysing a program and deriving tests from this analysis. ● Test automation reduces testing costs by supporting the test process with a range of software tools.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved