Requirements testing tools
Static analysis tools
Test design tools
Test data preparation tools
Test running tools (character-based, GUI)
Comparison tools
Test harnesses and drivers
Performance test tools
Dynamic analysis tools
Debugging tools
Test management tools
Coverage measurement
Where tools fit
Requirements testing tools
automated support for verification and validation of requirements models
consistency checking
animation
Static analysis tools
provide information about the quality of software
code is examined, not executed
objective measures
cyclomatic complexity
others: nesting levels, size
“Automated Inspection”
Test design tools
generate test inputs
from a formal specification or CASE repository
from code (e.g. code not covered yet)
Test data preparation tools
data manipulation
selected from existing databases or files
created according to some rules
edited from other sources
Test running tools 1
interface to the software being tested
run tests as though run by a human tester
test scripts in a programmable language
data, test inputs and expected results held in test repositories
most often used to automate regression testing
Test running tools 2
character-based
simulates user interaction from dumb terminals
capture keystrokes and screen responses
GUI (Graphical User Interface)
simulates user interaction for WIMP applications (Windows, Icons, Mouse, Pointer)
capture mouse movement, button clicks, and keyboard inputs
capture screens, bitmaps, characters, object states
Comparison tools
detect differences between actual test results and expected results
screens, characters, bitmaps
masking and filtering
test running tools normally include comparison capability
stand-alone comparison tools for files or databases
Test harnesses and drivers
used to exercise software which does not have a user interface (yet)
used to run groups of automated tests or comparisons
often custom-built
simulators (where testing in real environment would be too costly or dangerous)
Performance testing tools
load generation
drive application via user interface or test harness
simulates realistic load on the system & logs the number of transactions
transaction measurement
response times for selected transactions via user interface
reports based on logs, graphs of load versus response times
Dynamic analysis tools
provide run-time information on software (while tests are run)
allocation, use and de-allocation of resources, e.g. memory leaks
flag unassigned pointers or pointer arithmetic faults
network management tools
transaction rates measurements
bandwidth usage
potential network ‘bottlenecks’
Debugging tools
used by programmers when investigating, fixing and testing faults
used to reproduce faults and examine program execution in detail
single-stepping
breakpoints or watchpoints at any statement
examine contents of variables and other data
Test management tools
management of testware: test plans, specifications, results
project management of the test process, e.g. estimation, schedule tests, log results
incident management tools (may include workflow facilities to track allocation, correction and retesting)
traceability (of tests to requirements, designs)
Coverage measurement tools
objective measure of what parts of the software structure was executed by tests
code is instrumented in a static analysis pass
tests are run through the instrumented code
tool reports what has and has not been covered by those tests, line by line and summary statistics
different types of coverage: statement, branch, condition, LCSAJ, et al
Tool Selection and Implementation
The Tool Selection Process
Where to start with tools?
do not start
with a vendor visit & tool demonstration
with a list of tool features and functions
while your testing process is chaotic (good testing is more important than tools) (“CAST readiness”)
do start
by identifying your needs - which test activities have the worst problems, prioritise
consider constraints, e.g. hardware, OS, integration with other tools (cosmetic only?)
Tool selection process
after automation requirements are agreed:
create shortlist of candidate tools
arrange demos
evaluate selected tool(s)
review and select tool
don’t underestimate “people issues”, e.g. politics, resistance to change, territories
The Tool Implementation Process
Pilot project and implementation
objectives of the pilot project
gain experience in the use of the tool
identify changes in test process
set internal standards and conventions
assess costs and achievable benefits
implementation
based on successful pilot
needs strong commitment from tool users & managers (overcome resistance, overheads for learning curve)
Tool implementation iceberg
No hay comentarios.:
Publicar un comentario