Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

ISTQB CTFL Syllabus 2011

.pdf
Скачиваний:
66
Добавлен:
12.05.2015
Размер:
1.14 Mб
Скачать

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

migratio n rules to ensure that the processed data is correct, complete and compli es with a pre - defined context-spec ific standard .

Other testing tools exist for usability testing.

Version 2 011

Page 61 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

6.2 Effecti ve Use of Tools: Potential

20 minutes

Benefits and Risks ( K2)

 

Terms

Data-driven testing, keyword-driv en testing, s cripting lang uage

6.2.1 Potential Benefits and Risks of Tool Support f or Testing (for all to ols) (K2)

Simply p urchasing or leasing a t ool does not guarantee success with that tool. Each type of tool may require additional effort to achieve real a nd lasting benefits. There are potential benefits and opportun ities with th e use of tools in testing, but there are also risks.

Potential benefits of using tools include:

o Repetitive work is reduced (e .g., running regression tests, re-ente ring the sam e test data, and chec king against coding stan dards)

o Gre ater consistency and repeatability (e.g., tests executed by a t ool in the sa me order with the same frequency, and tests derived from equirement s)

o Obje ctive asses sment (e.g., static measu res, covera ge)

oEas e of access to information about test s or testing (e.g., statistics and graphs about test progress, incide nt rates and performance )

Risks of using tools include:

o Unr ealistic expe ctations for the tool (inclu ding functionality and ea se of use)

oUnderestimating the time, cost and effort for the initial introductio n of a tool (in cluding train ing and external exp ertise)

oUnderestimating the time an d effort needed to achieve significant and continuing benefits from the tool (including the need for changes in the testing process an d continuous improvement of the way the tool is used)

o Underestimating the effort required to ma intain the test assets generated by t he tool

oOver-reliance on the tool (re placement fo r test design or use of a utomated te sting where manual testing w ould be better)

o Neglecting versi on control of test assets within the tool

oNeglecting relationships and interoperability issues between critical tools, such as requirements management too ls, version c ontrol tools, incident management to ols, defect t racking tools and

tools from multiple vendors

oRisk of tool vend or going out of business, retiring the tool, or selli ng the tool to a different ven dor

o

Poor response fr om vendor for support, u pgrades, an d defect fixe s

o Risk of suspension of open-s ource / free tool project

o

Unfo reseen, such as the ina bility to support a new pl atform

6.2.2Special Considera tions for Some Typ es of Too ls (K1)

Test Execution Too ls

Test exe cution tools execute test objects usi ng automated test scripts . This type o f tool often requires significant effort in order to achieve significant b enefits.

Capturing tests by re cording the actions of a manual test er seems attractive, but this approac h does not scal e to large numbers of aut omated test scripts. A c aptured scrip t is a linear representati on with specific data and actions as part of each script. This type of script may be unstable when unexpected events o ccur.

Version 2 011

Page 62 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

A data-driven testing approach separates ou t the test inputs (the data), usually into a spreadsheet, and use s a more ge neric test script that can read the inp ut data and e xecute the s ame test script with diffe rent data. Testers who are not familiar with the scripting language can then create the test data for these predefined scripts.

There are other techniques employed in data-driven techniques, where instead of hard-coded data combinations placed in a spreadsheet, data is generated using algorithms based on configurable parameters at run ti me and supplied to the a pplication. F or example, a tool may use an algorithm, which ge nerates a ra ndom user I D, and for r epeatability in pattern, a seed is employed for controlli ng randomn ess.

In a key word-driven testing appr oach, the sp readsheet co ntains keywords describ ing the actio ns to be taken (also called action word s), and test data. Testers (even if th ey are not familiar with the scripting language) c an then define tests usin g the keywo rds, which can be tailore d to the application being tested.

Technic al expertise in the scripti ng language is needed fo r all approaches (either by testers or by specialis ts in test automation).

Regardl ess of the scripting techn ique used, the expected results for each test nee d to be stor ed for later co mparison.

Static Analysis Too ls

Static an alysis tools applied to s ource code can enforce c oding standards, but if a pplied to existing code ma y generate a large quantity of messa ges. Warnin g messages do not stop the code fro m being tra nslated into an executa ble program, but ideally s hould be addressed so that maintenance of the co de is easier in the future. A gradual implementation of the analysis tool with initial filters to exclude some mess ages is an effective appr oach.

Test Management T ools

Test management tools need to interface wit h other tools or spreadsh eets in order to produce useful information in a format that fits the needs of the organization.

Version 2 011

Page 63 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

6.3 Introducing a T ool into an Org anizatio n

15 minutes

(K1)

 

Terms

No specific terms.

Background

The main considerations in selec ting a tool fo r an organization includ e:

oAss essment of o rganizationa l maturity, strengths and weaknesses and identification of opp ortunities for an improve d test process supported by tools

o Evaluation again st clear requ irements and objective criteria

oA proof-of-conce pt, by using a test tool during the ev aluation pha se to establi sh whether it perf orms effectiv ely with the software un der test and within the cu rrent infrast ructure or to

o

identify changes needed to t hat infrastru cture to effec tively use the tool

Evaluation of the vendor (including traini ng, support a nd commer cial aspects) or service support

o

sup pliers in case of non-commercial tool s

Identification of internal requirements for coaching and mentoring in the use of the tool

o Evaluation of training needs considering the current test team’s te st automati on skills

o

Esti mation of a c ost-benefit ratio based on a concrete business c ase

Introducing the selec ted tool into an organization starts with a pilot pr oject, which has the following objective s:

o Learn more detail about the tool

o Evaluate how th e tool fits with existing processes and practices, a nd determin e what would nee d to change

o Decide on standard ways of using, managing, storing and maintaining the tool and the test asse ts (e.g., deciding on na ming conventions for files and tests, c reating libraries and defining the modularity of test suites)

o Ass ess whether the benefits will be achie ved at reas onable cost

Success factors for the deploym ent of the too l within an o rganization include: o Rolling out the to ol to the rest of the organization incrementally

o Adapting and improving processes to fit with the use of the tool o Prov iding training and coaching/mentoring for new us ers

o Defining usage g uidelines

o

Implementing a way to gathe r usage information from the actual use

o Monitoring tool u se and bene fits

o

Prov iding support for the tes t team for a given tool

o Gat hering lesso ns learned from all teams

References

6.2.2 B uwalda, 200 1, Fewster, 1 999 6.3 Few ster, 1999

Version 2 011

Page 64 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

7.Refere nces

Standards

ISTQB G lossary of T erms used in Software T esting Version 2.1

[CMMI] Chrissis, M. ., Konrad, M . and Shru m, S. (2004) CMMI, Guidelines for Process Integration and Pro duct Improve ment, Addis on Wesley: Reading, M A

See Section 2.1

[IEEE Std 829-1998] IEEE Std 8 29™ (1998) IEEE Stand ard for Software Test Documentation, See Sections 2.3, 2.4, 4.1, 5.2, 5.3, 5.5, 5.6

[IEEE 10 28] IEEE Std 1028™ (2008) IEEE Standard for Software Reviews and Audits, See Section 3.2

[IEEE 12 207] IEEE 1 2207/ISO/IE C 12207-20 08, Softwar e life cycle processes, See Section 2.1

[ISO 912 6] ISO/IEC 9126-1:2001 , Software E ngineering – Software P roduct Quality, See Section 2.3

Books

[Beizer, 1990] Beizer, B. (1990) Software Te sting Techni ques (2nd e dition), Van Nostrand Reinhold: Boston

See Sections 1.2, 1.3, 2.3, 4.2, 4.3, 4.4, 4.6

[Black, 2 001] Black, R. (2001) Managing the Testing Pro cess (3rd edition), John Wiley & Son s: New York

See Sections 1.1, 1.2, 1.4, 1.5, 2.3, 2.4, 5.1, 5.2, 5.3, 5.5, 5.6

[Buwald a, 2001] Buw alda, H. et al. (2001) Integrated Test Design and Automation , Addison W esley: Reading, MA

See Section 6.2

[Copela nd, 2004] Copeland, L. ( 2004) A Prac titioner’s Gu ide to Software Test De sign, Artech House: Norwood, M A

See Sections 2.2, 2.3, 4.2, 4.3, 4.4, 4.6

[Craig, 2002] Craig, Rick D. and Jaskiel, Stefan P. (2002) Systematic Software Te sting, Artec h House: Norwood, M A

See Sections 1.4.5, 2.1.3, 2.4, 4.1, 5.2.5, 5.3, 5.4

[Fewster , 1999] Fewster, M. and Graham, D. (1999) Soft ware Test A utomation, Addison Wesley: Reading, MA

See Sections 6.2, 6.3

[Gilb, 1993]: Gilb, Tom and Graham, Dorothy (1993) Software Inspection, Addiso n Wesley: Reading, MA

See Sections 3.2.2, 3.2.4

[Hetzel, 1988] Hetzel, W. (1988) Complete Guide to Soft ware Testing, QED: Wellesley, MA See Sections 1.3, 1.4, 1.5, 2.1, 2.2, 2.3, 2.4, 4.1, 5.1, 5.3

[Kaner, 2002] Kaner, C., Bach, J. and Petttic ord, B. (200 ) Lessons Learned in S oftware Testing, John Wiley & Sons: New York

See Sections 1.1, 4.5, 5.2

Version 2 011

Page 65 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

[Myers 1979] Myers, Glenford J. (1979) The Art of Softwa re Testing, John Wiley & Sons: New York See Sections 1.2, 1.3, 2.2, 4.3

[van Vee nendaal, 2004] van Veenendaal, E. (ed.) (2004) The Testing Practitioner (Chapters 6 , 8, 10), UT N Publishers: The Nether lands

See Sections 3.2, 3.3

Version 2 011

Page 66 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

8.Appen dix A – Syllabus Background

History of this Docume nt

This document was prepared bet ween 2004 and 2011 by a Working Group comp rised of mem bers appointe d by the Int ernational So ftware Testing Qualifications Board (ISTQB). It w as initially reviewe d by a selected review p anel, and the n by repres entatives drawn from the international software testing community. The rules used in the produc tion of this d ocument ar e shown in Appendix C.

This document is the syllabus for the International Foundation Certificate in Software Testing, the first level internation al qualificatio n approved by the ISTQB (www.istqb.org).

Objectives of t he Found ation Ce rtificate Qualificat ion

oTo gain recognition for testing as an esse ntial and pr ofessional s oftware engi neering spec ialization

o To provide a standard framework for the development of testers' careers

oTo enable profe sionally qua lified testers to be recognized by employers, customers and peers, and to raise the profile of testers

o

To promote con istent and good testing practices within all softw are engineering disciplin es

o

To i dentify testing topics that are relevant and of value to industry

oTo enable softw are suppliers to hire certified testers and thereby gain commercial advant age over their compe titors by adv ertising their tester recru itment policy

oTo provide an op portunity for testers and those with a n interest in testing to a cquire an internationally recognized qualification in the subject

Objectives of t he International Q ualification (adapted from ISTQB meeti ng at Sollentuna, Novemb er 2001)

o

To be able to compare testing skills across different c ountries

o

To enable testers to move a cross countr y borders m ore easily

o

To enable multin ational/international projects to have a common understandin g of testing issues

o

To i ncrease the number of q ualified testers worldwid e

oTo have more im pact/value a s an internationally-bas ed initiative than from any country-specific approach

o

To develop a co mmon international body of understanding and k nowledge about testing

 

thro ugh the syllabus and ter minology, and to increas e the level of knowledge about testin g for

 

all participants

o To promote testing as a prof ession in more countries

o

To enable testers to gain a r ecognized q ualification in their native language

o

To enable sharin g of knowle dge and res ources across countries

oTo provide intern ational recognition of te sters and this qualification due to participation from many countries

Entry Require ments for this Qua lification

The entry criterion for taking the ISTQB Foun dation Certificate in Software Testin g examination is that can didates have an interest in software testing. How ever, it is strongly recommended tha candidates also:

o Have at least a

minimal back ground in either softwar e developme nt or software testing, s uch as

six months experience as a system or user acceptance tester or as a softwar e developer

 

 

 

Version 2 011

Page 67 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

 

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

oTak e a course th at has been accredited to ISTQB sta ndards (by one of the ISTQB-recogn ized National Boards ).

Backg round an d Histor y of the F oundatio n Certifi cate in S oftware Testing

The inde pendent certification of software testers began in the UK with the British Computer Society' s Information Systems E xamination B oard (ISEB), when a Software Testi ng Board was set up in 19 98 (www.bcs .org.uk/iseb). In 2002, ASQF in Ger many began to support a German tes ter qualification scheme (www.asqf. de). This syllabus is bas ed on the IS EB and ASQF syllabi; it includes reorganized , updated a nd additional content, and the emphasis is directe d at topics that will provide the most practical help to testers.

An existing Foundation Certificate in Softwar e Testing (e.g., from ISEB, ASQF or an ISTQBrecognized National Board) awar ded before this International Certific ate was rele ased, will be deemed to be equiv alent to the I nternational Certificate. T he Foundation Certificat e does not e xpire and does not need t o be renewed. The date it was awarded is shown on the Certificate.

Within each participa ting country, local aspec ts are contr olled by a national ISTQ B-recognized Software Testing Board. Duties o f National Boards are sp ecified by th e ISTQB, b ut are imple mented within ea ch country. The duties o f the countr y boards are expected to include accreditation of training providers and the setting of exams.

Version 2 011

Page 68 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

9. Appen dix B – Learnin g Objectives/ Cogniti ve Level of Kno wledge

The follo wing learning objectives are defined as applying to this sylla bus. Each topic in the syllabus will be examined acc ording to th e learning ob jective for it.

Level 1: Remember (K1 )

The candidate will recognize, re member and recall a term or concept.

Keywords: Remem ber, retrieve, recall, reco gnize, know

Example

Can rec ognize the definition of “failure” as:

o “No n-delivery of service to a n end user or any other s takeholder” or

o “Actual deviation of the comp onent or sy stem from its expected delivery, service or result”

Level 2: Understand (K2)

The candidate can select the reasons or explanations for statements related to the topic, and can summarize, compar e, classify, c ategorize and give examples for the testing conc ept.

Keywords: Summarize, generalize, abstract , classify, compare, map, contrast, ex emplify, inte rpret, translate , represent, infer, conclu de, categorize, construct models

Examples

Can explain the reason why test s should be designed as early as possible:

o To find defects w hen they are cheaper to remove o To find the most important d efects first

Can explain the similarities and d ifferences between inte gration and s ystem testin g:

o Similarities: testing more than one comp onent, and can test non-functional aspects

oDiffe rences: integration testi ng concentra tes on interfaces and interactions, a nd system testing conc entrates on whole-system aspects, such as end- to-end proc essing

Level 3: Apply (K3)

The candidate can select the correct application of a con cept or techn ique and apply it to a gi ven context.

Keywords: Implement, execute, use, follow a procedure, apply a procedure

Example

o Can identify boundary value s for valid and invalid partitions

o Can select test c ases from a given state transition di agram in order to cover a ll transitions

Level 4: Analy ze (K4)

The candidate can separate info rmation related to a procedure or technique into its constituen t parts for better understand ing, and ca n distinguish between fac ts and infer ences. Typical application is to analyze a document, software or project situ ation and pro pose appro priate actions to solve a problem or task.

Keywords: Analyze, organize, find coherence, integrate, outline, par se, structure, attribute, deconstr uct, differentiate, discrim inate, distinguish, focus, select

Version 2 011

Page 69 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

Example

o Analyze product risks and propose preventive and corrective miti gation activities

o Describe which portions of an incident report are factual and whic h are inferre d from results

Reference

(For the cognitive lev els of learning objectives)

Anderson, L. W. and Krathwohl, D. R. (eds) (2001) A Taxonomy for Learning, Tea ching, and Assessi ng: A Revisi on of Bloom' s Taxonomy of Educatio nal Objective s, Allyn & Bacon

Version 2 011

Page 70 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]