Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

ISTQB CTFL Syllabus 2011

.pdf
Скачиваний:
66
Добавлен:
12.05.2015
Размер:
1.14 Mб
Скачать

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

10. Appen dix C – Rules A pplied to the ISTQB

Found ation Syllabus

The rules listed here were used in the development and review of this syllabus. (A “TAG” is sh own after eac h rule as a shorthand ab breviation of the rule.)

10.1.1 General Rules

SG1. The syllabus should be un derstandable and absorbable by peo ple with zero to six mont hs (or more) e xperience in testing. (6-M ONTH)

SG2. The syllabus should be practical rather than theoretical. (PRACTICAL)

SG3. The syllabus should be clear and unam biguous to i ts intended readers. (CL EAR) SG4. The syllabus should be un derstandable to people from different countries, and easily translata ble into diffe rent langua ges. (TRANS LATABLE)

SG5. The syllabus should use A merican English. (AMERICAN-ENGLISH)

10.1.2 Current Content

SC1. The syllabus s hould includ e recent testing concepts and should reflect curre nt best practices in softwa re testing where this is generally agr eed. The syllabus is subject to revie w every three to five years. (RECENT )

SC2. The syllabus s hould minimize time-related issues, such as curre nt market c onditions, to enable it to have a shelf life of three to five ye ars. (SHEL F-LIFE).

10.1.3 Learning Objectives

LO1. Le arning objectives should distinguish between item s to be reco gnized/rem embered (cognitive level K1 ), items the c andidate should unders tand conceptually (K2), items the can didate should be able to p ractice/use (K3), and items the candidate should be able to u se to analyz e a document, software or project situation in co ntext (K4). (KNOWLED GE-LEVEL)

LO2. The description of the cont ent should b e consistent with the lear ning objecti ves. (LOCONSIS TENT)

LO3. To illustrate the learning objectives, sa mple exam questions for each major section shou ld be issued along with th e syllabus. (L O-EXAM)

10.1.4 Overall Structure

ST1. Th e structure of the syllabus should be clear and allow cross-referencing to and from oth er parts, fro m exam qu estions and f rom other re levant documents. (CR SS-REF)

ST2. Overlap betwe en sections of the syllabu s should be minimized. (OVERLAP)

ST3. Each section of the syllabu s should have the same structure. (STRUCTURE -CONSISTE NT) ST4. Th e syllabus s hould contai n version, date of issue and page nu mber on every page. (VERSI ON)

ST5. Th e syllabus s hould include a guideline for the amount of time t o be spent in each section (to reflect th e relative im portance of each topic). (TIME-SPE NT)

References

SR1. Sources and re ferences will be given fo r concepts in the syllabus to help training provide rs find out more inform ation about the topic. (R EFS)

SR2. W here there are not readily identified and clear sources, more d etail should be provided in the

syllabus. For example, definition s are in the

Glossary, so only the ter ms are listed in the syllab us.

(NON-REF DETAIL)

 

 

 

 

 

Version 2 011

Page 71 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

 

 

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

Sourc es of Information

Terms used in the sy llabus are defined in the ISTQB Glo ssary of Ter ms used in S oftware Testing. A version of the Gloss ary is availab le from IST QB.

A list of recommend ed books on software tes ting is also issued in parallel with thi s syllabus. The main bo ok list is part of the Refe rences secti on.

Version 2 011

Page 72 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

11.Appendix D – No tice to T raining Provider s

Each ma jor subject heading in the syllabus is assigned an allocated time in minutes. The purp ose of this is bo th to give g uidance on t he relative proportion of time to be allocated to e ach section o f an accredit ed course, and to give a n approxima te minimum time for the teaching of e ach section. Training providers may spend m ore time tha n is indicated and candidates may spend more ti me again in reading and research. A course curriculum does not have to follow the sa me order as the syllabus.

The syll abus contains references to establish ed standards, which must be used in the prepara tion of trainin g material. Each standard used must be the vers ion quoted in the current version of this syllabus. Other publications, templates or standards not referenced in this syllabus may also b e used an d referenced, but will not be examined.

All K3 a nd K4 Learning Objective s require a practical exercise to be i ncluded in th e training materials.

Version 2 011

Page 73 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

12.Appendix E – Re lease N otes

Release 2010

1.Changes to Learning Objectives (LO) include so me clarification

a.Wording cha nged for the following L Os (content and level of LO remains unchanged): LO-1.2.2, LO-1.3.1, LO -1.4.1, LO-1.5.1, LO-2.1.1, LO-2.1.3, LO- 2.4.2, LO-4. 1.3, LO-4.2. 1, LO-4.2.2, LO-4.3.1, LO -4.3.2, LO- 4.3.3, LO-4.4 .1, LO-4.4.2, LO -4.4.3, LO- 4.6.1, LO-5.1 .2, LO-5.2.2, LO-5.3.2, LO-5.3.3, LO - 5.5.2, LO-5. 6.1, LO-6.1. 1, LO-6.2.2, LO-6.3.2.

b.LO-1.1.5 ha s been reworded and upgraded to K2 . Because a comparison of terms of def ect related te rms can be expected.

c.LO-1.2.3 (K 2) has been added. The content was already cov ered in the 2007 syllabus.

d.LO-3.1.3 (K 2) now combines the content of LO-3.1.3 and LO -3.1.4.

e.LO-3.1.4 ha s been removed from the 2010 syllab us, as it is partially redu ndant

with LO-3.1.3.

f.LO-3.2.1 ha s been reworded for consistency with the 2010 sy llabus conte nt.

g.LO-3.3.2 ha s been modified, and its level has been changed from K1 to K 2, for consistency with LO-3.1.2.

h.LO 4.4.4 ha s been modi fied for clarity, and has been change d from a K3 to a K4. Reason: LO-4.4.4 had already been written in a K4 man ner.

i.LO-6.1.2 (K 1) was dropp ed from the 2010 syllab us and was replaced with LO-

6.1.3(K2). There is no L O-6.1.2 in t he 2010 syll abus.

2.Consistent use for test approach according to the definition in the glossary. The term test strategy will not be required as term to recall.

3.Chapter 1.4 now contains the conce pt of traceability between test basis and test cases.

4.Chapter 2.x now contains test objects and test ba sis.

5.Re-testing is now the ma in term in the glossary in stead of co nfirmation testing.

6.The aspect data quality and testing h as been add ed at sever al locations in the syllabu s:

data quality and risk in Chapter 2.2, 5.5, 6.1.8.

7.Chapter 5.2.3 Entry Crite ria are adde d as a new subchapter. Reason: Consistency to Exit Criteria (-> e ntry criteria added to LO-5.2.9).

8.Consistent use of the terms test strategy and tes t approach w ith their definition in the glossary.

9.Chapter 6.1 shortened because the tool descriptions were to o large for a 45 minute lesson.

10.IEEE Std 829:2008 has been release d. This version of the syllabus does not yet consider this new edition. Section 5.2 refers to the docum ent Master Test Plan. The content of the Master Test Plan is covered by the concept that the documen t “Test Plan” covers diff erent levels of pla nning: Test p lans for the test levels can be created as well as a test plan on the project level covering mu ltiple test le vels. Latter i s named Master Test Pla n in this syllabus and in the IS TQB Glossa ry.

11.Code of Ethics has been moved from the CTAL to CTFL.

Release 2011

Change s made with the “mainten ance release” 2011

1. General: Wo rking Party eplaced by Working Gro up

2.Replaced po st-conditions by postcon ditions in or der to be co nsistent with the ISTQB Glossary 2.1.

3.First occurrence: ISTQB replaced by ISTQB®

4.Introduction to this Sylla bus: Descriptions of Cognitive Levels of Knowledge removed, because this was redund ant to Appendix B.

Version 2 011

Page 74 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

5.Section 1.6: Because th e intent was not to define a Learning Objective for the “Code of Ethics”, the cognitive level for the se ction has be en removed.

6.Section 2.2. 1, 2.2.2, 2.2.3 and 2.2.4, 3.2.3: Fixed formatting i ssues in lists .

7.Section 2.2. 2 The word failure was not correct fo r “…isolate f ailures to a s pecific com ponent …”. Therefo re replaced with “defect” in that sente nce.

8.Section 2.3: Corrected fo rmatting of bullet list of test objective s related to test terms in section Test Types (K2).

9.Section 2.3. 4: Updated d escription of debugging to be consistent with Ve rsion 2.1 of the ISTQB Glos sary.

10.Section 2.4 r emoved word “extensive ” from “inclu des extensi ve regression testing”, because the “extensive” depends on the change (size, risks, value, etc.) as written in the next sentenc e.

11.Section 3.2: The word “i ncluding” ha s been removed to clarify the senten ce.

12.Section 3.2. 1: Because the activities of a formal review had been incorrec tly formatte d, the

review proce ss had 12 m ain activitie s instead of six, as inten ded. It has been change d back to six, which makes this section compliant with the Syllabus 2 007 and the ISTQB Advanced Level Syllabus 2007.

13.Section 4: W ord “develo ped” replaced by “define d” because test cases ge t defined and not developed.

14.Section 4.2: Text change to clarify ho w black-box and white-b ox testing could be used in conjunction with experie nce-based te chniques.

15.Section 4.3. 5 text change “..between actors, inclu ding users and the syst em..” to “ …

between actors (users o r systems), … “.

16.Section 4.3. 5 alternative path replac ed by alterna tive scenari o.

17.Section 4.4. 2: In order to clarify the t erm branch testing in the text of Section 4.4, a sentence to clarify the focus of branch testing ha s been changed.

18.Section 4.5, Section 5.2.6: The term “experience d-based” testing has bee n replaced b y the correct term “experience-based”.

19.Section 6.1: Heading “6.1.1 Understa nding the Meaning and Purpose of T ool Support for Testing (K2)” replaced by “6.1.1 Tool Support for Testing (K2)”.

20.Section 7 / B ooks: The 3rd edition of [Black,2001] listed, repl acing 2nd edition.

21.Appendix D: Chapters requiring exercises have b een replaced by the gen eric require ment

that all Lear ning Objectiv es K3 and h igher requir e exercises. This is a req uirement specified in the ISTQB Accreditati on Process (Version 1.2 6).

22.Appendix E: The changed learning objectives be tween Versio n 2007 and 2010 are no w correctly list ed.

Version 2 011

Page 75 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

13.Index

action word .............

....................

 

............ 63

alpha te sting .....................................

 

 

24, 27

architecture ...............

15, 21, 22, 25, 28, 29

archivin g ...........................................

 

 

17, 30

automation ..............................................

 

 

29

benefits of independence ........... ............

 

47

benefits of using too ...............................

 

 

62

beta tes ting .......................................

 

 

24, 27

black-box technique ....................

 

 

37, 39, 40

black-box test desig .............n technique

39

black-box testing.....................................

 

 

28

bottom-u p................................................

 

 

25

boundary value analysis ............. ............

 

40

bug..........................................................

 

 

11

captured script ........................................

 

 

62

checklis ts ..........................................

 

 

34, 35

choosin g test techni .............. ............que

 

44

code co verage ................

 

28, 29, 37, 42, 58

commercial off the s ............helf (COTS)

22

compiler ..................................................

 

 

36

complexity.............................

 

11, 36, 50, 59

compon ent integrati on testing22, 25, 29, 59, 60

compon ent testing22 , 24, 25, 27, 29, 37, 41,

42

45, 48, 52

configur ation management .........

Configuration manag ement tool .............

58

confirma tion testing... 13, 15, 16, 21, 28, 29

contract acceptance testing

.................... 27

control flow............................

28, 36, 37, 42

coverag e 15, 24, 28, 29, 37, 38, 39, 40, 42,

50, 5 1, 58, 60, 62

58

coverag e tool ..............................

custom-developed software........

............ 27

data flow .....................................

............ 36

data-driv en approac h..................

............ 63

data-driv en testing ......................

............ 62

debugging .............................

13, 24, 29, 58

debugging tool ............................

...... 24, 58

decision coverage.......................

...... 37, 42

decision table testin ..................

...... 40, 41

decision testing ...........................

............ 42

defect10 , 11, 13, 14, 16, 18, 21, 24, 26, 28, 29, 3 1, 32, 33, 34, 35, 36, 37, 39, 40, 41, 43, 4 4, 45, 47, 49, 50, 51, 53, 54, 55, 59,

60, 6 9

50, 51

defect d ensity....................................

defect tracking tool.................................. 59 develop ment .. 8, 11, 12, 13, 14, 18, 21, 22,

24, 2

9, 32, 33, 36, 38, 44, 47, 49, 50, 52,

53, 5

5, 59, 67

21, 22

develop ment model ..........................

drawbacks of independence...................

47

driver....

...................................................

24

dynamic analysis too l .......................

58, 60

d ynamic testi ng ................

..... 13, 31, 32, 36

e mergency ch ange ..........

....................... 30

e nhancement ...................

................. 27, 30

e ntry criteria .....................

....................... 33

e quivalence partitioning ...

....................... 40

e rror.................................

10, 11, 18, 43, 50

e rror guessin g ..................

........... 18, 43, 50

e xhaustive te sting ............

....................... 14

e xit criteria13, 15, 16, 33, 35, 45, 48, 4 9, 50,

51

 

 

e xpected resu lt......................

16, 38, 48,

63

e xperience-ba sed technique.......

37, 39,

43

e xperience-ba sed test de sign techniq ue 39

e xploratory testing.............................

43, 50

factory accept ance testing ..................

.... 27

failure10, 11, 13, 14, 18, 2 1, 24, 26, 3 2, 36,

43, 46, 50, 51, 53, 54,

69

failure rate ........................

 

................. 50, 51

fault ..............

....................

 

........... 10, 11, 43

fault attack........................

 

....................... 43

fi eld testing.......................

 

................. 24, 27

follow-up.......

....................

 

........... 33, 34, 35

formal review ....................

 

................. 31, 33

functional requirement .....

................. 24, 26

functional specification.....

....................... 28

functional tas k ..................

 

....................... 25

functional test ...................

 

....................... 28

functional testing ..............

 

....................... 28

functionality ...............

24, 25, 28, 50, 53, 62

im pact analysis ................

 

........... 21, 30, 38

incident...

15, 16, 17, 19, 2 4, 46, 48, 5 5, 58,

59, 62

 

 

 

incident loggi ng ................

 

....................... 55

incident mana gement.......

 

........... 48, 55, 58

incident mana gement tool

................. 58, 59

incident report ..................

 

................. 46, 55

independence ..................

 

........... 18, 47, 48

informal review.................

 

........... 31, 33, 34

inspection.........................

 

..... 31, 33, 34, 35

inspection leader..............

 

....................... 33

integration13, 22, 24, 25, 27, 29, 36, 40, 41,

42, 45, 48, 59, 60, 69

 

integration tes ting22, 24,

25, 29, 36, 40, 45,

59, 60, 69

 

interoperability testing ......

....................... 28

introducing a tool into an o rganization 57, 64

IS O 9126...............................

11, 29, 30, 65

d evelopment model.............................

.... 22

it erative-incremental development mo del22

k eyword-drive n approach........................

63

k eyword-drive n testing ............................

62

kick-off.....................................................

33

learning objec tive ...

8, 9, 10, 21, 31, 3 7, 45,

57, 69, 70, 71

 

load testing.......................

........... 28, 58, 60

Version 2 011

Page 76 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

load testing tool...........................

 

............ 58

maintainability testing .................

............

28

mainten ance testing ...................

......

21, 30

management tool..................

48, 58, 59, 63

maturity .................................

17, 33, 38, 64

metric ..........................................

 

33, 35, 45

mistake .......................................

 

10, 11, 16

modelling tool..............................

............

59

moderator ...................................

 

33, 34, 35

monitoring tool ............................

......

48, 58

non-func tional requirement.........

 

21, 24, 26

non-func tional testing .................

......

11, 28

objective s for testing...................

............

13

off-the-shelf.................................

............

22

operational acceptan ce testing...............

27

operational test ...........................

 

13, 23, 30

patch ...........................................

............

30

peer review .................................

 

33, 34, 35

perform ance testing ....................

......

28, 58

perform ance testing tool .............

......

58, 60

pesticide paradox........................

............

14

portability testing.........................

............

28

probe effect.................................

............

58

procedure....................................

............

16

product risk ...........................

18, 45, 53, 54

project risk ..................................

 

12, 45, 53

prototyping ..................................

............

22

quality 8 , 10, 11, 13, 19, 28, 37, 38, 47, 48,

50, 5 3, 55, 59

 

 

 

rapid application development (R AD)

..... 22

Rational Unified Pro cess (RUP) .............

 

22

recorder ..................

.................... ............

 

34

regression testing .....

15, 16, 21, 28, 29, 30

Regulation acceptance ...............testing

 

27

reliability ....................

11, 13, 28, 50, 53, 58

reliability testing ......................................

 

 

28

requirem ent.....................

13, 22, 24, 32, 34

requirem ents manag ..............ement tool

 

58

requirem ents specifi ................cation

 

26, 28

responsibilities ............................

 

24, 31, 33

re-testing. 29, See confirmation testing, See

confirmation testing

review1 3, 19, 31, 32, 33, 34, 35, 36, 47, 48,

53, 5 5, 58, 67, 71

 

review t ool...............................................

58

reviewer ............................................

33, 34

risk11, 12, 13, 14, 25 , 26, 29, 30, 38, 44, 45,

49, 5 0, 51, 53, 54

54

risk-bas ed approach

risk-bas ed testing........................

50, 53, 54

risks .....

.................................

11, 25, 49, 53

risks of

using tool....................................

62

robustne ss testing...................................

24

roles .....

........... 8, 31, 33, 34, 35, 47, 48, 49

root cause .........................................

10, 11

scribe ...

.............................................

33, 34

scripting language.......................

60, 62, 63

security

.....................

27, 28, 36, 47, 50, 58

s ecurity testing ....................................

.... 28

s ecurity tool.......................................

58, 60

simulators............................................

.... 24

site acceptan ce testing .......................

.... 27

s oftware deve lopment.............

8, 11, 21, 22

s oftware deve lopment model..............

.... 22

s pecial consid erations for some types of tool62

test case..................................................

38

s pecificationbased technique.....

29, 39, 40

s pecificationbased testin g

.................. .... 37

stakeholders..

12, 13, 16, 18, 26, 39, 45, 54

state transitio n testing ......

.................

40, 41

statement cov erage .........

.......................

42

statement testing..............

.......................

42

static analysis

.................

32, 36

static analysis ...........tool

31, 36, 58, 59, 63

static techniq ................ue

.................

31, 32

static testing .....................

.................

13, 32

stress testing ....................

...........

28, 58, 60

stress testing .............tool

.................

58, 60

structural test ...............ing

.....

24, 28, 29, 42

structure-bas ed technique

................

39, 42

structure-bas ed test design technique.... 42

structure-bas ed testing .....................

37, 42

stub .....................................................

.... 24

s uccess factors ...................................

.... 35

s ystem integr ation testing .................

22, 25

s ystem testin g13, 22, 24, 25, 26, 27, 49, 69

technical revi ew ....................

31, 33, 34, 35

test analysis ..........................

15, 38, 48, 49

test approach ........................

38, 48, 50, 51

test basis.............................................

.... 15

test case.13, 14, 15, 16, 2 4, 28, 32, 3 7, 38,

39, 40, 41, 42, 45, 51,

55, 59, 69

test case spe cification......

........... 37, 38, 55

test cases.....

....................

 

....................... 28

test closure.......................

 

 

........... 10, 15, 16

test condition ....................

 

 

....................... 38

test condition ...........s

13, 15, 16, 28, 38, 39

test control........................

 

 

........... 15, 45, 51

test coverage ...................

 

 

................. 15, 50

test data ........

15, 16, 38, 48, 58, 60, 62, 63

test data preparation .tool

................. 58, 60

test design13, 15, 22, 37, 38, 39, 43, 4 8, 58,

62

45

test design specification......................

test design te chnique ..................

37, 38, 39

test design to ol..................................

58, 59

Test Develop ment Process.................

.... 38

test effort .............................................

.... 50

test environment .15, 16, 17, 24, 26, 48, 51

te st estimati on.......................................

50

test execution13, 15, 16, 3 2, 36, 38, 43, 45,

57, 58, 60

 

 

38

test execution schedule ...

.......................

test execution tool .....

16, 38, 57, 58, 60, 62

test harness.....................16, 24, 52, 58,

60

test implemen tation..........

...........

16, 38,

49

Version 2 011

Page 77 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Certified Test er

International

Software Te sting

Foundation Level Syllabus

Q ualifications Board

 

 

test leader .............................

18, 45, 47, 55

test leader tasks..........................

............ 47

test level. 21, 22, 24, 28, 29, 30, 37, 40, 42,

44, 4 5, 48, 49

 

15, 16, 43, 60

test log ....................

..............

test man agement....

....................

...... 45, 58

test man agement to ol .................

...... 58, 63

test man ager...........

....................

.. 8, 47, 53

test mon itoring ........

....................

...... 48, 51

test objective.......

13, 22, 28, 43, 44, 48, 51

test orac le ...............

....................

............ 60

test orga nization .....

....................

............ 47

test plan .. 15, 16, 32, 45, 48, 49, 52, 53, 54

test planning .............

15, 16, 45, 49, 52, 54

test pla nning activities

............. ............

49

test proc edure...........

15, 16, 37, 38, 45, 49

test proc edure specification..............

37, 38

test prog ress monitoring .........................

 

51

test repo rt....................................

......

 

45, 51

test repo rting...............................

......

 

45, 51

test scri pt ....................................

 

 

16, 32, 38

test stra tegy ................................

............

 

47

test suit e .....................................

............

 

29

test summary report ........

 

15, 16, 45, 48, 51

test tool classificatio n..................

............

 

58

test type ..........................

 

21, 28, 30, 48, 75

test-driv en developm ent .........................

 

24

tester 10 , 13, 18, 34, 41, 43, 45, 47, 48, 52,

62, 6 7

48

tester tasks .............................................

test-first approach...................................

24

testing and qu ality ............

 

...................

.... 11

testing principles ..............

 

.................

10, 14

testware...........................

 

15, 16, 17, 48, 52

tool support .....................

 

24, 32, 42, 57, 62

tool support fo r management of testin g and

tests .............................

 

...................

.... 59

tool support fo r performance and monitoring 60

tool support fo r static testing...............

.... 59

tool support fo r test execution and log ging60

tool support fo r test specification ........

.... 59

tool support fo r testing .....

.................

57, 62

top-down ..........................

 

...................

.... 25

traceability........................

 

........... 38, 48, 52

transaction processing se quences .....

.... 25

types of test t ool...............

 

.................

57, 58

u nit test frame work...........

 

........... 24, 58, 60

u nit test frame work tool....

.................

58, 60

u pgrades ..........................

 

...................

.... 30

u sability .....................

11, 27, 28, 45, 47, 53

u sability testin g ................

 

.................

28, 45

u se case test ....................

 

.................

37, 40

u se case testing ...............

 

........... 37, 40, 41

u se cases.........................

 

..... 22, 26, 28, 41

u ser acceptan ce testing ...

...................

.... 27

v alidation..........................

 

...................

.... 22

v erification........................

 

...................

.... 22

v ersion contro l..................

 

...................

.... 52

V-model............................

 

...................

.... 22

walkthrough................................. 31, 33, 34

white-box test design technique .......

39, 42

white-box testing ..............

 

.................

28, 42

Version 2 011

Page 78 of 78

31-Mar-2011

© Internationa l Software Testing Q ualifications Board

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]