click below
click below
Normal Size Small Size show me how
MTS #111
Testing
| Question | Answer |
|---|---|
| 111.1 State the purpose of a testing program? | Testing trainee on learning objectives |
| .2 State the Roles and responsibilities of the following for an effective testing program: A) Naval Education Training Command (NETC): | Policy and guidance |
| .2 NETC N7 | Policy and guidance oversight |
| .2 Learning Center Commanding Officer | Serves as CCA |
| .2 Learning Center Director of Training | Ensures testing programs are conducted |
| .2 Learning Center Learning Standards Officer | Guidance for curriculum on developing testing |
| .2 Course Curriculum Model Manager (CCMM) | Approves test design |
| .2 Curriculum developer | Design and develop test plan |
| .2 Learning Site Commanding Officer/Officer-in-charge | Implements testing plan |
| .2 Learning Site Testing Officer | Test administration/leading |
| .2 Course Supervisor | Test item analysis |
| .2 Participating Activities | Provides feedback |
| .3 State the primary course source data for creating test items | NAVEDTRA 132 p.3-4 JDTA, OCCSTDS, CTTL/PPP Table, COI |
| .4 List usable course source data to be used when the primary course source data is no available or has not been created. | NAVDETRA 132. p. 3-4 If JDTA data is not available then curriculum developers will bridge the absence of JDTA data using data elements from a combination of: Occupational Standards (OCCSTDs), CTTL, PPP Table, and a COI. |
| .5 Define the following tests: Formal and Informal | NAVEDTRA 132 p. 3-10 Formal "used in final calculation" Informal "not used in final grade" |
| .6 For the below items, define the three levels of proficiency levels contained within each: a. skill b. knowledge | a. skill: level 1-imitation level 2-repetition level 3-habit b. knowledge: level 1-knowledge/comprehension level 2-application/analysis level 3-synthesis/evaluation |
| .7 List the five categories for performance and knowledge tests. | 1. assess trainees knowledge, pre-req acceleration 2. progress-test blocks on instruction 3. comp test-w/in course or final exam 4. oral test- assess trainee knowledge, normally by board of members 5. quiz-short test to access knowledge comprehension |
| .8 Discuss the process of piloting a test. | SME-->CCMM-->LSO (review)-->(piloted)-->(approve) |
| .9 Describe the use of each test instrument as they relate to knowledge and performance tests: A. job sheet | by step performance of a task "MRC" |
| .9B Problem sheet | practical problems requiring solving |
| .9C Assignment sheet | direct the study or homework |
| .9D Multiple-choice | most versatile test item format |
| .9E True or False | provide only 2 answers |
| .9F Matching | two list of connected words, pictures |
| .9G Completion | fill in the blank |
| .9H Labeling | recall facts and label pictures |
| .9I Essay | answer questions with written response |
| .9J Case study | when comprehensive understanding of material is required |
| .9K Validation of Test Instruments | after test instruments have been constructed the content must be validated (before test) |
| .10 What are the two types of testing methods used in testing? | 1. criterion-referenced test-is required skill or knowledge met 2. norm-referenced-estimates individual skill based on a group (Navy adv. exam) |
| .11 Discuss test failure policies and associated grading criteria within your learning environment. | Test (if failed), re-train, re-test. If passed highest score student can receive is 80%. |
| .12 Discuss during performance test design how the skill learning objective critically is determined. | |
| .13 Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task. | |
| .14 Identify the ten sections of a testing plan. | NAVEDTRA 132. p. 11-2 course data, course roles and responsibilities, course waivers, test development, test administration, course test and test types, grading criteria, remediation, test and test item analysis, documentation (C3, T3,GRTD) |
| .15 State the purpose of test and test item analysis. | To determine statistical validity, test and test item analysis techniques are required. Used for difficulty index, index of discrimination, and effectiveness of alternatives. |
| .16 In a remediation program, discuss what the primary and secondary goal is. | primary-"motivate and assist" trainees in achieving learning objectives secondary-"remove barriers" of learning |
| .17 Discuss the three methods of remediation available to instructors: | targeted-one on one mentorship of the "specific area" the trainee is having issues with scalable-one on one mentorship each "major area" the trainee is having issues with iterative-one on one mentorship using "total recall" of everything |
| .18 Define the following sections of a remediation program: A.retest B. setback C. drop from training and attrites D. counseling E. academic review boards (ARBs) | retest-when trainee does not achieve minimum grade |
| .18B setback | determined by the degree of difficulty the trainee had with the test |
| .18C drop from training and attrites | trainee unsuited to complete course discharged from the navy |
| .18D counseling | preventitive counseling used in "A/C" school for personal and performance problems |
| .18E academic review boards (ARBs) | used when all other forms of consoling have failed |