Optional Early Instrument Evaluation

Transcription

Optional Early Instrument Evaluation
Optional Early Instrument
Evaluation
Elizabeth Vilky
Sr. Director of State and Member Relations
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
What is the Optional Early Instrument
Evaluation?
• Early in the accreditation process, providers can
elect to submit to CAEP the generic assessments,
surveys, and scoring guides/rubrics that they expect
to use to demonstrate that they meet CAEP
standards.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Purpose
• Provide EPP’s with formative feedback on how to
strengthen assessments
– No value or decision is made on assessments
– Feedback is given to EPPs to facilitate the improvement of EPP
assessments used across the EPP
• Part of CAEP’s commitment to capacity building
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Timing
• Review is scheduled three years* before the
scheduled date of the self-study
• Submissions are due by October 1 for fall cycle and April 1
for the spring cycle
• Timing allows EPPs to use feedback to improve quality of
assessments before the submission of self-study and site
visit
*CAEP early adopters can submit assessments for review.
However, EPPs will not be penalized for not having time
to make changes.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Reviewers
• Reviewers are being selected from OVA system
• Completing training in April
• EPPs should begin receiving feedback in May
• Early Adopters can submit plans for changes to assessments
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
What is submitted?
• Provider created assessments, surveys, and scoring
guides/rubrics used as evidence for CAEP standards.
• Data are not submitted with the assessments.
*Proprietary Assessments do not need to be submitted.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Proprietary Assessments
• Proprietary Assessments –
 Assessments that are external to the EPP where property
rights are held by another agency
• State Licensure exams
• edTPA or PPAT
• State required assessments (i.e., clinical observation
instruments, etc.)
• Any required state or national level assessment
• Validity and reliability established by an external source
 Proprietary Assessments are not submitted for review
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Proprietary Assessments (cont.)
• EPPs will provide context for the use of proprietary
assessments
 When during the program is assessment used
 Identify if the assessment is mandated or elective for the
EPP
 Identify the alignment of the proprietary assessment with
the CAEP standard
 If available, provide validity/trustworthiness and
reliability/consistency data for the instrument
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
EPP Created Assessments
• Often include, but not limited to 





Clinical observation instruments
Work sample instruments
Lesson and/or unit plan instruments
Dispositional instruments
Reflection instruments
Surveys
•
•
•
•
Candidate exit surveys
Employer surveys
Student surveys
Alumni surveys
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Steps for Submission
• Step 1
 Three years before the due date of self-study, the EPP
requests a shell for submission of assessments
• Step 2
 EPPs identify on a chart the proprietary assessments to
be submitted as evidence
 Complete a checklist of which proprietary assessments
provided evidence for which CAEP Standards
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Steps for Submission
• Step 3
 EPPs attach/upload to shell:
• Instruments created by the provider (such as student teaching
observation protocols used during clinical experiences, survey
data, teacher work samples, portfolios, candidate exit surveys,
employer surveys, and other common measures of candidate
competency)
• Scoring guides/rubrics for these instruments
• A table that identifies which items on assessments or surveys
provide evidence for individual CAEP standards, and, in those
states making the feedback program review option available,
indicates the alignment with state standards
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Steps for Submission
• Step 4
• In the space provided in the shell, EPPs answer questions on
the development of the assessment, describe the
establishment of validity for each assessment, and describe
the process in which reliability was established or a plan for
establishing reliability.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Review Process
 CAEP assigns a lead reviewer and two additional
reviewers
• Reviewers are specifically trained on criteria for quality
assessments
• Reviewers provide feedback on –
–
–
–
–
Quality of scoring guides or rubrics based on the criteria
Quality of surveys based on the criteria
Alignment of assessments to CAEP Standards
Quality of the evidence for CAEP Standards
Quality of the answers to validity and reliability answers
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Review Process
• Steps in review process
 Each of the three reviewers complete an independent
review through AIMS
 After all reports are submitted, lead reviewer host a
conference call with team
 Conference call generates a final team report
submitted through AIMs
 CAEP staff completes a tech edit of final report
 EPP receives feedback on all submitted assessments by
March 1 for fall cycle and September 1 for spring cycle
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Assessments with
Scoring guides (evidence guide pg. 22-25)
• HOW THE ASSESSMENTS ARE USED
 Is the point in the curriculum at which the assessment is
administered clear (e.g. first year, last year, etc.)?
• At entry, exit, mid-point, etc.?
• Are the curricular points an identified part of a clear
developmental sequence?
• NOTE: This information would be part of the
documentation that the assessments are relevant.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Assessments with
Scoring guides (evidence guide pg. 22-25)
• HOW THE INSTRUMENTS ARE CONSTRUCTED
 Are assessments aligned with CAEP Standards? If so,
then:
• the same or consistent categories of content appear in the
assessment that are in the Standards;
• the assessments are congruent with the complexity, cognitive
demands, and skill requirements described in the Standards; and
that
• the level of respondent effort required, or the difficulty or degree of
challenge of the assessments, is consistent with Standards and
reasonable for candidates who are ready to teach or to take on
other professional educator responsibilities.
• NOTE: Information on these aspects of assessments can be used by the
provider to demonstrate construct or content validity and relevance.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Assessments with
Scoring guides (evidence guide pg. 22-25)
• HOW THE INSTRUMENTS ARE SCORED
 Is there a clear basis for judging the adequacy of
candidate work?
• A rubric or scoring guide is supplied.
• Multiple raters or scorers are used.
• There is evidence that the assignment measures what it
purports to measure
• (NOTE: this information would be part of the evidence for construct
validity or content validity and relevance) and that results are
consistent across raters and over time (NOTE: this would be evidence of
reliability).
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Assessments with
Scoring guides (evidence guide pg. 22-25)
• HOW THE INSTRUMENTS ARE SCORED (continued)
 What do the performance levels represent?
• There are three, four or five distinct levels, and they are clearly
distinguishable from one another.
• For each level of performance, attributes are described that are
related to actual classroom performance; attributes are not simply
mechanical counts of particular attributes.
• Levels represent a developmental sequence in which each
successive level is qualitatively different from the prior level.
• It is clear which level represents exit proficiency (ready to
practice).
• NOTE: Information in this category would help document that the
evidence is actionable—it is in forms directly related to the preparation
program and can be used for program improvement and for
feedback to the candidate.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Quality Surveys
 Surveys allow EPPs to –
• Gather information for program improvement
• Access a broad spectrum of individuals
–
–
–
–
Candidate satisfaction
Graduate satisfaction
Employer satisfaction
Clinical faculty perceptions of candidates’ preparedness for
teaching
 Characteristics of Quality Survey
• Carefully designed
• Allow for systematic collection of data
• Measures the property it claims to measure
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Surveys (evidence
guide pg. 25-27)
• Are the purpose and intended use of the survey clear
and unambiguous?
• Is the point in the curriculum at which the survey is
administered clear (e.g., first year, last year, etc.)?
 At mid-point, exit, pre-service, in-service, etc.?
 Are surveys being used at different points so
comparisons can be made? (For example, are
candidates surveyed at the completion of the program
as well as one or two years after completion?)
• NOTE: This information would be part of the documentation that
surveys are relevant.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Surveys (evidence
guide pg. 25-27)
• HOW THE SURVEYS ARE CONSTRUCTED
 Is it clear how the EPP developed the survey?
 Are the individual items or questions in the survey constructed in a
manner consistent with sound survey research practice?
•
•
•
•
•
Questions should be simple and direct.
Questions should have a single subject and not combine two or more attributes.
Questions should be stated positively.
Leading questions should be avoided.
Response choices should be mutually exclusive and exhaustive.
• NOTE: Information of this type would be a part of the documentation that surveys are
valid in terms of construct or face validity and they are relevant.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Surveys (evidence
guide pg. 25-27)
• HOW RESULTS ARE SCORED AND REPORTED
 What efforts were made to ensure an acceptable return rate
for surveys? Has a benchmark been established?
 What conclusions can or cannot be determined by the data
based on return rate? Is there a comparison of respondent
characteristics with the full population or sample of intended
respondents?
 How are qualitative data being evaluated?
 How are results summarized and reported? Are the
conclusions unbiased?
 Is there consistency across the data and are there
comparisons with other data?
• NOTE: This information can be used by the EPP, in part, to document
reliability.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Guidelines for Evaluating Surveys (evidence
guide pg. 25-27)
• INFORMING SURVEY RESPONDENTS
 Is the intent of the survey clear to respondents and
reviewers?
• A cover letter or preamble explains what respondents are
being asked to do and why.
• The sequence of questions makes sense and is presented in
a logical order.
• Individual items or questions are grouped under
appropriate headings and subheadings.
 Are clear and consistent instructions provided to
respondents so they know how to answer each section?
• NOTE: This information could be a part of a self-study documentation
that the survey is fair.
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Rubric for Evaluation of EPP Instruments
Draft available at: http://caepnet.org/resources/
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Evidence for Standards
• Most of candidate data from assessments will be
submitted as evidence for Standard 1
• Validity and reliability evidence will be submitted as
evidence for Standard 5 (Quality Assurance)
• Feedback will be used by EPPs to improve or modify
assessments
• Feedback will be used by EPPs to improve or modify
validity and reliability processes
• Member of the assessment team will serve on the site
visit team
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
Contact Information
 Stevie Chepko, Senior Vice President, Accreditation
stevie.chepko@caepnet.org
To request shells contact:
 Monica Crouch, Accreditation Associate,
monica.crouch@caepnet.org
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates
QUESTIONS
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdates