Standard 5 and A.5
The provider maintains a quality assurance system is comprised of valid data from multiple measures, including evidence of candidates’ and completers’ positive impact is sustained and evidence-based, and that evaluates the effectiveness of its completers. The provider uses the results of inquiry and data collection to establish priorities, enhance program elements and capacity, and test innovations to improve completers’ impact on P-12 student learning and development.
SUMMARY
As demonstrated across Standards 1/A.1 - 4/A.4, the VCU EPP Quality Assurance System supports systematic data collection, reporting, and monitoring on all CAEP Standards. Leveraging several technology systems, the EPP collects, tracks, aggregates and reports data on multiple measures (click for details) in response to faculty/program needs, to monitor candidate progress and achievements of completers, as well as to understand the EPP effectiveness and opportunities for continuous improvement. Through multiple committee structures that include staff, faculty, leadership, and P-12 partners, the EPP routinely investigates the quality and usefulness of existing measures and uses information to make needed adjustments to ensure its QAS relies on high-quality evidence. Further, these committees engage in an ongoing review of data, improving in-depth review and understanding over time, such that the QAS forms the basis for the continuous improvement function. Through partnerships, as well as its website presence, the EPP is transparent about its outcomes, sharing results with stakeholders and the public (https://soe.vcu.edu/faculty-staff/offices/assessment/caep-measures).
EXAMPLES OF EXCELLENCE IN PRACTICE
- A comprehensive quality assurance system to support a culture of continuous improvement
- The QAS defines a set of policies and procedures, as well as committees, stakeholders, offices, and personnel, established to ensure quality in admissions, courses, program design, and candidate and completer performance.
- The Quality Assurance System (QAS) supports the EPP in reaching its mission and goals through purposeful analysis and use of evidence derived from all five CAEP standards.
- Engagement of stakeholders in review programs and data to inform continuous improvement
- VCU EPP re-established the Professional Education Coordinating Council and established the new Clinical Experience Advisory Board to engage individuals from P-12 schools and divisions in co-constructing mutually beneficial partnerships.
- The EPP engages in transparent data-sharing practices with faculty, staff, and partners to enhance understanding of our strengths and opportunities to improve processes, curriculum, measures, student learning, and ultimately our completer outcomes.
IMPROVEMENTS EMERGING FROM THE SELF-STUDY
- Through partnerships with P-12 schools, the EPP seeks to understand the context of partner schools/divisions and engage partners in co-constructing learning experiences that are mutually beneficial by enhancing VCU EPP candidates’ preparation for Title I schools. Beginning 2020-21, VCU EPP will implement an evaluation of these partnerships going forward to determine the effectiveness of our efforts to support mutually beneficial partnerships.
- The VCU SOE Strategic Plan incorporates goals and strategies aligned to CAEP standards (e.g., achievement of CAEP accreditation; recruitment of diverse candidates and faculty; systematic evaluation of completer outcomes; high-quality clinical experiences) demonstrating the ongoing and far-reaching efforts to develop a culture of evidence and to maintain the EPPs commitment to educator preparation.
The VCU EPP, including the Schools of Education, Arts, and the College of Humanities and Sciences firmly believes in meaningful assessment and evaluation for initial and advanced programs that are both developmental and continuous, and systematically implemented to inform our understanding of candidate progress, completer achievement, and provider operational effectiveness. The Quality Assurance System (QAS) supports the EPP in reaching its mission and goals through purposeful analysis and use of evidence derived from all five CAEP Standards. The QAS defines a set of policies and procedures, as well as committees of key stakeholders and partners, offices, and personnel, established to ensure quality in admissions, courses, program design, and candidate and completer performance. Evidence for Standard 5/A.5 is presented holistically to address the standard.
The QAS for initial and advanced programs operates on a framework (85.3) including:
- personnel and technology to manage data collection, monitoring, and data use;
- assessments of applicant qualifications and candidate knowledge, skills and dispositions implemented at specific gates in a program;
- evaluations of field experiences, academic programs, EPP operations, and P-12 partnerships;
- systematic management of recruitment and admissions data to attract quality applicants from diverse groups and in high-need areas;
- procedures for monitoring of completer outcomes;
- policies, procedures, and practices to govern the collection, analysis, and dissemination of data, and to support the use of data for candidate evaluation and program and EPP level improvement, and;
- key stakeholders of various roles who engage in periodic review of data and review of the QAS itself to inform continuous improvement.
Standard 5/A.5 Primary data sources
The VCU EPP QAS Handbook (85.1) provides narrative and additional tables/resources to expand on each of the components of the QAS framework, including program assessment maps and descriptions of processes and procedures, timelines, and roles/responsibilities to ensure a healthy QAS. Three additional table documents provide high level information summaries. Specifically, the Assessment and Accountability Timeline details key dates and deliverables of the QAS by month of the academic year; this chart also includes the responsible party. The Data Quality Map for VCU EPP Surveys and Assessments (85.2) outlines the measures used by initial and advanced programs (EPP-created and proprietary), aligned to CAEP Standards, and summarizes past and ongoing efforts to examine and establish reliability and validity evidence for EPP measures. The VCU EPP QAS Data Workflow offers a summary of the systematic review of data, measures, and procedures associated with the QAS to guide continuous improvement. Lastly, the Graduate Program and SOE Data Monitoring Dashboard provides transparent, ongoing access to key measures related to all CAEP Standards as well as measures that support monitoring toward the EPP's mission and goals.
5.1 / A.5.1 QAS comprised of multiple measures
The VCU EPP QAS has multiple capabilities and data, derived from a coherent set of multiple measures (85.2) that monitor candidate progress (See Standard 1 data), completer achievements (See Standard 4 data), and provider operational effectiveness (See Standard 2 data; See Standard 3 data; 85.1-Appendix B/E). These measures each align to CAEP Standards and support the EPPs efforts to understand effectiveness and support continuous improvement. Procedures and timelines are in place to assure data collection and entry, scheduled analysis and dissemination of data summaries, planned discussions among faculty and stakeholders, and documentation of data-based changes to improve candidate performance, program quality, and EPP operations (85.1-Appendix B/E; 85.3).
QAS Technologies
Data collection and reporting for admissions, candidate assessment, completer performance, and EPP effectiveness are supported by six complementary technologies (described in 85.1). First, Banner supports access to university student data (e.g. contact information, demographics, etc.) and course data (e.g. enrollments). These data may be accessed through Banner directly on a by-candidate basis, or by SAS Enterprise Guide with support from the Office of Assessment. Insights and the VCU Reporting Center provide the EPP a dashboard to access, query, and report Banner data in aggregate form, at the program- and EPP-levels, and to disaggregate by candidate demographics (e.g., gender, race/ethnicity).
Next, the EPP uses Tk20 as its assessment-management system. It provides a web-based interface for completing course- and clinical-based assessments and evaluations. Tk20 facilitates management of QAS data for individual candidates, aggregate reports, and the alignment of programmatic features (e.g. courses, rubrics) to the CAEP, InTASC and professional standards, and state standards, as relevant. Tk20 also integrates non-rubric assessment data (e.g. content GPA, GRE scores, Praxis Score, clinical placements, etc.) that are imported from various other university and third-party systems (e.g., ETS) and used by decision-makers and Student Services Center staff who may also add data about admissions, placements and endorsements. Tk20 provides a large number of core reports as well as a custom-query tool to create non-routine reports.
Clinical experiences management, including tracking, selection, and matching of cooperating teachers and clinical faculty for the culminating clinical experiences is maintained in rGrade. The University Degree Works system is a web-based degree audit tool that helps students and advisers monitor progress toward degree completion. While Degreeworks is a self-service tool for students and advisers, it is the official means used to confirm that students have completed requirements for graduation. Degree Works is applied in the EPP for candidate monitoring, per CAEP Standard 3/A.3.
Finally, program quality and EPP operational effectiveness surveys, including completer and employer surveys and evaluations of advising, clinical placements, and cooperating professionals are collected through web-based surveys using REDCap, the university licensed survey software, as well as the survey functions offered by the Tk20 assessment system.
Admissions and Candidate Monitoring Data from Standards 3/A.3
As discussed in Standard 3/A.3, both initial licensure and advanced candidates must meet admissions criteria and demonstrate performance at established levels across program requirements to complete the program and enter practice. Decisions about candidate performance are based on multiple assessments (e.g., GPA, test scores, EPP assessments, licensure requirements) organized around gates within the program and follow-up evaluations of graduates in practice (85.1). This sequencing allows faculty and staff to track individual candidate progress toward completion using defined points in time and consistent data sources across candidates and programs (initial and advanced). In addition to individual candidate monitoring as discussed in Standard 3/A.3, annually, the cohort average GPA and average score on each assessment is summarized for candidates entering in the most recent academic year (52). The Academic Petition Committee reviews these academic qualifications of the incoming cohort of candidates in comparison to the minimum group average performance requirements articulated by CAEP.
To understand recruitment effectiveness, Banner data are queried each semester using Insights and the Reporting Center to generate reports to monitor counts on applications, admissions, and enrollments, as well as admission and yield rates. These data are summarized in the CAEP Recruitment Monitoring Data with disaggregations by program, gender, race, and under-represented minority status. Equivalent university benchmarks are included to understand effectiveness of recruitment efforts in increasing enrollment and the effectiveness of efforts on diversifying the incoming cohort profile (92; 51).
Course- and Clinical-based Assessments from Standards 1/A.1, 2/A.2
Assessment data (See Standard 1 data) on initial and advanced candidate performance are based on multiple assessments from EPP-created course and clinical assessments and proprietary sources and are systematically collected as candidates progress through program gates. All data from program and EPP level assessments are entered directly into Tk20 by faculty and cooperating professionals for School of Education and School of the Arts programs. The Office of Assessment monitors collection of data and reports the status of candidate performance data collection to department chairs. Chairs are responsible for ensuring that all data are collected and reported to the Office of Assessment or entered directly into Tk20. The Office of Assessment summarizes and aggregates common assessment and program-specific assessment data at both the program and EPP level annually and shares data via the Graduate Program and SOE Data Monitoring Dashboard (92). Further, statistical analysis using SPSS explores variations in performance by student characteristics (e.g., program, gender, and race/ethnicity) (85.3). Course- and clinical-based assessment data for Math Specialist and Adapted/Severe Disabilities candidates are collected and aggregated by the program chair and reported to the Office of Assessment, as these programs do not host Tk20 accounts for their students.
Survey Evaluations from Standards 4/A.4, 5/A.5
VCU EPP uses a number of surveys to collect feedback from candidates, cooperating professionals, and completers and their employers. Evaluation survey data for all programs are collected using the University approved survey system, REDCap or via forms in Tk20 assessment system. Procedures for administration and timing of evaluation surveys, as well as data aggregation information, are detailed in the QAS handbook (85.1).
For EPP clinical experience measures (Principal Evaluation of Candidates in Internship, Cooperating Teacher / University Supervisor peer evaluations, and Candidate Evaluation of Clinical Experience), appropriate stakeholders receive an electronic link to the survey evaluation form via email. The timelines for administration of survey evaluations are detailed in the Assessment and Accountability Timeline (85.1, Appendix A) and are scheduled to align with the end of clinical experiences. Surveys are generally open for a period of two weeks with a reminder to encourage participation. Row-level data from these clinical experience measures are reported each semester to the Student Services Center and faculty/staff with responsibility for selection of clinical supervisors for placements of candidates. Aggregate data at the program and EPP levels are summarized annually. Aggregate data are shared with appropriate stakeholders, including program and department chairs, EPP leadership, and the Clinical Experience Advisory Board (85.1; 85.3).
The Office of Assessment also collects and aggregates completer and employer satisfaction survey data once a year. The summary documents for standards 4.3/A.4.1 (74; 79.1; 83.1) and 4.4/A.4.2 (78.1; 84.1) include detailed descriptions of administration and data aggregation procedures. As of 2019-20, VCU EPP collects advanced completer/employer survey data via REDCap, and initial completer/employer data are collected through the Virginia Educational Assessment Collaborative (VEAC). As described in Standard 4, the Completer Survey cohort derives from the employment data Virginia Department of Education shares with Virginia Commonwealth University. This partnership with the VDOE provides VCU EPP updated employment data on completers to support sample identification for completers 1-3 years post-program who are employed in education. For the Employer Survey, invitations for feedback are sent to those who employ a completer identified for the completer cohort. The survey is administered in spring, and employment data is recorded as of October one of the prior year; therefore, all completers on whom data are collected have been teaching at least one year.
Partnering most recently with a state-wide collaborative (Virginia Education Assessment Collaborative) of EPPs, initial employer and completer surveys are facilitated by the collaborative that supports centralized data collection, state-wide benchmark aggregation, and program-level reporting. EPP- and program-level data, as well as state-wide benchmarks, are reported for initial licensure programs to allow for comparisons. For advanced programs, VCU EPP administers the completer and employer surveys, collected and reported at the program level. Data reports for completer and employer surveys are shared with program faculty, the Assessment Committee (95, e.g., October 2019; March 2020), the Professional Education Coordinating Council (25, e.g., April 2019), and broadly via the SOE Public Data web page (96).
CAEP Annual Measures
Data for the CAEP Annual Measures derive from measures collected and monitored as part of the five CAEP Standards, including in particular Standard 4/ A.4. Data outcomes are aggregated and summarized annually and posted electronically to the EPP website (96; https://soe.vcu.edu/faculty-staff/offices/assessment/caep-measures/) for public review, and the link is shared with department chairs, program coordinators, and with the Professional Education Coordinating Council (25). The Assessment Committee (95, e.g., October 2019; March 2020) reviews the data annually for trends and implications for improvement to both programs and the QAS that provides the data. Items reporting fewer than ten cases are not reported to protect the candidate.
5.2/A.5.2 QAS relies on measures with evidence for reliability and validity to support actionable data and inferences
The QAS includes policies, procedures, and practices at the program and EPP level to ensure the quality of evidence for EPP measures (85.1). VCU EPP develops assessments for initial and advanced programs mindful of principles of good evidence as described by CAEP and dedicates resources to robust rubric analyses, as described in 1/A.1 assessment guidance documents (86.1; 86.2; 86.3) to ensure data and the interpretations made derive from measures with evidence for reliability and validity. The EPP uses a standard set of steps to review each assessment. Content validity is assured through alignment of course and rubric content to appropriate discipline standards with confirmation of relevance and comprehensiveness by P-12 partners. Calculations of internal consistency, rater agreement, and patterns of missing data provide evidence to confirm the reliability of response. Where program sizes support disaggregated evidence, t-test or ANOVA is applied to ensure fairness and lack of bias in scoring by program area, ethnicity, and gender. As the advanced programs are in phase-in of data collection for some A.1 assessments, rubric analyses are planned as sufficient sample sizes are collected. These methods are summarized in the Data Quality Map (85.2) and are further explicated in the guidance documents (86.1; 86.2; 86.3) for each EPP-created initial and advanced measure.
To date, the EPP affirms the three (100%) initial EPP-created common assessments meet CAEP sufficient level. As described above, VCU EPP ensures construct and content validity of assessments through expert development, alignment to standards, and content validity ratings. Specifically, the CAEP Rubric Team, formed in 2016, collaborated to review measures in light of new CAEP Standards (88). The group revised item language and developed new items as needed to ensure alignment of the Clinical Evaluation Continuum (the longitudinal and summative student teaching assessment) to the proficiencies detailed in CAEP Standard 1 components and the ten InTASC Standards. The CAEP Evaluation Framework for EPP-Created Assessments guided item revisions and new item development, with a focus on evidence for test content by way of standards alignment and expert faculty participation, and evidence for response process with discussions centered on assessment fidelity in classrooms across grade levels, content areas, and school divisions. Further, P-12 partners evaluated content validity of initial EPP common assessments for item relevance and clarity. From these ratings, the Office of Assessment computed scale and item content validity indices (I-CVI > .80 for all items; S-CVI = 1.00) (86.1; 86.2; 86.3).
Special Education programs do not require a Praxis subject test for content knowledge. As such, each VCU EPP Special Education licensure area administers and monitors a unique assessment of content knowledge based on the license. Content validity for content knowledge assessments in special education is supported by expert development (by program faculty) and alignment to professional standards to ensure content relevance presented in the assessment guidance documents (86.1; 86.2; 86.3). Special Education Adapted employs two course-based assessments for content knowledge that are approved by the Council for Exceptional Children. Special Education (early childhood, general education) are collecting data on the content knowledge comprehensive examinations they recently implemented to replace the prior summative portfolio review. The comprehensive examinations align to the Council for Exceptional Children Standards, demonstrating content validity. These exams will be evaluated against the CAEP Evaluation Framework for EPP-Created Assessments in subsequent semesters to ensure each meets CAEP's sufficiency level.
Similarly, at the advanced level, program faculty, P-12 partners, and the Director of Assessment partner to ensure content and construct validity for unique advanced program assessments of content knowledge and professional skills. As documented in the assessment guidance documents (86.1; 86.2; 86.3), rubrics are developed by program faculty to align with CAEP standards and professional standards for the licensure area. As detailed in the advanced program Phase-in Plan for A.1.1, scaffolded steps are in place to engage P-12 partners in review of all advanced program rubrics, for content validity against the CAEP Evaluation Framework for EPP-Created Rubrics. To date, International Literacy Association (8) reviewed Reading program assessments and awarded national SPA program recognition, and School Counseling received accreditation recognition from CACREP (8), which includes detailed review of program-level assessments for content validity. Internally, using the Lawshe method, P-12 partners with the Educational Leadership (A&S) program evaluated the content validity of program rubrics informing revisions to item language (86.1; 86.2; 86.3). In addition, as advanced program assessment data accumulate to larger sample sizes, the Office of Assessment will conduct analyses to ensure fairness and lack of bias by exploring performance by gender and ethnicity.
VCU EPP employs several strategies to support the reliability of candidate assessment data and investigates internal consistency (Cronbach's alpha) and inter-rater reliability (percentage agreement). At the initial level, raters (cooperating teachers, clinical faculty, university supervisors, program faculty) receive training on the Continuum (35; 36). Multiple raters are used to assess candidates' student teaching performance, and initial candidates are assessed two times on the Continuum during the clinical experience, formative and summative assessments. Internal consistency for each common assessment was calculated using Cronbach's alpha - Dispositions (alpha = 0.946); Lesson Plan (alpha = equal to 0.937); Clinical Evaluation Continuum (alpha = 0.990 for Cooperating Teachers and 0.992 for University Supervisors). Percentage of the agreement for items on the 16-week placement final evaluation ranged from 63% to 98% and from 64%-100% for items on the second 8-week placement evaluation. As part of the roll-out of the revised Clinical Evaluation Continuum, cooperating professionals provided feedback on item content and application of the rubric in clinical experiences to assess candidates (87). Feedback guides training of evaluators to ensure consistent understanding of items and consistency in scoring.
The Office of Assessment staff analyzed data for each common assessment rubric by program, gender, and ethnicity to ensure fairness. Further, the EPP supports an appeals process that allows candidates the right to appeal course grades they consider to have been arbitrarily and capriciously assigned or assigned without regard for the criteria, requirements, and procedures of the course stated in the syllabus or guidelines for assignments.
At the advanced level, where sample sizes permit, SOE Office of Assessment staff calculated Chronbach's alpha for internal consistency, finding very good reliability of data (86.1; 86.2; 86.3). Further, analyses explored agreement for the on-site or school-based supervisor and university supervisor ratings on the School Counseling Clinical Continuum assessment. The Data Quality Map (85.2) and A.1.1 Phase-in Plan (18) detail the timeline for investigating reliability for other advanced rubrics moving forward.
Survey evaluation measures are developed with appropriate representation from licensure programs and in consultation with the Director of Assessment and appropriate committees/stakeholders. EPP faculty and leadership consider CAEP and InTASC Standards, state standards, and P-12 partner feedback when designing content and process for evaluations. The CAEP Evaluation Framework for EPP-created measures does not require validity and reliability studies for survey measures. Yet, VCU EPP does ensure alignment of survey items to appropriate standards, supporting survey content validity (e.g., completer and employer surveys align to InTASC and Virginia Performance Standards).
The EPP regularly examines the validity and utility of data from the quality assurance system to ensure key elements (e.g. syllabi, assessments, surveys) of the quality assurance system are aligned to relevant professional, state, and national standards (85.3). Evaluations are reviewed for consistency with expectations in the CAEP Evaluation Framework for EPP-created assessments by various stakeholders, with all initial measures rated at sufficient level (85.2). For example, as described earlier and in the guidance document (86.1; 86.2; 86.3), the initial common assessments are a key example of periodic review, update, and analysis of measures in the quality assurance system to ensure appropriate alignment and validity and utility of candidate performance outcomes data. In addition, the examination of the validity and utility of data informs needed changes to ensure the quality assurance system is supported with the most appropriate data sources and methodologies. For example, the transition to participate in the Virginia collaborative (VEAC) for shared initial 4.1 and 4.2 measures reflects efforts to ensure survey content alignment to relevant standards and to support benchmark state-wide data sets that help faculty make appropriate and valid inferences from the results, relative to other Virginia EPPs.
5.3/A.5.3 EPP regularly and systematically engages in data-driven continuous improvement
Faculty, staff, and administrators across offices and functions contribute to the EPP quality assurance system, as well as completers, employers, and P-12 partners, with all engaged in contributing to the QAS through data inputs or use of data to drive continuous improvement. The EPP QAS Workflow (85.3) illustrates the nature and scope of ongoing data review. The following list details some functions in the QAS and stakeholders that engage in that function: 1) Recruitment Goal Setting and Monitoring: Recruitment specialists / program faculty; 2) Admissions Monitoring: Office of Graduate Studies; Student Services Center; Program faculty; 3) Student Success / Monitoring / Retention: Associate Dean for Student Affairs and Inclusive Excellence; Student Services Center; 4) Student learning assessment: Office of Assessment; Program faculty; Cooperating Teachers / University Supervisors; 5) Licensure and Accreditation oversight: Executive Director for Accreditation and Licensure; 6) Completer outcomes assessment: Office of Assessment; P-12 school division partners; Program completers; VEAC; 7) Completer and employer satisfaction evaluation: Office of Assessment; Program completers; Employers of completers; VEAC; 8) Data aggregation and analysis: Assessment Committee; Student Services Center; Recruitment; Graduate Studies; 9) Systematic review of data to inform continuous improvement: Assessment Committee, Continuous Improvement Task Force, Professional Education Coordinating Council, Clinical Experience Advisory Board, Program faculty; EPP leadership.
The Office of Assessment provides oversight to the QAS, ensuring the policies and procedures are implemented to support robust data collection and use. Recruitment data and admissions quality data are shared and reviewed with program chairs, the Graduate Admissions Committee, and leadership on a semester- or annual basis (54). Adjustments to recruitment and admissions advising are informed by these data. Assessment data are disseminated to program coordinators, department chairs, and EPP leadership on an annual basis (92). Data from Clinical Experience Evaluations are reported at the program and EPP levels each semester and annually. Data are shared with appropriate stakeholders, including program and department chairs, EPP leadership, and the Clinical Experience Advisory Board (23). Aggregate data reports for completer and employer survey measures are shared with program faculty, the Assessment Committee (95), the Professional Education Coordinating Council (25), and broadly via the SOE Public Data web page. CAEP data outcomes are aggregated and summarized annually and posted electronically to the EPP website for public review; the link is shared with department chairs, program coordinators, and with the Professional Education Coordinating Council.
To support transparent data sharing, the Office of Assessment maintains electronic dashboards (92) for program coordinators, department chairs, and leadership to access EPP and program-level assessment and evaluation data. The dashboards include data from the multiple measures in the QAS that inform all five CAEP Standards. Data include program and EPP level data, multi-year trend data, when available, and disaggregations by candidate demographics (e.g., gender, race/ethnicity) as appropriate, to provide a benchmark to guide interpretations of strengths and areas for improvement.
At the program level, faculty use data on student performance on assessments to evaluate coverage of topics and determine if changes are appropriate. In August of each year, program coordinators are prompted to review data on the program-level dashboard, including student learning and evaluation data. Discussions of data and changes made at the program level are documented annually, in October, on the student learning outcomes assessment report (91.1, 91.2, 91.3, 91.4) maintained in the university's Taskstream system. Specifically, programs report on key student learning assessment findings and detail any data-driven changes. This report is reviewed for compliance and maintained centrally by the VCU Office of Academic Integrity and Assessment. Student learning assessment plans and annual data reporting feed into the continued university-level accreditation.
Various committees are in place to provide a systematic review of EPP-wide data and discussion of evidence across programs to inform opportunities for improvement. The Assessment Committee annually reviews EPP-level candidate learning data, data from the CAEP Annual Measures (93), and the procedures that guide the QAS as a whole. The Clinical Experience Advisory Board (22), including P-12 partners, is charged with a review of clinical experience data to inform opportunities to improve the curricula, clinical experience, placement opportunities, and training for cooperating professionals. At a larger level, the Continuous Improvement Task Force (5.p), established in 2018, meets bi-monthly to review both policies and procedures, as well as curriculum and assessment inputs and outcomes data, to derive interpretations for opportunities to advance the EPP toward CAEP standards. Lastly, the Professional Education Coordinating Council meets twice annually to bring P-12 partners, faculty, department and EPP leadership together to discuss division needs and updates, curricular and clinical experiences, review candidate and completer data, and collaborate to share ideas and feedback to ensure ongoing mutually beneficial partnerships.
Based on interpretations and understandings of data reviewed in these dedicated committees or in program/department meetings, faculty, staff, or leadership may initiate curriculum changes to improve programs (e.g., 10). Recommendations for syllabus and program changes are initiated primarily at the program or department level. After departmental approval, the proposed changes are submitted to the School's Curriculum and Academic Resources Committee for review. Proposals that involve EPP changes are discussed by the Dean's Cabinet and with P-12 school partners through the PECC and CEAB committees, as appropriate to the nature of the proposal. Following EPP approval, proposed revisions are forwarded to university and state entities, where appropriate, for action. Detailed examples of changes based on data are referenced within the discussion of each standard throughout this self-study report.
Through review of data and the quality assurance processes with these various committees, VCU learned that although the EPP maintains committee structures at the program, EPP, and partner levels, opportunities are available to revisit the standing committees and the charge of each to ensure the EPP is efficient and strategic in the use of faculty and staff service time. Two strategies adopted in the current School of Education Strategic Plan are related to strategic review and use of data. Specifically, over the next two years of the strategic plan implementation, VCU SOE will review the charge of both the Continuous Improvement Task Force and Assessment Committee and determine if reconfiguration of membership, or combining the two committees, might reduce duplication of effort by faculty and staff while enhancing shared understandings of quality assurance processes, assessment procedures, and data currently reviewed by one or both committees. Further, the EPP intends to build on current efforts for periodic data review to implement dedicated, in-person "data days" in the academic year 2020-21 to support review of key outcomes, aligned to CAEP, state, and VCU/SOE strategic plans, to support data literacy, collective inferences, and program- and EPP-level continuous improvement informed by the multiple measures available from our data-rich quality assurance system.
5.4/ A.5.4 Engagement of stakeholders as partners
For VCU EPP, the CAEP Annual Reporting Measures derive from key data points in the QAS and represent evidence that is actively integrated in the ongoing EPP QAS Data Workflow and aligned with evidence reported for Standards 1/A.1, 4/A.4 regarding licensure outcomes and completer satisfaction and employment evidence. The CAEP Annual Reporting Measures are publicly available on the School of Education Assessment Office webpage with a prominent link on the VCU SOE home page. Data measures are shared with EPP faculty and staff and P-12 partners. The Assessment Committee, specifically, reviews the measures themselves, as well as the data, annually to examine trends and opportunities for improvement (93; 95). Several key trends persist for the most recent three annual report cycles: 1) high rate of employment of completers in Virginia public schools, and particularly in the EPP's region; 2) high rate of retention in Virginia public schools year over year for completers transitioning year one to two and year two to three in education; 3) majority of completers who are employed in Virginia public schools are working in Title I schools; 4) high ratings of satisfaction from both initial completers and employers with the preparation program, as well as for school counseling; 5) systemic gaps in employer and completer response for other advanced programs (i.e., reading and math specialist, administration and supervision); 6) continued demonstration of ability of completers (initial and advanced) to meet licensing and state requirements 7) consistently positive impact of initial completers on P-12 student learning; 8) opportunities to expand data collection to verify initial completer effectiveness in the classroom. To date, most data-driven improvements related to the CAEP Annual Reporting Measures involve improvements to processes and procedures, and to measures, in the QAS to enhance data quality and the utility of potential inferences from these data, including: 1) revised initial completer and employer surveys via the VEAC state group; 2) addition of focus groups to supplement initial and advanced completer and employer satisfaction data; 3) engagement with P-12 partners to expand and enhance data collection for 4.1 and 4.2. However, as described in other standards, there are other examples of revisions to the curriculum and candidate opportunities that tie back to these data: 1) curricular updates (e.g., culturally relevant teaching/schooling) to align to new VDOE Regulations Governing the Review and Approval of Education Programs in Virginia (see 1/A.1); 2) clinical experience revisions (e.g., clinical placements at multiple levels); 3) efforts to engage with P-12 partners to improve candidate readiness for Title I/high-need schools (see 2/A.2); 4) candidate opportunities to enhance readiness (e.g., employer panel for elementary candidates).
5.5 / A.5.5 Engagement of stakeholders
Faculty, staff, and administrators across offices and functions contribute to the EPP quality assurance system, as well as critical stakeholders, including completers, employers, and P-12 partners. As presented in the Quality Assurance System Handbook (85.1), the QAS depends on many offices and roles to perform the varied functions of an effective QAS. Further, the VCU EPP QAS Data WorkFlow details the stakeholder engaged in review of inputs and outputs of the QAS to enhance continuous improvement.
Internally, faculty and staff engage at every gate in the QAS, including the role in recruitment, admissions selection, ongoing student success, retention, and monitoring. Further, program faculty support student learning assessments, in conjunction with Cooperating Teachers and University Supervisors for clinical-based assessments. Staff engage in monitoring of candidate progress and also in confirming exit criteria are met prior to graduation/recommendation for licensure.
Staff and faculty continue to monitor completer success via effectiveness and impact measures, as well as completer and employer satisfaction post-program completion. Both completers and their employers engage in the QAS at this point, post-program completion, by providing feedback via surveys.
Partnership engagement ensures high-quality clinical experiences characterized by co-selection, preparation, and support of clinical educators; co-construction of mutually beneficial P-12 school and community arrangement for clinical preparation; and shared responsibility for continuous improvement of candidate preparation. As evidence of this stakeholder engagement, the Partnership Engagement Reports (20; 41) brings together various aspects of components in Standard 2, A.2 by highlighting processes and activities that occur in meetings with partner school divisions with the VCU Student Service Center clinical staff, university faculty, cooperating mentor teacher meetings and orientations, teacher recruitment days, orientations of clinical educators and clinical educator meetings, placement and removal processes, and evaluations of university supervisors and cooperating mentor teachers.
Various internal and external stakeholders, including faculty, staff, clinical faculty, and school division representatives, are invited to provide feedback on EPP clinical experiences to support triangulation of data from multiple lenses on the experiences and partnerships. First, using an electronic survey form, candidates are invited at the end of each semester to provide feedback on the clinical experience and feedback for the cooperating professional and university supervisor. Second, each semester, cooperating teacher and university supervisor assessments of candidates on the Clinical Evaluation Continuum and the Principal Survey of VCU Interns provide evidence of the degree to which candidates were able to demonstrate expected knowledge, skills, and dispositions in the internship. For example, at times these data may include feedback to faculty and programs on availability of technology to a student, the extent to which there were opportunities to partner with families, etc., during a field experience. Each semester, cooperating teachers and university supervisors are each invited to evaluate their counterparts with whom they partnered to supervise a clinical intern. Finally, P-12 clinical partners are invited periodically to provide feedback on aspects of the clinical experiences, assessments, and evaluations to provide input on potential program enhancements and candidate strengths and needs. Together, these data are used to inform subsequent placement sites and training needs for cooperating professionals and university supervisors.
Two committees engage VCU EPP partners in the systematic review of programs, assessments, processes and procedures, and outcomes data for continuous improvement - the Clinical Experience Advisory Board (CEAB) and the Professional Education Coordinating Council (PECC). The purpose and objectives of the CEAB include the following: to share knowledge relating to current clinical expectations, assist in identification of professional development needs of field instructors/coaches, to maintain quality field education consistent with VCU's standards, provide input and support in developing Cooperating Teacher (CT)/Clinical Faculty (CF) and continuing education materials, co-construct policies and procedures related to clinical experiences, be involved in recruitment of new field placement sites, and examine trends in clinical experiences that address current practice methods and issues for service delivery. The PECC shares information about programs, trends, issues, and projects affecting teacher education across multiple campus departments and colleges (the EPP).
Feedback from stakeholders does shape and inform the ongoing continuous improvement efforts for VCU EPP as evidenced standards 1/A.1-4/A.4. For example, as discussed in Standard 4, VCU EPP engages P-12 partners through the PECC and CEAB committees in reviewing candidate and completer data, providing feedback on clinical performance and experiences, and contributing to the development of assignments (25, 23). Further, partners provide the EPP with school division needs and updates. Through these partnerships, the EPP seeks to understand the context of partner schools/divisions and engage partners in co-constructing learning experiences that are mutually beneficial by enhancing VCU EPP candidates' preparation for Title I schools. In addition, completers, as stakeholders, indicated concerns related to technology preparation on the completer survey felt less prepared to integrate technology into instruction relative to other skill items on the survey. Thus, the elementary and secondary programs felt additional course content was needed on this topic and a new technology course was developed and approved by the Curriculum Committee as a formal requirement of those programs (10). In addition, completer feedback emphasizes the value of diversity in faculty and candidates/peers. As described in Standard 3, VCU EPP adjusted ongoing strategic recruitment efforts, aligned to the SOE strategic plan, to diversify the licensure applicant pool. In addition, the strategic plan focuses on strategies to increase diversity among faculty, as well as ensuring a culture that promotes retention of diverse candidates and faculty. In addition, the EPP will implement an evaluation of partnerships going forward to determine the effectiveness of our efforts to support mutually beneficial partnerships. Finally, partners on the CEAB committee indicated in some hard-to-staff schools, it is challenging to secure placements with cooperating teachers with three years of experience in the school (an EPP criterion for cooperating teachers). Therefore, the EPP adjusted the criterion whereby a teacher with two years experience in the school could be identified as a "mentor teacher" to supervise practicum, early program clinical placements (23, date: 6/4/2019).
Closing Statement
As demonstrated across standards 1/A.1-4/A.4, the QAS supports systematic data collection, reporting, and monitoring on all CAEP Standards. Leveraging several technology systems, the EPP collects, tracks, aggregates and reports data on multiple measures in response to faculty/program needs, to monitor candidate progress and achievements of completers, as well as to understand the EPP effectiveness and opportunities for continuous improvement. Through multiple committee structures that include staff, faculty, leadership, and P-12 partners, the EPP routinely investigates the quality and usefulness of existing measures and uses the information to make needed adjustments to ensure its QAS relies on high-quality evidence. Further, these committees engage in the ongoing review of data, improving in-depth review, and understanding with time, such that the QAS forms the basis for the continuous improvement function. Through partnerships, as well as the web presence, the EPP is transparent about its outcomes, sharing results with stakeholders.
Summary of Evidence and Supporting Documentation
- 85.1: VCU EPP Quality Assurance System Handbook
- 85.2: Data Quality Map for EPP-Created Surveys and Assessments
- 85.3: VCU EPP QAS Data WorkFlow
- 86.1: Guidance Document Initial Clinical Evaluation Continuum
- 86.2: Guidance Document Initial Dispositions Assessment
- 86.3: Guidance Document Initial Lesson Plan Assessment
- 87: Initial Licensure Reliability and Validity Analysis of Pilot Rubric Data
- 88: CAEP Rubric Writing Team Documentation
- 89: Guidance Documents for Initial Special Education programs
- 90.1: Educational Leadership Program Assessment Guidance Documents
- 90.2: Math Specialist Program Assessment Guidance Documents
- 90.3: Reading Program Assessment Guidance Documents
- 90.4: School Counseling Program Assessment Guidance Documents
- 91.1: Initial and Advanced Program Annual Summaries of Data Use Part I
- 91.2: Initial and Advanced Program Annual Summaries of Data Use Part II
- 91.3: Initial and Advanced Program Annual Summaries of Data Use Part III
- 91.4: Initial and Advanced Program Annual Summaries of Data Use Part IV
- 92: Sample Graduate Program and SOE Data Monitoring Dashboard
- 93: CAEP Annual Reports 2017 to 2020
- 94: VCU SOE Strategic Plan 2019-2022
- 95: Assessment Committee Meeting Minutes 2018-2020
- 96: VCU EPP CAEP Website