"Evaluation is the process of determining the adequacy of instruction and learning" (Seels & Richey, 1994, p. 54).
This domain encompasses a range of evaluative practices, including problem analysis, criterion-referenced measurement, formative evaluation, summative evaluation, and long-range planning. Instructional designers demonstrate their expertise in this area by gathering and interpreting data to inform instructional decisions, improve learning outcomes, and guide strategic planning. Through these practices, candidates show their ability to apply theory and research to assess both learner performance and the broader impact of instructional interventions across programs and systems.
Evaluation is a critical component of instructional design that informs decision-making at every stage of the process. My development in this area reflects a shift from viewing evaluation as a final step to embracing it as a continuous, integrated practice. I have gained experience in analyzing problems, designing assessments aligned with instructional goals, and using data to guide improvements in both instruction and support systems.
Through formative and summative strategies, I’ve learned to gather meaningful feedback from learners and stakeholders, assess the adequacy of instruction, and translate findings into actionable plans. This work has strengthened my ability to align evaluation with organizational needs and long-range goals, ensuring that the instructional experiences I design are both effective and responsive to change.
5.1 Problem Analysis
"Problem analysis involves determining the nature and parameters of the problem by using information-gathering and decision-making strategies" (Seels & Richey, 1994, p. 56).
Project Proposal For Evaluation of General Education Orientation Advising
This project involved developing an evaluation proposal to assess the effectiveness of General Education Orientation Advising for first-year college students. The goal was to determine whether current advising practices were meeting the needs of new students and supporting academic success. The proposal identified key challenges—such as inconsistent advising experiences, unclear learning outcomes, and varied student preparedness—and outlined a data-driven plan to evaluate these concerns through surveys, focus groups, and academic performance analysis.
Artifact
The artifact is a formal proposal that includes the problem statement, rationale for evaluation, data collection methods, and decision-making criteria. It served as the foundation for institutional discussions about improving advising practices through systematic assessment and instructional redesign.
Reflection
This project aligns with AECT Standard 5.1 – Problem Analysis, which is defined as “determining the nature and parameters of the problem by using information-gathering and decision-making strategies” (Seels & Richey, 1994). Through this work, I applied evaluation competencies to identify key gaps in advising and recommend actionable steps toward instructional improvement. I collected preliminary stakeholder input, framed evaluation questions, and designed a responsive plan grounded in both learner needs and institutional goals. This project reflects my growing ability to lead data-informed instructional change by using evaluation as a strategic tool to uncover challenges and guide improvements.
5.2 Criterion-Referenced Measurement
"Criterion-referenced measurement involves techniques for determining learner mastery of pre-specified content" (Seels & Richey, 1994, p. 56).
Academic Advising Preparation eLearning Module
This project involved the design and development of an eLearning module to teach first-year engineering students how to effectively prepare for academic advising sessions. The module included interactive instructional content and a built-in quiz that required students to demonstrate their understanding of the advising preparation process. The assessment was directly aligned to the instructional objective of planning for an advising session, with the goal of supporting students’ readiness and autonomy in the advising environment.
Artifacts
The artifact includes images of the integrated quiz used to assess learner understanding. Quiz items were designed to evaluate each student’s mastery of the specific advising preparation steps introduced in the module.
Reflection
This project exemplifies AECT Standard 5.2 – Criterion-Referenced Measurement, which Seels and Richey (1994) define as the use of techniques for assessing mastery of content. The quiz was constructed to measure student success relative to clearly stated instructional objectives rather than peer performance. This approach ensured clarity in expectations, consistency in assessment, and alignment with the broader instructional goals of the advising curriculum. The project also reflects my ability to implement performance-based assessments within technology-enhanced learning environments and demonstrates my growth in designing meaningful, outcome-aligned evaluation tools.
5.3 Formative and Summative Evaluation
"Formative evaluation involves gathering information on adequacy and using this information as a basis for further development. Summative evaluation involves gathering information on adequacy and using this information to make decisions about utilization" (Seels & Richey, 1994, p. 57).
Academic Advising in General Education at Virginia Tech: Evaluation Survey
This project involved the development and administration of a formative evaluation survey to gather student feedback on academic advising within the General Education program for engineering students at Virginia Tech. The goal was to assess advising effectiveness from the learner's perspective and use this information to guide improvements in instructional resources and advising communication. The survey was accompanied by tailored messaging materials, flyers, email announcements, and instructional text, to increase participation and clarify the purpose of the evaluation.
Artifacts
The artifacts includes the student survey instrument, accompanying outreach materials used to distribute and explain the survey, and documentation of the evaluation goals and strategy. Survey items focused on advising clarity, usefulness, preparedness, and satisfaction with the advising experience.
Reflection
This project exemplifies AECT Standard 5.3 Formative & Summative Evaluation, which Seels and Richey (1994) define as the process of “gathering information on adequacy and using this information as a basis for further development” (p. 57). By designing and implementing a structured feedback loop, I was able to identify areas in which the advising process could be clarified, adjusted, or expanded. This work reflects my growth in using formative evaluation to guide instructional improvement in real time and in applying communication strategies to ensure thoughtful and ethical engagement with learners during the evaluation process
Evaluation Report for Assessment of Academic Advising Program in General Engineering at Virginia Tech
This summative evaluation project analyzed the overall effectiveness of the academic advising program offered to first-year General Engineering students at Virginia Tech. Drawing on previously gathered formative data, the evaluation synthesized findings to assess whether the program was meeting its goals and to determine the impact of advising on student understanding and academic planning behaviors. The report was developed to inform decisions regarding program continuation, resource allocation, and areas for instructional improvement.
Artifact
The artifact includes the final written evaluation report, which presents findings from student surveys and stakeholder input, data analysis, and recommendations for enhancing the advising process. The report was intended for distribution among program stakeholders and institutional leadership.
Reflection
This project aligns with AECT Standard 5.3 Formative and Summative Evaluation. Through this work, I demonstrated my ability to conduct a complete, post-implementation evaluation of an instructional support program. The report combined stakeholder analysis, measurable outcomes, and program recommendations, allowing decision-makers to assess the value and impact of the advising initiative. This artifact reflects my growth in managing large-scale evaluations, synthesizing data into actionable insights, and supporting evidence-based educational leadership.
5.4 Long-Range Planning
“Long-range planning that focuses on the organization as a whole is strategic planning... Long-range is usually defined as a future period of about three to five years or longer. During strategic planning, managers are trying to decide in the present what must be done to ensure organizational success in the future.”(Certo, et al., 1990, as cited in Seels & Richey, 1994, p. 57)
Overview of Evaluation Results and Suggestions and Implications for Future Planning in GE Advisng
This document summarizes the results of an evaluation of academic advising in the General Engineering program at Virginia Tech and presents strategic recommendations for long-term improvement. The report goes beyond short-term fixes by outlining implications for ongoing program development, including recommendations for data systems, training structures, and scalable advising practices that support future cohorts of students. The focus was on institutional sustainability and innovation in advising, rather than isolated program adjustments.
Artifact
The artifact includes the evaluation summary, analysis of advising effectiveness, and a set of forward-looking recommendations to guide planning efforts for the next three to five years. Topics include advising workflow redesign, communication improvements, and integration of digital tools for student planning.
Reflection
This project aligns with AECT Standard 5.4 – Long-Range Planning. Through this report, I contributed to strategic decision-making by identifying future advising needs, proposing scalable solutions, and advocating for systemic improvements to student support infrastructure. This artifact reflects my growth in taking a systems-level approach, thinking beyond immediate instructional needs to promote lasting, equitable, and data-informed educational design at the institutional level.
VCUarts Administrative Planning Related to Advising Evaluation Results
This project involved translating the findings from an evaluation of undergraduate academic advising at VCUarts into actionable administrative planning. The document outlines institutional strategies for improving advising practices across departments, including changes to communication workflows, advising documentation standards, and professional development structures. The planning was informed by student feedback and designed to support long-term systemic improvement over multiple academic years.
Artifacts
The artifact includes the a document shared with VCUarts leadership, which synthesizes evaluation results and outlines prioritized recommendations for implementation. Artifacts also include sample results from the advising survey used as part of the evaluation process. The recommendations were developed to align with strategic goals and to guide policy, staffing, and technology decisions within the advising ecosystem.
Reflection
This project aligns strongly with AECT Standard 5.4 – Long-Range Planning, which emphasizes the importance of strategic organizational alignment in instructional technology decisions. By using data to drive structured planning conversations with institutional leaders, I helped connect instructional evaluation to broader organizational outcomes. This artifact demonstrates my ability to act as a bridge between data and decision-making, supporting sustainable, systemic change that enhances the learner experience while addressing institutional goals. It reflects my development as a strategic planner who contributes meaningfully to long-term educational transformation.