Interior_Banner_Sponsored_Projects

Dissemination of a Novel Undergraduate Research Evaluation Model

An NSF WIDER Project (DUE 13-47681 (SUNY Buffalo State) and 13-47727 (CUR)) | Active between 1/1/2015 - 12/31/2019

For general information about EvaluateUR: https://serc.carleton.edu/evaluateur/index.html

EvaluateUR supports independent summer and academic year undergraduate research. A one-year subscription to EvaluateUR is only $500. Learn more about EvaluateUR, join the email list to receive updates, and Subscribe.

For specific questions concerning the EvaluateUR method, please contact Dr. Jill Singer (singerjk@buffalostate.edu) directly.

____________________________________________________________________

The main goal of this NSF - WIDER project, led by Dr. Jill Singer at SUNY Buffalo State (hereafter Buffalo State) was to scale up and disseminate an evaluation method, known as EvaluateUR, for measuring student learning and related outcomes for students conducting summer and academic year mentored research. The project was a collaboration among Buffalo State, the Science Education Resource Center (SERC) at Carleton College, and Council on Undergraduate Research (CUR).

EvaluateUR centers on having both faculty mentors and their student researchers assess student knowledge and skills three times over the student’s research project (at the beginning of the research, mid-research, and end-of-research), followed each time by student-mentor conversations to compare and discuss the reasons for their respective assessments. One of the novel features of this approach to evaluation is that it is embedded into the research and mentoring processes, while at the same time generating reliable data that can be used by directors of undergraduate research programs to document their programs’ impacts. More details about the development of the method are provided in Singer and Weiler, 2009, Singer and Zimmerman, 2012, and Singer et al., (in prep).

EvaluateUR  Assessment Survey Outcome Categories and Components

Outcome Category
Outcome Component

Communication:

Uses and understands professional and discipline-specific language.

Expresses ideas orally in an organized, clear, and concise manner.

Writes clearly and concisely using correct grammar, spelling, syntax, and sentence structure.

Creativity:

Displays insight about the topic being investigated.

Shows ability to approach problems from different perspectives

Uses information in ways that demonstrate intellectual resourcefulness.

Effectively connects multiple ideas/approaches

Autonomy:

Demonstrates an ability to work independently and identify when guidance is needed.

Accepts constructive criticism and uses feedback effectively.

Uses time well to ensure work gets accomplished.

Sets and meets project deadlines.

Ability to Deal with Obstacles:

Is not discouraged by setbacks or unforeseen events and perseveres when challenges are encountered.

Shows flexibility and a willingness to take risks and try again.

Trouble-shoots problems and searches for ways to do things more effectively.

Intellectual Development:

Recognizes that problems are often more complicated than they first appear.

Approaches problems with an understanding that there can be more than one right explanation or even none at all.

Displays accurate insight into the limits of his/her own knowledge and an appreciation for what isn't known.

Critical Thinking and Problem Solving:

Challenges established thinking when appropriate.

Looks for the root causes of problems and develops or recognizes the most appropriate corrective actions.

Recognizes flaws, assumptions and missing elements in arguments.

Practice and Process of Inquiry:

Demonstrates ability to formulate questions and hypothesis within the discipline.

Demonstrates ability to properly identify and/or generate reliable data.

Shows understanding of how knowledge is generated, validated and communicated within the discipline.

Nature of Disciplinary Knowledge:

Shows understanding of the way practitioners think within the discipline (e.g., as an earth scientist, sociologist, artist . . .) and view the world around them.

Shows understanding of the criteria for determining what is valued as a contribution in the discipline.

Shows awareness of important contributions in the discipline and who was responsible for those contributions.

Reads and applies information obtained from professional journals and other sources.

Is aware of professional societies in the discipline.

Content Knowledge and Methods:

Displays knowledge of key facts and concepts.

Displays a grasp of relevant research methods and is clear about how these methods apply to the research project being undertaken.

Demonstrates an appropriate mastery of skills needed to conduct the project.

Ethical Conduct:

Recognizes that creating, modifying, misrepresenting or misreporting data including omission or elimination of data/findings or authorship is unethical.

Behaves with a high level of collegiality and treats others with respect.

Career Goals:

Is clear about academic and/or professional/work plans.

Is aware of how research skills relate to academic and/or professional/work plans.

 

In EvaluateUR, students and mentors complete identical assessment surveys that include 11 outcome categories, each defined by several measurable student behaviors for a total of 35 outcome components (Table 1). The outcome categories shown in Table 1 are also closely aligned with the wide range of essential workplace competencies identified by the Office of Career, Technical and Adult Education, U.S. Department of Education and by the National Association of Colleges and Employers (www.cte.ed.gov/employabilityskills and www.naceweb.org/career-readiness/competencies/career-readiness-defined.)

The assessments are completed before the student’s research begins, in the middle of the research, and at the end of the research experience. This phased assessment approach gives mentors multiple opportunities to review and assess student work and provides time for students to reflect on their strengths and weaknesses. EvaluateUR components are scored using a five-point scale ranging from “always” to “never” indicating the extent to which a student has displayed the outcome component being assessed. The instrument is first provided to each student-mentor pair at an orientation session that precedes the beginning of student research activities, so that both students and mentors can become familiar with the method.

Beginning with a “baseline assessment” before research begins and followed by two additional assessments (at the mid-point and end-points of the research project) students score themselves on each outcome component, and their research mentors, using the same instrument, score their students. The first assessment is done together so that the student and mentor can discuss how each outcome component relates to the student’s research. At this meeting, they also have the option to add project-specific outcomes to the assessment. The mid-research and end-of-research assessments are done independently. Following each assessment, the student and mentor receive a link that takes them to a score report that shows how they rated each outcome component. Outcome components with a score difference of 2 or more are highlighted to call attention to them. Following each of the assessments, the student and mentor meet to discuss how each scored the outcome components and to explore the reasons for any differences in their respective scores. EvaluateUR stresses that the assessment scores are less important than the conversation that follows the assessments, at which time the student and mentor share their rationales for assigning particular scores and discuss the reasons for differences, if any, in their perceptions. A series of instructional videos and other resources to help EvaluateUR adopters learn how to implement EvaluateUR are available on the website.

The EvaluateUR process is facilitated by undergraduate research program directors, who conduct orientations for students and mentors to explain the EvaluateUR goals and steps. A web-based administratior’s page (a dashboard) shows the status of each student-mentor pair in the program and helps the administrator track the progress of each student-mentor pair to ensure that assessments are completed at the appropriate time in the research program. Automated messages are sent throughout the EvaluateUR process with reminders about completing steps.

Particularly innovative aspects of the EvaluateUR approach include its applicability to all disciplinary areas, support for students, faculty mentors and undergraduate research directors, and a phased approach to assessing student knowledge and skill development throughout the course of UGR experiences. EvaluateUR also includes a web-based statistical package known as EZStats that automatically generates for each outcome component composite descriptive measures for students and mentors. The format of EZStat output makes these measures readily usable for reports. An instructional video and user guide for EZStats can also be found on the EvaluateUR website.

By the end of the five-year WIDER grant, ~50 colleges and universities had implemented EvaluateUR, and hundreds of faculty mentors and UGR directors were provided with information about the EvaluateUR method, through a variety of presentations and webinars at national UGR and STEM meetings.  Surveys conducted in 2019 with students, faculty mentors, and undergraduate research directors using EvaluateUR demonstrated that ~ 90% of respondents judged that the EvaluateUR discussions helped students gain better understanding of their academic and professional strengths and weaknesses.  All EvaluateUR assessment components saw statistically significant positive student gains for all 35 outcome components and research mentors found it easier to identify the academic strengths and weaknesses of the students they mentored, enabling them to focus their mentoring efforts more productively.

An independent evaluation found that EvaluateUR tested an innovative method for evaluating undergraduate research in a way that could reliably measure specific knowledge and skill outcomes while also contributing directly to student learning. At the conclusion of the WIDER grant, EvaluateUR transitioned to a subscription-based service with general support provided by Buffalo State and technical support provided by SERC.

 

 

Impact