SUREbyts: Presenting Early-Year Undergraduate Students with Videos on Research Topics

SUREbyts: Presenting Early-Year Undergraduate Students with Videos on Research Topics

Embedding research into the undergraduate curriculum has been shown to be a highly impactful pedagogical approach across all disciplinary areas (Walkington 2015). By engaging with structured research opportunities as part of their undergraduate studies, students are encouraged to creatively explore the topics being taught while also developing important disciplinary and transversal skills (Healey and Jenkins 2009). The opportunity for students to engage fully, or partially, with a research project and then present their findings at an undergraduate research conference or publish their findings in a journal has attracted substantial attention in recent decades, as evidenced by the proliferation of dissemination platforms for undergraduate research (Barker and Gibson 2022). These opportunities, however, tend to focus primarily on students at the latter end of their undergraduate studies. Despite this, there is increased attention in the literature on how undergraduate students at the earlier stages of their studies can become involved in, or exposed to, research projects (Shelby 2019; Wolkow et al. 2014). This article describes one project that shares this objective: the SUREbyts project.

The SUREbyts project allows first- and second-year undergraduate students to engage with research through a collection of video recordings in which experienced and early-stage researchers describe a problem, pose a question and possible solutions related to the problem, and then describe their research-informed view on the most appropriate solution. These videos, covering many of the prominent scientific disciplines, are freely available to all lecturers to use in class with their students under a Creative Commons BY-NC-ND 4.0 license. Suggested uses include integrating SUREbyts into a discussion regarding the topic of the video or using SUREbyts as part of a formative or summative assessment. Of the 294 students who responded to a survey about their engagement with SUREbyts, the majority reported that it had increased their interest in research in general, and their understanding of the work undertaken by researchers specifically. There are challenges, however, associated with this approach. Researchers often find it difficult to present their research in an accessible fashion, appropriate for early-stage undergraduate students. Creating an interesting and engaging video requires careful guidance and usually several design iterations. Additionally, lecturers require guidance on how to incorporate these videos meaningfully into their teaching, as misaligned use can result in a negative student learning experience.

The next sections describe the SUREbyts project in detail. The article concludes with a set of recommendations to institutions that are considering implementation of such a project using SUREbyts as a model. Institutions that do so will be well equipped to enhance the awareness of research among their first- and second-year students.


The Science Undergraduate Research Experience (SURE) Network (O’Leary et al. 2021) launched the SUREbyts project in 2021 with the objective of enhancing research awareness at the early stages of undergraduate programs in the sciences in Ireland. Through SUREbyts, experienced researchers and postgraduate research students were invited to record a brief video (a SUREbyt) centered on a question related to their research. The videos were then made available on the website of the SURE Network (SURE Network 2021), from where both students and lecturers could access them. Lecturers were encouraged to use SUREbyts videos in class to help their students learn about the research that was taking place within their discipline.

Video was chosen as the medium for this project for a variety of reasons, including ensuring that the research that was taking place throughout the network could be showcased to all students; and enabling the content to be reviewed and edited in advance of its use to ensure that it meets the requirements of the project. Of most relevance for this article, the SURE Network has ambitions for the SUREbyts collection of resources and the SUREbyts model to be adopted by institutions beyond Ireland. The collection currently comprises 34 SUREbyts videos that are freely available for use under a BY-NC-ND 4.0 Creative Commons license. To better understand the impact of the SUREbyts model, the project team surveyed lecturers and students who had used the SUREbyts resources. Thirteen lecturers and 294 students replied to the online surveys. The feedback obtained, both positive and negative, shapes the remaining sections and provides guidance to others who wish to either contribute to, use, or replicate the SUREbyts model.


A SUREbyt is a 10- to 12-minute video designed to provoke a discussion among students when played in class. In the video, students are informed about the research career and work of a research student or professional researcher at their own or another institution. The students are then presented with a question related to that work and three possible solutions. This can be thought of as the type of question that might be offered to an audience with a request for a show of hands on the most suitable answer. A break in the video then shows a countdown clock for two minutes, during which time students are encouraged to discuss the possible solutions with their nearest classmates. The second part of the video then presents the researcher’s own view on the best solution. Often, the researcher will explain that they have a preferred solution but that other researchers do not share their view. It is important that students are exposed to this type of discourse so that they appreciate that research does not always result in one, true answer, and that it is acceptable for researchers to hold diverse views based on their own findings.

In part 1 of the SUREbyt video, shown in Figure 1, the researcher introduces themself and their research and presents a question and possible solutions. In part 2, also shown in Figure 1, the researcher’s preferred solutions are presented and justified. Both parts are fully developed by the researcher, based on strict, but accessible, guidelines available through the SUREbyts website. The researcher then submits their videos to their institutional SUREbyts point of contact, as shown in Figure 2. The institutional point of contact reviews the video and may request edits or may liaise with the central coordinators of the SUREbyts project who review the videos for quality and adherence to the published guidelines. When complete, the researcher will submit both parts with a signed consent form to the project team. The two parts are then edited into the final format shown in Figure 1 by the SUREbyts project team. At this stage, a themed introduction and outro are added to bookend the videos, and a two-minute countdown clock is inserted between the two parts. Once finalized the SUREbyt video is published and categorized by discipline on the SURE Network website, where is it made available at a unique URL. Many videos are multidisciplinary and appear in multiple categories, helping alert students to the importance of research that transcends subject boundaries. The creator of a published SUREbyt video can apply for recognition with a digital badge issued by the SURE Network.


A primary metric of success for the SUREbyts project was the recruitment of 34 researchers from around Ireland, in all the SURE Network’s partner institutions, to create the videos. Of these, 19 were lecturers who were actively involved in research, and 15 were postgraduate research students. The mix of creators at different stages of their career meant that the full SUREbyts collection was representative of the diversity of experience that features in the research landscape. It also provided early-stage researchers and postgraduate students with a means of disseminating their research and enhancing both its engagement and impact, a common requirement of grant-awarding bodies. Equally important was the diversity of disciplinary areas, as shown in Table 1. Thirty-four SUREbyts videos were published, with several in multiple categories.

The project resulted in a collection of cutting-edge research videos addressing accessible, engaging topics and featuring questions that were designed for a novice audience. The most popular of the videos was titled Feeding Martian Colonies. In this SUREbyt video, the creator, a postgraduate student, explained her research background and project, which related to hydroponics. Following a four-minute description of her research, the researcher posed the question, “How are we going to feed Martian colonies?” and offered three solutions: (a) mix Martian soil with “human fertilizer” (urine and feces); (b) send constant resupply missions from Earth; (c) soil-less growth under controlled environment. At this point the video moves to a two-minute break so that viewers can consider the possible solutions, discussing them as appropriate with classmates. In the final part of the video, the researcher explained why the third option was her preferred solution and related this to her own current research. This SUREbyt video attracted approximately one-quarter of all the hits for the whole collection. Other popular videos cover a range of disciplinary areas. Titles include Pond Water, Endocrine-Disrupting Chemicals, Walking, Microbial Growth Strategies, and Tsunami.


The SUREbyts coordinators evaluated the quality of each of the SUREbyts videos against a set of technical requirements, a set of formatting requirements including the length of video, and the requirement for the video to be engaging for novice science students. Survey respondents subsequently helped further develop understanding of quality.

Survey feedback has suggested a diversity of quality among the videos and a dissatisfaction among students when the videos do not adhere fully to the guidelines. This is evidenced by one respondent’s comment about one of the videos that was almost 20 minutes in length.

I definitely felt some were of higher quality (the ones I used) than others—so it would be great if they were continually updated to give more choice. Students seemed to enjoy them but the group who watched the microplastics one felt it was too long—I really enjoyed this one in particular and so do not agree but thought this feedback may be useful (one student told me she increased the speed so that it was more watchable!).

Students also commented on the need for “simpler language and avoiding terminology” and that “there should not be much written text on the screen.” Students were frustrated by poor-quality recordings and the need to “improve the mic quality [because] some background noises could be heard and the audio was difficult to comprehend because of this.” There is a balance to be struck between the requirements set out for video creators, which may serve as barriers to their participation, and the requirement for high-quality videos.

Suggestions from survey respondents on how to improve the videos included the addition of subtitles to the videos and the inclusion of quizzes at the end of each video. The addition of subtitles is easily achieved through software automation and will be implemented for the next iteration of SUREbyts videos. The addition of quizzes was given consideration, but it was felt this would alter the purpose of the SUREbyt video, which is intended to focus on a single focal question in a classroom situation. Lecturers may decide to build quizzes related to the content of the video within the instructional context in which the video is used. It is important, however, that the overall burden on the creator of the video is kept to a minimum, as the success of SUREbyts is dependent upon the willingness of busy researchers and research students taking the time to develop accessible, engaging videos centered upon a carefully designed question.

What is clear is that students and lecturers have a very good sense of what constitutes good quality, and this is reflected in the popularity of certain videos. Popularity is driven, in the first instance, by the lecturer who decides on which video to use in their class, and how to use it.


Lecturers in first- and second-year modules in SURE Network partner institutions were encouraged to use the SUREbyts resources as part of the learning design for their classes. As with the video creators, lecturers could apply to the SURE Network for a digital badge once they had incorporated SUREbyts into their classes.

A dedicated online session was arranged for lecturers to explore different ways in which the resources could be used on their courses. Of these approaches, which are described on the SURE Network website, the one that was adopted by the majority of lecturers was “class opener.” With this method, a lecturer commences a class by playing the SUREbyt video from start to finish. When the middle part of the video plays, students are asked to discuss the possible solutions with each other, which they do again after the video completes. The lecturer then relates the subject of the SUREbyt to the topic under discussion in that week’s class. Other approaches such as “class bridge,” in which the playing of the video is divided between sessions, were also adopted by some lecturers. Others innovated and developed their own approach to using the videos, such as this lecturer:

I used the videos in a slightly different way than what was perhaps intended. First, I used the videos at the start of the semester as an ice breaker. This enabled the students to initiate conversations with each other, and it was very effective—the noise from the conversations was very loud!

Based on survey responses, the perception of lecturers on the value of the SUREbyts videos was generally positive, but not universally so. Nine of the 13 lecturers surveyed (69 percent) felt that their students’ awareness of research was enhanced through their engagement with SUREbyts. Ten of the lecturers (77 percent) said that they would use the videos again, with seven of that group (54 percent) “very much” likely to do so. These lecturers identified how the videos they used were good triggers for discussion, with one lecturer commenting that:

The videos were perfectly pitched for first-year students who really engaged and considered the questions posed. The videos were great examples of real-world applications of computing research that were clearly presented at the right level for students.

However, other lecturers felt that the introduction of subject matter relating to postgraduate research was not appropriate for the early stages of first-year undergraduates. One lecturer responded in the survey with the following view:

For the vast majority of first-years in semester 1, which is the only time I teach these groups, they are not ready to start thinking about postgraduate research.

Another lecturer felt that the material presented was more appropriate for more experienced students, commenting that they “felt that second-year students responded better.” The same lecturer struggled to find time in their class for the use of the SUREbyts resources, and decided to “provide them with a list of videos and links to use in their own time.” The videos were designed to be used in class, and ideally for first- and second-year groups, so the feedback helped surface both an inconsistency in target level across different videos and a need to be aware of uses inconsistent with the design.

An overriding objective of SUREbyts is to increase the awareness of research as an activity, with a secondary objective being to raise disciplinary knowledge among students. Greater than 60 percent of the 294 students surveyed agreed that SUREbyts enhanced their understanding of the work of researchers (73 percent), their interest in research in their area (62 percent), and their interest in carrying out research in the future (65 percent). Fewer than this, although still a majority (54 percent), felt that they had an increased understanding of the topics they were studying in their program. Some student feedback was glowing in praise:

All of this information that I have gathered from her astounding video has allowed me to ponder the world of horticulture. I never expected to be interested in such topics however, through SUREbyt videos I am sure I will discover many new academic discoveries.

Other students, however, shared the view of some lecturers that the videos are more appropriate for later stage students:

As an introduction to new students who have no idea about computer science and are new to it, it is confusing, but for ones who have knowledge about the area it is an interesting and further opening to the subject matter of machine learning.

In general, feedback suggests that both lecturers and students recognized the value of the resources in starting in-class discussion, such as this lecturer:

For the module that I am teaching students need to create a technology solution (high-level prototype design) to address one or more of the SDGs (sustainable development goals), so these examples served as a great point of discussion on how we can design technology to address real-world problems and consider the needs of end users. This is a great resource that I will certainly use in future!

This highlights the importance of the resources being used as part of a facilitated session or class, rather than as a stand-alone web-based resource. The videos are designed to commence, or contribute to, a discussion, for which the role of the lecturer is essential.


The SUREbyts project developed an innovative format for brief videos intended to be used to introduce early-stage undergraduate students to real research projects that are taking place in higher education institutions. The project produced detailed guidelines for video creators and users. The project had a mixed but generally positive response from lecturers and students, as detailed in earlier sections. Based on the experience of running the project, the authors of this article present recommendations in the sections below to other institutions that may wish to adopt some or all aspects of the SUREbyts project.

Video Development

Researchers and research students tend to be time-poor but eager for recognition for their research. Research students should be advised on how the creation of videos for instruction can fulfill the dissemination requirements of their research grants, and help raise their profile. Lecturers and researchers should be made aware of how teaching of undergraduates can have benefits for active researchers (Feller 2018), and of how research and teaching can support each other (Ashrafa 2010). SUREbyts digital badges were made available to the creators and users of videos, although very few badges were applied for in practice.


Digital learning provides a means through which otherwise abstract or unknown concepts can be “illustrated and become tangible” (Kerres and Otto 2022, 701). The illustration of the concept, the question, and the possible solutions are central to the quality of the SUREbyts video. It is important to ensure that the creators of the videos are focused from the start on identifying and presenting a clear, easily understood question that will engage their audience in a meaningful discussion. All other aspects of the SUREbyts video will pivot around the question. In the pilot project described here, templates, detailed guidelines, and dedicated, local support were provided to help achieve this objective.

Interpersonal Support

The SUREbyts project benefited hugely from the support of the established SURE Network (O’Leary et al. 2021). As a nationwide network with representatives in institutions throughout Ireland, SURE was able to provide local support, encouragement, and guidance to video creators. This support was invaluable for encouraging participation in the project and subsequent usage of the videos.


It is important that as many barriers to participation as possible are lowered or removed. Creators should not have to carry out extensive editing themselves; this should be provided as part of the final production process. Although guidelines are important and should be adhered to as much as possible, some flexibility should be afforded to the makers of the videos to be creative, within reason. Those videos that stray too far from what was expected, such as an excessively long video, will not be as attractive to students and lecturers.


During the SUREbyts project, it became evident that videos that did not reach a certain threshold of production quality, accessibility of the question, appropriateness of the language used, and content of the presentation would be ignored by lecturers and students. A large disparity in usage between the popular and unpopular videos showed the value of continually revising the videos with feedback until the appropriate quality is reached. Based on this outcome, the SUREbyts group has revised the guidelines to highlight this to future collaborators and content creators.


Lecturers require guidance on how to use the videos effectively. SUREbyts videos should enable students to experience what Pedaste (2022) describes as the orientation phase of research engagement: a “process of stimulating curiosity about a topic and addressing a learning challenge through stating a problem.” (151) For the SUREbyts project, a series of usage scenarios was presented to lecturers to encourage them to use the videos as part of a discussion with their classes. The videos are not intended to be used in the absence of an opportunity for peer discussion. Lecturers can be supported through dedicated training sessions, online resources, and, most valuable of all, case studies of effective use.


Based on feedback received, SUREbyts has proven effective at raising the profile of research among early-year undergraduate students in Ireland. The project team would welcome the adoption by others of the resources, format, or overall approach developed through the project. This article has provided guidance on how to do so. It is hoped that future users will learn from the successes of the SUREbyts project and avoid some of the challenging situations that emerged during the project.

Data Availability Statement

The research instruments used to collect data are available at The following statements regarding the storage and availability of data were agreed to with the Technological University Dublin Research Ethics and Integrity Committee:

  • Data will be stored securely, and analysis will take place within the project team, possibly with the support of a small number of administrators external to the team.
  • All data collected will be deleted upon completion of the research, no later than one year following the collection of the data.

Ethical Review Board Statement

The Research Ethics and Integrity Committee of Technological University Dublin approved this project (REC-20-183) on October 11, 2021. This approval was noted and approved by the corresponding committee at each institution at which data were collected.

Conflict of Interest Statement

No conflict of interest to declare.


The authors would like to acknowledge the support of Ireland’s National Forum for the Enhancement of Teaching and Learning in Higher Education whose network and discipline fund supported the development of SUREbyts. The authors recognize the work undertaken by the creators of the SUREbyts videos to develop a comprehensive, cross-disciplinary resource that has contributed to the teaching, learning, and assessment of undergraduate students across Ireland, and thank all video creators for this work. The authors also acknowledge and thank the lecturers and students who used the SUREbyts videos and gave up their time to contribute to the data collection for this evaluation study. Finally, the authors acknowledge the SURE Network for its support in promoting the SUREbyts project.


Ashrafa, Syed Salman. 2010. “Borrowing a Little from Research to Enhance Undergraduate Teaching.” Procedia Social and Behavioral Sciences 2: 5507–5511.

Barker, Emma, and Caroline Gibson. 2022. “Dissemination in Undergraduate Research: Challenges and Opportunities.” In The Cambridge Handbook of Undergraduate Research, ed. Harald A. Mieg et al., 172–182. Cambridge, UK: Cambridge University Press.

Feller, Marla B. 2018. “The Value of Undergraduate Teaching for Research Scientists.” Neuron 99: 1113–1115. doi: 10.1016/j.neuron.2018.09.005

Healey, Mick, and Alan Jenkins. 2009. Developing Undergraduate Research and Inquiry. York, UK: Higher Education Academy.

Kerres, Michael, and Daniel Otto. 2022. “Undergraduate Research in Digital Learning Environments.” In The Cambridge Handbook of Undergraduate Research, ed. Harald A. Mieg et al., 695–708. Cambridge, UK: Cambridge University Press.

O’Leary, Ciarán, Julie Dunne, Barry Ryan, Therese Montgomery, Anne Marie O’Brien, Cormac Quigley, Claire Lennon, et al. 2021. “Reflections on the Formation and Growth of the SURE Network: A National Disciplinary Network to Enhance Undergraduate Research in the Sciences.” Irish Journal of Academic Practice 9(1): article 7. doi: 10.21427/z3xx-dy28

Pedaste, Margus. 2022. “Inquiry Approach and Phases of Learning in Undergraduate Research.” In The Cambridge Handbook of Undergraduate Research, edited by Harald A. Mieg et al., 149–157. Cambridge, UK: Cambridge University Press.

Shelby, Shameka J. 2019. “A Course-Based Undergraduate Research Experience in Biochemistry That Is Suitable for Students with Various Levels of Preparedness.” Biochemistry and Molecular Biology Education 47: 220–227. doi: 10.1002/bmb.21227

SURE Network. 2021. SUREbyts (website). Accessed August 29, 2023.

Walkington, Helen. 2015. Students As Researchers: Supporting Undergraduate Research in the Disciplines in Higher Education. York, UK: Higher Education Academy. https://www.advancehe.

Wolkow, Thomas D., Lisa T. Durrenberger, Michael A. Maynard, Kylie K. Harrall, and Lisa M. Hines. 2014. “A Comprehensive Faculty, Staff, and Student Training Program Enhances Student Perceptions of a Course-Based Research Experience at a Two-Year Institution.” Life Sciences Education, 13: 724–737. doi: 10.1187/cbe.14-03-0056

Ciarán O’Leary

Technological University Dublin,

Ciarán O’Leary is the head of learning development for the faculty of computing, digital, and data at Technological University Dublin. O’Leary has been a lecturer in computer science at Technological University Dublin since 2000. O’Leary’s research interests relate to the entanglement of digital technology with academic practice. O’Leary was the first chairperson of the Science Undergraduate Research Experience (SURE) Network from its establishment in 2016 to 2021, and was the project lead for the SUREbyts project.

Gordon Cooke is a lecturer in biological sciences and an active researcher. Cooke completed his PhD in 2004 at the Institute of Technology Tallaght before being appointed as a Newman Fellow at University College Dublin to undertake research into Barrett’s metaplasia. Cooke joined Technological University Dublin in 2016, where he established his own research group with interests in antimicrobial resistance and extracellular vesicles. Cooke also is actively involved in educational research about technology-enhanced learning, student retention, and student resilience.

Julie Dunne has a PhD in chemistry, an MA in higher education, and is a fellow of the Royal Society of Chemistry and a member of the Institute of Food Science and Technology, Ireland. After working in the pharmaceutical industry, Dunne joined Technological University Dublin in 2003 and is currently the head of the School of Food Science and Environmental Health. Dunne’s research interests include work-integrated learning, undergraduate research, education for sustainable development, green biocatalysis, and carbohydrate-based antimicrobials.

Barry Ryan is a biochemistry lecturer currently on secondment to lead the development of the educational model for Technological University Dublin. He promotes (co-) creation to empower and centralize all students across all levels within undergraduate curricula. Ryan is passionate about implementing research-informed teaching and supporting others to develop in this area. Ryan is concurrently a senior fellow of the Higher Education Academy, a National Forum Teaching and Learning research fellow, and a chartered science teacher.

Carla Surlis is an early-stage researcher and lecturer in molecular genetics, specializing in the area of small RNA interactions in human disease. Surlis is enthusiastic about using digital technologies to improve engagement in undergraduate teaching and learning.

Matt Smith is a senior lecturer in computing in the faculty of computing, digital, and data at Technological University Dublin. Smith’s research focuses on interactive multimedia and extended reality technologies, and its applications to support computer-supported learning. He leads the Digital Realities, Interaction and Virtual Environments research group.

Emma Caraher is a lecturer in biopharmaceutical sciences at the School of Chemical and BioPharmaceutical Sciences at Technological University Dublin. Caraher completed her PhD in 1998 at University College Dublin. Following this Caraher worked as a postdoctoral researcher at Ottawa Hospital Research Institute and Health Canada. She joined Technological University Dublin in 2003 and in 2008 secured a Science Foundation Ireland–funded Stokes lectureship. Caraher is program chair of applied biology, bioanalysis, and bioanalytical science.

Claire Lennon lectures on organic chemistry and spectroscopic characterization at the undergraduate and postgraduate levels. Lennon places a strong focus on embedding research in her teaching and across the undergraduate curriculum. Lennon has research interests in stereoselective organic synthesis aiming to develop novel green and sustainable methods, supervising PhD students in these areas. Lennon has been a member of the SURE Network since its inception in 2017.

Evelyn Landers is a lecturer in inorganic chemistry and analytical science. Landers coordinates the first year of seven programs across the departments of science and land sciences and is program leader for the common entry science program. Landers is the recipient of the Teaching Hero Award from the National Forum for the Enhancement of Teaching and Learning in Higher Education and the Union of Students in Ireland as well as a Higher Education Innovation award.

Eileen O’Leary holds a PhD in organic chemistry, a master’s degree in teaching and learning and a certificate in coaching and leadership. O’Leary is a member of the SURE Network and its Digital Badge Committee. O’Leary is seconded to the teaching and learning unit at Munster Technological University. She is leading the program Enabling Academic Transitions through Professional Development, aimed at encouraging new staff to take a reflective and student-centered approach to practice by incorporating active learning.

Geraldine Dowling’s research interests are in the fields of forensic science, chemistry education (universal design for learning, community-based learning, and problem-based learning pedagogies), analytical science, metabolomics, and nutrition science. Dowling held posts in various Irish government ISO17025-accredited laboratories for 12 years prior to entering academia. Dowling has trained staff and students in the revenue, customs, and toxicology fields as a forensic practitioner. She also undertakes consultancy and supervises postgraduate students.

Margaret McCallig is a lecturer in occupational safety and health with over 10 years of industry experience in the construction, engineering, medical device, and food manufacturing industries. McCallig holds a BSc in health and safety systems and an MSc for research in occupational hygiene from the University of Galway. McCallig Is currently pursuing a PhD in the area of physical stressors in neonatal intensive care units in Ireland.

Anne Marie O’Brien is a lecturer at the Technological University of the Shannon (TUS) and has been in academia since 2006. O’Brien has an MSc and PhD in toxicology and biochemistry and also holds a postgraduate diploma in learning and teaching. O’Brien chairs the European team-based learning (TBL) collaborative, the SURE Network TBL and Digital Badge Committee, and also is the chair of the TUS Digital Badge Committee.

Valerie McCarthy is a lecturer and program director for the BSc environmental bioscience program at Dundalk Institute of Technology (DkIT). McCarthy is director of the Centre for Freshwater and Environmental Studies at DkIT. McCarthy’s research interests include theoretical community and ecosystem ecology in freshwater systems, investigating the linkages between aquatic systems and their catchments. Her current projects focus on the use of high-frequency and remote-sensing technologies to monitor surface water.

Josephine Treacy is a lecturer at Technological University of the Shannon. Treacy’s qualifications include a graduate diploma in environmental chemistry, MSc in analytical chemistry, PhD in environmental analytical chemistry, diploma in field ecology from University College Cork (UCC), and MEd from Mary Immaculate College, Limerick. Treacy’s previous employment includes postdoctorate research at UCC and being an executive environmental technician with Cork County Council. Her research interests include analytical science, method development, education, and academic writing.

Venturing into Qualitative Research: A Practical Guide to Getting Started

Venturing into Qualitative Research: A Practical Guide to Getting Started


We both started our scholarly journeys as biologists. As we trained, we both grew interested in researching undergraduate education and we transitioned to doing education research. We quickly came to realize that our training in experimental approaches and quantitative methods was woefully insufficient to study the diversity of ways students think, believe, value, feel, behave, and change in a variety of learning environments and educational systems.

For instance, there are established ways to quantify some educational variables, but not others. In addition, there may be phenomena at play that we haven’t thought of or that might be counterintuitive, which could lead us to quantify things that end up being irrelevant or meaningless. Herein lies the power of qualitative research. Qualitative research generates new knowledge by enabling rich, multifaceted descriptions of phenomena of interest, known as constructs (i.e., latent, unobservable variables), and producing possible explanations of how phenomena are occurring (i.e., mechanisms or relationships between constructs in different contexts and situations with different individuals and groups).

In this essay, we aim to offer an approachable explanation of qualitative research, including the types of questions that qualitative research is suited to address, the characteristics of robust qualitative research, and guidance on how to get started. We use examples from our own and others’ research to illustrate our explanations, and we cite references where readers can learn more. We expect Scholarship and Practice of Undergraduate Research (SPUR) readers from disciplines with a tradition of qualitative research might question why we would write this piece and what makes us qualified to do so. There are many scholars with much more qualitative research expertise than we have. Yet, we think we can offer a unique perspective to SPUR readers who are new to qualitative research or coming from disciplines where qualitative research is unfamiliar or undervalued. We have both designed, conducted, and published qualitative research in the context of undergraduate education and research experiences. We draw upon this experience in the recommendations we offer here.

Doing qualitative research involves acknowledging your “positionality,” or how your own background, lived experiences, and philosophical understandings of research influence how you approach and interpret the work (e.g., Hampton, Reeping, and Ozkan 2021; Holmes and Darwin 2020). Our positionalities have influenced our approach to this article and qualitative research generally. I (MAP) first learned about qualitative research from my undergraduate academic adviser. She invited me to help her implement and evaluate a capstone course in which groups of microbiology undergraduates engaged in a semester-long research project to address problems faced by community organizations (Watson, Willford, and Pfeifer 2018). At the time, I wasn’t aware of the long-standing history of qualitative research or its different forms and approaches. I just knew that reading quote data helped me understand human experiences in a way that survey numbers did not. Since my introduction to qualitative research, I’ve been fortunate to receive formal training. I consider my most valuable lessons about qualitative research to be through the practical experience of doing qualitative research and being mentored by qualitative researchers.

When I (ELD) first learned about qualitative research, I thought it meant words – perhaps collected through surveys, focus groups, interviews, or class recordings. I thought qualitative research would be easy – it was just words after all, and I had been using words almost my whole life. I assumed if I collected some words and summarized what I thought they meant (think word cloud), I would be doing qualitative research. As we will elaborate here, this is a limited view of what qualitative research is and what qualitative research can accomplish. When I began presenting qualitative research, I found it helpful to draw analogies to qualitative studies in natural science and medical disciplines. For instance, in the field of biology, the invention of technologies (e.g., lenses, microscopes) allowed for detailed observation and rich descriptions of cells (i.e., qualitative research) that led to the development of cell theory, the establishment of the field of cell biology, and quantitative research on cell structure, function, and dysfunction. In my own field of neuroscience, Henry Moliason, known as HM, was the focus of qualitative case study because he lost the ability to form new long-term memories due to a surgical treatment for severe epilepsy. Rich (i.e., comprehensive and detailed) description of Mr. Moliason’s memory impairment was the basis for hippocampal function being proposed as the main mechanism through which memories are formed. These examples of “non-numbery” research that produce influential descriptions and testable mechanisms helped me recognize the potential value and impact of qualitative research.

Types of Qualitative Research Questions

Qualitative research is useful for addressing two main types of questions: descriptive and mechanistic. Descriptive questions ask what is happening, for whom, and in what circumstances. Mechanistic questions ask how a phenomenon of interest happening. Here we explain each type of question and highlight some example studies conducted in the context of undergraduate research.

Descriptive Questions

Descriptive research seeks to elucidate details that enhance our overall understanding of a particular phenomenon—it answers questions about what a phenomenon is, including its defining features (i.e., dimensions) and what makes it distinct from other phenomena (Loeb et al. 2017). Descriptive research can also reveal who experiences the phenomenon, as well as when and where a phenomenon occurs (Loeb et al. 2017). Details like these serve as a starting point for future research, policy development, and enhanced practice. For instance, Hunter, Laursen, and Seymour (2007) carried out a qualitative study that identified and described the benefits of undergraduate research from the perspectives of both students and faculty. This work prompted calls for expansion of undergraduate research nationally and led to numerous quantitative studies (Gentile, Brenner, and Stephens 2017). Among these were quantitative studies from our group on the influences of research mentors on undergraduate researchers (Aikens et al. 2016, 2017; Joshi, Aikens, and Dolan 2019). Although these studies were framed to identify beneficial outcomes, we observed that undergraduates who had less favorable experiences with mentors were opting not to participate in our studies. Given this observation and the dearth of research on negative experiences in undergraduate research, we carried out a descriptive qualitative study of the dimensions (i.e., the what) of negative mentoring—that is, problematic or ineffective mentoring—in undergraduate life science research (Limeri et al. 2019). This study revealed that negative mentoring in undergraduate research included the absence of support from mentors and actively harmful mentor behaviors. These results served as the basis for practical guidance on how to curtail negative mentoring and its effects and for ongoing quantitative research. We use this study as the basis for the extended examples highlighted in Table 1.

Descriptive research is also suited to investigating the experiences of groups that are marginalized or minoritized in higher education. These studies offer insights into student experiences that may be otherwise overlooked or masked in larger quantitative studies (Vaccaro et al. 2015). For example, descriptive qualitative research shed light on how Black women in undergraduate and graduate STEM programs recognized and responded to structural racism, sexism, and race-gender bias. This research identified how high-achieving Black STEM students experienced racial battle fatigue and offered program-level suggestions for how to better support Black students (McGee and Bentley 2017). Descriptive qualitative research of deaf students involved in undergraduate research revealed that lack of awareness of Deaf culture of research mentors as well as lack of communication hindered students’ research experiences (Majocha et al. 2018). This research led to recommendations for research programs, research mentors, and students themselves. Another descriptive qualitative study showed how Latine students’ science identity changed over time when involved in an undergraduate research program (Vasquez-Salgado et al. 2023). Specifically, Vasguez-Salgado and colleagues identified patterns in students’ science identity through three waves of data collection spanning 18 months. Students’ identities showed consistent or fast achievement of feeling like a scientist, gradual achievement of feeling like a scientist, achievement adjustment of feeling like a scientist at one point and less so later in the program, or never feeling like a scientist. Together, these and other studies have generated knowledge that raises questions for future research and informs our collective efforts to make undergraduate research more accessible and inclusive.

Mechanistic Questions

Mechanistic qualitative research aims to address questions of how or why a phenomenon occurs. In the context of undergraduate research, an investigator may seek to understand how or why a particular practice or program design affects students. Recently, we conducted a mechanistic qualitative study that aimed, in part, to understand how early career researchers (undergraduate, postbaccalaureate, and graduate students) conceptualized their science identity (Pfeifer et al. 2023). Previous research theorized that someone is more likely to identify as a scientist if they are interested in science, believe they are competent in and can perform science, and feel recognized by others for their scientific aptitude or accomplishments (Carlone and Johnson 2007; Hazari et al. 2010; Potvin and Hazari 2013). However, this theory is somewhat limited in that it does not fully explain how context affects science identity or how science identity evolves, especially as researchers advance in their scientific training (Hazari et al. 2020; Kim and Sinatra 2018). To address this, we integrated science identity theory with research on professional identity development to design our study (Pratt, Rockmann, and Kaufmann 2006). We analyzed data from two national samples, including open-ended survey responses from 548 undergraduates engaged in research training and interview data from 30 early career researchers in the natural sciences. We found that they conceptualized science identity as a continuum that encompassed being a science student, being a science researcher, and being a career researcher. How students saw their science identity depended on how they viewed the purpose of their daily research, the level of intellectual responsibility they have for their research, and the extent of their autonomy in their research. We consider these findings to be hypotheses that can be tested quantitatively to better understand science identity dynamics in research training contexts. By asking this mechanistic question about science identity, we sought to add to and refine existing theory.

Key Attributes of Qualitative Research

For any type of research to be meaningful, it must possess some degree of rigor—what qualitative researchers call trustworthiness (Morse et al. 2002; Yilmaz 2013). Qualitative research is more trustworthy if it is characterized by credibility, transferability, dependability, and confirmability (Creswell and Poth 2016; Lincoln and Guba 1985). For instance, like accuracy and precision in quantitative research, do qualitative findings reflect what is being studied and are the interpretations true to the data (credibility)? Similar to reproducibility in quantitative research, how can qualitative research findings be applied to similar contexts (transferability)? Like validity in quantitative research, to what degree are the framing, methods, and findings of qualitative research appropriate given the aims (dependability)? Similar to the idea of replicability in quantitative research, if the same analytic tools were applied to the same data set could similar findings be reached by someone outside the original research team (confirmability)? The exact dimensions of trustworthiness, how trustworthiness manifests in the research process, the best ways to achieve trustworthiness, and how to talk about trustworthiness in research products are the subject of ongoing and often-spirited debate (e.g., Gioia et al. 2022; Mays and Pope 2020; Morse et al. 2002; Ritchie et al. 2013; Tracy 2010; Welch 2018; Yadav 2022). Central to these dialogues is the fact that qualitative research is composed of different philosophical approaches that emerged and evolved from diverse social science fields (Creswell and Poth 2016; Ritchie et al. 2013). Identifying universally agreed-upon criteria and the means to achieve these criteria is complex.

In our own work, we have found Tracy’s (2010) eight criteria for excellent qualitative research particularly useful. These criteria have helped us design studies, make decisions during the course of research, and articulate in our papers how our research seeks to achieve trustworthiness (e.g., Pfeifer, Cordero, and Stanton 2023). The full list of criteria is: worthy topic, rich rigor, sincerity, credibility, resonance, significant contribution, ethical conduct, and meaningful coherence (Tracy 2010). These criteria borrow from and build on the presented concepts of credibility, transferability, dependability, and confirmability. In our view, these criteria are presented and described in a way that makes sense to us and fits our approach to research. Here we highlight two criteria that may be particularly relevant if you are new to qualitative research.

Worthy Topics

As scholars familiar with undergraduate research and scholarly inquiry, SPUR readers are well-positioned to design studies that address research questions that are significant and timely in the context of undergraduate research. The first step in doing qualitative research (or any research) is to figure out what you want to study. You’ll want to select a topic that you find interesting, relevant, or otherwise compelling so you are motivated to spend time and effort investigating it. One way to find a topic is to notice what is happening in your environment and your work. What are you observing about undergraduate research? Something about students who participate (or not)? Something about colleagues who work with undergraduate researchers (or not)? Something about the design, implementation, or outcomes of the research experience? Something about the programmatic or institutional context? For a topic to be worthy of research, it should be interesting to you and to others. Consider sharing your observations with a few critical friends (i.e., trusted colleagues who will give you honest feedback) about whether they find your observations interesting or worth your time and energy to explore.


Like other human research, qualitative studies must adhere to basic ethical principles of respect for persons, beneficence, and justice (National Commission for the Protection of Human Subjects 1978). Respect for persons means treating all people as autonomous and protecting individuals with diminished autonomy (e.g., students whom we teach and assess). Beneficence involves treating people in an ethical manner, including respecting their decisions, protecting them from harm, and securing their well-being. Justice refers to the balance between benefiting from research and bearing its burdens; in other words, people should be able to benefit from research and should not be expected to bear the burden of research if they cannot benefit. Although it is beyond the scope of this essay to provide guidance on how to adhere to these principles, it is important to recognize that qualitative methods like interviewing can be highly personal and sometimes powerful experiences for both participants (and researchers). Investigators should carefully consider how their participants may be affected by data collection. For example, you may interview or survey participants about a personally difficult or painful experience. Do you then bear responsibility for helping them find support to navigate these difficulties? What if a participant reveals to you a serious mental health issue or physical safety concern? These situations occurred during our negative mentoring studies. We provided information to participants about where they could seek counseling or support for specific issues that can occur with mentors, such as harassment and discrimination.

Certainly not all qualitative data collection brings up these issues, but it can and does happen more frequently than you might expect. Your institutional review board (IRB), collaborators, and critical friends can be helpful resources when planning for and navigating tough scenarios like this. If working with an IRB is new to you, we recommend finding colleagues at your institution who have conducted IRB-reviewed research and asking them for guidance and examples. Some IRBs offer training for individuals new to developing human research protocols, and there are likely to be templates for everything from recruitment letters to consent forms to study information. We have found the process of developing IRB protocols helps refine research questions and study plans. Furthermore, IRB review is needed before you collect data that will be used for your study; IRBs rarely if ever allow for retrospective review and approval. In our experience, these studies are likely to be determined as exempt from IRB review because they involve minimal risk and use standard educational research procedures. However, the IRB is still responsible for making this determination and is a valuable partner for helping investigators navigate sensitive or complex situations that occur in human research.

Getting Started with Qualitative Research

Now that you have a sense of the purposes of qualitative research and what features help to ensure its quality, you are probably wondering how to do it. We want to emphasize that there are entire programs of study, whole courses, and lengthy texts that aim to teach qualitative research. We cannot come close to describing what can be learned from these more substantial resources. With this is mind, we share our own process of carrying out qualitative research as an example that others might find helpful to follow. We outline this “how to” as a series of steps, but qualitative research (like all research) is iterative and dynamic (University of California Museum of Paleontology 2022). Feel free to read through the steps in a linear fashion but then move in non-linear ways through the various steps. Extended discussion of each of these steps with examples from our research on negative mentoring is provided in Table 1 along with an abridged list of our go-to references.

Observe, Search, and Read

For a topic to be worthy of qualitative research (or any research), it should also have the potential to address a knowledge gap. After we identify a “worthy topic,” we try to find as much information about that topic as possible (Dolan 2013). We read, then we keep reading, and then we read some more. This may seem obvious, but we find that investing time reading literature can save us a lot of time designing, conducting, and writing up a study on a phenomenon that is already well known or understood by others and just not (yet) by us. To help us in our searching, we will sometimes reach out to colleagues in related fields to describe the phenomenon we are interested in studying and see if they have terms that they use to describe the phenomenon or theories they think are related. Theory informs our research questions, study designs, analytic approaches, and interpretation and reporting of findings, and enables alignment among all of these elements of research (e.g., Grant and Osanloo 2014; Luft et al. 2022; Spangler and Williams 2019). Theory also serves as a touchstone for connecting our findings to larger bodies of knowledge and communicating these connections in a way that promotes collective understanding of whatever we are investigating.

Formulate a Question

Once you have selected a topic and identified a knowledge gap, consider research questions that, if answered, would address the knowledge gap. Recall that qualitative research is suited to questions that require a descriptive (what) or mechanistic (how) answer.

Decide on a Study Design

Just like quantitative research, qualitative research has characteristic approaches, designs, and methodologies, each of which has affordances and constraints (Creswell and Poth 2016; Merriam 2014; Miles, Huberman, and Saldana 2014). Creswell and Poth provide a valuable resource for learning more about different types of qualitative research study designs, including which designs are suited to address which kinds of research questions. Given the labor intensiveness of qualitative data collection and analysis, it is critical to think carefully about how to recruit and select study participants. What this looks like and who might be appropriate study participants will depend on many factors, including the knowledge gap, research question, study design, and methods. Questions that can be helpful to ask are: Who do I need to study to answer my research question? What should the study participants have in common? In what ways should study participants vary to provide rich, complex, and varied insight into what I am studying? To whom do I want to generalize my findings, keeping in mind the qualitative nature of the work?

Based on the answers to these questions, you may opt for purposeful sampling in which you collect data only from participants who meet the characteristics you decide upon given the aims of your study. In this case, you will likely send a screening survey to potential participants to determine what their characteristics of interest are, which will help you decide if you will invite them for further data collection or not. A purposeful sample contrasts with a convenience sample where essentially any person who agrees to participate in the study will be selected for further data collection.

Collect and Analyze Data Systematically

Qualitative data can be collected in a variety of ways, including surveys, interviews, and focus groups, as well as audio and video recordings of learning experiences such as class sessions. To decide which method(s) to use for data collection, it is helpful to consider what you aim to learn from study participants. Surveys tend to be easier to distribute to a larger sample, but may elicit shorter or shallower responses, which are challenging to interpret because there is less information (i.e., words) and no opportunity to clarify with participants. Focus groups can be effective for quickly gathering input from a group of participants. However, social dynamics may result in one or a few people dominating the discussion, or “group think,” when people agree with one another rather than providing their own unique perspectives. Interviews with individuals can be a rich and varied data source because each participant has time and space to offer their own distinct perspective. Interviews also allow for follow-up questions that are difficult through survey methods. Yet, conducting interviews skillfully—avoiding leading questions and ensuring that the line of questioning yields the desired data—takes a lot of thought and practice. Kvale (1996) offers detailed guidance on how to design and carry out research interviews. Observing an expert interviewer and having them observe and give feedback as you interview can help improve your skills. Audio and video recordings of learning experiences like class sessions or group work can provide a plethora of information (e.g., verbal and nonverbal exchanges among students or between students and instructors) in a more natural setting than surveys or interviews. Yet deciding what information will serve as data to answer your research question, or how that large body of data will be systematically analyzed, can be cumbersome.

Regardless of the data collection method, you’ll need to decide how much data to collect. There is no one right sample size. A good rule of thumb is collecting data until you reach “saturation,” which is the notion that the same ideas are coming up repeatedly and that no new ideas are emerging during data collection. This means that your data collection and analysis are likely to overlap in time, with some data collection then some analysis and then more data collection.

Analytic methods in qualitative research vary widely in their interpretive complexity. As natural scientists, we favor sticking close to the data and analyzing using a method called qualitative content analysis. Content analysis involves taking quotes or segments of text and capturing their meaning with short words or phrases called codes. The process of developing codes and systematically applying them to a dataset is called coding. Coding is highly iterative and time-consuming because it typically requires multiple, careful passes through the dataset to ensure all codes have been evenly applied to all data. In a recent study, we spent 10 to 15 person-hours to code a single interview, and about 400 person-hours to complete coding for a 30-participant study. The time involved in coding depends on what is being studied, the type of coding, and who is coding the data. Saldaña (2016) provides excellent guidance on the coding process, including various ways of making sense of codes by grouping them into themes. Content analysis is just one approach to qualitative data analysis. We encourage you to learn more about different forms of qualitative approaches and choose what works best for you, including your skill level, research goals, and data (e.g., Creswell and Poth 2016; Starks and Brown Trinidad 2007).

Interpret and Write Results

There are many ways to effectively write up results, often called findings, from qualitative research. Because qualitative research involves extensive interpretation, it can sometimes be easier to integrate the results and discussion of a qualitative paper. Integration allows the interpretation (discussion) to be directly supported by the evidence in the form of quotations (results). The conclusions of the paper should avoid repeating the results and instead comment on the implications and applications of the findings: why they matter and what to do as a result. Because qualitative data are quotations rather than numbers, qualitative papers tend to be longer than papers presenting quantitative studies. That said, qualitative papers should still aim to be succinct. For instance, depending on the approach and methods, quotations can be lightly edited to remove extra words or filler language (e.g., um, uh) that is a natural part of language but otherwise irrelevant to the findings. Presenting only the most pertinent part of a quotation not only facilitates succinctness, but helps readers attend to the specific evidence that supports the claims being made. Another strategy to shorten qualitative papers is to present some findings in supplemental materials.

Final Recommendations

In closing our article, we offer some advice that we wish we knew when we began conducting qualitative research. We hope that these recommendations will help you think through issues that are likely emerge as you delve deeper into qualitative analysis, both as a producer and a consumer of qualitative research.

Consensus Coding in Qualitative Analysis

In qualitative analysis, we work to ensure that the analysis yields trustworthy findings by coding to consensus, meaning that the analytic team reaches 100 percent agreement on the application of each code to the data. Any disagreement between coders is discussed until a resolution is resolved. In some cases, these discussions may result in a code description being redefined. Redefinition of a code requires that all data previously coded using the original code be reanalyzed to ensure fit with the revised definition. As you might imagine, coding to consensus can be time-consuming. Yet, in our experience, the time invested in coding to consensus is well spent because the analysis yields deeper insights about the data and phenomenon being investigated. We also see coding to consensus as a great way to take advantage of the diverse viewpoints that team members bring to our research. By coding to consensus, we consider multiple interpretations of the data throughout the analysis process. We are well-positioned to develop theory (as appropriate for our study design) as a team because we all have engaged in meaningful conversations about our findings throughout analysis.

Some qualitative research relies on a calculated measure of intercoder reliability (ICR) instead of coding to consensus. ICR values indicate how often a set of coders agree on the application of a code in the dataset. This quantification of coding is tempting because we love numbers, yet it can also be problematic (O’Connor and Joffe 2020). For instance, aiming for high ICR can create situations when coders are pressured to agree with each other rather than bringing their own unique perspective to the coding process (e.g., Belur et al. 2018; Morse 1997). Quantifying qualitative work also can imply a false precision in the analysis. In some research, ICR is calculated partway through the analysis to determine whether an “acceptable” level of agreement has been reached, at which point the remainder of the data are coded by just one researcher. This approach of using ICR as a cut-off runs counter to what many argue is the value of qualitative research: generating new theoretical understandings informed by multiple perspectives.

Using Numbers in Qualitative Analysis

Although numbers certainly have a place in qualitative analysis (Sandelowski 2001), we encourage researchers to move beyond word clouds or frequency counts of codes and themes in their results for two reasons. First, a code or theme that is infrequently observed in the data set can still be important to the phenomenon being studied. As an analogy, consider making qualitative observations of living cells under a typical light microscope. We would most frequently see a relatively stationary cell that is punctuated by a relatively rare cell division or mitosis. If we only reported stationary observations in findings, we would overlook describing mitosis, one of the most dynamic and fundamental processes that cells display. Second, given limited sample sizes, it may be that a unique and important code or theme is reported by only one participant in the data set. In fact, rare observations can serve as “a-ha moments” that lead to a more comprehensive understanding of the phenomenon under investigation. These rare observations also may inspire new studies about topics that were not initially anticipated; this speaks to the value of qualitative research.

Closing Thoughts

We encourage readers to continue to learn about qualitative research as there is much that could not be addressed in a single article. For instance, we did not introduce how philosophical stances, like how someone views the nature of truth or what counts as evidence, influence the research process. (Creswell and Poth 2016). For now, we will close with one final piece of advice. We both became better qualitative researchers by working with mentors and collaborators who have this expertise. We encourage you to find colleagues in your networks or at your institutions who may be interested in being a collaborator, mentor, or critical friend. The complexity of students and their experiences lend themselves to qualitative approaches. We hope this article might serve as an impetus for you to learn more about qualitative research and even start your own investigations.

Data Availability Statement

The data included in this commentary have been published in an open-access journal under a Creative Commons license. Citations are included in the text.

Institutional Review Board Statement

Not applicable.

Conflict of Interest Statement

The authors have no conflicts of interest to report.


This material is based upon work supported by the National Science Foundation under award number OCE-2019589. This is the National Science Foundation’s Center for Chemical Currencies of a Microbial Planet (C-Comp) publication #026. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. We thank Patricia Mabrouk for inviting us to contribute this commentary. We thank members of the Biology Education Research Group at the University of Georgia and Daniel Dries, Joseph Provost, and Verónica Segarra for their thoughtful feedback on manuscript drafts.


Agee, Jane. 2009. “Developing Qualitative Research Questions: A Reflective Process.” International Journal of Qualitative Studies in Education 22: 431–447. doi: 10.1080/09518390902736512

Aikens, Melissa L., Melissa M. Robertson, Sona Sadselia, Keiana Watkins, Mara Evans, Christopher R. Runyon, Lillian T. Eby, and Erin L. Dolan. 2017. “Race and Gender Differences in Undergraduate Research Mentoring Structures and Research Outcomes.” CBE—Life Sciences Education 16(2): ar34. doi:10.1187/cbe.16-07-0211

Aikens, Melissa L., Sona Sadselia, Keiana Watkins, Mara Evans, Lillian T. Eby, and Erin L. Dolan. 2016. “A Social Capital Perspective on the Mentoring of Undergraduate Life Science Researchers: An Empirical Study of Undergraduate–Postgraduate–Faculty Triads.” CBE—Life Sciences Education 15(2): ar16. doi: 10.1187/cbe.15-10-0208

Anfara, Vincent A., Kathleen M. Brown, and Terri L. Mangione. 2002. “Qualitative Analysis on Stage: Making the Research Process More Public.” Educational Researcher 31(7): 28–38. doi:10.3102/0013189X031007028

Belur, Jyoti, Lisa Tompson, Amy Thornton, and Miranda Simon. 2018. “Interrater Reliability in Systematic Review Methodology: Exploring Variation in Coder Decision-Making.” Sociological Methods & Research 50: 837–865. doi:10.1177/0049124118799372

Carlone, Heidi B., and Angela Johnson. 2007. “Understanding the Science Experiences of Successful Women of Color: Science Identity as an Analytic Lens.” Journal of Research in Science Teaching 44: 1187–1218. doi: 10.1002/tea.20237

Castillo-Montoya, Milagros. 2016. “Preparing for Interview Research: The Interview Protocol Refinement Framework.” Qualitative Report 21: 811–831. doi: 10.46743/2160-3715/2016.2337

Charmaz, Kathy. 2006. Constructing Grounded Theory: A Practical Guide through Qualitative Analysis. London: Sage.

Creswell, John W., and Cheryl N. Poth. 2016. Qualitative Inquiry and Research Design: Choosing among Five Approaches. Sage.

Dolan, Erin L. 2013. “Biology Education Scholarship.” IBiology.

Gentile, Jim, Kerry Brenner, and Amy Stephens, eds. 2017. Undergraduate Research Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington, DC: National Academies Press.

Gioia, Denny, Kevin Corley, Kathleen Eisenhardt, Martha Feldman, Ann Langley, Jane Lê, Karen Golden-Biddle, et al. 2022. “A Curated Debate: On Using ‘Templates’ in Qualitative Research.” Journal of Management Inquiry 31: 231–52. doi:10.1177/10564926221098955

Goldberg, Abbie E., and Katherine R. Allen. 2015. “Communicating Qualitative Research: Some Practical Guideposts for Scholars.” Journal of Marriage and Family 77 (1): 3–22. doi:10.1111/jomf.12153

Grant, Cynthia, and Azadeh Osanloo. 2014. “Understanding, Selecting, and Integrating a Theoretical Framework in Dissertation Research: Creating the Blueprint for Your ‘House.’” Administrative Issues Journal 4(2): 4.

Hampton, Cynthia, David Reeping, and Desen Sevi Ozkan. 2021. “Positionality Statements in Engineering Education Research: A Look at the Hand That Guides the Methodological Tools.” Studies in Engineering Education 1(2): 126–141. doi: 10.21061/see.13

Hazari, Zahra, Deepa Chari, Geoff Potvin, and Eric Brewe. 2020. “The Context Dependence of Physics Identity: Examining the Role of Performance/Competence, Recognition, Interest, and Sense of Belonging for Lower and Upper Female Physics Undergraduates.” Journal of Research in Science Teaching 57:1583–1607. doi: 10.1002/tea.21644

Hazari, Zahra, Gerhard Sonnert, Philip M. Sadler, and Marie-Claire Shanahan. 2010. “Connecting High School Physics Experiences, Outcome Expectations, Physics Identity, and Physics Career Choice: A Gender Study.” Journal of Research in Science Teaching 47: 978–1003. doi: 10.1002/tea.20363

Holmes, Andrew, and Gary Darwin. 2020. “Researcher Positionality: A Consideration of Its Influence and Place in Qualitative Research; A New Researcher Guide.” Shanlax International Journal of Education 8(4): 1–10. doi: 10.34293/education.v8i4.3232

Hunter, Anne-Barrie, Sandra L. Laursen, and Elaine Seymour. 2007. “Becoming a Scientist: The Role of Undergraduate Research in Students’ Cognitive, Personal, and Professional Development.” Science Education 91: 36–74. doi: 10.1002/sce.20173

Joshi, Megha, Melissa L. Aikens, and Erin L. Dolan. 2019. “Direct Ties to a Faculty Mentor Related to Positive Outcomes for Undergraduate Researchers.” BioScience 69: 389–397. doi10.1093/biosci/biz039

Kim, Ann Y., and Gale M. Sinatra. 2018. “Science Identity Development: An Interactionist Approach.” International Journal of STEM Education 5: 51. doi: 10.1186/s40594-018-0149-9

Knott, Eleanor, Aliya Hamid Rao, Kate Summers, and Chana Teeger. 2022. “Interviews in the Social Sciences.” Nature Reviews Methods Primers 2: 73. doi: 10.1038/s43586-022-00150-6

Korstjens, Irene, and Albine Moser. 2017. “Series: Practical Guidance to Qualitative Research. Part 2: Context, Research Questions and Designs.” European Journal of General Practice 23: 274–279. doi: 10.1080/13814788.2017.1375090

Kvale, Steinar. 1996. InterViews: An Introduction to Qualitative Research Interviewing. Thousand Oaks, CA: Sage.

Kyngäs, Helvi, Kristina Mikkonen, and Maria Kääriäinen, eds. 2020. The Application of Content Analysis in Nursing Science Research. Cham: Springer International. doi: 10.1007/978-3-030-30199-6

Limeri, Lisa B., Muhammad Zaka Asif, Benjamin H. T. Bridges, David Esparza, Trevor T. Tuma, Daquan Sanders, Alexander J. Morrison, Pallavi Rao, Joseph A. Harsh, and Adam V. Maltese. 2019. “‘Where’s My Mentor?!’ Characterizing Negative Mentoring Experiences in Undergraduate Life Science Research.” CBE—Life Sciences Education 18(4): ar61. doi: 10.1187/cbe.19-02-0036

Lincoln, Yvonna S., and Egon G. Guba. 1985. Naturalistic Inquiry. Sage.

Loeb, Susanna, Susan Dynarski, Daniel McFarland, Pamela Morris, Sean Reardon, and Sarah Reber. 2017. “Descriptive Analysis in Education: A Guide for Researchers.” NCEE 2017-4023. National Center for Education Evaluation and Regional Assistance.

Luft, Julie A., Sophia Jeong, Robert Idsardi, and Grant Gardner. 2022. “Literature Reviews, Theoretical Frameworks, and Conceptual Frameworks: An Introduction for New Biology Education Researchers.” CBE—Life Sciences Education 21(3): rm33. doi: 10.1187/cbe.21-05-0134

Majocha, Megan, Zachary Davenport, Derek C. Braun, and Cara Gormally. 2018. “‘Everyone Was Nice . . . But I Was Still Left Out’: An Interview Study about Deaf Interns’ Research Experiences in STEM.” Journal of Microbiology & Biology Education 19(1): 19.1.10. doi: 10.1128/jmbe.v19i1.1381

Mays, Nicholas, and Catherine Pope. 2020. “Quality in Qualitative Research.” In Qualitative Research in Health Care, ed. Catherine Pope and Nicholas Mays, 211–233. doi:10.1002/9781119410867.ch15

McGee, Ebony O., and Lydia Bentley. 2017. “The Troubled Success of Black Women in STEM.” Cognition and Instruction 35: 265–289. doi: 10.1080/07370008.2017.1355211

Merriam, Sharan B. 2014. Qualitative Research: A Guide to Design and Implementation. San Francisco: Wiley.

Miles, Matthew B., A. Michael Huberman, and Johnny Saldana. 2014. Qualitative Data Analysis: A Methods Sourcebook. 3rd ed. Thousand Oaks, CA: Sage.

Morse, Janice M. 1997. “‘Perfectly Healthy, but Dead’: The Myth of Inter-Rater Reliability.” Qualitative Health Research 7:445–47. doi: 10.1177/104973239700700401

Morse, Janice M., Michael Barrett, Maria Mayan, Karin Olson, and Jude Spiers. 2002. “Verification Strategies for Establishing Reliability and Validity in Qualitative Research.” International Journal of Qualitative Methods 1(2): 13–22. Doi:10.1177/160940690200100202

National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. 1978. “The Belmont Report: Ethical Principles and Guidelines for the Commission for the Protection of Human Subjects of Biomedical and Behavioral Research.” 3 vols. Bethesda, MD: National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research.

O’Connor, Cliodhna, and Helene Joffe. 2020. “Intercoder Reliability in Qualitative Research: Debates and Practical Guidelines.” International Journal of Qualitative Methods 19:1609406919899220. doi: 10.1177/1609406919899220

Pfeifer, Mariel A., Julio J. Cordero, and Julie Dangremond Stanton. 2023. “What I Wish My Instructor Knew: How Active Learning Influences the Classroom Experiences and Self-Advocacy of STEM Majors with ADHD and Specific Learning Disabilities.” CBE—Life Sciences Education 2(1): ar2. doi: 10.1187/cbe.21-12-0329

Pfeifer, Mariel A., C. J. Zajic, Jared M. Isaacs, Olivia A. Erickson, and Erin L. Dolan. 2023. “Beyond Performance, Competence, and Recognition: Forging a Science Researcher Identity in the Context of Research Training.” BioRxiv 2023.03.22.533783. doi: 10.1101/2023.03.22.533783

Potvin, Geoff, and Zahra Hazari. 2013. “The Development and Measurement of Identity across the Physical Sciences.” 2013 PERC Proceedings. American Association of Physics Teachers.

Pratt, Michael G., Kevin W. Rockmann, and Jeffrey B. Kaufmann. 2006. “Constructing Professional Identity: The Role of Work and Identity Learning Cycles in the Customization of Identity among Medical Residents.” Academy of Management Journal 49: 235–262. doi: 10.5465/AMJ.2006.20786060

Ritchie, Jane, Jane Lewis, Carol McNaughton Nicholls, and Rachel Ormston. 2013. Qualitative Research Practice: A Guide for Social Science Students and Researchers. Sage.

Roulston, Kathryn, Kathleen deMarrais, and Jamie B. Lewis. 2003. “Learning to Interview in the Social Sciences.” Qualitative Inquiry 9: 643–668. doi: 10.1177/1077800403252736

Saldaña, Johnny. 2016. The Coding Manual for Qualitative Researchers. 3rd ed. Los Angeles: Sage.

Sandelowski, Margarete. 1995. “Qualitative Analysis: What It Is and How to Begin.” Research in Nursing & Health 18: 371–375. doi: 10.1002/nur.4770180411

Sandelowski, Margarete. 1998. “Writing a Good Read: Strategies for Re-Presenting Qualitative Data.” Research in Nursing & Health 21: 375–382. doi: 10.1002/(SICI)1098-240X(199808)21:4<375::AID-NUR9>3.0.CO;2-C

Sandelowski, Margarete. 2001. “Real Qualitative Researchers Do Not Count: The Use of Numbers in Qualitative Research.” Research in Nursing & Health 24: 230–240. doi: 10.1002/nur.1025

Spangler, Denise A., and Steven R. Williams. 2019. “The Role of Theoretical Frameworks in Mathematics Education Research.” In Designing, Conducting, and Publishing Quality Research in Mathematics Education, ed. Keith R. Leatham, 3–16. Research in Mathematics Education. Cham: Springer International. doi:10.1007/978-3-030-23505-5_1

Starks, Helene, and Susan Brown Trinidad. 2007. “Choose Your Method: A Comparison of Phenomenology, Discourse Analysis, and Grounded Theory.” Qualitative Health Research 17: 1372–1380. doi: 10.1177/1049732307307031

Tracy, Sarah J. 2010. “Qualitative Quality: Eight ‘Big-Tent’ Criteria for Excellent Qualitative Research.” Qualitative Inquiry 16:837–851. doi: 10.1177/1077800410383121

University of California Museum of Paleontology. 2022. “Understanding Science: Science Flowchart.” UC Museum of Paleontology Understanding Science.

Vaccaro, Annemarie, Ezekiel W. Kimball, Ryan S. Wells, and Benjamin J. Ostiguy. “Researching students with disabilities: The importance of critical perspectives.” New directions for institutional research 2014, no. 163 (2015): 25-41. doi: 10.1002/ir.20084

Vasquez-Salgado, Yolanda, Tissyana C. Camacho, Isabel López, Gabriela Chavira, Carrie L. Saetermoe, and Crist Khachikian. 2023. “‘I Definitely Feel like a Scientist’: Exploring Science Identity Trajectories among Latinx Students in a Critical Race Theory–Informed Undergraduate Research Experience.” Infant and Child Development 32(3): e2371. doi: 10.1002/icd.2371

Watson, Rachel M., John D. Willford, and Mariel A. Pfeifer. 2018. “A Cultured Learning Environment: Implementing a Problem-and Service-Based Microbiology Capstone Course to Assess Process- and Skill-Based Learning Objectives.” Interdisciplinary Journal of Problem-Based Learning 12(1): article 8. doi:10.7771/1541-5015.1694

Welch, Catherine. 2018. “Good Qualitative Research: Opening up the Debate.” In Collaborative Research Design: Working with Business for Meaningful Findings, 401–412. Singapore: Springer. doi: 10.1007/978-981-10-5008-4

Yadav, Drishti. 2022. “Criteria for Good Qualitative Research: A Comprehensive Review.” Asia-Pacific Education Researcher 31 679–689. doi: 10.1007/s40299-021-00619-0

Yilmaz, Kaya. 2013. “Comparison of Quantitative and Qualitative Research Traditions: Epistemological, Theoretical, and Methodological Differences.” European Journal of Education 48: 311–325. doi: 10.1111/ejed.12014

Mariel A. Pfeifer

University of Georgia,

Mariel A. Pfeifer is a postdoctoral researcher at the University of Georgia’s SPREE (Social Psychology of Research Experiences and Education) Lab. Her passion for biology education research was sparked by her experiences as an undergraduate teaching assistant, a pre-service science teacher, and a disability services coordinator. Soon Pfeifer will begin her new role as an assistant professor of biology at the University of Mississippi.

Erin L. Dolan is a professor of biochemistry and molecular biology and Georgia Athletic Association Professor of Innovative Science Education at the University of Georgia As a graduate student, Dolan volunteered in K–12 schools, which inspired her pursuit of a biology education career. She teaches introductory biology and her research group, the SPREE Lab, works to delineate features of undergraduate and graduate research that influence students’ career decisions.

Step Up for SPUR

Step Up for SPUR

Publishing with SPUR: Start with a Great Research Question

Publishing with SPUR: Start with a Great Research Question

Introduction – Fall 2023

Introduction – Fall 2023

Developing Prerequisite Skills in a CURE through Competency-Based Assignments

Developing Prerequisite Skills in a CURE through Competency-Based Assignments

The Genomics Education Partnership: First Findings on Genomics Research in Community Colleges

The Genomics Education Partnership: First Findings on Genomics Research in Community Colleges