8 Using Phenomenography as a Theoretical Framework for Investigating Student Experience With Edtech

Brett McCollum

Introduction

Early on in my SoTL journey, I felt comfortable with quantitative methods that answered research questions in the form of ‘what works.’ I quickly realized that for all the value that quantitative studies provide, they cannot answer all questions. There are aspects of the human experience, including those within teaching and learning, that might be better understood through qualitative methods. For example, it is not possible to quantify what a learner is doing or thinking when they engage with a learning resource without some degree of interpretation on the part of the researcher. Either the researcher assigns categories a priori based on assumptions of how the learner will interact with the resource, or the researcher generates a coding system to describe their observations. In both cases, the categories or codes represent one or more qualitative variables, and the researcher is using their expert judgement to assign observational data within the parameters of these variables. While statistical methods can be applied to the categorical data (Agresti, 2013), the bias of the researcher is unavoidable. This does not devalue the approach, rather it necessitates disclosure of bias.

Once I overcame my initial opposition to qualitative research, I began experimenting with different styles of research questions. In one study, I investigated ‘what is happening’ when learners attempt to translate between three-dimensional representations of molecules (McCollum et al., 2016). This involved collecting think-aloud interview data and analyzing it to identify the variation in problem-solving strategies that learners developed. Later, I undertook a study of ‘what is possible’ when learners are challenged to problem solve live over video chat with a peer in another country (Skagen et al., 2018). It is this latter style of question, alternatively phrased as ‘visions of the possible’ among the taxonomy of SoTL questions (Hutchings, 2000), that continues to capture my imagination. Arguably, most SoTL projects involving novel educational technology initially involve this type of research question. We must first discover what is possible with educational technology before we can explore the processes involved or the impacts of these new opportunities.

In this chapter, I introduce the reader to phenomenography. Phenomenology can be viewed as a research methodology, or the methods and procedures used to collect and analyze data. However, a more encompassing perspective is that of a theoretical framework, wherein the theory of phenomenography includes specific assumptions and rulesets that determine the parameters of the study from the outset. As a theoretical framework, it shapes the research questions, influences the methodology of the study, and guides the researcher in their choice of conceptual framework for interpreting the data. In particular, phenomenography as a theoretical framework can be used to investigate variations in a learning experience across a population. This framework is particularly appropriate for ‘what is happening’ and ‘what is possible’ SoTL research questions and can be used for qualitative or mixed methods studies.

Selecting a Research Paradigm to Study Personal Experience

All learning experiences are personal. Many learning experiences are also social. In 2016, Dr. Layne Morsch of the University of Illinois – Springfield and I decided to try leveraging information and communication technologies for a novel learning experience in chemistry. Our plan was to pair our students, one Canadian and one American per group, and have them meet over video chat six times during a semester to collaborate on organic chemistry homework. We designed the experience to include personal and social components. We reflected on our two contexts: the ubiquity among our student populations of mobile personal computing devices (phones, tablets, laptops) equipped with microphones and cameras. Yet we did not know how it was going to be received by our learners. We were not sure what would work as intended, what would go smoother than we anticipated, and what would go horribly wrong.

As described in Chapter 2, phenomenography aligns with the paradigm category that involves empirical research that aims to build theory and where the researcher values objectivity and strives to minimize researcher influence. Working within this paradigm, my colleague and I sought to understand under what conditions our innovative approach to teaching organic chemistry was possible. We also had to actively discuss and document our positionality relative to the experience we were studying. We were experts in the field of chemistry. Our students were novices. We had significant experience using synchronous video communication technologies for professional work. While there was near unity for the fraction of students in our classes that had used video chat before, almost none had used video chat for any purpose other than socializing with friends or visiting with a relative. Communicating over video chat for academic or professional purposes, particularly with a stranger, was outside the realm of comfort for our students.

Tracking student performance on the exams, relative to other sections or semesters, could give us metrics on the impact of this learning intervention. We could collect observational data from the student pairs by having them submit video recordings of their meetings. However, we were most interested in understanding if it was possible to use information and communication technology to connect learners across an international border for synchronous communication. By possible, I mean was it feasible or realistic to expect a group of second year university students to contact a stranger from another country, organize times across different time zones to meet over video chat, complete individualized preparatory work, and then engage with their assigned international partner respectfully and professionally? Thus, we began asking questions such as the following:

  • What barriers would students tell us they had experienced during their international online collaborative learning?
  • How would students describe the experience?
  • What skills would students report developing as a result of using this communication technology for academic purposes?

These questions emphasize the learner’s experience from their own perspective. How significant we, as the researchers, felt a barrier was for a student, or even whether we considered something within the experience as a barrier rather than a benefit, was not the focus of our work. Our goal was to understand research participants’ experiences and minimize our subjectivity when interpreting participants’ descriptions of the experience. Thus, we needed a framework for our SoTL study that would provide us with a ‘ruleset’ to address our positionality, manage our bias, and honour the contributions of our participants by sharing their voices with minimal interpretative filtering.

Phenomenography as a Research Framework

Phenomenography is a qualitative research framework used to identify and categorize the variations in how people experience a phenomenon (Marton, 1981). Implied within this brief description are a few assumptions:

  1. The objective of the research is to inductively generate a hypothesis/model/theory, not test the veracity of a hypothesis (Glaser & Strauss, 1967).
  2. People can experience, conceptualize, understand, perceive, and apprehend phenomena in and of the world in distinctly different ways (Marton, 1994).
  3. The data collection method will generate qualitative data that can be analyzed to identify the variations in experience.
  4. The number of distinct ways that people can experience a phenomenon is finite.
  5. Sampling a large enough number of participants from a population can permit a researcher to observe the complete set of variations for that population.
  6. The conclusions of the research report how participants themselves describe their relationship to the phenomenon (Pang, 2003).
  7. The phenomenon under study is not being considered in the absence of the population under study (Bowden, 2000; Limberg, 2000); in this sense, the conjoined subject-object relationship is the experience being investigated (Yates et al., 2012).
  8. The researcher is examining participants’ descriptions of the experience, not the experience itself (Säljö, 1997).

Given that the ways an experience may be described can vary from population to population based on their prior experiences and cultural aspects of discourse, description of context is important in phenomenographic research. Furthermore, the context of the researcher, their positionality relative to the subject-object relationship, should be disclosed (Sandbergh, 1997). Appropriate experimental controls and checks on researcher interpretations should be structured within the study’s design.

Similar to research within a positivist paradigm, reproducibility, replicability, and reliability are important in phenomenographic studies. Reproducibility means that if another researcher is working with the same data set and methods, even with the impacts of researcher positionality, they should be able to reproduce the findings. Replicability means that if the study was repeated, the same results would be obtained. In phenomenography, this would necessitate repeating the study on the same population, given that how individuals—and by way of extension, a population—experience a phenomenon is influenced by context. However, how humans experience the world around them can also be time-dependent, and thus, replicability cannot necessarily be assumed or expected for phenomenographic studies. In the context of our study, how students experienced using video chat for learning changed as campus WiFi improved. Descriptions of the experience changed again during the COVID-19 pandemic. This does not imply that the results from the original data analysis were not valid, or that they no longer provide value. Rather, it is the responsibility of scholars and practitioners to reflect on the relationship between context and replicability when reading phenomenographic studies. The third ‘R,’ reliability, means that our findings are appropriately robust to withstand scrutiny by scholars using alternative methods. This is not the same as suggesting that your results are ‘true,’ which is fundamentally an epistemological question. Triangulating phenomenographic results with analysis of other qualitative or quantitative data, such as student performance data, can increase your trust that your conclusions are reliable.

The result of a phenomenographic study is a well-defined set of categories that attempts to span the outcome space of possible variations in a subject-object relationship. While the observed outcome space may not be identical to the true outcome space for the population depending on the sampling method (Åkerlind et al., 2005), data collection until saturation is achieved is one approach that can improve the reliability of the set of categories (Trigwell, 1994, 2000).

The set of categories has as few elements as is feasible for describing the critical variation in experience and meets the following criteria for the set (Bruce, 1997; Marton & Booth, 1997):

  • Each element within the set must describe a distinctly different aspect of the experience (it must be qualitatively different from the other categories).
  • Each element within the set should logically be related to each other element.
  • The complete set of elements describes the observed critical variation.

Finally, the structure of the outcome space can present as three possible types based on the logical relationship between the elements. The outcome space is either hierarchical, developmental, or participant past-experience dependent (Laurillard, 1993).

Applying a phenomenographic framework to the study of online collaborative learning involved selecting a research question situated around the learner’s experience with the phenomenon, aligning that research question with an appropriate data collection method, and using inductive data analysis methods.

Methods

Interviews and Focus Groups

Within a phenomenographic framework, in order to form a model of your research participants’ experiences with respect to a phenomenon, you must first collect descriptions of the experience from your participants. Generally, this can take two forms: one-on-one interviews or focus groups (Gill et al., 2008). The choice of one method over the other depends on the objective of the data collection.

Interviews with individual participants can be structured, semi-structured, or unstructured (Yeo et al. 2023). For semi-structured interviews, key questions are selected by the researcher in advance with the intention of pursuing follow-up lines of questioning dependent on the responses of the participant. This permits the researcher to obtain information about the experience that participants deem relevant beyond the researcher’s original parameters of interest. While this method is more time intensive than a research questionnaire or survey, the dynamic interaction between the researcher and the participant in an interview yields significantly more insight into the experience of the participant. The resulting increased depth of qualitative data is valuable for enhancing the reliability of the outcome space.

Focus groups are an organized discussion with a group of participants (Krueger & Casey, 2009; Gibbs, 2012). A key distinction of focus groups relative to individual interviews is the interaction between participants. Thus, the researcher must be prepared to act as a facilitator of the conversation, creating opportunities and extending invitations for all participants to contribute to the discussion. Depending on the phenomenon, the individuals within a group, and the skill of the interviewer, focus groups can generate consensus or reveal diverging perspectives of an experience. For a phenomenographic study, which seeks to identify the variations in how people experience a phenomenon, the researcher will employ probing questions that seek to expose differences in opinion within the group. Furthermore, the researcher should express appreciation for, and validity of, these varied descriptions of the experience. This helps participants discuss their feelings and opinions openly and minimizes the potential impacts of groupthink. Yet it must be acknowledged that the researcher cannot predict how the group dynamics will play out. Proper documentation of the conversation by audio recording and of the observed context of each focus group by way of field notes, will aid the researcher during data analysis (Krueger & Casey, 2001; Phillippi & Lauderdale, 2018).

As a research method, focus groups are not suitable for hypothesis-testing studies (Vaughn et al., 1996). This is not a problem for work framed by phenomenography due to its hypothesis-generating nature. Focus groups are effective for quickly gathering information on how participants are experiencing change. Members of the group may provide complimentary information that fills in the gaps from the description of a single participant. On the other hand, focus groups may discourage certain population members from fully contributing or participating at all, and findings may not generate the same depth of insight as one-on-one interviews (Halcomb et al., 2007). Hence, the choice of sampling method should be reported, and attainment of data saturation is important for a phenomenographic study.

For our study of student experiences with international online collaborative learning (OCL), we chose to use focus groups. Our purpose in selecting focus groups over interviews was to have participants hear a range of OCL pair dynamics and discuss how each other’s experiences compared and contrasted to their own. To avoid a conflict of interest in our dual roles as instructors and researchers, and to minimize its potential impacts on group discussions, we chose to have senior undergraduate student researchers facilitate the focus groups. Having these near-peers managing all interactions with participants demonstrably increased the comfort of our students during the focus groups, yielding descriptions of the experience that may not have been reported if a faculty member had facilitated the groups.

At the beginning of each focus group, the facilitator described the obligation of all attendees to maintain the anonymity of participants, the purpose of the data collection, and the potential impact of the results to inform future pedagogical decisions. Throughout the focus groups, participants were asked how their experience compared to a previous speaker. When new issues would emerge during the discussion, the facilitator would ask additional questions, demonstrating interest, to obtain additional detail.

The initial foci for our SoTL study on learning experiences during OCL were barriers and benefits of the novel approach. This included barriers/benefits involving the educational technology (hardware, software, WiFi connectivity, etc.) and the social component of working with an international partner. During our focus groups, the topics of professionalism, professional skills, and professional identity emerged. Our facilitators skillfully probed for more information when participants raised these unanticipated topics. Participants described navigating technological barriers with support from their partner and situated their comments within language related to professional skills. They reported how an early semester lack of professionalism from their partner, or from themselves, harmed the trust between partners. Further questioning from the facilitator revealed that across the six assignments, student pairs were broadly successful in addressing differences in meeting preparation, which resulted in strengthening the relationship between partners. The course instructors, Dr. Morsch and myself, separately observed complementary evidence of this bonding between partners when we suggested to our classes that the partners would be reassigned after the third assignment to provide an opportunity to meet another international peer. In both classrooms, chaos erupted. One particularly vocal student jumped out of their seat and shouted, “You can’t take my Canadian away from me!”

In the focus groups, we observed that as participants grew comfortable with the facilitator, each other, and the structure of the focus group, they were more proactive in emphasizing the similarities and differences in their experiences and provided more supporting examples from their OCL pair interactions. After several focus groups, the facilitators began meeting regularly to compare field notes and assess if new descriptions of the phenomenon critically varied from previous focus groups. This provided an early estimation of data saturation and indicated when we were ready to begin data analysis.

Qualitative Coding

Focus group audio recordings were transcribed, and then we, as the researchers, read the transcripts several times to familiarize ourselves with the data. Next, a subset of transcripts were line-by-line inductively coded to generate an initial set of codes (Thomas, 2006). This coding process was carried out by members of the research team independently, and then the team met to discuss the emergent codes. The choices of names for codes, their meanings, and how they should be applied to the transcripts was refined through collegial debate. This process of coding and refining the codes was repeated for several iterations before additional transcripts were added to the coding process. Additional iterations of code refinement then followed until saturation was achieved, meaning that no new codes emerged, and the researchers reached a consensus on code names, meanings, and how the codes should be applied to passages within the transcripts. Although we describe the codes as ‘emerging’ from the data, it is important for researchers to again acknowledge their positionality and how it influences their observation of the codes through active choices (Fine, 2002).

The remaining transcripts were then coded by two or more team members using the established coding system. Based on our initial reading of the transcripts to familiarize ourselves with the data, we were confident that no additional codes would be required, and thus, a deductive coding approach was appropriate at this stage.

Code Clustering and Thematic Analysis

With codes established, the research team began to cluster the codes into themes. This process, known as thematic analysis (Braun & Clarke, 2006; Saldaña, 2009), aims to reduce the number of elements (categories) within the phenomenographic outcome space to its minimal critical set. Thematically organizing the individual codes resulted in a reduction from over a dozen codes to only three themes that spanned the entire outcome space. Thus, a benefit of this analysis approach is to simplify the narrative of your findings for dissemination. Having an outcome space of three to five themes permits your audience to more effectively understand the findings of your SoTL study. In our study of how students experience OCL with an international peer in organic chemistry, we obtained four themes: impact, barriers, resources, and collaborative learning approaches.

Communicating Our Results

The richness of our data was prohibitive for publishing all of the results in a single manuscript. Thus, we prepared separate manuscripts addressing the themes of impact (Skagen et al., 2018) and barriers (McCollum et al., 2019). For example, within the theme of barriers for OCL with an international partner, we identified three codes: (1) content and pedagogy, (2) social interactions, and (3) technology. In that manuscript, each code was presented as a sub-header. The observed variation in ways that our participants experienced the phenomenon relative to that code was described under that sub-header. Quotes from our focus groups were provided to the reader with appropriate framing of context and importance. Our findings were situated within existing literature to more fully explain our observations and solidify our conclusions with established theory.

Including Scholarly Details on Abysmal Failures

As a scholar of teaching and learning, I strive to provide sufficient detail in publications to permit other scholars and practitioners to replicate my teaching initiative within their own courses. Beyond research findings, this includes descriptions of what worked as intended. It also involves the exploration of issues that emerged, resistance among stakeholders, and what can best be described as abysmal failures.

For example, the SoTL study investigating barriers for OCL found several technological barriers in the Fall 2016 semester that necessitated changes in design before the project could be repeated in a future semester (McCollum et al., 2019). Campus WiFi was so unreliable that one participant stated, “we learned not to trust the WiFi,” and another said, “we would just have to mime the whole time” (p. 12). As a result, the technology became a distraction rather than affording new opportunities for learning. While our students were exceptionally patient with the situation, I did not consider the status quo to be sustainable. Fortunately, when I made inquiries, I learned that my university was in the midst of a campus-wide WiFi upgrade, which was temporarily decreasing the reliability of the network but would ultimately double WiFi capacity once the upgrades were complete. Based on the experience described by our participants, I would not recommend the use of OCL unless all learners had adequate access to a reliable WiFi network.

Another example of an abysmal failure from the same study was identified in terms of learning gains from one segment of our participants. Quantitative data on student performance revealed measurable gains on standard exam questions for the Canadian students, relative to previous semesters, but no comparable improvement for the American learners. Initially, we were unable to explain the presence of this variation across the two populations. The answer came from our phenomenographic study. In the focus groups, participants described how the American students were assigned the responsibility of recording the video chat meetings and submitting them to their instructor. Training had been provided to the American students by their university’s I.T. department on how to use the recording software, but technical issues still arose during student-pair meetings. In contrast, the Canadian students were not required to record the meetings since that task was already being completed by their partner. This differential in responsibilities did not markedly change the experience for all learners, but it did for some American students. As one learner stated, “I didn’t really feel like I learned that much because there was so much technology issues that I was freaking out about that the whole time” (p. 13). The context of cognitive load for the American partners was different enough from their Canadian counterparts that the change in exam performance by the average American student was not statistically significant, despite our observation of improved learning for the Canadian students. Given that one of the objectives of our teaching innovation was to improve student learning, we considered this result a failure that required a redesign of the experience. The next time we deployed the OCL assignments, we removed the requirement for recording the video chat meetings. A description of the iterative redesign of the OCL experience in collaboration with students was ultimately reported as a case study of Students as Partners (McCollum et al., 2019); our work reflecting the cyclic process of scholarly teaching is shown in Figure 4.1 of Chapter 4.

Conclusion

Qualitative research frameworks, such as phenomenography, provide value both in terms of a theoretical underpinning (Ravitch & Riggan, 2016) for the work and as a standardized set of rules that the SoTL scholar agrees to abide by. Similar to using standard equipment in a chemistry lab or a validated survey tool in social sciences work, the use of a research framework allows your audience to quickly understand the strengths and limitations of your study. Similar to methodological or instrumental approaches in other forms of research, increased use of a particular framework by a researcher can enhance their expertise with that framework and improve their scholarship. Your choice of framework guides the way you frame your research question, how you investigate it, and what you can report at the end of your study. With its emphasis on the conjoined subject-object relationship, phenomenography is exceptionally useful for SoTL inquiry on how learners experience educational technology or how educators experience teaching with educational technology.

References

Agresti, A. (2013). Categorical data analysis (3rd ed.). John Wiley & Sons.

Bowden, J. A. (2000). The nature of phenomenographic research. In J. A. Bowden, & E. Walsh (Eds.), Phenomenography (pp. 1–18). RMIT University Press

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Bruce, C. (1997). The seven faces of information literacy. Auslib Press

Fine, M. (2002). Disruptive voices: The possibilities for feminist research. University of Michigan Press.

Gibbs, A. (2012). Focus groups and group interviews. In J. Arthur, M. Waring, R. Coe, & L. V. Hedge (Eds.), Research methods and methodologies in education (pp. 186–192). Sage.

Gill, P., Stewart, K., Treasure, E., & Chadwick, B. (2008). Methods of data collection in qualitative research: Interviews and focus groups. British Dental Journal, 204, 291-295. https://doi.org/10.1038/bdj.2008.192

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Aldine.

Halcomb, E. J., Gholizadeh, L., Phillips, J., & Davidson, P. M. (2007). Literature review: Consideration in undertaking focus group research with culturally and linguistically diverse groups. Journal of Clinical Nursing, 16(6), 1000–1011. https://doi.org/10.1111/j.1365-2702.2006.01760.x

Hutchings, P. (2000). Opening lines: Approaches to the scholarship of teaching and learning. Carnegie Foundation for the Advancement of Teaching.

Krueger, R. A., & Casey, M. A. (2001). Designing and conducting focus group interviews. Social Development Papers: Social Analysis Selected Tools and Techniques, 36, 4–23.

Krueger, R., & Casey, M. (2009). Focus groups: A practical guide for applied research (4th ed.). Sage.

Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of educational technology. Routledge.

Limberg, L. (2000). Phenomenography: A relational approach to research on information needs, seeking and use. The New Review of Information Behaviour Research, 1(December), 51–67.

Marton, F. (1981). Phenomenography — Describing conceptions of the world around us. Instructional Science, 10, 177–200. https://doi.org/10.1007/BF00132516

Marton, F. (1994). Phenomenography. In T. Husen, & T. N. Postlethwaite (Eds.), The international encyclopedia of education (2nd ed., Vol. 8, pp. 4424–4429). Pergamon.

Marton, F., & Booth, S. (1997). Learning and awareness. Lawrence Erlbaum Associates.

McCollum, B., Morsch, L., Pinder, C., Ripley, I., Skagen, D., & Wentzel, M. (2019). Multi-dimensional trust between partners for international online collaborative learning in the third space. International Journal for Students as Partners, 3(1), 50–59. https://doi.org/10.15173/ijsap.v3i1.3730

McCollum, B., Morsch, L., Shokoples, B., & Skagen, D. (2019). Overcoming barriers for implementing international online collaborative assignments in chemistry. The Canadian Journal for the Scholarship of Teaching and Learning, 10(1). https://doi.org/10.5206/cjsotl-rcacea.2019.1.8004

McCollum, B., Sepulveda, A., & Moreno, Y. (2016). Representational technologies and learner problem-solving strategies in chemistry. Teaching & Learning Inquiry, 4(2), 1–14. https://doi.org/10.20343/teachlearninqu.4.2.10

Pang, M. F. (2003) Two faces of variation: On continuity in the phenomenographic movement. Scandinavian Journal of Educational Research, 47(2) 145–156. https://doi.org/10.1080/00313830308612

Phillippi, J., & Lauderdale, J. (2018). A guide to field notes for qualitative research: Context and conversation. Qualitative Health Research, 28(3), 381–388. https://doi.org/10.1177/1049732317697102

Ravitch, S. M., & Riggan, J. M. (2016). Reason and rigor: How conceptual frameworks guide research. Sage.

Saldaña, J. (2009). The coding manual for qualitative researchers. Sage.

Säljö, R. (1997). Talk as data and practice—A critical look at phenomenographic inquiry and the appeal to experience. Higher Education Research & Development, 16(2), 173–190. https://doi.org/10.1080/0729436970160205

Sandbergh, J. (1997). Are phenomenographic results reliable? Higher Education Research & Development, 16(2), 203–212. https://doi.org/10.1080/0729436970160207

Skagen, D., McCollum, B., Morsch, L., & Shokoples, B. (2018). Developing communication confidence and professional identity in chemistry through international online collaborative learning. Chemistry Education Research and Practice, 19, 567–582. https://doi.org/10.1039/C7RP00220C

Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237–246. https://doi.org/10.1177/1098214005283748

Trigwell, K. (1994). The first stage of a phenomenographic study of phenomenography. In J. A. Bowden, & E. Walsh (Eds.), Phenomenographic research: Variations in method (pp. 56–72). RMIT University Press.

Trigwell, K. (2000). A phenomenographic interview on phenomenography. In J. A. Bowden, & E. Walsh (Eds.), Phenomenography (pp. 62–82). RMIT University Press.

Vaughn, S., Schumm, J., and Sinagub, J. (1996). Focus group interviews in education and psychology. Sage.

Yates, C., Partridge, H., & Bruce, C. (2012). Exploring information experiences through phenomenography. Library and Information Research, 36(112), 96–119. https://doi.org/10.29173/lirg496

Yeo, M., Miller-Young, J., & Manarin, K. (2023). SoTL research methodologies: A guide to conceptualizing and conducting the Scholarship of Teaching and Learning. Routledge. https://doi.org/10.4324/9781003447054