⚠️ Research in Progress: Doctoral Defence Forthcoming
This site is a living academic document. Content is being updated as the dissertation moves toward its final defence. Some sections remain in draft form. New to this site? Visit the Start Here page.
How I Designed This Research and Why the Method Matters
“The master’s tools will never dismantle the master’s house.”
Audre Lorde, 1984
I could have studied international students the traditional way. I could have designed a survey, distributed it to hundreds, run the numbers, and produced findings at the scale institutions prefer: generalizable, defensible, tidy. Instead, I designed a methodology that hands cameras to students and asks them to show me their university through their own eyes. This was a methodological choice. It was also an ethical one. This page describes the design of that methodology. The findings it generated will be shared following the doctoral defence.

The Paradigm: Critical-Transformative Research
This study is situated within the critical-transformative paradigm articulated by Mertens (2009), which holds that research should serve the interests of marginalized communities and contribute to social transformation. A research paradigm is the set of beliefs and values about what counts as knowledge, what counts as evidence, and what research is for, and it guides every decision a researcher makes. The critical-transformative paradigm differs from conventional scientific approaches in that it explicitly takes a political and ethical stance: research that ignores the power structures shaping the people it studies is anything but neutral. It is complicit. This paradigmatic commitment shapes every aspect of the research design: from the questions I ask to the methods I employ to the ways I will eventually present findings.
Within this paradigm, I adopt a constructivist-interpretivist epistemology with critical inflections. An epistemology is a theory of how we know what we know. A constructivist-interpretivist epistemology holds that knowledge is constructed through social interaction rather than discovered as fixed objective fact. Meaning emerges through interpretation, and interpretation is always shaped by the social positioning of both researcher and participant. The critical inflection demands attention to how structures of power (particularly those emerging from academic capitalism, Slaughter & Rhoades, 2004, and settler colonialism) shape the conditions within which meaning-making occurs.
Shawn Wilson’s (2008) concept of relational accountability provides the axiological foundation of this study: the ethical value system that orients every decision I make. Wilson, an Opaskwayak Cree scholar, argues that research creates relationships, and that those relationships carry obligations beyond publication. While I am a settler researcher using a settler methodology, I draw on relational accountability as an ethical compass: findings must be returned to the communities that generated them. This commitment shapes the knowledge mobilization plan and the eventual structure of this website.
Photovoice: Cameras in Students’ Hands
The participatory foundation of this study is Photovoice, developed by Caroline Wang and Mary Ann Burris (1997) for community health research. Photovoice is a participatory action research method in which community members use cameras to document their own lives and concerns, then come together to analyze those photographs collectively. The core principle is radical in its simplicity: the people who live an experience are the experts on that experience. Instead of the researcher deciding what matters and designing instruments to capture it, Photovoice positions cameras in participants’ hands and asks them to document their own realities.
Wang and Burris (1997) identified three goals for Photovoice: to enable people to record and reflect on their community’s strengths and concerns, to promote critical dialogue through discussion of photographs, and to reach policymakers. These goals aligned precisely with what I needed. I was designing a study about the gap between institutional rhetoric and lived experience, and I needed a method that would let the gap speak for itself.
After the photography period, participants gathered for a collaborative debrief session guided by the SHOWeD protocol, a structured questioning technique that moves from description through analysis to action:
Table 3
The SHOWeD Protocol
| Letter | Question |
|---|---|
| S | What do you See here? |
| H | What is really Happening here? |
| O | How does this relate to Our lives? |
| W | Why does this situation exist? |
| e | How could this image Educate others? |
| D | What can we Do about it? |
Note. Adapted from Wang and Burris (1997). The lowercase “e” in SHOWeD reflects the six-stage structure used in this study. The protocol structures collaborative meaning-making, moving participants from literal description through systemic analysis to actionable recommendations.
Scholarly Personal Narrative: Making the Researcher Visible
The reflexive strand of the methodology is Scholarly Personal Narrative (SPN), developed by Nash (2004) and extended by Nash and Bradley (2011). SPN is a qualitative writing method that integrates personal experience into scholarly inquiry as a source of analytical insight rather than as mere contextual backdrop. It argues that refusing the “view from nowhere” (the pretence that the researcher is a neutral, disembodied observer) actually produces stronger scholarship, because it makes the lenses through which we interpret data visible and contestable.
Throughout the dissertation, SPN sections document my own experience as a contract faculty member navigating institutional precarity alongside the students I study. These sections function as analytical lenses that contextualise participant-generated data rather than as evidence of truth claims. My narratives illuminate what my positioning allows me to see while making transparent what it might obscure.
Blended Witnessing: Two Eyes, One Depth
I developed blended witnessing to name the specific ethical and analytical alignment that existing methodological terms failed to capture. Photovoice centres participant voice yet can render the researcher invisible. Scholarly Personal Narrative centres researcher reflexivity yet can overshadow participant testimony. Blended witnessing holds both voices in productive tension through what I call protective divergence: the deliberate analytical separation of researcher and participant narratives to prevent appropriation while enabling resonance.
The metaphor I use is binocular vision. One eye, the participant’s, sees the lived experience of exclusion. The other eye, the researcher’s, sees the institutional mechanisms that produce that exclusion. Together, they generate depth perception that reveals what I call the architecture of exclusion as an integrated system. This draws on Donna Haraway’s (1988) concept of situated knowledges: the argument that only partial perspective promises objective vision, and that refusing the “god trick of seeing everything from nowhere” produces more accountable scholarship.
Table 4
Components of Blended Witnessing
| Component | Source | Function in This Study |
|---|---|---|
| Photovoice | Wang & Burris (1997) | Centres participant expertise; generates visual evidence; positions students as experts on their own experience |
| Scholarly Personal Narrative | Nash (2004); Nash & Bradley (2011) | Renders researcher positioning visible as analytical data; first-person engagement with interpretive tensions |
| Interpretative Phenomenological Analysis | Smith et al. (2009) | Systematic meaning-making analysis; each participant’s experience understood in its particularity before cross-case patterns are identified |
| Trauma-informed ethics | Mertens (2009); Ginwright (2018) | Protection during vulnerable disclosure; ongoing consent; participant wellbeing prioritized over data richness |
| Relational accountability | Wilson (2008) | Research as relationship carrying obligations beyond publication; findings returned to communities before academic audiences |
Note. Each component contributes distinct methodological strengths. The integration enables analysis that honours participant voices while making the researcher’s positionality visible and contestable.

The Kitchen Table
Before entering a single image into analysis software, I designed the study to begin with what I call kitchen table analysis. This approach draws on feminist traditions of kitchen table reflexivity (Kohl & McCutcheon, 2015) and what Mountz et al. (2015) term slow scholarship: a politics of resistance to the accelerated timelines of the neoliberal academy. The protocol involves printing participant photographs and physically arranging them on a surface, using coloured tags and yarn to trace emergent connections before digital coding begins.
Something in me resisted the idea of entering photographs into a database before sitting with them, moving them around, letting them speak to each other and to me. Jongeling et al. (2016) observe that the embodied act of sorting printed photographs prompts memories, bridges connections, and enables prioritization in ways that digital sorting may obscure. The kitchen table phase was designed to honour this embodied way of knowing.
Following the analogue phase, data were entered into NVivo for systematic verification, allowing convergence or divergence between the two analytical modes to strengthen confidence in the emergent themes.
Designing for Absence
One of the methodological innovations I propose in this study is the concept of ghost data: participant withdrawal treated as structural evidence rather than methodological failure. Ghost data is my term for what happens when the most precarious participants, the very people a study of precarity most needs to hear from, are consumed by the conditions being studied before they can participate in research about those conditions. In any study of precarity, the most precarious participants may be least available for sustained engagement. If a participant withdraws because the very conditions the study examines have consumed their capacity, that withdrawal reveals something about those conditions that completed participation cannot.
Ghost data maps the outer limits of participation under conditions of extraction. It asks: what does the system take before testimony can occur? This concept builds on Mazzei’s (2007) work on inhabited silence and extends it into participatory visual methodologies where the absence of an image carries analytical weight.
I designed the study with the awareness that attrition in precarious populations is likely and should be analyzed rather than dismissed. The ethical protocols include clear and unconditional withdrawal procedures, and the analytical framework includes provisions for interpreting absence as data.
Ethical Design
The study received dual ethics approval from both the Thompson Rivers University Research Ethics Board (File H25-04204, Board of Primary Record) and the Royal Roads University Research Ethics Board (File H25-00572). Ethical design centres on four commitments:
Table 5
Ethical Design Principles
| Principle | How It Shapes This Study |
|---|---|
| Consent as ongoing process | Consent is treated as an evolving relationship rather than a one-time signature. Participants reviewed and renegotiated consent at multiple stages: initial recruitment, photography, debrief, and member checking. |
| Visual sovereignty | Participants retain ownership of all original images. Photographs are reproduced only with explicit permission, and participants may withdraw any image at any time. Faces of third parties are obscured. |
| Trauma-informed protocols | Data collection was designed with awareness that precarity produces vulnerability. Support resources (Here2Talk, Interior Crisis Line) were provided. Debriefs included emotional check-ins before analytical discussion. |
| Relational accountability | Findings are returned to participant communities before academic audiences. Participant review of interpretations is built into the timeline. The knowledge mobilization plan prioritizes community benefit. |
Note. These principles are designed to ensure that the participatory methodology avoids reproducing the extractive dynamics it seeks to analyze. The research process should model the relational integrity the findings call for.
Recruitment and Sampling Design
I want to be honest about how I recruited participants, because the recruitment process itself reveals something about the conditions this study was designed to examine.
Participants were recruited through purposive sampling, a strategy that selects individuals based on direct, lived experience of the phenomenon under investigation rather than statistical representativeness (Patton, 2002). In a study designed to centre the voices of international business students at Thompson Rivers University, purposive sampling was the only approach that made ethical and epistemological sense. I was seeking depth of experience, particularity of perspective, and willingness to engage in a multi-stage, time-intensive participatory process. These qualities cannot be randomly sampled. They have to be sought deliberately.
To be eligible for the Photovoice component, students needed to be currently enrolled as international students in a business programme at TRU and willing to participate in three stages of engagement: a photography workshop, an individual photo-elicitation interview, and a group dialogue session. I recruited through several channels: direct invitation in classes I taught, posters placed in common areas and the library, and word of mouth through the International Student Services office. I was deliberate about reaching students across different years of study and countries of origin, because I understood that the experience of belonging and precarity shifts depending on how long someone has been navigating the institution and what structural protections, or absences of protection, they carry with them.
Five students enrolled. Four completed the full research process. One withdrew before taking a single photograph. Her withdrawal was caused directly by the work demands that the study was designed to document. She left, and the study had nothing to do with it. She left because the institution had already consumed the time and energy that participation would have required. I return to her absence throughout this dissertation. It is a presence shaped like an absence. It is data.
The sample size of four completed participants is small by conventional research standards. It is entirely appropriate for the methodology I employed. IPA is designed for small, carefully selected samples that allow for deep, idiographic analysis of individual experience before patterns across cases are considered (Smith et al., 2009). Photovoice similarly works with depth rather than breadth. Wang and Burris (1997) developed the method for groups of six to twelve, and scholars applying Photovoice to university contexts have consistently worked with sample sizes of four to ten participants. Four participants, each producing multiple photographs and participating in extended dialogue, generated a body of evidence that is rich, layered, and analytically substantial.
References
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Sage.
Smith, J. A., Larkin, M., & Flowers, P. (2009). Interpretative phenomenological analysis: Theory, method and research. Sage.
Wang, C., & Burris, M. A. (1997). Photovoice: Concept, methodology, and use for participatory needs assessment. Health Education and Behavior, 24(3), 369–387. https://doi.org/10.1177/109019819702400309
The Photography Workshop and Data Collection Protocol
The photography workshop was the first time I sat in a room with participants and understood, in my body rather than just my mind, what I was asking of them.
The workshop ran for approximately ninety minutes. I opened with a brief orientation to the study’s purpose: to use photographs as a way of documenting and theorizing the gap between what the institution promises and what it delivers. I was careful about how I framed this. I wanted to leave space open before they had picked up their phones. I wanted them to photograph what they actually noticed, what stopped them, what made them feel something that resisted words. The orientation was structured around invitation rather than instruction.
Participants used their own smartphones. This was a deliberate methodological choice. Providing cameras would have introduced a layer of technological unfamiliarity and, more importantly, would have changed the relationship between the photographer and the act of photographing. Smartphones are already embedded in how these students navigate their world. Using them meant that the photography could happen in ordinary moments, between classes, on the way to work, in the dining hall at midnight, rather than only in formally designated “research time” that precarity might render unavailable.
Each participant was asked to produce a minimum of five photographs over a two-week period, documenting spaces, objects, moments, and experiences related to their daily life at TRU. There was no upper limit. Participants were encouraged to photograph whatever drew their attention, whether it confirmed or complicated their experience of the institution. They were also given a brief photography ethics briefing covering the importance of avoiding identifiable images of other people without consent, respecting spaces where photography is restricted, and exercising their own judgment about what felt safe to document. Visual sovereignty, the participant’s right to control what they chose to show, was established from the first session.
Over the two-week period, four participants produced a combined total of twenty-nine photographs. The images ranged from architectural spaces to small personal objects to institutional signage to food. What they shared was a quality of deliberateness: these were images that someone had stopped to make, had chosen to include, had decided were worth showing. That quality of attention is itself methodologically significant. The photograph is an argument, as Rose (2016) argues, and treating it as such shaped every subsequent stage of analysis.
References
Rose, G. (2016). Visual methodologies: An introduction to researching with visual materials (4th ed.). Sage.
Wang, C., & Burris, M. A. (1997). Photovoice: Concept, methodology, and use for participatory needs assessment. Health Education and Behavior, 24(3), 369–387. https://doi.org/10.1177/109019819702400309
The Photo-Elicitation Interview Structure
Before the group debrief, each participant met with me individually for a photo-elicitation interview. These sessions lasted between sixty and ninety minutes. They were the methodological heart of the study.
Photo-elicitation interviews use participant-produced images as the starting point for conversation, a technique developed to shift the locus of interpretive authority away from the researcher and toward the person whose experience the image documents (Harper, 2002). In conventional interviews, the researcher controls the agenda through the questions. In photo-elicitation, the photograph controls the agenda. The participant explains what they saw, why they took it, what it means to them, what it sits alongside in their daily experience. My role was to follow that explanation carefully, to ask what opened rather than what closed, and to resist the impulse to interpret before I had listened.
I structured each interview in three phases. In the first phase, I invited participants to lay out all their photographs and simply describe them, image by image, without analysis. This warm-up served two purposes: it settled the participant into the conversation, and it gave me a first pass at the full archive before any interpretive work began. In the second phase, I asked participants to select three to five images they considered most significant and to speak about those in depth. I used open prompts throughout this phase: “Tell me more about this one.” “What were you thinking when you took it?” “What does this show that words would flatten?” In the third phase, I asked a set of questions drawn from the SHOWeD protocol, inviting participants to connect individual images to broader patterns in their institutional experience.
I was acutely aware of the power dynamics in these conversations. I was a faculty member interviewing students. I had held institutional authority over some of them in the classroom. I had also, by the time these interviews took place, been laid off by the same institution, a fact I disclosed to participants as part of my positionality and which, in several cases, visibly shifted the quality of what participants were willing to say. The asymmetry remained. It became something we could both see rather than something only I was managing.
All interviews were audio-recorded with participant consent and transcribed verbatim. Participants reviewed their own transcripts as part of the member-checking process described later in this section.
References
Harper, D. (2002). Talking about pictures: A case for photo elicitation. Visual Studies, 17(1), 13–26. https://doi.org/10.1080/14725860220137345
The Group Debrief Session
The group debrief session was the moment the study became something I had designed toward but could only partially have anticipated.
All four participants met together for approximately two hours. I facilitated. My research assistant was present to support the session logistics and to observe dynamics I might miss while I was in the middle of them. The session was audio-recorded with participant consent.
The session opened with each participant selecting one photograph to introduce to the group, briefly explaining what they had captured and why they had chosen it to represent their collection. This opening served as an equaliser: every voice entered the room at the same moment, with equal time and equal weight. What followed was less a linear discussion than a series of recognitions. One participant would describe an experience embedded in their image, and another would respond with resonance rather than analysis: “Yes. That. That is what I mean.”
I guided the group through the SHOWeD protocol across a selection of images, moving from description through systemic analysis toward what participants identified as actionable change. The protocol kept the conversation anchored in the photographs rather than drifting into generalization, which mattered because the photographs were evidence: specific, located, undeniable. When a participant said the institution felt unwelcoming, the image sitting in front of us either confirmed or complicated that claim. The photograph held the conversation accountable to the particular.
The session moved between languages. Participants drew on their full linguistic repertoires as meaning required, sometimes mid-sentence. I had designed for this. The debrief was facilitated in English but conducted in whatever language carried the thought most accurately. This was a healing-centred design choice (Ginwright, 2018): a refusal to treat the session as a performance of English proficiency and an insistence instead on the quality of thinking that the participants brought.
I left that session carrying more than I had arrived with. What I carried was responsibility. Relational accountability, in Wilson’s (2008) framing, is the recognition that research creates obligations. Sitting in that room, I understood those obligations in my bones rather than just in my bibliography.
References
Ginwright, S. (2018). The future of healing: Shifting from trauma informed care to healing centered engagement. Occasional Paper. https://medium.com/@ginwright/the-future-of-healing-shifting-from-trauma-informed-care-to-healing-centered-engagement-634f557ce69c
Wilson, S. (2008). Research is ceremony: Indigenous research methods. Fernwood Publishing.
The Analytical Process in Full
Analysis in this study moved in two directions simultaneously: inward, toward the particularity of each participant’s experience, and outward, toward the structural patterns that individual accounts made visible. The methodology I developed, blended witnessing, required holding both movements at once rather than completing one before beginning the other.
The kitchen table phase came first. I printed all twenty-nine participant photographs at full size and arranged them on a large surface. I worked with coloured tags and lengths of yarn, tracing connections between images before I had words for what the connections were. Jongeling et al. (2016) observe that the embodied act of physically sorting printed photographs activates a different kind of attention than screen-based sorting: it allows for spatial relationship, for placing images in conversation with each other across a surface rather than in sequence down a page. What emerged from the kitchen table was a set of provisional groupings I thought of as families rather than codes: clusters of images that shared something I could feel before I could name it.
Following the analogue phase, I entered all data, photographs, interview transcripts, group debrief transcript, and my own SPN fieldnotes, into NVivo for systematic coding. I used an IPA framework as the analytical engine. Smith et al. (2009) describe IPA as proceeding through several stages: reading and rereading the data to achieve immersion, noting initial observations and connections, developing emergent themes from those observations, and then identifying patterns across cases once each individual account had been fully mapped in its particularity. I followed this sequence while adding a layer that IPA alone leaves unaddressed: the parallel coding of my own SPN material through what I call protective divergence.
Protective divergence means that my narrative and participants’ narratives were coded in separate streams throughout the analytical process. They were never merged. They were held alongside each other, allowed to resonate, and examined for the places where they rhymed and the places where they diverged. The divergences were as analytically important as the resonances. Where my experience as a contract faculty member aligned with what a participant described, that alignment illuminated the structural conditions we shared. Where my experience departed sharply from a participant’s, that departure illuminated the asymmetry that is central to the study’s theoretical argument. The analytical process was designed to make both visible.
Convergence between the kitchen table findings and the NVivo coding strengthened confidence in the emergent themes. Where the two analytical modes produced different groupings, I treated the divergence as a prompt to return to the data rather than an error to be resolved. Disagreement between analytic modes is information, and I designed the study to honour it.
References
Jongeling, T., Lewis, K., Howat, P., & Wood, L. (2016). Application of a Photovoice methodology: The value of a researcher reflective journal. International Journal of Qualitative Methods, 15(1), 1–12. https://doi.org/10.1177/1609406916636189
Smith, J. A., Larkin, M., & Flowers, P. (2009). Interpretative phenomenological analysis: Theory, method and research. Sage.
Member Checking
Member checking is the process through which a researcher returns interpretations to participants and asks whether those interpretations accurately represent the participants’ experiences (Lincoln & Guba, 1985). In many studies it functions as a procedural checkbox. In this study it functioned as an ethical obligation and an analytical resource.
I returned to each participant individually with a written summary of the themes I had developed from their photographs and interview transcript, along with a set of specific interpretive claims I was considering making. I asked each participant to review the summary and respond to two questions: Does this accurately represent your experience? Is there anything here you would like me to correct, remove, or develop further?
This was a genuine invitation, structured into the research design from the beginning rather than added at the end. Consent in this study was treated as an evolving relationship rather than a one-time signature, and member checking was one of the moments at which that relationship was renewed. Participants held visual sovereignty over their own images throughout the process, and they held interpretive sovereignty over the meanings I was drawing from those images.
The responses I received during member checking shaped the final analysis in substantive ways. Two participants asked me to develop an interpretation further, offering additional context that enriched the analysis considerably. One participant identified a place where my language had flattened a nuance they considered important, and I revised accordingly. That revision made the finding more accurate, which made it more useful. Member checking is sometimes framed as a validation technique, a way of checking whether the researcher has gotten it right. I experienced it as something more interesting than that: a continuation of the analytical conversation that the interviews had begun.
References
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Sage.
Ghost Data: Absence as Analytical Evidence
The fifth participant enrolled in this study with full intent. She attended the initial orientation session. She signed the consent form. She described, with precision and feeling, what she wanted to photograph. Then she disappeared from the research. Her work schedule had expanded to cover a gap in her hours that had opened suddenly, and the time that participation required was time she no longer had.
I could have recorded her withdrawal as attrition and moved on. The methodology required something more honest than that.
I developed the concept of ghost data to name what her empty folder actually represents: structural evidence of how the very conditions this study was designed to examine can consume a participant’s capacity to testify about those conditions before testimony occurs. Ghost data is the data that should exist but was consumed by the system before it could be produced. It is the outer limit of what participation under precarity can produce (Mazzei, 2007).
Treating her absence as data rather than as a methodological shortcoming required a deliberate design choice. It required building an analytical framework that could hold silence alongside speech and absence alongside presence. It required acknowledging that the most precarious participants in a study of precarity may be precisely the ones least available for sustained research engagement, and that this pattern is itself a finding about the system rather than a problem with the study.
Her empty folder sits in the NVivo project. It is labelled. It is present in the analysis. In every section of this dissertation where I write about what the system takes from people before they can speak, I am writing about her as much as I am writing about the four participants who completed the research. She was taken from the study before she could choose to stay or go. The institution made that choice for her.
References
Mazzei, L. A. (2007). Inhabited silence in qualitative research: Putting poststructural theory to work. Peter Lang.
Trustworthiness and Rigour
Qualitative research is rigorous when it earns that description through the deliberateness of its design, the transparency of its process, and the depth of its engagement with the complexity of human experience. The criteria Lincoln and Guba (1985) developed, credibility, transferability, dependability, and confirmability, provide the framework I used to evaluate the quality of this study throughout the research process rather than at the end of it.
Credibility, the qualitative parallel to internal validity, asks whether the findings accurately represent the experiences of the people who participated. I addressed credibility through prolonged engagement with participants across three stages of data collection, through member checking that returned interpretations to participants for review, through the deliberate maintenance of an SPN fieldwork journal that tracked my own interpretive moves in real time, and through the kitchen table analytical process that foregrounded embodied attention before digital coding began. The dual coding structure, analogue then digital, also strengthened credibility by requiring my interpretations to survive contact with two different analytical modes.
Transferability, the qualitative parallel to external validity, asks whether findings might resonate in other contexts. I make no claim to generalization. This study is situated in one institution, at one moment in a specific policy history, with four participants whose experiences are particular and irreducible. What I do claim is that the structural conditions this study documents, academic capitalism, epistemic injustice, malperformative inclusion, and asymmetrical precarity, are operational across Canadian postsecondary institutions and that the analytical concepts this study generates are designed to travel. Transferability is the responsibility of readers who encounter this work in their own contexts, and I have provided enough description of the research site, the policy moment, and the participant group to allow that judgment to be made.
Dependability, the qualitative parallel to reliability, asks whether the research process is coherent and traceable. The dissertation includes a full audit trail: ethics approvals, recruitment materials, consent forms, interview protocols, the SHOWeD facilitation guide, the NVivo codebook, and the member-checking records are all documented in the appendices. Any reader who wishes to trace how an interpretation was reached can do so.
Confirmability, the qualitative parallel to objectivity, asks whether the findings emerge from the data rather than from the researcher’s preferences. The answer here is honest: in a study that deliberately integrates researcher experience as analytical data, complete separation of finding from researcher is neither possible nor desirable. What I can confirm is that my own SPN material was coded separately from participant material throughout, that member checking provided a structural check on my interpretations, and that the theoretical framework was developed before data collection began rather than retrofitted to produce a desired result. Confirmability in a study like this one is achieved through transparency rather than through the fiction of neutrality.
References
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Sage.
Limitations of the Design
I want to be direct about the limitations of this study, because the limitations are part of the intellectual honesty the methodology demands.
The sample is small. Four participants at one institution in one policy moment cannot represent the full range of international student experience in Canadian higher education. I am aware of this, and I have designed for depth rather than breadth accordingly. The limitation is real, and it is also, within an IPA and Photovoice framework, appropriate. Small samples are a feature of these methodologies, a condition of their rigour rather than a compromise of it.
The single-site design means that findings are deeply contextualised within TRU’s specific institutional culture, revenue dependencies, and geographic location. Thompson Rivers University is a comprehensive teaching university in a mid-sized interior British Columbia city with a particular history of access-oriented programming and aggressive international student recruitment. What happens here is shaped by conditions that exist in this precise configuration only here. I have tried to develop the conceptual tools at a level of abstraction sufficient to travel beyond TRU, and I hold open the possibility that they will land differently in other settings.
My insider positioning is both a strength and a limitation. Twenty-five years of teaching at TRU gave me contextual knowledge that no outside researcher could have built during a single study, and it shaped my ability to hear what participants were describing with precision and recognition. It also means that my interpretations are inflected by my own history with this institution, including my layoff, my union involvement, and the accumulated effects of nineteen years of precarious employment. I have named these inflections throughout rather than pretending they are absent, but naming them leaves them present rather than dissolved. They remain in the analysis, visible and accountable.
Finally, the timing of data collection, during the period of acute policy uncertainty that followed the 2024 federal study permit cap, means that this study captures a particular historical moment. The experiences participants documented are real and the structural conditions they reveal are durable, but the specific texture of anxiety and contingency that saturates the photographs is shaped by a policy crisis that will have evolved by the time this dissertation is read. I have tried to distinguish between the structural conditions, which are chronic and systemic, and the policy moment, which is acute and historically specific. Readers should hold both layers in mind.