• Open access
  • Published: 27 May 2020

How to use and assess qualitative research methods

  • Loraine Busetto   ORCID: orcid.org/0000-0002-9228-7875 1 ,
  • Wolfgang Wick 1 , 2 &
  • Christoph Gumbinger 1  

Neurological Research and Practice volume  2 , Article number:  14 ( 2020 ) Cite this article

779k Accesses

369 Citations

90 Altmetric

Metrics details

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 , 8 , 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 , 10 , 11 , 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

figure 1

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

figure 2

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

figure 3

From data collection to data analysis

Attributions for icons: see Fig. 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 , 25 , 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

figure 4

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 , 32 , 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 , 38 , 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Availability of data and materials

Not applicable.

Abbreviations

Endovascular treatment

Randomised Controlled Trial

Standard Operating Procedure

Standards for Reporting Qualitative Research

Philipsen, H., & Vernooij-Dassen, M. (2007). Kwalitatief onderzoek: nuttig, onmisbaar en uitdagend. In L. PLBJ & H. TCo (Eds.), Kwalitatief onderzoek: Praktische methoden voor de medische praktijk . [Qualitative research: useful, indispensable and challenging. In: Qualitative research: Practical methods for medical practice (pp. 5–12). Houten: Bohn Stafleu van Loghum.

Chapter   Google Scholar  

Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative approaches . London: Sage.

Kelly, J., Dwyer, J., Willis, E., & Pekarsky, B. (2014). Travelling to the city for hospital care: Access factors in country aboriginal patient journeys. Australian Journal of Rural Health, 22 (3), 109–113.

Article   Google Scholar  

Nilsen, P., Ståhl, C., Roback, K., & Cairney, P. (2013). Never the twain shall meet? - a comparison of implementation science and policy implementation research. Implementation Science, 8 (1), 1–12.

Howick J, Chalmers I, Glasziou, P., Greenhalgh, T., Heneghan, C., Liberati, A., Moschetti, I., Phillips, B., & Thornton, H. (2011). The 2011 Oxford CEBM evidence levels of evidence (introductory document) . Oxford Center for Evidence Based Medicine. https://www.cebm.net/2011/06/2011-oxford-cebm-levels-evidence-introductory-document/ .

Eakin, J. M. (2016). Educating critical qualitative health researchers in the land of the randomized controlled trial. Qualitative Inquiry, 22 (2), 107–118.

May, A., & Mathijssen, J. (2015). Alternatieven voor RCT bij de evaluatie van effectiviteit van interventies!? Eindrapportage. In Alternatives for RCTs in the evaluation of effectiveness of interventions!? Final report .

Google Scholar  

Berwick, D. M. (2008). The science of improvement. Journal of the American Medical Association, 299 (10), 1182–1184.

Article   CAS   Google Scholar  

Christ, T. W. (2014). Scientific-based research and randomized controlled trials, the “gold” standard? Alternative paradigms and mixed methodologies. Qualitative Inquiry, 20 (1), 72–80.

Lamont, T., Barber, N., Jd, P., Fulop, N., Garfield-Birkbeck, S., Lilford, R., Mear, L., Raine, R., & Fitzpatrick, R. (2016). New approaches to evaluating complex health and care systems. BMJ, 352:i154.

Drabble, S. J., & O’Cathain, A. (2015). Moving from Randomized Controlled Trials to Mixed Methods Intervention Evaluation. In S. Hesse-Biber & R. B. Johnson (Eds.), The Oxford Handbook of Multimethod and Mixed Methods Research Inquiry (pp. 406–425). London: Oxford University Press.

Chambers, D. A., Glasgow, R. E., & Stange, K. C. (2013). The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science : IS, 8 , 117.

Hak, T. (2007). Waarnemingsmethoden in kwalitatief onderzoek. In L. PLBJ & H. TCo (Eds.), Kwalitatief onderzoek: Praktische methoden voor de medische praktijk . [Observation methods in qualitative research] (pp. 13–25). Houten: Bohn Stafleu van Loghum.

Russell, C. K., & Gregory, D. M. (2003). Evaluation of qualitative research studies. Evidence Based Nursing, 6 (2), 36–40.

Fossey, E., Harvey, C., McDermott, F., & Davidson, L. (2002). Understanding and evaluating qualitative research. Australian and New Zealand Journal of Psychiatry, 36 , 717–732.

Yanow, D. (2000). Conducting interpretive policy analysis (Vol. 47). Thousand Oaks: Sage University Papers Series on Qualitative Research Methods.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22 , 63–75.

van der Geest, S. (2006). Participeren in ziekte en zorg: meer over kwalitatief onderzoek. Huisarts en Wetenschap, 49 (4), 283–287.

Hijmans, E., & Kuyper, M. (2007). Het halfopen interview als onderzoeksmethode. In L. PLBJ & H. TCo (Eds.), Kwalitatief onderzoek: Praktische methoden voor de medische praktijk . [The half-open interview as research method (pp. 43–51). Houten: Bohn Stafleu van Loghum.

Jansen, H. (2007). Systematiek en toepassing van de kwalitatieve survey. In L. PLBJ & H. TCo (Eds.), Kwalitatief onderzoek: Praktische methoden voor de medische praktijk . [Systematics and implementation of the qualitative survey (pp. 27–41). Houten: Bohn Stafleu van Loghum.

Pv, R., & Peremans, L. (2007). Exploreren met focusgroepgesprekken: de ‘stem’ van de groep onder de loep. In L. PLBJ & H. TCo (Eds.), Kwalitatief onderzoek: Praktische methoden voor de medische praktijk . [Exploring with focus group conversations: the “voice” of the group under the magnifying glass (pp. 53–64). Houten: Bohn Stafleu van Loghum.

Carter, N., Bryant-Lukosius, D., DiCenso, A., Blythe, J., & Neville, A. J. (2014). The use of triangulation in qualitative research. Oncology Nursing Forum, 41 (5), 545–547.

Boeije H: Analyseren in kwalitatief onderzoek: Denken en doen, [Analysis in qualitative research: Thinking and doing] vol. Den Haag Boom Lemma uitgevers; 2012.

Hunter, A., & Brewer, J. (2015). Designing Multimethod Research. In S. Hesse-Biber & R. B. Johnson (Eds.), The Oxford Handbook of Multimethod and Mixed Methods Research Inquiry (pp. 185–205). London: Oxford University Press.

Archibald, M. M., Radil, A. I., Zhang, X., & Hanson, W. E. (2015). Current mixed methods practices in qualitative research: A content analysis of leading journals. International Journal of Qualitative Methods, 14 (2), 5–33.

Creswell, J. W., & Plano Clark, V. L. (2011). Choosing a Mixed Methods Design. In Designing and Conducting Mixed Methods Research . Thousand Oaks: SAGE Publications.

Mays, N., & Pope, C. (2000). Assessing quality in qualitative research. BMJ, 320 (7226), 50–52.

O'Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for reporting qualitative research: A synthesis of recommendations. Academic Medicine : Journal of the Association of American Medical Colleges, 89 (9), 1245–1251.

Saunders, B., Sim, J., Kingstone, T., Baker, S., Waterfield, J., Bartlam, B., Burroughs, H., & Jinks, C. (2018). Saturation in qualitative research: Exploring its conceptualization and operationalization. Quality and Quantity, 52 (4), 1893–1907.

Moser, A., & Korstjens, I. (2018). Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis. European Journal of General Practice, 24 (1), 9–18.

Marlett, N., Shklarov, S., Marshall, D., Santana, M. J., & Wasylak, T. (2015). Building new roles and relationships in research: A model of patient engagement research. Quality of Life Research : an international journal of quality of life aspects of treatment, care and rehabilitation, 24 (5), 1057–1067.

Demian, M. N., Lam, N. N., Mac-Way, F., Sapir-Pichhadze, R., & Fernandez, N. (2017). Opportunities for engaging patients in kidney research. Canadian Journal of Kidney Health and Disease, 4 , 2054358117703070–2054358117703070.

Noyes, J., McLaughlin, L., Morgan, K., Roberts, A., Stephens, M., Bourne, J., Houlston, M., Houlston, J., Thomas, S., Rhys, R. G., et al. (2019). Designing a co-productive study to overcome known methodological challenges in organ donation research with bereaved family members. Health Expectations . 22(4):824–35.

Piil, K., Jarden, M., & Pii, K. H. (2019). Research agenda for life-threatening cancer. European Journal Cancer Care (Engl), 28 (1), e12935.

Hofmann, D., Ibrahim, F., Rose, D., Scott, D. L., Cope, A., Wykes, T., & Lempp, H. (2015). Expectations of new treatment in rheumatoid arthritis: Developing a patient-generated questionnaire. Health Expectations : an international journal of public participation in health care and health policy, 18 (5), 995–1008.

Jun, M., Manns, B., Laupacis, A., Manns, L., Rehal, B., Crowe, S., & Hemmelgarn, B. R. (2015). Assessing the extent to which current clinical research is consistent with patient priorities: A scoping review using a case study in patients on or nearing dialysis. Canadian Journal of Kidney Health and Disease, 2 , 35.

Elsie Baker, S., & Edwards, R. (2012). How many qualitative interviews is enough? In National Centre for Research Methods Review Paper . National Centre for Research Methods. http://eprints.ncrm.ac.uk/2273/4/how_many_interviews.pdf .

Sandelowski, M. (1995). Sample size in qualitative research. Research in Nursing & Health, 18 (2), 179–183.

Sim, J., Saunders, B., Waterfield, J., & Kingstone, T. (2018). Can sample size in qualitative research be determined a priori? International Journal of Social Research Methodology, 21 (5), 619–634.

Download references

Acknowledgements

no external funding.

Author information

Authors and affiliations.

Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120, Heidelberg, Germany

Loraine Busetto, Wolfgang Wick & Christoph Gumbinger

Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Wolfgang Wick

You can also search for this author in PubMed   Google Scholar

Contributions

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

Corresponding author

Correspondence to Loraine Busetto .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Busetto, L., Wick, W. & Gumbinger, C. How to use and assess qualitative research methods. Neurol. Res. Pract. 2 , 14 (2020). https://doi.org/10.1186/s42466-020-00059-z

Download citation

Received : 30 January 2020

Accepted : 22 April 2020

Published : 27 May 2020

DOI : https://doi.org/10.1186/s42466-020-00059-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative research
  • Mixed methods
  • Quality assessment

Neurological Research and Practice

ISSN: 2524-3489

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

qualitative medical research examples

Loading metrics

Open Access

Qualitative Research: Understanding Patients' Needs and Experiences

  • The PLoS Medicine Editors

PLOS

Published: August 28, 2007

  • https://doi.org/10.1371/journal.pmed.0040258
  • Reader Comments

Citation: The PLoS Medicine Editors (2007) Qualitative Research: Understanding Patients' Needs and Experiences. PLoS Med 4(8): e258. https://doi.org/10.1371/journal.pmed.0040258

Copyright: © 2007 The PLoS Medicine Editors. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Why do up to half of all patients with tuberculosis (TB) fail to adhere to drug treatment [ 1 ]? The answer to this question is a matter of life and death, since nonadherence contributes to disease relapse and mortality [ 2 ]. In last month's PLoS Medicine, Salla Munro and colleagues argue that qualitative studies—in which researchers listen to what patients, care givers, and health care providers have to say—can provide important insights into why nonadherence occurs [ 3 ]. Their paper is a “meta-ethnography” [ 4 ], a systematic review and synthesis of qualitative studies on adherence to TB medication. The review found a wide array of factors to explain nonadherence, such as the belief that if one's symptoms have disappeared there is no need to finish a course of treatment. We published this review because we thought it would play a role in improving the delivery of TB treatment and ultimately in reducing the enormous global burden of the disease.

PLoS Medicine has now published two such meta-ethnographies (the first looked at adherence to HIV medication [ 5 ]). We have also published a small number of individual qualitative studies. For example, in our special issue on social medicine ( http://collections.plos.org/plosmedicine/socialmedicine-2006.php ), we published a qualitative study of migrant workers in the US that found that farm working and housing conditions are organized according to ethnicity and citizenship and that this hierarchy determines health disparities [ 6 ]. We have been very selective in our editorial decisions about which qualitative studies to publish. In our decision-making process, we have been guided by two crucial questions.

The first question is whether a qualitative approach was the right way to answer the research question. Quantitative research strives to be objective: human beings, health, and illness are the objects of investigation. Such investigation has led to extraordinary biomedical advances—yet patients often fail to reap the benefits because health professionals may not understand how best to deliver them in the context of patients' multifaceted lives. The academic editor of Salla Munro and colleagues' study commented that thinking of TB drugs simply as a “biomedical intervention” without factoring in patients' needs and broader social contexts creates circumstances that increase the likelihood of poor adherence to treatment. Qualitative research is the best way to understand these needs and contexts.

Astrid Fletcher and colleagues, for example, used quantitative methods to objectively determine who (in terms of age, sex, and education level) did not use the eye-care services available in India [ 7 ]. But they adopted a qualitative approach to answer the question of why people did not use these services. David Leon and colleagues, during a quantitative study on hazardous alcohol drinking in Russia, learned that much alcohol was consumed in the form of what were described as “surrogates” [ 8 ]. Qualitative research helped to identify what these surrogates were—they included eau de Cologne and over-the-counter medications.

When researchers investigate the experiences of people receiving or failing to receive health care, identify themes in these subjective stories, and integrate these themes into the greater context of human life experience, the results are informative to care providers. The usefulness of these results lies precisely in their subjectivity: the subjects are telling us, or we are finding out through more subtle observation, what matters to them.

The results of qualitative research can also help to inform the very process of research itself. Qualitative approaches can help us to understand, for example, why some patients decline to participate in clinical trials [ 9 ], or how patients experience the trial process itself. They can even be used to refine or improve a clinical trial in “real time.” In a trial of a computerized decision support tool for patients with atrial fibrillation being considered for anticoagulation treatment, Madeleine Murtagh and colleagues used qualitative evidence in deciding to discontinue one arm of the trial (the intervention in that arm was causing confusion amongst the patients and was unlikely to produce valid data) [ 10 ]. When a quantitative study is assessing the effectiveness of a complex multifaceted intervention, qualitative methods can help to tease out why such an intervention works or fails [ 11 ]. Qualitative approaches can also help to identify which of many possible research questions should receive priority for investigation, often by asking the research participants themselves. For example, patients with asthma may value easy-to-use inhalers more highly than a new class of drug.

Once it is clear that qualitative methods constitute the right approach for a study submitted to PLoS Medicine, the second question is whether the study meets our criteria for rigor and relevance. For a study to be suitable, regardless of the methodology, it should address an important topic in clinical medicine or public health and it should have the potential to transform our understanding of the causes or treatment of disease. In assessing any study, quantitative or qualitative, we are always on the lookout for biases, poorly described methods, and limited generalizability or overinterpretation of the data. In specifically assessing qualitative studies, we additionally wish to be reassured that the researchers used some type of “quality control” in analyzing the data—for example, were the data independently analyzed by at least two researchers and did consistent themes emerge from the data each time?

One characteristic of PLoS Medicine is the very broad range of research that we have published to date. We feel that such a range is appropriate for a medical journal, since understanding the complex nature of illness and health care requires a variety of different research approaches. “What is involved is not a crossroads where we have to go left or right,” Martyn Hammersley has argued in a discussion of the false dichotomy between quantitative and qualitative research. “A better analogy is a complex maze where we are repeatedly faced with decisions, and where paths wind back on one another” [ 12 ].

  • View Article
  • Google Scholar
  • 2. Volmink J, Garner P (2006) Directly observed therapy for treating tuberculosis. Cochrane Database Syst Rev 2, CD003343.
  • 4. Noblit GW, Hare RD (1988) Meta-ethnography: Synthesizing qualitative studies. Newbury Park (CA): Sage. 88 p.
  • 12. (1992) Deconstructing the qualitative–quantitative divide. In: Brannen J, editor. Mixing methods: Qualitative and quantitative research. Aldershot (United Kingdom): Avebury. pp. 39–55. editor.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on September 5, 2024.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organizations to understand their cultures.
Action research Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Prevent plagiarism. Run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorize common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2024, September 05). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved September 18, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals

You are here

  • Volume 55, Issue 2
  • Understanding qualitative research in health care
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

Qualitative studies are often used to research phenomena that are difficult to quantify numerically. 1,2 These may include concepts, feelings, opinions, interpretations and meanings, or why people behave in a certain way. Although qualitative research is often described in opposition to quantitative research, the approaches are complementary, and many researchers use mixed methods in their projects, combining the strengths of both approaches. 2 Many comprehensive texts exist on qualitative research methodology including those with a focus on healthcare related research. 2-4 Here we give a brief introduction to the rationale, methods and quality assessment of qualitative research.

https://doi.org/10.1136/dtb.2017.2.0457

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Read the full text or download the PDF:

  • Open access
  • Published: 13 December 2018

Using qualitative Health Research methods to improve patient and public involvement and engagement in research

  • Danielle E. Rolfe 1 ,
  • Vivian R. Ramsden 2 ,
  • Davina Banner 3 &
  • Ian D. Graham 1  

Research Involvement and Engagement volume  4 , Article number:  49 ( 2018 ) Cite this article

45k Accesses

71 Citations

23 Altmetric

Metrics details

Plain English summary

Patient engagement (or patient and public involvement) in health research is becoming a requirement for many health research funders, yet many researchers have little or no experience in engaging patients as partners as opposed to research subjects. Additionally, many patients have no experience providing input on the research design or acting as a decision-making partner on a research team. Several potential risks exist when patient engagement is done poorly, despite best intentions. Some of these risks are that: (1) patients’ involvement is merely tokenism (patients are involved but their suggestions have little influence on how research is conducted); (2) engaged patients do not represent the diversity of people affected by the research; and, (3) research outcomes lack relevance to patients’ lives and experiences.

Qualitative health research (the collection and systematic analysis of non-quantitative data about peoples’ experiences of health or illness and the healthcare system) offers several approaches that can help to mitigate these risks. Several qualitative health research methods, when done well, can help research teams to: (1) accurately incorporate patients’ perspectives and experiences into the design and conduct of research; (2) engage diverse patient perspectives; and, (3) treat patients as equal and ongoing partners on the research team.

This commentary presents several established qualitative health research methods that are relevant to patient engagement in research. The hope is that this paper will inspire readers to seek more information about qualitative health research, and consider how its established methods may help improve the quality and ethical conduct of patient engagement for health research.

Research funders in several countries have posited a new vision for research that involves patients and the public as co-applicants for the funding, and as collaborative partners in decision-making at various stages and/or throughout the research process. Patient engagement (or patient and public involvement) in health research is presented as a more democratic approach that leads to research that is relevant to the lives of the people affected by its outcomes. What is missing from the recent proliferation of resources and publications detailing the practical aspects of patient engagement is a recognition of how existing research methods can inform patient engagement initiatives. Qualitative health research, for example, has established methods of collecting and analyzing non-quantitative data about individuals’ and communities’ lived experiences with health, illness and/or the healthcare system. Included in the paradigm of qualitative health research is participatory health research, which offers approaches to partnering with individuals and communities to design and conduct research that addresses their needs and priorities.

The purpose of this commentary is to explore how qualitative health research methods can inform and support meaningful engagement with patients as partners. Specifically, this paper addresses issues of: rigour (how can patient engagement in research be done well?); representation (are the right patients being engaged?); and, reflexivity (is engagement being done in ways that are meaningful, ethical and equitable?). Various qualitative research methods are presented to increase the rigour found within patient engagement. Approaches to engage more diverse patient perspectives are presented to improve representation beyond the common practice of engaging only one or two patients. Reflexivity, or the practice of identifying and articulating how research processes and outcomes are constructed by the respective personal and professional experiences of researchers and patients, is presented to support the development of authentic, sustainable, equitable and meaningful engagement of patients as partners in health research.

Conclusions

Researchers will need to engage patients as stakeholders in order to satisfy the overlapping mandate in health policy, care and research for engaging patients as partners in decision-making. This paper presents several suggestions to ground patient engagement approaches in established research designs and methods.

Peer Review reports

Patient engagement (or patient and public involvement) in research involves partnering with ‘patients’ (a term more often used in Canada and the US, that is inclusive of individuals, caregivers, and/or members of the public) to facilitate research related to health or healthcare services. Rather than research subjects or participants, patients are engaged as partners in the research process. This partnership is intended to be meaningful and ongoing, from the outset of planning a research project, and/or at various stages throughout the research process. Engagement can include the involvement of patients in defining a research question, identifying appropriate outcomes and methods, collecting and interpreting data, and developing and delivering a knowledge translation strategy [ 1 ].

The concept of engaging non-researchers throughout the research process is not new to participatory health researchers, or integrated knowledge translation researchers, as the latter involves ongoing collaboration with clinicians, health planners and policy makers throughout the research process in order to generate new knowledge [ 2 , 3 ]. Patients, however, are less frequently included as partners on health research teams, or as knowledge users in integrated knowledge translation research teams compared to clinicians, healthcare managers and policy-makers, as these individuals are perceived as having “the authority to invoke change in the practice or policy setting.” (p.2) [ 2 ] Recent requirements for patient engagement by health research funders [ 4 , 5 , 6 ], ,and mandates by most healthcare planners and organizations to engage patients in healthcare improvement initiatives, suggest that it would be prudent for integrated knowledge translation (and indeed all) health researchers to begin engaging patients as knowledge users in many, if not all, of their research projects.

Training and tools for patient engagement are being developed and implemented in Canada via the Canadian Institutes for Health Research (CIHR) Strategy for Patient Oriented Research (SPOR) initiative, in the US via Patient Centered Outcomes Research Institute (PCORI), and very practical resources are already available from the UK’s more established INVOLVE Advisory Group [ 5 , 6 , 7 ]. What is seldom provided by these ‘get started’ guides, however, are rigorous methods and evidence-based approaches to engaging diverse patient perspectives, and ensuring that their experiences, values and advice are appropriately incorporated into the research process.

The purpose of this commentary is to stimulate readers’ further discussion and inquiry into qualitative health research methods as a means of fostering the more meaningfully engagement of patients as partners for research. Specifically, this paper will address issues of: rigour (how do we know that the interpretation of patients’ perspectives has been done well and is applicable to other patients?); representation (are multiple and diverse patient perspectives being sought?); and, reflexivity (is engagement being done ethically and equitably?). This commentary alone is insufficient to guide researchers and patient partners to use the methods presented as part of their patient engagement efforts. However, with increased understanding of these approaches and perhaps guidance from experienced qualitative health researchers, integrated knowledge translation and health researchers alike may be better prepared to engage patients in a meaningful way in research that has the potential to improve health and healthcare experiences and outcomes.

What can be learned from methods utilized in qualitative health research?

There is wide variation in researchers’ and healthcare providers’ openness to engaging patients [ 8 ]. Often, the patients that are engaged are a select group of individuals known to the research team, sometimes do not reflect the target population of the research, are involved at a consultative rather than a partnership level, and are more likely to be involved in the planning rather than the dissemination of research [ 9 , 10 , 11 ]. As a result, patient engagement can be seen as tokenistic and the antithesis of the intention of most patient engagement initiatives, which is to have patients’ diverse experiences and perspectives help to shape what and how research is done. The principles, values, and practices of qualitative health research (e.g., relativism, social equity, inductive reasoning) have rich epistemological traditions that align with the conceptual and practical spirit of patient engagement. It is beyond the scope of this commentary, however, to describe in detail the qualitative research paradigm, and readers are encouraged to gain greater knowledge of this topic via relevant courses and texts. Nevertheless, several qualitative research considerations and methods can be applied to the practice of patient engagement, and the following sections describe three of these: rigour, representation and reflexivity.

Rigour: Interpreting and incorporating patients’ experiences into the design and conduct of research

When patient engagement strategies go beyond the inclusion of a few patient partners on the research team, for example, by using focus groups, interviews, community forums, or other methods of seeking input from a broad range of patient perspectives, the diversity of patients’ experiences or perspectives may be a challenge to quickly draw conclusions from in order to make decisions about the study design. To make these decisions, members of the research team (which should include patient partners) may discuss what they heard about patients’ perspectives and suggestions, and then unsystematically incorporate these suggestions, or they may take a vote, try to achieve consensus, implement a Delphi technique [ 12 ], or use another approach designed specifically for patient engagement like the James Lind Alliance technique for priority setting [ 13 ]. Although the information gathered from patients is not data (and indeed would require ethical review to be used as such), a number of qualitative research practices designed to increase rigour can be employed to help ensure that the interpretation and incorporation of patients’ experiences and perspectives has been done systematically and could be reproduced [ 14 ]. These practices include member checking , dense description , and constant comparative analysis . To borrow key descriptors of rigour from qualitative research, these techniques improve “credibility” (i.e., accurate representations of patients’ experiences and preferences that are likely to be understood or recognized by other patients in similar situations – known in quantitative research as internal validity), and “transferability” (or the ability to apply what was found among a group of engaged patients to other patients in similar contexts – known in quantitative research as external validity) [ 15 ].

Member checking

Member checking in qualitative research involves “taking ideas back to the research participants for their confirmation” (p. 111) [ 16 ]. The objective of member checking is to ensure that a researcher’s interpretation of the data (whether a single interview with a participant, or after analyzing several interviews with participants) accurately reflects the participants’ intended meaning (in the case of a member check with a single participant about their interview), or their lived experience (in the case of sharing an overall finding about several individuals with one or more participants) [ 16 ]. For research involving patient engagement, member checking can be utilized to follow-up with patients who may have been engaged at one or only a few time points, or on an on-going basis with patient partners. A summary of what was understood and what decisions were made based on patients’ recommendations could be used to initiate this discussion and followed up with questions such as, “have I understood correctly what you intended to communicate to me?” or “do you see yourself or your experience(s) reflected in these findings or suggestions for the design of the study?”

Dense description

As with quantitative research, detailed information about qualitative research methods and study participants is needed to enable other researchers to understand the context and focus of the research and to establish how these findings relate more broadly. This helps researchers to not only potentially repeat the study, but to extend its findings to similar participants in similar contexts. Dense description provides details of the social, demographic and health profile of participants (e.g., gender, education, health conditions, etc.), as well as the setting and context of their experiences (i.e., where they live, what access to healthcare they have). In this way, dense description improves the transferability of study findings to similar individuals in similar situations [ 15 ]. To date, most studies involving patient engagement provide limited details about their engagement processes and who was engaged [ 17 ]. This omission may be done intentionally (e.g., to protect the privacy of engaged patients, particularly those with stigmatizing health conditions), or as a practical constraint such as publication word limits. Nonetheless, reporting of patient engagement using some aspects of dense description of participants (as appropriate), the ways that they were engaged, and recommendations that emanated from engaged patients can also contribute to greater transferability and understanding of how patient engagement influenced the design of a research study.

Constant comparative analysis

Constant comparative analysis is a method commonly used in grounded theory qualitative research [ 18 ]. Put simply, the understanding of a phenomenon or experience that a researcher acquires through engaging with participants is constantly redeveloped and refined based on subsequent participant interactions. This process of adapting to new information in order to make it more relevant is similar to processes used in rapid cycle evaluation during implementation research [ 19 ]. This method can be usefully adapted and applied to research involving ongoing collaboration and partnership with several engaged patient partners, and/or engagement strategies that seek the perspectives of many patients at various points in the research process. For example, if, in addition to having ongoing patient partners, a larger group of patients provides input and advice (e.g., a steering or advisory committee) at different stages in the research process, their input may result in multiple course corrections during the design and conduct of the research processes to incorporate their suggestions. These suggestions may result in refinement of earlier decisions made about study design or conduct, and as such, the research process becomes more iterative rather than linear. In this way, engaged patients and patient partners are able to provide their input and experience to improve each step of the research process from formulating an appropriate research question or objective, determining best approaches to conducting the research and sharing it with those most affected by the outcomes.

Representation: Gathering diverse perspectives to design relevant and appropriate research studies

The intention of engaging patients is to have their lived experience of health care or a health condition contribute to the optimization of a research project design [ 20 ]. Development of a meaningful and sustainable relationship with patient partners requires considerable time, a demonstrated commitment to partnership by both the patient partners and the researcher(s), resources to facilitate patient partners’ engagement, and often, an individual designated to support the development of this relationship [ 17 , 21 ]. This may lead some research teams to sustain this relationship with only one or two patients who are often previously known to the research team [ 17 ]. The limitation of this approach is that the experiences of these one or two individuals may not adequately reflect the diverse perspectives of patients that may be affected by the research or its outcomes. The notion of gaining ‘ the patient perspective’ from a single or only a few individuals has already been problematized [ 22 , 23 ]. To be sure, the engagement of a single patient is better than none at all, but the engagement of a broader and diverse population of patients should be considered to better inform the research design, and to help prevent further perpetuation of health disparities. Key issues to be considered include (1) how engagement can be made accessible to patients from diverse backgrounds, and (2) which engagement strategies (e.g., ranging from a community information forum to full partnership on the research team) are most appropriate to reach the target population [ 24 ].

Making engagement accessible

Expecting patient partner(s) to attend regular research team meetings held during working hours in a boardroom setting in a hospital, research institute or university limits the participation of many individuals. To support the participation and diversity of engaged patients, effort should be made to increase the accessibility and emotional safety of engagement initiatives [ 25 ]. A budget must be allocated for patient partners’ transportation, childcare or caregiving support, remuneration for time or time taken off work and, at the very least, covering expenses related to their engagement. Another consideration that is often made by qualitative health researchers is whether brief counselling support can be provided to patients should the sharing of their experiences result in emotional distress. There are some resources that can help with planning for costs [ 26 ], including an online cost calculator [ 27 ].

Engagement strategies

Patient partners can be coached to consider the needs and experiences of people unlike them, but there are other methods of engagement that can help to gain a more fulsome perspective of what is likely a diverse patient population that is the focus of the research study. In qualitative health research, this is known as purposeful or purposive sampling: finding people who can provide information-rich descriptions of the phenomenon under study [ 28 ]. Engagement may require different approaches (e.g., deliberative group processes, community forums, focus groups, and patient partners on the research team), at different times in the research process to reach different individuals or populations (e.g., marginalized patients, or patients or caregivers experiencing illnesses that inhibit their ability to maintain an ongoing relationship with the research team). Engagement strategies of different forms at different times may be required. For example, ongoing engagement may occur with patient partners who are members of the research team (e.g., co-applicants on a research grant), and intermittent engagement may be sought from other patients through other methods that may be more time-limited or accessible to a diverse population of patients (e.g., a one-time focus group, community forum, or ongoing online discussion) to address issues that may arise during various stages of the research or dissemination processes. The result of this approach is that patients are not only consulted or involved (one-time or low commitment methods), but are also members of the research team and have the ability to help make decisions about the research being undertaken.

Engagement can generate a wealth of information from very diverse perspectives. Each iteration of engagement may yield new information. Knowing when enough information has been gathered to make decisions with the research team (that includes patient partners) about how the research may be designed or conducted can be challenging. One approach from qualitative research that can be adapted for patient engagement initiatives is theoretical saturation [ 29 ], or “the point in analysis when…further data gathering and analysis add little new to the conceptualization, though variations can always be discovered.” (p. 263) [ 18 ]. That is, a one-time engagement strategy (e.g., a discussion with a single patient partner) may be insufficient to acquire the diverse perspectives of the individuals that will be affected by the research or its outcomes. Additional strategies (e.g., focus groups or interviews with several individuals) may be initiated until many patients identify similar issues or recommendations.

Engagement approaches should also consider: how patients are initially engaged (e.g., through known or new networks, posted notices, telephone or in-person recruitment) and whether involvement has been offered widely enough to garner multiple perspectives; how patients’ experiences are shared (e.g., community forums, formal meetings, individual or group discussions) and whether facilitation enables broad participation; and finally, how patients’ participation and experiences are incorporated into the research planning and design, with patients having equal decision-making capacity to other research team members. Several publications and tools are available that can help guide researchers who are new to processes of engaging patients in research [ 24 , 30 , 31 , 32 , 33 , 34 ], but unfortunately few address how to evaluate the effectiveness of engagement [ 35 ].

Reflexivity: Ensuring meaningful and authentic engagement

In qualitative research, reflexivity is an ongoing process of “the researcher’s scrutiny of his or her research experience, decisions, and interpretations in ways that bring the researcher into the process and allow the reader to assess how and to what extent the researcher’s interests, positions, and assumptions influenced inquiry. A reflexive stance informs how the researcher conducts his or her research, relates to the research participants, and represents them in written reports,” (p.188–189) [ 16 ]. The concept of reflexivity can be applied to research involving patient engagement by continually and explicitly considering how decisions about the research study were made. All members of the research team must consider (and perhaps discuss): (1) how patient partners are invited to participate in research planning and decision-making; (2) how their input is received relative to other team members (i.e., do their suggestions garner the same respect as researchers’ or providers’?); and, (3) whether engaged patients or patient partners feel sufficiently safe, able and respected to share their experiences, preferences and recommendations with the research team.

Ideally, reflexivity becomes a practice within the research team and may be operationalized through regular check-ins with patients and researchers about their comfort in sharing their views, and whether they feel that their views have been considered and taken onboard. Power dynamics should also be considered during patient engagement initiatives. For example, reflecting on how community forums, focus groups or interviews are to be facilitated, including a consideration of who is at the table/who is not, who speaks/who does not, whose suggestions are implemented/whose are not? Reflexivity can be practiced through informal discussions, or using methods that may allow more candid responses by engaged patients (e.g., anonymous online survey or feedback forms). At the very least, if these practices were not conducted throughout the research process, the research team (including patient partners) should endeavor to reflect upon team dynamics and consider how these may have contributed to the research design or outcomes. For example, were physicians and researchers seen as experts and patients felt less welcome or able to share their personal experiences? Were patients only engaged by telephone rather than in-person and did this influence their ability to easily engage in decision-making? Reflexive practices may be usefully supplemented by formal evaluation of the process of patient engagement from the perspective of patients and other research team members [ 36 , 37 ], and some tools are available to do this [ 35 ].

A note about language

One way to address the team dynamic between researchers, professional knowledge users (such as clinicians or health policy planners) and patients is to consider the language used to engage with patients in the planning of patient engagement strategies. That is, the term ‘patient engagement’ is a construction of an individual’s identity that exists only within the healthcare setting, and in the context of a patient-provider dynamic. This term does not consider how people make decisions about their health and healthcare within a broader context of their family, community, and culture [ 22 , 38 ]. This may be why research communities in some countries (e.g., the United Kingdom) use the term ‘patient and public involvement’. Additionally, research that involves communities defined by geography, shared experiences, cultural or ethnic identity, as is the case with participatory health research, may refer to ‘community engagement.’ Regardless of the term used, partnerships with patients, the public, or with communities need to be conceived instead as person-to-person interactions between researchers and individuals who are most affected by the research. Discussions with engaged patients should be conducted early on to determine how to best describe their role on the team or during engagement initiatives (e.g., as patient partners, community members, or people with lived experience).

Tokenism is the “difference between…the empty ritual of participation and having the real power needed to affect the outcome,” (p.2) [ 39 ]. Ongoing reflection on the power dynamic between researchers and engaged patients, a central tenet of critical qualitative health research [ 40 , 41 ], can increase the likelihood that engagement involves equitable processes and will result in meaningful engagement experiences by patients rather than tokenism [ 36 , 42 ]. Patient engagement initiatives should strive for “partnership” amongst all team members, and not just reflect a patient-clinician or researcher-subject dynamic [ 43 ]. To develop meaningful, authentic and sustainable relationships with engaged patients, methods used for participatory, action or community-based research (approaches that fall under the paradigm of qualitative inquiry) provide detailed experiential guidance [ 44 ]. For example, a realist review of community-based participatory research projects reported that gaining and maintaining trust with patient or community partners, although time-intensive, is foundational to equitable and sustainable partnerships that benefit communities and individuals [ 45 , 46 ]. Additionally, Chapter Nine of the Canadian Tri-Council Policy Statement on Research involving Humans, which has to date been applied to research involving First Nations, Inuit and, Métis Peoples in Canada [ 47 ], provides useful information and direction that can be applied to working with patient partners on research [ 48 ].

Authentic patient engagement should include their involvement at all stages of the research process [ 49 , 50 ], but this is often not the case [ 10 ]. .Since patient partners are not research subjects or participants, their engagement does not (usually) require ethics approval, and they can be engaged as partners as early as during the submission of grant applications [ 49 ]. This early engagement helps to incorporate patients’ perspectives into the proposed research before the project is wedded to particular objectives, outcomes and methods, and can also serve to allocate needed resources to support patient engagement (including remuneration for patient partners’ time). Training in research for patient partners can also support their meaningful engagement by increasing their ability to fully engage in decision-making with other members of the research team [ 51 , 52 ]. Patient partners may also thrive in co-leading the dissemination of findings to healthcare providers, researchers, patients or communities most affected by the research [ 53 ].

Patient engagement has gained increasing popularity, but many research organizations are still at the early stages of developing approaches and methods, many of which are based on experience rather than evidence. As health researchers and members of the public will increasingly need to partner for research to satisfy the overlapping mandate of patient engagement in health policy, healthcare and research, the qualitative research methods highlighted in this commentary provide some suggestions to foster rigorous, meaningful and sustained engagement initiatives while addressing broader issues of power and representation. By incorporating evidence-based methods of gathering and learning from multiple and diverse patient perspectives, we will hopefully conduct better patient engaged research, live out the democratic ideals of patient engagement, and ultimately contribute to research that is more relevant to the lives of patients; as well as, contribute to the improved delivery of healthcare services. In addition to the references provided in this paper, readers are encouraged to learn more about the meaningful engagement of patients in research from several key texts [ 54 , 55 , 56 ].

Abbreviations

Canadian Institutes for Health Research

Patient Centered Outcomes Research Institute

Strategy for Patient Oriented Research

Canadian Institutes of Health Research. Strategy for Patient-Oriented Research. 2014. http://www.cihr-irsc.gc.ca/e/documents/spor_framework-en.pdf . Accessed 27 Sep 2017.

Google Scholar  

Kothari A, Wathen CN. Integrated knowledge translation: digging deeper, moving forward. J Epidemiol Community Health. 2017;71:619–23. https://doi.org/10.1136/jech-2016-208490 .

Article   PubMed   Google Scholar  

Gagliardi AR, Berta W, Kothari A, Boyko J, Urquhart R. Integrated knowledge translation (IKT) in health care: a scoping review. Implement Sci. 2016;11:38. https://doi.org/10.1186/s13012-016-0399-1 .

Article   PubMed   PubMed Central   Google Scholar  

Canadian Institutes of Health Research. Strategy for Patient-Oriented Research - CIHR. http://www.cihr-irsc.gc.ca/e/41204.html . Accessed 20 Sep 2017.

Patient-Centered Outcomes Research Institute. Patient-Centered Outcomes Research Institute website. https://www.pcori.org/ . Accessed 20 Sep 2017.

National Institute for Health Research. INVOLVE | INVOLVE Supporting public involvement in NHS, public health and social care research. http://www.invo.org.uk/ . Accessed 27 Sep 2017.

Canadian Institutes of Health Research. SPOR SUPPORT Units - CIHR. http://www.cihr-irsc.gc.ca/e/45859.html . Accessed 27 Sep 2017.

Ocloo J, Matthews R. From tokenism to empowerment: progressing patient and public involvement in healthcare improvement. BMJ Qual Saf. 2016;25:626–32.

Hearld KR, Hearld LR, Hall AG. Engaging patients as partners in research: factors associated with awareness, interest, and engagement as research partners. SAGE open Med. 2017;5:2050312116686709. https://doi.org/10.1177/2050312116686709.

Domecq JP, Prutsky G, Elraiyah T, Wang Z, Nabhan M, Shippee N, et al. Patient engagement in research: a systematic review. BMC Health Serv Res. 2014;14:89. https://doi.org/10.1186/1472-6963-14-89 .

Concannon TW, Fuster M, Saunders T, Patel K, Wong JB, Leslie LK, et al. A systematic review of stakeholder engagement in comparative effectiveness and patient-centered outcomes research. J Gen Intern Med. 2014;29:1692–701.

Oostendorp LJM, Durand M-A, Lloyd A, Elwyn G. Measuring organisational readiness for patient engagement (MORE): an international online Delphi consensus study. BMC Health Serv Res. 2015;15:61. https://doi.org/10.1186/s12913-015-0717-3 .

Boddy K, Cowan K, Gibson A, Britten N. Does funded research reflect the priorities of people living with type 1 diabetes? A secondary analysis of research questions. BMJ Open. 2017;7:e016540. https://doi.org/10.1136/bmjopen-2017-016540 .

Mays N, Pope C. Rigour and qualitative research. BMJ Br Med J. 1995;311:109–12.

Article   CAS   Google Scholar  

Krefting L. Rigor in qualitative research: the assessment of trustworthiness. Am J Occup Ther. 1991;45:214–22.

Article   CAS   PubMed   Google Scholar  

Charmaz K. Constructing grounded theory : a practical guide through qualitative analysis. London: Sage Publications; 2006.

Wilson P, Mathie E, Keenan J, McNeilly E, Goodman C, Howe A, et al. ReseArch with patient and public invOlvement: a RealisT evaluation – the RAPPORT study. Heal Serv Deliv Res. 2015;3:1–176. https://doi.org/10.3310/hsdr03380 .

Corbin JM, Strauss AL, Strauss AL. Basics of qualitative research : techniques and procedures for developing grounded theory. Thousand Oaks: Sage Publications; 2008.

Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF. Using the consolidated framework for implementation research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci. 2017;12:15. https://doi.org/10.1186/s13012-017-0550-7 .

Esmail L, Moore E. Evaluating patient and stakeholder engagement in research: moving from theory to practice. J Comp Eff Res. 2015;4:133–45. https://doi.org/10.2217/cer.14.79 .

Brett J, Staniszewska S, Mockford C, Herron-Marx S, Hughes J, Tysall C, et al. A systematic review of the impact of patient and public involvement on service users, researchers and communities. Patient. 2014;7:387–95.

Rowland P, McMillan S, McGillicuddy P, Richards J. What is “the patient perspective” in patient engagement programs? Implicit logics and parallels to feminist theories. Heal An Interdiscip J Soc Study Heal Illn Med. 2016. https://doi.org/10.1177/1363459316644494 .

Article   Google Scholar  

Martin GP. “Ordinary people only”: knowledge, representativeness, and the publics of public participation in healthcare. Sociol Health Illn. 2008;30:35–54. https://doi.org/10.1111/j.1467-9566.2007.01027.x .

Snow B, Tweedie K, Pederson A, Shrestha H, Bachmann L. Patient Engagement: Heard and valued. 2013. http://www.cfhi-fcass.ca/sf-docs/default-source/patient-engagement/awesome_workbook-fraserhealth.pdf . Accessed 24 Jan 2018.

Montesanti SR, Abelson J, Lavis JN, Dunn JR. Enabling the participation of marginalized populations: case studies from a health service organization in Ontario. Canada Health Promot Int. 2017;32(4):636–49. https://doi.org/10.1093/heapro/dav118.

The Change Foundation. Should Money Come into It? A Tool for Deciding Whether to Pay Patient- Engagement Participants. Toronto; 2015. http://www.changefoundation.ca/site/wp-content/uploads/2016/05/Should-money-come-into-it.pdf . Accessed 18 June 2018

INVOLVE. Involvement Cost Calculator | INVOLVE. http://www.invo.org.uk/resource-centre/payment-and-recognition-for-public-involvement/involvement-cost-calculator/ . Accessed 11 Oct 2017.

Patton MQ. Qualitative research and evaluation methods. 3rd ed. Thousand Oaks: Sage Publications; 2002.

Kerr C, Nixon A, Wild D. Assessing and demonstrating data saturation in qualitative inquiry supporting patient-reported outcomes research. Expert Rev Pharmacoecon Outcomes Res. 2010;10(3):269–81. https://doi.org/10.1586/erp.10.30 .

Canadian Institutes of Health Research. CIHR’S Citizen Engagement Handbook. http://www.cihr-irsc.gc.ca/e/documents/ce_handbook_e.pdf . Accessed 18 June 2018.

Elliott J, Heesterbeek S, Lukensmeyer CJ, Slocum N. Participatory and deliberative methods toolkit how to connect with citizens: a practitioner’s manual. King Baudouin Foundation; 2005. https://www.kbs-frb.be/en/Virtual-Library/2006/294864 . Accessed 28 Sep 2017.

Robbins M, Tufte J, Hsu C. Learning to “swim” with the experts: experiences of two patient co-investigators for a project funded by the Patient-Centered Outcomes Research Institute. Perm J. 2016;20:85–8.

PubMed   PubMed Central   Google Scholar  

Gauvin F-P, Abelson J, Lavis JN. Strengthening public and patient engagement in health technology assessment in Ontario. 2014. https://www.mcmasterforum.org/docs/default-source/product-documents/evidence-briefs/public-engagement-in-health-technology-assessement-in-ontario-eb.pdf?sfvrsn=2 . Accessed 18 June 2018.

Abelson J, Montesanti S, Li K, Gauvin F-P, Martin E. Effective strategies for interactive public engagement in the development of healthcare policies and Programs. 2010; https://www.cfhi-fcass.ca/sf-docs/default-source/commissionedresearch-reports/Abelson_EN_FINAL.pdf?sfvrsn=0. . Accessed 16 Nov 2018.

Abelson J. Public and patient engagement evaluation tool (version 1.0), vol. 0; 2015. https://www.fhs.mcmaster.ca/publicandpatientengagement/ppeet_request_form.html ! Accessed 18 June 2018

Johnson DS, Bush MT, Brandzel S, Wernli KJ. The patient voice in research—evolution of a role. Res Involv Engagem. 2016;2:6. https://doi.org/10.1186/s40900-016-0020-4 .

Crocker JC, Boylan AM, Bostock J, Locock L. Is it worth it? Patient and public views on the impact of their involvement in health research and its assessment: a UK-based qualitative interview study. Health Expect. 2017;20(3):519–28. https://doi.org/10.1111/hex.12479 .

Renedo A, Marston C. Healthcare professionals’ representations of “patient and public involvement” and creation of “public participant” identities: implications for the development of inclusive and bottom-up community participation initiatives. J Community Appl Soc Psychol. 2011;21:268–80. https://doi.org/10.1002/casp.1092 .

Hahn DL, Hoffmann AE, Felzien M, LeMaster JW, Xu J, Fagnan LJ. Tokenism in patient engagement. Fam Pract. 2017;34(3):290–5. https://doi.org/10.1093/fampra/cmw097.

Hart C, Poole JM, Facey ME, Parsons JA. Holding firm: power, push-Back, and opportunities in navigating the liminal space of critical qualitative Health Research. Qual Health Res. 2017;27:1765–74. https://doi.org/10.1177/1049732317715631 .

Eakin JM. Educating critical qualitative health researchers in the land of the randomized controlled trial. Qual Inq. 2016;22:107–18. https://doi.org/10.1177/1077800415617207 .

Safo S, Cunningham C, Beckman A, Haughton L, Starrels JL. “A place at the table:” a qualitative analysis of community board members’ experiences with academic HIV/AIDS research. BMC Med Res Methodol. 2016;16:80. https://doi.org/10.1186/s12874-016-0181-8 .

Thompson J, Bissell P, Cooper C, Armitage CJ, Barber R, Walker S, et al. Credibility and the “professionalized” lay expert: reflections on the dilemmas and opportunities of public involvement in health research. Heal Interdiscip J Soc Study Heal Illn Med. 2012;16:602–18. https://doi.org/10.1177/1363459312441008 .

Burke JG, Jones J, Yonas M, Guizzetti L, Virata MC, Costlow M, et al. PCOR, CER, and CBPR: alphabet soup or complementary fields of Health Research? Clin Transl Sci. 2013;6:493–6. https://doi.org/10.1111/cts.12064 .

Jagosh J, Bush PL, Salsberg J, Macaulay AC, Greenhalgh T, Wong G, et al. A realist evaluation of community-based participatory research: partnership synergy, trust building and related ripple effects. BMC Public Health. 2015;15:725. https://doi.org/10.1186/s12889-015-1949-1 .

Jagosh J, Macaulay AC, Pluye P, Salsberg J, Bush PL, Henderson J, et al. Uncovering the benefits of participatory research: implications of a realist review for health research and practice. Milbank Q. 2012;90:311–46. https://doi.org/10.1111/j.1468-0009.2012.00665.x .

Government of Canada IAP on RE. TCPS 2 - Chapter 9 Research Involving the First Nations, Inuit and Métis Peoples of Canada. http://www.pre.ethics.gc.ca/eng/policy-politique/initiatives/tcps2-eptc2/chapter9-chapitre9/ . Accessed 16 Oct 2017.

Ramsden VR, Crowe J, Rabbitskin N, Rolfe D, Macaulay A. Authentic engagement, co-creation and action research. In: Goodyear-smith F, Mash B, editors. How to do primary care research. Boca Raton: CRC press, medicine, Taylor & Francis Group; 2019.

Ramsden VR, Salsberg J, Herbert CP, Westfall JM, LeMaster J, Macaulay AC. Patient- and community-oriented research: how is authentic engagement identified in grant applications? Can Fam Physician. 2017;63:74–6.

Woolf SH, Zimmerman E, Haley A, Krist AH. Authentic engagement of patients and communities can transform research, practice, and policy. Health Aff. 2016;35:590–4.

Parkes JH, Pyer M, Wray P, Taylor J. Partners in projects: preparing for public involvement in health and social care research. Health Policy. 2014;117:399–408. https://doi.org/10.1016/j.healthpol.2014.04.014 .

Norman N, Bennett C, Cowart S, Felzien M, Flores M, Flores R, et al. Boot camp translation: a method for building a community of solution. J Am Board Fam Med. 2013;26:254–63. https://doi.org/10.3122/jabfm.2013.03.120253 .

Ramsden VR, Rabbitskin N, Westfall JM, Felzien M, Braden J, Sand J. Is knowledge translation without patient or community engagement flawed. Fam Pract. 2016:1–3.

Denzin NK, Lincoln YS. SAGE Handbook of qualitative research. 3rd ed. Thousand Oaks: Sage Publications; 2005.

Pearce V, Baraitser P, Smith G, Greenhalgh T. Experience-based co-design. In: User involvement in health care. Oxford: Wiley-Blackwell; 2010. p. 28–51. https://doi.org/10.1002/9781444325164.ch3 .

Chapter   Google Scholar  

Hulatt I, Lowes L. Involving service users in health and social care research: Routledge; 2005. https://doi.org/10.4324/9780203865651.

Book   Google Scholar  

Download references

Acknowledgements

This paper was drafted in response to a call for concept papers related to integrated knowledge translation issued by the Integrated Knowledge Translation Research Network (CIHR FDN #143237).

This paper was commissioned by the Integrated Knowledge Translation Network (IKTRN). The IKTRN brings together knowledge users and researchers to advance the science and practice of integrated knowledge translation and train the next generation of integrated knowledge translation researchers. Honorariums were provided for completed papers. The IKTRN is funded by a Canadian Institutes of Health Research Foundation Grant (FDN #143247).

Availability of data and materials

Not applicable.

Author information

Authors and affiliations.

School of Epidemiology and Public Health, University of Ottawa, 307D- 600 Peter Morand Crescent, Ottawa, ON, K1G 5Z3, Canada

Danielle E. Rolfe & Ian D. Graham

Department of Academic Family Medicine, University of Saskatchewan, West Winds Primary Health Centre, 3311 Fairlight Drive, Saskatoon, SK, S7M 3Y5, Canada

Vivian R. Ramsden

School of Nursing, University of Northern British Columbia, 3333 University Way, Prince George, BC, V2K4C6, Canada

Davina Banner

You can also search for this author in PubMed   Google Scholar

Contributions

DR conceived and drafted this paper. All authors were involved in critiquing and revising the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Danielle E. Rolfe .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Rolfe, D.E., Ramsden, V.R., Banner, D. et al. Using qualitative Health Research methods to improve patient and public involvement and engagement in research. Res Involv Engagem 4 , 49 (2018). https://doi.org/10.1186/s40900-018-0129-8

Download citation

Received : 18 June 2018

Accepted : 09 November 2018

Published : 13 December 2018

DOI : https://doi.org/10.1186/s40900-018-0129-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative methods
  • Qualitative health research
  • Patient-oriented research
  • Integrated knowledge translation
  • Patient engagement
  • Patient partners
  • Patient and public involvement

Research Involvement and Engagement

ISSN: 2056-7529

qualitative medical research examples

  • - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Qualitative Research:...

Qualitative Research: Qualitative interviews in medical research

  • Related content
  • Peer review
  • Nicky Britten , lecturer in medical sociology a
  • a Department of General Practice, United Medical and Dental Schools of Guy's and St Thomas's Hospitals, London SE11 6SP

Much qualitative research is interview based, and this paper provides an outline of qualitative interview techniques and their application in medical settings. It explains the rationale for these techniques and shows how they can be used to research kinds of questions that are different from those dealt with by quantitative methods. Different types of qualitative interviews are described, and the way in which they differ from clinical consultations is emphasised. Practical guidance for conducting such interviews is given.

Types of qualitative interview

Practising clinicians routinely interview patients during their clinical work, and they may wonder whether simply talking to people constitutes a legitimate form of research. In sociology and related disciplines, however, interviewing is a well established research technique. There are three main types: structured, semistructured, and in depth interviews (box 1).

Types of interviews

Structured Usually with a structured questionnaire

Semistructured Open ended questions

Depth One or two issues covered in great detail Questions are based on what the interviewee says

Structured interviews consist of administering structured questionnaires, and interviewers are trained to ask questions (mostly fixed choice) in a standardised manner. For example, interviewees might be asked: “Is your health: excellent, good, fair, or poor?” Though qualitative interviews are often described as being unstructured in order to contrast them with this type of formalised quantitative interview, the term “unstructured” is misleading as no interview is completely devoid of structure: if it were, there would be no guarantee that the data gathered would be appropriate to the research question.

Semistructured interviews are conducted on the basis of a loose structure consisting of open ended questions that define the area to be explored, at least initially, and from which the interviewer or interviewee may diverge in order to pursue an idea in more detail. Continuing with the same example, interviewees might initially be asked a series of …

Log in using your username and password

BMA Member Log In

If you have a subscription to The BMJ, log in:

  • Need to activate
  • Log in via institution
  • Log in via OpenAthens

Log in through your institution

Subscribe from £184 *.

Subscribe and get access to all BMJ articles, and much more.

* For online subscription

Access this article for 1 day for: £50 / $60/ €56 ( excludes VAT )

You can download a PDF version for your personal record.

Buy this article

qualitative medical research examples

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Qualitative studies. Their role in medical research

Affiliation.

  • 1 Department of Family Medicine, University of Ottawa.
  • PMID: 9839063
  • PMCID: PMC2277956

Objective: To define qualitative research in terms of its philosophical roots, the questions it addresses, its methods and analyses, and the type of results it can offer.

Data sources: MEDLINE and CINAHL (Cumulative Index to Nursing and Allied Health Literature) databases were searched for the years January 1985 to April 1998. The search strategy consisted of "textword" terms that searched in the "title" field of both databases. Qualitative research and evaluation textbooks in health and the social sciences were also used.

Quality of evidence: The information on qualitative research is based on the most recent and valid evidence from the health and social science fields.

Main message: Qualitative research seeks to understand and interpret personal experience to explain social phenomena, including those related to health. It can address questions that quantitative research cannot, such as why people do not adhere to a treatment regimen or why a certain health care intervention is successful. It uses many methods of data collection, including participant observation, case studies, and interviews, and numerous approaches to data analysis that range from the quasistatistical to the intuitive and inductive.

Conclusions: Qualitative research, a form of research completely different from quantitative research, can provide important insights into health-related phenomena and can enrich further research inquiries.

PubMed Disclaimer

Similar articles

  • Literature review: considerations in undertaking focus group research with culturally and linguistically diverse groups. Halcomb EJ, Gholizadeh L, DiGiacomo M, Phillips J, Davidson PM. Halcomb EJ, et al. J Clin Nurs. 2007 Jun;16(6):1000-11. doi: 10.1111/j.1365-2702.2006.01760.x. J Clin Nurs. 2007. PMID: 17518876 Review.
  • Qualitative methods: what are they and why use them? Sofaer S. Sofaer S. Health Serv Res. 1999 Dec;34(5 Pt 2):1101-18. Health Serv Res. 1999. PMID: 10591275 Free PMC article. Review.
  • Qualitative research methods: key features and insights gained from use in infection prevention research. Forman J, Creswell JW, Damschroder L, Kowalski CP, Krein SL. Forman J, et al. Am J Infect Control. 2008 Dec;36(10):764-71. doi: 10.1016/j.ajic.2008.03.010. Epub 2008 Oct 3. Am J Infect Control. 2008. PMID: 18834752
  • Keeping up appearances: using qualitative research to enhance knowledge of dental practice. Meadows LM, Verdi AJ, Crabtree BF. Meadows LM, et al. J Dent Educ. 2003 Sep;67(9):981-90. J Dent Educ. 2003. PMID: 14518836
  • [Qualitative methods in medical research--preconditions, potentials and limitations]. Malterud K. Malterud K. Tidsskr Nor Laegeforen. 2002 Oct 20;122(25):2468-72. Tidsskr Nor Laegeforen. 2002. PMID: 12448119 Review. Norwegian.
  • Physical, psychological, and social experiences of women recovered from COVID-19 in Iran: A qualitative study. Aghamohammadi V, Rabiee-Khan F, Nasiri K, Habibi Soola A, Mousazadeh Y, Rezakhani Moghaddam H. Aghamohammadi V, et al. J Educ Health Promot. 2024 Jul 5;13:188. doi: 10.4103/jehp.jehp_476_23. eCollection 2024. J Educ Health Promot. 2024. PMID: 39268423 Free PMC article.
  • Promoting patient-centered care in CAR-T therapy for hematologic malignancy: a qualitative meta-synthesis. Xie C, Duan H, Liu H, Wang Y, Sun Z, Lan M. Xie C, et al. Support Care Cancer. 2024 Aug 16;32(9):591. doi: 10.1007/s00520-024-08799-3. Support Care Cancer. 2024. PMID: 39150486 Free PMC article. Review.
  • Qualitative Investigation of Experience and Quality of Life in Patients Treated with Calcium Electroporation for Cutaneous Metastases. Vestergaard K, Vissing M, Gehl J, Lindhardt CL. Vestergaard K, et al. Cancers (Basel). 2023 Jan 18;15(3):599. doi: 10.3390/cancers15030599. Cancers (Basel). 2023. PMID: 36765556 Free PMC article.
  • Financing care for Severe Stigmatizing Skin Diseases (SSSDs) in Liberia: challenges and opportunities. Smith JS Jr, Diaconu K, Witter S, Weiland S, Zaizay FZ, Theobald S, McCollum R, Kollie K, Kollie J, Berrian H, Hotopf I, Sempe L, Tate W, Dean L. Smith JS Jr, et al. Int J Equity Health. 2022 Nov 14;21(1):160. doi: 10.1186/s12939-022-01781-7. Int J Equity Health. 2022. PMID: 36376897 Free PMC article.
  • Perceived barriers and facilitators of physical activity among Saudi Arabian females living in the East Midlands. Almaqhawi A. Almaqhawi A. J Taibah Univ Med Sci. 2021 Dec 14;17(3):384-391. doi: 10.1016/j.jtumed.2021.11.002. eCollection 2022 Jun. J Taibah Univ Med Sci. 2021. PMID: 35722239 Free PMC article.
  • Fam Med. 1989 Jul-Aug;21(4):296-8 - PubMed
  • CMAJ. 1997 Nov 15;157(10):1442-6 - PubMed
  • Can Fam Physician. 1996 Mar;42:387-9, 397-400 - PubMed
  • BMJ. 1995 Aug 12;311(7002):444-6 - PubMed
  • Fam Pract Res J. 1994 Sep;14(3):289-97 - PubMed

Publication types

  • Search in MeSH

Related information

  • Cited in Books

LinkOut - more resources

Full text sources.

  • Europe PubMed Central
  • PubMed Central

Research Materials

  • NCI CPTC Antibody Characterization Program

Miscellaneous

  • NCI CPTAC Assay Portal
  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Advertisement

Issue Cover

  • Previous Issue
  • Previous Article
  • Next Article
  • Box 1. What to Look for in Research Using This Method

What Is Qualitative Research?

Qualitative versus quantitative research, conducting and appraising qualitative research, conclusions, research support, competing interests, qualitative research methods in medical education.

Submitted for publication January 5, 2018. Accepted for publication November 29, 2018.

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Cite Icon Cite
  • Get Permissions
  • Search Site

Adam P. Sawatsky , John T. Ratelle , Thomas J. Beckman; Qualitative Research Methods in Medical Education. Anesthesiology 2019; 131:14–22 doi: https://doi.org/10.1097/ALN.0000000000002728

Download citation file:

  • Ris (Zotero)
  • Reference Manager

Qualitative research was originally developed within the social sciences. Medical education is a field that comprises multiple disciplines, including the social sciences, and utilizes qualitative research to gain a broader understanding of key phenomena within the field. Many clinician educators are unfamiliar with qualitative research. This article provides a primer for clinician educators who want to appraise or conduct qualitative research in medical education. This article discusses a definition and the philosophical underpinnings for qualitative research. Using the Standards for Reporting Qualitative Research as a guide, this article provides a step-wise approach for conducting and evaluating qualitative research in medical education. This review will enable the reader to understand when to utilize qualitative research in medical education and how to interpret reports using qualitative approaches.

Image: J. P. Rathmell and Terri Navarette.

Image: J. P. Rathmell and Terri Navarette.

Qualitative research provides approaches to explore and characterize the education of future anesthesiologists. For example, the practice of anesthesiology is increasingly team-based; core members of the anesthesia care team include physicians, trainees, nurse anesthetists, anesthesiologist assistants, and other healthcare team members. 1   Understanding how to work within and how to teach learners about anesthesia care teams requires the ability to conceptualize the complexity of individual psychology and social interactions that occur within teams. Qualitative research is well suited to investigate complex issues like team-based care. For example, one qualitative study observed the interactions between members of the anesthesia care team during simulated stressful situations and conducted interviews of team members; they described limited understanding of each team member’s role and perceptions about appropriate roles and responsibilities, which provided insight for interprofessional team training. 2   Another qualitative study explored the hierarchy within the anesthesia care team, highlighting residents’ reluctance to challenge the established hierarchy and outlining the strategies they use to cope with fear and intimidation. 3   Key issues in medical education and anesthesiology, particularly when exploring human experience and social interactions, may be best studied using qualitative research methodologies and methods.

Medical education is a complex field, and medical education research and practice fittingly draws from many disciplines ( e.g. , medicine, psychology, sociology, education) and synthesizes multiple perspectives to explain how people learn and how medicine should be taught. 4 , 5   The concept of a field was well described by Cristancho and Varpio 5   in their tips for early career medical educators: “A discipline is usually guided by shared paradigms, assumptions, rules and methods to present their knowledge claims— i.e. , people from the same discipline speak the same language. A field brings people from multiple disciplines together.” Qualitative research draws from the perspectives of multiple disciplines and has provided methodologies to explore the complex research questions inherent to medical education.

When appraising qualitative research in medical education, do the authors:

Clearly state the study purpose and research question?

Describe the conceptual framework that inform the study and guide analysis?

Identify their qualitative methodology and research paradigm?

Demonstrate adequate reflexivity, conveying to the reader their values, assumptions and way of thinking, being explicit about the effects these ways of thinking have on the research process?

Choose data collection methods that are congruent with the research purpose and qualitative methodology?

Select an appropriate sampling strategy, choosing participants whose perspectives or experiences are relevant to the study question?

Define their method for determining saturation, how they decided to stop data collection?

Outline their process for data processing, including the management and coding of study data?

Conduct data analysis consistent with their chosen methodology?

Consider techniques to enhance trustworthiness of their study findings?

Synthesize and interpret their data with sufficient detail and supporting quotations to explain the phenomenon of study?

Current medical training is heavily influenced by the practice of evidence-based medicine. 6   Trainees are taught the “hierarchy of evidence” for evaluating studies of clinical interventions. 7   This hierarchy prioritizes knowledge gained through systematic reviews and meta-analyses, randomized controlled trials, and observational studies, but it does not include qualitative research methodologies. This means that because of their medical training and exposure to quantitative medical literature, clinician educators may be more familiar with quantitative research and feel more comfortable engaging in studies utilizing quantitative methodologies. However, many clinician educators are not familiar with the language and application of qualitative research and feel less comfortable engaging in studies using qualitative methodologies.

Because medical education is a diverse and complex field, qualitative research is a common approach in medical education research. Clinician educators who wish to understand the medical education literature need to be familiar with qualitative research. Clinician educators involved in research may also find themselves asking questions best answered by qualitative methodologies. Our goal is to provide a broad, practical overview of qualitative research in medical education. Our objectives are to:

1) Define qualitative research.

2) Compare and contrast qualitative and quantitative research.

3) Provide a framework for conducting and appraising qualitative research in medical education.

Qualitative research in medical education has a distinct vocabulary with terminology not commonly used in other biomedical research fields. Therefore, we have provided a glossary and definitions of the common terms that are used throughout this article ( table 1 ).

Glossary of Common Terms Used in Qualitative Research

Glossary of Common Terms Used in Qualitative Research

Of the many attempts to provide a comprehensive definition of qualitative research, our favorite definition comes from Denzin and Lincoln:

“Qualitative research is a situated activity that locates the observer in the world. Qualitative research consists of a set of interpretive, material practices that make the world visible. These practices…turn the world into a series of representations, including field notes, interviews, conversations, photographs, recordings, and memos to the self. At this level, qualitative research involves an interpretive, naturalistic approach to the world. This means that qualitative researchers study things in their natural settings, attempting to make sense of or interpret phenomena in terms of the meanings people bring to them.” 12  

This definition reveals the following points: first, qualitative research is a “situated activity,” meaning that the research and observations are made in the real world, in this case a real life clinical or educational situation. Second, qualitative research “turns the world into a series of representations” by representing the observations, in this case of a clinical or educational situation, with qualitative data, usually taking the form of words, pictures, documents, and other symbols. Last, qualitative researchers seek to “make sense” of the meanings that research participants bring to different phenomena to allow for a greater understanding of those phenomena. Through qualitative research, observers comprehend participants’ beliefs and values and the way these beliefs and values are shaped by the context in which they are studied.

Because most clinician educators are familiar with quantitative methods, we will start by comparing qualitative and quantitative methods to gain a better understanding of qualitative research ( table 2 ). To illustrate the difference between qualitative and quantitative research in medical education, we pose the question: “What makes noon conference lectures effective for resident learning?” A qualitative approach might explore the learner perspective on learning in noon conference lectures during residency and conduct an exploratory thematic analysis to better understand what the learner thinks is effective. 13   A qualitative approach is useful to answer this question, especially if the phenomenon of interest is incompletely understood. If we wanted to compare types or attributes of conferences to assess the most effective methods of teaching in a noon conference setting, then a quantitative approach might be more appropriate, though a qualitative approach could be helpful as well. We could use qualitative data to inform the design of a survey 14   or even inform the design of a randomized control trial to compare two types of learning during noon conference. 15   Therefore, when discussing qualitative and quantitative research, the issue is not which research approach is stronger, because it is understood that each approach yields different types of knowledge when answering the research question.

Comparisons of Quantitative and Qualitative Research in Medical Education

Comparisons of Quantitative and Qualitative Research in Medical Education

Similarities

The first step of any research project, qualitative or quantitative, is to determine and refine the study question; this includes conducting a thorough literature review, crafting a problem statement, establishing a conceptual framework for the study, and declaring a statement of intent. 16   A common pitfall in medical education research is to start by identifying the desired methods ( e.g. , “I want to do a focus group study with medical students.”) without having a clearly refined research question, which is like putting the cart before the horse. In other words, the research question should guide the methodology and methods for both qualitative and quantitative research.

Acknowledging the conceptual framework for a study is equally important for both qualitative and quantitative research. In a systematic review of medical education research, only 55% of studies provided a conceptual framework, limiting the interpretation and meaning of the results. 17   Conceptual frameworks are often theories that represent a way of thinking about the phenomenon being studied. Conceptual frameworks guide the interpretation of data and situate the study within the larger body of literature on a specific topic. 9   Because qualitative research was developed within the social sciences, many qualitative research studies in medical education are framed by theories from social sciences. Theories from social science disciplines have the ability to “open up new ways of seeing the world and, in turn, new questions to ask, new assumptions to unearth, and new possibilities for change.” 18   Qualitative research in medical education has benefitted from these new perspectives to help understand fundamental and complex problems within medical education such as culture, power, identity, and meaning.

Differences

The fundamental difference between qualitative and quantitative methodologies centers on epistemology ( i.e. , differing views on truth and knowledge). Cleland 19   describes the differences between qualitative and quantitative philosophies of scientific inquiry: “quantitative and qualitative approaches make different assumptions about the world, about how science should be conducted and about what constitutes legitimate problems, solutions and criteria of ‘proof.’”

Quantitative research comes from objectivism , an epistemology asserting that there is an absolute truth that can be discovered; this way of thinking about knowledge leads researchers to conduct experimental study designs aimed to test hypotheses about cause and effect. 10   Qualitative research, on the other hand, comes from constructivism , an epistemology asserting that reality is constructed by our social, historical, and individual contexts, and leads researchers to utilize more naturalistic or exploratory study designs to provide explanations about phenomenon in the context that they are being studied. 10   This leads researchers to ask fundamentally different questions about a given phenomenon; quantitative research often asks questions of “What?” and “Why?” to understand causation, whereas qualitative research often asks the questions “Why?” and “How?” to understand explanations. Cook et al. 20   provide a framework for classifying the purpose of medical education research to reflect the steps in the scientific method—description (“What was done?”), justification (“Did it work?”), and clarification (“Why or how did it work?”). Qualitative research nicely fits into the categories of “description” and “clarification” by describing observations in natural settings and developing models or theories to help explain “how” and “why” educational methods work. 20  

Another difference between quantitative and qualitative research is the role of the researcher in the research process. Experimental studies have explicitly stated methods for creating an “unbiased” study in which the researcher is detached ( i.e. , “blinded”) from the analysis process so that their biases do not shape the outcome of the research. 21   The term “bias” comes from the positivist paradigm underpinning quantitative research. Assessing and addressing “bias” in qualitative research is incongruous. 22   Qualitative research, based largely on a constructivist paradigm, acknowledges the role of the researcher as a “coconstructer” of knowledge and utilizes the concept of “reflexivity.” Because researchers act as coconstructors of knowledge, they must be explicit about the perspectives they bring to the research process. A reflexive researcher is one who challenges their own values, assumptions, and way of thinking and who is explicit about the effects these ways of thinking have on the research process. 23   For example, when we conducted a study on self-directed learning in residency training, we were overt regarding our roles in the residency program as core faculty, our belief in the importance of self-directed learning, and our assumptions that residents actually engaged in self-directed learning. 24 , 25   We also needed to challenge these assumptions and open ourselves to alternative questions, methods of data collection, and interpretations of the data, to ultimately ensure that we created a research team with varied perspectives. Therefore, qualitative researchers do not strive for “unbiased” research but to understand their own roles in the coconstruction of knowledge. When assessing reflexivity, it is important for the authors to define their roles, explain how those roles may affect the collection and analysis of data, and how the researchers accounted for that effect and, if needed, challenged any assumptions during the research process. Because of the role of the researcher in qualitative research, it is vital to have a member of the research team with qualitative research experience.

A Word on Mixed Methods

In mixed methods research, the researcher collects and analyzes both qualitative and quantitative data rigorously and integrates both forms of data in the results of the study. 26   Medical education research often involves complex questions that may be best addressed through both quantitative and qualitative approaches. Combining methods can complement the strengths and limitations of each method and provide data from multiple sources to create a more detailed understanding of the phenomenon of interest. Examples of uses of mixed methods that would be applicable to medical education research include: collecting qualitative and quantitative data for more complete program evaluation, collecting qualitative data to inform the research design or instrument development of a quantitative study, or collecting qualitative data to explain the meaning behind the results of a quantitative study. 26   The keys to conducting mixed methods studies are to clearly articulate your research questions, explain your rationale for use of each approach, build an appropriate research team, and carefully follow guidelines for methodologic rigor for each approach. 27  

Toward Asking More “Why” Questions

We presented similarities and differences between qualitative and quantitative research to introduce the clinician educator to qualitative research but not to suggest the relative value of one these research methods over the other. Whether conducting qualitative or quantitative research in medical education, researchers should move toward asking more “why” questions to gain deeper understanding of the key phenomena and theories in medical education to move the field of medical education forward. 28   By understanding the theories and assumptions behind qualitative and quantitative research, clinicians can decide how to use these approaches to answer important questions in medical education.

There are substantial differences between qualitative and quantitative research with respect to the assessment of rigor; here we provide a framework for reading, understanding, and assessing the quality of qualitative research. O’Brien et al. 29   created a useful 21-item guide for reporting qualitative research in medical education, based upon a systematic review of reporting standards for qualitative research—the Standards for Reporting Qualitative Research. It should be noted, however, that just performing and reporting each step in these standards do not ensure research quality.

Using the Standards for Reporting Qualitative Research as a backdrop, we will highlight basic steps for clinician educators wanting to engage with qualitative research. If you use this framework to conduct qualitative research in medical education, then you should address these steps; if you are evaluating qualitative research in medical education, then you can assess whether the study investigators addressed these steps. Table 3 underscores each step and provides examples from our research in resident self-directed learning. 25  

Components of Qualitative Research: Examples from a Single Research Study

Components of Qualitative Research: Examples from a Single Research Study

Refine the study question. As with any research project, investigators should clearly define the topic of research, describe what is already known about the phenomenon that is being studied, identify gaps in the literature, and clearly state how the study will fill that gap. Considering theoretical underpinnings of qualitative research in medical education often means searching for sources outside of the biomedical literature and utilizing theories from education, sociology, psychology, or other disciplines. This is also a critical time to engage people from other disciplines to identify theories or sources of information that can help define the problem and theoretical frameworks for data collection and analysis. When evaluating the introduction of a qualitative study, the researchers should demonstrate a clear understanding of the phenomenon being studied, the previous research on the phenomenon, and conceptual frameworks that contextualize the study. Last, the problem statement and purpose of the study should be clearly stated.

Identify the qualitative methodology and research paradigm. The qualitative methodology should be chosen based on the stated purpose of the research. The qualitative methodology represents the overarching philosophy guiding the collection and analysis of data and is distinct from the research methods ( i.e. , how the data will be collected). There are a number of qualitative methodologies; we have included a list of some of the most common methodologies in table 4 . Choosing a qualitative methodology involves examining the existing literature, involving colleagues with qualitative research expertise, and considering the goals of each approach. 32   For example, explaining the processes, relationships, and theoretical understanding of a phenomenon would point the researcher to grounded theory as an appropriate approach to conducting research. Alternatively, describing the lived experiences of participants may point the researcher to a phenomenological approach. Ultimately, qualitative research should explicitly state the qualitative methodology along with the supporting rationale. Qualitative research is challenging, and you should consult or collaborate with a qualitative research expert as you shape your research question and choose an appropriate methodology. 32  

Choose data collection methods. The choice of data collection methods is driven by the research question, methodology, and practical considerations. Sources of data for qualitative studies would include open-ended survey questions, interviews, focus groups, observations, and documents. Among the most important aspects of choosing the data collection method is alignment with the chosen methodology and study purpose. 33   For interviews and focus groups, there are specific methods for designing the instruments. 34 , 35   Remarkably, these instruments can change throughout the course of the study, because data analysis often informs future data collection in an iterative fashion.

Select a sampling strategy. After identifying the types of data to be collected, the next step is deciding how to sample the data sources to obtain a representative sample. Most qualitative methodologies utilize purposive sampling, which is choosing participants whose perspectives or experiences are relevant to the study question. 11   Although random sampling and convenience sampling may be simpler and less costly for the researcher than purposeful sampling, these approaches often do not provide sufficient information to answer the study question. 36   For example, in grounded theory, theoretical sampling means that the choice of subsequent participants is purposeful to aid in the building and refinement of developing theory. The criteria for selecting participants should be stated clearly. One key difference between qualitative and quantitative research is sample size: in qualitative research, sample size is usually determined during the data collection process, whereas in quantitative research, the sample size is determined a priori . Saturation is verified when the analysis of newly collected data no longer provides additional insights into the data analysis process. 10  

Plan and outline a strategy for data processing. Data processing refers to how the researcher organizes, manages, and dissects the study data. Although data processing serves data analysis, it is not the analysis itself. Data processing includes practical aspects of data management, like transcribing interviews, collecting field notes, and organizing data for analysis. The next step is coding the data, which begins with organizing the raw data into chunks to allow for the identification of themes and patterns. A code is a “word or short phrase that symbolically assigns a summative, salient, essence-capturing, and/or evocative attribute for a portion of language-based or visual data.” 8   There is an artificial breakdown between data processing and analysis, because these steps may be conducted simultaneously; many consider coding as different from—yet a necessary step to facilitating—the analysis of data. 8   Qualitative software can support this process, by making it easier to organize, access, search, and code your data. However, it is noteworthy that these programs do not do the work for you, they are merely tools for supporting data processing and analysis.

Conduct the data analysis. When analyzing the data, there are several factors to consider. First, the process of data analysis begins with the initial data collection, which often informs future data collection. Researchers should be intentional when reading, reviewing, and analyzing data as it is collected, so that they can shape and enrich subsequent data collection ( e.g. , modify the interview questions). Second, data analysis is often conducted by a research team that should have the appropriate expertise and perspectives to bring to the analysis process. Therefore, when evaluating a qualitative study, you should consider the team’s composition and their reflexivity with respect to their potential biases and influences on their study subjects. Third, the overall goal is to move from the raw data to abstractions of the data that answer the research question. For example, in grounded theory, the research moves from the raw data, to the identification of themes, to categorization of themes, to identifying relationships between themes, and ultimately to the development of theoretical explanations of the phenomenon. 30   Consequently, the primary researcher or research team should be intimately involved with the data analysis, interrogating the data, writing analytic memos, and ultimately make meaning out of the data. There are differing opinions about the use of “counting” of codes or themes in qualitative research. In general, counting of themes is used during the analysis process to recognize patterns and themes; often these are not reported as numbers and percentages as in quantitative research, but may be represented by words like few , some , or many . 37  

Recognize techniques to enhance trustworthiness of your study findings. Ensuring consistency between the data and the results of data analysis, along with ensuring that the data and results accurately represent the perspectives and contexts related to the data source, are crucial to ensuring trustworthiness of study findings. Methods for enhancing trustworthiness include triangulation , which is comparing findings from different methods or perspectives, and member-checking , which is presenting research findings to study participants to provide opportunities to ensure that the analysis is representative. 10  

Synthesize and interpret your data. Synthesis of qualitative research is determined by the depth of the analysis and involves moving beyond description of the data to explaining the findings and situating the results within the larger body of literature on the phenomenon of interest. The reporting of data synthesis should match the research methodology. For instance, if the study is using grounded theory, does the study advance the theoretical understanding of the phenomenon being studied? It is also important to acknowledge that clarity and organization are paramount. 10   Qualitative data are rich and extensive; therefore, researchers must organize and tell a compelling story from the data. 38   This process includes the selection of representative data ( e.g. , quotations from interviews) to substantiate claims made by the research team.

Common Methodologies Used in Qualitative Research

Common Methodologies Used in Qualitative Research

For more information on qualitative research in medical education:

Qualitative Research and Evaluation Methods: Integrating Theory and Practice, by Michael Q. Patton (SAGE Publications, Inc., 2014)

Qualitative Inquiry and Research Design: Choosing Among Five Approaches, by John W. Cresswell (SAGE Publications, Inc. 2017)

Researching Medical Education, by Jennifer Cleland and Steven J. Durning (Wiley-Blackwell, 2015)

Qualitative Research in Medical Education, by Patricia McNally, in Oxford Textbook of Medical Education, edited by Kieren Walsh (Oxford University Press, 2013)

The Journal of Graduate Medical Education “Qualitative Rip Out Series” (Available at: http://www.jgme.org/page/ripouts )

The Standards for Reporting Qualitative Research (O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245-51.)

The Wilson Centre Qualitative Atelier (For more information: http://thewilsoncentre.ca/atelier/ )

Qualitative research is commonly used in medical education but may be unfamiliar to many clinician educators. In this article, we provided a definition of qualitative research, explored the similarities and differences between qualitative and quantitative research, and outlined a framework for conducting or appraising qualitative research in medical education. Even with advanced training, it can be difficult for clinician educators to understand and conduct qualitative research. Leaders in medical education research have proposed the following advice to clinician educators wanting to engage in qualitative medical education research: (1) clinician educators should find collaborators with knowledge of theories from other disciplines ( e.g. , sociology, cognitive psychology) and experience in qualitative research to utilize their complementary knowledge and experience to conduct research—in this way, clinician educators can identify important research questions; collaborators can inform research methodology and theoretical perspectives; and (2) clinician educators should engage with a diverse range disciplines to generate new questions and perspectives on research. 4  

Support was provided solely from institutional and/or departmental sources.

The authors declare no competing interests.

Citing articles via

Most viewed, email alerts, related articles, social media, affiliations.

  • ASA Practice Parameters
  • Online First
  • Author Resource Center
  • About the Journal
  • Editorial Board
  • Rights & Permissions
  • Online ISSN 1528-1175
  • Print ISSN 0003-3022
  • Anesthesiology
  • ASA Monitor

Silverchair Information Systems

  • Terms & Conditions Privacy Policy
  • Manage Cookie Preferences
  • © Copyright 2024 American Society of Anesthesiologists

This Feature Is Available To Subscribers Only

Sign In or Create an Account

  • Technical advance
  • Open access
  • Published: 26 August 2020

A specific method for qualitative medical research: the IPSE (Inductive Process to analyze the Structure of lived Experience) approach

  • Jordan Sibeoni   ORCID: orcid.org/0000-0001-9613-5513 1 , 2 ,
  • Laurence Verneuil 2 ,
  • Emilie Manolios 2 , 3 &
  • Anne Révah-Levy 1 , 2  

BMC Medical Research Methodology volume  20 , Article number:  216 ( 2020 ) Cite this article

12k Accesses

30 Citations

1 Altmetric

Metrics details

This paper reports the construction and use of a specific method for qualitative medical research: The Inductive Process to Analyze the Structure of lived Experience (IPSE), an inductive and phenomenological approach designed to gain the closest access possible to the patients’ experience and to produce concrete recommendations for improving care. This paper describes this innovative method.

IPSE has five steps: 1) set up a research group, 2) ensure the originality of the research, 3) organize recruitment and sampling intended to optimize exemplarity, 4) collect data that enable entry into the subjects’ experience, and 5) analyze the data. This final stage is composed of one individual descriptive phase, followed by two group phases: i) structure the experience, and ii) translate the findings into concrete proposals that make a difference in care.

This innovative method has provided original findings that have opened up new avenues of research and have important practical implications, including (1) the development of patient-reported outcomes, (2) clinical recommendations concerning assessment and treatment, (3) innovative ways to improve communication between patients and doctors, and (4) new insights for medical pedagogy.

Conclusions

IPSE is a qualitative method specifically developed for clinical medical research to reach concrete proposals, easily combined with quantitative research within a mixed-method study design and then directly integrated within evidence-based medicine.

Peer Review reports

The need for a new qualitative method conceptualized by physicians

As the role of patients in their own medical management radically evolves, more collaborative practices that consider the patient’s perspective in this process are progressively replacing the older approach of paternalistic medicine [ 1 ]. Patients’ preferences, choices, and needs have been placed at the core of treatment. This idea relies on a paradigm shift that places the patient’s lived experience at the center of the care process. That is, patients are now considered to be the expert on their own lived experience; and their voices must be heard to enable the achievement of a more person-centered medicine. This paradigm shift is illustrated by the development of concepts such as patient experts [ 2 ], patient partners [ 3 ], and peer-support workers in psychiatry care [ 4 ], but also by the evolution of the principles of evidence-based medicine (EBM). The concept of EBM emerged in the 1980s with the aim of rationalizing medical practices and hierarchizing the medical literature. This concept first relied only on (i) external clinical data (that is, results from randomized control trials and meta-analysis), with (ii) medical expertise subsequently included, and most recently, (iii) patients’ preferences added [ 5 ]. EBM considers, nowadays, that the best-informed medical decision is the one at the intersection of a Venn diagram composed by these three circles. This paradigm shift is also illustrated by the development of new concepts intended to capture patients’ preferences better: patient-reported outcomes (PROs) and patient-reported outcome measures (PROMs). According to the FDA-NIH Biomarker-Working-Group glossary definition, a PRO is “a measurement based on a report that comes directly from the patient, about the status of a patient’s health condition without amendments or interpretations of the patient’s response by a clinician or anyone else” [ 6 ]. Today, clinical trials use PROs/PROMs increasingly often [ 7 ]. They are essential outcome measures, demanded by health authorities and regulatory agencies, and useful for physicians, patients, and health policy-makers. In this new era of person-centered medicine, many authors have concluded that an initial phase of qualitative research is needed early in the construction of all PRO tools to explore patients’ experiences [ 8 , 9 , 10 ]. The PRO Good Research Practices Task Force of the International Society for Pharmacoeconomics and Outcomes Research (IPSOR) [ 10 , 11 ] has also published suggestions for conducting qualitative studies intended to support the content validity of PRO instruments.

Qualitative health research (QHR) is a relatively recent field, covering a broad area — from the description of illness experience to the sociocultural organization of health care — and using a myriad of qualitative methods coming from other theoretical fields, mostly social sciences: sociology (i.e., symbolic interactionism with grounded theory [ 12 ], ethnography [ 13 ]), psychology (i.e. phenomenological psychology [ 14 , 15 , 16 , 17 ]), case study, narrative, and linguistics [ 18 , 19 ]. Other qualitative methods have been developed specifically for QHR in applied disciplines, mainly by and for nursing sciences [ 20 , 21 ], and in primary care [ 22 ], focusing on specific issues (e.g. experience of specific illness, of caring, of help seeking). Qualitative research is now booming in biomedical clinical studies [ 23 ] in many medical specialties, aimed at obtaining an in-depth understanding of phenomena, directly from the perspective of the people experiencing them. These studies have either applied a qualitative method from another field, or relied exclusively on a thematic analysis approach intended only to structure how the data are analyzed and the results presented [ 24 ]. According to Morse, one goal of QHR is to “bridge the gap between scientific evidence and clinical practice” [ 20 ]. This is already the case at a “review” level with the work of the “Cochrane Qualitative & Implementation Methods Group” for the dissemination and incorporation of qualitative results in systematic reviews, that is, qualitative evidence synthesis [ 25 ]. But, so far, there has been no medical qualitative research method specifically tailored to produce rigorous data from the lived experiences of both patients and physicians to directly inform EBM. For instance, there are no specific qualitative methods to explore the perceived efficacy of a treatment to determine efficacy criteria relevant for patients themselves. Medical research should expect qualitative studies to produce knowledge with the potential to improve patients’ care and lives, and not simply conceptual knowledge, that is, knowledge for its own sake, as qualitative methods from the social sciences produce [ 20 ]. Within nursing research, Thorne has developed “interpretive description”, an inductive qualitative method with roots in phenomenology, ethnography, and grounded theory but which endeavors not to theorize the results but rather to offer practical outcomes for nurses’ daily practices [ 21 ]. Although applicable to other areas of health, including clinical medical practice [ 26 , 27 ], interpretive description does not focus on the lived experience of the stakeholders but rather on contextualizing illnesses in multiple domains (e.g. experiential, spiritual, political, cultural, etc.).

Our group, which has worked more than a decade on the analysis, dissemination, and use of qualitative methods in medicine, has developed expertise in their use for exploring complex questions around the experience of diseases and their treatment [ 28 , 29 , 30 ]. We consider that physicians have specific concerns and that their medical training and professional experience enable them to contribute to the field of qualitative medical research differently than nurses and other healthcare professionals do. Thus, as both medical doctors and experienced qualitative researchers, we have become convinced of the need for a new qualitative method designed by physicians for addressing specific issues in clinical medical research.

We advocate that this new method should meet several criteria:

To thoroughly capture the lived experience of patients and other stakeholders: how they live their disease and its treatment and how they recount it [ 31 ], not as an end but a means, to connect experiential knowledge with physicians’ medical knowledge and to develop concrete proposals for improving the health care pathway, treatment, and clinical research.

To be completely structured to allow a group of physicians, after receiving appropriate training, to conduct rigorous, systematic qualitative medical studies, that is, with all stages of the design clearly described and operationalized.

To use vocabulary and concepts that make qualitative medical research accessible and meaningful for physicians and for administrative and policy-making bodies

To directly involve patients within the research process

To integrate qualitative medical research within EBM.

Since none of the existing qualitative methods meet all these criteria, we decided to progressively develop our own: Inductive Process to analyze the Structure of lived Experience (IPSE).

IPSE: theoretical backgrounds

Some qualitative methods have been suggested in medical research, such as qualitative theory-development studies or qualitative elicitation research within a cognitive model [ 32 ]. These are based on directive, task-oriented interviews that we think excessively restrict the direction of the conversations for both participants and interviewers. We want instead to keep the research open to what the participants’ narratives of the experience can add, to allow them to share what they have lived. We strongly believe that truly taking their experience in dealing with an illness or its treatment into account requires letting them recount it freely, as they want to and see fit. The exploration of the lived experience is for us the core of what qualitative methodology can contribute to medical research. To do so, the method must fit into the constructivist paradigm [ 33 , 34 ] and be informed by a phenomenological approach [ 14 ], but without overly theorizing the underlying epistemological and philosophical knowledge, which would impede its practicability.

Lived experience can be defined as personal knowledge of the world gained through direct participation and involvement in the event or phenomenon. Lived experience refers to human activities that are immediate, situated and daily, which are lived without thinking about or paying attention to them (pre-reflexive experience) [ 35 ].

Constructivism comes from the work of the philosopher Immanuel Kant [ 36 ]. It considers that knowledge emerges from a human process of construction. As a research paradigm, constructivism conceives knowledge as a shared construction built on the encounter between researchers and research participants [ 33 , 34 ]. Many qualitative methods, including ethnographic, narrative, and phenomenological, fit into this paradigm, —or have been adapted to it, such as Charmaz’s adaptation of Grounded Theory from a constructivist perspective [ 37 ]. IPSE fits epistemologically into a constructivist paradigm as we postulate that the production of knowledge relies on three elements: (i) subjectivity as a space for constructing human reality, (ii) intersubjectivity as a strategy for accessing valid knowledge of human reality, and (iii) understanding that human reality takes place in daily life. These elements underlie the strengths of this method, which is characterized by flexibility in the progressive construction of the object under study, constantly adjusted to the characteristics and complexity of human phenomena, and always takes the subjectivity of the researchers and the participants into account while combining several techniques of data collection and analysis.

Phenomenology literally means the study of what appears . At the very beginning of the twentieth century, phenomenology became the name of a philosophical current founded by Husserl [ 38 ]. He aimed to study how objects appear to the subject’s consciousness and to describe the essence of a phenomenon not by describing the object as it exists but by describing the experience of the subject. Phenomenology can be descriptive or interpretive, that is, associated with hermeneutics (science of interpretation) [ 39 ]. Within qualitative research, phenomenological approaches seek to capture the lived experience of a subject about a phenomenon, to understand how this phenomenon appears in the individual’s conscious experience. In the field of phenomenology, experience is conceived as uniquely perspectival, embodied, and situated. Phenomenological approaches are particularly relevant for conducting research on experiences, thoughts, imagination, intentions, desires or volition. There are many phenomenological qualitative approaches, coming mainly from the field of psychology, either descriptive (Giorgi’s descriptive phenomenological approach [ 14 ], its adaptation by Colaizzi [ 40 ], and Moustakas’s heuristic method [ 16 ]) or interpretative/hermeneutic (interpretative phenomenological analysis (IPA) and Van Mannen’s approach) [ 15 , 17 ]. Many of them use a theoretical vocabulary and philosophical concepts that are not easily accessible for physicians. Our early work used well-known qualitative phenomenological approaches: one descriptive, that is, Colaizzi’s method [ 41 ] and one hermeneutic: IPA [ 42 , 43 , 44 , 45 , 46 , 47 , 48 ].

Recently, philosophers working in the field of phenomenology have criticized some of these methods. Zahavi wrote about the approaches of IPA and Van Mannen that “qualitative health researchers interested in phenomenology should look elsewhere for theoretical inspiration and methodological guidance” [ 49 ]. As for Moustakas’s method, Appelbaum noted that, although it uses key phenomenological terms , this method is not phenomenological but rather grounded in a humanistic therapeutic perspective [ 50 ]. Moreover, hermeneutic approaches assume that human beings are always already engaged in interpretative meaning-making activities. They do not seek to capture the patients’ lived experience, but only the meaning they give to it. These aspects do not appear appropriate for application in medical research. In line with Thorne [ 21 ], we consider that the interpretative underpinning in qualitative medical research must be more pragmatic and focus on eliciting concrete proposals for improving treatment. A descriptive approach, that is, “develop [ing] a textural description, what the participants experienced, and a structural description, how they experienced it in terms of conditions, situations or context” [ 51 ] appeared to us more appropriate to integrate into EBM and PRO. However, the descriptive phenomenological approaches [ 14 , 40 ] are mainly methods for analyzing qualitative data (i.e., interview transcripts) and not global research methods (methods structuring a systematic research process from A to Z). In particular, while they underline the need to collect data of first-person accounts of types of experience, they do not provide detailed instructions for the data collection process or study design. Furthermore, descriptive approaches consider access to the lived experience as the goal of the approach without any other more practical and concrete objectives or perspectives. As mentioned above, we consider that within qualitative medical research, lived experience should be considered a means rather than an end. All stages of IPSE are informed by a phenomenological descriptive approach, not only the analytical procedure, as each stage contributes in its own way to capture and describe the lived experience of the participants. At the same time, the objectives of IPSE differ from those of other phenomenological approaches used in qualitative health research: it seeks to improve the quality of care, by producing concrete measures (about treatment and care pathways) and to propose new avenues of research.

The two cornerstones of IPSE

The choice of the name IPSE (Inductive Process to Analyze the Structure of lived Experience) underlines the method’s two cornerstones: the inductive process and the analysis of the structure of lived experience.

IPSE relies on an inductive process: the procedure is exploratory, and no research hypotheses are formulated before starting; rather, they emerge from the material, through methods designed to penetrate as far as possible into the participants’ lived experience. Because the data are collected and analyzed simultaneously, the analysis can affect the collection of the data, directly from the material, that is, the narrative of the participants’ lived experience [ 31 ]. The most exemplary inductive approach in qualitative research is Grounded Theory [ 12 ], in which the researchers must suspend their relations with previous theories and limit their review of the literature, so that they can be fully attentive to the unexpected and the novel and can allow local theories to be emerge directly by the context and the material [ 52 ]. The IPSE inductive approach does not, however, imply disregarding either practical or theoretical medical knowledge when formulating research questions and objectives. The starting point of an IPSE study is always an unanswered question about the experience of individuals involved in medical care, unanswered questions by experienced physicians specializing in the topic. These specialized physicians are part of the research group and contribute to all the stages, including the definition of the areas to be explore in the data collection procedure. However, in line with the grounded theory approach [ 12 ], the physicians are not to share their knowledge with the qualitative researchers conducting interviews and analyzing data.

A qualitative researcher is his or her own instrument [ 53 ] and his or her preexisting knowledge and preconceptions (i.e. assumptions, values, interests, theories, beliefs, emotions, etc.) influence how data are collected, explored, analyzed, interpreted, and presented [ 54 ]. Usually, researchers using a qualitative phenomenological approach claim they perform époché, that is, that they “bracket” or set aside their preconceived knowledge and preconceptions of the phenomenon being researched [ 54 ]. Phenomenological philosophers, however, argue that this husserlian term has been misused and misinterpreted by qualitative researchers [ 55 ]. Using terms such as époché and reduction would, we think, confuse physicians and impede the accessibility of the IPSE method to physicians. We consider instead, along with other researchers such as Moustakas [ 16 ], that what matters is not to bracket the preconceptions but to identify and acknowledge, and make them explicit, through the act and work of reflexivity [ 56 ], which we will describe fully later. Only in this way can researchers avoid blind spots and cognitive biases, that is, the systematic errors in thinking that occur when people are processing and interpreting information-, especially confirmation bias, selection bias and the curse of knowledge [ 57 , 58 ] that can hinder both access to the participants’ experiential knowledge and the discovery of new useful knowledge.

The analysis of a structure of lived experience is the main goal of our method. The descriptive phase of the analysis is inspired by Colaizzi’s method [ 40 ], but our approach is systematic and the structure of the experience is not an end but a means, since it is translated into concrete proposals for improvement in the health care pathway, for treatment, and for clinical research.

The ISPE method has five stages that structure the entire research process (Fig.  1 ).

figure 1

IPSE, a new method for qualitative research applied to clinical medical research, in 5 stages

Stage 1: setting up a research group

The research group always includes two physicians specializing in the topic, generally three researchers with expertise in qualitative methods, and, when possible one or two patients who experience the phenomenon under study.

In total an IPSE research group ideally contains five to seven members. This number is necessary to ensure that the study is rigorous and trustworthy.

The two physicians specializing in the topic are necessary because they both perform the systematic review (stage 2) and ground all the research process into practical and theoretical medical knowledge without impeding the inductive process.

The three qualitative researchers collect the data (stage 4) and analyze it (stage 5, descriptive individual phase). We consider that three are necessary to avoid confining the data collected and analyzed in the sole perspective of one researcher, or confronting only two opposing points of view.

As for the two patients: In both participatory [ 59 ] and heuristic approaches [ 16 ], participants are not viewed as study subjects but as co-researchers who are an integral part of the research process. IPSE strongly supports a more participatory approach in which service users should be directly involved in the qualitative medical research process [ 60 ], but with a different innovative strategy — to integrate directly, when possible and appropriate, into the research group one or two patients who have experienced the phenomenon or had the disease under study. This strategy follows the same principles as those underlying the use of peer-support worker in psychiatric departments to allow a more person-centered and recovery-focused approach [ 4 ]. Participating patients requires a short training on qualitative research in general and the IPSE approach in particular, which we provide in 3 days before the research starts. Integrating patients within the research group is not mandatory as it can sometimes be quite difficult to find such patients willing to be trained and participate. However, the research group must meet at least twice with “subjects of the experience” that is, patients other than the participants, in the same situation (via patient associations, for example) (i) at the very beginning of the research to develop a research question that really matters for the patients; and (ii) at the end to obtain feedback and validation of the results from the patients themselves.

This group oversees the entire research project, making all decisions collegially. We aim for heterogeneity in the group’s members, in terms of culture, knowledge, sex, age, occupation, and background. This diversity helps enrich the research at every stage, so that the results are more robust and relevant and not limited to a single perspective.

Stage 2: ensuring the originality of the study

We follow the common principles of good practice in research, one of which is that a study must always begin by examining the existing qualitative and quantitative literature on the subject. Inductive approaches generally assume that to prevent interference by existing data, researchers beginning a qualitative study must not review the literature. On the other hand, there is little reason to replicate a qualitative study, since the importance of this type of research is measured by the novelty of the information it provides. It is thus important to avoid reinventing the wheel with each study, which would result in literature overloaded by similar but different or differently labeled concepts — what Morse describes as “theoretical congestion” [ 20 ]. Moreover, it is as important to ground the research in a rationale informed by medical literature, for example, by specifying or redefining the study objectives, as it is to remain attentive to the unexpected and novel [ 52 ] to produce original findings. This epistemological issue is known as Meno’s paradox , enunciated by Socrates in Plato’s Meno: “ We cannot look either for what we know, nor for what we do not know; what we know because, as we know it, we do not need to look for it, what we do not know because we do not even know what to look for [ 61 ].”

To resolve this conundrum, we have developed an original group procedure: the two physicians who are experts in the topic conduct a systematic review of the qualitative and quantitative literature to confirm the study’s relevance and originality. To remain inductive and open to novelty, as mentioned above, the other group members have access to this review only after the data analysis has been completed. The tragedy of modern knowledge is, as Morin stated, that “the exponential increase in knowledge and references … stands in the way of reflecting on knowledge” [ 62 ]. It is therefore important that physicians share the minimum of necessary knowledge to inform the study without impeding it by the curse of knowledge [ 57 ].

Stage 3: recruitment and sampling, aiming for exemplarity

After defining the research question, the group selects the study site or sites best able to optimize the feasibility of recruitment, depending on the study topic. It also defines the inclusion and exclusion criteria. In our method, sampling aims to attain exemplarity, that is, to select participants who, according to the research group (especially the physicians and the patients), have experienced quintessential, typical, or archetypal examples of the situation being studied. It thus uses purposive sampling, that is, selects the subjects likely to provide the most information about the phenomenon studied [ 63 ]. Unlike other recruitment strategies in qualitative research (i.e., homogeneous or convenience sampling), we are looking for a variety of exemplary situations by including participants who might enrich and add something new to what was previously found. The patients included might thus differ by sex, age, social and family status, degree of involvement, disease history, comorbidities, duration of treatment, and outcomes. This enables a broader understanding of the phenomenon under study. Because the analysis takes place simultaneously with the data collection, the latter continues for as long as the analysis of the material continues to provide new information useful for exploring the topic. Sample size in qualitative research is not defined in advance. It is determined by data saturation, usually defined as when the analysis of new material no longer yields new findings [ 64 ]. Saturation is, as Morse stated, “the key to excellent qualitative work” [ 65 ]. It is indeed an essential criterion of validity in qualitative research, especially for qualitative studies intended to lead to PRO development [ 66 ], as it ensures in-depth study of the phenomenon and suggests that further interviews are unlikely to produce new findings. This point has been heavily criticized, however, for it appears impossible to affirm saturation with certainty; that is, even if data saturation is a helpful idea for qualitative researchers, there are no pragmatic and consensual guidelines for determining when the point of data saturation has been reached [ 67 ].

For this reason, in line with grounded theory approaches, we prefer the concept of “theoretical sufficiency” [ 68 ]: data collection and analysis are complete when the researchers consider that the axes of experience obtained provide a sufficient explanatory framework for the data collected. Our minimum sample size is always at least 20 subjects, a choice made to optimize the visibility of our work: it enables publication in specialized journals accustomed to the large sizes in quantitative samples and randomized clinical trials, and unfamiliar with qualitative research, where sample size is not a criterion of methodological rigor.

Stage 4: data collection, access to experience

The quality of the data collection determines the quality of the results. The goal is to reach the narrative of the experience [ 69 ]. The tool used to obtain this narrative always depends on the context, in either individual interviews or focus groups. Researchers in charge of data collection procedure should always consider the risk that face-to-face questioning might limit the subjects’ ability to narrate — to talk about — a subject so deeply personal, especially vulnerable persons [ 70 ] and adapt the data collection process to ensure they reach this narrative. Patients can find it difficult, even intrusive, to talk about their lived experience of a disease [ 71 ]. Most of the time, the IPSE data collection procedure relies on visual narrative support for the participants, aimed at enhancing the narrative by reducing inhibitions. This support may be a photograph, or a clinical vignette, or a short video clip directly related to the experience under study. It also facilitates the relationship and communication between the researcher and the subjects [ 70 ].

Photo-elicitation is the visual narrative support method we use most often [ 72 ]. This tool helps participants to think about a picture — ideally one they took themselves or, sometimes chosen with caution by the research group for the purpose of introducing the object of the study without influencing the participants. The positive effects of photo-elicitation on the research process have been widely described in qualitative literature: it improves the quality of the data collected [ 73 ], it promotes active cognitive involvement and better participation in the research [ 74 ] and when the participants take the picture themselves, it empowers them, by putting them in a more active position and thereby giving them the opportunity to influence the research process more strongly [ 75 ].

In interviews, the qualitative researchers systematically start by asking the participants to comment and react to the experience-related visual support. The latter usually begin and spontaneously continue speaking about it and especially their own experience.

The researchers then move on to open-ended questions [ 76 ], structured by the areas to explore, developed in turn from: (i) reading two pilot interviews, which will not form any part of the data analyzed for the study, (ii) the qualitative researchers’ own thoughts, each with different insights according to his or her own explicated preconceptions on the subject, and (iii) the knowledge and representations of the physicians experts in the topic. The group collectively chooses the areas, but these may be modified throughout the research process by each interview conducted. The interviewers use an interactive conversational style [ 77 ]. In an IPSE study, participants are considered the experts on their own experience. Qualitative researchers must conduct interviews that offer them the opportunity to recount it. In practice, they use prompts based on the “life-world” [ 78 ], a phenomenological concept with five dimensions (i.e., lived body, lived time, lived space, otherness and selfhood) through which the everyday actions and thoughts of the participants can be explored. All interviews are recorded and then transcribed verbatim, including the nonverbal aspects (e.g., pauses, hesitation, and laughs). The data are anonymized. The transcripts obtained are the object of the analysis.

Stage 5: data analysis, from the structure of the experience to its translation into concrete proposals

Our analytic process is rigorous, detailed, systematic, and sharable. It relies on an inductive, phenomenological method and is intended to lead to concrete suggestions for improving aspects of treatment and of the health care pathway. It has two stages: one of independent work by individual researchers, and one of pooling the data collectively, by the group (Fig.  2 ).

figure 2

Stage 5 of IPSE. From the demonstration of the structure of the experience to its translation into concrete proposals. a. The procedure of each researcher, initially individually, corresponds to the descriptive analysis phase, including: i) listening to and reading the interview, ii) cutting up the text in descriptive units and then regrouping them into categories, This operation is performed for each of the 20 interviews, which are analyzed transversally. b. This structuring phase involves a group procedure (at least 3 researchers) with regular pooling of the data and analysis, during which the theoretical sufficiency is assessed. During this phase, the axes of experience are produced and the group determinates the central axes of experience, which result in the proposal of a structure of the lived experience. Finally the practical phase, which leads from triangulation by the literature to concrete proposals (guidelines, PRO)

Individual procedure - descriptive phase

At least three of the qualitative researchers independently and simultaneously conduct a systematic descriptive analysis aimed at conveying each participant’s experience. This procedure is not original; it has been already proposed in other phenomenological approaches, including IPA [ 15 ] and that Colaizzi’s method [ 40 ]. In fact, our descriptive analytic phase is inspired by Colaizzi’s analytical procedures and leads to the drafting of a structure of the experience, related to what Colaizzi named the “fundamental structure of the phenomenon” [ 79 ].

For each interview, this involves

Listening and reading : we suggest that qualitative researchers analyzing the data listen to the recorded interview twice — once without taking notes, and a second time while taking notes throughout. They can refer to these notes during the analysis and share them (or not) during the group process. Next, they read the interview transcript three times, taking notes at each reading. This process bathes the researchers in each participant’s expressive style and enables an overview of the narrative. These numbers of times to listen to and read the interview are of course suggestions; these remain the decision of each researcher. For instance, some researchers prefer to transcribe their own data and do not need to listen and read as much as a researcher discovering the data for the first time. However, in our experience, for data not transcribed by the researcher, listening twice and reading three times appear to be a perfect compromise to really saturate oneself with the material without wasting too much time.

Exploring the experience word by word : The researchers explore the interview meticulously and cut up the entire text (in units of one or several words) into segments called descriptive units (Table  1 ). These descriptive units are not pre-established and remain as close as possible to the participants’ words.

Regrouping the descriptive units into categories : The units are categorized, that is, they are regrouped according to their proximity of meaning and experience, wherever they may be in the interview. This reorganization reveals the framework of the participants’ experience (Fig. 4 ).

These stages are carried out with the help of “QSR” NVivo12 software to create and assemble the descriptive units and provide graphic support for their reorganization. This descriptive analysis is performed separately for each interview. Progressively, the researchers independently analyze all of the interviews thus far explored, cross-sectionally, by regrouping similar categories and excluding none of them.

Group process-the structuring phase and then the practical phase

During this group process phase, the three researchers who segmented and coded the text now meet and work with the other group members, that is, the physicians experts in the topic, the other qualitative researchers, and the patients, all of whom have familiarized themselves with the data (by listening and reading all the interviews as many times as necessary), without performing the descriptive analysis. The group’s heterogeneity has a heuristic function essential for the construction of the results: the group enables the co-construction by all group members of important points of the experience linking a set of implicit perspectives, made up of each researcher’s culture, theory, knowledge, sex, function, and background. The three researchers meet with the rest of the group after the analysis of five interviews, then 10, then 15, then 20, etc. … to share the categories that have been uncovered and assess their theoretical sufficiency. In practice, for organizational reasons mostly, group meetings can occur after the analysis of more or less than five interviews; however, we recommend this rhythm of meetings because of our experience: we found that sharing the content of the descriptive analysis of more than five interviews can result in a superficial sharing that impairs the quality and originality of the results, while more frequent meeting appeared to be time-consuming without providing richer data analysis.

The structuring phase

In practice, during these two-hour meetings, the group must:

Regroup the categories into axes of experience: The reorganization of the categories must uncover the framework of the participants’ experience, which we call its axes. These axes must be constructed such that each can be linked to its subjacent categories. Naming the axes must make it easy to read the results and highlight the original and relevant points. In practice, the names, the number, and the content of the axes may well change several times before their structure is finalized (Fig.  3 ).

Determine the structure of lived experience characterized by the central axes : this is an action that is delicate, iterative. This phase involves an important dimension of choice. Exhaustive results, unranked, may dilute the original points and the new information, thus impeding any translation of the results into direct implications. The final structure of lived experience does not reflect the many intermediate stages required to reach it, stages during which some axes of experience will be regrouped and some even abandoned. It is very important during this phase that the physicians who analyzed the literature consider and discuss the originality and relevance of each axis, or on the contrary, its previous mentions or triviality according to the literature.

figure 3

Intermediate stages of the structuring phase (from axes of experience to the structure of lived experience). The objective of the structuring phase is to produce a proposed structure of the lived experience. The three researchers meet with the rest of the research group. This collective group procedure can be defined as a co-construction by the researchers of important points of the experience. a. It involves sharing all the categories and constructing the organization of the axes of experience obtained during the descriptive phase, sometimes changing the name of the axes, sometimes changing categories from one axis of experience to the other. b. This is an intermediate stage of organizing or naming the axes. The structuring phase is a repeated act with an important dimension of choice. Some axes of experience will not discarded to determine the central axes of experience and terminate by c., the proposed structure of the lived experience

figure 4

Regrouping of codes into categories with NVivo 12

At the end of these two stages, the group, all together, writes up a proposed structure of the experience within the context in which it was explored, composed of its central axes of experience; this will be the study’s results section.

Practical phase

The objective of this phase is the signature of the IPSE method, concordant with Thorne’s interpretive description [ 21 ]: the translation of the findings into proposals about the health care pathway, its clinical implications, and the perspectives for further research. Accordingly, the results will propose, for example, supports for interviews or practical recommendations that physicians can use directly with these patients.

A process of triangulation with the data in the literature completes the analytic process. This is not a process original to IPSE, of course, but rather a good practice that should be followed in all qualitative health research [ 20 , 22 ]. The physicians experts in the topic and responsible for the literature review now share with the group their in-depth analysis of the literature from several databases (PubMed, PsycINFO, CINALH, and Web-of-Science (SSCI)) to identify the original aspects of the results and their differences and similarities with the literature.

The completed study is then reported in a scientific article that meets the COREQ criteria [ 80 ]. COREQ is a useful checklist of items that should always be included in reports of qualitative research. Although some items are directly related to criteria for analytic rigor, such as reflexivity or triangulation, the COREQ checklist provides an outline for reporting important aspects of qualitative research and not guidelines for performing it.

Rigor and methodological quality of IPSE

We identified and operationalized seven methodological points to ensure the quality and rigor of IPSE studies. Some are already recognized criteria of rigor for qualitative research (triangulation, attention to negative cases, transferability, reflexivity), others are innovative and specific:

Patient involvement and feedback of the “subjects of experience”

We have already mentioned that some patients should be part of the research group, or at least be invited to help develop the research question and to give feedback about the results. For the latter point, many authors, including Colaizzi, recommend that participants validate these results. There are practical difficulties in obtaining feedback from the entire group of participants. Our method offers a methodological innovation by progressively replacing feedback by study participants by presentations of the study and conversations with other subjects of the experience. This action ensures the credibility of the results and guarantees their transferability. The results obtained by the IPSE method have a singular status, as accessible and expected, but also uncovered and surprising. The expected effects are both agreement and surprise, through the uncovering of evidence “hidden” until then. The subject of the experience should be able to say: “ it’s exactly that, but I had never formulated it like that ”.

Triangulation

This concept refers to the use of multiple methods or data sources as a rigorous procedure to ensure a global understanding of the phenomenon under study [ 81 ]. There are four types of triangulation: “method triangulation, investigator triangulation, theory triangulation, and data source triangulation” [ 82 ]. In an IPSE study, at least three researchers are involved with data collection and individual analytical procedures; several data collection techniques can be used in the same study (for instance, individual interview and focus group); and triangulation with the literature is systematically carried out.

Attention to negative cases

Particular attention must be paid to these cases in which new elements can differ radically from the emerging structure of the experience. Most of the time, these negative- sometimes contradictory- cases will be integrated into the results. If a case differs completely from the proposed structure of the experience, we consider that theoretical sufficiency has not been reached and conduct new interviews and analyses.

The question of the choices of the central axes of experience

Our objective is not knowledge for its own sake, but knowledge for improvement in patients’ care and in their lives. The choices, always guided by this objective, are determined by the relevance and not the recurrence of the axes of experience.

Researchers’ subjectivity and reflexivity

The issue of reflexivity must be addressed. It can be defined as the researchers’ reflection of their role in the study and its effects on their findings at every step of the research process [ 56 ]. A recurrent hazard in qualitative research is that the results become the reflection or confirmation of the researcher’s preconceptions and beliefs [ 83 ]. The process of reflexivity enables researchers to avoid the pitfalls of applying their own preconceptions and assumptions to the material. They must take care to clarify their position, as much in their encounter with the material to be analyzed as in the research group meetings. To do this, the researchers involved in the study must answer these two questions regarding the study:

(i) What are my preconceptions and my beliefs about the phenomenon under study and the research question ? To address this question, they must each list all of their preconceptions and beliefs first to themselves and then share them with the group

(ii) What are my expectations regarding this study? The researchers must question — themselves and each other — their personal motives to be a part of the research and what they are expecting to find or to achieve.

This reflexive position is worked on constantly in the group, during open discussions between the researchers. More than in the field notes, it is in the conversations, exchanges, and discussions between the researchers that reflexivity accomplishes its work.

Transferability

Qualitative research studies are performed in specific contexts. What matters for their results however is that they are transposable [ 84 ]; in our IPSE approach, this means that the structure of the experience is transferable, that it resonates with what other patients live beyond the context of the study. The assessment of the transferability of results ultimately lies with readers, who must decide if the setting of the study is sufficiently similar for its results to be transferable to their own context [ 85 ]. We ensure the transferability of the structure of the experience by obtaining feedback from patient associations or other representative groups. Also, as shown by the creation of a PRO tool validated by a quantitative psychometric study (currently being written up) in scleroderma, validation in a large sample of other subjects demonstrates the transferability of our results. A tool developed and structured by our method focusing on the lived experience of patients appears transferable and closer to reality than tools based on the theories and inferences of the professionals who create them [ 86 ].

The language of the analysis

It seems important to specify that our position is to anchor our research work in the participants’ language as well as their words. The research is conducted entirely in French, the language of the participants, and the researchers develop and write the results in this same language. Finally, at the last stage, the article is sent to a bilingual (English-French) professional scientific translator, and the authors and translator consult frequently to ensure that the words and meaning stay as close as possible to those of the participants.

The use of the IPSE approach has provided original findings enabling practical implications, such as (1) development of PROs focusing on areas not yet covered by existing scales, (2) clinical recommendations concerning assessment and treatment, (3) innovative ways to improve communication between patients and doctors, and (4) new insights for medical pedagogy.

PRO development

The lived experience of hand involvement in patients with systemic sclerosis (ssc).

Table  2 presents an exemplary IPSE study exploring hand involvement among patients with SSc. The structure of lived experience described in our results revealed that the distress of patients dealing with functional impairment of their hands is linked especially to the loss of what had been important parts of their lives before the disease (leisure activities and hobbies, work, a musical instrument, a family activity). In other words, the intensity of the functional impact was related to “what I can no longer do” rather than to “what I cannot do.” The existing scales either focus on the very targeted assessment of individual components of hand involvement in SSc (the Raynaud’s Conditions Score [ 87 ], the Hand Mobility in Scleroderma Scale [ 88 ], and the Delta Finger to Palm [ 89 ]) or are generic functional scales validated for this disease, evaluating the functional impact of this involvement and its daily repercussions, at home, at work, and on QoL (the Cochin Hand Function Scale [ 90 ], the Arthritis Hand Function Test [ 91 ], and some specific items of QoL scales for scleroderma, such as the Scleroderma Health Assessment Questionnaire [ 92 ]). These scales evaluate the ability to perform some actions of daily life, such as cleaning or getting dressed, but do not include this dimension of a function they once had, which had been important, and was now lost. Here, access to the lived experience and its rigorous analysis make it possible to show that it is this loss of function that is painful. Similarly, our results show that patients develop strategies to compensate for this functional involvement, which no longer presents a problem in their daily life, although functional scales continue to detect it as a functional problem.

The question of the visibility of their hand impairment, to themselves and to others, was also crucial in the lived experience of the participants. The esthetic repercussions on QoL in SSc have been already explored by dermatology QoL scales, the Dermatology Life Quality Index questionnaire [ 93 ] or the Satisfaction with Appearance scale (SWAP) [ 94 ]; and the esthetic impact of hand involvement can be assessed by the 6-item Brief SWAP, validated for SSc [ 95 ]. The IPSE approach enabled the emergence of two original results about the visibility of hand involvement: i) the permanent exposure of their disease to the eyes of others — both in social interactions and in more personal relationships — underlines its effect on relationships; ii) it serves as a permanent reminder of the disease to the patients, inducing constant concerns about their survival, their existence. Beyond the functional impact of hand involvement in SSc and the esthetic repercussions, relational and existential aspects directly associated with patients’ emotional distress also appeared to be an important part of patients’ experience. The different qualitative studies exploring the sources of emotional distress among people living with SSc [ 96 , 97 , 98 , 99 , 100 , 101 , 102 , 103 ] have never explored the lived experience of hand involvement in this disease. These emotional, relational, and existential aspects have never been described specifically for hand involvement in either SSc or other autoimmune diseases affecting the hands, and no scale used to assess hand involvement contains items assessing these aspects.

The IPSE approach has thus made it possible to enrich the data available on the lived experience of SSc patients with hand involvement. The current scales, obtained from questionnaires constructed without exposure to patients’ lived experience, inform clinicians especially about the functional dimension of their patients’ disease but do not allow them to provide comprehensive management for these patients that covers what matters from the patients’ perspective. It is especially important to take their perspective into account in that this is a chronic disease that cannot be cured and has no specific treatment. Our results thus allowed us to construct an appropriate, relevant –and missing- PRO tool, the HAnDE scale, intended to be a useful tool enabling detection of the different dimensions of hand involvement in SSc: functional (and loss of function), emotional, relational, existential, and esthetic. Based on our results and using the vocabulary of the patients we interviewed, we generated 18 items associated with their lived experience. A final version of 16 items was subsequently obtained during a validation study with 105 patients that showed the relevance of the scale for assessing the global experience of hand involvement in patients with SSc. This new PRO scale will be considered as an outcome measure in future trials. This illustrates the direct integration of IPSE studies in EBM.

PRO development in the field of child and adolescent psychiatry

Pediatric PRO assessment is a very recent field of research, and empirical evidence about quantitative instruments within this age-specific population is still scarce [ 104 ]. We are currently constructing two adolescent psychiatry PRO scales based on the results of two qualitative studies: (i) one about the treatment of adolescents with anxiety-based school refusal [ 105 ], and (ii) another assessing the therapeutic alliance in the treatment of adolescents with anorexia nervosa [ 106 ].

This qualitative study, exploring how 20 adolescents with anxiety-based school refusal and 21 of their parents experienced psychiatric treatment, revealed some divergences between the two groups on their perception of efficacy and enabled us to construct a relevant PRO for both adolescents and their parents. This tool integrates original aspects found in our study: assessment of both external ( return to school ) and internal ( self-transformation ) goals of care, the duration of care (an effective treatment as rapidly as possible vs. the need for a treatment period sufficiently long to allow adolescents to change and develop) and unexpected care-linked relationships.

This study explored the experience of therapeutic alliance among 15 adolescents with anorexia nervosa (AN), 18 or their parents and their 8 psychiatrists. Crossing these three perspectives makes it possible to identify aspects that are missing in scales currently used to measure therapeutic alliance in the treatment of AN in adolescents, such as the AWAI or the HAQ-CP [ 107 , 108 ]: parents’ negative representations of “psychiatry” focusing on somatic aspects of treatment and the omnipresence of the issue of relationships. Items on relationship in the scales currently used concern only the relationship with care providers and focus on its perceived quality. In our results, relationships are involved in all three components of the definition of therapeutic alliance: the quality of the association, the objective of treatment, and the means to achieve this objective. Similarly, the current scales do not mention the role of the adolescent–parent relationship, which appeared to play a key role in the construction of the therapeutic alliance in our results. We are therefore developing a PRO scale on this topic that includes specific items: (1) the quality of the relationship with staff as a means of getting better, (2) the impact on the alliance of parental involvement in treatment, (3) the impact of treatment on the parents’ point of view about the adolescent and the relationship between adolescents and their parents, and finally (4) agreement about the objectives for improving family relationships generally and adolescent–parent relationships in particular.

Clinical recommendations

Clinical recommendations drawn from the results of IPSE studies are intended directly for the physicians themselves.

Assessment and diagnosis

Clinical implications from IPSE studies can be innovative concrete guidelines, supported by the structure of lived experience of patients and other stakeholders, to improve the quality of the clinical assessment and of the diagnosis process.

For instance, based on the results of the study of hand involvement in SSc, using the same rationale presented above, we provided clinicians with clinical implications regarding the assessment of hand involvement among patients with SSc: (i) clinicians must assess this involvement globally and not by segmenting the evaluation with several scales that target especially functional involvement; (ii) They should routinely evaluate hand functions that patients used in ways important to them and have now lost and the impact of the visibility of the disease due to this hand involvement, and (iii) should also explore the esthetic, emotional, relational, and existential issues that result.

In another study, we explored the experience of the diagnostic pathway among 20 patients with acromegaly, a rare disease with a substantial diagnostic delay. Our results revealed the direct associations between diagnostic delay and the doctor–patient encounter [ 28 ]. The literature has already emphasized the key role of any doctor, regardless of specialty, in identify the signs, symptoms, and comorbidities of acromegaly by becoming involved and seeing the unseen ’ [ 109 ]. To identify the disease as early as possible, however, our results suggest that physicians must allow themselves to question the patient proactively and to consider clinical processes beyond their own specialty, in other words, seeing the unseen is not enough if physicians do not say the unsaid.

Therapeutic implications

The IPSE approach is particularly well suited to exploring complex therapeutic processes and the perceived efficacy of treatment. The approach enables the description of therapeutic levers and efficacy criteria directly relevant to patients and other stakeholders and could contribute to achieving a more person-centered medicine.

Improving patients’ lives

We conducted two studies to explore the lived experience of cancer treatment; one crossed the perspectives of patients ( N  = 30), their families ( N  = 30), and their oncologist ( N  = 10) [ 30 ], while the other one focused on what affects the quality of daily life of patients with cancer ( N  = 30) during active treatment [ 110 ]. Our results led to some clinical recommendations to achieve patient-centered cancer treatment, that is, that physicians integrate the dimension of care into the curative treatments performed so that patients to live as well and not simply as long as possible [ 30 ]. To achieve this task, we found an original therapeutic lever that acts like a relational tool for physicians: the support object, defined as an object, a relationship or an activity particularly invested by the patients in their daily lives, which makes them feel good and makes the cancer and its treatment bearable. When patients are able to choose and be involved with a support object, the physician must support them and converse with them on this topic to help them maintain this investment throughout the health care pathway and to establish a trusting relationship and therefore, according to our results, improve their quality of daily life, without using up very much of the physician’s time [ 110 ].

Improving families’ lives

Therapeutic implications drawn from IPSE studies can also concern families and relatives. For example, we conducted a study to explore how 20 older siblings describe and perceive the care received by their brother or sister in child psychiatric centers specialized in the management of children with Autism Spectrum Disorder [ 111 ]. The literature has already recognized the need for both global family-centered treatment approaches [ 112 ] and specific programs intended for the siblings of children with serious diseases [ 113 ] to help them cope with their brother/sister’s condition, but our study revealed that when older siblings play and claim a role of helping and caring for the child with ASD, they benefit from their empowerment and involvement in this treatment, and physicians benefit from their perspective on this treatment.

Recommendations addressing the care pathway

For the study of anxiety-based school refusal mentioned above [ 105 ], we were able to provide recommendations about the outcomes and the duration of care: treatment must last long enough, in a place dedicated to care, to allow adolescents to become involved in their care and to reflect on the personal changes they need, but also to offer them the possibility of multiple human encounters, some of which — expected or unexpected — will turn out to be determinant in their development. Treatment should strive to combine and coordinate two outcomes of equal importance: a rapid return to school for the parents, and a sufficiently long time in care to enable a self-transformation for the adolescents.

Patient-physician communication

Using the structure of lived experience.

First of all, the systematic undervaluation of symptoms by physicians reveals some distortion in physician-patient communication [ 114 , 115 , 116 ]. The integration of patients’ points of view into their management through the intermediary of structures of experience obtained with IPSE would promote this communication. In an ideal context of shared medical decision-making, the involvement of patients in their management requires that they receive complete and appropriate information on which they can base their choices. Complete information requires that they have been heard and that their narrative of their disease has been considered. In our study of hand involvement in SSc, we systematically provided feedback to the clinicians, so that they could apply these new results uncovered by our exploration.

Improving communication, reducing confusion

The IPSE approach, especially when crossing perspectives, can also provide innovative ways to improve communication between patients and doctors.

We conducted a study among 20 patients and 10 physicians aimed at exploring the experience of neuroendocrine tumors (NETs), rare gastrointestinal tumors characterized by their rarity, the difficulty of their diagnosis, their often better prognosis, and their complex and long management [ 29 ]. The primary — and original — result of this study is the important experience of confusion found among patients. We have provided a statement that all physicians can use to support patients diagnosed with NETs to reduce their confusion, especially the semantic confusion as it explicitly uses the term cancer. This communicative tool meets patients’ needs (i.e. silent symptomatology, name, evolution, treatment and monitoring) including the need to improve patient-physician communication. It has been used in specialized medical consultations but also in training sessions of medical trainees in oncology and gastroenterology.

Medical pedagogy

Here again, the structure of experience can always directly serve as training support to provide to medical students with relevant information about how patients experience both their disease and their care. IPSE approach can also reveal specific needs or gaps in the physician’s training and provide new insights.

Revealing training needs

In our study crossing perspectives between patients with cancer, their families, and their oncologists, we found that physicians had difficulties dealing with patients’ negative emotions during consultations and that this could be a barrier to their access to the factors that improve the patients’ capacity to live as well as possible. We suggested that physicians dealing on a regular basis with patients with cancer should receive a specific medical education that directly addresses the issues of coping with, recognizing, eliciting and using patients’ feelings as a therapeutic tool.

New insights for medical pedagogy

In our study regarding acromegaly diagnosis, patients reported having faced deficiencies in the medical world’s awareness of acromegaly. Indeed, acromegaly is a rare disease that doctors see very rarely and are therefore unlikely to think about. Our results led to suggesting the intervention of patient experts [ 2 ] in medical schools, so students can hear about their experience of diagnostic errors that lead to diagnostic delay and about its early and current clinical signs. Future doctors, who have received such training, will be more aware of the need for a high level of suspicion and active questioning to reach a diagnosis and should thus be more likely to think of the signs observed or reported as potential indicators of acromegaly.

All the studies presented here, and their practical implications, focus on the day-to-day clinical practice of physicians: relationships and communication in care, duration of care, therapeutic alliance, care issues, and outcomes. These aspects, related to the patient’s subjectivity and the patient-physician relationship, are very often forgotten or even excluded from medical research [ 20 ]. We consider, along with other scholars in qualitative health research [ 20 , 21 ], that many advances in medicine and patient care are impossible until qualitative methods are fully integrated into the research arsenal. Qualitative research should — but does not yet —have a major role to play in clinical medical research; use of these methods remains a minority, even marginalized, option. Many medical researchers still apply a hierarchy — based on a paradigm confusion — between research methods according to the sole presence of quantitative research criteria, such as sample size, objectivity, and reproducibility; they inaccurately conclude that qualitative research is inferior and reduce it to a secondary role, always conceived of in the context of mixed-method research [ 117 ]. In other words, qualitative medical research is a victim of the burden of proof [ 118 ] and of the tyranny of the average [ 119 ].

For qualitative research to be able to fully contribute to medical research, it requires better recognition and appreciation from the entire medical community. This is what the IPSE approach is trying to achieve by staking out a medical position within a rigorous and systematic qualitative method.

The IPSE approach is to be integrated within EBM through mixed-method study designs resulted from a pragmatic and mutually enriching partnership between qualitative and quantitative methods [120].

Setting up research groups involving both physicians and patients is an innovative and original idea. This multiples the perspectives and enriches the data and results. Moreover, patient involvement helps to direct research towards person-centered medicine and finally, it allows the research process to maintain an inductive approach providing new results while remaining anchored in relevant medical issues.

Almost everyone agrees that it is important to understand what patients and other caregivers are going through. But to what extent? The first objective of an IPSE study should always be to achieve concrete improvements in patients’ lives (or those of other stakeholders) by leading to practical changes such as PRO development and being used to construct health recommendations or policies, while respecting the fundamental principles of qualitative research.

However, the ISPE approach has some pitfalls. First, it is a very demanding and ambitious research method. An IPSE study is as constraining in terms of workload and time as any other clinical medical research. It also requires abandoning the idea that a qualitative research requires fewer human, financial, and technical means. Second, in exploring an experience in depth in an interview, the researcher can expose the subject’s distress, especially when the question concerns the experience of a disease. This is an important ethical point: the researcher must systematically report this distress to the patient’s physician.

IPSE is an innovative method and an important contribution to current methodological developments aimed at improving the quality and rigor of qualitative research in the medical field, for it anchors the research to the lived experience of those involved in medical care (patients, family, professionals) and it proposes concrete suggestions based on the results, including the development of PROs. No structured qualitative methods have previously recommended the direct involvement of patient experience into PRO construction, before assessing psychometrics characteristics of the scale. IPSE is a qualitative method specific for clinical research in medicine, designed to enable the implementation of pragmatic improvements. Our approach, which allocates to the experience of all the stakeholders in medical care its necessary role in the research process, is part of the movement for collaborative person-focused medicine and can be integrated easily in mixed-methods study designs [ 1 ].

Availability of data and materials

The datasets analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

Consolidated criteria for reporting qualitative research

Evidence-based medicine;

Hand AutoimmuNe Disease lived Experience

Interpretative phenomenological analysis

Inductive Process to Analyze the Structure of Lived Experience

Patient and public involvement

Patient Reported Outcome

Quality of life

Scleroderma Health Assessment Questionnaire

Systemic sclerosis

Truog RD. Patients and doctors — the evolution of a relationship. N Engl J Med. 2012;366:581–5.

CAS   PubMed   Google Scholar  

Elberse JE, Caron-Flinterman JF, Broerse JEW. Patient-expert partnerships in research: how to stimulate inclusion of patient perspectives. Health Expect. 2011;14:225–39.

PubMed   Google Scholar  

Santana MJ, Manalili K, Jolley RJ, Zelinsky S, Quan H, Lu M. How to practice person-centred care: a conceptual framework. Health Expect. 2018;21:429–40.

Sticher L, Bonsack C. Peer support workers : a novel profession in psychiatry. Rev Med Suisse. 2017;13:1614–6.

Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. EBM: what it is and what it is not. BMJ. 1996;312:71–2.

CAS   PubMed   PubMed Central   Google Scholar  

FDA-NIH Biomarker Working Group. BEST (Biomarkers, EndpointS, and other Tools) Resource. Bethesda: Food and Drug Administration (US); National Institutes of Health (US); 2016. Available at:- https://www.ncbi.nlm.nih.gov/books/NBK338448/def-item/measurement/ site consulted May 13, 2019.

Google Scholar  

Germain N, Aballéa S, Smela-Lipinska B, Pisarczyk K, Keskes M, Toumi M. Patient-Reported Outcomes in Randomized Controlled Trials for Overactive Bladder: A Systematic Literature. Rev Value Health. 2018;21:S114.

Lasch KE, Marquis P, Vigneux M, Abetz L, Arnould B, Bayliss M, Crawford B, Rosa K. PRO development: rigorous qualitative research as the crucial foundation. Qual Life Res. 2010;19(8):1087–96.

PubMed   PubMed Central   Google Scholar  

Food Drug Agency. Guidance for industry: patient-reported outcome measures: use in medical product development to support labeling claims. Rockville; 2009.

Patrick DL, Burke LB, Gwaltney CJ, Leidy NK, Martin ML, Molsen E, Ring L. Content validity—establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) instruments for medical product evaluation: ISPOR PRO good research practices task force report: part 1—eliciting concepts for a new PRO instrument. Value Health. 2011;14(8):967–77.

Patrick DL, Burke LB, Gwaltney CJ, Leidy NK, Martin ML, Molsen E, Ring L. Content validity—establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) instruments for medical product evaluation: ISPOR PRO good research practices task force report: part 2—assessing respondent understanding. Value Health. 2011;14(8):978–88.

Glaser BG. Strauss A1.The discovery of grounded theory. Chicago: Aldine; 1967.

Savage J. Ethnography and health care. BMJ. 2000;321:1400–2.

Giorgi A. The descriptive phenomenological psychological method. J Phenomenol Psychol. 2012;43:3–12.

Smith JA, Flowers P, Larkin M. Interpretative phenomenological analysis: theory, method and research. Los Angeles: SAGE; 2009.

Moustakas C. Phenomenological research methods. Thousand Oaks: Sage publications; 1994.

Van Manen M. Phenomenology of practice: meaning-giving methods in phenomenological research and writing. New York: Routledge; 2016.

Pedersen AHM, Rasmussen K, Grytnes R, Nielsen KJ. Collaboration and patient safety at an emergency department - a qualitative case study. J Health Organ Manag. 2018;32:25–38.

Aguinaldo JP. Qualitative analysis in gay men’s health research: comparing thematic, critical discourse, and conversation analysis. J Homosex. 2012;59:765–87.

Morse JM. Qualitative health research: creating a new discipline. Walnut Creek: Left Coast Press; 2012.

Thorne S. Interpretive description: qualitative research for applied practice. New York and London: Routledge; 2016.

Pope C, Ziebland S, Mays N. Qualitative research in health care. Analysing qualitative data. BMJ. 2000;320:114–6.

Shuval K, Harker K, Roudsari B, Groce NE, Mills B, Siddiqi Z, et al. Is Qualitative Research Second Class Science? A Quantitative Longitudinal Examination of Qualitative Research in Medical Journals. PLoS One. 2011;6(2):e16937.

Braun V, Clarke V, Terry G. Chapter 7 thematic analysis. In: Antonia C, editor. Qualitative Research in Clinical and Health Psychology- Poul Rohleder: Lyons Palgrave Macmillan. Houdmills: Palgrave Macmillan; 2014.

Booth A. Harnessing energies, resolving tensions: acknowledging a dual heritage for qualitative evidence synthesis. Qual Health Res. 2019;29(1):18–31.

Mejdahl CT, Schougaard LMV, Hjollund NH, Riiskjær E, Thorne S, Lomborg K. PRO-based follow-up as a means of self-management support - an interpretive description of the patient perspective. J Patient Rep Outcomes. 2017;2:38.

Ploeg J, Matthew-Maich N, Fraser K, Dufour S, McAiney C, Kaasalainen S, et al. Managing multiple chronic conditions in the community: a Canadian qualitative study of the experiences of older adults, family caregivers and healthcare providers. BMC Geriatr. 2017;17:40.

Sibeoni J, Manolios E, Verneuil L, Chanson P, Revah-Levy A. Patients’ perspectives on acromegaly diagnostic delay: a qualitative study. Eur J Endocrinol. 2019;180:339–52.

Sibeoni J, Khannoussi W, Manolios E, Rebours V, Revah-Levy A, Ruszniewski P. Perspectives of patients and physicians about neuroendocrine tumors. A qualitative study. Oncotarget. 2018;9(18):14138–47.

Orri M, Sibeoni J, Bousquet G, Labey M, Gueguen J, Laporte C, et al. Crossing the perspectives of patients, families, and physicians on cancer treatment: a qualitative study. Oncotarget. 2017;8(13):22113–22.

Pope C, Mays N. Qualitative Research: Reaching the parts other methods cannot reach: an introduction to qualitative methods in health and health services research. BMJ. 1995;311(6996):42–5.

Vrablik M, Catapano AL, Wiklund O, Qian Y, Rane P, Grove A. Martin ML understanding the patient perception of statin experience: a qualitative study. Adv Ther. 2019;3. https://doi.org/10.1007/s12325-019-01073-7 .

Denzin NK, Lincoln YS. Introduction: the discipline and practice of Qualitative research. Thousand Oaks:. In: Denzin NK, Lincoln YS, editors. The Sage handbook of qualitative research: Sage; 2011.

Pernecky T. 1. Introduction: Situating metaphysics and epistemology in qualitative research In: Pernecky T, Epistemology and metaphysics for qualitative research. London: Sage; 2016. p. 3–32.

Mapp T. Understanding phenomenology: the lived experience. Br J Midwifery. 2008;16(5):308–11.

Kant I. The Critique of Pure Reason, vol. 1781. New-York: Start; 2012.

Charmaz K. Teaching theory construction with initial grounded theory tools: a reflection on lessons and learning. Qual Health Res. 2015;25:1610–22.

Husserl E. The idea of phenomenology. The Hague: Martinus Nijhoff; 1970.

Pernecky T. 4. German idealism, Phenomenology, and Hermeneutics In: Pernecky T, Epistemology and metaphysics for qualitative research. London: Sage. 2016 p. 84–109.

Colaizzi P. Psychological research as the phenomenologist’s view it. In: Vale R, King M, editors. Existential–phenomenological alternatives for psychology. New York: Oxford University Press; 1978. p. 48–71.

Revah-Levy A, Birmaher B, Gasquet I, Falissard B. The adolescent depression rating scale (ADRS): a validation study. BMC Psychiatry. 2007;7:2.

Ramalho J de AM, Lachal J, Bucher-Maluschke JSNF, Moro M-R, Revah-Levy A A qualitative study of the role of food in family relationships: An insight into the families of Brazilian obese adolescents using photo elicitation Appetite 2016;96:539–545.

Harf A, Skandrani S, Radjack R, Sibeoni J, Moro MR, Revah-Levy A. First parent-child meetings in international adoptions: a qualitative study. PLoS One. 2013;8(9):e75300.

Harf A, Skandrani S, Sibeoni J, Pontvert C, Revah-Levy A, Moro MR. Cultural identity and internationally adopted children: qualitative approach to parental representations. PLoS One. 2015;10(3):e0119635.

Orri M, Paduanello M, Lachal J, Falissard B, Sibeoni J, Revah-Levy A. Qualitative approach to attempted suicide by adolescents and young adults: the (neglected) role of revenge. PLoS One. 2014;9(5):e96716.

Spodenkiewicz M, Speranza M, Taïeb O, Pham-Scottez A, Corcos M, Révah-Levy A. Living from day to day - qualitative study on borderline personality disorder in adolescence. J Can Acad Child Adolesc Psychiatry. 2013;22(4):282–9.

Lachal J, Speranza M, Taïeb O, Falissard B, Lefèvre H. QUALIGRAMH, et al. qualitative research using photo-elicitation to explore the role of food in family relationships among obese adolescents. Appetite. 2012;58(3):1099–105.

Perier A, Revah-Levy A, Bruel C, Cousin N, Angeli S, Brochon S, et al. Phenomenologic analysis of healthcare worker perceptions of intensive care unit diaries. Crit Care. 2013;17(1):R13.

Zahavi D. Getting it quite wrong: Van Manen and Smith on phenomenology. Qual Health Res. 2019;29:900–7.

Applebaum M. Moustakas’ Phenomenology: Husserlian? https://phenomenologyblog.com/?p=896 2013 Site consulted June 15 2020.

Creswell JW, Hanson WE, Clark Plano VL, Morales A. Qualitative research designs: selection and implementation. Couns Psychol. 2007;35(2):236–64.

Strauss A, Corbin J. Basics of qualitative research techniques and procedures for developing grounded theory. Thousand Oaks: Sage; 1998.

Starks H, Brown TS. Choose your method: a comparison of phenomenology, discourse analysis, and grounded theory. Qual Health Res. 2007;17:1372–80.

Tufford L, Newman P. Bracketing in qualitative research. Qual Soc Work. 2012;11:80–96.

Zahavi D. Applied phenomenology: why it is safe to ignore the epoché. Cont Philos Rev. 2019. https://doi.org/10.1007/s11007-019-09463-y .

Finlay L. Reflexivity: a practical guide for researchers in health and social sciences. Oxford: Blackwell Science Ltd; 2003.

Birch SAJ, Bloom P. The curse of knowledge in reasoning about false beliefs. Psychol Sci. 2007;18:382–6.

Hallihan GM, Shu LH. Considering confirmation Bias in design and design research. J Integr Des Process Sci. 2013;17:19–35.

Bradbury-Jones C, Isham L, Taylor J. The complexities and contradictions in participatory research with vulnerable children and young people: a qualitative systematic review. Soc Sci Med. 2018;215:80–91.

Cowley A, Kerr M, Darby J, Logan P. Reflections on qualitative data analysis training for PPI partners and its implementation into practice. Res Involv Engagem. 2019;5(1):22.

Plato [Jowett B]. Meno. Dover edition. New York: Dover Publicationc, Inc; 2019.

Morin E. La méthode 3. La connaissance de la connaissance. Essais, Seuil; 1986.

Patton MQ. Qualitative Research & Evaluation Methods. 3rd ed; 2001.

Hennink MM, Kaiser BN, Weber MB. What influences saturation? Estimating sample sizes in focus group research. Qual Health Res. 2019;29:1483–96.

Morse JM. The significance of saturation. Qual Health Res. 1995;5:147–9.

Kerr C, Nixon A, Wild D. Assessing and demonstrating data saturation in qualitative inquiry supporting patient-reported outcomes research. Expert Rev Pharmacoecon Outcomes Res. 2010;10:269–81.

Ness L, Fusch P. Are We There Yet? - Data Saturation in Qualitative Research. Qual Rep. 2015;20(9):1:1408–16.

Dey I. Grounding grounded theory: guidelines for qualitative inquiry. San Diego: Academic Press; 1999.

Morse JM. Methods most frequently used in Qualitative Health research. In: Morse JM, editor. Qualitative health research: creating a new discipline. Walnut Creek: Left Coast Press; 2012. p. 84–9.

Sibeoni J, Costa-Drolon E, Poulmarc’h L, Colin S, Valentin M, Pradère J, et al. Photo-elicitation with adolescents in qualitative research: an example of its use in exploring family interactions in adolescent psychiatry. Child Adolesc Psychiatry Ment Health. 2017;11:49.

Holloway I, Wheeler S. Ethical issues in qualitative nursing research. Nurs Ethics. 1995;2:223–32.

Harper D. Talking about pictures: a case for photo elicitation. Vis Stud. 17(1):13–26.

Pain H. A literature review to evaluate the choice and use of visual methods. Int J Qual Methods. 2012;11(4):303–19.

Guillemin M, Drew S. Questions of process in participant-generated visual methodologies. Vis Stud. 2010;25(2):175–88.

Oliffe JL, Bottorff JL. Further than the eye can see? Photo elicitation and research with men. Qual Health Res. 2007;17(6):850–8.

Britten N. Qualitative research: qualitative interviews in medical research. BMJ. 1995;311:251–3.

Mason J. Qualitative Interviewing: Asking, Listening and Interpreting. In: May T, editor. Qualitative Research in Action: SAGE Publications Ltd; 2002. p. 226–41. https://doi.org/10.4135/9781849209656.n10 .

Husserl E. The crisis of European sciences and transcendental phenomenology: an introduction to phenomenological philosophy. Evanston: Northwestern University Press; 1970.

Park K-O, Park S-H, Yu M. Physicians’ experience of communication with nurses related to patient safety: a phenomenological study using the Colaizzi method. Asian Nurs Res. 2018;12(3):166–74.

Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

Creswell JW, Miller DL. Determining validity in qualitative inquiry. Theory Pract. 2000;39(3):124–31.

Carter N, Bryant-Lukosius D, DiCenso A, Blythe J, Neville AJ. The use of triangulation in qualitative research. Oncol Nurs Forum. 2014;41:545–7.

Freeman D. Margaret Mead and Samoa : the making and unmaking of an anthropological myth: Australian National University Press; 1983.

Brown C, Lloyd K. Qualitative methods in psychiatric research. Adv Psychiatr Treat. 2001;7(5):350–6.

Kuper A, Reeves S, Levinson W. An introduction to reading and appraising qualitative research. BMJ. 2008;337:a288. https://www.bmj.com/content/337/bmj.a288 .

Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review. PLoS One. 2011;6(6):e20476.

Pope J. Measures of systemic sclerosis (scleroderma): Health Assessment Questionnaire (HAQ) and Scleroderma HAQ (SHAQ), Physician- and Patient-Rated Global Assessments, Symptom Burden Index (SBI), University of California, Los Angeles, Scleroderma Clinical Trials Consortium Gastrointestinal Scale (UCLA SCTC GIT) 2.0, Baseline Dyspnea Index (BDI) and Transition Dyspnea Index (TDI) (Mahler’s Index), Cambridge Pulmonary Hypertension Outcome Review (CAMPHOR), and Raynaud’s Condition Score (RCS). Arthritis Care Res. 2011;63(S11):S98–S111.

Sandqvist G, Eklund M. Hand mobility in scleroderma (HAMIS) test: the reliability of a novel hand function test. Arthritis Rheum. 2000;13(6):369–74.

CAS   Google Scholar  

Torok KS, Baker NA, Lucas M, Domsic RT, Boudreau R, Medsger TA. Reliability and validity of the delta finger-to-palm (FTP), a new measure of finger range of motion in systemic sclerosis. Clin Exp Rheumatol. 2010;28(2 Suppl 58):S28–36.

Rannou F, Poiraudeau S, Berezné A, et al. Assessing disability and quality of life in systemic sclerosis: construct validities of the Cochin hand function scale, health assessment questionnaire (HAQ), systemic sclerosis HAQ, and medical outcomes study 36-item short form health survey. Arthritis Rheum. 2007;57(1):94–102.

Poole JL, Gallegos M, O’Linc S. Reliability and validity of the arthritis hand function test in adults with systemic sclerosis (scleroderma). Arthritis Rheum. 2000;13(2):69–73.

Steen VD, Medsger TA. The value of the health assessment questionnaire and special patient-generated scales to demonstrate change in systemic sclerosis patients over time. Arthritis Rheum. 1997;40(11):1984–91.

Chularojanamontri L, Kulthanan K, Sethabutra P, Manapajon A. Dermatology life quality index in Thai patients with systemic sclerosis: A cross-sectional study. Indian J Dermatol Venereol Leprol. 2011;77(6):683.

Heinberg LJ, Kudel I, White B, et al. Assessing body image in patients with systemic sclerosis (scleroderma): validation of the adapted satisfaction with appearance scale. Body Image. 2007;4(1):79–86. https://doi.org/10.1016/j.bodyim.2006.11.002 .

Article   PubMed   PubMed Central   Google Scholar  

Jewett LR, Hudson M, Haythornthwaite JA, et al. Development and validation of the brief-satisfaction with appearance scale for systemic sclerosis. Arthritis Care Res. 2010;62(12):1779–86.

Joachim G, Acorn S. Life with a rare chronic disease: the scleroderma experience. J Adv Nurs. 2003;42(6):598–606.

Milette K, Thombs BD, Dewez S, Körner A, Peláez S. Scleroderma patient perspectives on social support from close social relationships. Disabil Rehabil. 2019:1–11.

Cinar FI, Unver V, Yilmaz S, et al. Living with scleroderma: patients’ perspectives, a phenomenological study. Rheumatol Int. 2012;32(11):3573–9.

Suarez-Almazor ME, Kallen MA, Roundtree AK, Mayes M. Disease and symptom burden in systemic sclerosis: a patient perspective. J Rheumatol. 2007;34(8):1718–26.

Newton EG, Thombs BD, Groleau D. The experience of emotional distress among women with scleroderma. Qual Health Res. 2012;22(9):1195–206.

Gumuchian ST, Peláez S, Delisle VC, et al. Exploring Sources of Emotional Distress among People Living with Scleroderma: A Focus Group Study. PLoS One. 2016;11(3):e0152419 Assassi S, ed.

Mouthon L, Alami S, Boisard A-S, Chaigne B, Hachulla E, Poiraudeau S. Patients’ views and needs about systemic sclerosis and its management: a qualitative interview study. BMC Musculoskelet Disord. 2017;18(1):230.

Stamm TA, Mattsson M, Mihai C, et al. Concepts of functioning and health important to people with systemic sclerosis: a qualitative study in four European countries. Ann Rheum Dis. 2011;70(6):1074–9.

Matza LS, Patrick DL, Riley AW, Alexander JJ, Rajmil L, Pleil AM, et al. Pediatric patient-reported outcome instruments for research to support medical product labeling: report of the ISPOR PRO good research practices for the assessment of children and adolescents task force. Value Health. 2013;16:461–79.

Sibeoni J, Orri M, Podlipski M-A, Labey M, Campredon S, Gerardin P, et al. The experience of psychiatric Care of Adolescents with anxiety-based school refusal and of their parents: a qualitative study. J Can Acad Child Adolesc Psychiatry. 2018;27:39–49.

Sibeoni J, Verneuil L, Poulmarc’h L, Orri M, Jean E, Podlipski M-A, et al. Obstacles and facilitators of therapeutic alliance among adolescents with anorexia nervosa, their parents and their psychiatrists: a qualitative study. Clin Child Psychol Psychiatry. 2020;25:16–32.

DiGiuseppe R, Linscott J, Jilton R. Developing the thrapeutic alliance in child-adolescent psychotherapy. Appl Prev Psychol. 1996;15:85–100.

Kermarrec S, Kabuth B, Bursztejn C, Guillemin F. French Adaptation and Validation of the Helping Alliance Questionnaires for Child, Parents, and Therapist. Can J Psychiatr. 2006;51:913–22 Seeing the unseen: diagnosing acromegaly in a dental setup.

Sibeoni J, Picard C, Orri M, Labey M, Bousquet G, Verneuil L, et al. Patients’ quality of life during active cancer treatment: a qualitative study. BMC Cancer. 2018;18:951.

Sibeoni J, Chambon L, Pommepuy N, Rappaport C, Revah-Levy A. Psychiatric care of children with autism spectrum disorder - what do their siblings think about it? A qualitative study. Autism. 2019;23:326–37.

Ooi KL, Ong YS, Jacob SA, Khan TM. A meta-synthesis on parenting a child with autism. Neuropsychiatr Dis Treat. 2016;12:745–62.

Roberts RM, Ejova A, Giallo R, Strohm K, Lillie ME. Support group programme for siblings of children with special needs: predictors of improved emotional and behavioural functioning. Disabil Rehabil. 2016;38:2063–72.

Macquart-Moulin G, Viens P, Bouscary ML, Genre D, Resbeut M, Gravis G, Moatti JP. Discordance between physicians' estimations and breast cancer patients' self-assessment of side-effects of chemotherapy: an issue for quality of care. Br J Cancer. 1997;76(12):1640.

Basch E, Iasonos A, McDonough T, Barz A, Culkin A, et al. Patient versus clinician symptom reporting using the National Cancer Institute common terminology criteria for adverse events: results of a questionnaire-based study. Lancet Oncol. 2006;7(11):903–9.

Di Maio M, Gallo C, Leighl NB, Piccirillo MC, Daniele G, Nuzzo F, Ceribelli A. Symptomatic toxicities experienced during anticancer treatment: agreement between patient and physician reporting in three randomized trials. J Clin Oncol. 2015;33(8):910–5.

Kohn L, Christiaens W. Les méthodes de recherches qualitatives dans la recherche en soins de santé : apports et croyances. Reflets et perspectives de la vie economique, vol. Tome LIII; 2014. p. 67–82.

Turow S, Gelsanliter D. The burden of proof, vol. 2. Farrar: Straus, and Giroux; 1990.

Rose, Todd. The end of average: how to succeed in a world that values sameness. Penguin UK, 2016.

McVilly KR, Stancliffe RJ, Parmenter TR, Burton-Smith RM. Remaining open to quantitative, qualitative, and mixed-method designs: an unscientific compromise, or good research practice? Int Rev Res Ment Retard. 2008;35:151–203.

Download references

Acknowledgements

We would like to thank all the patients for their participation in this study.

We also want to thank la Société Française de Dermatologie; and Jo Ann Cahn for the translation in English.

No funding.

Author information

Authors and affiliations.

Service Universitaire de Psychiatrie de l’Adolescent, Argenteuil Hospital Centre, 69 Rue du Lieutenant Colonel Prud’hon, 95107 ARGENTEUIL Cedex, France

Jordan Sibeoni & Anne Révah-Levy

ECSTRRA Team, UMR-1153, Inserm, Université de Paris, F-75010, Paris, France

Jordan Sibeoni, Laurence Verneuil, Emilie Manolios & Anne Révah-Levy

Service de Psychologie et Psychiatrie de Liaison et d’Urgences, Hôpital Européen Georges Pompidou AP-HP, Hôpitaux Universitaires Paris Ouest, Paris, France

Emilie Manolios

You can also search for this author in PubMed   Google Scholar

Contributions

JS, LV,EM and ARL conceived and designed the study; collected the data; analyzed the data; JS, LV, EM and ARL wrote the paper. All authors had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Jordan Sibeoni .

Ethics declarations

Ethics approval and consent to participate.

The Paris-Descartes University review board (CERES) approved the research protocol (IRB number: 20140600001072) of the study presented in Table 2 . All participants provided written informed consent.

Consent for publication

All the participants gave their written consent.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Sibeoni, J., Verneuil, L., Manolios, E. et al. A specific method for qualitative medical research: the IPSE (Inductive Process to analyze the Structure of lived Experience) approach. BMC Med Res Methodol 20 , 216 (2020). https://doi.org/10.1186/s12874-020-01099-4

Download citation

Received : 10 February 2020

Accepted : 11 August 2020

Published : 26 August 2020

DOI : https://doi.org/10.1186/s12874-020-01099-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Qualitative research
  • Patient-reported outcomes
  • Research methodology

BMC Medical Research Methodology

ISSN: 1471-2288

qualitative medical research examples

  • Skip to Guides Search
  • Skip to breadcrumb
  • Skip to main content
  • Skip to footer
  • Skip to chat link
  • Report accessibility issues and get help
  • Go to Penn Libraries Home
  • Go to Franklin catalog
  • Penn Libraries

MGMT 4090: Huntsman Capstone

  • Qualitative Research Design
  • Market Research & Industry Analysis
  • Company Information
  • Business Conditions
  • Scholarly Articles & News
  • Finding Quantitative Data
  • Organizing & Documenting Your Work

Search the Business FAQ

General guides.

Here are a few general guides to conducting qualitative research:

  • SAGE Research Methods Research Methods is a tool created to help researchers, faculty and students with their research projects. Users can explore methods concepts to help them design research projects, understand particular methods or identify a new method, conduct their research, and write up their findings. Since SAGE Research Methods focuses on methodology rather than disciplines, it can be used across the social sciences, health sciences, and other areas of research.
  • Qualitative inquiry & research design : choosing among five approaches / John W. Creswell Available in Van Pelt Library
  • Handbook of research design & social measurement / Delbert C. Miller, Neil J. Salkind. Available in Van Pelt Library
  • Case study research [electronic resource] / edited by Matthew David. Full text available online through the library
  • Sage Benchmarks in Social Research Methods This links you to the library's Sage Benchmarks in Social Research Methods collection, which has e-resources on many different aspects of social sciences research.

Here is a list of books available in the library or online that offer guidance on designing interview questions:

  • Qualitative Interview Design: A Practical Guide for Novice Investigators
  • Interviewing and representation in qualitative research / John Schostak. Available in Van Pelt Library
  • Handbook of interview research : context & method/ editors, Jaber F. Gubrium, James A. Holstein. Available in Van Pelt Library
  • The research interview / Bill Gillham. Available in Van Pelt Library
  • Interviewing [I] [electronic resource] / edited by Nigel Fielding. Full text available online
  • How to conduct in-person interviews for surveys / Sabine Mertens Oishi.

Here are some resources on survey construction and design:

  • Envisioning the survey interview of the future / edited by Frederick G. Conrad, Michael F. Schober. Available in Van Pelt Library
  • Standardization and tacit knowledge : interaction and practice in the survey interview / editors, Douglas W. Maynard ... [et al.]. Available in Van Pelt Library
  • Interaction and the standardized survey interview : the living questionnaire / Hanneke Houtkoop-Steenstra. Available in Van Pelt Library
  • Questionnaires [electronic resource] / edited by Martin Bulmer. Full text available online

Focus Groups

Here are a few focus-group specific books for if you will be running focus groups:

  • Focus group discussions [electronic resource] / Monique M. Hennink Electronic resource. This book is also available in print in Van Pelt. The call number is H61.28 .H45 2014.
  • The Focus Group Kit Electronic resource. This is a multi-volume set that covers different aspects of utilizing focus groups for research. This link takes you to the Franklin results, where links are available to all volumes in the set.

Profile Photo

  • << Previous: Scholarly Articles & News
  • Next: Finding Quantitative Data >>
  • Last Updated: Sep 18, 2024 4:00 PM
  • URL: https://guides.library.upenn.edu/mgmt4090

U.S. flag

An official website of the Department of Health & Human Services

  • Search All AHRQ Sites
  • Email Updates

Patient Safety Network

1. Use quotes to search for an exact match of a phrase.

2. Put a minus sign just before words you don't want.

3. Enter any important keywords in any order to find entries where all these terms appear.

  • The PSNet Collection
  • All Content
  • Perspectives
  • Current Weekly Issue
  • Past Weekly Issues
  • Curated Libraries
  • Clinical Areas
  • Patient Safety 101
  • The Fundamentals
  • Training and Education
  • Continuing Education
  • WebM&M: Case Studies
  • Training Catalog
  • Submit a WebM&M Case
  • Submit a Training NEW
  • Improvement Resources
  • Innovations
  • Submit an Innovation
  • Submit a Toolkit NEW
  • About PSNet
  • Editorial Team
  • Technical Expert Panel

Human errors in emergency medical services: a qualitative analysis of contributing factors.

Poranen A, Kouvonen A, Nordquist H. Human errors in emergency medical services: a qualitative analysis of contributing factors. Scand J Trauma Resusc Emerg Med. 2024;32(1):78. doi:10.1186/s13049-024-01253-7.

Advancing patient safety in prehospital care is receiving increasing attention . This qualitative study analyzed factors contributing to human errors among paramedics and emergency medical field supervisors in Finland. Researchers identified three main categories of contributing factors: (1) the changing work environment, such as external disruptions or challenging working conditions ; (2) the organization of the work, such as inadequate care guidelines; and (3) individual factors, such as cognitive processing and individual needs.

Safety checklists for emergency response driving and patient transport: experiences from emergency medical services. July 7, 2021

A scoping review of the methodological approaches used in retrospective chart reviews to validate adverse event rates in administrative data. May 29, 2024

Use of an audit with feedback implementation strategy to promote medication error reporting by nurses. November 25, 2020

Using medicolegal data to support safe medical care: a contributing factor coding framework. September 5, 2018

Prospects for comparing European hospitals in terms of quality and safety: lessons from a comparative study in five countries. March 27, 2013

Multi-level analysis of national nursing students' disclosure of patient safety concerns. December 12, 2018

The proportion of errors in medical prescriptions and their executions among hospitalized children before and during accreditation. September 13, 2017

How to scale up quality and safety program with the home care accreditation. April 13, 2022

Missed nursing care in surgical care- a hazard to patient safety: a quantitative study within the inCHARGE programme. May 1, 2024

Exploring the intersection of structural racism and ageism in healthcare. December 7, 2022

A strategic solution to preventing the harm associated with ambulance handover delays. May 29, 2024

Rural emergency medical services clinicians' perceptions and preferences in receiving clinical feedback from hospitals: a qualitative needs assessment. April 10, 2024

Factors influencing witnesses' perception of patient safety during pre-hospital health care from emergency medical services: a multi-center cross-sectional study. February 14, 2024

Women with large vessel occlusion acute ischemic stroke are less likely to be routed to comprehensive stroke centers. August 23, 2023

Improving emergency medicine clinician awareness of prehospital-administered medications. August 9, 2023

Interorganizational health information exchange-related patient safety incidents: a descriptive register-based qualitative study. May 24, 2023

An analysis of prehospital pediatric medication dosing errors after implementation of a state-wide EMS pediatric drug dosing reference. March 1, 2023

A novel study of situational awareness among out-of-hospital providers during an online clinical simulation. October 5, 2022

Shame and guilt in EMS: a qualitative analysis of culture and attitudes in prehospital emergency care. July 13, 2022

Reasons for bias in ambulance clinicians' assessments of non-conveyed patients: a mixed-methods study. May 25, 2022

Patient Safety Network

  • Submit a Training
  • Submit a Toolkit

Connect With Us

LinkedIn

Sign up for Email Updates

To sign up for updates or to access your subscriber preferences, please enter your email address below.

Agency for Healthcare Research and Quality

5600 Fishers Lane Rockville, MD 20857 Telephone: (301) 427-1364

  • Accessibility
  • Disclaimers
  • Electronic Policies
  • HHS Digital Strategy
  • HHS Nondiscrimination Notice
  • Inspector General
  • Plain Writing Act
  • Privacy Policy
  • Viewers & Players
  • U.S. Department of Health & Human Services
  • The White House
  • Don't have an account? Sign up to PSNet

Submit Your Innovations

Please select your preferred way to submit an innovation.

Continue as a Guest

Track and save your innovation

in My Innovations

Edit your innovation as a draft

Continue Logged In

Please select your preferred way to submit an innovation. Note that even if you have an account, you can still choose to submit an innovation as a guest.

Continue logged in

New users to the psnet site.

Access to quizzes and start earning

CME, CEU, or Trainee Certification.

Get email alerts when new content

matching your topics of interest

in My Innovations.

Submit Your Training

Please select your preferred way to submit a training.

Track and save your training in My

Edit your training as a draft

Please select your preferred way to submit a training. Note that even if you have an account, you can still choose to submit a training as a guest.

Submit Your Toolkit

Please select your preferred way to submit a toolkit.

Track and save your toolkit in My

Edit your toolkit as a draft

Please select your preferred way to submit a toolkit. Note that even if you have an account, you can still choose to submit a toolkit as a guest.

Submit Your WebM&M Case

Please select your preferred way to submit a case.

Track and save your case in My

Edit your case as a draft

Your name will not be publicly

associated with the case

Please select your preferred way to submit a case. Note that even if you have an account, you can still choose to submit a case as a guest. And if you do choose to submit as a logged-in user, your name will not be publicly associated with the case. Learn more information here.

Already have a PSNet

  • Open access
  • Published: 18 September 2024

“Luck of the draw really”: a qualitative exploration of Australian trainee doctors’ experiences of mandatory research

  • Caitlin Brandenburg 1 , 2 ,
  • Joanne Hilder 1 , 2 ,
  • Christy Noble 3 ,
  • Rhea Liang 1 , 2 ,
  • Kirsty Forrest 2 ,
  • Hitesh Joshi 4 ,
  • Gerben Keijzers 1 , 2 , 5 ,
  • Sharon Mickan 2 ,
  • David Pearson 1 ,
  • Ian A. Scott 6 , 7 ,
  • Emma Veysey 8 &
  • Paulina Stehlik 9 , 10  

BMC Medical Education volume  24 , Article number:  1021 ( 2024 ) Cite this article

Metrics details

Many medical trainees, prior to achieving specialist status, are required to complete a mandatory research project, the usefulness of which has been debated. The aim of this study was to gain an in-depth understanding of trainees’ experiences and satisfaction of conducting such research projects in Australia.

A qualitative descriptive approach was used. Semi-structured interviews with trainees were undertaken between May 2021 and June 2022. Australian medical trainees who had completed a research project as part of specialty training within the past five years were invited to participate. The purposive sample was drawn from participants in a survey on the same topic who had indicated interest in participating in an interview. Interviews explored trainees’ overall experience of and satisfaction with conducting research projects, as well as their perceptions of research training, support, barriers, enablers, and perceived benefits. Interviews were transcribed verbatim and thematically analysed.

Sixteen medical doctors from seven medical colleges were interviewed. Trainee experience and satisfaction was highly variable between participants and was shaped by four factors: 1) trainees entered their specialty training with their own perspectives on the value and purpose of the research project, informed by their previous experiences with research and perceived importance of research in their planned career path; 2) in conducting the project, enablers including protected time, supervisor support and institutional structures, were vital to shaping their experience; 3) trainees’ access to these enablers was variable, mediated by a combination of luck, and the trainees’ own drive and research skill; and 4) project outcomes, in terms of research merit, learning, career benefits and impacts on patient care.

Conclusions

Trainee experiences of doing research were mixed, with positive experiences often attributed to chance rather than an intentionally structured learning experience. We believe alternatives to mandatory trainee research projects must be explored, including recognising other forms of research learning activities, and directing scarce resources to supporting the few trainees who plan to pursue clinician researcher careers.

Peer Review reports

Engaging clinicians in research is a cornerstone of quality healthcare and evidence-based practice. Professional colleges in many countries have fostered such engagement through the integration of research-related competencies into their training standards, usually by requiring completion of a research project during residency [ 1 , 2 , 3 , 4 ]. However, the value of requiring college trainees to lead projects to gain specialist clinical qualifications has long been debated [ 5 , 6 , 7 , 8 ]. Key medical academics have suggested that in health service contexts, research should only be led by those few who plan to engage in a clinician researcher career, while the majority of clinicians should learn to participate in research, and effectively utilize research findings [ 9 ]. In Australia, specialty training is led by specialty specific colleges, most of which mandate completion of a scholarly project which, for most colleges is a research projects, but for some can also be a quality improvement initiative [ 4 ].

Previous research has identified problems with the current system of mandatory research. Our previous investigation showed that Australian college research project requirements, similar to those of specialty training programs internationally [ 10 ], frequently specify the trainee must lead or carry out most of the project, but often do not stipulate requirements regarding the quality of supervision or the quality of the research [ 4 ]. Previous international research has also reported significant barriers to trainees undertaking research, predominantly lack of protected time in light of other training demands, but also lack of appropriate mentorship, funding, structural supports, and interest from trainees and faculty [ 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ]. The intended goals of mandated research projects are important- to advance research, support evidence-based practice, encourage critical thinking and stimulate a culture of lifelong learning [ 2 , 20 ]. However, many clinicians do not develop sufficient research skills during specialty training [ 21 ]. Hence the current system is at risk of failing to meet training goals and provide a satisfying learning experience for trainees.

While the primary goal of mandated research projects is to educate future clinicians in research methods, there is also an obligation to ensure the research produced by trainees is valid, potentially useful [ 22 ], and reflective of the considerable time invested in completing a research project. Substantial input is required from not just the trainee, but often their supervisor, co-investigators, ethics committee members, governance administrators, health data custodians, statisticians, patients and other types of participants. Some commentators have suggested the pressure to produce research outputs during training, and the considerable barriers to conducting this research, promotes wasteful research practices [ 5 , 6 , 10 , 23 , 24 ].These may involve abandoning projects midway, producing low-quality research, failing to make the results of research suitably available, or even engaging in research misconduct [ 14 , 25 , 26 , 27 , 28 , 29 ]. Additionally, some studies, mostly in surgical fields, have found that time spent in research training can interfere with the acquisition of clinical skills [ 30 ], or the volume of clinical activity trainees can undertake. [ 31 ] In an increasingly constrained healthcare environment, it is vital that trainee time spent away from clinical duties is spent responsibly in terms of both meeting curriculum goals and producing quality research.

Ensuring that trainees have a constructive and satisfying research experience is critical. Medical competency frameworks place importance on the scholar role for both gaining specialty licensure, and ongoing professional development [ 2 , 20 ]. However negative early experiences of research can deter clinicians from future research engagement entirely [ 14 , 32 , 33 ]. Previous research on trainees’ personal experiences of conducting research projects has almost exclusively comprised quantitative surveys, with minimal study outside North America [ 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ]. Thus, a wider understanding of trainee experiences is important in gauging the effectiveness and relevance of mandated projects in medical training and identifying areas for improvement.

Accordingly, we aimed to explore the research expectations and training set by specialist medical colleges in Australia and New Zealand as part of a mixed methods body of work entitled Enhancing the Research Development of Medical Specialty Trainees (ENHANCE). A survey component, reported elsewhere [ 29 ], explored how trainees undertook mandatory projects, their views regarding the curriculum, and the quality of their research outputs. The present study aims to gain an in-depth understanding of trainees’ experiences of conducting research projects as part of medical specialty training.

Study design

This study used a qualitative descriptive design, applying a theory-informed inductive approach [ 34 ]. We conducted semi-structured interviews with current and recently graduated Australian medical trainees. An interpretivist paradigm was applied, which considers trainee experiences to be subjective and influenced by context [ 35 ]. The Standards for Reporting Qualitative Research (SRQR) guidelines were used to guide reporting [ 36 ].

Recruitment and sampling

The study sample was primarily drawn from participants in the ENHANCE survey study [ 29 ] who consented to be contacted for an interview and met eligibility criteria of having completed a research project as part of their specialty training, either as a current trainee or a specialist who had completed their training within the past 5 years. Those who completed their project as part of a higher degree were excluded, as their experience was expected to differ significantly from most trainees.

We used purposive and snowball sampling to select participants from a variety of medical colleges. Survey recruitment utilized the existing communication channels (website, newsletter and direct email) of 11 Australian and New Zealand medical colleges. Of the 372 participants who completed the ENHANCE survey, 33 expressed interest in participating in an interview. This approach was supplemented by snowball sampling, in which study information was shared through avenues such as college meetings, social media and email chains. Potential participants who expressed interest either in the survey or by emailing investigators after learning about the study elsewhere were emailed up to three occasions to arrange an interview time. Recruitment was ceased when all available participants had been interviewed.

Data collection and analysis

Two authors (PS, CB) developed a draft interview guide (Supplementary file 1) informed by relevant literature and conceptual frameworks of research learning by trainees and research waste [ 4 , 32 , 37 , 38 , 39 , 40 ]. This draft was reviewed by the research team and underwent several rounds of feedback. Questions included in the final interview guide explored the overall experience of conducting research projects, research training, support, experiences of barriers and enablers, and perceived benefits for their future careers and patient care.

Interviews were conducted via videoconference by a single, non-doctor author (JH) with prior experience in qualitative interviewing but no prior involvement in college research projects. The average length of each interview was 34 min. All interviews were recorded on Zoom videoconferencing software [ 41 ], transcribed verbatim using Otter.ai software [ 42 ], and checked, corrected, and anonymized by a single author (JH). The data were then analysed thematically [ 43 ] by five members of the research team (JH, CB, PS, RL, CN). Inductive thematic analysis followed six steps: 1. familiarisation with the data, 2. generating initial codes, 3. generating themes; 4. reviewing potential themes, 5. defining and naming themes and 6. producing the report [ 43 ]. Each researcher initially coded three transcripts and discussed key themes, after which a single author (JH) coded all transcripts using NVivo software [ 44 ]. A summary of the preliminary findings was presented to all team members and final themes were discussed and refined.

Reflexivity

As a team of 11 authors, including four allied health professionals and seven doctors representing seven different medical specialties, we approached this research with a range of experiences and expertise. The team included experts in qualitative research, medical education, clinical research, and research waste. All authors had experience in research, and an appreciation of the importance of research competencies in medical training. Ongoing discussions throughout the research process ensured a balanced methodological and analytical approach.

Participants

Of the 39 doctors who expressed interest in participating in an interview, 20 did not respond to follow-up invitation emails, and 3 were ineligible. Although reasons for nonparticipation were not collected, the study took place during May 2021 to June 2022, hence the impact of the COVID-19 pandemic on workload and personal demands may have been a factor. Sixteen medical doctors participated in interviews; fourteen were recruited directly from ENHANCE survey participants and two were recruited via snowballing.

Most participants were male (12 of 16), and half were working as specialists, while the other half were current trainees. They represented seven different medical colleges, with two participants belonging to two different colleges (Table  1 ). All participants completed their training in Australia (although New Zealand doctors were also eligible to participate, none did). Four participants had gone on to start a higher degree in research (PhD or Master’s) after completing their research project, and one participant had a PhD prior to becoming a medical doctor but was still required to conduct a research project. All participants reported some research exposure before commencing their project, ranging from completing research-related coursework in medical school with no practical experience, to working as an academic and supervising multiple PhD students.

Identified themes

Five main themes revealed how trainees’ experiences of research projects were formed, as displayed in a thematic map in Fig.  1 . First, trainees entered their specialty training with their own perspectives on the value and purpose of the research project, often informed by previous experiences with research and perceived importance of research in their planned career path. Second, as trainees conducted the project, access to protected time, supervisor support and institutional research infrastructure were considered important enablers to project success. Third, trainees’ access to these enablers was mediated by a combination of luck, and the trainees’ own drive and research skill. Fourth, perceptions were influenced by project outcomes in terms of research achievement, learning, career benefits and impacts on patient care. Lastly, each of these themes contributed to trainees’ overall perception of and satisfaction with their research project experience, which was highly variable. These themes are discussed in further detail below, with illustrative participant quotes.

figure 1

Overview of themes

Trainees’ initial perceptions of the research project

Trainees entered their specialty training with diverse understandings of the purpose and value of the research project, previously shaped by a combination of past experiences of research and their desired career path. Many participants saw the project as simply a mandatory requirement that trainees needed to fulfill, likening it to a “ tick box ” (P14) or “ jump through hoop s” (P5) activity, rather than an opportunity to develop their research capabilities. This opinion was more common among those who did not think research was likely to be a major part of their future careers, believing the project to be “ a waste of time ” (P5). This perception led some trainees to seek the easiest route for completing their project. Some participants were concerned that this approach fostered low-quality research. A few participants who already had significant research experience (e.g. PhD) prior to specialty training expressed frustration that their prior experience was often not recognised by the College as it did not fit the rigid College research project requirements; “ Frankly, I think it was kind of naïve and silly of them to request that I continue with a project like they need to tick the box for research in their fellows. I've got probably 20 years research experience .” (P12) However, other participants interested in research or who had plans for research to be part of their career placed more value on the project, with some deliberately choosing more complex projects to build their research experience and skills.

Factors influencing project completion

Three factors were deemed important to the successful completion of the research project, and resulting trainee experience and satisfaction: protected time, supervisory support, and institutional structures.

The first factor was access to protected time to complete the project, although few participants reported receiving it. For most, substantial amount of time outside of work was required to complete the project, further compounded by competing training demands, personal commitments, and participants underestimating the time and effort required to complete the project at the outset.

The second was the importance of having a research-skilled supervisor who was accessible. Most participants felt well-supported and encouraged to complete their project by their supervisor, but where this was absent, trainees struggled, especially those with no prior research experience to fall back on. Some participants found their supervisor lacked adequate research experience to effectively guide them; “It's the blind leading the blind” (P5), while others reported a general lack of interest and involvement; “ He [supervisor] mostly left me to my own devices ” (P11). Even those reporting good supervision experiences noted their supervisors’ clinical demands left them little time for research supervision, leading trainees to rely more on their own knowledge and informal resources (e.g. YouTube). Participants felt that helping identify a project topic was an important supervisor role, and valued when supervisors were able to guide them towards topics that were interesting, clinically relevant, and of a manageable scope; “ Because [supervisor] herself…has lots of experience in research it wasn't too difficult to come up with a project ” (P2).

The third factor was institutional structures, pertaining to both the trainees’ college and the health service in which they worked. Trainees perceived a misalignment between structures and processes inherent in clinical training and those enabling completion of research projects. For example, many training programs required trainees to regularly rotate between hospitals or health services, posing difficulty for many research projects which needed to be performed within one service; “ There is a requirement for rotational training….[trainees] move on, and then it becomes very difficult because it's beyond just sort of access to supervisor. It's also things like, research governance, particularly, and access to data ” (P6). Obtaining necessary but time-consuming ethics and governance approvals to access local clinical data during a time-limited placement within one health service was challenging; “ ethics takes a year ” (P15).Some trainees instead intentionally chose “portable” (P6) projects, like literature reviews, while others benefitted from joining clinical teams with extensive research programs and pre-established approvals and data access processes.

Institutional research culture and supports were also important for project completion. Participants able to be part of a team with a strong research culture (which they felt was the exception rather than the rule for trainees) reported benefits such as access to research-experienced supervisors, learning alongside other trainees, and access to research support staff, notably librarians and statisticians; “ I was very lucky that the organization that I was working for…is quite a well-established clinical and research centre…. [I] did have access to a statistician, which was really helpful” (P14). The absence of such a supportive culture impeded project completion; “If you're unlucky enough to be in a small department, which doesn't care about research, then that's almost like game over for your aspirations and that affects your ongoing practice” (P4).

Structured support from colleges, in the form of paid protected time and high-quality supervision, was only identified by general practitioner trainees, who emphasised the importanceto their research experience. Other colleges, and health services where trainees were placed, provided some resources, such as workshops and online content, although some trainees were unaware of their existence during their project, and others felt they did not fully translate into acquisition of practical skills.

Access to enablers is through luck or trainee experience and drive

Participants felt their access to the supports mentioned above was mediated by a combination of “ luck of the draw ” (P7), and also their own research experience and personal drive to pursue research activity. Many participants felt access to experienced supervisors, biostatisticians, data, desk space, and time was usually driven by luck; “A lot of it really just comes down to dumb luck. You have to be at the right place at the right time” (P1). While many participants reported mostly positive experiences, they were aware of “unlucky” colleagues with more challenging experiences; “ I was fairly lucky that things worked out really nicely with my supervisor. You know, I've had other colleagues who have… been unable to complete projects and had supervisors that are unavailable, and all sorts of other things” (P6).

However, participants also recognised that individual attributes such as having prior research experience, strong interest in research, and motivation contributed to positive experiences and successful project completion; “I'm not sure that if someone else going through that process would feel the same way. Because I was highly motivated myself…So, I suppose it all depends on the person” (P4). Some participants, usually with more research experience, were able to carefully plan their project to make it achievable and sustainable (e.g. use a method they were already familiar with); “I was fairly careful in selecting something that, you know, that I knew I could accomplish…in the timeframe that I needed to finish it” (P6).

Lack of access to facilitating factors not only predisposed to a poor research experience but could also engender poor research practices, such as using a personal device to store patient data (P9), learning data analysis from YouTube (P2), using a “ torrented version of a data analysis tool ” (P16) and choosing infeasible project topics; “ [a supervisor] could easily have steered me off this [idea]” (P7).

Participant 14 summed up the balance between luck and individual motivation and planning ; “I often look back [and] wonder, if I wasn't at this exact place, with this supervisor, I possibly wouldn't have been where I am now. But I guess I also actively applied for and was looking for a role in an organization that had the research as well as a clinical focus. For the reason that I did have an interest in research.”

Project outcomes

All participants reported a sense of accomplishment in publishing or presenting their research, both in terms of fulfilling college requirements and furthering their own personal development,. Many also viewed the experience as useful for general exposure to the research process and acquiring some new research-related skills and knowledge (e.g., submitting an ethics application, conducting a certain type of analysis, engaging in academic writing), but often felt these were somewhat limited or project-specific; “ I learned a few new skills, you know, nothing outstanding ” (P6); “ I’ve learned some things from a skills point of view. But as I said, because I wouldn’t do the project the same way again, I don’t feel like actually, I’ve got skills I can apply to a new project ” (P9). However, many participants felt they gained a better appreciation of the importance of finding relevant literature for clinical care and an improved ability to critically appraise research papers; “I think that's it's good to help understand the levels of evidence available, and the quality that's out there to help you judge the evidence that you read better, and then make better decisions for your patients based on that” (P11).

Participants reported the immediate benefits of the research project to their clinical careers were limited, consisting mainly of providing a point of differentiation in interviews for consultant appointments and networking opportunities. For the subset of participants who were pursuing, or planned to pursue, research careers, they felt the project helped build their track record and skills. Participants expressed mixed feelings regarding whether doing the project gave them enough skills and knowledge to supervise research projects of future trainees; “[As a] consultant, I potentially will have to supervise other registrars in their research. And I would say that I don't really feel prepared for that” (P8).

Most participants were uncertain about the impact of their research on health services and patient care. Some perceived limited impact due to the small scale of their projects or its confinement to their own clinical care; “ I think the first one [article] is interesting and is part of sort of a body of literature… so from that point of view, it's not useless… My second paper, absolutely not. I don't think it's really added anything to the [clinical speciality] landscape whatsoever ” (P9). However, other participants were confident their research had been impactful as it had been highly cited or, in one case, had contributed to national clinical practice guidelines.

Experience and satisfaction

All of the above themes fed into the trainees’ subjective project experience and satisfaction with the process, which ranged from highly positive, satisfying experiences; “ My experience was very, very positive ” (P6), to dissatisfaction and difficulty; “ My view of this research project is that it's a waste of time” (P5). This, in turn, impacted participants’ interest in pursuing research into the future; “Not that I was ever planning to be a researcher, but I am definitely less likely to do further research now” (P9). Some participants viewed their project outcomes and experiences more positively in hindsight than when conducting the research; “With the benefit of hindsight, I'm much more satisfied now that it's actually published…I think I probably would have had a very different answer if you'd asked me this question maybe at the end of completing it, because I think I just had enough of it and just wanted to be over and done with it” (P10).

Many of the themes mentioned above fed into trainee experience such as: pre-existing research experience; “It was a very positive experience, though, I appreciate that, that's not the average trainee project journey. I mean, I had some advantages…I had a bit of pre-knowledge that enabled me to select something that I knew with a fair amount of certainty I would be able to sort out” (P6); research interest; “I quite enjoyed the experience, I guess I've always had an interest in research. I'm actually doing a PhD now” (P14); and access to enabling factors through luck or personal drive; “I would say very satisfied…because of how the local support, facilitated it and reduced the frictions that you will normally have” (P3).

Our findings revealed variation in levels of satisfaction with, and perceived outcomes of, mandatory research projects, with trainees feeling luck played a significant role in accessing crucial enablers such as protected time, quality supervision, and accommodating institutional structures.

Trainees’ reliance on serendipity for obtaining such support is problematic, as their absence is likely to lead to an unsuccessful and negative research experience, and less meaningful and impactful research outputs. Negative experiences with mandatory research projects are not uncommon, with previous research showing that trainee satisfaction with mandatory research projects is highly variable [ 11 , 14 , 16 , 45 ]. Quantitative research reports results similar to our study- that trainee satisfaction is adversely impacted by poor quality of supervision, inappropriately scoped projects, lack of organisational support, and frustration with prohibitive “hoop jumping” type training requirements [ 13 , 17 , 46 ]. Trainees also see mandatory projects as a source of stress and a negative impact on work-life balance [ 47 ]. Our research found some trainees who had access to supports did have highly positive experiences, while others felt, in hindsight, the experience was valuable despite challenges during the project, especially if they then pursued a research career or produced meaningful research outputs [ 48 ]. However, this variability in experiences must be addressed, as negative experiences may discourage future engagement in research which is contrary to the intended curriculum goals [ 14 , 33 ]. Colleges do not leave the development of clinical competencies to chance, but instead support development through an intentionally structured and appropriately resourced training curriculum. The same should apply to the development of research competencies if these are also a required component of the curriculum.

Our recommendations are twofold. Firstly, we recommend a well-supported pathway for trainees to pursue research projects voluntarily, requiring enabling supports to be embedded in both the colleges and the health services trainees are placed within. It is arguably unfair to expect research-naïve trainees to have success in environments where such support is unavailable. Even the more research-experienced trainees in our study reported that they anticipated a lack of supports and intentionally limited their project scope, thereby curtailing their potential research contributions and engendering a missed opportunity for further skill development. The literature affirms many training programs lack key elements shown to support trainee research activity, such as an organized research curriculum, appropriate supervision and protected time [ 32 , 46 , 49 , 50 , 51 , 52 ]. A trainees’ experience should not rely on where they are placed, especially when current systems may disadvantage those placed in under-resourced regional/rural locations or smaller health services [ 16 , 53 ].

Supports should also be directed to the subset of trainees who choose to complete a project, rather than attempting to provide this significant level of support to every trainee, for two reasons. Firstly, it is likely impractical, both in terms of limited funding available for such activities, and because of a paucity of skilled research supervisors [ 5 , 29 ]. The current system compounds this issue, as trainees who feel their training has not furnished them with the necessary research skills are nonetheless compelled to supervise the mandatory projects of future trainees, potentially passing on suboptimal research practices. As a minimum, any research-naïve trainee should have access to both content and methods expertise, which may necessitate multiple supervisors. Secondly, studies have shown that supports may be better directed to health professionals with higher intrinsic motivation to do research, as they are more likely to be successful than those whose only motivation is to satisfy external training requirements [ 11 , 54 , 55 ]. Some colleges, such as the RACGP [ 56 ], have recognized the need for institutional supports for those few wishing to pursue a clinician researcher career and provide a specialised pathway, including protected time. Few doctors currently identify as clinician researchers [ 57 ], and their numbers are declining [ 58 ]. For most trainees, the value of mandatory trainee-led projects for clinically-orientated careers, and for adding meaningfully to the evidence base, is questionable. We posit that the limited resources available for research should be funnelled into high quality supports for research-interested trainees, and increasing the number of clinician researchers.

Our second recommendation is for colleges to value and incentivise forms of research engagement other than leading a project for the remainder of trainees. Although research interpretation and application must remain an expected competency for all trainees, we have found a mismatch between the intended goals of research projects and the realized outcomes. If the key goal of specialty training research requirements is to produce clinicians informed in research methodology, more emphasis should be placed on appraising and implementing research, and on participating in, rather than leading, research. Options other than mandatory research projects may be more effective and far less resource intensive in imparting the necessary skills. Key theories of research culture and behaviour change hold that strategies that make such change easy, normative and rewarding should come first, with mandated compliance being used as a last resort [ 59 ]. Studies support this, finding that even mandating trainee research activity has mixed impacts on trainee outputs [ 11 , 18 , 49 ], and that trainees would prefer activities other than mandated research projects [ 14 , 16 , 18 , 45 , 47 ]. However, many trainees also emphasise the importance of understanding the principles of research and evidence for training [ 16 , 18 , 29 , 45 , 47 ], and recognise that trainee-led research outputs are vital to career progression in many specialties [ 14 , 16 ]. For all these reasons, research skills must remain an expected trainee competency, and each college must consider the barriers and benefits to various research training options specific to their context.

The literature suggests some alternate options for trainees to gain research competencies. The Australasian College for Emergency Medicine replaced their mandatory research project with a choice of a coursework or project pathway in 2009, due to perceptions of research waste and limited value to training [ 17 ]. Subsequent research comparing the two pathways found trainees rated the coursework pathway as more useful in achieving all learning objectives [ 17 ]. More recently, the Royal Australasian College of Surgeons has given less focus in its general surgery training criteria on leading a research project, and more recognition to other types of research activities and prior learning [ 60 ], although formal evaluation is yet to occur. Participation in large student and trainee-led research collaboratives could also be recognised activities in college curricula, as they not only produce high quality and impactful research, but are also effective in developing research competencies and promoting long term research engagement of clinicians [ 61 , 62 , 63 , 64 ]. Such avenues for trainees to participate in, but not lead, research projects should be supported, in order to provide trainees without a preconceived research interest an opportunity to develop this interest. To support a sustainable cultural shift, changes also need to be considered up- and down-stream from residency, including processes that disincentivise the research “arms race” by refocusing evaluation of research activity at key career transitions on quality and competency, rather than quantity [ 65 ].

Underlying potential changes to research requirements should be a clear and agreed upon understanding of the research competencies expected of trainees, and, accordingly, of specialist doctors. A set of core research-related competencies relevant to all graduating trainees need to be defined [ 66 ], and colleges provided support to develop and implement new curricula. Competencies for those completing a research project should also be clarified, acknowledging the goal is to impart basic research skills, and that it is not possible or necessary to impart all the skills required to consistently produce high quality research after a single small-scale project [ 67 ].

This study has two key strengths. First was the use of qualitative methodology which allowed for in-depth exploration of trainees’ experiences and perceptions. Second was the inclusion of doctors from multiple medical colleges which added to the diversity of elicited perspectives and experiences. The study also had limitations. Our participants may have provided a more supportive narrative of mandatory research compared to the general trainee population; indeed many stated they felt their experiences were more positive than those of their colleagues. he participants who expressed interest in participating in interviews were more likely than other survey participants to say they were supportive of mandatory research projects (67% vs 39%) and express satisfaction with their overall research experience (55% vs 34%) [ 29 ]. The study only included trainees from Australia limiting the transferability of the findings to other countries and contexts. The reasons for non-participation of New Zealand trainees are unclear, but possibly due to the fact they made up a smaller percentage of the survey participants (13%). Finally, the important views of those who support research projects, including supervisors, leadership and research support staff, were not investigated.

This study found that specialty trainees have variable experiences and levels of satisfaction in undertaking mandatory research projects. While some trainees were able to draw upon previous research or personal drive to engender positive experiences, the success of many projects relied on chance. Trainees also varied in their perceptions of project outcomes, in terms of their own learning and contributions to the evidence base. Given that considerable resources are needed to provide trainees with a good research experience with a higher likelihood of worthwhile outcomes, we recommend that resources for research are directed only to those trainees interested in conducting research projects during residency. For other trainees, alternative pathways to mandatory projects should be considered and resourced within mainstream curricula to allow the acquisition of necessary research related skills and competencies. These strategies in combination could constitute more effective and sustainable methods for achieving the desired twin goals of producing research-informed clinicians and supporting the growth of clinician researchers.

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available due to identifiability of the interview transcripts. Portions are available from the corresponding author on reasonable request. A published protocol for the study is available at Open Science Framework: https://doi.org/ https://doi.org/10.17605/OSF.IO/BNGZK

Abbreviations

Enhancing the Research Development of Medical Specialty Trainees

Standards for Reporting Qualitative Research

Frank JR, Snell L, Sherbino J. CanMEDS 2015 physician competency framework: Royal College of Physicians & Surgeons of Canada. 2015.

Google Scholar  

Specialist Education Accreditation Committee. Specialist education accreditation committee standards for assessment and accreditation of specialist medical programs and professional development programs by the Australian medical council 2015: Australian medical council limited. 2015.

ACGME. Common Program Requirements (Residency): accreditaton council for graduate medical education. 2022.

Stehlik P, Noble C, Brandenburg C, Fawzy P, Narouz I, Henry D, et al. How do trainee doctors learn about research? Content analysis of Australian specialist colleges’ intended research curricula. BMJ Open. 2020;10(3): e034962.

Article   Google Scholar  

Mykkanen K, Tran V. The ACEM trainee research requirement is no longer relevant. Yes Emerg Med Australas. 2017;29(6):724–5.

Altman DG. The scandal of poor medical research. BMJ. 1994;308(6924):283–4.

Ryan J, Bonanno R, Dunn G, Fahrenwald R, Kirsch S. Required research a disservice? Fam Med. 1997;29(9):610–1.

McKinnon G. “But I just want to be a good clinician”: Research in Canadian surgical training programs. Ann R Coll Physicians Surg Can. 2002;35(4):203–6.

Del Mar C. Publishing research in Australian family physician. Aust Fam Physician. 2001;30(11):1094–5.

Farrell I, Duff S. Research requirements for CCT across the surgical specialties: why the difference? The Bulletin of the Royal College of Surgeons of England. 2020;102(S1):39–42.

Abramson EL, Naifeh MM, Stevenson MD, Mauer E, Hammad HT, Gerber LM, et al. Scholarly activity training during residency: Are we hitting the mark? A national assessment of pediatric residents. Acad Pediatr. 2018;18(5):542–9.

Carter S, Liew S, Brown G, Moaveni AK. Barriers to completion of research projects among orthopaedic trainees. J Surg Educ. 2018;75(6):1630–4.

Cheung G, Gale C, Menkes DB. What affects completion of the scholarly project? A survey of RANZCP trainees. Australas Psychiatry. 2018;26(5):545–50.

Clancy AA, Posner G. Attitudes toward research during residency: A survey of Canadian residents in obstetrics and gynecology. J Surg Educ. 2015;72(5):836–43.

Levine RB, Hebert RS, Wright SM. Resident research and scholarly activity in internal medicine residency training programs. J Gen Intern Med. 2005;20:155–9.

Mansi A, Karam WN, Chaaban MR. Attitudes of residents and program directors towards research in otolaryngology residency. Ann Otol Rhinol Laryngol. 2019;128(1):28–35.

Mitra B, Jones P, Fatovich D, Thom O, Trainee Research Committee ACEM. Trainee perspectives on usefulness of the Trainee Research Requirement. Emerg Med Australas. 2014;26(4):392–7.

Silcox LC, Ashbury TL, VanDenKerkhof EG, Milne B. Residents’ and program directors’ attitudes toward research during anesthesiology training: a Canadian perspective. Anesth Analg. 2006;102(3):859–64.

Tien-Estrada J, Vieira A, Percy V, Millar K, Tam H, Russell K, et al. Determinants of scholarly project completion in a paediatric resident program in Canada. Paediatr Child Health. 2019;24(2):e98–103.

Frank JR. Editor. The CanMEDS 2005 physician competency framework. Better standards. Better physicians. Better care. Ottawa (CAN): The Royal College of Physicians and Surgeons of Canada; 2005.

McKeon S, Alexander E, Brodaty H, Ferris B, Frazer I, Little M. Strategic review of health and medical research in Australia–better health through research. Canberra (AU): Commonwealth of Australia; 2013. p. 204.

Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76.

Stehlik P, Henry DA, Glasziou PP. Specialist college training: a potential source of research wastage. In: The BMJ Opinion. 2020. Available from: https://blogs.bmj.com/bmj/2020/07/14/specialist-college-training-a-potential-source-of-research-wastage/ . Accessed 25 Feb 2024 .

Withers C, Noble C, Brandenburg C, Glasziou PP, Stehlik P. Selection criteria for Australian and New Zealand medical specialist training programs: Another under-recognised driver of research waste. Med J Aust. 2021;215(7):336.

Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.

Alamri Y, Alsahli K, Butler J, Cawood T. Do medical interns publish findings of compulsory audit or research projects? Five-year experience from a single centre in New Zealand. J Adv Med Educ Prof. 2020;8(2):100.

Burkhart RJ, Hecht CJ, Karimi AH, Acuña AJ, Kamath AF. What are the trends in research publication misrepresentation among orthopaedic residency and fellowship applicants from 1996 to 2019? A systematic review. Clin Orthop Relat Res. 2023;481(7):1292–303.

Rodriguez-Unda NA, Webster ND, Verheyden CN. Trends in academic “ghost publications” in plastic surgery residency applications: A 3-year study. Plast Reconstr Surg Glob Open. 2020;8(1):e2617.

Stehlik P, Withers C, Bourke R, Barnett AG, Brandenburg C, Noble C et al. A cross sectional survey of Australian and New Zealand specialist trainees’ research experiences and outputs. medRxiv [Preprint] 2024. https://doi.org/10.1101/2024.03.11.24303739v1 .

Gawad N, Allen M, Fowler A. Decay of competence with extended research absences during residency training: a scoping review. Cureus. 2019;11(10):e5971.

Schott NJ, Emerick TD, Metro DG, Sakai T. The cost of resident scholarly activity and its effect on resident clinical experience. Anesth Analg. 2013;117(5):1211–6.

Noble C, Billett SR, Phang DT, Sharma S, Hashem F, Rogers GD. Supporting resident research learning in the workplace: a rapid realist review. Acad Med. 2018;93(11):1732–40.

Reck SJ, Stratman EJ, Vogel C, Mukesh BN. Assessment of residents’ loss of interest in academic careers and identification of correctable factors. Arch Dermatol. 2006;142(7):855–8.

Bradshaw C, Atkinson S, Doody O. Employing a qualitative description approach in health care research. Glob Qual Nurs Res. 2017;4:2333393617742282.

Crotty M. The foundations of social research: Meaning and perspective in the research process. St Leonards, NSW (AU): Allen & Unwin; 1998.

O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51.

Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JP, et al. Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101–4.

Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, et al. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

Salman RA-S, Beller E, Kagan J, Hemminki E, Phillips RS, Savulescu J, et al. Increasing value and reducing waste in biomedical research regulation and management. Lancet. 2014;383(9912):176–85.

Zoom Video Communications I. Zoom. 5 ed2023.

Otter.ai I. Otter. 3 ed2022.

A worked example of Braun and Clarke’s approach to reflexive thematic analysis. Qual Quant. 2022;56:1391–412.

QSR International Pty Ltd. NVivo qualitative data analysis Software. 11 ed2015.

Abramson EL, Weiss P, Mauer EL, Naifeh M, Stevenson M, Rama J, et al. Scholarly activity training during fellowship: a national assessment. Acad Pediatr. 2020;20(7):e1.

Phang DT, Rogers GD, Hashem F, Sharma S, Noble C. Factors influencing junior doctor workplace engagement in research: an Australian study. Focus Health Profess Educ. 2020;21(1):13–28.

Dahn HM, Best L, Bowes D. Attitudes towards research during residency training: a survey of Canadian radiation oncology residents and program directors. J Cancer Educ. 2020;35:1111–8.

Mittwede PN, Morales-Restrepo A, Fourman MS, Fu FH, Lee JY, Ahn J, et al. Track residency programs in orthopaedic surgery: a survey of program directors and recent graduates. J Bone Joint Surg Am. 2019;101(15):1420–7.

Brandenburg C, Stehlik P, Wenke R, Jones K, Hattingh L, Dungey K, et al. How can healthcare organisations increase doctors’ research engagement? A scoping review. J Health Organ Manag. 2024. https://doi.org/10.1108/JHOM-09-2023-0270 .

Stevenson MD, Smigielski EM, Naifeh MM, Abramson EL, Todd C, Li S-TT. Increasing scholarly activity productivity during residency: a systematic review. Acad Med. 2017;92(2):250–66.

Laupland KB, Edwards F, Dhanani J. Determinants of research productivity during postgraduate medical education: a structured review. BMC Med Educ. 2021;21(1):1–9.

Li QK, Wollny K, Twilt M, Walsh CM, Bright K, Dimitropoulos G, et al. Curricula, teaching methods, and success metrics of clinician–scientist training programs: a scoping review. Acad Med. 2022;97(9):1403–12.

Hill H, Harvey C. Completing the RANZCP scholarly project in a rural/regional setting: a practical example. Australas Psychiatry. 2021;29(2):234–6.

McHenry MS, Abramson EL, McKenna MP, Li S-TT. Research in pediatric residency: National experience of pediatric chief residents. Acad Pediatr. 2017;17(2):144–8.

D’Arrietta LM, Vangaveti VN, Crowe MJ, Malau-Aduli BS. Rethinking health professionals’ motivation to do research: a systematic review. J Multidiscip Healthc. 2022;15:185–216.

Royal Australian College of General Practitioners. Research during training. 2024. Available from: https://www.racgp.org.au/education/research-grants-and-programs/research-grants-and-programs/research-webinar-series/research-during-training . Accessed 25 Feb 2024.

Australian Health Practitioner Regulation Agency. National health workforce dataset: medical practitioners dashboard. 2022. https://hwd.health.gov.au/mdcl-dashboards/index.html . Accessed 12 Jan 2024.

Traill CL, Januszewski AS, Larkins R, Keech AC, Jenkins AJ. Time to research Australian physician-researchers. Intern Med J. 2016;46(5):550–8.

Cuevas Shaw L, Errington TM, Mellor DT. Toward open science: contributing to research culture change. Sci Ed. 2022;45:14–7.

Royal Australasian College of Surgeons. Training regulations: General surgery education and training program. 2024. https://www.generalsurgeons.com.au/media/files/Education%20and%20Training/GSET/REG%202024-01%20Jan%202024.pdf . Accessed 12 Jan 2024.

Chapman SJ, Glasbey JC, Khatri C, Kelly M, Nepogodiev D, Bhangu A, et al. Promoting research and audit at medical school: Evaluating the educational impact of participation in a student-led national collaborative study. BMC Med Educ. 2015;15(1):1–11.

Javanmard-Emamghissi H, The COVID:HAREM Collaborative Group. Forming a novel trainee-led research collaborative during times of crisis: lessons learned from the COVID: HAREM collaborative. BJS Open. 2021;5(Supp 1):zrab033.002.

Jones CS, Dada M, Dewhurst M, Dewi F, Pathak S, Main BG, et al. Facilitating engagement in surgical research through a virtual systematic review network: the RoboSurg Collaborative. BJS Open. 2021;5(Supp 1):zrab033.001.

Jamjoom AA, Phan PN, Hutchinson PJ, Kolias AG. Surgical trainee research collaboratives in the UK: an observational study of research activity and publication productivity. BMJ Open. 2016;6(2): e010374.

Elliott B, Carmody JB. Publish or perish: The research arms race in residency selection. J Grad Med Educ. 2023;15(5):524–7.

Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and delphi survey. JAMA Netw Open. 2018;1(2):e180281.

Ruco A, Morassaei S, Di Prospero L. Development of research core competencies for academic practice among health professionals: a mixed-methods approach. Qual Manag Health Care. 2024. https://doi.org/10.1097/QMH.0000000000000443 .

Download references

Acknowledgements

The authors would like to acknowledge the funding support of the Gold Coast Health Collaborative Research Grant Scheme and the valuable contributions to initial concept and funding acquisition of the overarching ENHANCE project team (including Caitlyn Withers, Rachel Bourke, Adrian Barnett, Alexandra Bannach-Brown, Paul Glasziou, Mark Morgan, Thomas Campbell and David Henry). The authors would also like to extend their sincere gratitude to the participants who generously gave their time and shared their valuable insights and experiences for this study.

This study was funded by the Gold Coast Health Collaborative Research Grant Scheme 2020 (RGS2020-037).

Author information

Authors and affiliations.

Gold Coast Hospital and Health Service, Queensland Health, Gold Coast, QLD, Australia

Caitlin Brandenburg, Joanne Hilder, Rhea Liang, Gerben Keijzers & David Pearson

Faculty of Health Sciences and Medicine, Bond University, Gold Coast, QLD, Australia

Caitlin Brandenburg, Joanne Hilder, Rhea Liang, Kirsty Forrest, Gerben Keijzers & Sharon Mickan

Medical School, University of Queensland, Brisbane, QLD, Australia

Christy Noble

Metro North Mental Health, Queensland Health, Brisbane, QLD, Australia

Hitesh Joshi

School of Medicine and Dentistry, Griffith University, Gold Coast, QLD, Australia

Gerben Keijzers

Metro South Digital Health and Informatics, Princess Alexandra Hospital, Brisbane, QLD, Australia

Ian A. Scott

Centre for Health Services Research, University of Queensland, Brisbane, QLD, Australia

Dermatology Department, St Vincent’s Hospital, Melbourne, VIC, Australia

Emma Veysey

Institute for Evidence-Based Healthcare, Bond University, Gold Coast, QLD, Australia

Paulina Stehlik

School of Pharmacy and Medical Sciences, Griffith University, Gold Coast, QLD, Australia

You can also search for this author in PubMed   Google Scholar

Contributions

CB is the study guarantor and led the project, while PS led project administration. PS, CB and CN conceptualised the study, and all authors, as well as the broader ENHANCE authorship group (see protocol), contributed to the development of the methodology and funding acquisition. Data collection and analysis was undertaken by JH, overseen by CB. All authors contributed to the analysis and/or interpretation of data. CB and JH developed the original draft, earlier versions of the draft were reviewed by PS, CN and RL, and later drafts were reviewed by remaining authors. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Caitlin Brandenburg .

Ethics declarations

Ethics approval and consent to participate.

The Bond University Human Research Ethics Committee (PS00149) provided approval to conduct the study. All methods were carried out in accordance with relevant guidelines and regulations. All participants provided written informed consent.

Consent for publication

Not applicable.

Competing interests

The authors declare no financial competing interests. Some authors have roles relating to medical training which may influence their opinions, listed here. CB & PS are currently on the Queensland Training for Research Active Clinicians (QTRAC) Working Party. RL is Clinical Sub Dean GCHHS for Bond University; RACS educator and course director, Care of the Critically Ill Surgical Patient (CCrISP) and Operate With Respect (OWR); and Co-Chair, Supervisors and Educators Reference Group (SERG), A Better Culture project. KF is Dean of Medicine, Bond University; Executive committee member and Treasurer Medical Deans of Australia and New Zealand (MDANZ); Chair of Education and Evaluation Committee (EDEC) of the Australia and New Zealand College of Anaesthetists (ANZCA); and Chair of Professional Practice research network (PPRN). HJ is a member (casual) of the RANZCP Committee for Examinations. GK is a trainee research requirement adjudicator for the Australasian College of Emergency Medicine. IS gives lectures on research methods to trainees of the Royal Australasian College of Physicians and has been an examiner for the college. EV is chair of the Australasian College of Dermatologists Academic Research Committee (ARC). JH, CN, SM & DP report no specific roles of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1. interview guide., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Brandenburg, C., Hilder, J., Noble, C. et al. “Luck of the draw really”: a qualitative exploration of Australian trainee doctors’ experiences of mandatory research. BMC Med Educ 24 , 1021 (2024). https://doi.org/10.1186/s12909-024-05954-6

Download citation

Received : 12 April 2024

Accepted : 23 August 2024

Published : 18 September 2024

DOI : https://doi.org/10.1186/s12909-024-05954-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Medical training
  • Research requirements
  • Scholarly activity

BMC Medical Education

ISSN: 1472-6920

qualitative medical research examples

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Singapore Med J
  • v.59(12); 2018

Logo of singmedj

Qualitative research essentials for medical education

Sayra m cristancho.

1 Department of Surgery and Faculty of Education, Schulich School of Medicine and Dentistry, Western University, Canada

2 Centre for Education Research and Innovation, Schulich School of Medicine and Dentistry, Western University, Canada

Mark Goldszmidt

3 Department of Medicine, Schulich School of Medicine and Dentistry, Western University, Canada

Lorelei Lingard

Christopher watling.

4 Postgraduate Medical Education, Schulich School of Medicine and Dentistry, Western University, Canada

This paper offers a selective overview of the increasingly popular paradigm of qualitative research. We consider the nature of qualitative research questions, describe common methodologies, discuss data collection and analysis methods, highlight recent innovations and outline principles of rigour. Examples are provided from our own and other authors’ published qualitative medical education research. Our aim is to provide both an introduction to some qualitative essentials for readers who are new to this research paradigm and a resource for more experienced readers, such as those who are currently engaged in a qualitative research project and would like a better sense of where their work sits within the broader paradigm.

INTRODUCTION

Are you a medical education researcher engaged in qualitative research and wondering if you are on the right track? Are you contemplating a qualitative research project and not sure how to get started? Are you reading qualitative manuscripts and making guesses about their quality? This paper offers a selective overview of the increasingly popular domain of qualitative research. We consider the nature of qualitative research questions, describe common methodologies, discuss data collection and analysis methods, highlight recent innovations, and outline principles of rigour. The aim of this paper is to educate newcomers through introductory explanations while stimulating more experienced researchers through attention to current innovations and emerging debates.

WHAT IS QUALITATIVE RESEARCH?

Qualitative research is naturalistic; the natural setting – not the laboratory – is the source of data. Researchers go where the action is; to collect data, they may talk with individuals or groups, observe their behaviour and their setting, or examine their artefacts.( 1 ) As defined by leading qualitative researchers Denzin and Lincoln, qualitative research studies social and human phenomena in their natural settings, attempting to make sense of or interpret these phenomena in terms of the meanings participants bring to them.( 2 )

Because qualitative research situates itself firmly in the world it studies, it cannot aim for generalisability. Its aim is to understand, rather than erase, the influence of context, culture and perspective. Good qualitative research produces descriptions, theory or conceptual understanding that may be usefully transferred to other contexts, but users of qualitative research must always carefully consider how the principles unearthed might unfold in their own distinct settings.

WHAT QUESTIONS ARE APPROPRIATE FOR QUALITATIVE RESEARCH?

Meaningful education research begins with compelling questions. Research methods translate curiosity into action, facilitating exploration of those questions. Methods must be chosen wisely; some questions lend themselves to certain methodological approaches and not to others.

In recent years, qualitative research methods have become increasingly prominent in medical education. The reason is simple: some of the most pressing questions in the field require qualitative approaches for meaningful answers to be found.

Qualitative research examines how things unfold in real world settings. While quantitative research approaches that dominate the basic and clinical sciences focus on testing hypotheses, qualitative research explores processes, phenomena and settings ( Box 1 ). For example, the question “Does the introduction of a mandatory rural clerkship increase the rate of graduates choosing to practise in rural areas? ” demands a quantitative approach. The question embeds a hypothesis – that a mandatory rural clerkship will increase the rate of graduates choosing to practise in rural areas – and so the research method must test whether or not that hypothesis is true. But the question “ How do graduating doctors make choices about their practice location? ” demands a qualitative approach. The question does not embed a hypothesis; rather, it explores a process of decision-making.

An external file that holds a picture, illustration, etc.
Object name is SMJ-59-622-g001.jpg

Qualitative research questions:

Many issues in medical education could be examined from either a quantitative or qualitative approach; one approach is not inherently superior. The questions that drive the research as well as the products that derive from it are, however, fundamentally different. Consider two approaches to studying the issue of online learning. A quantitative researcher might ask, “ What is the effect of an online learning module on medical students’ end-of-semester OSCE [objective structured clinical examination] scores? ”, while a qualitative researcher might ask “ How do medical students make choices about using online learning resources? ” Although the underlying issue is the same – the phenomenon of online learning in medical school – the studies launched by these questions and the products of those studies will look very different.

WHAT ARE QUALITATIVE METHODOLOGIES AND WHY ARE THEY IMPORTANT?

Executing rigorous qualitative research requires an understanding of methodology – the principles and procedures that define how the research is approached. Far from being monolithic, the world of qualitative research encompasses a range of methodologies, each with distinctive approaches to inquiry and characteristic products. Methodologies are informed by the researcher’s epistemology – that is, their theory of knowledge. Epistemology shapes how researchers approach the researcher’s role, the participant-researcher relationship, forms of data, analytical procedures, measures of research quality, and representation of results in analysis and writing.( 3 )

In medical education, published qualitative work includes methodologies such as grounded theory, phenomenology, ethnography, case study, discourse analysis, participatory action research and narrative inquiry, although the list is growing as the field embraces researchers with diverse disciplinary backgrounds. This paper neither seeks to exhaustively catalogue all qualitative methodologies nor comprehensively describe any of them. Rather, we present a subset, with the aim of familiarising readers with its fundamental goals. In this article, we briefly introduce four common methodologies used in medical education research ( Box 2 ). Using one topic, professionalism, we illustrate how each methodology might be applied and how its particular features would yield different insights into that topic.

An external file that holds a picture, illustration, etc.
Object name is SMJ-59-622-g002.jpg

Common qualitative methodologies in medical education:

Grounded theory

Arguably the most frequently used methodology in medical education research today, grounded theory seeks to understand social processes. Core features of grounded theory include iteration, in which data collection and analysis take place concurrently with each informing the other, and a reliance on theoretical sampling to explore patterns as they emerge.( 4 ) While many different schools of grounded theory exist, they share the aim of generating theory that is grounded in empirical data.( 5 ) Theory, in this type of research, can be thought of as a conceptual understanding of the process under study, ideally affording a useful explanatory power. For example, if one were interested in the development of professionalism among senior medical students during clerkship, one might design a grounded theory study around the following question: “ What aspects of clerkship support or challenge professional behaviour among senior medical students? ” The resulting product would be a conceptual rendering of how senior medical students navigate thorny professionalism issues, which might in turn be useful to curriculum planners.

Phenomenology

This methodology begins with a phenomenon of interest and seeks to understand the subjective lived experience of that phenomenon.( 6 ) Core features of phenomenology include a focus on the individual experience (typically pursued through in-depth interviewing and/or examinations of personal narratives), inductive analysis and a particular attention to reflexivity.( 7 ) Phenomenological researchers typically enumerate their own ideas and preconceptions about the phenomenon under study and consider how these perceptions might influence their interpretation of data.( 8 ) A phenomenological study around professionalism in senior medical students, for example, might involve interviewing several students who have experienced a professionalism lapse about that experience. The resulting product might be an enhanced understanding of the emotional, social and professional implications of this phenomenon from the student’s perspective, which might in turn inform wellness or resilience strategies.

Ethnography

Ethnography aims to understand people in their contexts, exploring the influence of culture, social organisation and shared values on how people behave – their routines and rituals. Core features of ethnography include reliance on direct observation as a data source, and the use of sustained immersive engagement in the setting of interest in order to understand social dynamics from within.( 9 , 10 ) An ethnographic approach to studying how professional attitudes develop in senior medical students might gather data through observations of ward rounds, team meetings and clinical teaching sessions over a period of time. The resulting product – called an ethnography – would describe how professional values are socialised in junior learners in clinical settings, which could assist educators in understanding how the clinical experiences they programme for their learners are influencing their professional development.

Case study research seeks an in-depth understanding of an individual case (or series of cases) that is illustrative of a problem of interest. Like clinical case studies, the goal is not generalisation but a thorough exploration of one case, in hopes that the fruits of that exploration may prove useful to others facing similar problems. Core features of case study research include: thoughtful bounding or defining of the scope of the case at the outset; collection of data from multiple sources, ranging from interviews with key players to written material in policy documents and websites; and careful attention to both the phenomenon of interest and its particular context.( 11 ) A specific professionalism challenge involving medical students could provide fodder for a productive case study. For example, if a medical school had to discipline several students for inappropriately sharing personal patient information on social media, a case study might be undertaken. The ‘case’ would be the incident of social media misuse at a single medical school, and the data gathered might include interviews with students and school officials, examination of relevant policy documents, examination of news media coverage of the event, and so on. The product of this research might trigger similar institutions to carefully consider how they might approach – or prevent – a similar problem.

As these four examples illustrate, methodology is the backbone of qualitative research. Methodology shapes the way the research question is asked, defines the characteristics of an appropriate sample, and governs the way the data collection and analysis procedures are organised. The researcher’s role is also distinctive in each methodology; for instance, in constructivist grounded theory, the researcher actively constructs the theory,( 12 ) while in phenomenology, the researcher attempts to manage his or her ‘pre-understandings’ through either bracketing them off or being reflexive about them.( 13 ) Interested readers may wish to consult the reference list for recently published examples of research using grounded theory,( 14 ) phenomenology,( 15 ) ethnography( 16 ) and case study approaches( 17 ) in order to appreciate how researchers deploy these methodologies to tackle compelling questions in contemporary medical education.

WHAT ARE SOME COMMON METHODS OF QUALITATIVE DATA COLLECTION?

The most common methods of qualitative data collection are interview – talking to participants about their experiences relevant to the research question, and observation – watching participants while they are having those experiences. Depending on the research questions explored, a research design might combine interviews and observations.

Interview-based methods

Interviews are typically used for situations where a guided conversation with relevant participants can help provide insight into their lived experiences and how they view and interpret the world around them. Interviews are also particularly useful for exploring past events that cannot be replicated or phenomena where direct observation is impossible or unfeasible.

Participants may be interviewed individually or in groups. Focus group interviews are used when the researcher’s topic of interest is best explored through a guided, interactive discussion among the participants themselves. Therefore, when focus groups are used, the sample is conceptualised at the level of the group – three focus groups of five people constitutes a sample of three interactive discussions, not 15 individual participants. Because they centre on the group discussion and dynamic, focus groups are less well-suited for topics that are sensitive, highly personal or perceived to be culturally inappropriate to discuss publicly.( 18 )

Unlike quantitative interviews, where a set of structured, closed-ended (e.g. yes/no) questions are asked in the same order with the same wording every time, qualitative interviews typically involve a semi-structured design where a list of open-ended questions serves to guide, but not constrain, the interview. Therefore, at the interviewer’s discretion, the questions and their sequence may vary from interview to interview. This judgement is made based on both the interviewer’s understanding of the phenomenon under exploration and the emerging dynamic between the interviewer and participant.

The primary goal of a qualitative interview is to get the participants to think carefully about their experience and relate it to the interviewer with rich detail. Getting good data from interviewing relies on using creative strategies to avoid the common trap of getting politically correct answers – often called ‘cover stories’– or answers that are superficial rather than deep and reflective.( 19 ) A common design error occurs when researchers are overly explicit in their questioning, such as asking “ What are the top five criteria you use to assess student professionalism? ” A better approach involves questions that ask participants to describe what they do in practice, with follow-up probes that extend beyond the specific experience described. For example, starting with “ Tell me about a recent experience where you assessed a student’s professionalism ” allows the participant to relay an experience, to which the interviewer can respond with probes such as “ What was tricky about that? ” or “ How typical is that experience? ”

Another common strategy for prompting participants to engage in rich reflection on their experience and perceptions is to use vignettes as discussion prompts. Vignettes are often artificial scenarios presented to participants to read or watch on video, about which they are then asked probing questions.( 20 ) However, vignettes can also be used to recreate an authentic situation for the participant to engage with.( 21 ) For instance, in one interview study, we presented participants with a vignette in the form of the research assistant reading aloud a standard patient admission presentation that the interviewees would typically hear from their students on morning ward rounds. We then asked the participants to interact with the interviewer as though he or she was a student who had presented this case on morning rounds. Recreating this interaction in the context of the interview served as a stepping stone to questions such as “ Why did you ask the student ‘x’? ” and “ How would your approach have differed with a different student presenter, e.g. a stronger or weaker one? ”

Direct observation

Observation-based research can involve a wide spectrum of activities, ranging from brief observations of specific tasks (e.g. handover, preoperative team briefings) to prolonged field observations such as those seen in ethnography. When used effectively, direct observation can provide the researcher with powerful insight into the routines of a group.

Getting good data from observational research relies on several key components. First, it is essential to define the scope of the project upfront: limited budgets, the massive amount of detail to be attended to, and the ability of any individual or group of observers to attend to these make this essential. Good observational research therefore relies on collaboration between knowledgeable insiders and those with both methodological and theoretical expertise. Sampling demands particular attention; an initial purposive sampling approach is often followed by more targeted, theoretical sampling that is guided by the developing analysis. Observational research also typically involves a mix of data sources, including observational field notes, field interviews and document analysis. Audio and video may be helpful when the studied phenomena is particularly complex or nuances of interaction may be missed without the ability to review data, or when precision of verbal and nonverbal interactions is necessary to answer the research question.( 22 )

Field notes are often the dominant data source used for subsequent analysis in observational research. As such, they must be created with great diligence. Usually researchers will jot down brief notes during an observation and afterwards elaborate in as much detail as they can recall. Field notes have an important reflective component. In addition to the factual descriptions, researchers include comments about their feelings, reactions, hunches, speculations and working theories or interpretations. The content of field notes, therefore, usually includes: descriptions of the setting, people and activities; direct quotations or paraphrasing of what people said; and the observer’s reflections.( 23 ) Field notes are time-consuming when done well – even a single hour of observation can lead to several hours of reflective documentation.

An important aspect to consider when designing observation-based research is the ‘observer effect’, also known as the Hawthorne effect, more recently reframed as ‘participant reactivity’ by health professions education researchers Paradis and Sutkin.( 24 ) The Hawthorne effect is conventionally defined as “ when observed participants act differently from how they would act if the observer were not present ”.( 25 ) Researchers have implemented a number of strategies to mitigate this effect, including prolonged embedding of the observer, efforts to ‘fit in’ through dress or comportment, and careful recording of explicit instances of the effect.( 24 ) However, Paradis and Sutkin found that instances of the Hawthorne effect, as conventionally defined, have never been described in qualitative research manuscripts in the health professions education field, perhaps because, as they speculate, healthcare workers and trainees are accustomed to being observed. Based on this, they argued that researchers should worry less about mitigating the Hawthorne effect and instead invest in interpersonal relationships at their study site to mitigate the effects of altered behaviour and draw on theory to make sense of participants’ altered behaviour.( 23 ) Combining interviewing and observation is also common in qualitative research ( Box 3 ).

An external file that holds a picture, illustration, etc.
Object name is SMJ-59-622-g003.jpg

Combining interviews and observations:

WHAT ARE THE COMMON METHODS OF QUALITATIVE DATA ANALYSIS?

Qualitative data almost invariably takes the form of text; an interview is turned into a transcript and an observation is rendered into a field note. Analysing these qualitative texts is about uncovering meaning, developing understanding and discovering insights relevant to the research question. Analysis is not separated from data collection in qualitative research, and begins with the first interview, the first observation or the first reading of a document. In fact, the iterative nature of data collection and analysis is a hallmark of qualitative research, because it allows the researcher’s emerging insights about the study phenomena to inform subsequent rounds of data collection ( Box 4 ).

An external file that holds a picture, illustration, etc.
Object name is SMJ-59-622-g004.jpg

The iterative process of analysis:

Data that has been analysed while being collected is both parsimonious and illuminating. However, this process can extend indefinitely. There will always be another person to interview or another observation to record. Deciding when to stop depends on both practical and theoretical concerns. Practical concerns include deadlines and funding. More importantly, the decision should be guided by the theoretical concern of sufficiency.( 26 ) Sufficiency occurs when new data does not produce new insights into the phenomenon, in other words, when you keep hearing and seeing the same things you have heard and seen before.

Qualitative data analysis is primarily inductive and comparative. The overall process of data analysis begins by identifying segments in the data that are responsive to the research question. The next step is to compare one segment with the next, looking for recurring patterns in the data set. During this step, the focus is on sorting the raw data into categories that progressively build a coherent description or explanation of the phenomenon under study. This process of identifying pieces of data and grouping them into categories is called coding.( 14 ) Once a tentative scheme of categories is derived, it is applied to new data to see whether those categories continue to exist or not, or whether new categories arise – this step determines whether sufficiency has been reached. The final step in the analysis is to think about how categories interrelate. At this point, the analysis moves to interpreting the meaning of these categories and their interrelations.( 12 )

The process for data analysis laid out in this section is a basic inductive and comparative analysis strategy that is suitable for analysing data for most interpretive qualitative research methodologies, including the four featured in this paper – phenomenology, grounded theory, ethnography and case study – as well as others such as narrative analysis and action research. While each methodology attends to specific procedures, they all share the use of this basic inductive/comparative strategy. Overall, analysis should be guided by methodology, but different analytical procedures can be creatively combined across methodologies, as long as this combining is explicit and intentional.( 27 )

WHAT ARE SOME CURRENT INNOVATIONS IN QUALITATIVE RESEARCH?

Understanding the complex factors that influence clinical practice and medical education is not an easy research task. Many important issues may be difficult for the insider to articulate during interviews and impossible for the outsider to ‘see’ during observation. Innovations to address these challenges include guided walks,( 28 ) photovoice( 29 ) and point-of-view filming.( 30 ) Our own research has drawn intensively on the innovation termed ‘rich pictures’ to explore the features and implications of complexity in medical education.( 31 ) In one study, we asked medical students to draw pictures of clinical cases that they found complex: an exciting case and a frustrating one.( 32 ) Participants were given 30–60 minutes on their own to reflect on the situation and draw their pictures. This was followed by an in-depth interview using the pictures as triggers to explore the phenomenon under study – in this case, students’ experiences of and responses to complexity during their training.

Such innovations hold great promise for qualitative research in medical education. For instance, rich pictures can reveal emotional and organisational dimensions of complex clinical experiences, which are less likely to be emphasised in participants’ traditional interview responses.( 33 ) Methodological innovations, however, bring new challenges: they can be time-intensive for participants and researchers; they require new analytical procedures to be developed; and they necessitate efforts to educate audiences about the rigour and credibility of unfamiliar approaches.

WHAT ARE THE PRINCIPLES OF RIGOUR IN QUALITATIVE RESEARCH?

Like quantitative research, qualitative research has principles of rigour that are used to judge the quality of the work.( 34 ) Here, we discuss principles that appear in most criteria for rigour in the field: reflexivity, adequacy, authenticity, trustworthiness and resonance ( Box 5 ).

An external file that holds a picture, illustration, etc.
Object name is SMJ-59-622-g005.jpg

Principles of rigour in qualitative research:

The main data collection tool in qualitative research is the researcher. We talk to participants, observe their practices and interpret their documents. Consequently, a critical feature of rigour in qualitative data collection is researcher reflexivity: the ability to consider our own orientations towards the studied phenomenon, acknowledge our assumptions and articulate regularly our impressions of the data.( 35 ) Only this way can we assure others that our subjectivity has been thoughtfully considered and afford them the ability to judge its influence on the work for themselves. Qualitative research does not seek to remove this subjectivity; it treats research perspective as unavoidable and enriching, not as a form of bias to purge.

Every qualitative dataset is an approximation of a complex phenomenon – no study can capture all dimensions and nuances of situated social experiences, such as medical students’ negotiations of professional dilemmas in the clinical workplace. Therefore, two other important criteria of rigour relate to the adequacy and authenticity of the sampled experiences. Did the research focus on the appropriate participants and/or situations? Was the size and scope of the sample adequate to represent the scope of the phenomenon?( 36 ) Was the data collected an authentic reflection of the phenomenon in question? Qualitative researchers should thoughtfully combine different perspectives, methods and data sources (a process called ‘triangulation’) to intensify the richness of their representation.( 37 ) We should endeavour to draw on data in our written reports such that we provide what sociologist Geertz has termed a sufficiently ‘thick’ description( 38 ) for readers to judge the authenticity of our portrayal of the studied phenomenon.

Qualitative analysis embraces subjectivity: what the researcher ‘sees’ in the data is a product both of what participants told or showed us and of what we were oriented to make of those stories and situations. To some degree, a rhetorician will always see rhetoric and a systems engineer will always see systems. To fulfil the rigour criteria of trustworthiness, qualitative analysis should also be systematic and held to a principle of trustworthiness, which dictates that we should clearly describe: (a) what was done by whom during the inductive, comparative analytical process; (b) how the perspectives of multiple coders were negotiated; (c) how and when theoretical lenses were brought to bear in the iterative process of data collection and analysis; and (d) how discrepant instances in the data – those that fell outside the dominant thematic patterns – were handled.

Finally, the ultimate measure of quality in qualitative research is the resonance of the final product to those who live the social experience under study.( 4 ) As qualitative researchers presenting our work at conferences, we know we have met this bar if our audiences laugh, nod or scowl at the right moments, and if their response at the end is “ You nailed it. That’s my world. But you’ve given me a new way to look at it ”. The situatedness of qualitative research means that its transferability to other contexts is always a matter of the listener/reader’s judgement, based on their consideration of the similarities and differences between the research context and their own. Thus, there is a necessity for qualitative research to sufficiently describe its context, so that consumers of the work have the necessary information to gauge transferability. Ultimately, though, transferability remains an open question, requiring further inquiry to explore the explanatory power of one study’s insights in a new setting.

WHAT ELSE IS THERE TO KNOW?

This overview of qualitative research in medical education is not exhaustive. We have been purposefully selective, discussing in depth some common methodologies and methods, and leaving aside others. We have also passed over important issues such as qualitative research ethics, sampling and writing. There is much, much more for readers to know! Our selectivity notwithstanding, we hope that this paper will provide an accessible introduction to some qualitative essentials for readers who are new to this research domain, and that it may serve as a useful resource for more experienced readers, particularly those who are doing a qualitative research project and would like a better sense of where their work sits within the broader field of qualitative approaches.

IMAGES

  1. Qualitative Data in Medical Research for PhD Scholars

    qualitative medical research examples

  2. Qualitative Questionnaire

    qualitative medical research examples

  3. 18 Qualitative Research Examples (2024)

    qualitative medical research examples

  4. Beautiful Example Of Qualitative Research Report Pdf How To Write A

    qualitative medical research examples

  5. Top 10 Qualitative Research Report Templates with Samples and Examples

    qualitative medical research examples

  6. Healthcare

    qualitative medical research examples

VIDEO

  1. 10 Difference Between Qualitative and Quantitative Research (With Table)

  2. SAMPLING PROCEDURE AND SAMPLE (QUALITATIVE RESEARCH)

  3. Quantitative and Qualitative Analysis of Medical Record

  4. Abstracts: Oral

  5. Descriptive Research, Its Types, Methods and Examples

  6. S1: Key Opinion Leader Engagement: The Main Role of Medical Science Liaisons (MSLs)

COMMENTS

  1. Qualitative Methods in Health Care Research

    Significance of Qualitative Research. The qualitative method of inquiry examines the 'how' and 'why' of decision making, rather than the 'when,' 'what,' and 'where.'[] Unlike quantitative methods, the objective of qualitative inquiry is to explore, narrate, and explain the phenomena and make sense of the complex reality.Health interventions, explanatory health models, and medical-social ...

  2. A practical guide for conducting qualitative research in medical

    INTRODUCTION. Qualitative research plays an important role in advancing practice and policy in education both inside and out of the field of medicine. 1, 2 Qualitative methods allow for in‐depth understanding of human behavior and social context to provide clues as to "how" and "why" certain phenomena are occurring. 3 This can help inform understanding of teacher or learner behavior ...

  3. A practical guide for conducting qualitative research in medical

    INTRODUCTION. A well‐executed qualitative study that is conducted using a systematic approach to study design, data analysis, and interpretation can shed light on topics that are of interest to researchers 1 and allows for an in‐depth understanding of human behavior. 2 In medical education, this may prompt a quantitative study to test a hypothesis whose goal is to advance educational ...

  4. How to use qualitative methods for health and health services research

    In qualitative research, these questions explore reasons for why people do things or believe in something and tend to try to cover the drivers for behaviours, attitudes and motivations, instead of just the countable details. 2 It is important to strike a balance between a research question that is all-encompassing and one that is researchable ...

  5. How to use and assess qualitative research methods

    Using qualitative methods can also help shed light on the "softer" side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress ...

  6. Qualitative Research: Understanding Patients' Needs and Experiences

    The results of qualitative research can also help to inform the very process of research itself. Qualitative approaches can help us to understand, for example, why some patients decline to participate in clinical trials [9], or how patients experience the trial process itself. They can even be used to refine or improve a clinical trial in ...

  7. What Is Qualitative Research?

    Revised on September 5, 2024. Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which ...

  8. A practical guide for conducting qualitative research in medical

    INTRODUCTION. A well-executed qualitative study that is conducted using a systematic approach to study design, data analysis, and interpretation can shed light on topics that are of interest to researchers 1 and allows for an in-depth understanding of human behavior. 2 In medical education, this may prompt a quantitative study to test a hypothesis whose goal is to advance educational methods ...

  9. Qualitative Research in Health Care

    Qualitative Research in Health Care, 4th Edition looks at the interface between qualitative and quantitative research in primary mixed method studies, case study research, and secondary analysis and evidence synthesis. The book further offers chapters covering: different research designs, ethical issues in qualitative research; interview, focus ...

  10. Understanding qualitative research in health care

    Qualitative studies are often used to research phenomena that are difficult to quantify numerically.1,2 These may include concepts, feelings, opinions, interpretations and meanings, or why people behave in a certain way. Although qualitative research is often described in opposition to quantitative research, the approaches are complementary, and many researchers use mixed methods in their ...

  11. Qualitative Health Research: Sage Journals

    Qualitative Health Research (QHR) is a peer-reviewed monthly journal that provides an international, interdisciplinary forum to enhance health care and further the development and understanding of qualitative research in health-care settings.QHR is an invaluable resource for researchers and academics, administrators and others in the health and social service professions, and graduates who ...

  12. Using qualitative Health Research methods to improve patient and public

    Qualitative health research, for example, has established methods of collecting and analyzing non-quantitative data about individuals' and communities' lived experiences with health, illness and/or the healthcare system. Included in the paradigm of qualitative health research is participatory health research, which offers approaches to ...

  13. Qualitative Methods in Health Care Research

    Qualitative research has ample possibilities within the arena of healthcare research. This article aims to inform healthcare professionals regarding qualitative research, its significance, and applicability in the field of healthcare. A wide variety of phenomena that cannot be explained using the quantitative approach can be explored and ...

  14. How to use and assess qualitative research methods

    This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. ... Using qualitative methods can also help shed light on the "softer" side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of ...

  15. Qualitative Research: Qualitative interviews in medical research

    Much qualitative research is interview based, and this paper provides an outline of qualitative interview techniques and their application in medical settings. It explains the rationale for these techniques and shows how they can be used to research kinds of questions that are different from those dealt with by quantitative methods. Different types of qualitative interviews are described, and ...

  16. Qualitative description

    The knowledge and use of qualitative description as a qualitative research approach in health services research is limited. The aim of this article is to discuss the potential benefits of a qualitative descriptive approach, to identify its strengths and weaknesses and to provide examples of use. Qualitative description is a useful qualitative method in much medical research if you keep the ...

  17. Full article: A practical guide to reflexivity in qualitative research

    Abstract. Qualitative research relies on nuanced judgements that require researcher reflexivity, yet reflexivity is often addressed superficially or overlooked completely during the research process. In this AMEE Guide, we define reflexivity as a set of continuous, collaborative, and multifaceted practices through which researchers self ...

  18. Qualitative studies. Their role in medical research

    Quality of evidence: The information on qualitative research is based on the most recent and valid evidence from the health and social science fields. Main message: Qualitative research seeks to understand and interpret personal experience to explain social phenomena, including those related to health. It can address questions that quantitative ...

  19. Qualitative Research Methods in Medical Education

    Qualitative research provides approaches to explore and characterize the education of future anesthesiologists. For example, the practice of anesthesiology is increasingly team-based; core members of the anesthesia care team include physicians, trainees, nurse anesthetists, anesthesiologist assistants, and other healthcare team members. 1 Understanding how to work within and how to teach ...

  20. A specific method for qualitative medical research: the IPSE (Inductive

    Background This paper reports the construction and use of a specific method for qualitative medical research: The Inductive Process to Analyze the Structure of lived Experience (IPSE), an inductive and phenomenological approach designed to gain the closest access possible to the patients' experience and to produce concrete recommendations for improving care. This paper describes this ...

  21. Qualitative Research in Healthcare: Necessity and Characteristics

    Qualitative research is conducted in the following order: (1) selection of a research topic and question, (2) selection of a theoretical framework and methods, (3) literature analysis, (4) selection of the research participants and data collection methods, (5) data analysis and description of findings, and (6) research validation.

  22. Guides: MGMT 4090: Huntsman Capstone: Qualitative Research Design

    Users can explore methods concepts to help them design research projects, understand particular methods or identify a new method, conduct their research, and write up their findings. Since SAGE Research Methods focuses on methodology rather than disciplines, it can be used across the social sciences, health sciences, and other areas of research.

  23. Human errors in emergency medical services: a qualitative ...

    Advancing patient safety in prehospital care is receiving increasing attention. This qualitative study analyzed factors contributing to human errors among paramedics and emergency medical field supervisors in Finland. Researchers identified three main categories of contributing factors: (1) the changing work environment, such as external disruptions or challenging working conditions; (2) the ...

  24. A Practical Guide to Writing Quantitative and Qualitative Research

    INTRODUCTION. Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses.1,2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results.3,4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the ...

  25. "Luck of the draw really": a qualitative exploration of Australian

    Background Many medical trainees, prior to achieving specialist status, are required to complete a mandatory research project, the usefulness of which has been debated. The aim of this study was to gain an in-depth understanding of trainees' experiences and satisfaction of conducting such research projects in Australia. Methods A qualitative descriptive approach was used. Semi-structured ...

  26. Qualitative research essentials for medical education

    For example, if a medical school had to discipline several students for inappropriately sharing personal patient information on social media, a case study might be undertaken. ... This overview of qualitative research in medical education is not exhaustive. We have been purposefully selective, discussing in depth some common methodologies and ...