Qualitative Research: Characteristics, Design, Methods & Examples

Lauren McCall

MSc Health Psychology Graduate

MSc, Health Psychology, University of Nottingham

Lauren obtained an MSc in Health Psychology from The University of Nottingham with a distinction classification.

Learn about our Editorial Process

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Qualitative research is a type of research methodology that focuses on gathering and analyzing non-numerical data to gain a deeper understanding of human behavior, experiences, and perspectives.

It aims to explore the “why” and “how” of a phenomenon rather than the “what,” “where,” and “when” typically addressed by quantitative research.

Unlike quantitative research, which focuses on gathering and analyzing numerical data for statistical analysis, qualitative research involves researchers interpreting data to identify themes, patterns, and meanings.

Qualitative research can be used to:

  • Gain deep contextual understandings of the subjective social reality of individuals
  • To answer questions about experience and meaning from the participant’s perspective
  • To design hypotheses, theory must be researched using qualitative methods to determine what is important before research can begin. 

Examples of qualitative research questions include: 

  • How does stress influence young adults’ behavior?
  • What factors influence students’ school attendance rates in developed countries?
  • How do adults interpret binge drinking in the UK?
  • What are the psychological impacts of cervical cancer screening in women?
  • How can mental health lessons be integrated into the school curriculum? 

Characteristics 

Naturalistic setting.

Individuals are studied in their natural setting to gain a deeper understanding of how people experience the world. This enables the researcher to understand a phenomenon close to how participants experience it. 

Naturalistic settings provide valuable contextual information to help researchers better understand and interpret the data they collect.

The environment, social interactions, and cultural factors can all influence behavior and experiences, and these elements are more easily observed in real-world settings.

Reality is socially constructed

Qualitative research aims to understand how participants make meaning of their experiences – individually or in social contexts. It assumes there is no objective reality and that the social world is interpreted (Yilmaz, 2013). 

The primacy of subject matter 

The primary aim of qualitative research is to understand the perspectives, experiences, and beliefs of individuals who have experienced the phenomenon selected for research rather than the average experiences of groups of people (Minichiello, 1990).

An in-depth understanding is attained since qualitative techniques allow participants to freely disclose their experiences, thoughts, and feelings without constraint (Tenny et al., 2022). 

Variables are complex, interwoven, and difficult to measure

Factors such as experiences, behaviors, and attitudes are complex and interwoven, so they cannot be reduced to isolated variables , making them difficult to measure quantitatively.

However, a qualitative approach enables participants to describe what, why, or how they were thinking/ feeling during a phenomenon being studied (Yilmaz, 2013). 

Emic (insider’s point of view)

The phenomenon being studied is centered on the participants’ point of view (Minichiello, 1990).

Emic is used to describe how participants interact, communicate, and behave in the research setting (Scarduzio, 2017).

Interpretive analysis

In qualitative research, interpretive analysis is crucial in making sense of the collected data.

This process involves examining the raw data, such as interview transcripts, field notes, or documents, and identifying the underlying themes, patterns, and meanings that emerge from the participants’ experiences and perspectives.

Collecting Qualitative Data

There are four main research design methods used to collect qualitative data: observations, interviews,  focus groups, and ethnography.

Observations

This method involves watching and recording phenomena as they occur in nature. Observation can be divided into two types: participant and non-participant observation.

In participant observation, the researcher actively participates in the situation/events being observed.

In non-participant observation, the researcher is not an active part of the observation and tries not to influence the behaviors they are observing (Busetto et al., 2020). 

Observations can be covert (participants are unaware that a researcher is observing them) or overt (participants are aware of the researcher’s presence and know they are being observed).

However, awareness of an observer’s presence may influence participants’ behavior. 

Interviews give researchers a window into the world of a participant by seeking their account of an event, situation, or phenomenon. They are usually conducted on a one-to-one basis and can be distinguished according to the level at which they are structured (Punch, 2013). 

Structured interviews involve predetermined questions and sequences to ensure replicability and comparability. However, they are unable to explore emerging issues.

Informal interviews consist of spontaneous, casual conversations which are closer to the truth of a phenomenon. However, information is gathered using quick notes made by the researcher and is therefore subject to recall bias. 

Semi-structured interviews have a flexible structure, phrasing, and placement so emerging issues can be explored (Denny & Weckesser, 2022).

The use of probing questions and clarification can lead to a detailed understanding, but semi-structured interviews can be time-consuming and subject to interviewer bias. 

Focus groups 

Similar to interviews, focus groups elicit a rich and detailed account of an experience. However, focus groups are more dynamic since participants with shared characteristics construct this account together (Denny & Weckesser, 2022).

A shared narrative is built between participants to capture a group experience shaped by a shared context. 

The researcher takes on the role of a moderator, who will establish ground rules and guide the discussion by following a topic guide to focus the group discussions.

Typically, focus groups have 4-10 participants as a discussion can be difficult to facilitate with more than this, and this number allows everyone the time to speak.

Ethnography

Ethnography is a methodology used to study a group of people’s behaviors and social interactions in their environment (Reeves et al., 2008).

Data are collected using methods such as observations, field notes, or structured/ unstructured interviews.

The aim of ethnography is to provide detailed, holistic insights into people’s behavior and perspectives within their natural setting. In order to achieve this, researchers immerse themselves in a community or organization. 

Due to the flexibility and real-world focus of ethnography, researchers are able to gather an in-depth, nuanced understanding of people’s experiences, knowledge and perspectives that are influenced by culture and society.

In order to develop a representative picture of a particular culture/ context, researchers must conduct extensive field work. 

This can be time-consuming as researchers may need to immerse themselves into a community/ culture for a few days, or possibly a few years.

Qualitative Data Analysis Methods

Different methods can be used for analyzing qualitative data. The researcher chooses based on the objectives of their study. 

The researcher plays a key role in the interpretation of data, making decisions about the coding, theming, decontextualizing, and recontextualizing of data (Starks & Trinidad, 2007). 

Grounded theory

Grounded theory is a qualitative method specifically designed to inductively generate theory from data. It was developed by Glaser and Strauss in 1967 (Glaser & Strauss, 2017).

This methodology aims to develop theories (rather than test hypotheses) that explain a social process, action, or interaction (Petty et al., 2012). To inform the developing theory, data collection and analysis run simultaneously. 

There are three key types of coding used in grounded theory: initial (open), intermediate (axial), and advanced (selective) coding. 

Throughout the analysis, memos should be created to document methodological and theoretical ideas about the data. Data should be collected and analyzed until data saturation is reached and a theory is developed. 

Content analysis

Content analysis was first used in the early twentieth century to analyze textual materials such as newspapers and political speeches.

Content analysis is a research method used to identify and analyze the presence and patterns of themes, concepts, or words in data (Vaismoradi et al., 2013). 

This research method can be used to analyze data in different formats, which can be written, oral, or visual. 

The goal of content analysis is to develop themes that capture the underlying meanings of data (Schreier, 2012). 

Qualitative content analysis can be used to validate existing theories, support the development of new models and theories, and provide in-depth descriptions of particular settings or experiences.

The following six steps provide a guideline for how to conduct qualitative content analysis.
  • Define a Research Question : To start content analysis, a clear research question should be developed.
  • Identify and Collect Data : Establish the inclusion criteria for your data. Find the relevant sources to analyze.
  • Define the Unit or Theme of Analysis : Categorize the content into themes. Themes can be a word, phrase, or sentence.
  • Develop Rules for Coding your Data : Define a set of coding rules to ensure that all data are coded consistently.
  • Code the Data : Follow the coding rules to categorize data into themes.
  • Analyze the Results and Draw Conclusions : Examine the data to identify patterns and draw conclusions in relation to your research question.

Discourse analysis

Discourse analysis is a research method used to study written/ spoken language in relation to its social context (Wood & Kroger, 2000).

In discourse analysis, the researcher interprets details of language materials and the context in which it is situated.

Discourse analysis aims to understand the functions of language (how language is used in real life) and how meaning is conveyed by language in different contexts. Researchers use discourse analysis to investigate social groups and how language is used to achieve specific communication goals.

Different methods of discourse analysis can be used depending on the aims and objectives of a study. However, the following steps provide a guideline on how to conduct discourse analysis.
  • Define the Research Question : Develop a relevant research question to frame the analysis.
  • Gather Data and Establish the Context : Collect research materials (e.g., interview transcripts, documents). Gather factual details and review the literature to construct a theory about the social and historical context of your study.
  • Analyze the Content : Closely examine various components of the text, such as the vocabulary, sentences, paragraphs, and structure of the text. Identify patterns relevant to the research question to create codes, then group these into themes.
  • Review the Results : Reflect on the findings to examine the function of the language, and the meaning and context of the discourse. 

Thematic analysis

Thematic analysis is a method used to identify, interpret, and report patterns in data, such as commonalities or contrasts. 

Although the origin of thematic analysis can be traced back to the early twentieth century, understanding and clarity of thematic analysis is attributed to Braun and Clarke (2006).

Thematic analysis aims to develop themes (patterns of meaning) across a dataset to address a research question. 

In thematic analysis, qualitative data is gathered using techniques such as interviews, focus groups, and questionnaires. Audio recordings are transcribed. The dataset is then explored and interpreted by a researcher to identify patterns. 

This occurs through the rigorous process of data familiarisation, coding, theme development, and revision. These identified patterns provide a summary of the dataset and can be used to address a research question.

Themes are developed by exploring the implicit and explicit meanings within the data. Two different approaches are used to generate themes: inductive and deductive. 

An inductive approach allows themes to emerge from the data. In contrast, a deductive approach uses existing theories or knowledge to apply preconceived ideas to the data.

Phases of Thematic Analysis

Braun and Clarke (2006) provide a guide of the six phases of thematic analysis. These phases can be applied flexibly to fit research questions and data. 
Phase
1. Gather and transcribe dataGather raw data, for example interviews or focus groups, and transcribe audio recordings fully
2. Familiarization with dataRead and reread all your data from beginning to end; note down initial ideas
3. Create initial codesStart identifying preliminary codes which highlight important features of the data and may be relevant to the research question
4. Create new codes which encapsulate potential themesReview initial codes and explore any similarities, differences, or contradictions to uncover underlying themes; create a map to visualize identified themes
5. Take a break then return to the dataTake a break and then return later to review themes
6. Evaluate themes for good fitLast opportunity for analysis; check themes are supported and saturated with data

Template analysis

Template analysis refers to a specific method of thematic analysis which uses hierarchical coding (Brooks et al., 2014).

Template analysis is used to analyze textual data, for example, interview transcripts or open-ended responses on a written questionnaire.

To conduct template analysis, a coding template must be developed (usually from a subset of the data) and subsequently revised and refined. This template represents the themes identified by researchers as important in the dataset. 

Codes are ordered hierarchically within the template, with the highest-level codes demonstrating overarching themes in the data and lower-level codes representing constituent themes with a narrower focus.

A guideline for the main procedural steps for conducting template analysis is outlined below.
  • Familiarization with the Data : Read (and reread) the dataset in full. Engage, reflect, and take notes on data that may be relevant to the research question.
  • Preliminary Coding : Identify initial codes using guidance from the a priori codes, identified before the analysis as likely to be beneficial and relevant to the analysis.
  • Organize Themes : Organize themes into meaningful clusters. Consider the relationships between the themes both within and between clusters.
  • Produce an Initial Template : Develop an initial template. This may be based on a subset of the data.
  • Apply and Develop the Template : Apply the initial template to further data and make any necessary modifications. Refinements of the template may include adding themes, removing themes, or changing the scope/title of themes. 
  • Finalize Template : Finalize the template, then apply it to the entire dataset. 

Frame analysis

Frame analysis is a comparative form of thematic analysis which systematically analyzes data using a matrix output.

Ritchie and Spencer (1994) developed this set of techniques to analyze qualitative data in applied policy research. Frame analysis aims to generate theory from data.

Frame analysis encourages researchers to organize and manage their data using summarization.

This results in a flexible and unique matrix output, in which individual participants (or cases) are represented by rows and themes are represented by columns. 

Each intersecting cell is used to summarize findings relating to the corresponding participant and theme.

Frame analysis has five distinct phases which are interrelated, forming a methodical and rigorous framework.
  • Familiarization with the Data : Familiarize yourself with all the transcripts. Immerse yourself in the details of each transcript and start to note recurring themes.
  • Develop a Theoretical Framework : Identify recurrent/ important themes and add them to a chart. Provide a framework/ structure for the analysis.
  • Indexing : Apply the framework systematically to the entire study data.
  • Summarize Data in Analytical Framework : Reduce the data into brief summaries of participants’ accounts.
  • Mapping and Interpretation : Compare themes and subthemes and check against the original transcripts. Group the data into categories and provide an explanation for them.

Preventing Bias in Qualitative Research

To evaluate qualitative studies, the CASP (Critical Appraisal Skills Programme) checklist for qualitative studies can be used to ensure all aspects of a study have been considered (CASP, 2018).

The quality of research can be enhanced and assessed using criteria such as checklists, reflexivity, co-coding, and member-checking. 

Co-coding 

Relying on only one researcher to interpret rich and complex data may risk key insights and alternative viewpoints being missed. Therefore, coding is often performed by multiple researchers.

A common strategy must be defined at the beginning of the coding process  (Busetto et al., 2020). This includes establishing a useful coding list and finding a common definition of individual codes.

Transcripts are initially coded independently by researchers and then compared and consolidated to minimize error or bias and to bring confirmation of findings. 

Member checking

Member checking (or respondent validation) involves checking back with participants to see if the research resonates with their experiences (Russell & Gregory, 2003).

Data can be returned to participants after data collection or when results are first available. For example, participants may be provided with their interview transcript and asked to verify whether this is a complete and accurate representation of their views.

Participants may then clarify or elaborate on their responses to ensure they align with their views (Shenton, 2004).

This feedback becomes part of data collection and ensures accurate descriptions/ interpretations of phenomena (Mays & Pope, 2000). 

Reflexivity in qualitative research

Reflexivity typically involves examining your own judgments, practices, and belief systems during data collection and analysis. It aims to identify any personal beliefs which may affect the research. 

Reflexivity is essential in qualitative research to ensure methodological transparency and complete reporting. This enables readers to understand how the interaction between the researcher and participant shapes the data.

Depending on the research question and population being researched, factors that need to be considered include the experience of the researcher, how the contact was established and maintained, age, gender, and ethnicity.

These details are important because, in qualitative research, the researcher is a dynamic part of the research process and actively influences the outcome of the research (Boeije, 2014). 

Reflexivity Example

Who you are and your characteristics influence how you collect and analyze data. Here is an example of a reflexivity statement for research on smoking. I am a 30-year-old white female from a middle-class background. I live in the southwest of England and have been educated to master’s level. I have been involved in two research projects on oral health. I have never smoked, but I have witnessed how smoking can cause ill health from my volunteering in a smoking cessation clinic. My research aspirations are to help to develop interventions to help smokers quit.

Establishing Trustworthiness in Qualitative Research

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability.

1. Credibility in Qualitative Research

Credibility refers to how accurately the results represent the reality and viewpoints of the participants.

To establish credibility in research, participants’ views and the researcher’s representation of their views need to align (Tobin & Begley, 2004).

To increase the credibility of findings, researchers may use data source triangulation, investigator triangulation, peer debriefing, or member checking (Lincoln & Guba, 1985). 

2. Transferability in Qualitative Research

Transferability refers to how generalizable the findings are: whether the findings may be applied to another context, setting, or group (Tobin & Begley, 2004).

Transferability can be enhanced by giving thorough and in-depth descriptions of the research setting, sample, and methods (Nowell et al., 2017). 

3. Dependability in Qualitative Research

Dependability is the extent to which the study could be replicated under similar conditions and the findings would be consistent.

Researchers can establish dependability using methods such as audit trails so readers can see the research process is logical and traceable (Koch, 1994).

4. Confirmability in Qualitative Research

Confirmability is concerned with establishing that there is a clear link between the researcher’s interpretations/ findings and the data.

Researchers can achieve confirmability by demonstrating how conclusions and interpretations were arrived at (Nowell et al., 2017).

This enables readers to understand the reasoning behind the decisions made. 

Audit Trails in Qualitative Research

An audit trail provides evidence of the decisions made by the researcher regarding theory, research design, and data collection, as well as the steps they have chosen to manage, analyze, and report data. 

The researcher must provide a clear rationale to demonstrate how conclusions were reached in their study.

A clear description of the research path must be provided to enable readers to trace through the researcher’s logic (Halpren, 1983).

Researchers should maintain records of the raw data, field notes, transcripts, and a reflective journal in order to provide a clear audit trail. 

Discovery of unexpected data

Open-ended questions in qualitative research mean the researcher can probe an interview topic and enable the participant to elaborate on responses in an unrestricted manner.

This allows unexpected data to emerge, which can lead to further research into that topic. 

The exploratory nature of qualitative research helps generate hypotheses that can be tested quantitatively (Busetto et al., 2020).

Flexibility

Data collection and analysis can be modified and adapted to take the research in a different direction if new ideas or patterns emerge in the data.

This enables researchers to investigate new opportunities while firmly maintaining their research goals. 

Naturalistic settings

The behaviors of participants are recorded in real-world settings. Studies that use real-world settings have high ecological validity since participants behave more authentically. 

Limitations

Time-consuming .

Qualitative research results in large amounts of data which often need to be transcribed and analyzed manually.

Even when software is used, transcription can be inaccurate, and using software for analysis can result in many codes which need to be condensed into themes. 

Subjectivity 

The researcher has an integral role in collecting and interpreting qualitative data. Therefore, the conclusions reached are from their perspective and experience.

Consequently, interpretations of data from another researcher may vary greatly. 

Limited generalizability

The aim of qualitative research is to provide a detailed, contextualized understanding of an aspect of the human experience from a relatively small sample size.

Despite rigorous analysis procedures, conclusions drawn cannot be generalized to the wider population since data may be biased or unrepresentative.

Therefore, results are only applicable to a small group of the population. 

While individual qualitative studies are often limited in their generalizability due to factors such as sample size and context, metasynthesis enables researchers to synthesize findings from multiple studies, potentially leading to more generalizable conclusions.

By integrating findings from studies conducted in diverse settings and with different populations, metasynthesis can provide broader insights into the phenomenon of interest.

Extraneous variables

Qualitative research is often conducted in real-world settings. This may cause results to be unreliable since extraneous variables may affect the data, for example:

  • Situational variables : different environmental conditions may influence participants’ behavior in a study. The random variation in factors (such as noise or lighting) may be difficult to control in real-world settings.
  • Participant characteristics : this includes any characteristics that may influence how a participant answers/ behaves in a study. This may include a participant’s mood, gender, age, ethnicity, sexual identity, IQ, etc.
  • Experimenter effect : experimenter effect refers to how a researcher’s unintentional influence can change the outcome of a study. This occurs when (i) their interactions with participants unintentionally change participants’ behaviors or (ii) due to errors in observation, interpretation, or analysis. 

What sample size should qualitative research be?

The sample size for qualitative studies has been recommended to include a minimum of 12 participants to reach data saturation (Braun, 2013).

Are surveys qualitative or quantitative?

Surveys can be used to gather information from a sample qualitatively or quantitatively. Qualitative surveys use open-ended questions to gather detailed information from a large sample using free text responses.

The use of open-ended questions allows for unrestricted responses where participants use their own words, enabling the collection of more in-depth information than closed-ended questions.

In contrast, quantitative surveys consist of closed-ended questions with multiple-choice answer options. Quantitative surveys are ideal to gather a statistical representation of a population.

What are the ethical considerations of qualitative research?

Before conducting a study, you must think about any risks that could occur and take steps to prevent them. Participant Protection : Researchers must protect participants from physical and mental harm. This means you must not embarrass, frighten, offend, or harm participants. Transparency : Researchers are obligated to clearly communicate how they will collect, store, analyze, use, and share the data. Confidentiality : You need to consider how to maintain the confidentiality and anonymity of participants’ data.

What is triangulation in qualitative research?

Triangulation refers to the use of several approaches in a study to comprehensively understand phenomena. This method helps to increase the validity and credibility of research findings. 

Types of triangulation include method triangulation (using multiple methods to gather data); investigator triangulation (multiple researchers for collecting/ analyzing data), theory triangulation (comparing several theoretical perspectives to explain a phenomenon), and data source triangulation (using data from various times, locations, and people; Carter et al., 2014).

Why is qualitative research important?

Qualitative research allows researchers to describe and explain the social world. The exploratory nature of qualitative research helps to generate hypotheses that can then be tested quantitatively.

In qualitative research, participants are able to express their thoughts, experiences, and feelings without constraint.

Additionally, researchers are able to follow up on participants’ answers in real-time, generating valuable discussion around a topic. This enables researchers to gain a nuanced understanding of phenomena which is difficult to attain using quantitative methods.

What is coding data in qualitative research?

Coding data is a qualitative data analysis strategy in which a section of text is assigned with a label that describes its content.

These labels may be words or phrases which represent important (and recurring) patterns in the data.

This process enables researchers to identify related content across the dataset. Codes can then be used to group similar types of data to generate themes.

What is the difference between qualitative and quantitative research?

Qualitative research involves the collection and analysis of non-numerical data in order to understand experiences and meanings from the participant’s perspective.

This can provide rich, in-depth insights on complicated phenomena. Qualitative data may be collected using interviews, focus groups, or observations.

In contrast, quantitative research involves the collection and analysis of numerical data to measure the frequency, magnitude, or relationships of variables. This can provide objective and reliable evidence that can be generalized to the wider population.

Quantitative data may be collected using closed-ended questionnaires or experiments.

What is trustworthiness in qualitative research?

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability. 

Credibility refers to how accurately the results represent the reality and viewpoints of the participants. Transferability refers to whether the findings may be applied to another context, setting, or group.

Dependability is the extent to which the findings are consistent and reliable. Confirmability refers to the objectivity of findings (not influenced by the bias or assumptions of researchers).

What is data saturation in qualitative research?

Data saturation is a methodological principle used to guide the sample size of a qualitative research study.

Data saturation is proposed as a necessary methodological component in qualitative research (Saunders et al., 2018) as it is a vital criterion for discontinuing data collection and/or analysis. 

The intention of data saturation is to find “no new data, no new themes, no new coding, and ability to replicate the study” (Guest et al., 2006). Therefore, enough data has been gathered to make conclusions.

Why is sampling in qualitative research important?

In quantitative research, large sample sizes are used to provide statistically significant quantitative estimates.

This is because quantitative research aims to provide generalizable conclusions that represent populations.

However, the aim of sampling in qualitative research is to gather data that will help the researcher understand the depth, complexity, variation, or context of a phenomenon. The small sample sizes in qualitative studies support the depth of case-oriented analysis.

What is narrative analysis?

Narrative analysis is a qualitative research method used to understand how individuals create stories from their personal experiences.

There is an emphasis on understanding the context in which a narrative is constructed, recognizing the influence of historical, cultural, and social factors on storytelling.

Researchers can use different methods together to explore a research question.

Some narrative researchers focus on the content of what is said, using thematic narrative analysis, while others focus on the structure, such as holistic-form or categorical-form structural narrative analysis. Others focus on how the narrative is produced and performed.

Boeije, H. (2014). Analysis in qualitative research. Sage.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology , 3 (2), 77-101. https://doi.org/10.1191/1478088706qp063oa

Brooks, J., McCluskey, S., Turley, E., & King, N. (2014). The utility of template analysis in qualitative psychology research. Qualitative Research in Psychology , 12 (2), 202–222. https://doi.org/10.1080/14780887.2014.955224

Busetto, L., Wick, W., & Gumbinger, C. (2020). How to use and assess qualitative research methods. Neurological research and practice , 2 (1), 14-14. https://doi.org/10.1186/s42466-020-00059-z 

Carter, N., Bryant-Lukosius, D., DiCenso, A., Blythe, J., & Neville, A. J. (2014). The use of triangulation in qualitative research. Oncology nursing forum , 41 (5), 545–547. https://doi.org/10.1188/14.ONF.545-547

Critical Appraisal Skills Programme. (2018). CASP Checklist: 10 questions to help you make sense of a Qualitative research. https://casp-uk.net/images/checklist/documents/CASP-Qualitative-Studies-Checklist/CASP-Qualitative-Checklist-2018_fillable_form.pdf Accessed: March 15 2023

Clarke, V., & Braun, V. (2013). Successful qualitative research: A practical guide for beginners. Successful Qualitative Research , 1-400.

Denny, E., & Weckesser, A. (2022). How to do qualitative research?: Qualitative research methods. BJOG : an international journal of obstetrics and gynaecology , 129 (7), 1166-1167. https://doi.org/10.1111/1471-0528.17150 

Glaser, B. G., & Strauss, A. L. (2017). The discovery of grounded theory. The Discovery of Grounded Theory , 1–18. https://doi.org/10.4324/9780203793206-1

Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18 (1), 59-82. doi:10.1177/1525822X05279903

Halpren, E. S. (1983). Auditing naturalistic inquiries: The development and application of a model (Unpublished doctoral dissertation). Indiana University, Bloomington.

Hammarberg, K., Kirkman, M., & de Lacey, S. (2016). Qualitative research methods: When to use them and how to judge them. Human Reproduction , 31 (3), 498–501. https://doi.org/10.1093/humrep/dev334

Koch, T. (1994). Establishing rigour in qualitative research: The decision trail. Journal of Advanced Nursing, 19, 976–986. doi:10.1111/ j.1365-2648.1994.tb01177.x

Lincoln, Y., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.

Mays, N., & Pope, C. (2000). Assessing quality in qualitative research. BMJ, 320(7226), 50–52.

Minichiello, V. (1990). In-Depth Interviewing: Researching People. Longman Cheshire.

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic Analysis: Striving to Meet the Trustworthiness Criteria. International Journal of Qualitative Methods, 16 (1). https://doi.org/10.1177/1609406917733847

Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? part 2: Introducing qualitative research methodologies and methods. Manual Therapy , 17 (5), 378–384. https://doi.org/10.1016/j.math.2012.03.004

Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative approaches. London: Sage

Reeves, S., Kuper, A., & Hodges, B. D. (2008). Qualitative research methodologies: Ethnography. BMJ , 337 (aug07 3). https://doi.org/10.1136/bmj.a1020

Russell, C. K., & Gregory, D. M. (2003). Evaluation of qualitative research studies. Evidence Based Nursing, 6 (2), 36–40.

Saunders, B., Sim, J., Kingstone, T., Baker, S., Waterfield, J., Bartlam, B., Burroughs, H., & Jinks, C. (2018). Saturation in qualitative research: exploring its conceptualization and operationalization. Quality & quantity , 52 (4), 1893–1907. https://doi.org/10.1007/s11135-017-0574-8

Scarduzio, J. A. (2017). Emic approach to qualitative research. The International Encyclopedia of Communication Research Methods, 1–2 . https://doi.org/10.1002/9781118901731.iecrm0082

Schreier, M. (2012). Qualitative content analysis in practice / Margrit Schreier.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22 , 63–75.

Starks, H., & Trinidad, S. B. (2007). Choose your method: a comparison of phenomenology, discourse analysis, and grounded theory. Qualitative health research , 17 (10), 1372–1380. https://doi.org/10.1177/1049732307307031

Tenny, S., Brannan, J. M., & Brannan, G. D. (2022). Qualitative Study. In StatPearls. StatPearls Publishing.

Tobin, G. A., & Begley, C. M. (2004). Methodological rigour within a qualitative framework. Journal of Advanced Nursing, 48, 388–396. doi:10.1111/j.1365-2648.2004.03207.x

Vaismoradi, M., Turunen, H., & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & health sciences , 15 (3), 398-405. https://doi.org/10.1111/nhs.12048

Wood L. A., Kroger R. O. (2000). Doing discourse analysis: Methods for studying action in talk and text. Sage.

Yilmaz, K. (2013). Comparison of Quantitative and Qualitative Research Traditions: epistemological, theoretical, and methodological differences. European journal of education , 48 (2), 311-325. https://doi.org/10.1111/ejed.12014

Print Friendly, PDF & Email

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on September 5, 2024.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organizations to understand their cultures.
Action research Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Prevent plagiarism. Run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorize common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2024, September 05). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved September 9, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

  • Privacy Policy

Research Method

Home » Qualitative Research – Methods, Analysis Types and Guide

Qualitative Research – Methods, Analysis Types and Guide

Table of Contents

Qualitative Research

Qualitative Research

Qualitative research is a type of research methodology that focuses on exploring and understanding people’s beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus groups, observations, and textual analysis.

Qualitative research aims to uncover the meaning and significance of social phenomena, and it typically involves a more flexible and iterative approach to data collection and analysis compared to quantitative research. Qualitative research is often used in fields such as sociology, anthropology, psychology, and education.

Qualitative Research Methods

Types of Qualitative Research

Qualitative Research Methods are as follows:

One-to-One Interview

This method involves conducting an interview with a single participant to gain a detailed understanding of their experiences, attitudes, and beliefs. One-to-one interviews can be conducted in-person, over the phone, or through video conferencing. The interviewer typically uses open-ended questions to encourage the participant to share their thoughts and feelings. One-to-one interviews are useful for gaining detailed insights into individual experiences.

Focus Groups

This method involves bringing together a group of people to discuss a specific topic in a structured setting. The focus group is led by a moderator who guides the discussion and encourages participants to share their thoughts and opinions. Focus groups are useful for generating ideas and insights, exploring social norms and attitudes, and understanding group dynamics.

Ethnographic Studies

This method involves immersing oneself in a culture or community to gain a deep understanding of its norms, beliefs, and practices. Ethnographic studies typically involve long-term fieldwork and observation, as well as interviews and document analysis. Ethnographic studies are useful for understanding the cultural context of social phenomena and for gaining a holistic understanding of complex social processes.

Text Analysis

This method involves analyzing written or spoken language to identify patterns and themes. Text analysis can be quantitative or qualitative. Qualitative text analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Text analysis is useful for understanding media messages, public discourse, and cultural trends.

This method involves an in-depth examination of a single person, group, or event to gain an understanding of complex phenomena. Case studies typically involve a combination of data collection methods, such as interviews, observations, and document analysis, to provide a comprehensive understanding of the case. Case studies are useful for exploring unique or rare cases, and for generating hypotheses for further research.

Process of Observation

This method involves systematically observing and recording behaviors and interactions in natural settings. The observer may take notes, use audio or video recordings, or use other methods to document what they see. Process of observation is useful for understanding social interactions, cultural practices, and the context in which behaviors occur.

Record Keeping

This method involves keeping detailed records of observations, interviews, and other data collected during the research process. Record keeping is essential for ensuring the accuracy and reliability of the data, and for providing a basis for analysis and interpretation.

This method involves collecting data from a large sample of participants through a structured questionnaire. Surveys can be conducted in person, over the phone, through mail, or online. Surveys are useful for collecting data on attitudes, beliefs, and behaviors, and for identifying patterns and trends in a population.

Qualitative data analysis is a process of turning unstructured data into meaningful insights. It involves extracting and organizing information from sources like interviews, focus groups, and surveys. The goal is to understand people’s attitudes, behaviors, and motivations

Qualitative Research Analysis Methods

Qualitative Research analysis methods involve a systematic approach to interpreting and making sense of the data collected in qualitative research. Here are some common qualitative data analysis methods:

Thematic Analysis

This method involves identifying patterns or themes in the data that are relevant to the research question. The researcher reviews the data, identifies keywords or phrases, and groups them into categories or themes. Thematic analysis is useful for identifying patterns across multiple data sources and for generating new insights into the research topic.

Content Analysis

This method involves analyzing the content of written or spoken language to identify key themes or concepts. Content analysis can be quantitative or qualitative. Qualitative content analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Content analysis is useful for identifying patterns in media messages, public discourse, and cultural trends.

Discourse Analysis

This method involves analyzing language to understand how it constructs meaning and shapes social interactions. Discourse analysis can involve a variety of methods, such as conversation analysis, critical discourse analysis, and narrative analysis. Discourse analysis is useful for understanding how language shapes social interactions, cultural norms, and power relationships.

Grounded Theory Analysis

This method involves developing a theory or explanation based on the data collected. Grounded theory analysis starts with the data and uses an iterative process of coding and analysis to identify patterns and themes in the data. The theory or explanation that emerges is grounded in the data, rather than preconceived hypotheses. Grounded theory analysis is useful for understanding complex social phenomena and for generating new theoretical insights.

Narrative Analysis

This method involves analyzing the stories or narratives that participants share to gain insights into their experiences, attitudes, and beliefs. Narrative analysis can involve a variety of methods, such as structural analysis, thematic analysis, and discourse analysis. Narrative analysis is useful for understanding how individuals construct their identities, make sense of their experiences, and communicate their values and beliefs.

Phenomenological Analysis

This method involves analyzing how individuals make sense of their experiences and the meanings they attach to them. Phenomenological analysis typically involves in-depth interviews with participants to explore their experiences in detail. Phenomenological analysis is useful for understanding subjective experiences and for developing a rich understanding of human consciousness.

Comparative Analysis

This method involves comparing and contrasting data across different cases or groups to identify similarities and differences. Comparative analysis can be used to identify patterns or themes that are common across multiple cases, as well as to identify unique or distinctive features of individual cases. Comparative analysis is useful for understanding how social phenomena vary across different contexts and groups.

Applications of Qualitative Research

Qualitative research has many applications across different fields and industries. Here are some examples of how qualitative research is used:

  • Market Research: Qualitative research is often used in market research to understand consumer attitudes, behaviors, and preferences. Researchers conduct focus groups and one-on-one interviews with consumers to gather insights into their experiences and perceptions of products and services.
  • Health Care: Qualitative research is used in health care to explore patient experiences and perspectives on health and illness. Researchers conduct in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education: Qualitative research is used in education to understand student experiences and to develop effective teaching strategies. Researchers conduct classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work : Qualitative research is used in social work to explore social problems and to develop interventions to address them. Researchers conduct in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : Qualitative research is used in anthropology to understand different cultures and societies. Researchers conduct ethnographic studies and observe and interview members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : Qualitative research is used in psychology to understand human behavior and mental processes. Researchers conduct in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy : Qualitative research is used in public policy to explore public attitudes and to inform policy decisions. Researchers conduct focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

How to Conduct Qualitative Research

Here are some general steps for conducting qualitative research:

  • Identify your research question: Qualitative research starts with a research question or set of questions that you want to explore. This question should be focused and specific, but also broad enough to allow for exploration and discovery.
  • Select your research design: There are different types of qualitative research designs, including ethnography, case study, grounded theory, and phenomenology. You should select a design that aligns with your research question and that will allow you to gather the data you need to answer your research question.
  • Recruit participants: Once you have your research question and design, you need to recruit participants. The number of participants you need will depend on your research design and the scope of your research. You can recruit participants through advertisements, social media, or through personal networks.
  • Collect data: There are different methods for collecting qualitative data, including interviews, focus groups, observation, and document analysis. You should select the method or methods that align with your research design and that will allow you to gather the data you need to answer your research question.
  • Analyze data: Once you have collected your data, you need to analyze it. This involves reviewing your data, identifying patterns and themes, and developing codes to organize your data. You can use different software programs to help you analyze your data, or you can do it manually.
  • Interpret data: Once you have analyzed your data, you need to interpret it. This involves making sense of the patterns and themes you have identified, and developing insights and conclusions that answer your research question. You should be guided by your research question and use your data to support your conclusions.
  • Communicate results: Once you have interpreted your data, you need to communicate your results. This can be done through academic papers, presentations, or reports. You should be clear and concise in your communication, and use examples and quotes from your data to support your findings.

Examples of Qualitative Research

Here are some real-time examples of qualitative research:

  • Customer Feedback: A company may conduct qualitative research to understand the feedback and experiences of its customers. This may involve conducting focus groups or one-on-one interviews with customers to gather insights into their attitudes, behaviors, and preferences.
  • Healthcare : A healthcare provider may conduct qualitative research to explore patient experiences and perspectives on health and illness. This may involve conducting in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education : An educational institution may conduct qualitative research to understand student experiences and to develop effective teaching strategies. This may involve conducting classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work: A social worker may conduct qualitative research to explore social problems and to develop interventions to address them. This may involve conducting in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : An anthropologist may conduct qualitative research to understand different cultures and societies. This may involve conducting ethnographic studies and observing and interviewing members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : A psychologist may conduct qualitative research to understand human behavior and mental processes. This may involve conducting in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy: A government agency or non-profit organization may conduct qualitative research to explore public attitudes and to inform policy decisions. This may involve conducting focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

Purpose of Qualitative Research

The purpose of qualitative research is to explore and understand the subjective experiences, behaviors, and perspectives of individuals or groups in a particular context. Unlike quantitative research, which focuses on numerical data and statistical analysis, qualitative research aims to provide in-depth, descriptive information that can help researchers develop insights and theories about complex social phenomena.

Qualitative research can serve multiple purposes, including:

  • Exploring new or emerging phenomena : Qualitative research can be useful for exploring new or emerging phenomena, such as new technologies or social trends. This type of research can help researchers develop a deeper understanding of these phenomena and identify potential areas for further study.
  • Understanding complex social phenomena : Qualitative research can be useful for exploring complex social phenomena, such as cultural beliefs, social norms, or political processes. This type of research can help researchers develop a more nuanced understanding of these phenomena and identify factors that may influence them.
  • Generating new theories or hypotheses: Qualitative research can be useful for generating new theories or hypotheses about social phenomena. By gathering rich, detailed data about individuals’ experiences and perspectives, researchers can develop insights that may challenge existing theories or lead to new lines of inquiry.
  • Providing context for quantitative data: Qualitative research can be useful for providing context for quantitative data. By gathering qualitative data alongside quantitative data, researchers can develop a more complete understanding of complex social phenomena and identify potential explanations for quantitative findings.

When to use Qualitative Research

Here are some situations where qualitative research may be appropriate:

  • Exploring a new area: If little is known about a particular topic, qualitative research can help to identify key issues, generate hypotheses, and develop new theories.
  • Understanding complex phenomena: Qualitative research can be used to investigate complex social, cultural, or organizational phenomena that are difficult to measure quantitatively.
  • Investigating subjective experiences: Qualitative research is particularly useful for investigating the subjective experiences of individuals or groups, such as their attitudes, beliefs, values, or emotions.
  • Conducting formative research: Qualitative research can be used in the early stages of a research project to develop research questions, identify potential research participants, and refine research methods.
  • Evaluating interventions or programs: Qualitative research can be used to evaluate the effectiveness of interventions or programs by collecting data on participants’ experiences, attitudes, and behaviors.

Characteristics of Qualitative Research

Qualitative research is characterized by several key features, including:

  • Focus on subjective experience: Qualitative research is concerned with understanding the subjective experiences, beliefs, and perspectives of individuals or groups in a particular context. Researchers aim to explore the meanings that people attach to their experiences and to understand the social and cultural factors that shape these meanings.
  • Use of open-ended questions: Qualitative research relies on open-ended questions that allow participants to provide detailed, in-depth responses. Researchers seek to elicit rich, descriptive data that can provide insights into participants’ experiences and perspectives.
  • Sampling-based on purpose and diversity: Qualitative research often involves purposive sampling, in which participants are selected based on specific criteria related to the research question. Researchers may also seek to include participants with diverse experiences and perspectives to capture a range of viewpoints.
  • Data collection through multiple methods: Qualitative research typically involves the use of multiple data collection methods, such as in-depth interviews, focus groups, and observation. This allows researchers to gather rich, detailed data from multiple sources, which can provide a more complete picture of participants’ experiences and perspectives.
  • Inductive data analysis: Qualitative research relies on inductive data analysis, in which researchers develop theories and insights based on the data rather than testing pre-existing hypotheses. Researchers use coding and thematic analysis to identify patterns and themes in the data and to develop theories and explanations based on these patterns.
  • Emphasis on researcher reflexivity: Qualitative research recognizes the importance of the researcher’s role in shaping the research process and outcomes. Researchers are encouraged to reflect on their own biases and assumptions and to be transparent about their role in the research process.

Advantages of Qualitative Research

Qualitative research offers several advantages over other research methods, including:

  • Depth and detail: Qualitative research allows researchers to gather rich, detailed data that provides a deeper understanding of complex social phenomena. Through in-depth interviews, focus groups, and observation, researchers can gather detailed information about participants’ experiences and perspectives that may be missed by other research methods.
  • Flexibility : Qualitative research is a flexible approach that allows researchers to adapt their methods to the research question and context. Researchers can adjust their research methods in real-time to gather more information or explore unexpected findings.
  • Contextual understanding: Qualitative research is well-suited to exploring the social and cultural context in which individuals or groups are situated. Researchers can gather information about cultural norms, social structures, and historical events that may influence participants’ experiences and perspectives.
  • Participant perspective : Qualitative research prioritizes the perspective of participants, allowing researchers to explore subjective experiences and understand the meanings that participants attach to their experiences.
  • Theory development: Qualitative research can contribute to the development of new theories and insights about complex social phenomena. By gathering rich, detailed data and using inductive data analysis, researchers can develop new theories and explanations that may challenge existing understandings.
  • Validity : Qualitative research can offer high validity by using multiple data collection methods, purposive and diverse sampling, and researcher reflexivity. This can help ensure that findings are credible and trustworthy.

Limitations of Qualitative Research

Qualitative research also has some limitations, including:

  • Subjectivity : Qualitative research relies on the subjective interpretation of researchers, which can introduce bias into the research process. The researcher’s perspective, beliefs, and experiences can influence the way data is collected, analyzed, and interpreted.
  • Limited generalizability: Qualitative research typically involves small, purposive samples that may not be representative of larger populations. This limits the generalizability of findings to other contexts or populations.
  • Time-consuming: Qualitative research can be a time-consuming process, requiring significant resources for data collection, analysis, and interpretation.
  • Resource-intensive: Qualitative research may require more resources than other research methods, including specialized training for researchers, specialized software for data analysis, and transcription services.
  • Limited reliability: Qualitative research may be less reliable than quantitative research, as it relies on the subjective interpretation of researchers. This can make it difficult to replicate findings or compare results across different studies.
  • Ethics and confidentiality: Qualitative research involves collecting sensitive information from participants, which raises ethical concerns about confidentiality and informed consent. Researchers must take care to protect the privacy and confidentiality of participants and obtain informed consent.

Also see Research Methods

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Explanatory Research

Explanatory Research – Types, Methods, Guide

Focus Groups in Qualitative Research

Focus Groups – Steps, Examples and Guide

Survey Research

Survey Research – Types, Methods, Examples

Questionnaire

Questionnaire – Definition, Types, and Examples

Phenomenology

Phenomenology – Methods, Examples and Guide

One-to-One Interview in Research

One-to-One Interview – Methods and Guide

  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Qualitative Methods
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Applying Critical Thinking
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

The word qualitative implies an emphasis on the qualities of entities and on processes and meanings that are not experimentally examined or measured [if measured at all] in terms of quantity, amount, intensity, or frequency. Qualitative researchers stress the socially constructed nature of reality, the intimate relationship between the researcher and what is studied, and the situational constraints that shape inquiry. Such researchers emphasize the value-laden nature of inquiry. They seek answers to questions that stress how social experience is created and given meaning. In contrast, quantitative studies emphasize the measurement and analysis of causal relationships between variables, not processes. Qualitative forms of inquiry are considered by many social and behavioral scientists to be as much a perspective on how to approach investigating a research problem as it is a method.

Denzin, Norman. K. and Yvonna S. Lincoln. “Introduction: The Discipline and Practice of Qualitative Research.” In The Sage Handbook of Qualitative Research . Norman. K. Denzin and Yvonna S. Lincoln, eds. 3 rd edition. (Thousand Oaks, CA: Sage, 2005), p. 10.

Characteristics of Qualitative Research

Below are the three key elements that define a qualitative research study and the applied forms each take in the investigation of a research problem.

  • Naturalistic -- refers to studying real-world situations as they unfold naturally; non-manipulative and non-controlling; the researcher is open to whatever emerges [i.e., there is a lack of predetermined constraints on findings].
  • Emergent -- acceptance of adapting inquiry as understanding deepens and/or situations change; the researcher avoids rigid designs that eliminate responding to opportunities to pursue new paths of discovery as they emerge.
  • Purposeful -- cases for study [e.g., people, organizations, communities, cultures, events, critical incidences] are selected because they are “information rich” and illuminative. That is, they offer useful manifestations of the phenomenon of interest; sampling is aimed at insight about the phenomenon, not empirical generalization derived from a sample and applied to a population.

The Collection of Data

  • Data -- observations yield a detailed, "thick description" [in-depth understanding]; interviews capture direct quotations about people’s personal perspectives and lived experiences; often derived from carefully conducted case studies and review of material culture.
  • Personal experience and engagement -- researcher has direct contact with and gets close to the people, situation, and phenomenon under investigation; the researcher’s personal experiences and insights are an important part of the inquiry and critical to understanding the phenomenon.
  • Empathic neutrality -- an empathic stance in working with study respondents seeks vicarious understanding without judgment [neutrality] by showing openness, sensitivity, respect, awareness, and responsiveness; in observation, it means being fully present [mindfulness].
  • Dynamic systems -- there is attention to process; assumes change is ongoing, whether the focus is on an individual, an organization, a community, or an entire culture, therefore, the researcher is mindful of and attentive to system and situational dynamics.

The Analysis

  • Unique case orientation -- assumes that each case is special and unique; the first level of analysis is being true to, respecting, and capturing the details of the individual cases being studied; cross-case analysis follows from and depends upon the quality of individual case studies.
  • Inductive analysis -- immersion in the details and specifics of the data to discover important patterns, themes, and inter-relationships; begins by exploring, then confirming findings, guided by analytical principles rather than rules.
  • Holistic perspective -- the whole phenomenon under study is understood as a complex system that is more than the sum of its parts; the focus is on complex interdependencies and system dynamics that cannot be reduced in any meaningful way to linear, cause and effect relationships and/or a few discrete variables.
  • Context sensitive -- places findings in a social, historical, and temporal context; researcher is careful about [even dubious of] the possibility or meaningfulness of generalizations across time and space; emphasizes careful comparative case study analysis and extrapolating patterns for possible transferability and adaptation in new settings.
  • Voice, perspective, and reflexivity -- the qualitative methodologist owns and is reflective about her or his own voice and perspective; a credible voice conveys authenticity and trustworthiness; complete objectivity being impossible and pure subjectivity undermining credibility, the researcher's focus reflects a balance between understanding and depicting the world authentically in all its complexity and of being self-analytical, politically aware, and reflexive in consciousness.

Berg, Bruce Lawrence. Qualitative Research Methods for the Social Sciences . 8th edition. Boston, MA: Allyn and Bacon, 2012; Denzin, Norman. K. and Yvonna S. Lincoln. Handbook of Qualitative Research . 2nd edition. Thousand Oaks, CA: Sage, 2000; Marshall, Catherine and Gretchen B. Rossman. Designing Qualitative Research . 2nd ed. Thousand Oaks, CA: Sage Publications, 1995; Merriam, Sharan B. Qualitative Research: A Guide to Design and Implementation . San Francisco, CA: Jossey-Bass, 2009.

Basic Research Design for Qualitative Studies

Unlike positivist or experimental research that utilizes a linear and one-directional sequence of design steps, there is considerable variation in how a qualitative research study is organized. In general, qualitative researchers attempt to describe and interpret human behavior based primarily on the words of selected individuals [a.k.a., “informants” or “respondents”] and/or through the interpretation of their material culture or occupied space. There is a reflexive process underpinning every stage of a qualitative study to ensure that researcher biases, presuppositions, and interpretations are clearly evident, thus ensuring that the reader is better able to interpret the overall validity of the research. According to Maxwell (2009), there are five, not necessarily ordered or sequential, components in qualitative research designs. How they are presented depends upon the research philosophy and theoretical framework of the study, the methods chosen, and the general assumptions underpinning the study. Goals Describe the central research problem being addressed but avoid describing any anticipated outcomes. Questions to ask yourself are: Why is your study worth doing? What issues do you want to clarify, and what practices and policies do you want it to influence? Why do you want to conduct this study, and why should the reader care about the results? Conceptual Framework Questions to ask yourself are: What do you think is going on with the issues, settings, or people you plan to study? What theories, beliefs, and prior research findings will guide or inform your research, and what literature, preliminary studies, and personal experiences will you draw upon for understanding the people or issues you are studying? Note to not only report the results of other studies in your review of the literature, but note the methods used as well. If appropriate, describe why earlier studies using quantitative methods were inadequate in addressing the research problem. Research Questions Usually there is a research problem that frames your qualitative study and that influences your decision about what methods to use, but qualitative designs generally lack an accompanying hypothesis or set of assumptions because the findings are emergent and unpredictable. In this context, more specific research questions are generally the result of an interactive design process rather than the starting point for that process. Questions to ask yourself are: What do you specifically want to learn or understand by conducting this study? What do you not know about the things you are studying that you want to learn? What questions will your research attempt to answer, and how are these questions related to one another? Methods Structured approaches to applying a method or methods to your study help to ensure that there is comparability of data across sources and researchers and, thus, they can be useful in answering questions that deal with differences between phenomena and the explanation for these differences [variance questions]. An unstructured approach allows the researcher to focus on the particular phenomena studied. This facilitates an understanding of the processes that led to specific outcomes, trading generalizability and comparability for internal validity and contextual and evaluative understanding. Questions to ask yourself are: What will you actually do in conducting this study? What approaches and techniques will you use to collect and analyze your data, and how do these constitute an integrated strategy? Validity In contrast to quantitative studies where the goal is to design, in advance, “controls” such as formal comparisons, sampling strategies, or statistical manipulations to address anticipated and unanticipated threats to validity, qualitative researchers must attempt to rule out most threats to validity after the research has begun by relying on evidence collected during the research process itself in order to effectively argue that any alternative explanations for a phenomenon are implausible. Questions to ask yourself are: How might your results and conclusions be wrong? What are the plausible alternative interpretations and validity threats to these, and how will you deal with these? How can the data that you have, or that you could potentially collect, support or challenge your ideas about what’s going on? Why should we believe your results? Conclusion Although Maxwell does not mention a conclusion as one of the components of a qualitative research design, you should formally conclude your study. Briefly reiterate the goals of your study and the ways in which your research addressed them. Discuss the benefits of your study and how stakeholders can use your results. Also, note the limitations of your study and, if appropriate, place them in the context of areas in need of further research.

Chenail, Ronald J. Introduction to Qualitative Research Design. Nova Southeastern University; Heath, A. W. The Proposal in Qualitative Research. The Qualitative Report 3 (March 1997); Marshall, Catherine and Gretchen B. Rossman. Designing Qualitative Research . 3rd edition. Thousand Oaks, CA: Sage, 1999; Maxwell, Joseph A. "Designing a Qualitative Study." In The SAGE Handbook of Applied Social Research Methods . Leonard Bickman and Debra J. Rog, eds. 2nd ed. (Thousand Oaks, CA: Sage, 2009), p. 214-253; Qualitative Research Methods. Writing@CSU. Colorado State University; Yin, Robert K. Qualitative Research from Start to Finish . 2nd edition. New York: Guilford, 2015.

Strengths of Using Qualitative Methods

The advantage of using qualitative methods is that they generate rich, detailed data that leave the participants' perspectives intact and provide multiple contexts for understanding the phenomenon under study. In this way, qualitative research can be used to vividly demonstrate phenomena or to conduct cross-case comparisons and analysis of individuals or groups.

Among the specific strengths of using qualitative methods to study social science research problems is the ability to:

  • Obtain a more realistic view of the lived world that cannot be understood or experienced in numerical data and statistical analysis;
  • Provide the researcher with the perspective of the participants of the study through immersion in a culture or situation and as a result of direct interaction with them;
  • Allow the researcher to describe existing phenomena and current situations;
  • Develop flexible ways to perform data collection, subsequent analysis, and interpretation of collected information;
  • Yield results that can be helpful in pioneering new ways of understanding;
  • Respond to changes that occur while conducting the study ]e.g., extended fieldwork or observation] and offer the flexibility to shift the focus of the research as a result;
  • Provide a holistic view of the phenomena under investigation;
  • Respond to local situations, conditions, and needs of participants;
  • Interact with the research subjects in their own language and on their own terms; and,
  • Create a descriptive capability based on primary and unstructured data.

Anderson, Claire. “Presenting and Evaluating Qualitative Research.” American Journal of Pharmaceutical Education 74 (2010): 1-7; Denzin, Norman. K. and Yvonna S. Lincoln. Handbook of Qualitative Research . 2nd edition. Thousand Oaks, CA: Sage, 2000; Merriam, Sharan B. Qualitative Research: A Guide to Design and Implementation . San Francisco, CA: Jossey-Bass, 2009.

Limitations of Using Qualitative Methods

It is very much true that most of the limitations you find in using qualitative research techniques also reflect their inherent strengths . For example, small sample sizes help you investigate research problems in a comprehensive and in-depth manner. However, small sample sizes undermine opportunities to draw useful generalizations from, or to make broad policy recommendations based upon, the findings. Additionally, as the primary instrument of investigation, qualitative researchers are often embedded in the cultures and experiences of others. However, cultural embeddedness increases the opportunity for bias generated from conscious or unconscious assumptions about the study setting to enter into how data is gathered, interpreted, and reported.

Some specific limitations associated with using qualitative methods to study research problems in the social sciences include the following:

  • Drifting away from the original objectives of the study in response to the changing nature of the context under which the research is conducted;
  • Arriving at different conclusions based on the same information depending on the personal characteristics of the researcher;
  • Replication of a study is very difficult;
  • Research using human subjects increases the chance of ethical dilemmas that undermine the overall validity of the study;
  • An inability to investigate causality between different research phenomena;
  • Difficulty in explaining differences in the quality and quantity of information obtained from different respondents and arriving at different, non-consistent conclusions;
  • Data gathering and analysis is often time consuming and/or expensive;
  • Requires a high level of experience from the researcher to obtain the targeted information from the respondent;
  • May lack consistency and reliability because the researcher can employ different probing techniques and the respondent can choose to tell some particular stories and ignore others; and,
  • Generation of a significant amount of data that cannot be randomized into manageable parts for analysis.

Research Tip

Human Subject Research and Institutional Review Board Approval

Almost every socio-behavioral study requires you to submit your proposed research plan to an Institutional Review Board. The role of the Board is to evaluate your research proposal and determine whether it will be conducted ethically and under the regulations, institutional polices, and Code of Ethics set forth by the university. The purpose of the review is to protect the rights and welfare of individuals participating in your study. The review is intended to ensure equitable selection of respondents, that you have met the requirements for obtaining informed consent , that there is clear assessment and minimization of risks to participants and to the university [read: no lawsuits!], and that privacy and confidentiality are maintained throughout the research process and beyond. Go to the USC IRB website for detailed information and templates of forms you need to submit before you can proceed. If you are  unsure whether your study is subject to IRB review, consult with your professor or academic advisor.

Chenail, Ronald J. Introduction to Qualitative Research Design. Nova Southeastern University; Labaree, Robert V. "Working Successfully with Your Institutional Review Board: Practical Advice for Academic Librarians." College and Research Libraries News 71 (April 2010): 190-193.

Another Research Tip

Finding Examples of How to Apply Different Types of Research Methods

SAGE publications is a major publisher of studies about how to design and conduct research in the social and behavioral sciences. Their SAGE Research Methods Online and Cases database includes contents from books, articles, encyclopedias, handbooks, and videos covering social science research design and methods including the complete Little Green Book Series of Quantitative Applications in the Social Sciences and the Little Blue Book Series of Qualitative Research techniques. The database also includes case studies outlining the research methods used in real research projects. This is an excellent source for finding definitions of key terms and descriptions of research design and practice, techniques of data gathering, analysis, and reporting, and information about theories of research [e.g., grounded theory]. The database covers both qualitative and quantitative research methods as well as mixed methods approaches to conducting research.

SAGE Research Methods Online and Cases

NOTE :  For a list of online communities, research centers, indispensable learning resources, and personal websites of leading qualitative researchers, GO HERE .

For a list of scholarly journals devoted to the study and application of qualitative research methods, GO HERE .

  • << Previous: 6. The Methodology
  • Next: Quantitative Methods >>
  • Last Updated: Sep 4, 2024 9:40 AM
  • URL: https://libguides.usc.edu/writingguide

Logo for JCU Open eBooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

4.2 Definitions and Characteristics of Qualitative Research

Qualitative research aims to uncover the meaning and understanding of phenomena that cannot be broken down into measurable elements. It is based on naturalistic, interpretative and humanistic notions. 5 This research method seeks to discover, explore, identify or describe subjective human experiences using non-statistical methods and develops themes from the study participants’ stories. 5 Figure 4.1 depicts major features/ characteristics of qualitative research. It utilises exploratory open-ended questions and observations to search for patterns of meaning in collected data (e.g. observation, verbal/written narrative data, photographs, etc.) and uses inductive thinking (from specific observations to more general rules) to interpret meaning. 6 Participants’ voice is evident through quotations and description of the work. 6 The context/ setting of the study and the researcher’s reflexivity (i.e. “reflection on and awareness of their bias”, the effect of the researcher’s experience on the data and interpretations) are very important and described as part of data collection. 6 Analysis of collected data is complex, often involves inductive data analysis (exploration, contrasts, specific to general) and requires multiple coding and development of themes from participant stories. 6

flow chart of characteristics of qualitative research

Reflexivity- avoiding bias/Role of the qualitative researcher

Qualitative researchers generally begin their work with the recognition that their position (or worldview) has a significant impact on the overall research process. 7 Researcher worldview shapes the way the research is conducted, i.e., how the questions are formulated, methods are chosen, data are collected and analysed, and results are reported. Therefore, it is essential for qualitative researchers to acknowledge, articulate, reflect on and clarify their own underlying biases and assumptions before embarking on any research project. 7 Reflexivity helps to ensure that the researcher’s own experiences, values, and beliefs do not unintentionally bias the data collection, analysis, and interpretation. 7 It is the gold standard for establishing trustworthiness and has been established as one of the ways qualitative researchers should ensure rigour and quality in their work. 8 The following questions in Table 4.1 may help you begin the reflective process. 9

Table 4.1: Questions to aid the reflection process

What piques my interest in this subject? You need to consider what motivates your excitement, energy, and interest in investigating this topic to answer this question
What exactly do I believe the solution is? Asking this question allows you to detect any biases by honestly reflecting on what you anticipate finding. The assumptions can be grouped/classified to allow the participants’ opinions to be heard.
What exactly am I getting out of this? In many circumstances, the “pressure to publish” reduces research to nothing more than a job necessity. What effect does this have on your interest in the subject and its results? To what extent are you willing to go to find information?
What do my colleagues think of this project—and me? You will not work in a vacuum as a researcher; you will be part of a social and interpersonal world. These outside factors will impact your perceptions of yourself and your job.

Recognising this impact and its possible implications on human behaviour will allow for more self-reflection during the study process.

Philosophical underpinnings to qualitative research

Qualitative research uses an inductive approach and stems from interpretivism or constructivism and assumes that realities are multiple, socially constructed, and holistic. 10 According to this philosophical viewpoint, humans build reality through their interactions with the world around them. 10 As a result, qualitative research aims to comprehend how individuals make sense of their experiences and build meaning in their lives. 10 Because reality is complex/nuanced and context-bound, participants constantly construct it depending on their understanding. Thus, the interactions between the researcher and the participants are considered necessary to offer a rich description of the concept and provide an in-depth understanding of the phenomenon under investigation. 11

An Introduction to Research Methods for Undergraduate Health Profession Students Copyright © 2023 by Faith Alele and Bunmi Malau-Aduli is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on 4 April 2022 by Pritha Bhandari . Revised on 30 January 2023.

Qualitative research involves collecting and analysing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analysing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, and history.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organisation?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography, action research, phenomenological research, and narrative research. They share some similarities, but emphasise different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organisations to understand their cultures.
Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Prevent plagiarism, run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves ‘instruments’ in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analysing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organise your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorise your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analysing qualitative data. Although these methods share similar processes, they emphasise different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorise common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analysing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analysing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalisability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalisable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labour-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to test a hypothesis by systematically collecting and analysing data, while qualitative methods allow you to explore ideas and experiences in depth.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organisation to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organisations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organise your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Bhandari, P. (2023, January 30). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved 9 September 2024, from https://www.scribbr.co.uk/research-methods/introduction-to-qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

key features of a qualitative research

The Ultimate Guide to Qualitative Research - Part 1: The Basics

key features of a qualitative research

  • Introduction and overview

Basics of qualitative research

Types, aspects, examples, benefits and challenges, how qualitative research complements quantitative research, how is qualitative research reported.

  • What is qualitative data?
  • Examples of qualitative data
  • Qualitative vs. quantitative research
  • Mixed methods
  • Qualitative research preparation
  • Theoretical perspective
  • Theoretical framework
  • Literature reviews
  • Research question
  • Conceptual framework
  • Conceptual vs. theoretical framework
  • Data collection
  • Qualitative research methods
  • Focus groups
  • Observational research
  • Case studies
  • Ethnographical research

Ethical considerations

  • Confidentiality and privacy
  • Power dynamics
  • Reflexivity

What is qualitative research?

Qualitative research is an essential approach in various academic disciplines and professional fields, as it seeks to understand and interpret the meanings, experiences, and social realities of people in their natural settings. This type of research employs an array of qualitative methods to gather and analyze non-numerical data, such as words, images, and behaviors, and aims to generate in-depth and contextualized insights into the phenomena under study.

key features of a qualitative research

Qualitative research is designed to address research questions that focus on understanding the "why" and "how" of human behavior, experiences, and interactions, rather than just the "what" or "how many" that quantitative methods typically seek to answer. The main purpose of qualitative research is to gain a rich and nuanced understanding of people's perspectives, emotions, beliefs, and motivations in relation to specific issues, situations, or phenomena.

Characteristics of qualitative research

Several key characteristics distinguish qualitative research from other types of research, such as quantitative research:

Naturalistic settings : Qualitative researchers collect data in the real-world settings where the phenomena of interest occur, rather than in controlled laboratory environments. This allows researchers to observe and understand the participants' behavior, experiences, and social interactions in their natural context.

Inductive approach : Unlike quantitative research, which often follows a deductive approach , qualitative research begins with the collection of data and then seeks to develop theories, concepts, or themes that emerge from the data. This inductive approach enables researchers to stay open to new insights and unexpected findings.

Holistic perspective : Qualitative research aims to provide a comprehensive understanding of the phenomena under study by considering multiple dimensions, such as the social, cultural, historical, and psychological aspects that shape people's experiences and behavior.

Subjectivity and interpretation : Epistemology plays a crucial role in qualitative research. Researchers are encouraged to reflect on their biases, assumptions, and values , and to consider how these may influence their data collection, analysis, and interpretation.

Flexibility : Qualitative research methods are often flexible and adaptable, allowing researchers to refine their research questions , sampling strategies, or data collection techniques as new insights and perspectives emerge during the research process.

Key principles of qualitative research

Qualitative research is guided by several fundamental principles that shape its approach, methods, and analysis:

Empathy and reflexivity : Qualitative researchers strive to empathize with the participants and to understand their perspectives, experiences, and emotions from their viewpoint. This requires researchers to be attentive, open-minded, and sensitive to the participants' verbal and non-verbal cues. At the same, qualitative researchers critically reflect on their participants’ perspectives, experiences, and emotions to develop their findings and conclusions, instead of taking these at face value. In addition, it is important for the researcher to reflect on how their own role and viewpoint may be shaping the research.

Trustworthiness : Establishing trustworthiness in qualitative research involves demonstrating credibility, transferability, dependability, and confirmability. Researchers can enhance trustworthiness by using various strategies, such as triangulation, member checking , peer debriefing , and reflexivity .

Iterative analysis : Qualitative data analysis is an ongoing and iterative process, in which researchers continually review, compare, and revise their interpretations as they collect and analyze more data. This iterative process allows researchers to refine their understanding of the phenomena and to develop more robust and nuanced theories, concepts, or themes.

Rich description : Providing detailed, vivid, and context-sensitive descriptions of the data is essential in qualitative research. Rich descriptions help convey the complexity and nuances of the phenomena under study, and enable readers to assess the relevance and transferability of the findings to other settings or populations.

key features of a qualitative research

What are the common types of qualitative research?

Qualitative research is an umbrella term for various methodologies that focus on understanding and interpreting human experiences, behaviors, and social phenomena within their context. These approaches seek to gather in-depth, rich data through the analysis of language, actions, and expressions. Five common types of qualitative research are narrative research , phenomenology , grounded theory , ethnography , and case study .

Narrative research : This approach focuses on the stories and experiences of individuals, aiming to understand their lives and personal perspectives. Researchers can collect data through interviews, letters, diaries, or autobiographies, and analyze these narratives to identify recurring themes, patterns, and meanings . Narrative research can be valuable for exploring individual identities, cultural beliefs, and historical events.

Phenomenology : Phenomenology seeks to understand the essence of a particular phenomenon by analyzing the experiences and perceptions of individuals who have gone through that phenomenon . Researchers can explore participants' thoughts, feelings, and experiences through in-depth interviews, observations, or written materials. The goal is to describe the commonalities and variations in these experiences, ultimately revealing the underlying structures and meaning of the phenomenon under study.

Grounded theory : This inductive research method aims to generate new theories by systematically collecting and analyzing data. Researchers begin with an open-ended research question and gather data through observations, interviews, and document analysis . They then use a process of coding and constant comparison to identify patterns, categories, and relationships in the data. This iterative process continues until a comprehensive, grounded theory emerges that is based in the recollected data and explains the topic of interest.

Ethnography : Ethnographic research involves the in-depth study of a specific cultural or social group, focusing on understanding its members' behaviors, beliefs, and interactions. Researchers immerse themselves in the group's environment, often for extended periods, to observe and participate in daily activities. They can collect data through field notes, interviews, and document analysis, aiming to provide a holistic and nuanced understanding of the group's cultural practices and social dynamics.

Case study : A case study is an in-depth examination of a specific instance, event, organization, or individual within its real-life context. Researchers use multiple sources of data, such as interviews, observations, documents, and artifacts to build a rich, detailed understanding of the case. Case study research can be used to explore complex phenomena, generate new hypotheses , or evaluate the effectiveness of interventions or policies.

What are the purposes of qualitative research?

Qualitative research presents outcomes that emerge from the process of collecting and analyzing qualitative data. These outcomes often involve generating new theories, developing or challenging existing theories, and proposing practical implications based on actionable insights. The products of qualitative research contribute to a deeper understanding of human experiences, social phenomena, and cultural contexts. Qualitative research can also be a powerful complement to quantitative research.

Generating new theory : One of the primary goals of qualitative research is to develop new theories or conceptual frameworks that help explain previously unexplored or poorly understood phenomena. By conducting in-depth investigations and analyzing rich data, researchers can identify patterns, relationships, and underlying structures that form the basis of novel theoretical insights.

Developing or challenging existing theory : Qualitative research can also contribute to the refinement or expansion of existing theories by providing new perspectives, revealing previously unnoticed complexities, or highlighting areas where current theories may be insufficient or inaccurate. By examining the nuances and context-specific details of a phenomenon, researchers can generate evidence that supports, contradicts, or modifies existing theoretical frameworks .

Proposing practical implications : Qualitative research often yields actionable insights that can inform policy, practice, and intervention strategies. By delving into the lived experiences of individuals and communities, researchers can identify factors that contribute to or hinder the effectiveness of certain approaches, uncovering opportunities for improvement or innovation. The insights gained from qualitative research can be used to design targeted interventions, develop context-sensitive policies, or inform the professional practices of practitioners in various fields.

Enhancing understanding and empathy : Qualitative research promotes a deeper understanding of human experiences, emotions, and perspectives, fostering empathy and cultural sensitivity. By engaging with diverse voices and experiences, researchers can develop a more nuanced appreciation of the complexities of human behavior and social dynamics, ultimately contributing to more compassionate and inclusive societies.

Informing mixed-methods research : The products of qualitative research can also be used in conjunction with quantitative research, as part of a mixed-methods approach . Qualitative findings can help generate hypotheses for further testing, inform the development of survey instruments , or provide context and explanation for quantitative results. Combining the strengths of both approaches can lead to more robust and comprehensive understanding of complex research questions .

What are some examples of qualitative research?

Qualitative research can be conducted across various scientific fields, exploring diverse topics and phenomena. Here are six brief descriptions of qualitative studies that can provide researchers with ideas for their own projects:

Exploring the lived experiences of refugees : A phenomenological study could be conducted to investigate the lived experiences and coping strategies of refugees in a specific host country. By conducting in-depth interviews with refugees and analyzing their narratives , researchers can gain insights into the challenges they face, their resilience, and the factors that contribute to successful integration into their new communities.

Understanding the dynamics of online communities : An ethnographic study could be designed to explore the culture and social dynamics of a particular online community or social media platform. By immersing themselves in the virtual environment, researchers can observe patterns of interaction, communication styles, and shared values among community members, providing a nuanced understanding of the factors that influence online behavior and group dynamics.

Examining the impact of gentrification on local communities : A case study could be conducted to explore the impact of gentrification on a specific neighborhood or community. Researchers can collect data through interviews with residents, local business owners, and policymakers, as well as analyzing relevant documents and media coverage. The study can shed light on the effects of gentrification on housing affordability, social cohesion, and cultural identity, informing policy and urban planning decisions.

Studying the career trajectories of women in STEM fields : A narrative research project can be designed to investigate the career experiences and pathways of women in science, technology, engineering, and mathematics (STEM) fields. By collecting and analyzing the stories of women at various career stages, researchers can identify factors that contribute to their success, as well as barriers and challenges they face in male-dominated fields.

Evaluating the effectiveness of a mental health intervention : A qualitative study can be conducted to evaluate the effectiveness of a specific mental health intervention, such as a mindfulness-based program for reducing stress and anxiety. Researchers can gather data through interviews and focus groups with program participants, exploring their experiences, perceived benefits, and suggestions for improvement. The findings can provide valuable insights for refining the intervention and informing future mental health initiatives.

Investigating the role of social media in political activism : A qualitative study using document analysis and visual methods could explore the role of social media in shaping political activism and public opinion during a specific social movement or election campaign. By analyzing user-generated content, such as tweets, posts, images, and videos, researchers can examine patterns of communication, mobilization, and discourse, shedding light on the ways in which social media influences political engagement and democratic processes.

key features of a qualitative research

Whatever your research topic may be, ATLAS.ti makes it happen

Give a try to see how our data analysis tools can work for you.

What are common qualitative research methods?

Qualitative research methods are techniques used to collect, analyze, and interpret data in qualitative studies. These methods prioritize the exploration of meaning, context, and individual experiences. Common qualitative research methods include interviews, focus groups, observations, document analysis, and visual methods.

Interviews : Interviews involve one-on-one conversations between the researcher and the participant. They can be structured, semi-structured, or unstructured, depending on the level of guidance provided by the researcher. Interviews allow for in-depth exploration of participants' experiences, thoughts, and feelings, providing rich and detailed data for analysis.

Focus groups : Focus groups are group discussions facilitated by a researcher, usually consisting of 6-12 participants. They enable researchers to explore participants' collective perspectives, opinions, and experiences in a social setting. Focus groups can generate insights into group dynamics, cultural norms, and shared understandings, as participants interact and respond to each other's viewpoints.

Observations : Observational research involves the systematic collection of data through watching and recording people, events, or behaviors in their natural settings. Researchers can take on different roles, such as participant-observer or non-participant observer, depending on their level of involvement. Observations provide valuable information about context, social interactions, and non-verbal communication, which can help researchers understand the nuances of a particular phenomenon.

Document analysis : Document analysis is the examination of written or visual materials, such as letters, diaries, reports, newspaper articles, photographs, or videos. This method can provide insights into historical or cultural contexts, individual perspectives, and organizational processes. Researchers may use content analysis, discourse analysis, or other analytic techniques to interpret the meaning and significance of these documents.

Visual methods : Visual methods involve the use of visual materials, such as photographs, drawings, or videos, to explore and represent participants' experiences and perspectives. Techniques like photo elicitation, where participants are asked to take or select photographs related to the research topic and discuss their meaning, can encourage reflection and stimulate discussion. Visual methods can be particularly useful in capturing non-verbal information, promoting cross-cultural understanding, and engaging with hard-to-reach populations.

key features of a qualitative research

Importance of qualitative research and qualitative data analysis

Qualitative research and qualitative data analysis play a vital role in advancing knowledge, informing policies, and improving practices in various fields, such as education, healthcare, business, and social work. The unique insights and in-depth understanding generated through qualitative research can accomplish a number of goals.

Inform decision-making

Qualitative research helps decision-makers better understand the needs, preferences, and concerns of different stakeholders, such as customers, employees, or community members. This can lead to more effective and tailored policies, programs, or interventions that address real-world challenges.

Enhance innovation

By exploring people's experiences, motivations, and aspirations, qualitative research can uncover new ideas, opportunities, and trends that can drive innovation in products, services, or processes.

Foster empathy and cultural competence

Qualitative research can increase our empathy and understanding of diverse populations, cultures, and contexts. This can enhance our ability to communicate, collaborate, and work effectively with people from different backgrounds.

Complement quantitative research

Qualitative research can complement quantitative research by providing rich contextual information and in-depth insights into the underlying mechanisms, processes, or factors that may explain the patterns or relationships observed in quantitative data.

Facilitate social change

Qualitative research can give voice to marginalized or underrepresented groups, highlight social injustices or inequalities, and inspire actions and reforms that promote social change and well-being.

Challenges of conducting qualitative research

While qualitative research offers valuable insights and understanding of human experiences, it also presents some challenges that researchers must navigate. Acknowledging and addressing these challenges can help ensure the rigor, credibility, and relevance of qualitative research. In this section, we will discuss some common challenges that researchers may encounter when conducting qualitative research and offer suggestions on how to overcome them.

Subjectivity and bias

One of the primary challenges in qualitative research is managing subjectivity and potential biases that may arise from the researcher's personal beliefs, values, and experiences. Since qualitative research relies on the researcher's interpretation of the data , there is a risk that the researcher's subjectivity may influence the findings.

Researchers can minimize the impact of subjectivity and bias by maintaining reflexivity , or ongoing self-awareness and critical reflection on their role, assumptions, and influences in the research process. This may involve keeping a reflexive journal, engaging in peer debriefing , and discussing potential biases with research participants during member checking .

Data collection and quality

Collecting high-quality data in qualitative research can be challenging, particularly when dealing with sensitive topics, hard-to-reach populations, or complex social phenomena. Ensuring the trustworthiness of qualitative data collection is essential to producing credible and meaningful findings.

Researchers can enhance data quality by employing various strategies, such as purposive or theoretical sampling, triangulation of data sources, methods or researchers, and establishing rapport and trust with research participants.

Data analysis and interpretation

The analysis and interpretation of qualitative data can be a complex, time-consuming, and sometimes overwhelming process. Researchers must make sense of large amounts of diverse and unstructured data, while also ensuring the rigor, transparency, and consistency of their analysis.

Researchers can facilitate data analysis and interpretation by adopting systematic and well-established approaches, such as thematic analysis , grounded theory , or content analysis . Utilizing qualitative data analysis software , like ATLAS.ti, can also help manage and analyze data more efficiently and rigorously.

Qualitative research often involves exploring sensitive issues or working with vulnerable populations, which raises various ethical considerations , such as privacy, confidentiality , informed consent , and potential harm to participants.

Researchers should be familiar with the ethical guidelines and requirements of their discipline, institution, or funding agency, and should obtain ethical approval from relevant review boards or committees before conducting the research. Researchers should also maintain open communication with participants, respect their autonomy and dignity, and protect their well-being throughout the research process.

Generalizability and transferability

Qualitative research typically focuses on in-depth exploration of specific cases or contexts, which may limit the generalizability or transferability of the findings to other settings or populations. However, the goal of qualitative research is not to produce statistically generalizable results but rather to provide a rich, contextualized, and nuanced understanding of the phenomena under study.

Researchers can enhance the transferability of their findings by providing rich descriptions of the research context, participants, and methods, and by discussing the potential applicability or relevance of the findings to other settings or populations. Readers can then assess the transferability of the findings based on the similarity of their own context to the one described in the research.

By addressing these challenges and adopting rigorous and transparent research practices, qualitative researchers can contribute valuable and meaningful insights that advance knowledge, inform policies, and improve practices in various fields and contexts.

Qualitative and quantitative research approaches are often seen as distinct and even opposing paradigms. However, these two approaches can be complementary, providing a more comprehensive understanding of complex social phenomena when combined. In this section, we will discuss how qualitative research can complement quantitative research and enhance the overall depth, breadth, and rigor of research findings.

Exploring and understanding context

Quantitative research excels at identifying patterns, trends, and relationships among variables using numerical data, while qualitative research provides rich and nuanced insights into the context, meaning, and underlying processes that shape these patterns or relationships. By integrating qualitative research with quantitative research, researchers can explore not only the "what" or "how many" but also the "why" and "how" of the phenomena under study.

For example, a quantitative study in health services research might reveal a correlation between social media usage and mental health outcomes, while a qualitative study could help explain the reasons behind this correlation by exploring users' experiences, motivations, and perceptions of social media. Qualitative and quantitative data in this case complement each other to contribute to a more robust theory and more informed policy implications.

Generating and refining hypotheses

Qualitative research can inform the development and refinement of hypotheses for quantitative research by identifying new concepts, variables, or relationships that emerge from the data. This can lead to more focused, relevant, and innovative quantitative research questions and hypotheses. For instance, a qualitative study on employee motivation might uncover the importance of meaningful work and supportive relationships with supervisors as key factors influencing motivation. These findings could then be incorporated into a quantitative study to test the relationships between these factors and employee motivation.

Validating and triangulating findings

Combining qualitative and quantitative research methods can enhance the credibility and trustworthiness of research findings through validation and triangulation. Validation involves comparing the findings from different methods to assess their consistency and convergence, while triangulation involves using multiple methods, data sources, or researchers to gain a more comprehensive understanding of the phenomena under study.

For example, a researcher might use both quantitative surveys and qualitative interviews in a mixed methods research design to assess the effectiveness of a health intervention. If both methods yield similar findings, this can increase confidence in the results. If the findings differ, the researcher can further investigate the reasons for these discrepancies and refine their understanding of the intervention's effectiveness.

Enhancing communication and dissemination

Qualitative research can enhance the communication and dissemination of quantitative research findings by providing vivid narratives, case studies, or examples that bring the data to life and make it more accessible and engaging for diverse audiences, such as policymakers, practitioners, or the public.

For example, a quantitative study on the impact of a community-based program might report the percentage of participants who experienced improvements in various outcomes. By adding qualitative data, such as quotes or stories from participants, the researcher can illustrate the human impact of the program and make the findings more compelling and relatable.

In conclusion, qualitative research can complement and enrich quantitative research in various ways, leading to a more comprehensive, contextualized, and rigorous understanding of complex social phenomena. By integrating qualitative and quantitative research methods, researchers can harness the strengths of both approaches to produce more robust, relevant, and impactful findings that inform theory, policy, and practice.

Qualitative research findings are typically reported in various formats, depending on the audience, purpose, and context of the research. Common ways to report qualitative research include dissertations, journal articles, market research reports, and needs assessment reports. Each format has its own structure and emphasis, tailored to meet the expectations and requirements of its target audience.

key features of a qualitative research

Dissertations and theses : Doctoral,master's, or bachelor students often conduct qualitative research as part of their dissertation or thesis projects. In this format, researchers provide a comprehensive account of their research questions , methodology, data collection , data analysis , and findings. Dissertations are expected to make a significant contribution to the existing body of knowledge and demonstrate the researcher's mastery of the subject matter.

Journal articles : Researchers frequently disseminate their qualitative research findings through articles published in academic journals . These articles are typically structured in a way that includes an introduction, literature review, methodology, results, and discussion sections. In addition, articles often undergo a peer-review process before being published in the academic journal. Journal articles focus on communicating the study's purpose, methods, and findings in a concise and coherent manner, providing enough detail for other researchers to evaluate the rigor and validity of the research so that they can cite the article and build on it in their own studies.

Market research reports : Market research often employs qualitative methods to gather insights into consumer behavior, preferences, and attitudes. Market research reports present the findings of these studies to clients, typically businesses or organizations interested in understanding their target audience or market trends. These reports focus on providing actionable insights and recommendations based on the qualitative data, helping clients make informed decisions and develop effective marketing strategies.

Needs assessment reports : Needs assessment is a process used to identify gaps or areas of improvement in a specific context, such as healthcare, education, or social services. Qualitative research methods can be used to collect data on the needs, challenges, and experiences of the target population. Needs assessment reports present the findings of this research, highlighting the identified needs and providing recommendations for addressing them. These reports are used by organizations and policymakers to inform the development and implementation of targeted interventions and policies.

Other formats : In addition to the aforementioned formats, qualitative research findings can also be reported in conference presentations, white papers, policy briefs, blog posts, or multimedia presentations. The choice of format depends on the target audience and the intended purpose of the research, as well as the researcher's preferences and resources. Regardless of the format, it is important for researchers to present their findings in a clear, accurate, and engaging manner, ensuring that their work is accessible and relevant to their audience.

key features of a qualitative research

Analyze qualitative data with ATLAS.ti

Try out our powerful data analysis tools with a free trial.

Qualitative Research : Definition

Qualitative research is the naturalistic study of social meanings and processes, using interviews, observations, and the analysis of texts and images.  In contrast to quantitative researchers, whose statistical methods enable broad generalizations about populations (for example, comparisons of the percentages of U.S. demographic groups who vote in particular ways), qualitative researchers use in-depth studies of the social world to analyze how and why groups think and act in particular ways (for instance, case studies of the experiences that shape political views).   

Events and Workshops

  • Introduction to NVivo Have you just collected your data and wondered what to do next? Come join us for an introductory session on utilizing NVivo to support your analytical process. This session will only cover features of the software and how to import your records. Please feel free to attend any of the following sessions below: April 25th, 2024 12:30 pm - 1:45 pm Green Library - SVA Conference Room 125 May 9th, 2024 12:30 pm - 1:45 pm Green Library - SVA Conference Room 125
  • Next: Choose an approach >>
  • Choose an approach
  • Find studies
  • Learn methods
  • Getting Started
  • Get software
  • Get data for secondary analysis
  • Network with researchers

Profile Photo

  • Last Updated: Aug 9, 2024 2:09 PM
  • URL: https://guides.library.stanford.edu/qualitative_research

Logo for VCU Pressbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

29 Conceptualization in qualitative research

Chapter outline

  • 15.1 Alternative paradigms: Interpretivism, critical paradigm, and pragmatism

15.2 Multiparadigmatic research: An example

15.3 idiographic causal relationships, 15.4 qualitative research questions.

Now let’s change things up! In the previous chapters, we explored steps to create and carry out a quantitative research study. Quantitative studies are great when we want to summarize or test relationships between ideas using numbers and the power of statistics. However, qualitative research offers us a different and equally important tool. Sometimes the aim of research projects is to explore meaning and lived experience. Instead of trying to arrive at generalizable conclusions for all people, some research projects establish a deep, authentic description of a specific time, place, and group of people.

Qualitative research relies on the power of human expression through words, pictures, movies, performance and other artifacts that represent these things. All of these tell stories about the human experience and we want to learn from them and have them be represented in our research. Generally speaking, qualitative research is about the gathering up of these stories, breaking them into pieces so we can examine the ideas that make them up, and putting them back together in a way that allows us to tell a common or shared story that responds to our research question. To do that, we need to discuss the assumptions underlying social science.

A penguin on an ice float. The top of the float is labeled method, next down is methodology, theory, and philosophical foundations.

17.1 Alternative paradigms: Interpretivism, critical, and pragmatism

Learning objectives.

Students will be able to…

  • Distinguish between the assumptions of positivism, interpretivism, critical, and pragmatist research paradigms.
  • Use paradigm to describe how scientific thought changes over time.

In Chapter 10, we reviewed the assumptions that underly post-positivism (abbreviated hereafter as positivism for brevity). Quantitative methods are most often the choice for positivist research questions because they conform to these assumptions. Qualitative methods  can conform to these assumptions; however, they are limited in their generalizability.

Kivunja & Kuyini (2017) [1] describe the essential features of positivism as:

  • A belief that theory is universal and law-like generalizations can be made across contexts
  • The assumption that context is not important
  • The belief that truth or knowledge is ‘out there to be discovered’ by research
  • The belief that cause and effect are distinguishable and analytically separable
  • The belief that results of inquiry can be quantified
  • The belief that theory can be used to predict and to control outcomes
  • The belief that research should follow the scientific method of investigation
  • Rests on formulation and testing of hypotheses
  • Employs empirical or analytical approaches
  • Pursues an objective search for facts
  • Believes in ability to observe knowledge
  • The researcher’s ultimate aim is to establish a comprehensive universal theory, to account for human and social behavior
  • Application of the scientific method

Because positivism is the dominant social science research paradigm, it can be easy to ignore or be confused by research that does not use these assumptions. We covered in Chapter 10 the table reprinted below when discussing the assumptions underlying positivistic social science.

As you consider your research project, keep these philosophical assumptions in mind. They are useful shortcuts to understanding the deeper ideas and assumptions behind the construction of knowledge. The purpose of exploring these philosophical assumptions isn’t to find out which is true and which is false. Instead, the goal is to identify the assumptions that fit with how you think about your research question. Choosing a paradigm helps you make those assumptions explicit.

Table 7.1 Philosophical assumptions in social science research
Ontology: assumptions about what is real
Epistemology: assumptions about how we come to know what is real

Assumptions about the researcher

Assumptions about human action

Assumptions about the social world
Assumptions about the purpose of research

Before we explore alternative paradigms, it’s important for us to review what paradigms are.

How do scientific ideas change over time?

Much like your ideas develop over time as you learn more, so does the body of scientific knowledge. Kuhn’s (1962) [2] The Structure of Scientific Revolutions is one of the most influential works on the philosophy of science, and is credited with introducing the idea of competing paradigms (or “disciplinary matrices”) in research. Kuhn investigated the way that scientific practices evolve over time, arguing that we don’t have a simple progression from “less knowledge” to “more knowledge” because the way that we approach inquiry is changing over time. This can happen gradually, but the process results in moments of change where our understanding of a phenomenon changes more radically (such as in the transition from Newtonian to Einsteinian physics; or from Lamarckian to Darwinian theories of evolution). For a social work practice example, Fleuridas & Krafcik (2019) [3] trace the development of the “four forces” of psychotherapy , from psychodynamics to behaviorism to humanism as well as the competition among emerging perspectives to establish itself as the fourth force to guide psychotherapeutic practice. But how did the problems in one paradigm inspire new paradigms? Kuhn presents us with a way of understanding the history of scientific development across all topics and disciplines.

As you can see in this video from Matthew J. Brown (CC-BY), there are four stages in the cycle of science in Kuhn’s approach. Firstly, a pre-paradigmatic state where competing approaches share no consensus. Secondly, the “normal” state where there is wide acceptance of a particular set of methods and assumptions. Thirdly, a state of crisis where anomalies that cannot be solved within the existing paradigm emerge and competing theories to address them follow. Fourthly, a revolutionary phase where some new paradigmatic approach becomes dominant and supplants the old. Shnieder (2009) [4] suggests that the Kuhnian phases are characterized by different kinds of scientific activity.

Newer approaches often build upon rather than replace older ones, but they also overlap and can exist within a state of competition. Scientists working within a particular paradigm often share methods, assumptions and values. In addition to supporting specific methods, research paradigms also influence things like the ambition and nature of research, the researcher-participant relationship and how the role of the researcher is understood.

Paradigm vs. theory

The terms ‘ paradigm ‘ and ‘ theory ‘ are often used interchangeably in social science. There is not a consensus among social scientists as to whether these are identical or distinct concepts. With that said, in this text, we will make a clear distinction between the two ideas because thinking about each concept separately is more useful for our purposes.

We define paradigm a set of common philosophical (ontological, epistemological, and axiological) assumptions that inform research. The four paradigms we describe in this section refer to patterns in how groups of researchers resolve philosophical questions. Some assumptions naturally make sense together, and paradigms grow out of researchers with shared assumptions about what is important and how to study it. Paradigms are like “analytic lenses” and a provide framework on top of which we can build theoretical and empirical knowledge (Kuhn, 1962). [5] Consider this video of an interview with world-famous physicist Richard Feynman in which he explains why “when you explain a ‘why,’ you have to be in some framework that you allow something to be true. Otherwise, you are perpetually asking why.” In order to answer basic physics question like “what is happening when two magnets attract?” or a social work research question like “what is the impact of this therapeutic intervention on depression,” you must understand the assumptions you are making about social science and the social world. Paradigmatic assumptions about objective and subjective truth support methodological choices like whether to conduct interviews or send out surveys, for example.

While paradigms are broad philosophical assumptions, theory is more specific, and refers to a set of concepts and relationships scientists use to explain the social world. Theories are more concrete, while paradigms are more abstract. Look back to Figure 7.1 at the beginning of this chapter. Theory helps you identify the concepts and relationships that align with your paradigmatic understanding of the problem. Moreover, theory informs how you will measure the concepts in your research question and the design of your project.

For both theories and paradigms, Kuhn’s observation of scientific paradigms, crises, and revolutions is instructive for understanding the history of science. Researchers inherit institutions, norms, and ideas that are marked by the battlegrounds of theoretical and paradigmatic debates that stretch back hundreds of years. We have necessarily simplified this history into four paradigms: positivism, interpretivism, critical, and pragmatism. Our framework and explanation are inspired by the framework of Guba and Lincoln (1990) [6] and Burrell and Morgan (1979). [7] while also incorporating pragmatism as a way of resolving paradigmatic questions. Most of social work research and theory can be classified as belonging to one of these four paradigms, though this classification system represents only one of many useful approaches to analyzing social science research paradigms.

Building on our discussion in section 7.1 on objective vs. subjective epistemologies and ontologies, we will start with the difference between positivism and interpretivism. Afterward, we will link our discussion of axiology in section 7.2 with the critical paradigm. Finally, we will situate pragmatism as a way to resolve paradigmatic questions strategically. The difference between positivism and interpretivism is a good place to start, since the critical paradigm and pragmatism build on their philosophical insights.

It’s important to think of paradigms less as distinct categories and more as a spectrum along which projects might fall. For example, some projects may be somewhat positivist, somewhat interpretivist, and a little critical. No project fits perfectly into one paradigm. Additionally, there is no paradigm that is more correct than the other. Each paradigm uses assumptions that are logically consistent, and when combined, are a useful approach to understanding the social world using science. The purpose of this section is to acquaint you with what research projects in each paradigm look like and how they are grounded in philosophical assumptions about social science.

You should read this section to situate yourself in terms of what paradigm feels most “at home” to both you as a person and to your project. You may find, as I have, that your research projects are more conventional and less radical than what feels most like home to you, personally. In a research project, however, students should start with their working question rather than their heart. Use the paradigm that fits with your question the best, rather than which paradigm you think fits you the best.

key features of a qualitative research

Interpretivism: Researcher as “empathizer”

Positivism is focused on generalizable truth. Interpretivism , by contrast, develops from the idea that we want to understand the truths of individuals, how they interpret and experience the world, their thought processes, and the social structures that emerge from sharing those interpretations through language and behavior. The process of interpretation (or social construction) is guided by the empathy of the researcher to understand the meaning behind what other people say.

Historically, interpretivism grew out of a specific critique of positivism: that knowledge in the human and social sciences cannot conform to the model of natural science because there are features of human experience that cannot objectively be “known”. The tools we use to understand objects that have no self-awareness may not be well-attuned to subjective experiences like emotions, understandings, values, feelings, socio-cultural factors, historical influences, and other meaningful aspects of social life. Instead of finding a single generalizable “truth,” the interpretivist researcher aims to generate understanding and often adopts a relativist position.

While positivists seek “the truth,” the social constructionist framework argues that “truth” varies. Truth differs based on who you ask, and people change what they believe is true based on social interactions. These subjective truths also exist within social and historical contexts, and our understanding of truth varies across communities and time periods. This is because we, according to this paradigm, create reality ourselves through our social interactions and our interpretations of those interactions. Key to the interpretivist perspective is the idea that social context and interaction frame our realities.

Researchers operating within this framework take keen interest in how people come to socially agree, or disagree, about what is real and true. Consider how people, depending on their social and geographical context, ascribe different meanings to certain hand gestures. When a person raises their middle finger, those of us in Western cultures will probably think that this person isn’t very happy (not to mention the person at whom the middle finger is being directed!). In other societies around the world, a thumbs-up gesture, rather than a middle finger, signifies discontent (Wong, 2007). [8] The fact that these hand gestures have different meanings across cultures aptly demonstrates that those meanings are socially and collectively constructed. What, then, is the “truth” of the middle finger or thumbs up? As we’ve seen in this section, the truth depends on the intention of the person making the gesture, the interpretation of the person receiving it, and the social context in which the action occurred.

Qualitative methods are preferred as ways to investigate these phenomena. Data collected might be unstructured (or “messy”) and correspondingly a range of techniques for approaching data collection have been developed. Interpretivism acknowledges that it is impossible to remove cultural and individual influence from research, often instead making a virtue of the positionality of the researcher and the socio-cultural context of a study.

One common objection positivists levy against interpretivists is that interpretivism tends to emphasize the subjective over the objective. If the starting point for an investigation is that we can’t fully and objectively know the world, how can we do research into this without everything being a matter of opinion? For the positivist, this risk for confirmation bias as well as invalid and unreliable measures makes interpretivist research unscientific. Clearly, we disagree with this assessment, and you should, too. Positivism and interpretivism have different ontologies and epistemologies with contrasting notions of rigor and validity (for more information on assumptions about measurement, see Chapter 11 for quantitative validity and reliability and Chapter 20 for qualitative rigor). Nevertheless, both paradigms apply the values and concepts of the scientific method through systematic investigation of the social world, even if their assumptions lead them to do so in different ways. Interpretivist research often embraces a relativist epistemology, bringing together different perspectives in search of a trustworthy and authentic understanding or narrative.

Kivunja & Kuyini (2017) [9] describe the essential features of interpretivism as:

  • The belief that truths are multiple and socially constructed
  • The acceptance that there is inevitable interaction between the researcher and his or her research participants
  • The acceptance that context is vital for knowledge and knowing
  • The belief that knowledge can be value laden and the researcher’s values need to be made explicit
  • The need to understand specific cases and contexts rather deriving universal laws that apply to everyone, everywhere.
  • The belief that causes and effects are mutually interdependent, and that causality may be circular or contradictory
  • The belief that contextual factors need to be taken into consideration in any systematic pursuit of understanding

One important clarification: it’s important to think of the interpretivist perspective as not just about individual interpretations but the social life of interpretations. While individuals may construct their own realities, groups—from a small one such as a married couple to large ones such as nations—often agree on notions of what is true and what “is” and what “is not.” In other words, the meanings that we construct have power beyond the individuals who create them. Therefore, the ways that people and communities act based on such meanings is of as much interest to interpretivists as how they were created in the first place. Theories like social constructionism, phenomenology, and symbolic interactionism are often used in concert with interpretivism.

Is interpretivism right for your project?

An interpretivist orientation to research is appropriate when your working question asks about subjective truths. The cause-and-effect relationships that interpretivist studies produce are specific to the time and place in which the study happened, rather than a generalizable objective truth. More pragmatically, if you picture yourself having a conversation with participants like an interview or focus group, then interpretivism is likely going to be a major influence for your study.

Positivists critique the interpretivist paradigm as non-scientific. They view the interpretivist focus on subjectivity and values as sources of bias. Positivists and interpretivists differ on the degree to which social phenomena are like natural phenomena. Positivists believe that the assumptions of the social sciences and natural sciences are the same, while interpretivists strongly believe that social sciences differ from the natural sciences because their subjects are social creatures.

Similarly, the critical paradigm finds fault with the interpretivist focus on the status quo rather than social change. Although interpretivists often proceed from a feminist or other standpoint theory, the focus is less on liberation than on understanding the present from multiple perspectives. Other critical theorists may object to the consensus orientation of interpretivist research. By searching for commonalities between people’s stories, they may erase the uniqueness of each individual’s story. For example, while interpretivists may arrive at a consensus definition of what the experience of “coming out” is like for people who identify as lesbian, gay, bisexual, transgender, or queer, it cannot represent the diversity of each person’s unique “coming out” experience and what it meant to them. For example, see Rosario and colleagues’ (2009) [10] critique the literature on lesbians “coming out” because previous studies did not addressing how appearing, behaving, or identifying as a butch or femme impacted the experience of “coming out” for lesbians.

  • From your literature search, identify an empirical article that uses qualitative methods to answer a research question similar to your working question or about your research topic.
  • Review the assumptions of the interpretivist research paradigm.
  • Discuss in a few sentences how the author’s conclusions are based on some of these paradigmatic assumptions. How might a researcher operating from a different paradigm (like positivism or the critical paradigm) critique the conclusions of this study?

key features of a qualitative research

Critical paradigm: Researcher as “activist”

As we’ve discussed a bit in the preceding sections, the critical paradigm focuses on power, inequality, and social change. Although some rather diverse perspectives are included here, the critical paradigm, in general, includes ideas developed by early social theorists, such as Max Horkheimer (Calhoun et al., 2007), [11] and later works developed by feminist scholars, such as Nancy Fraser (1989). [12] Unlike the positivist paradigm, the critical paradigm assumes that social science can never be truly objective or value-free. Furthermore, this paradigm operates from the perspective that scientific investigation should be conducted with the express goal of social change. Researchers in the critical paradigm foreground axiology, positionality and values . In contrast with the detached, “objective” observations associated with the positivist researcher, critical approaches make explicit the intention for research to act as a transformative or emancipatory force within and beyond the study.

Researchers in the critical paradigm might start with the knowledge that systems are biased against certain groups, such as women or ethnic minorities, building upon previous theory and empirical data. Moreover, their research projects are designed not only to collect data, but to impact the participants as well as the systems being studied. The critical paradigm applies its study of power and inequality to change those power imbalances as part of the research process itself. If this sounds familiar to you, you may remember hearing similar ideas when discussing social conflict theory in your human behavior in the social environment (HBSE) class. [13] Because of this focus on social change, the critical paradigm is a natural home for social work research. However, we fall far short of adopting this approach widely in our profession’s research efforts.

Is the critical paradigm right for your project?

Every social work research project impacts social justice in some way. What distinguishes critical research is how it integrates an analysis of power into the research process itself. Critical research is appropriate for projects that are activist in orientation. For example, critical research projects should have working questions that explicitly seek to raise the consciousness of an oppressed group or collaborate equitably with community members and clients to addresses issues of concern. Because of their transformative potential, critical research projects can be incredibly rewarding to complete. However, partnerships take a long time to develop and social change can evolve slowly on an issue, making critical research projects a more challenging fit for student research projects which must be completed under a tight deadline with few resources.

Positivists critique the critical paradigm on multiple fronts. First and foremost, the focus on oppression and values as part of the research process is seen as likely to bias the research process, most problematically, towards confirmation bias. If you start out with the assumption that oppression exists and must be dealt with, then you are likely to find that regardless of whether it is truly there or not. Similarly, positivists may fault critical researchers for focusing on how the world should be, rather than how it truly is . In this, they may focus too much on theoretical and abstract inquiry and less on traditional experimentation and empirical inquiry. Finally, the goal of social transformation is seen as inherently unscientific, as science is not a political practice.

Interpretivists often find common cause with critical researchers. Feminist studies, for example, may explore the perspectives of women while centering gender-based oppression as part of the research process. In interpretivist research, the focus is less on radical change as part of the research process and more on small, incremental changes based on the results and conclusions drawn from the research project. Additionally, some critical researchers’ focus on individuality of experience is in stark contrast to the consensus-orientation of interpretivists. Interpretivists seek to understand people’s true selves. Some critical theorists argue that people have multiple selves or no self at all.

  • From your literature search, identify an article relevant to your working question or broad research topic that uses a critical perspective. You should look for articles where the authors are clear that they are applying a critical approach to research like feminism, anti-racism, Marxism and critical theory, decolonization, anti-oppressive practice, or other social justice-focused theoretical perspectives. To target your search further, include keywords in your queries to research methods commonly used in the critical paradigm like participatory action research and community-based participatory research. If you have trouble identifying an article for this exercise, consult your professor for some help. These articles may be more challenging to find, but reviewing one is necessary to get a feel for what research in this paradigm is like.
  • Review the assumptions of the critical research paradigm.
  • Discuss in a few sentences how the author’s conclusions are based on some of these paradigmatic assumptions. How might a researcher operating from different assumptions (like values-neutrality or researcher as neutral and unbiased) critique the conclusions of this study?

key features of a qualitative research

Pragmatism: Researcher as “strategist”

“Essentially, all models are wrong but some are useful.” (Box, 1976) [14]

Pragmatism is a research paradigm that suspends questions of philosophical ‘truth’ and focuses more on how different philosophies, theories, and methods can be used strategically to provide a multidimensional view of a topic. Researchers employing pragmatism will mix elements of positivist, interpretivist, and critical research depending on the purpose of a particular project and the practical constraints faced by the researcher and their research context. We favor this approach for student projects because it avoids getting bogged down in choosing the “right” paradigm and instead focuses on the assumptions that help you answer your question, given the limitations of your research context. Student research projects are completed quickly and moving in the direction of pragmatism can be a route to successfully completing a project. Your project is a representation of what you think is feasible, ethical, and important enough for you to study.

The crucial consideration for the pragmatist is whether the outcomes of research have any real-world application, rather than whether they are “true.” The methods, theories, and philosophies chosen by pragmatic researchers are guided by their working question. There are no distinctively pragmatic research methods since this approach is about making judicious use whichever methods fit best with the problem under investigation. Pragmatic approaches may be less likely to prioritize ontological, epistemological or axiological consistency when combining different research methods. Instead, the emphasis is on solving a pressing problem and adapting to the limitations and opportunities in the researchers’ context.

Adopt a multi-paradigmatic perspective

Believe it or not, there is a long literature of acrimonious conflict between scientists from positivist, interpretivist, and critical camps (see Heineman-Pieper et al., 2002 [15] for a longer discussion). Pragmatism is an old idea, but it is appealing precisely because it attempts to resolve the problem of multiple incompatible philosophical assumptions in social science. To a pragmatist, there is no “correct” paradigm. All paradigms rely on assumptions about the social world that are the subject of philosophical debate. Each paradigm is an incomplete understanding of the world, and it requires a scientific community using all of them to gain a comprehensive view of the social world. This multi-paradigmatic perspective is a unique gift of social work research, as our emphasis on empathy and social change makes us more critical of positivism, the dominant paradigm in social science.

We offered the metaphors of expert, empathizer, activist, and strategist for each paradigm. It’s important not to take these labels too seriously. For example, some may view that scientists should be experts or that activists are biased and unscientific. Nevertheless, we hope that these metaphors give you a sense of what it feels like to conduct research within each paradigm.

One of the unique aspects of paradigmatic thinking is that often where you think you are most at home may actually be the opposite of where your research project is. For example, in my graduate and doctoral education, I thought I was a critical researcher. In fact, I thought I was a radical researcher focused on social change and transformation. Yet, often times when I sit down to conceptualize and start a research project, I find myself squarely in the positivist paradigm, thinking through neat cause-and-effect relationships that can be mathematically measured. There is nothing wrong with that! Your task for your research project is to find the paradigm that best matches your research question. Think through what you really want to study and how you think about the topic, then use assumptions of that paradigm to guide your inquiry.

Another important lesson is that no research project fits perfectly in one paradigm or another. Instead, there is a spectrum along which studies are, to varying degrees, interpretivist, positivist, and critical. For example, all social work research is a bit activist in that our research projects are designed to inform action for change on behalf of clients and systems. However, some projects will focus on the conclusions and implications of projects informing social change (i.e., positivist and interpretivist projects) while others will partner with community members and design research projects collaboratively in a way that leads to social change (i.e. critical projects). In section 7.5, we will describe a pragmatic approach to research design guided by your paradigmatic and theoretical framework.

Key Takeaways

  • Social work research falls, to some degree, in each of the four paradigms: positivism, interpretivism, critical, and pragmatist.
  • Adopting a pragmatic, multi-paradigmatic approach to research makes sense for student researchers, as it directs students to use the philosophical assumptions and methodological approaches that best match their research question and research context.
  • Research in all paradigms is necessary to come to a comprehensive understanding of a topic, and social workers must be able to understand and apply knowledge from each research paradigm.
  • Describe which paradigm best fits your perspective on the world and which best fits with your project.
  • Identify any similarities and differences in your personal assumptions and the assumption your research project relies upon. For example, are you a more critical and radical thinker but have chosen a more “expert” role for yourself in your research project?

Learners will be able to…

  • Apply the assumptions of each paradigm to your project
  • Summarize what aspects of your project stem from positivist, interpretivist, or critical assumptions

In the previous sections, we reviewed the major paradigms and theories in social work research. In this section, we will provide an example of how to apply theory and paradigm in research. This process is depicted in Figure 7.2 below with some quick summary questions for each stage. Some questions in the figure below have example answers like designs (i.e., experimental, survey) and data analysis approaches (i.e., discourse analysis). These examples are arbitrary. There are a lot of options that are not listed. So, don’t feel like you have to memorize them or use them in your study.

key features of a qualitative research

This diagram (taken from an archived Open University (UK) course entitled E89 ​- Educational Inquiry ) ​ shows one way to visualize the research design process. While research is far from linear, in general, this is how research projects progress sequentially. Researchers begin with a working question, and through engaging with the literature, develop and refine those questions into research questions (a process we will finalize in Chapter 9 ). But in order to get to the part where you gather your sample, measure your participants, and analyze your data, you need to start with paradigm. Based on your work in section 7.3, you should have a sense of which paradigm or paradigms are best suited to answering your question. The approach taken will often reflect the nature of the research question; the kind of data it is possible to collect; and work previously done in the area under consideration. When evaluating paradigm and theory, it is important to look at what other authors have done previously and the framework used by studies that are similar to the one you are thinking of conducting.

Once you situate your project in a research paradigm, it becomes possible to start making concrete choices about methods. Depending on the project, this will involve choices about things like:

  • What is my final research question?
  • What are the key variables and concepts under investigation, and how will I measure them?
  • How do I find a representative sample of people who experience the topic I’m studying?
  • What design is most appropriate for my research question?
  • How will I collect and analyze data?
  • How do I determine whether my results describe real patterns in the world or are the result of bias or error?

The data collection phase can begin once these decisions are made. It can be very tempting to start collecting data as soon as possible in the research process as this gives a sense of progress. However, it is usually worth getting things exactly right before collecting data as an error found in your approach further down the line can be harder to correct or recalibrate around.

Designing a study using paradigm and theory: An example

Paradigm and theory have the potential to turn some people off since there is a lot of abstract terminology and thinking about real-world social work practice contexts. In this section, I’ll use an example from my own research, and I hope it will illustrate a few things. First, it will show that paradigms are really just philosophical statements about things you already understand and think about normally. It will also show that no project neatly sits in one paradigm and that a social work researcher should use whichever paradigm or combination of paradigms suit their question the best. Finally, I hope it is one example of how to be a pragmatist and strategically use the strengths of different theories and paradigms to answering a research question. We will pick up the discussion of mixed methods in the next chapter.

Thinking as an expert: Positivism

In my undergraduate research methods class, I used an open textbook much like this one and wanted to study whether it improved student learning. You can read a copy of the article we wrote on based on our study . We’ll learn more about the specifics of experiments and evaluation research in Chapter 13 , but you know enough to understand what evaluating an intervention might look like. My first thought was to conduct an experiment, which placed me firmly within the positivist or “expert” paradigm.

Experiments focus on isolating the relationship between cause and effect. For my study, this meant studying an open textbook (the cause, or intervention) and final grades (the effect, or outcome). Notice that my position as “expert” lets me assume many things in this process. First, it assumes that I can distill the many dimensions of student learning into one number—the final grade. Second, as the “expert,” I’ve determined what the intervention is: indeed, I created the book I was studying, and applied a theory from experts in the field that explains how and why it should impact student learning.

Theory is part of applying all paradigms, but I’ll discuss its impact within positivism first. Theories grounded in positivism help explain why one thing causes another. More specifically, these theories isolate a causal relationship between two (or more) concepts while holding constant the effects of other variables that might confound the relationship between the key variables. That is why experimental design is so common in positivist research. The researcher isolates the environment from anything that might impact or bias the cause and effect relationship they want to investigate.

But in order for one thing to lead to change in something else, there must be some logical, rational reason why it would do so. In open education, there are a few hypotheses (though no full-fledged theories) on why students might perform better using open textbooks. The most common is the access hypothesis , which states that students who cannot afford expensive textbooks or wouldn’t buy them anyway can access open textbooks because they are free, which will improve their grades. It’s important to note that I held this theory prior to starting the experiment, as in positivist research you spell out your hypotheses in advance and design an experiment to support or refute that hypothesis.

Notice that the hypothesis here applies not only to the people in my experiment, but to any student in higher education. Positivism seeks generalizable truth, or what is true for everyone. The results of my study should provide evidence that  anyone  who uses an open textbook would achieve similar outcomes. Of course, there were a number of limitations as it was difficult to tightly control the study. I could not randomly assign students or prevent them from sharing resources with one another, for example. So, while this study had many positivist elements, it was far from a perfect positivist study because I was forced to adapt to the pragmatic limitations of my research context (e.g., I cannot randomly assign students to classes) that made it difficult to establish an objective, generalizable truth.

Thinking like an empathizer: Interpretivism

One of the things that did not sit right with me about the study was the reliance on final grades to signify everything that was going on with students. I added another quantitative measure that measured research knowledge, but this was still too simplistic. I wanted to understand how students used the book and what they thought about it. I could create survey questions that ask about these things, but to get at the subjective truths here, I thought it best to use focus groups in which students would talk to one another with a researcher moderating the discussion and guiding it using predetermined questions. You will learn more about focus groups in Chapter 18 .

Researchers spoke with small groups of students during the last class of the semester. They prompted people to talk about aspects of the textbook they liked and didn’t like, compare it to textbooks from other classes, describe how they used it, and so forth. It was this focus on  understanding and subjective experience that brought us into the interpretivist paradigm. Alongside other researchers, I created the focus group questions but encouraged researchers who moderated the focus groups to allow the conversation to flow organically.

We originally started out with the assumption, for which there is support in the literature, that students would be angry with the high-cost textbook that we used prior to the free one, and this cost shock might play a role in students’ negative attitudes about research. But unlike the hypotheses in positivism, these are merely a place to start and are open to revision throughout the research process. This is because the researchers are not the experts, the participants are! Just like your clients are the experts on their lives, so were the students in my study. Our job as researchers was to create a group in which they would reveal their informed thoughts about the issue, coming to consensus around a few key themes.

key features of a qualitative research

When we initially analyzed the focus groups, we uncovered themes that seemed to fit the data. But the overall picture was murky. How were themes related to each other? And how could we distill these themes and relationships into something meaningful? We went back to the data again. We could do this because there isn’t one truth, as in positivism, but multiple truths and multiple ways of interpreting the data. When we looked again, we focused on some of the effects of having a textbook customized to the course. It was that customization process that helped make the language more approachable, engaging, and relevant to social work practice.

Ultimately, our data revealed differences in how students perceived a free textbook versus a free textbook that is customized to the class. When we went to interpret this finding, the remix  hypothesis of open textbook was helpful in understanding that relationship. It states that the more faculty incorporate editing and creating into the course, the better student learning will be. Our study helped flesh out that theory by discussing the customization process and how students made sense of a customized resource.

In this way, theoretical analysis operates differently in interpretivist research. While positivist research tests existing theories, interpretivist research creates theories based on the stories of research participants. However, it is difficult to say if this theory was totally emergent in the dataset or if my prior knowledge of the remix hypothesis influenced my thinking about the data. Interpretivist researchers are encouraged to put a box around their prior experiences and beliefs, acknowledging them, but trying to approach the data with fresh eyes. Interpretivists know that this is never perfectly possible, though, as we are always influenced by our previous experiences when interpreting data and conducting scientific research projects.

Thinking like an activist: Critical

Although adding focus groups helped ease my concern about reducing student learning down to just final grades by providing a more rich set of conversations to analyze. However, my role as researcher and “expert” was still an important part of the analysis. As someone who has been out of school for a while, and indeed has taught this course for years, I have lost touch with what it is like to be a student taking research methods for the first time. How could I accurately interpret or understand what students were saying? Perhaps I would overlook things that reflected poorly on my teaching or my book. I brought other faculty researchers on board to help me analyze the data, but this still didn’t feel like enough.

By luck, an undergraduate student approached me about wanting to work together on a research project. I asked her if she would like to collaborate on evaluating the textbook with me. Over the next year, she assisted me with conceptualizing the project, creating research questions, as well as conducting and analyzing the focus groups. Not only would she provide an “insider” perspective on coding the data, steeped in her lived experience as a student, but she would serve as a check on my power through the process.

Including people from the group you are measuring as part of your research team is a common component of critical research. Ultimately, critical theorists would find my study to be inadequate in many ways. I still developed the research question, created the intervention, and wrote up the results for publication, which privileges my voice and role as “expert.” Instead, critical theorists would emphasize the role of students (community members) in identifying research questions, choosing the best intervention to used, and so forth. But collaborating with students as part of a research team did address some of the power imbalances in the research process.

Critical research projects also aim to have an impact on the people and systems involved in research. No students or researchers had profound personal realizations as a result of my study, nor did it lessen the impact of oppressive structures in society. I can claim some small victory that my department switched to using my textbook after the study was complete (changing a system), though this was likely the result of factors other than the study (my advocacy for open textbooks).

Social work research is almost always designed to create change for people or systems. To that end, every social work project is at least somewhat critical. However, the additional steps of conducting research with people rather than on people reveal a depth to the critical paradigm. By bringing students on board the research team, study had student perspectives represented in conceptualization, data collection, and analysis. That said, there was much to critique about this study from a critical perspective. I retained a lot of the power in the research process, and students did not have the ability to determine the research question or purpose of the project. For example, students might likely have said that textbook costs and the quality of their research methods textbook were less important than student debt, racism, or other potential issues experienced by students in my class. Instead of a ground-up research process based in community engagement, my research included some important participation by students on project created and led by faculty.

Conceptualization is an iterative process

I hope this conversation was useful in applying paradigms to a research project. While my example discusses education research, the same would apply for social work research about social welfare programs, clinical interventions, or other topics. Paradigm and theory are covered at the beginning of the conceptualization of your project because these assumptions will structure the rest of your project. Each of the research steps that occur after this chapter (e.g., forming a question, choosing a design) rely upon philosophical and theoretical assumptions. As you continue conceptualizing your project over the next few weeks, you may find yourself shifting between paradigms. That is normal, as conceptualization is not a linear process. As you move through the next steps of conceptualizing and designing a project, you’ll find philosophies and theories that best match how you want to study your topic.

Viewing theoretical and empirical arguments through this lens is one of the true gifts of the social work approach to research. The multi-paradigmatic perspective is a hallmark of social work research and one that helps us contribute something unique on research teams and in practice.

  • Multi-paradigmatic research is a distinguishing hallmark of social work research. Understanding the limitations and strengths of each paradigm will help you justify your research approach and strategically choose elements from one or more paradigms to answer your question.
  • Paradigmatic assumptions help you understand the “blind spots” in your research project and how to adjust and address these areas. Keep in mind, it is not necessary to address all of your blind spots, as all projects have limitations.
  • Sketch out which paradigm applies best to your project. Second, building on your answer to the exercise in section 7.3, identify how the theory you chose and the paradigm in which you find yourself are consistent or are in conflict with one another. For example, if you are using systems theory in a positivist framework, you might talk about how they both rely on a deterministic approach to human behavior with a focus on the status-quo and social order.
  • Define and provide an example of an idiographic causal explanation
  • Differentiate between idiographic and nomothetic causal relationships
  • Link idiographic and nomothetic causal relationships with the process of theory building and theory testing
  • Describe how idiographic and nomothetic causal explanations can be complementary

As we transition away from positivism, it is important to highlight the assumptions it makes about the scientific process–the hypothetico-deductive method, sometimes referred to as the research circle.

The hypothetico-deductive method

The primary way that researchers in the positivist paradigm use theories is sometimes called the hypothetico-deductive method (although this term is much more likely to be used by philosophers of science than by scientists themselves). Researchers choose an existing theory. Then, they make a prediction about some new phenomenon that should be observed if the theory is correct. Again, this prediction is called a hypothesis. The researchers then conduct an empirical study to test the hypothesis. Finally, they reevaluate the theory in light of the new results and revise it if necessary.

This process is usually conceptualized as a cycle because the researchers can then derive a new hypothesis from the revised theory, conduct a new empirical study to test the hypothesis, and so on. As Figure 8.8 shows, this approach meshes nicely with the process of conducting a research project—creating a more detailed model of “theoretically motivated” or “theory-driven” research. Together, they form a model of theoretically motivated research. 

key features of a qualitative research

Keep in mind the hypothetico-deductive method is only one way of using social theory to inform social science research. It starts with describing one or more existing theories, deriving a hypothesis from one of those theories, testing your hypothesis in a new study, and finally reevaluating the theory based on the results data analyses. This format works well when there is an existing theory that addresses the research question—especially if the resulting hypothesis is surprising or conflicts with a hypothesis derived from a different theory.

But what if your research question is more interpretive? What if it is less about theory-testing and more about theory-building? This is what our next chapter covers: the process of inductively deriving theory from people’s stories and experiences. This process looks different than that depicted in Figure 8.8. It still starts with your research question and answering that question by conducting a research study. But instead of testing a hypothesis you created based on a theory, you will create a theory of your own that explain the data you collected. This format works well for qualitative research questions and for research questions that existing theories do not address.

Inductive reasoning is most commonly found in studies using qualitative methods, such as focus groups and interviews. Because inductive reasoning involves the creation of a new theory, researchers need very nuanced data on how the key concepts in their working question operate in the real world. Qualitative data is often drawn from lengthy interactions and observations with the individuals and phenomena under examination. For this reason, inductive reasoning is most often associated with qualitative methods, though it is used in both quantitative and qualitative research.

key features of a qualitative research

Whose truth does science establish?

Social work is concerned with the “isms” of oppression (ableism, ageism, cissexism, classism, heterosexism, racism, sexism, etc.), and so our approach to science must reconcile its history as both a tool of oppression and its exclusion of oppressed groups. Science grew out of the Enlightenment, a philosophical movement which applied reason and empirical analysis to understanding the world. While the Enlightenment brought forth tremendous achievements, the critiques of Marxian, feminist, and other critical theorists complicated the Enlightenment understanding of science. For this section, I will focus on feminist critiques of science, building upon an entry in the Stanford Encyclopedia of Philosophy (Crasnow, 2020). [16]

In its original formulation, science was an individualistic endeavor. As we learned in Chapter 1 , a basic statement of the scientific method is that a researcher studies existing theories on a topic, formulates a hypothesis about what might be true, and either confirms or disconfirms their hypothesis through experiment and rigorous observation. Over time, our theories become more accurate in their predictions and more comprehensive in their conclusions. Scientists put aside their preconceptions, look at the data, and build their theories based on objective rationality.

Yet, this cannot be perfectly true. Scientists are human, after all. As a profession historically dominated by white men, scientists have dismissed women and other minorities as being psychologically unfit for the scientific profession. While attitudes have improved, science, technology, engineering, mathematics (STEM) and related fields remain dominated by white men (Grogan, 2019). [17] Biases can persist in social work theory and research when social scientists do not have similar experiences to the populations they study.

Gender bias can influence the research questions scientists choose to answer. Feminist critiques of medical science drew attention to women’s health issues, spurring research and changing standards of care. The focus on domestic violence in the empirical literature can also be seen as a result of feminist critique. Thus, critical theory helps us critique what is on the agenda for science. If science is to answer important questions, it must speak to the concerns of all people. Through the democratization in access to scientific knowledge and the means to produce it, science becomes a sister process of social development and social justice.

The goal of a diverse and participatory scientific community lies in contrast to much of what we understand to be “proper” scientific knowledge. Many of the older, classic social science theories were developed based on research which observed males or from university students in the United States or other Western nations. How these observations were made, what questions were asked, and how the data were interpreted were shaped by the same oppressive forces that existed in broader society, a process that continues into the present. In psychology, the concept of hysteria or hysterical women was believed to be caused by a wandering womb (Tasca et al., 2012). [18] Even today, there are gender biases in diagnoses of histrionic personality disorder and racial biases in psychotic disorders (Klonsky et al., 2002) [19] because the theories underlying them were created in a sexist and racist culture. In these ways, science can reinforce the truth of the white Western male perspective.

Finally, it is important to note that social science research is often conducted on populations rather than with populations. Historically, this has often meant Western men traveling to other countries and seeking to understand other cultures through a Western lens. Lacking cultural humility and failing to engage stakeholders, ethnocentric research of this sort has led to the view of non-Western cultures as inferior. Moreover, the use of these populations as research subjects rather than co-equal participants in the research process privileges the researcher’s knowledge over that from other groups or cultures. Researchers working with indigenous cultures, in particular, had a destructive habit of conducting research for a short time and then leaving, without regard for the impact their study had on the population. These critiques of Western science aim to decolonize social science and dismantle the racist ideas the oppress indigenous and non-Western peoples through research (Smith, 2013). [20]

The central concept in feminist, anti-racist, and decolonization critiques (among other critical frames) is epistemic injustice. Epistemic injustice happens when someone is treated unfairly in their capacity to know something or describe their experience of the world. As described by Fricker (2011), [21] the injustice emerges from the dismissal of knowledge from oppressed groups, discrimination against oppressed groups in scientific communities, and the resulting gap between what scientists can make sense of from their experience and the experiences of people with less power who have lived experience of the topic. We recommend this video from Edinburgh Law School which applies epistemic injustice to studying public health emergencies, disabilities, and refugee services .

The letters IV on the left side with an arrow pointing to the letters DV on the right

Positivism relies on nomothetic causality, or the idea that “one event, behavior, or belief will result in the occurrence of another, subsequent event, behavior, or belief.” Then, we described one kind of causality: a simple cause-and-effect relationship supported by existing theory and research on the topic, also known as a nomothetic causal relationship. But what if there is not a lot of literature on your topic? What if your question is more exploratory than explanatory? Then, you need a different kind of causal explanation, one that accounts for the complexity of human interactions.

How can we build causal relationships if we are just describing or exploring a topic? Recall the definitions of exploratory research , descriptive research , and explanatory research from Chapter 2. Wouldn’t we need to do explanatory research to build any kind of causal explanation? Explanatory research attempts to establish nomothetic causal relationships: an independent variable is demonstrated to cause change in a dependent variable. Exploratory and descriptive qualitative research contains some causal relationships, but they are actually descriptions of the causal relationships established by the study participants.

What do idiographic causal explanations look like?

An idiographic causal relationship   tries to identify the many, interrelated causes that account for the phenomenon the researcher is investigating. So, if idiographic causal explanations do not look like Figure 8.5, 8.6, or 8.7 what do they look like? Instead of saying “x causes y,” your participants will describe their experiences with “x,” which they will tell you was caused and influenced by a variety of other factors, as interpreted through their unique perspective, time, and environment. As we stated before, idiographic causal explanations are messy. Your job as a social science researcher is to accurately describe the patterns in what your participants tell you.

Let’s think about this using an example. If I asked you why you decided to become a social worker, what might you say? For me, I would say that I wanted to be a mental health clinician since I was in high school. I was interested in how people thought, and I was privileged enough to have psychology courses at my local high school. I thought I wanted to be a psychologist, but at my second internship in my undergraduate program, my supervisors advised me to become a social worker because the license provided greater authority for independent practice and flexibility for career change. Once I found out social workers were like psychologists who also raised trouble about social justice, I was hooked.

That’s not a simple explanation at all! But it’s definitely a causal explanation. It is my individual, subjective truth of a complex process. If we were to ask multiple social workers the same question, we might find out that many social workers begin their careers based on factors like personal experience with a disability or social injustice, positive experiences with social workers, or a desire to help others. No one factor is the “most important factor,” like with nomothetic causal relationships. Instead, a complex web of factors, contingent on context, emerge when you interpret what people tell you about their lives.

Understanding “why?”

In creating an idiographic explanation, you are still asking “why?” But the answer is going to be more complex. Those complexities are described in Table 8.1 as well as this short video comparing nomothetic and idiographic relationships .

Table 8.1: Comparing nomothetic and idiographic causal relationships
Nomothetic causal relationships Idiographic causal relationships
Paradigm Positivist Interpretivist
Purpose of research Prediction & generalization Understanding & particularity
Reasoning Deductive Inductive
Purpose of research Explanatory Exploratory or descriptive
Research methods Quantitative Qualitative
Causality Simple: cause and effect Complex: context-dependent, sometimes circular or contradictory
Role of theory Theory testing Theory building

Remember our question from the last section, “Are you trying to generalize or nah?” If you answered nah (or no, like a normal person), you are trying to establish an idiographic causal explanation. The purpose of that explanation isn’t to predict the future or generalize to larger populations, but to describe the here-and-now as it is experienced by individuals within small groups and communities. Idiographic explanations are focused less on what is generally experienced by all people but more on the particularities of what specific individuals in a unique time and place experience.

Researchers seeking idiographic causal relationships are not trying to generalize or predict, so they have no need to reduce phenomena to mathematics. In fact, only examining things that can be counted can rob a causal relationship of its meaning and context. Instead, the goal of idiographic causal relationships is understanding, rather than prediction. Idiographic causal relationships are formed by interpreting people’s stories and experiences. Usually, these are expressed through words. Not all qualitative studies use word data, as some can use interpretations of visual or performance art. However, the vast majority of qualitative studies do use word data, like the transcripts from interviews and focus groups or documents like journal entries or meeting notes. Your participants are the experts on their lives—much like in social work practice—and as in practice, people’s experiences are embedded in their cultural, historical, and environmental context.

Idiographic causal explanations are powerful because they can describe the complicated and interconnected nature of human life. Nomothetic causal explanations, by comparison, are simplistic. Think about if someone asked you why you wanted to be a social worker. Your story might include a couple of vignettes from your education and early employment. It might include personal experience with the social welfare system or family traditions. Maybe you decided on a whim to enroll in a social work course during your graduate program. The impact of each of these events on your career is unique to you.

Idiographic causal explanations are concerned with individual stories, their idiosyncrasies, and the patterns that emerge when you collect and analyze multiple people’s stories. This is the inductive reasoning we discussed at the beginning of this chapter. Often, idiographic causal explanations begin by collecting a lot of qualitative data, whether though interviews, focus groups, or looking at available documents or cultural artifacts. Next, the researcher looks for patterns in the data and arrives at a tentative theory for how the key ideas in people’s stories are causally related.

Unlike nomothetic causal relationships, there are no formal criteria (e.g., covariation) for establishing causality in idiographic causal relationships. In fact, some criteria like temporality and nonspuriousness may be violated. For example, if an adolescent client says, “It’s hard for me to tell whether my depression began before my drinking, but both got worse when I was expelled from my first high school,” they are recognizing that it may not so simple that one thing causes another. Sometimes, there is a reciprocal relationship where one variable (depression) impacts another (alcohol abuse), which then feeds back into the first variable (depression) and into other variables as well (school). Other criteria, such as covariation and plausibility, still make sense, as the relationships you highlight as part of your idiographic causal explanation should still be plausible and its elements should vary together.

Theory building and theory testing

As we learned in the previous section, nomothetic causal explanations are created by researchers applying deductive reasoning to their topic and creating hypotheses using social science theories. Much of what we think of as social science is based on this hypothetico-deductive method, but this leaves out the other half of the equation. Where do theories come from? Are they all just revisions of one another? How do any new ideas enter social science?

Through inductive reasoning and idiographic causal explanations!

Let’s consider a social work example. If you plan to study domestic and sexual violence, you will likely encounter the Power and Control Wheel, also known as the Duluth Model (Figure 8.9). The wheel is a model designed to depict the process of domestic violence. The wheel was developed based on qualitative focus groups conducted by sexual and domestic violence advocates in Duluth, MN. This video explains more about the Duluth Model of domestic abuse.

Power and control wheel indicating the factors like

The Power and Control Wheel is an example of what an idiographic causal relationship looks like. By contrast, look back at the previous section’s Figure 8.5, 8.6, and 8.7 on nomothetic causal relationships between independent and dependent variables. See how much more complex idiographic causal explanations are?! They are complex, but not difficult to understand. At the center of domestic abuse is power and control, and while not every abuser would say that is what they were doing, that is the understanding of the survivors who informed this theoretical model. Their power and control is maintained through a variety of abusive tactics from social isolation to use of privilege to avoid consequences.

What about the role of hypotheses in idiographic causal explanations? In nomothetic causal explanations, researchers create hypotheses using existing theory and then test them for accuracy. Hypotheses in idiographic causality are much more tentative, and are probably best considered as “hunches” about what they think might be true. Importantly, they might indicate the researcher’s prior knowledge and biases before the project begins, but the goal of idiographic research is to let your participants guide you rather than existing social work knowledge. Continuing with our Duluth Model example, advocates likely had some tentative hypotheses about what was important in a relationship with domestic violence. After all, they worked with this population for years prior to the creation of the model. However, it was the stories of the participants in these focus groups that led the Power and Control Wheel explanation for domestic abuse.

As qualitative inquiry unfolds, hypotheses and hunches are likely to emerge and shift as researchers learn from what their participants share. Because the participants are the experts in idiographic causal relationships, a researcher should be open to emerging topics and shift their research questions and hypotheses accordingly. This is in contrast to hypotheses in quantitative research, which remain constant throughout the study and are shown to be true or false.

Over time, as more qualitative studies are done and patterns emerge across different studies and locations, more sophisticated theories emerge that explain phenomena across multiple contexts. Once a theory is developed from qualitative studies, a quantitative researcher can seek to test that theory. For example, a quantitative researcher may hypothesize that men who hold traditional gender roles are more likely to engage in domestic violence. That would make sense based on the Power and Control Wheel model, as the category of “using male privilege” speaks to this relationship. In this way, qualitatively-derived theory can inspire a hypothesis for a quantitative research project, as we will explore in the next section.

Complementary approaches

If idiographic and nomothetic still seem like obscure philosophy terms, let’s consider another example. Imagine you are working for a community-based non-profit agency serving people with disabilities. You are putting together a report to lobby the state government for additional funding for community support programs. As part of that lobbying, you are likely to rely on both nomothetic and idiographic causal relationships.

If you looked at nomothetic causal relationships, you might learn how previous studies have shown that, in general, community-based programs like yours are linked with better health and employment outcomes for people with disabilities. Nomothetic causal explanations seek to establish that community-based programs are better for everyone with disabilities, including people in your community.

If you looked at idiographic causal explanations, you would use stories and experiences of people in community-based programs. These individual stories are full of detail about the lived experience of being in a community-based program. You might use one story from a client in your lobbying campaign, so policymakers can understand the lived experience of what it’s like to be a person with a disability in this program. For example, a client who said “I feel at home when I’m at this agency because they treat me like a family member,” or “this is the agency that helped me get my first paycheck,” can communicate richer, more complex causal relationships.

Neither kind of causal explanation is better than the other. A decision to seek idiographic causal explanations means that you will attempt to explain or describe your phenomenon exhaustively, attending to cultural context and subjective interpretations. A decision to seek nomothetic causal explanations, on the other hand, means that you will try to explain what is true for everyone and predict what will be true in the future. In short, idiographic explanations have greater depth, and nomothetic explanations have greater breadth.

Most importantly, social workers understand the value of both approaches to understanding the social world. A social worker helping a client with substance abuse issues seeks idiographic explanations when they ask about that client’s life story, investigate their unique physical environment, or probe how their family relationships. At the same time, a social worker also uses nomothetic explanations to guide their interventions. Nomothetic explanations may help guide them to minimize risk factors and maximize protective factors or use an evidence-based therapy, relying on knowledge about what in general  helps people with substance abuse issues.

So, which approach speaks to you? Are you interested in learning about (a) a few people’s experiences in a great deal of depth, or (b) a lot of people’s experiences more superficially, while also hoping your findings can be generalized to a greater number of people? The answer to this question will drive your research question and project. These approaches provide different types of information and both types are valuable.

  • Idiographic causal explanations focus on subjectivity, context, and meaning.
  • Idiographic causal explanations are best suited to exploratory research questions and qualitative methods.
  • Idiographic causal explanations are used to create new theories in social science.
  • Explore the literature on the theory you identified in section 8.1.
  • Read about the origins of your theory. Who developed it and from what data?
  • See if you can find a figure like Figure 8.9 in an article or book chapter that depicts the key concepts in your theory and how those concepts are related to one another causally. Write out a short statement on the causal relationships contained in the figure.
  • List the key terms associated with qualitative research questions
  • Distinguish between qualitative and quantitative research questions

Qualitative research questions differ from quantitative research questions. Because qualitative research questions seek to explore or describe phenomena, not provide a neat nomothetic explanation, they are often more general and openly worded. They may include only one concept, though many include more than one. Instead of asking how one variable causes changes in another, we are instead trying to understand the experiences ,  understandings , and  meanings that people have about the concepts in our research question. These keywords often make an appearance in qualitative research questions.

Let’s work through an example from our last section. In Table 9.1, a student asked, “What is the relationship between sexual orientation or gender identity and homelessness for late adolescents in foster care?” In this question, it is pretty clear that the student believes that adolescents in foster care who identify as LGBTQ+ may be at greater risk for homelessness. This is a nomothetic causal relationship—LGBTQ+ status causes changes in homelessness.

However, what if the student were less interested in  predicting  homelessness based on LGBTQ+ status and more interested in  understanding  the stories of foster care youth who identify as LGBTQ+ and may be at risk for homelessness? In that case, the researcher would be building an idiographic causal explanation . The youths whom the researcher interviews may share stories of how their foster families, caseworkers, and others treated them. They may share stories about how they thought of their own sexuality or gender identity and how it changed over time. They may have different ideas about what it means to transition out of foster care.

key features of a qualitative research

Because qualitative questions usually center on idiographic causal relationships, they look different than quantitative questions. Table 9.3 below takes the final research questions from Table 9.1 and adapts them for qualitative research. The guidelines for research questions previously described in this chapter still apply, but there are some new elements to qualitative research questions that are not present in quantitative questions.

  • Qualitative research questions often ask about lived experience, personal experience, understanding, meaning, and stories.
  • Qualitative research questions may be more general and less specific.
  • Qualitative research questions may also contain only one variable, rather than asking about relationships between multiple variables.
Table 9.3 Quantitative vs. qualitative research questions
How does witnessing domestic violence impact a child’s romantic relationships in adulthood? How do people who witness domestic violence understand its effects on their current relationships?
What is the relationship between sexual orientation or gender identity and homelessness for late adolescents in foster care? What is the experience of identifying as LGBTQ+ in the foster care system?
How does income inequality affect ambivalence in high-density urban areas? What does racial ambivalence mean to residents of an urban neighborhood with high income inequality?
How does race impact rates of mental health diagnosis for children in foster care? How do African-Americans experience seeking help for mental health concerns?

Qualitative research questions have one final feature that distinguishes them from quantitative research questions: they can change over the course of a study. Qualitative research is a reflexive process, one in which the researcher adapts their approach based on what participants say and do. The researcher must constantly evaluate whether their question is important and relevant to the participants. As the researcher gains information from participants, it is normal for the focus of the inquiry to shift.

For example, a qualitative researcher may want to study how a new truancy rule impacts youth at risk of expulsion. However, after interviewing some of the youth in their community, a researcher might find that the rule is actually irrelevant to their behavior and thoughts. Instead, their participants will direct the discussion to their frustration with the school administrators or the lack of job opportunities in the area. This is a natural part of qualitative research, and it is normal for research questions and hypothesis to evolve based on information gleaned from participants.

However, this reflexivity and openness unacceptable in quantitative research for good reasons. Researchers using quantitative methods are testing a hypothesis, and if they could revise that hypothesis to match what they found, they could never be wrong! Indeed, an important component of open science and reproducability is the preregistration of a researcher’s hypotheses and data analysis plan in a central repository that can be verified and replicated by reviewers and other researchers. This interactive graphic from 538 shows how an unscrupulous research could come up with a hypothesis and theoretical explanation  after collecting data by hunting for a combination of factors that results in a statistically significant relationship. This is an excellent example of how the positivist assumptions behind quantitative research and intepretivist assumptions behind qualitative research result in different approaches to social science.

  • Qualitative research questions often contain words or phrases like “lived experience,” “personal experience,” “understanding,” “meaning,” and “stories.”
  • Qualitative research questions can change and evolve over the course of the study.
  • Using the guidance in this chapter, write a qualitative research question. You may want to use some of the keywords mentioned above.
  • Kivuna, C. & Kuyini, A. B. (2017). Understanding and applying research paradigms in educational contexts. International Journal of Higher Education, 6 (5), 26-41. https://eric.ed.gov/?id=EJ1154775 ↵
  • Kuhn, T. (1962). The structure of scientific revolutions . Chicago: University of Chicago Press. ↵
  • Fleuridas, C., & Krafcik, D. (2019). Beyond four forces: The evolution of psychotherapy. Sage Open ,  9 (1), 2158244018824492. ↵
  • Shneider, A. M. (2009). Four stages of a scientific discipline; four types of scientist. Trends in Biochemical Sciences 34 (5), 217-233. https://doi.org/10.1016/j.tibs.2009.02.00 ↵
  • Burrell, G. & Morgan, G. (1979). Sociological paradigms and organizational analysis . Routledge. Guba, E. (ed.) (1990). The paradigm dialog . SAGE. ↵
  • Routledge. Guba, E. (ed.) (1990). The paradigm dialog . SAGE. ↵
  • Burrell, G. & Morgan, G. (1979). Sociological paradigms and organizational analysis . Here is a summary of Burrell & Morgan from Babson College , and our classification collapses radical humanism and radical structuralism into the critical paradigm, following Guba and Lincoln's three-paradigm framework. We feel this approach is more parsimonious and easier for students to understand on an introductory level. ↵
  • For more about how the meanings of hand gestures vary by region, you might read the following blog entry: Wong, W. (2007). The top 10 hand gestures you’d better get right . Retrieved from: http://www.languagetrainers.co.uk/blog/2007/09/24/top-10-hand-gestures ↵
  • Rosario, M., Schrimshaw, E. W., Hunter, J., & Levy-Warren, A. (2009). The coming-out process of young lesbian and bisexual women: Are there butch/femme differences in sexual identity development?. Archives of sexual behavior ,  38 (1), 34-49. ↵
  • Calhoun, C., Gerteis, J., Moody, J., Pfaff, S., & Virk, I. (Eds.). (2007). Classical sociological theory  (2nd ed.). Malden, MA: Blackwell. ↵
  • Fraser, N. (1989).  Unruly practices: Power, discourse, and gender in contemporary social theory . Minneapolis, MN: University of Minnesota Press. ↵
  • Here are links to two HBSE open textbooks, if you are unfamiliar with social work theories and would like more background. https://uark.pressbooks.pub/hbse1/ and https://uark.pressbooks.pub/humanbehaviorandthesocialenvironment2/ ↵
  • Box, G. E. P.. (1976). Science and statistics. Journal of the American Statistical Association, 71 (356), 791. ↵
  • Heineman-Pieper, J., Tyson, K., & Pieper, M. H. (2002). Doing good science without sacrificing good values: Why the heuristic paradigm is the best choice for social work.  Families in Society ,  83 (1), 15-28. ↵
  • Crasnow, S. (2020). Feminist perspectives on science. In E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2020 Edition). Retrieved from: https://plato.stanford.edu/entries/feminist-science/ ↵
  • Grogan, K.E. (2019) How the entire scientific community can confront gender bias in the workplace. Nature Ecology & Evolution, 3 ,  3–6. doi:10.1038/s41559-018-0747-4 ↵
  • Tasca, C., Rapetti, M., Carta, M. G., & Fadda, B. (2012). Women and hysteria in the history of mental health. Clinical practice and epidemiology in mental health: Clinical practice & epidemiology in mental health ,  8 , 110-119. ↵
  • Klonsky, E. D., Jane, J. S., Turkheimer, E., & Oltmanns, T. F. (2002). Gender role and personality disorders.  Journal of personality disorders ,  16 (5), 464-476. ↵
  • Smith, L. T. (2013). Decolonizing methodologies: Research and indigenous peoples . Zed Books Ltd. ↵
  • Fricker, M. (2011). Epistemic injustice: Power and the ethics of knowing . Oxford University Press. ↵

The highest level of measurement. Denoted by mutually exclusive categories, a hierarchy (order), values can be added, subtracted, multiplied, and divided, and the presence of an absolute zero.

a paradigm based on the idea that social context and interaction frame our realities

a paradigm in social science research focused on power, inequality, and social change

a research paradigm that suspends questions of philosophical ‘truth’ and focuses more on how different philosophies, theories, and methods can be used strategically to resolve a problem or question within the researcher's unique context

A cyclical process of theory development, starting with an observed phenomenon, then developing or using a theory to make a specific prediction of what should happen if that theory is correct, testing that prediction, refining the theory in light of the findings, and using that refined theory to develop new hypotheses, and so on.

when someone is treated unfairly in their capacity to know something or describe their experience of the world

conducted during the early stages of a project, usually when a researcher wants to test the feasibility of conducting a more extensive study or if the topic has not been studied in the past

research that describes or defines a particular phenomenon

explains why particular phenomena work in the way that they do; answers “why” questions

attempts to explain or describe your phenomenon exhaustively, based on the subjective understandings of your participants

"Assuming that the null hypothesis is true and the study is repeated an infinite number times by drawing random samples from the same populations(s), less than 5% of these results will be more extreme than the current result" (Cassidy et al., 2019, p. 233).

Scientific Inquiry in Social Work (2nd Edition) Copyright © 2020 by Matthew DeCarlo, Cory Cummings, and Kate Agnelli is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Introduction to qualitative research methods - Part I

Affiliation.

  • 1 Department of Global Health and Social Medicine, King's College London, London, United Kingdom.
  • PMID: 36909216
  • PMCID: PMC10003579
  • DOI: 10.4103/picr.picr_253_22

Qualitative research methods are widely used in the social sciences and the humanities, but they can also complement quantitative approaches used in clinical research. In this article, we discuss the key features and contributions of qualitative research methods.

Keywords: Qualitative research; social sciences; sociology.

Copyright: © 2023 Perspectives in Clinical Research.

PubMed Disclaimer

Conflict of interest statement

There are no conflicts of interest.

Examples of qualitative research techniques

Developing a research methodology

Key features of qualitative research…

Key features of qualitative research methods

Similar articles

  • Introduction to qualitative research methods: Part 2. Surawy-Stepney N, Provost F, Bhangu S, Caduff C. Surawy-Stepney N, et al. Perspect Clin Res. 2023 Apr-Jun;14(2):95-99. doi: 10.4103/picr.picr_37_23. Epub 2023 Apr 13. Perspect Clin Res. 2023. PMID: 37325574 Free PMC article.
  • Qualitative methods for intervention research. Needleman C, Needleman ML. Needleman C, et al. Am J Ind Med. 1996 Apr;29(4):329-37. doi: 10.1002/(SICI)1097-0274(199604)29:4 3.0.CO;2-3. Am J Ind Med. 1996. PMID: 8728134 Review.
  • [The science of the art of medicine. Qualitative methods in medical research]. Hafting M, Nessa J. Hafting M, et al. Tidsskr Nor Laegeforen. 1990 Aug 20;110(19):2546-9. Tidsskr Nor Laegeforen. 1990. PMID: 2219019 Norwegian.
  • Time to care: why the humanities and the social sciences belong in the science of health. Clarke B, Ghiara V, Russo F. Clarke B, et al. BMJ Open. 2019 Aug 27;9(8):e030286. doi: 10.1136/bmjopen-2019-030286. BMJ Open. 2019. PMID: 31462483 Free PMC article.
  • The norm and the text: Denzin and Lincoln's handbooks of qualitative method. Fielding NG. Fielding NG. Br J Sociol. 1999 Sep;50(3):525-34. Br J Sociol. 1999. PMID: 15259199 Review.
  • Examining the educational experiences of Behvarzes from the insufficient participation of some people in preventive measures against the COVID-19 pandemic: a lesson for the future. Rajabi-Arani Z, Asadi-Piri Z, Zamani-Alavijeh F, Mirhosseini F, Bigdeli S, Dandekar SP, Bastami F. Rajabi-Arani Z, et al. BMC Med Educ. 2024 Jul 22;24(1):785. doi: 10.1186/s12909-024-05752-0. BMC Med Educ. 2024. PMID: 39039468 Free PMC article.
  • Conducting and Writing Quantitative and Qualitative Research. Barroga E, Matanguihan GJ, Furuta A, Arima M, Tsuchiya S, Kawahara C, Takamiya Y, Izumi M. Barroga E, et al. J Korean Med Sci. 2023 Sep 18;38(37):e291. doi: 10.3346/jkms.2023.38.e291. J Korean Med Sci. 2023. PMID: 37724495 Free PMC article. Review.
  • Shapin S, Schaffer S. Princeton: Princeton University Press; 1985. Leviathan and the Air-Pump: Hobbes, Boyle, and the Experimental Life.
  • Uberoi JP. Delhi: Oxford University Press; 1978. Science and Culture.
  • Poovey M. Chicago, IL: University of Chicago Press; 1998. A History of the Modern Fact: Problems of Knowledge in the Sciences of Wealth and Society.
  • Creswell JW. 2nd. Thousand Oaks, CA: Sage Publications; 2007. Qualitative Inquiry and Research Design: Choosing among Five Approaches.
  • Bhangu S, Bisshop A, Engelmann S, Meulemans G, Reinert H, Thibault-Picazo Y. Feeling/Following: Creative Experiments and Material Play, Anthropocene Curriculum, Haus der Kulturen der Welt. Max Planck Institute for the History of Science; The Anthropocene Issue. 2016

Related information

Linkout - more resources, full text sources.

  • Europe PubMed Central
  • PubMed Central
  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

  • Harvard Library
  • Research Guides
  • Faculty of Arts & Sciences Libraries

Library Support for Qualitative Research

  • Data Analysis
  • Types of Interviews
  • Recruiting & Engaging Participants
  • Interview Questions
  • Conducting Interviews
  • Recording & Transcription

QDA Software

Coding and themeing the data, data visualization, testing or generating theories.

  • Managing Interview Data
  • Finding Extant Interviews
  • Past Workshops on Interview Research
  • Methodological Resources
  • Remote & Virtual Fieldwork
  • Data Management & Repositories
  • Campus Access
  • Free download available for Harvard Faculty of Arts and Sciences (FAS) affiliates
  • Desktop access at Lamont Library Media Lab, 3rd floor
  • Desktop access at Harvard Kennedy School Library (with HKS ID)
  • Remote desktop access for Harvard affiliates from  IQSS Computer Labs . Email them at  [email protected] and ask for a new lab account and remote desktop access to NVivo.
  • Virtual Desktop Infrastructure (VDI) access available to Harvard T.H. Chan School of Public Health affiliates.

Qualitative data analysis methods should flow from, or align with, the methodological paradigm chosen for your study, whether that paradigm is interpretivist, critical, positivist, or participative in nature (or a combination of these). Some established methods include Content Analysis, Critical Analysis, Discourse Analysis, Gestalt Analysis, Grounded Theory Analysis, Interpretive Analysis, Narrative Analysis, Normative Analysis, Phenomenological Analysis, Rhetorical Analysis, and Semiotic Analysis, among others. The following resources should help you navigate your methodological options and put into practice methods for coding, themeing, interpreting, and presenting your data.

  • Users can browse content by topic, discipline, or format type (reference works, book chapters, definitions, etc.). SRM offers several research tools as well: a methods map, user-created reading lists, a project planner, and advice on choosing statistical tests.  
  • Abductive Coding: Theory Building and Qualitative (Re)Analysis by Vila-Henninger, et al.  The authors recommend an abductive approach to guide qualitative researchers who are oriented towards theory-building. They outline a set of tactics for abductive analysis, including the generation of an abductive codebook, abductive data reduction through code equations, and in-depth abductive qualitative analysis.  
  • Analyzing and Interpreting Qualitative Research: After the Interview by Charles F. Vanover, Paul A. Mihas, and Johnny Saldana (Editors)   Providing insight into the wide range of approaches available to the qualitative researcher and covering all steps in the research process, the authors utilize a consistent chapter structure that provides novice and seasoned researchers with pragmatic, "how-to" strategies. Each chapter author introduces the method, uses one of their own research projects as a case study of the method described, shows how the specific analytic method can be used in other types of studies, and concludes with three questions/activities to prompt class discussion or personal study.   
  • "Analyzing Qualitative Data." Theory Into Practice 39, no. 3 (2000): 146-54 by Margaret D. LeCompte   This article walks readers though rules for unbiased data analysis and provides guidance for getting organized, finding items, creating stable sets of items, creating patterns, assembling structures, and conducting data validity checks.  
  • "Coding is Not a Dirty Word" in Chapter 1 (pp. 1–30) of Enhancing Qualitative and Mixed Methods Research with Technology by Shalin Hai-Jew (Editor)   Current discourses in qualitative research, especially those situated in postmodernism, represent coding and the technology that assists with coding as reductive, lacking complexity, and detached from theory. In this chapter, the author presents a counter-narrative to this dominant discourse in qualitative research. The author argues that coding is not necessarily devoid of theory, nor does the use of software for data management and analysis automatically render scholarship theoretically lightweight or barren. A lack of deep analytical insight is a consequence not of software but of epistemology. Using examples informed by interpretive and critical approaches, the author demonstrates how NVivo can provide an effective tool for data management and analysis. The author also highlights ideas for critical and deconstructive approaches in qualitative inquiry while using NVivo. By troubling the positivist discourse of coding, the author seeks to create dialogic spaces that integrate theory with technology-driven data management and analysis, while maintaining the depth and rigor of qualitative research.   
  • The Coding Manual for Qualitative Researchers by Johnny Saldana   An in-depth guide to the multiple approaches available for coding qualitative data. Clear, practical and authoritative, the book profiles 32 coding methods that can be applied to a range of research genres from grounded theory to phenomenology to narrative inquiry. For each approach, Saldaña discusses the methods, origins, a description of the method, practical applications, and a clearly illustrated example with analytic follow-up. Essential reading across the social sciences.  
  • Flexible Coding of In-depth Interviews: A Twenty-first-century Approach by Nicole M. Deterding and Mary C. Waters The authors suggest steps in data organization and analysis to better utilize qualitative data analysis technologies and support rigorous, transparent, and flexible analysis of in-depth interview data.  
  • From the Editors: What Grounded Theory is Not by Roy Suddaby Walks readers through common misconceptions that hinder grounded theory studies, reinforcing the two key concepts of the grounded theory approach: (1) constant comparison of data gathered throughout the data collection process and (2) the determination of which kinds of data to sample in succession based on emergent themes (i.e., "theoretical sampling").  
  • “Good enough” methods for life-story analysis, by Wendy Luttrell. In Quinn N. (Ed.), Finding culture in talk (pp. 243–268). Demonstrates for researchers of culture and consciousness who use narrative how to concretely document reflexive processes in terms of where, how and why particular decisions are made at particular stages of the research process.   
  • The Ethnographic Interview by James P. Spradley  “Spradley wrote this book for the professional and student who have never done ethnographic fieldwork (p. 231) and for the professional ethnographer who is interested in adapting the author’s procedures (p. iv) ... Steps 6 and 8 explain lucidly how to construct a domain and a taxonomic analysis” (excerpted from book review by James D. Sexton, 1980). See also:  Presentation slides on coding and themeing your data, derived from Saldana, Spradley, and LeCompte Click to request access.  
  • Qualitative Data Analysis by Matthew B. Miles; A. Michael Huberman   A practical sourcebook for researchers who make use of qualitative data, presenting the current state of the craft in the design, testing, and use of qualitative analysis methods. Strong emphasis is placed on data displays matrices and networks that go beyond ordinary narrative text. Each method of data display and analysis is described and illustrated.  
  • "A Survey of Qualitative Data Analytic Methods" in Chapter 4 (pp. 89–138) of Fundamentals of Qualitative Research by Johnny Saldana   Provides an in-depth introduction to coding as a heuristic, particularly focusing on process coding, in vivo coding, descriptive coding, values coding, dramaturgical coding, and versus coding. Includes advice on writing analytic memos, developing categories, and themeing data.   
  • "Thematic Networks: An Analytic Tool for Qualitative Research." Qualitative Research : QR, 1(3), 385–405 by Jennifer Attride-Stirling Details a technique for conducting thematic analysis of qualitative material, presenting a step-by-step guide of the analytic process, with the aid of an empirical example. The analytic method presented employs established, well-known techniques; the article proposes that thematic analyses can be usefully aided by and presented as thematic networks.  
  • Using Thematic Analysis in Psychology by Virginia Braun and Victoria Clark Walks readers through the process of reflexive thematic analysis, step by step. The method may be adapted in fields outside of psychology as relevant. Pair this with One Size Fits All? What Counts as Quality Practice in Reflexive Thematic Analysis? by Virginia Braun and Victoria Clark

Data visualization can be employed formatively, to aid your data analysis, or summatively, to present your findings. Many qualitative data analysis (QDA) software platforms, such as NVivo , feature search functionality and data visualization options within them to aid data analysis during the formative stages of your project.

For expert assistance creating data visualizations to present your research, Harvard Library offers Visualization Support . Get help and training with data visualization design and tools—such as Tableau—for the Harvard community. Workshops and one-on-one consultations are also available.

The quality of your data analysis depends on how you situate what you learn within a wider body of knowledge. Consider the following advice:

A good literature review has many obvious virtues. It enables the investigator to define problems and assess data. It provides the concepts on which percepts depend. But the literature review has a special importance for the qualitative researcher. This consists of its ability to sharpen his or her capacity for surprise (Lazarsfeld, 1972b). The investigator who is well versed in the literature now has a set of expectations the data can defy. Counterexpectational data are conspicuous, readable, and highly provocative data. They signal the existence of unfulfilled theoretical assumptions, and these are, as Kuhn (1962) has noted, the very origins of intellectual innovation. A thorough review of the literature is, to this extent, a way to manufacture distance. It is a way to let the data of one's research project take issue with the theory of one's field.

- McCracken, G. (1988), The Long Interview, Sage: Newbury Park, CA, p. 31

Once you have coalesced around a theory, realize that a theory should  reveal  rather than  color  your discoveries. Allow your data to guide you to what's most suitable. Grounded theory  researchers may develop their own theory where current theories fail to provide insight.  This guide on Theoretical Models  from Alfaisal University Library provides a helpful overview on using theory.

If you'd like to supplement what you learned about relevant theories through your coursework and literature review, try these sources:

  • Annual Reviews   Review articles sum up the latest research in many fields, including social sciences, biomedicine, life sciences, and physical sciences. These are timely collections of critical reviews written by leading scientists.  
  • HOLLIS - search for resources on theories in your field   Modify this example search by entering the name of your field in place of "your discipline," then hit search.  
  • Oxford Bibliographies   Written and reviewed by academic experts, every article in this database is an authoritative guide to the current scholarship in a variety of fields, containing original commentary and annotations.  
  • ProQuest Dissertations & Theses (PQDT)   Indexes dissertations and masters' theses from most North American graduate schools as well as some European universities. Provides full text for most indexed dissertations from 1990-present.  
  • Very Short Introductions   Launched by Oxford University Press in 1995, Very Short Introductions offer concise introductions to a diverse range of subjects from Climate to Consciousness, Game Theory to Ancient Warfare, Privacy to Islamic History, Economics to Literary Theory.
  • << Previous: Recording & Transcription
  • Next: Managing Interview Data >>

Except where otherwise noted, this work is subject to a Creative Commons Attribution 4.0 International License , which allows anyone to share and adapt our material as long as proper attribution is given. For details and exceptions, see the Harvard Library Copyright Policy ©2021 Presidents and Fellows of Harvard College.

  • Open access
  • Published: 11 September 2024

What does ‘safe care’ mean in the context of community-based mental health services? A qualitative exploration of the perspectives of service users, carers, and healthcare providers in England

  • Phoebe Averill 1 ,
  • Bryher Bowness 1 ,
  • Claire Henderson 1   na1 &
  • Nick Sevdalis 2   na1  

BMC Health Services Research volume  24 , Article number:  1053 ( 2024 ) Cite this article

Metrics details

Having traditionally received limited attention in empirical research and safety improvement agendas, issues of patient safety in mental healthcare increasingly feature in healthcare quality improvement discourses. Dominant approaches to safety stem from narrow clinical risk management perspectives, yet existing evidence points to the limitations of this characterisation. Although mental healthcare is increasingly delivered in community-based settings, narratives of safety pertain primarily to hospital-based mental healthcare. Therefore, through exploratory qualitative interviews and one focus group, we aimed to examine how service users, carers, and healthcare providers conceptualise ‘patient safety’ in community-based mental healthcare.

Semi-structured interviews and a single focus group were conducted with users of UK community-based mental healthcare provision for adults ( n  = 13), their carers ( n  = 12), and providers ( n  = 18), who were diverse in characteristics and experiences. Study data were analysed in accordance with a reflexive approach to thematic analysis.

Four key themes were developed, reflecting contrasting conceptualisations of safety in this care context, where participant thinking evolved throughout discussions. ‘ Systemic inertia: threats to safety ’ characterises the entrenched, systemic challenges which rendered participants powerless to advocate for or deliver safe care. ‘ Managing the risks service users present ’ equates ‘safe care’ to the mitigation of risks service users may pose to themselves or others when unwell, or risks from those around them. ‘ More than responding to risks: everyone plays a role in creating safety ’ recognises providers’ agency in causing or proactively preventing patient harm. Finally, ‘ The goals of ‘safety’: our destination is not yet in sight ’ positions safety as a work in progress, calling for ambitious safety agendas, giving primacy to goals which meaningfully improve service users’ lives.

Conclusions

Our findings have implications for the understanding and improvement of patient safety in community-based mental healthcare settings, where improvement objectives should transcend beyond management of risks and preventing deterioration to address patient and carer-centred concerns, including practices that make people feel unsafe.

Peer Review reports

Improving healthcare safety and reducing avoidable harm are global imperatives [ 1 ]. Until recently, safety optimisation efforts have centred primarily on physical healthcare settings, including acute care hospitals [ 2 , 3 , 4 ]. Nevertheless, emerging research has begun to illuminate the nature and causation of safety problems within mental healthcare [ 5 , 6 , 7 , 8 , 9 , 10 , 11 ]. These works advance insights into inpatient safety, or that of various mental health settings more broadly. Safety challenges specific to community-based mental health services, where most mental healthcare is delivered, warrant focused scientific enquiry.

Although the field lacks coherent definitions and models of patient safety in mental healthcare, recognised safety concerns in mental healthcare include generic incidents, such as medication errors [ 12 ], and specialty-specific issues, such as self-harm or use of restrictive interventions [ 13 , 14 , 15 ]. Care in community-based mental health services may present distinct safety considerations. Lengthy waiting times; infrequent care encounters; and insufficient involvement of family carers in risk assessment, constitute particular concerns [ 5 , 7 , 16 , 17 ].

Historically, ‘patient safety’ has been appraised through a narrow clinical risk management lens [ 18 ]. This is pertinent in mental healthcare, where management of risks to self and others are key aspects of clinical practice [ 19 , 20 ]. That iatrogenic risks appear less widely considered, perhaps marks a distinction between mental healthcare and other health specialties [ 20 , 21 ]. Nevertheless, existing research demonstrates that service users can play a valuable role in safety improvement [ 22 ], and may conceptualise safety differently to healthcare providers [ 23 , 24 , 25 ]. Alongside providers, it is vital that service users and carers are involved both in defining ‘safety’ and in shaping meaningful improvement agendas [ 26 ]. Accordingly, the present study aimed to examine service user, carer, and healthcare providers’ subjective experiences and conceptualisations of ‘patient safety’ in community-based mental healthcare. Specifically, we sought to understand what ‘safe’ and, conversely, ‘unsafe care’ meant to these participant groups.

An exploratory design was adopted, involving semi-structured qualitative interviews and a single focus group with service users, carers, and healthcare providers. Participants took part in their choice of an individual interview or a focus group with others from the same respondent group. Compared to individual interviews, focus groups may yield additional layers of insight as a product of interaction between participants. Nonetheless, it is plausible that deeper understanding of individual participants’ subjective experiences may be achieved through individual interviews [ 27 ]. We expected that participants may describe distressing personal or professional experiences on the topic of patient safety problems in community-based mental health services. Therefore, presenting participants with the choice of an individual interview or a group discussion constituted an important ethical decision, allowing participants to take part in a format in which they felt most comfortable and able to speak with candour. Study methods were approved by a Welsh National Health Service (NHS) research ethics committee (IRAS ID: 279409) and prospectively registered (identifier: NCT04866693). Research reporting corresponds to the Standards for Reporting Qualitative Research [ 28 ].

Patient and public involvement

Through group workshops and individual consultations, seven service users and carers with lived experience of mental health services provided guidance on participant selection considerations, interview questioning, topic guide piloting, and refinement of participant-facing materials. A healthcare provider topic guide was also piloted and modified according to feedback from two clinicians. Based upon contributor feedback, we focused on both specialist and non-specialist mental healthcare provision, as general practitioner (GP) support represented the only care available for many people with mental health problems.

Participants and setting

Eligible participants had current or past year experience of contact with or employment within community-based mental health services for working-age adults within England, including primary care (e.g. GP support with psychotropic medication management) or specialist mental healthcare provision (e.g. community mental health team; CMHT input). Participants were required to be 18 years old or over and able to speak sufficient English to provide informed consent and to take part in an interview, owing to lack of resource for translation costs. Informed by comparable existing literature, where people from minority ethnic groups; male participants; and those with experience of severe mental illness (e.g. schizophrenia, bipolar disorder), have typically been underrepresented [ 5 , 6 , 7 , 29 ], targeted efforts were made to ensure that a diversity of voices were heard. Accordingly, we selected participants purposively according to gender, ethnicity, age, mental health diagnosis (if applicable), and service(s) experienced. All individuals who were screened for eligibility were informed of the possibility that they may not be recruited to the study, where we had already heard from multiple participants sharing similar characteristics. We notified those we were unable to recruit within two months of screening and ascertained their interest in being contacted should future involvement opportunities arise.

The decision to stop sampling was guided by ‘information power’ principles, an indicator of internal validity based on the notion that the greater the ‘information power’ a sample provides, fewer participants are needed for adequate analysis [ 30 ]. Adequacy of our sample was appraised according to multiple interrelated facets of ‘information power’, such as the breadth of our study aims, sample diversity, and data quality. The achieved sample was diverse in terms of participant characteristics and perspectives.

Data collection

Recruitment and data collection were conducted by one researcher (PA) over a nine-month period (May 2021—February 2022). Advertisement flyers were distributed within a local English NHS mental health trust. An open recruitment call saw flyers shared online, through newsletters, and via pre-established service user and carer engagement groups. Participants contacted the research team directly to express interest in participating in their choice of an individual interview or focus group discussion.

Consenting participants primarily took part virtually via videoconferencing software, although two interviewees opted to participate in person. Except for one focus group comprising seven healthcare providers, all other participants expressed a preference to take part in individual interviews ( n  = 36), lasting between 26 and 89 ( M  = 47) minutes. Separate topic guides for service users, carers, and providers were developed for this study and used flexibly to guide discussions (Supplementary Materials 1 , 2 , 3 ). Participants were first asked contextual questions, and then to define patient safety in their own language, reflecting approaches used in prior research [ 7 , 23 ]. Service users and carers were offered a £15.00 voucher for their contributions; and all participants were thanked and provided information about available sources of support, should they wish to seek support after taking part. Audio-recordings from interviews and the single focus group were transcribed verbatim and anonymised.

Methodological perspectives and reflexivity

Underpinned by critical realism, the ontological notion that reality exists, but that it is independent of human observation and cannot be accessed directly [ 31 , 32 ], participants’ narratives were examined as representations of reality shaped by subjective experiences. To generate knowledge on the topic under investigation, it follows that participants’ subjective representations of phenomena must be interpreted [ 33 ]. Within reflexive thematic analysis, the analytical approach adopted, researcher subjectivity is acknowledged as the primary tool for generating findings [ 34 ]. Analysis was undertaken primarily by one researcher (PA, White British female), with multiple coding on several transcripts conducted by a second researcher (BB, White British female). We recognise that our own experiences, along with the social, economic, and political context contemporaneous to this study will have shaped our analysis. The lead author (PA) is currently an outsider to all groups at the centre of this research, though has prior experience of working in unqualified positions within various inpatient mental health services, including settings which at times felt unsafe for service users and staff alike. The second author (BB), a qualified mental health nurse, has lived experience of using services and of working within community-based mental healthcare settings. These experiences undoubtedly shaped our interpretations of meaning within the present study. Likewise, we were conscious of additional barriers to delivering safe care that providers faced during the COVID-19 pandemic.

To examine patterns of meaning across the qualitative dataset, reflexive thematic analysis was conducted by the first author (PA) using methods outlined by Braun and Clarke [ 34 , 35 ]. Braun and Clarke’s approach to reflexive thematic analysis has been widely used within the psychological sciences and across a range of research disciplines; it emphasises the active and iterative role of the researcher in generating meaning through analysis [ 34 , 35 ]. The analysis was inductive and data-driven, comprising six phases (Table  1 ; phases 3–5 were performed recursively rather than linearly). First, audio-recordings, deidentified transcripts, and researcher notes were iteratively reviewed to gain familiarity with the dataset. Using NVivo software, transcripts were then systematically coded at semantic and latent levels, capturing both explicit and conceptual meanings (Supplementary Material 4 ). Multiple coding was conducted on three transcripts by a second researcher (BB) to explore alternative data interpretations, which were then discussed between both coders (PA, BB). Codes were further refined throughout the analysis and initial themes were generated through grouping of codes representing similar concepts or shared meanings. Theme fidelity was appraised against coded data extracts, resulting in reorganisation. Analytical progress was discussed regularly amongst all authors, and four themes were retained, defined, and labelled.

Participant characteristics

Of 97 expressions of interest received, 43 participants were recruited, comprised of 13 service users, 12 carers, and 18 healthcare providers. Participant characteristics are presented in Table  2 . A further fifty-four people who expressed interest in taking part were not recruited for the following reasons: omission to provide information required for eligibility screening ( n  = 27); not meeting study eligibility criteria, e.g. experience of services involved mental healthcare provision for children and adolescents ( n = 13 ); unable to schedule interview ( n = 2 ). According to our purposive recruitment strategy, an additional 12 people were not recruited where they shared characteristics with multiple other participants who had already taken part (e.g. female gender, White ethnicity).

Overview of findings

Four interpretative themes and several subthemes were developed, representing contrasting conceptualisations of patient safety in community-based mental health services (Table  3 ). Our thematic structure elucidates the ways in which notions of safety problems, their origins, and approaches to resolution were positioned within participant narratives. Each theme depicts a contrasting discourse as to whether safety risks are: (1) driven by systemic challenges; (2) by service users; (3) by responsible care teams; or (4) surmountable with an ambitious safety agenda, centred on outcomes of importance to service users and carers. These ideas are not mutually exclusive: participants typically espoused views aligning with several themes as their conceptualisations of ‘safe care’ evolved throughout discussions.

Theme 1: systemic inertia: threats to safety

The first theme sets out the position that complex, systemic issues make community-based mental healthcare less safe. Accordingly, ‘unsafe care’ is presented as the product of these ingrained challenges. Healthcare providers appeared to situate systems-level challenges as beyond their own locus of control, at times apportioning responsibility to other care teams. However, whilst recognising entrenched safety threats within ‘the system’, service users and carers did not view individual providers and care teams as entirely blameless or external to their causation. All participant groups nevertheless faced the same experiential outcome: a shared sense of powerlessness.

Timely access was central to conceptualisations of safe care:

It’s not about panic buttons and CCTV. How about making mental healthcare available to people when they need it?’ (Carer, CMHT, individual interview)

Negotiating service access, from GP consultations through to crisis care, was likened to ‘moving mountains’ (Carer, Crisis team, individual interview). Access appeared contingent upon the advocacy of carers, who acquired expertise in the language of service navigation:

You have to say the words that get people moving: ‘risk’, ‘harm’, ‘complaint’, ‘crisis’…words you have to know to get something done. (Carer, Crisis team, individual interview)

Delayed, mismanaged, and inappropriately rejected referrals to specialist community-based mental health services constituted a particular obstacle to safe care, seemingly contributing to deterioration, hopelessness, and self-harm. For apparent mood disorders, GPs were obliged to evidence trialling several unsuccessful prior treatments for referral acceptance. For one individual, accurate diagnosis (bipolar disorder) and appropriate treatment were delayed, owing to late entry into specialist services:

It took years of fiddling with medications [in primary care] before I got to see a psychiatrist and in that time this life that I’d been trying to hold onto imploded. (Service user, CMHT, individual interview)

Once the case for referral was established, lengthy waits for assessment and initial outpatient consultations introduced further risks. These waiting periods were a collective concern. However, providers were thought not to appreciate the gravity of such delays:

Time is the enemy here…I don’t think they understand how each setback and every single delay, it just makes it more likely that my partner’s going to die. (Carer, CMHT, individual interview)

Access challenges were not resolved upon entering specialist mental health services. Waits spanning months to several years for autism assessments, or specialist psychological therapies, further obstructed safety. Moral injury—distress or internal conflict triggered by situations which contravene a provider’s personal or professional principles—was apparent where service users needed urgent input, yet providers perceived limited capacity to alleviate delays:

It feels like you are condemning someone to deteriorate because you’re putting them on such a long waiting list and contributing to their hopelessness…I think it does raise people’s risk of self-harm, of deterioration. Puts people in harm’s way really. (Clinical psychologist, Crisis team, individual interview)

Moreover, participant narratives suggested that care intensity was inadequate for maintaining safety and preventing harm. Staffing time limits, alongside closures of voluntary sector resources and day centres, meant that providers struggled to offer suitable support. Consequently, patient safety and public safety incidents were foreseeable:

I just don’t feel like we’re meeting his needs…a bit more input and attention might have avoided these situations blowing up. (Nurse, CMHT, focus group)

Service users and carers often communicated notions of ‘patient safety’ through the lens of feeling safe or unsafe within interactions with providers, or the community-based mental healthcare system as a whole. System-driven shortfalls in care provision seemed to threaten service users’ psychological sense of safety. This was particularly pertinent in conversations about crisis care:

I worry it will become so out of control because I didn’t get help soon enough…they won’t do anything and won’t notice, and I’ll just be gone in my mind (Service user, CMHT, individual interview)

Imbalances between demands for care and service offer were especially apparent in emergency situations. Crisis care, including out-of-hours provision, was characterised as inaccessible or insufficient to avert immediate safety risks. Challenges in reaching the crisis team via telephone, or refusals to conduct home visits, meant service users and carers felt unsafe. Even where daily care was provided, participants regarded crisis team care as an ineffective alternative to hospitalisation:

If they’re only there for one hour of one day…they can’t physically prevent you from killing yourself. (Carer, Crisis team, individual interview)

‘Falling through the gaps’ could be regarded as a further system-driven safety threat. ‘Gaps’ ranged from concrete and tangible, including a lack of services to meet service users’ needs, through to the relational, reflecting communication divides between care teams. ‘Gaps’ also manifested conceptually, characterising subjective experiences of powerlessness in the face of service exclusion.

Communication breakdowns were intimately linked to patient safety, emerging in verbal handovers; through documentation errors; or stemming from the design of formal information-sharing channels (e.g. lack of shared electronic patient records). For those under multiple services, communication challenges operated at organisational interfaces between primary care, specialist mental health services, and social care, contributing to care delays, omitted medications, and mistrust of services. To improve safety, there were calls to strengthen interprofessional relationships between services:

It is dialogue between primary and secondary [specialist] mental health services that should improve because we hardly have interactions with them apart from referrals. We don’t get to know them. (GP, Primary care, individual interview)

‘Falling through the gaps’ also arose from service organisation and patient flow. Gaps in commissioned care pathways resulted in service exclusion, including for individuals whose problems appeared too complex for primary care input, yet not severe enough to warrant specialist care. Poor cooperation between services was a product of workload pressures, manifesting in excessive gatekeeping, with some patients receiving no care at all:

Tying care together is much more challenging, because everyone is pressurised, every service tends to ringfence what they’re prepared to provide…The number of service users who are unable to access any care at all, it’s really difficult. (Nurse, CMHT, individual interview)

Impacts of service divisions were evident. It seemed that providers avoided delivering interventions requiring between-service multidisciplinary collaboration, including physical healthcare provision for people with severe mental illness. There was a tendency for siloed working, meaning that service users’ care was not cohesive:

90% of the people in my clinic will be on dreadful polypharmacy…the burden of side effects is horrendous, the interactions are dreadful, and the problem is, nobody will take responsibility and technically it’s not my job either. (Pharmacist, Specialist medication clinic, individual interview)

Care transitions were viewed as equally fraught; premature or poorly coordinated discharges jeopardised service user progress. Nevertheless, responding to pressure from service managers, providers recognised their own role in discharging patients before they were ready:

Loved ones start calling and say, ‘What are you people doing?…I don’t want to discharge, I am wanting to continue. (Nurse, CMHT, individual interview)

Service users’ sociodemographic and clinical characteristics appeared interrelated to their service navigation experiences. Care quality corresponded to social class; those regarded as articulate or well-presented reportedly received better standards of care. Care journeys for Black service users were characterised more-so by police involvement and restrictive practices, relative to White patients:

There are also the different pathways to care, how our ethnic communities perceive our systems, the fear of the psychiatric system…If you’re a Black family, you’re far more likely to have longer duration of untreated psychosis for your loved one…then you’re far more likely to call the police. (Clinical psychologist, CMHT, focus group)

Ethnicity-based preconceptions also appeared to shape the care offered to Asian patients:

Asian women, I don’t think we’re perceived as a danger to society…And of course, to get treatment, you have to be a danger to yourself or someone else. (Carer, Crisis team, individual interview)

Similarly, personality disorder diagnoses underpinned stigmatisation and service exclusion, with recurrent risks unaddressed:

They’ve said they’ll never take me back with the same presentation, which is self-harm. I’m not going to morph into someone with psychosis, so whatever risk I present, they’re not going to respond. (Service user, Primary care, individual interview)

Additional system-drivers of harm included chronic workforce underinvestment. Staffing shortages seemingly arose from long-term disinvestment from community teams, causing unmanageable workloads, which damaged recruitment and retention efforts. Whilst inpatient services are limited by bed availability, community-based providers experienced uncapped caseloads, exceeding safe limits:

These wonderful people trying to do their best with limited resources…when I have been let down, it’s not them…it’s the institution letting me down. (Service user, CMHT, individual interview)

Workforce underinvestment explained inadequacies in staff training and skills mix. Knowledge about psychotropic medications amongst care teams was thought inadequate. Risk assessment and management expertise were also considered lacking. Experience and seniority levels within community teams were understood to have declined over time, with inexperienced staff increasingly delivering care in high-risk situations.

Participant accounts suggested that policies and procedures resulted in further safety vulnerabilities. Administrative tasks aimed at monitoring safety, including incident reporting, equipment audits, and extensive paperwork, were thought to detract from time to deliver safe care:

Our performance is measured upon our completion of forms, rather than the risks that we’ve managed over the month… (Occupational therapist, CMHT, focus group)

Moreover, workarounds to staff safety protocols were necessary, including disregarding lone working protocols to ensure that service users were actually seen. Compromising one’s own safety appeared unavoidable in the face of unmanageable caseloads.

Another perceived mismatch between healthcare policy and achieving safety related to the UK Mental Health Act. Beyond challenges in securing police attendance at assessments and hospital bed availability, there were tensions between upholding human rights whilst preventing patient harm. Providers must wait for those declining treatment to become unwell enough to warrant detention and hospitalisation. Within this period, harmful outcomes included patient arrests, or involvement in an assault, possibly preventable with earlier hospitalisation. Legislation reforms worried carers, who feared increased thresholds for involuntary treatment:

Being sectioned, it’s not the most fun experience, but it does mean you get treated. But they’re bringing in the new White Paper. It’s going to be even harder to get her sectioned, so she might be facing a lifetime of illness. (Carer, Crisis team, individual interview)

Participant accounts suggest that efforts to deliver ‘safe care’ within community-based mental health services were rendered untenable owing to seemingly insurmountable systems-level issues. Local and distal challenges including insufficient resources, staffing shortfalls, and poorly joined-up services, constrained service users’ prospects of accessing timely, appropriate, safe care. Patient safety appeared to be compromised actively, through necessitating violations of unworkable policies, but also in more subtle ways, through communication disconnects; inequities in service access for specific patient groups; and erosion of workforce morale and resilience.

Theme 2: managing the risks service users present

Aside from systems-generated safety risks, ‘patient safety’ in community-based mental healthcare was equated to the management of risks originating from service users. Indeed, a second theme ‘Managing the risks service users present’ , characterises a seemingly dominant perception that ‘safe care’ provision in this context is that which mitigates the risks of harm service users may pose to themselves or others when unwell. Risk of harm to self was a recurrent topic, including self-neglect, self-harm, and suicide. Although infrequent, risks to others of rare acts of violence were also raised. It appears that staff had limited tools for community prevention of violence, yet these incidents were contextualised in terms of care that is ‘unsafe’:

Unsafe mental healthcare could come from different means. Either the client is not engaging, or the risk associated with clients…did anyone call that the client is shouting along the road or tried to attack somebody? (Nurse, CMHT, individual interview)

Risks to service users from those around them were also described, including physical, sexual, and financial abuse. Family members who discouraged medication adherence presented further threats to safety. Oversight of such risks was considered a key function of safe mental healthcare:

In the thick of it in the community, you are in this incredibly vulnerable spot and that’s why we have things like care coordinators, so that they can keep an eye on you. (Service user, CMHT, individual interview)

Whilst ‘patient safety’ was framed predominantly as the inverse of ‘risk’, staff sometimes drew subtle distinctions between the two. Whether suicide constituted a safety incident was debated, in scenarios where providers deemed it beyond service prevention. Nuances were evident in sense-making about patient safety in this care context:

When you mention ‘patient safety’ within mental healthcare, my mind thinks about what is their risk of harm to themselves or others? Whereas ‘safe mental healthcare’ is where we are looking at providing appropriate or effective mental healthcare for their needs. It’s got two connotations in how it can be measured. (GP, Primary care, individual interview)

Likewise, ‘holding’ risk dominated conversations about community safety. Carers’ accounts suggested that they were insufficiently supported to maintain patient safety at home, holding risks such as self-neglect, or active suicidal intent. Although often excluded from formal safety planning, it appears that carers were positioned as part of the service user’s care team, relied upon to fill gaps within the care system and to navigate risk:

No one gave me any training, ’If she refuses her meds, if she doesn’t want to eat, this is the best way to approach it’…I didn’t have a clue. (Carer, CMHT, individual interview)

Beyond carers, providers were similarly apprehensive about assuming responsibility for risks. Community staff perceived greater individual accountability for risk relative to inpatient care teams. The burdens of clinical decision-making in high-risk situations were evident:

You hold those risks of patients going into crisis, harming themselves, harming other people…you hold that risk almost on your own…it’s only your clinical judgement keeping that person safe. (Nurse, CMHT, focus group)

Throughout participant accounts, it appeared that service users were positioned as inherently ‘unsafe’. It can be contended that risks were deemed to arise from service users, rather than the processes of healthcare itself. Key tensions can be discerned within this discourse: the extent to which risks could feasibly be averted within the structures of community care was unclear. Nevertheless, participant narratives suggest that the prevention of such harms by services should be considered a fundamental tenet of ‘safe care’.

Theme 3: more than responding to risks: everyone plays a role in creating safety

Beyond safety problems stemming from the system itself, or risks presented by users within it, a third theme recognises providers’ agency in contributing to safety. Within this theme, the creation of ‘safety’ can be regarded as a dynamic process, where it is acknowledged that ‘everyone’ (individual providers, care teams, and wider services) has the potential to cause or mitigate harm, but also to proactively influence the delivery of safe care.

It’s easy to say, ‘I didn’t cause someone harm’, but there needs to be more reflection on ‘Well what more could you have done to prevent that? (Carer, Crisis team, individual interview)

That contact with services could inadvertently cause harm appeared an accepted truth. Overt harm precursors included errors (e.g. incorrect medication), and protocol breaches, such as a failure to follow-up patients newly prescribed with antidepressants:

A regular number of people are taking their own lives, who have been recently prescribed an antidepressant and they’ve not had follow-up monitoring as per NICE guidance…those deaths are completely avoidable. (Pharmacist, Specialist medication clinic, individual interview)

Iatrogenic consequences of psychotropic medication, including weight gain-induced complications, sleep disturbances, and elevated blood pressure, were prominent concerns. In several cases, medications caused new physical health conditions (e.g. quetiapine-induced diabetes). There was a sense that prescriber opinions were privileged over service users’ lived experience:

Each time I see the psychiatrist, he will prescribe more drugs, he fails to listen to you first…about your side effects, about internal experiences that are horrible…he brings your medication, injects it into you, then he goes. (Service user, Community outreach and rehabilitation service, individual interview)

Failure to respond to patient deterioration, including reactive responses to accumulating safety risks, were described recurrently. Service users and carers were seemingly powerless to ensure their concerns were addressed:

He [psychiatrist] didn’t seem to appreciate the severity of my dad’s symptoms…it is so challenging to advocate for yourself because there’s such a power imbalance (Carer, Crisis team, individual interview)

Subtle manifestations of harm appeared widespread, including providers’ failures to meet commitments. This included appointment non-attendance or cancellation, or simply ceasing contact with service users without explanation. The latter occurred during long-term staff sickness absence, where there were no safeguards to ensure caseload reassignment. Equally, experiences of psychological harm were common. The potential for each care encounter to have therapeutic value was recognised by all participant groups and thought vital to maintaining safety:

What you want is staff who have good, positive relationships with patients. That is what keeps people safe. (Social worker, CMHT, individual interview)

Negative patient-provider interactions led to patients losing trust in services, non-disclosure of care-relevant information, or service avoidance. Service withdrawal following a harmful encounter occurred for varying reasons, including fears of involuntary treatment:

I don’t want her to be sectioned again, I don’t want them to decide she’s better off in the unit because she prefers to be at home, it’s safer for her… (Carer, CMHT, individual interview)

Also within this theme was the position that safer care can be achieved through proactive approaches to care delivery, emphasising harm prevention through planning, team working, and collaboration with service users and carers:

That time for safety, you just don’t find it. If you think about the time that you spend doing incident reports or, reading complaints…so yeah, I would have those safety mechanisms early on. (Psychiatrist, CMHT, individual interview)

Aspirations to mitigate harm were evident. One preventative strategy involved allocating more time resource at the outset (e.g. investing time into clinical formulation, learning from prior treatment history), rather than in response to something going wrong. Other examples of frontloading care included proactive medication monitoring, averting safety problems before they arose. In the case of lithium treatment, toxicity was prevented through reviewing longitudinal lithium level trends, instead of individual results in isolation. Service users were also educated on toxicity signs, encouraging timely support-seeking.

Communication systems appeared vital to planning for safety. Electronic patient record systems were paramount to effective information sharing, despite challenges in navigating large volumes of information recorded in different locations. The importance of documenting safety information so that it is meaningful to all providers, was explored:

When I’m documenting risks, that person might turn up to A&E [Accident and Emergency department], might be an inpatient, or might go to the home treatment team…I need to do it in a way that conveys easily that information, so you get a narrative of the patient, rather than a shopping list. (Psychiatrist, CMHT, individual interview)

Related to communication were the merits of team working for strengthening safety. ‘Zoning meetings’, where service users are classified according to risk and care needs, helped to prioritise interventions at team-level, supporting staff in caring for service users presenting with complex needs. Likewise, safety huddles were used proactively within primary care practices.

Service user’ and carer’ narratives suggested a similar endorsement of a proactive safety outlook. Harmful outcomes of deterioration, such as involuntary hospitalisation, breakdown of family relationships, or loss of dignity, were deemed avoidable with timely service intervention. Indeed, safety planning was thought most effective when designed whilst an individual was relatively well, rather than in crisis. It appeared that active involvement of service users and carers in crisis mitigation restored confidence and indicated mutual respect:

I have to give credit to the new psychiatrist my brother has been seeing…he decided to take the risk and he asked me, ‘Do you think it’s better just to be sent straight away to the hospital, or maybe just give him a chance?’…I said, ‘Okay, I’ll be home with him, let’s not lose what we actually achieved over these months. (Carer, CMHT, individual interview)

The above outlined theme, representing a third conceptual component of ‘safe care’, presents clear points of distinction from the preceding themes. Indeed, beyond apparent latent risks to safety which providers inherit from the systems they are required to operate within, or risks which surround service users, the agentic role of providers in ensuring that care is safe is clearly acknowledged. These findings highlight the propensity of providers, teams, and services to impact patient safety, both positively and negatively.

Theme 4: the goals of ‘safety’: our destination is not yet in sight

Marking a fourth element of participant characterisations of ‘patient safety’ in community-based mental health services, a final theme sets out a position that the pursuit of ‘safe care’ is a work in progress. Indeed, according to participant narratives, this theme depicts the breadth of change required to reach a point of safety: ‘our destination is not yet in sight’. Beyond managing risks and preventing deterioration, this theme presents the ideological stance that ‘safe’ mental healthcare provision is that which strives towards goals which meaningfully improve service users’ lives.

I don’t for one minute think that those people are causing him direct harm. But by indirect behaviour they are causing him harm…that harm is he’s not progressing, he’s not developing. (Carer, Dual diagnosis supported accommodation services, individual interview)

It appeared that safety was compromised by a preoccupation with managing incident outcomes, rather than addressing their causation. Basic wound care or liver function assessment sometimes constituted the only follow-up interventions provided after self-harm or suicide attempts, at the expense of compassionately exploring underlying distress. Services were seemingly oriented towards managing risks, lacking focus on promoting service user resilience and long-term care goals. Instead, respondent accounts are indicative that patient stabilisation was regarded as the treatment endpoint:

No one said ‘What do you want to achieve? How can we help you? (Carer, Crisis team, individual interview)

Beyond calls to re-examine the objectives of ‘safety’ in community mental healthcare, safety improvement priorities were widely explored. Enhancing care access could be regarded as a unanimous goal. Faster, streamlined routes into specialist services were deemed vital for promoting recovery. Accordingly, ineffective triage processes, crisis care, and out-of-hours care, were regarded as vital system-wide foci for safety improvement.

Further safety improvement priorities concerned care transitions and joint working between services. A higher intensity of support was considered essential for safer transitions across the inpatient-to-community interface, and from specialist community services into primary care. Proactive service follow-up was considered a vehicle for safer transitions, allowing timely monitoring for relapse signs:

Even if it’s like in a month or two…If you have that follow-up…you will be able to pick things up before it gets bad. (Service user, Specialist psychological therapy service, individual interview)

Where multiple teams were involved in a person’s care, participant accounts suggest that strengthening provider-collaboration was a safety priority. Relationships between staff in primary care and specialist services were strained by perceived reluctance on part of the other service to accept referrals or discharges. The need for improved cooperation was voiced:

We really do need to bring primary and secondary [specialist] care services closer together, to be working more with each other rather than this ‘us and them’ scenario. (Specialist pharmacist, Primary care, individual interview)

Improved joint working was also raised in relation to people at risk of service exclusion. These individuals often had multiple complex needs (e.g. severe mental illness and co-morbid substance use problems). There were calls for practical, cooperative, patient-centred approaches to planning care:

People might fall in theory into two or three services or none…It might be a negotiation, ‘Okay, this time I take this person, next time you take the case. (Psychiatrist, CMHT, individual interview)

Service user and carer support resembled a further safety improvement priority. It was proposed that service users could better maintain their own safety with individualised, co-produced, up-to-date safety plans, incorporating advanced planning for relapses. Carers similarly needed guidance for keeping service users safe (e.g. in lethal means restriction for suicidal individuals), but discerned a lack of provider willingness to offer such support:

Someone should come up with a plan to make sure carers have proper training…They need certain skills and they’re not given skills, and no-one wants to listen to them or encourage them. (Carer, CMHT, individual interview)

Medication safety represented a final safety priority. It was apparent that education on antidepressant use and prescription marked a key concern for respondents. Prescribers identified medication counselling needs on the risks of premature discontinuation of antidepressants upon perceiving no therapeutic benefit. Similarly, carers considered antidepressants to be prescribed too readily, without educating people about potential negative side-effects (e.g. suicidal thoughts) upon initial treatment:

They hand out antidepressants, whenever they feel like it. Without any recourse to ‘Is this person going to be susceptible to problems with these drugs? (Carer, Assessment and treatment team, individual interview)

Medication non-adherence could be regarded as an important early indicator of risks to safety. Given the narrow therapeutic range of many psychotropic medications, where dosage and timeliness are essential, the absence of mechanisms to detect when medications were not ordered or collected concerned participants. Where GP practices had in-house pharmacies, alerting systems were proposed to mitigate risks of delayed or omitted medications.

Moreover, medication interaction burden was another area warranting closer attention. Improved prescriber awareness of the potential for harm was considered an important prerequisite to reducing risk. Accordingly, a participant discussed developing a bespoke form within the electronic health record system, prompting clinicians to reflect on risks introduced by their prescribing, inviting consideration of alternative treatment approaches:

Once you’ve listed your drugs, then you have to list the potential interactions, and a suggestion of how you might want to manage that (Pharmacist, Specialist medication clinic, individual interview)

Although service users, carers, and providers alike indicated support for patient-centred care planning and delivery, there were seemingly differences between participant groups in views on how this is achieved in practice. It can be contended that patient safety is compromised owing to an apparent lack of ambition on part of providers to strive towards meaningful, recovery-oriented goals, which support service users to approach their own vision of ‘getting better’. Beyond the treatment of clinical symptoms alone, recovery approaches to mental illness are concerned with “the processes by which people experiencing mental illness can develop a purposeful and meaningful life” [ 36 ]. Recovery can be considered a deeply individual journey, characterised by connectedness; hope and optimism about the future; a positive sense of identity; meaning in life; and empowerment [ 37 ]. Indeed, service user’ and carer’ accounts implied that ‘safe care’ could not be realised where the objectives of community-based mental healthcare were limited to achieving stabilisation and preventing deterioration alone. Tangible avenues for improving patient safety within this care context are evident within participant narratives.

This study articulates the multifaceted ways in which service users, carers, and providers conceptualise ‘safe care’ and, conversely, ‘unsafe care’ in community-based mental health services in England. Upon incorporating wider stakeholder perspectives, our findings suggest that the remit of patient safety and objectives for its improvement expand beyond mere efforts to manage risks. Indeed, aligned with mental health services research conducted primarily within inpatient contexts [ 9 , 38 ], or mixed settings [ 7 ], service users often framed safety subjectively in terms of feeling safe. Although the management of risks to self, to others, and from others, when unwell is recognised as a core component of patient safety (Theme 2), we posit that this dominant perspective alone is incomplete. Threats to ‘safe care’ were apparent first and foremost at mental healthcare systems-level (Theme 1). Whether care is ‘safe’ is also contingent upon the actions, or inaction, of providers (Theme 3). Moreover, we hold that ambitious safety agendas, giving primacy to goals which meaningfully improve service users’ lives, are required for the achievement of ‘safe care’ (Theme 4).

Consistent with findings from non-psychiatric hospital-based settings [ 23 ], participants were unaccustomed to considering the meaning and connotations of ‘patient safety’ in the present service context. Surprisingly, this was true of all participant groups, including providers, perhaps reflecting the dominance of inpatient settings in current discourses and initiatives aimed at improving safety in mental healthcare [ 39 ]. Well-evidenced iatrogenic harms within these settings (e.g. physical restraint, seclusion) [ 10 ], are possibly particularly salient to service users, carers, and staff alike. Nevertheless, from a point of uncertainty, participants refined their conceptualisations of safety throughout the interviews.

Theme one findings illustrated that systemic conditions result in care which departs from safe practice. Mirroring conclusions of the annual NHS community mental health survey [ 40 ], service access was a prominent concern, where care delays risked potentially severe consequences for deteriorating patients. Apparent inequities in care access, corresponding to sociodemographic and clinical characteristics, contributed to a sense of ‘falling through the gaps’. That participants deliberated on disparities in care for service users from Black backgrounds was perhaps unsurprising, given evidenced inequities in experiences of mental health services. Indeed, delayed access to specialist services and higher rates of compulsory treatment, are salient examples of adverse experiences faced by Black communities [ 41 , 42 , 43 , 44 ]. In contrast, intersections between ethnicity and access for Asian women, as voiced by participants, are less well-documented amongst the literature and warrant exploration in further research. Issues in recruiting and retaining staff further undermined patient safety, culminating in a less experienced and insufficiently supervised workforce, as described in existing research [ 45 ]. Finally, key healthcare policies, including the UK Mental Health Act, presented challenges for providers in the balance between upholding human rights and preventing harm to patients.

A second position characterised ‘safe care’ as the management of risks originating from service users. Prior research indicates that risk and safety are treated as equivalent concepts by mental health services staff [ 19 , 46 ]. Risks of harm from others in the community (e.g. physical, sexual, financial abuse) indicate setting-specific safety challenges within mental healthcare. Moreover, although carers felt left to hold risks, they were seldom involved in proactive safety planning; essential for improving safety in community-based care [ 16 , 47 ].

Provider agency in harm causation or mitigation was recognised in a third theme. Overt unsafe practices (e.g. protocol breaches) were less widespread than other less measurable harms (e.g. psychological harm). Finally, a fourth theme set out visions for the future of patient safety within community-based mental healthcare, contending that supporting service users to live fulfilling lives must be considered within the remit of ‘safe’ care, consistent with research capturing nursing perspectives [ 17 ]. This finding provides backing to calls for patient-directed safety targets [ 47 ], and resonates with recovery principles [ 37 ]. Indeed, in the context of the ongoing transformation of community-based mental healthcare in England in efforts to deliver integrated place-based care [ 48 ], patient safety considerations within voluntary sector and social care services will become increasingly important, warranting attention in future research.

Strengths and limitations

We detail novel insights into how ‘safety’ is conceptualised by those who use or deliver care within community-based mental health services, illuminating plausible safety improvement foci. Key strengths lie in our study sample, diverse in participant characteristics including ethnicity, diagnosis, and service experience, where sampling was guided by information power principles [ 30 ]. Moreover, we hold that the reflexive approach to data analysis adopted, positioned in relation to our own experiences, constitutes an additional strength of this work. However, although our findings are not limited to a single region or NHS trust, almost two-thirds of participants were London-based. Accordingly, we may not have adequately captured geographical disparities in safety experiences, including the concerns of rural communities. Overall, we noted a preference amongst participants to take part in an individual interview. As such, whilst our findings present a rich picture of individuals’ subjective experiences, it is possible that further insights yielded from interaction between participants may have been gained had we have been able to conduct focus groups with service users and carers. Although we believe that service user and carer involvement in shaping the research methods and focus resembles a strength of this study, we recognise that a more extensive approach to patient and public involvement throughout the research cycle may have improved this work [ 49 ]. In future research examining the safety of mental healthcare, training of lived experience co-researchers in qualitative interviewing, analysis, and interpretation may introduce valuable new perspectives on this topic. Such methods were beyond the limits of available resource for the present study, which formed part of a PhD project. Taking the findings of this study as a starting point, we believe that equitable co-production partnerships with service users and carers will be particularly important in any follow-on research aiming to develop interventions to improve patient safety in this context [ 50 ].

Implications

This study supplements recently emerging literature exploring the specifics of patient safety in community-based mental health services [ 16 , 17 ], expanding upon existing research by integrating service user, carer, and provider perspectives. Our findings offer a reimagining of ‘safe care’ in this service context, transcending narrow safety definitions and risk management agendas, which have dominated patient safety discourse [ 25 , 26 ]. Safety improvement priorities, as identified by participants, warrant examination in future research, to move the field towards effective safety interventions.

There was striking overlap between the safety shortfalls participants recounted and recommendations from a suicide reduction toolkit for primary care and specialist mental health services providers [ 51 ]. For example, timely post-discharge follow-up; 24-hour crisis care access; and carer involvement, constituted key recommendations. Beyond associations with lower suicide rates [ 52 , 53 ], full implementation of these recommendations may plausibly help to drive wider improvements in service safety.

In illustrating safety problems alongside examples of system resilience where safe care is successfully delivered, our findings correspond to both Safety-I and Safety-II paradigms [ 54 ]. Participants identified opportunities for optimising practice to improve service resilience, yet service pressures and efforts to manage the consequences of things going wrong (Safety-I), impacted provider capacity to engage in proactive planning for safety (Safety-II). Taken together, this underlines the importance of continued efforts to improve staffing and skills development, which will afford healthcare providers the time and expertise to deliver safer care.

The present research provides nuanced insights into the ways in which service users, carers, and healthcare providers conceptualise ‘patient safety’ within community-based mental healthcare, where extant research has primarily explored safety within inpatient mental healthcare settings only. Our findings indicate that participants conceived of ‘safe care’ in multiple contrasting, yet not mutually exclusive, ways. Dominant risk management narratives, which focus on those risks emanating from service users themselves when unwell, characterised one aspect of patient safety only. Indeed, systemic conditions were thought to culminate in mental healthcare provision which diverges from safe, effective clinical practice. Beyond these entrenched, systems-level challenges, participants nevertheless recognised the propensity of healthcare providers to cause or circumvent harm. Finally, this research highlights a vital need to pursue goals which are meaningful to service users in the shaping of safety improvement agendas within mental healthcare, indicating a range of potential foci for future interventional research.

Data availability

The dataset generated and analysed within the current study is not available for sharing and is not publicly available. Consent for data sharing was not sought from research participants and was not agreed as part of the research ethics committee approval obtained for this research. Queries about these data may be directed to the corresponding author.

Abbreviations

Accident and Emergency department

Community Mental Health Team

General Practitioner

National Health Service

World Health Organization. Global Patient Safety Action Plan 2021–2030. 2021. https://www.who.int/publications/i/item/9789240032705 . Accessed 12 Dec 2022.

Brennan TA, Leape LL, Laird NM, Hebert L, Localio AR, Lawthers AG, et al. Incidence of adverse events and negligence in hospitalized patients. N Engl J Med. 1991. https://doi.org/10.1056/NEJM199102073240604 .

Article   PubMed   Google Scholar  

Howell AM, Burns EM, Bouras G, Donaldson LJ, Athanasiou T, Darzi A. Can patient safety incident reports be used to compare hospital safety? Results from a quantitative analysis of the English National Reporting and Learning System data. PLoS ONE. 2015. https://doi.org/10.1371/journal.pone.0144107 .

Article   PubMed   PubMed Central   Google Scholar  

Leape LL, Brennan TA, Laird N, Lawthers AG, Localio AR, Barnes BA, et al. The nature of adverse events in hospitalized patients. N Engl J Med. 1991. https://doi.org/10.1056/NEJM199102073240605 .

Albutt A, Berzins K, Louch G, Baker J. Health professionals’ perspectives of safety issues in mental health services: a qualitative study. Int J Ment Health Nurs. 2021. https://doi.org/10.1111/inm.12838 .

Berzins K, Baker J, Brown M, Lawton R. A cross-sectional survey of mental health service users’, carers’ and professionals’ priorities for patient safety in the United Kingdom. Health Expect. 2018. https://doi.org/10.1111/hex.12805 .

Berzins K, Baker J, Louch G, Albutt A. A qualitative exploration of mental health service user and carer perspectives on safety issues in UK mental health services. Health Expect. 2020. https://doi.org/10.1111/hex.13025 .

Cutler NA, Sim J, Halcomb E, Stephens M, Moxham L. Understanding how personhood impacts consumers’ feelings of safety in acute mental health units: a qualitative study. Int J Ment Health Nurs. 2021. https://doi.org/10.1111/inm.12809 .

Stenhouse RC. Safe enough in here? Patients’ expectations and experiences of feeling safe in an acute psychiatric inpatient ward. J Clin Nurs. 2013. https://doi.org/10.1111/jocn.12111 .

Thibaut B, Dewa LH, Ramtale SC, D’Lima D, Adam S, Ashrafian H, et al. Patient safety in inpatient mental health settings: a systematic review. BMJ Open. 2019. https://doi.org/10.1136/bmjopen-2019-030230 .

D’Lima D, Crawford MJ, Darzi A, Archer S. Patient safety and quality of care in mental health: a world of its own? BJPsych Bull. 2017. https://doi.org/10.1192/pb.bp.116.055327 .

Ayre MJ, Lewis PJ, Keers RN. Understanding the medication safety challenges for patients with mental illness in primary care: a scoping review. BMC Psychiatry. 2023. https://doi.org/10.1186/s12888-023-04850-5 .

Cusack P, Cusack FP, McAndrew S, McKeown M, Duxbury J. An integrative review exploring the physical and psychological harm inherent in using restraint in mental health inpatient settings. Int J Ment Health Nurs. 2018. https://doi.org/10.1111/inm.12432 .

Innes J, Curtis D. Medication patient safety incidents linked to rapid tranquillisation: one year’s data from the National Reporting and Learning System. J Psychiatric Intensive Care. 2015. https://doi.org/10.1017/s1742646413000277 .

Article   Google Scholar  

James K, Stewart D, Wright S, Bowers L. Self harm in adult inpatient psychiatric care: a national study of incident reports in the UK. Int J Nurs Stud. 2012. https://doi.org/10.1016/J.IJNURSTU.2012.04.010 .

Ayhan F, Üstün B. The opinions and practices of health professionals in community mental health centers on risk assessment. J Psychiatric Nurs. 2021. https://doi.org/10.14744/phd.2021.08769 .

Sundin R, Nilsson A, Waage-Andrée R, Björn C. Nurses’ perceptions of Patient Safety in Community Mental Health settings: a qualitative study. Open J Nurs. 2015. https://doi.org/10.4236/ojn.2015.54042 .

Lachman P, Brennan J, Fitzsimons J, Jayadev A, Runnacles J. Resilience theory, complexity science, and Safety-II. In: Lachman P, Brennan J, Fitzsimons J, Jayadev A, Runnacles J, Lachman P, et al. editors. Oxford Professional Practice: Handbook of Patient Safety. Oxford University Press; 2022. pp. 101–10.

Coffey M, Cohen R, Faulkner A, Hannigan B, Simpson A, Barlow S. Ordinary risks and accepted fictions: how contrasting and competing priorities work in risk assessment and mental health care planning. Health Expect. 2017. https://doi.org/10.1111/hex.12474 .

Higgins A, Doyle L, Downes C, Morrissey J, Costello P, Brennan M, et al. There is more to risk and safety planning than dramatic risks: Mental health nurses’ risk assessment and safety-management practice. Int J Ment Health Nurs. 2016. https://doi.org/10.1111/inm.12180 .

Averill P, Vincent C, Reen G, Henderson C, Sevdalis N. Conceptual and practical challenges associated with understanding patient safety within community-based mental health services. Health Expect. 2023. https://doi.org/10.1111/hex.13660 .

The Health Foundation. Involving patients in improving safety. London: The Health Foundation. 2013. https://www.health.org.uk/publications/involving-patients-in-improving-safety . Accessed 4 Mar 2020.

Barrow E, Lear RA, Morbi A, Long S, Darzi A, Mayer E, et al. How do hospital inpatients conceptualise patient safety? A qualitative interview study using constructivist grounded theory. BMJ Qual Saf. 2023. https://doi.org/10.1136/bmjqs-2022-014695 .

Cutler NA. What does safety in acute mental health units mean for consumers? University of Wollongong; 2021. https://ro.uow.edu.au/theses1/1162/ . Accessed 21 Nov 2022.

Delaney KR, Johnson ME. Inpatient Psychiatric nursing: why Safety must be the Key Deliverable. Arch Psychiatr Nurs. 2008. https://doi.org/10.1016/j.apnu.2008.09.003 .

O’Hara JK, Lawton RJ. At a crossroads? Key challenges and future opportunities for patient involvement in patient safety. BMJ Qual Saf. 2016. https://doi.org/10.1136/bmjqs-2016-005476 .

DiCicco-Bloom B, Crabtree BF. The qualitative research interview. Med Educ. 2006. https://doi.org/10.1111/j.1365-2929.2006.02418.x .

O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014. https://doi.org/10.1097/ACM.0000000000000388 .

Berzins K, Louch G, Brown M, O’Hara JK, Baker J. Service user and carer involvement in mental health care safety: raising concerns and improving the safety of services. BMC Health Serv Res. 2018. https://doi.org/10.1186/s12913-018-3455-5 .

Malterud K, Siersma VD, Guassora AD. Sample size in qualitative interview studies: guided by Information Power. Qual Health Res. 2016. https://doi.org/10.1177/1049732315617444 .

Bhaskar RA. Scientific Realism and Human Emancipation. London: Verso; 1986.

Google Scholar  

Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. Qualitative Research Practice. Second edition. London: Sage; 2003.

Willig C. Introducing Qualitative Research in Psychology. Third edition. Berkshire, England: McGraw-Hill Education; 2013.

Braun V, Clarke V. Thematic analysis: a practical guide. London: Sage; 2021.

Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Res Psychol. 2006;3:77–101.

Slade M. Mental illness and well-being: the central importance of positive psychology and recovery approaches. BMC Health Serv Res. 2010. https://doi.org/10.1186/1472-6963-10-26 .

Leamy M, Bird V, Le Boutillier C, Williams J, Slade M. Conceptual framework for personal recovery in mental health: systematic review and narrative synthesis. Br J Psychiatry. 2011. https://doi.org/10.1192/bjp.bp.110.083733 .

Cutler NA, Halcomb E, Sim J, Stephens M, Moxham L. How does the environment influence consumers’ perceptions of safety in acute mental health units? A qualitative study. J Clin Nurs. 2021. https://doi.org/10.1111/jocn.15614 .

NHS England. The Mental Health Patient Safety Improvement Programme. 2021. https://www.england.nhs.uk/patient-safety/patient-safety-improvement-programmes/#MHSIP . Accessed 19 Oct 2022.

Care Quality Commission. Community mental health survey. 2022. https://www.cqc.org.uk/publications/surveys/community-mental-health-survey . Accessed 3 Sep 2023.

Halvorsrud K, Nazroo J, Otis M, Brown Hajdukova E, Bhui K. Ethnic inequalities and pathways to care in psychosis in England: a systematic review and meta-analysis. BMC Med. 2018. https://doi.org/10.1186/s12916-018-1201-9 .

Morgan C, Fearon P, Lappin J, Heslin M, Donoghue K, Lomas B, et al. Ethnicity and long-term course and outcome of psychotic disorders in a UK sample: the ÆsOP-10 study. Br J Psychiatry. 2017. https://doi.org/10.1186/s12916-018-1201-9 .

NHS Digital. Detentions under the Mental Health Act. 2022. https://www.ethnicity-facts-figures.service.gov.uk/health/mental-health/detentions-under-the-mental-health-act/latest . Accessed 13 Dec 2022.

NHS Digital. Community Treatment Orders. Mental Health Act Statistics, Annual Fig. 2022. https://digital.nhs.uk/data-and-information/publications/statistical/mental-health-act-statistics-annual-figures/2021-22-annual-figures/community-treatment-orders . Accessed 13 Dec 2022.

Baker J, Canvin K, Berzins K. The relationship between workforce characteristics and perception of quality of care in mental health: a qualitative study. Int J Nurs Stud. 2019. https://doi.org/10.1016/j.ijnurstu.2019.103412 .

Slemon A, Jenkins E, Bungay V. Safety in psychiatric inpatient care: the impact of risk management culture on mental health nursing practice. Nurs Inq. 2017. https://doi.org/10.1111/nin.12199 .

Vincent C, Amalberti R. Safer Healthcare Strategies for the Real World. London: Springer Open; 2016.

Book   Google Scholar  

NHS England. Community mental health services. NHS England. 2021. https://www.england.nhs.uk/mental-health/adults/cmhs/ . Accessed 15 Jun 2024.

NIHR Applied Research Collaboration West. New guide to working effectively with public contributors – by public contributors themselves. NIHR ARC West. 2023. https://arc-w.nihr.ac.uk/news/new-guide-to-working-effectively-with-public-contributors-by-public-contributors-themselves . Accessed 15 Jun 2024.

NHS England. Working in partnership with people and communities: Statutory guidance. NHS England. 2023. https://www.england.nhs.uk/long-read/working-in-partnership-with-people-and-communities-statutory-guidance/#annex-a-implementation . Accessed 15 Jun 2024.

The National Confidential Inquiry into Suicide and Safety in Mental Health. Safer services: a toolkit for specialist mental health services and primary care. Manchester: University of Manchester; 2022.

Kapur N, Ibrahim S, While D, Baird A, Rodway C, Hunt IM, et al. Mental health service changes, organisational factors, and patient suicide in England in 1997–2012: a before-and-after study. Lancet Psychiatry. 2016. https://doi.org/10.1016/S2215-0366(16)00063-8 .

While D, Bickley H, Roscoe A, Windfuhr K, Rahman S, Shaw J, et al. Implementation of mental health service recommendations in England and Wales and suicide rates, 1997–2006: a cross-sectional and before-and-after observational study. Lancet. 2012. https://doi.org/10.1016/S0140-6736(11)61712-1 .

Hollnagel E, Wears RL, Braithwaite J. From Safety-I to Safety-II: a White Paper. University of Florida, USA, and Macquarie University, Australia; 2015.

Download references

Acknowledgements

We would like to thank the service users and carers who provided valuable guidance in a patient and public involvement capacity, which helped to shape this research. Likewise, we acknowledge the support of the Maudsley Biomedical Research Centre’s Feasibility and Acceptability Support Team for Researchers (FAST-R). Finally, we would like to express our sincere thanks to the participants in this study, who so generously gave up their time in helping us to understand their experiences.

This project is supported by the Health Foundation’s grant to the University of Cambridge for The Healthcare Improvement Studies Institute (THIS Institute), grant number PHD-2018‐01‐026. The views expressed in this publication are those of the authors, and not necessarily those of the Health Foundation or THIS Institute.

Author information

Claire Henderson and Nick Sevdalis are joint senior authors of this study.

Authors and Affiliations

Centre for Implementation Science, Health Service and Population Research Department, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, 16 De Crespigny Park, London, SE5 8AF, UK

Phoebe Averill, Bryher Bowness & Claire Henderson

Centre for Behavioural and Implementation Science Interventions, Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore

Nick Sevdalis

You can also search for this author in PubMed   Google Scholar

Contributions

PA: Conceptualisation; Formal analysis, Investigation; Methodology; Project administration; Writing – Original Draft Preparation; Writing – Review & Editing. BB: Formal analysis; Writing – Review & Editing. CH: Conceptualisation; Funding acquisition; Methodology; Supervision; Writing – Review & Editing. NS: Conceptualisation; Funding acquisition; Methodology; Supervision; Writing – Review & Editing.

Corresponding author

Correspondence to Phoebe Averill .

Ethics declarations

Ethics approval and consent to participate.

This research was approved by a Welsh National Health Service (NHS) research ethics committee (IRAS ID: 279409). Participants were aware of the aims of the research and the voluntary nature of taking part. Informed consent was documented prior to participation.

Consent for publication

Not applicable.

Competing interests

NS is the director of London Safety and Training Solutions Ltd, which offers training in patient safety, implementation solutions and human factors to healthcare organisations and the pharmaceutical industry. The other authors report no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, supplementary material 3, supplementary material 4, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Averill, P., Bowness, B., Henderson, C. et al. What does ‘safe care’ mean in the context of community-based mental health services? A qualitative exploration of the perspectives of service users, carers, and healthcare providers in England. BMC Health Serv Res 24 , 1053 (2024). https://doi.org/10.1186/s12913-024-11473-3

Download citation

Received : 09 January 2024

Accepted : 21 August 2024

Published : 11 September 2024

DOI : https://doi.org/10.1186/s12913-024-11473-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Patient safety
  • Mental health services
  • Health services research
  • Qualitative study

BMC Health Services Research

ISSN: 1472-6963

key features of a qualitative research

Foundations of automatic feature extraction at LHC–point clouds and graphs

  • Open access
  • Published: 11 September 2024

Cite this article

You have full access to this open access article

key features of a qualitative research

  • Akanksha Bhardwaj 1 ,
  • Partha Konar 2 &
  • Vishal Ngairangbam   ORCID: orcid.org/0000-0002-7143-715X 3  

Deep learning algorithms will play a key role in the upcoming runs of the Large Hadron Collider (LHC), helping bolster various fronts ranging from fast and accurate detector simulations to physics analysis probing possible deviations from the Standard Model. The game-changing feature of these new algorithms is the ability to extract relevant information from high-dimensional input spaces, often regarded as “replacing the expert” in designing physics-intuitive variables. While this may seem true at first glance, it is far from reality. Existing research shows that physics-inspired feature extractors have many advantages beyond improving the qualitative understanding of the extracted features. In this review, we systematically explore automatic feature extraction from a phenomenological viewpoint and the motivation for physics-inspired architectures. We also discuss how prior knowledge from physics results in the naturalness of the point cloud representation and discuss graph-based applications to LHC phenomenology.

Explore related subjects

  • Artificial Intelligence

Avoid common mistakes on your manuscript.

1 Introduction

Modern machine learning (ML) techniques are quickly becoming ubiquitous in the natural sciences due to their excellent data processing capabilities and ability to find excellent interpolations from high-dimensional data. The situation is not too different in various branches of physics (see Ref. [ 1 ] for a recent review), where they are being employed to enhance traditional ways of studying physical systems ranging from finding the ground state of many-body quantum systems to predicting cosmological parameters. The situation is similar in collider phenomenology. Particularly relevant for the huge data that the Large Hadron Collider (LHC) will generate in the high-luminosity runs [ 2 , 3 , 4 , 5 ], the community is currently striving to work out the intricacies of these efficient data handling algorithms ranging from their applications to triggers [ 6 , 7 , 8 , 9 ]: the stage which decides which events to store and which to discard, the study of various detector modules [ 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 ], phenomenological applications [ 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 , 28 ]: which connects the recorded data to update our current understanding of the physical universe, and foundation models [ 29 , 30 , 31 , 32 , 33 , 34 ] capable of generalising across several applications.

The defining characteristic of current state-of-the-art ML algorithms is their ability to extract features from a high-dimensional input representation, which has undergone little to no processing (in terms of dimensionality reduction to a few variables through domain knowledge). These so-called deep-learning algorithms often need not rely on hand-engineered variables and can (with appropriate architecture design) outperform shallow methods relying on human-designed variables. The downside of such algorithms is the loss of understanding, in the traditional sense, of what the network is exploiting to outperform the human-engineered variables. Even though the increase, in principle, can be attributed to a higher degree of optimisation via numerical methods coupled with the neural network’s inherent power, the very high-dimensional parameter space prohibits an analytic understanding of the optimisation process.

figure 1

A schematic diagram on the plane of classifier performance and understanding of deep learning algorithms and its comparison to shallow machine learning methods

Taking the example of signal vs background classification as a proxy for performance, we can cast the interplay of performance and physics-understanding in a two-dimensional plane as shown in Fig.  1 . In the bottom left, with virtually no separation and understanding reside the raw detector data Footnote 1 used as inputs to reconstruction algorithms. This process which include defining infrared and collinear safe jets, and other objects like leptons, photons, etc., based on isolation criteria on the measurements of the various components of the detector, give us a set of reconstructed objects on the bottom right. On the reconstructed data, one generally applies loose selection cuts to enrich the signal and then apply shallow machine learning after constructing observables based on physics insights of the specific signature in question. In deep-learning based analyses, one goes back to the raw data after pre-selection and uses the high dimensional data as inputs to a suitable model. These models generally reside higher on the performance axis compared to shallow methods but fare poorly on the understanding axis. With physics-inspired deep-learning methods, there is a small increase on the understanding axis with still comparable performance to out-of-the-box methods. However, this is still a long way from the traditional understanding of physics-based observable design (evident from the discontinuous scale on the understanding axis). It would be considered a breakthrough to reach an equal level of understanding to traditional approaches without potentially losing out on the performance gain. Although, what qualifies as equal, will of course be decided by the community at large, a pragmatic criterion could be the understanding of their systematic uncertanties [ 22 , 35 , 36 , 37 , 38 , 39 ], especially those with theoretical origins [ 40 , 41 ].

We systematically review the nature of automatic feature extraction for phenomenology at the Large Hadron Collider, taking a more expositionary approach to what is already known in mainstream machine learning literature and their relations/reinterpretations specific to LHC phenomenology. One major theme of current research in this context, is the suitability of the point cloud representation in analysing high-energy collision events. In the point cloud representation, events are described as an unordered set of particles, characterised by their measured properties like four-momentum, charge, etc., whose number is not constant on an event-by-event basis. This is due to the non-conservation of particle multiplicity in relativistic quantum mechanics and our innate interest in how the measured particles relate to the interactions at subnuclear length scales. As the details of these interactions reside in the correlations between the measured particles, relational structures (like a graph) between the set’s elements are often employed to expose these correlations. While we touch upon graph-based representations and overview some applications, our coverage is essentially incomplete as our aim is to elucidate the interplay between the expressive power and generalisation capabilities of deep neural networks in the context of high-energy physics. For a more detailed literature review, we refer interested readers to Refs. [ 42 , 43 , 44 , 45 , 46 , 47 , 48 , 49 , 50 , 51 ] and the living review [ 52 ].

The rest of the review is organised as follows. In Sect.  2 , we present a qualitative discussion of the expressivity of fully-connected feed-forward neural networks, and its interplay between generalisation performance, arguing the need to curtail the theoretical expressivity to have good generalisation capabilities for specific purposes. We discuss these prior assumptions in the context of LHC phenomenology in Sect.  3 which motivates the set-based point cloud representation. Thereby, we present an introduction to graph-based architectures acting on the point cloud representation in Sect.  4 . In Sect.  5 , we overview some applications relevant to LHC phenomenology. We summarise in Sect.  6 .

2 Universal approximation: theory and practice

Artificial Neural Networks (ANNs) are non-parametric models which learn an underlying target function based on observed data. By non-parametric, we mean that the free parameters do not have any inherent (physical) meaning and the analysis relies on approximating the target function as accurately as possible. The models are structured to capture a large set of possible functions. ANNs owe their origin to mathematical models of the biological neuron [ 53 , 54 ]. As we will be discussing point cloud architectures which utilise dense feed-forward networks as building blocks, we will use the term Multilayer Layer Perceptron (MLP) to refer to such networks to avoid confusion with the composite architectures built out of such units.

This section presents a brief account of the mathematical structure of MLPs, their expressive power captured in terms of Universal Approximation Theorems (UAT). These UATs, generally phrased as existence theorems, deal with the ability to approximate any continous function on a compact domain upto any arbitrary precision. Such generality at times impede a practical optimisation especially when the input dimensionality is very large. The expressivity is meticulously reduced based on the input representation and the training objective by biasing the network architecture to some smaller sets of functions. This preconditions are colloquially referred to as “ inductive biases ”, closely related to the paradigm of inductive learning. In such scenarios, we train a model on a specific training dataset to be able to generalise to samples not present during the training, assuming that the new samples follow the same relationship between the input and the target variable. The expected error on the unseen samples is called the generalisation error of the model and it is estimated with the help of the validation dataset. Highly expressive models generally result in very good training error with poor generalisation and frequently overfit the training data. When the architecture is suitable for the particular task at hand, it results in a better generalisation error and hence can reach lower levels of validation loss without overfitting to the training dataset. We end the section with a discussion on the need to curtail the expressivity via such biases.

2.1 Structure of multilayer perceptrons

A statistical model takes an input vector \(\textbf{x}\in \mathbb {R}^n\) and learns a target function Footnote 2 \(\tilde{y}=f(\Theta ,\textbf{x})\) by tuning the set of parameters \(\Theta\) according to some known pair of tuples \(\{(\textbf{x},y)_i\}\) . Different models have different forms of functional dependence on the parameters and may involve hidden representations as in the case of neural networks. For MLPs, it is an affine map from the input space \(\mathcal {X}\ni \textbf{x}\) to a sequence of spaces \(\mathcal {Z}_\alpha\) . Without any additional restrictions, these are simply \(\mathbb {R}^{n_\alpha }\) , with \(n_{\alpha }\) the dimensionality of the \(\alpha\) -th representation. Denoting the input vector as \(\textbf{x}=\textbf{z}_0\) and the output as \(\tilde{y}=\textbf{z}_{n+1}\) , with n being the number of hidden layers, the mathematical form can be written as

where \(\textbf{w}_\alpha\) is a \(n_{\alpha }\times n_{\alpha -1}\) weight matrix and \(\textbf{b}_\alpha\) is a \(n_{\alpha }\) dimensional weight vector. A is an activation function applied to each element of the argument vector. As composition of of linear functions is a linear function, activation functions need to be be non-linear to be able to capture non linearities in data. On the other hand, not every non-linear activation usually leads to the universal approximation property, although the requirements are not very restrictive as we shall see in the following discussions.

The architecture as described above, are sequential, i.e., each layer feeds to the next and so on, to give the final predicted output, or the function \(\tilde{y}(\textbf{x})\) is a composition of several functions. In cases where the quality of the approximation can be quantified with a differentiable metric between \(\tilde{y}\) and the true value y , one generally uses a gradient descent algorithm to optimise the parameters \(\textbf{w}_\alpha\) and \(\textbf{b}_\alpha\) which are initialised to some random value before the training starts. Footnote 3 Due to the compositional nature, the application of the chain rule, gives a handy way to update the parameters when utilising gradient descent by what is known as the back-propagation algorithm [ 58 ]. In the following, we will denote the parameters of an MLP by \(\Theta\) , and the space to which it belongs to with \(\mathcal {W}\ni \Theta\) , without explicitly talking about the sequential structure of the functional mappings. The optimisation algorithm finds a point \(\Theta _0\in \mathcal {W}\) which best approximates a target function by reducing a metric between the approximated function \(\tilde{y}(\textbf{x})\) and y for all \((\textbf{x},y)\) in the training dataset.

2.2 Universal approximation theorems: a bird’s eye view

The range of functions that neural networks can approximate are encapsulated within the so-called universal approximation theorems (UATs) (see [ 59 , 60 , 61 , 62 ] for instance). These theorems investigate the class of functions that MLPs with a certain activation can approximate with finite but arbitrary number of hidden nodes and/or layers. Most of them prove that the set of functions represented by a certain class of neural networks are dense in the set of bounded continous functions in the n -dimensional hypercube.

The notion of dense sets in a larger superset is a generalisation of the properties of the rational numbers \(\mathbb {Q}\) and real numbers \(\mathbb {R}\) , with \(\mathbb {Q}\) being dense in \(\mathbb {R}\) . This means that for any \(r\in \mathbb {R}\) , we can find \(q\in \mathbb {Q}\) such that \(|r-q|<\epsilon\) for any arbitrarily small \(\epsilon >0\) . To generalise this notion to more intricate scenarios, one defines a metric, which measures a “ distance ” between two elements of the set. A dense set \(\mathcal {D}\) in a superset \(\mathcal {S}\) , is one where for all elements \(t\in \mathcal {S}\) , we can find an element \(a\in \mathcal {D}\) which is arbitrarily close with respect to a metric \(d(s,a)<\epsilon\) . For UATs, this is generally the metric induced by the supremum norm: \(\sup _{\textbf{x}\in D} |\hat{y}(\textbf{x})|\) , which is the least upper bound of the absolute value of the function \(\hat{y}\) in the domain D . For a parametrised function \(\tilde{y}(\textbf{x})=f(\Theta ,\textbf{x})\) and a target function \(\hat{y}\) , we have

Loosely speaking, it measures the largest difference between the two functions in the domain D . Therefore, \(\epsilon\) quantifies the highest absolute difference between the approximated function \(\tilde{y}\) and the target function \(\hat{y}\) .

The superset of interest for universal approximation is \(C(I_n)\) , the set of bounded continuous functions Footnote 4 from the closed unit hypercube \(I_n=[0,1]^n\) to \(\mathbb {R}\) . One of the first universal approximation theorem [ 60 ], proves that the set of functions represented by an MLP of one hidden layer and arbitrary but finite number of nodes with sigmoid activation is dense in \(C(I_n)\) with respect to \(d(\tilde{y},\hat{y})\) . This means that for any given target function \(\hat{y}\in C(I_n)\) , there exists a function \(\tilde{y}(\textbf{x})=f(\Theta _0,\textbf{x})\) parametrised by an MLP of some finite number of nodes in the hidden layer and a point \(\Theta _0\) in the parameter space which is an arbitrarily close approximation of \(\hat{y}\) . As an existence theorem, it does not talk about how to find the particular point \(\Theta _0\) , nor does it restrict it to any subset of \(\mathbb {R}^N\) , N being the dimension of \(\mathcal {W}\) which is the number of tunable parameters in the MLP. This dimension is controlled by the number of nodes in the hidden layer and the theorems do not provide a particular strategy to find N for a target level of accuracy \(\epsilon\) . However, given the generality of the function space which the theorem covers, the formal notion of existence of an arbitrarily close approximation to any bounded continuous function predicated the modern interest and revival of Artificial Neural Networks.

Practically, an MLP with a single layer is not a very efficient function approximator and falls within the broader class of shallow machine learning algorithms [ 63 ]. This is because for highly fluctuating functions, one may need an exponentially large number of nodes compared to deeper architectures. It was known intuitively that depth increases the effectiveness of function approximation [ 63 ] which follows from close analogy to the computational complexity for width vs depth [ 64 , 65 ] in circuits for implementing boolean operations, as well as the known ability of biological brains to learn simpler concepts first and more abstract concepts later in life. Deeper networks are able to approximate topologically more complex functions [ 66 , 67 ] much more efficiently than shallow networks. Footnote 5 Intuitively, this can be understood [ 68 ] from the ability of the component functions in the overall compositional chain to induce topologically discontinous changes in each mapping.

As the target function’s closed form expression is not known, the approximation can only be as good as the quality and the amount available data. In HEP where synthetic data can be simulated based on well-understood theoretical formulations, the amount and quality of data is seldom a problem. Yet the generality of the functions that dense architectures can approximate results in their handicap. The efficient optimisation via gradient descent algorithms only finds the best point \(\Theta _0\) accesible from the initial point \(\Theta _i\) . However, domain knowledge from particle physics, motivate additional restrictions on the functions beyond continuity. Such restrictions can be built into the architecture which biases the functions to (strongly) always follow a particular property, or to (weakly) partially have some desired properties. We present a qualitative discussion of why such restrictions are needed in the next subsection.

2.3 Restrictions beyond continuity: no such thing as a free lunch

As we have seen hinted in the previous description, the existence of arbitrarily close approximation via a dense neural network to any given continous function does not neccesarily translate to finding the said approximation. Due to their non-constructive nature, one has to often revert to heuristics and brute force searches in the form of hyperparameter Footnote 6 optimisation coupled with some form of gradient descent optimisation to find a workable architecture and a particular set of parameters in the weight space. Although this is manageable for moderately large input dimensionality, it becomes intractable very fast with increasing number of dimensions. This is true even for deep fully-connected networks with more than two hidden layers. To counter this issue, additional informative priors are generally employed when one goes to very high dimensional input spaces which restricts the function beyond continuity. In this subsection, we discuss the motivation and the need to assume these priors.

It is often enticing to think that given the universal nature of dense neural networks, an equally universal algorithm could exists that would guarantee an efficient optimisation for all distributions. However, this is known to be untrue [ 69 , 70 , 71 , 72 , 73 ], where it was found that without any assumption about the underlying distribution, there is no such thing as an algorithm which performs the best in comparison to any other algorithm over the full range of possible functions. A particularly relevant exposition for the choice of learning algorithms and the inherent balance in its performance in all possible learning tasks can be found in Ref. [ 71 ], Footnote 7 where a conservation law for off-training generalisation error was proved over the full range of target functions for a fixed input distribution and, finite and fixed training size. Simply put, an algorithm performing well for a set of target functions will lose it’s performance and be worse than a random guess in a different set of target functions so that the overall sum balances over the full space of target functions. Therefore, if one knows some properties of the target function beforehand, it is possible (or mandatory perhaps) to devise algorithms or models which favour that particular form. This motivates building restrictions in the possible functions that a non-parametric model (neural networks for our case) can appoximate.

The restriction of the possible functions beyond continuity can be done primarily in three ways: the data representation, the architecture of the network, and the loss function including the choice of regulariser and the training procedure. The first two are related although for a given data representation, there can be competing choices of architectures based on the complexity. The last strategy can be regarded as independent of the first two while depending more on the nature of finding the optimal weights for a given data representation and architecture like convex vs non-convex optimisation, or saddle point finding vs extrema finding of the loss function. Strictly speaking, depth is also one such prior which forces the learning of hierarchical abstractions of the input data relevant to the particular function that needs to be learnt at the output layer. However, with increasing input dimensionality, this alone is not enough to guarantee efficient search of the target function and one has to include domain-specific constraints into the architecture. For instance, the structure of convolutional neural networks with local connectivity, parameter sharing etc., is motivated from the fact that images tend to have local features which are relevent for (say) classification and they can be present anywhere in the image.

When the architecture is structured according to domain-specific priors, there is usually an additional fully-connected module which takes the features and process them for the final goal of the training. Therefore, the actual function that is approximated (by the downstream network) may not follow the implemented priors. The initial domain-specific modules are colloqually known as feature extractors, meaning, they replace the feature-engineering stage which builds domain-specific features normally utilised as inputs to shallow machine learning methods. Although deep-learning is a subset of machine learning algorithms, their distinction from shallow methods resides in their ability to find better functions than those relying on traditional hand-engineered features. There is a long history of the usage of shallow machine learning methods [ 74 , 75 ] in collider experiments and phenomenology (see Ref. [ 76 ] for a review). These methods utilise physics-motivated variable definitions constructed by processing the low-level high-dimensional data that one records at the detectors. Although, dense feed-forward neural networks with more than one-hidden layer, i.e., with no additional domain-specific designs, qualify as deep-learning models, the introduction of domain-specific design is what sets apart the modern algorithms which do not need extensively processed data. They can utilise the very high-dimensional low-level data and mostly outperform (dependent on the suitability of the priors) shallow machine learning methods.

For LHC phenomenology, all three methods of restricting the target function are utilised and they have been generally successful. However, due to the inherent nature of scientific research where favourable outcomes get published, there is a selection bias which can sometimes lead one to the false belief of out-of-the-box deep learning algorithms always performing better than shallow methods acting on physics-intuitive observables devised by experts. For instance, see Refs. [ 77 , 78 ] for a more pragmatic standpoint on Convolutional Neural Networks’ comparison to substructure variables. Nevertheless, the wealth of prior physics knowledge also stipulates one to utilise it as much as possible while not being too restrictive to cause a performance degradation. Although the performance is only estimated a posteriori , it has been found that physics-inspired networks generally help better the performance, interpretibility, and training convergence. We will discuss these priors and their relation to the point cloud representation in the next section.

3 Broad priors at the energy frontier

The nature of collisions at high energies is described via the Standard Model (SM) of particle physics—a Quantum Field Theory of the fundamental particles based on the gauge group \(SU(3)_C\otimes SU(2)_L \otimes U(1)_Y\) . Consequently, investigating particle collisions at the LHC has an inextricable connection to the underlying formalism, especially for phenomenological searches of new physics and particles. The nature of high-energy collisions result in the suitability of the point cloud representation to analyse events at the LHC. However, the restriction of the possible set of functions that neural networks can approximate based on priors is not restricted to physics, and most architectures in modern deep-learning usage inherently assume some form of symmetries or biases which help in the effective approximation for the particular domain. The underlying theme that unites the different deep learning models is the assumption of some form of geometry of the input data. With deep connections to functional analysis, differential geometry, and invariant theory, the study of the interplay between geometric priors within the architecture design, which restricts the function beyond continuity, resulting in efficient learning for various domains, is collectively referred to as geometric deep learning [ 79 , 80 ]. Consequently, other than building physics-inspired biases into the architecture, existing architectures with their own domain-specific priors have proved to be advantageous for the physics programme at the LHC. In this section, we will discuss domain-specific priors resulting in the suitability of using the point cloud representation, as well as the compatibility of existing priors in the context of LHC phenomenology.

3.1 Physical symmetries

One of the most important priors that result from physics knowledge is the existence of symmetries. When training a neural network for a particular purpose, without any additional restriction, the network does not know of the already known symmetries like Lorentz invariance. These symmetries can be built into architecture using group equivariant neural networks, where the hidden representations follow the group multiplication property and the maps between them commute with the group actions on the underlying spaces.

Given a group \(\mathcal {G}\) and a set \(\textbf{X}\) , the left-action of \(\mathcal {G}\) on \(\textbf{X}\) is a map \(A:\mathcal {G} \times \textbf{X} \rightarrow \textbf{X}\) with the following properties:

\(A(e,\textbf{x})=\textbf{x}\) for the identity element \(e\in \mathcal {G}\) and any \(\textbf{x}\in \textbf{X}\) , and

\(A(g_1,A(g_2,\textbf{x}))=A(g_1g_2,\textbf{x})\) for all \(g_1,g_2\in \mathcal {G}\) .

A map \(f:\textbf{X}\rightarrow \textbf{Y}\) from sets \(\textbf{X}\) to \(\textbf{Y}\) which both admit group actions, say \(\textbf{A}_\textbf{X}\) and \(\textbf{A}_\textbf{Y}\) respectively, for the group \(\mathcal {G}\) is said to be \(\mathcal {G}\) -equivariant if f commutes with the group actions, i.e.,

for every \(g\in \mathcal {G}\) and \(\textbf{x}\in \textbf{X}\) . The map f is \(\mathcal {G}\) -invariant if the action \(\textbf{A}_\textbf{Y}\) is trivial, i.e., \(\textbf{A}(g,\textbf{y})=\textbf{y}\) for all \(g\in \mathcal {G}\) and any \(\textbf{y}\in \textbf{Y}\) .

Looking at the form of the group action, it is evident that it has a very close connection to group representations. Given a representation of the group \(\mathcal {G}\) in terms of invertible matrices of dimension \(n\times n\) , \(\rho : \mathcal {G}\rightarrow GL(n,\mathbb {R})\) which preserves the group multiplication in terms of the matrix multiplication (a group homomorphism), we can define an action on \(\mathbb {R}^n\) as

where \(\rho (g)\) are \(n\times n\) invertible matrices. Consequently, the Lorentz transformation

is a linear action of the Lorentz group on the set of physical four-vectors \(p_\mu\) . Similar considerations hold for the multitude of groups that one encounters in high-energy physics, although the underlying field may at times be complex.

Evidently the use of function approximators which already take into account the group structure reduces the search space of possible target functions immensely. This may, Footnote 8 result in an increase in sample efficiency (the number of training samples required to reach a particular level of performance), convergence (the number of epochs required to reach that particular accuracy), reduced number of trainable parameters, or increase in overall performance with respect to non-equivariant architectures. This gain is dependent on the suitability of the target function and the particular group equivariance that is implemented into the architecture that learns the target function. As a simplified example, take the binary classification problem of two resonant particles with different (but with some overlap due to their width) masses decaying into two particles. The mass of the resonance is obviously a very important quantity in the discrimination and a Lorentz invariant feature extraction from the sum of the decay products’ four vectors would expose this feature directly, while, say Euclidean group invariance in eight dimensions would struggle to find the distinguishing feature. From a purely data driven perspective, the four vector of the two particles are just some feature vector living in \(\mathbb {R}^8\) , and it is our physics intuition which drives the additional algebraic structure of the Lorentz group.

Equivariant neural networks have been instrumental in several domains ranging from image processing [ 81 , 82 ], to the prediction of protein structures [ 83 ]. With the rich structure of symmetries in particle physics, it is no wonder that the community is exploring novel ways to utilise equivariant architectures [ 84 , 85 , 86 , 87 ] for various applications in theory and phenomenology.

3.2 Permutation invariance and variable cardinality

Another important aspect of studying collisions at the LHC, is the inherent absence of order in the data. In event analysis, the reconstructed objects like jets, leptons, photons etc., have four vectors and additional information like flavour and charge which form an unordered collection. Similarly in jet substructure analysis, the ordering of the constituents are irrelevant while the discriminating information is found in the inter particle correlations. Due to the non-conservation of particle multiplicity in relativistic quantum field theory where highly energetic particles inevitably produce additional particles which share the momentum, the number of elements in the set for either case is not constant. Therefore, one needs to consider an input representation which acts on unordered sets of variable cardinality, i.e., to preserve the property of sets, a function needs to be permutation invariant with respect to the order of the inputs, and be well-defined for varying number of inputs.

Restricting events to an ordered and fixed sized representation may result in truncation and combinatorial ambiguities [ 88 ]. While we may be interested in a specific number of coloured objects at the parton level, due to their highly radiative nature, it is often advantageous to not have a hard restriction on the number of reconstructed jets with additional jets arising from additional radiations. In other words, truncating an event signature to a fixed sized representation introduces ambiguities in how one should select these fixed number of objects with the kinematics inextricably linked to those objects which are not utilised in the representation. Combinatorial ambiguities arise when a signature has more than one way of associating the reconstructed objects to an earlier parton. For instance, in the fully leptonic decay of a pair of top quarks, other than the inability to reconstruct the momentum of the two neutrinos, we also have to decide the allocation of two bottom jets (as we cannot determine the charge) to the two reconstructed lepton. The situation becomes more complicated if we have more than two identified bottom jets in a candidate event.

Truncation ambiguities are much more pronounced in jet-substructure studies, where analogues of multiplicity are well-defined with more stringent theoretical considerations [ 89 , 90 , 91 ]. This difficulty is due to the resolution of QCD radiation patterns at much smaller angular scales within the jet in the regimes with considerable collinear enhancement. Combinatorial ambiguities can also arise in signal decays into more than two particles, like the top quark, when one is interested in assigning a parton-level flavour to some of the constituent subjets.

3.3 Infrared and collinear safety: resilience to deformations

The presence of massless particles, demand an additional consideration for perturbative calculations in QFT, namely, infrared and collinear (IRC) safety. This is one of the cornerstones for fixed order calculability of any observable for theories containing massless gauge bosons, where there is an intricate cancellation of infrared divergences arising from virtual loops with the interference contribution of soft and collinear emmisions. This cancellation is guaranteed to happen for IRC safe observables, i.e., when an observable defined for \(n+1\) particles is equal (or as a limit) to the same for n particles when any of the \(n+1\) particles become soft or any two of them become collinear. Therefore, IRC safety puts additional relations between the functions defined on sets with cardinality difference of one.

In domains like image processing or 3D point cloud processing, neural networks have some form of pooling or readout operation in a local region which propagates the summary statistic of the region forward. This results in the propagated information being approximately resilient to local deformations in the data, which mainly arise from noise. Due to the enhancement of radiation in the almost collinear or soft regimes, there is an inherent deformation of the data living in the rapidity-azimuth plane for the point cloud or the image representation. Near collinear emissions result in local distortions in a jet-image , while soft emissions are inherently less prominent since the harder hits will dominate the sum in the convolution operation. Therefore, Convolutional Neural Networks already take into account resilience to such soft and collinear emissions [ 92 ]. On the other hand, point cloud architectures which take the three dimensional features \((p_T,\Delta \eta ,\Delta \phi )\) of each particle as inputs to an MLP, do not distinguish between the transverse momentum and the angular variables and as such, the extracted features are IRC unsafe. Nevertheless, the very general nature of point cloud representation allows defining IRC safe feature extractors acting on sets of particles as epitomised by Energy Flow Networks [ 93 , 94 , 95 , 96 , 97 ] along with its graph and hypergraph generalisations [ 98 , 99 , 100 , 101 , 102 ].

An important aspect of the local deformations induced by QCD radiation is in its inevitability even in the assumption of perfect measurements, since it results from the underlying physical process rather than being induced by measurement errors. Therefore, IRC safety is a crucial element of any phenomenological analysis at hadron colliders. While the IRC safe anti- \(k_t\)  [ 103 ] jet definition has been the main workhorse of physics analyses at LHC, jet definitions utilised at Tevatron were still IRC unsafe [ 104 , 105 ]. One should note that theoretical predictions even between IRC safe jet definitions can result in systematic differences of 10% or larger as shown in Ref. [ 106 ], and without IRC safety, we are at the mercy of poorly understood non-perturbative effects. As deep learning approaches to event analysis will presumably expand in the future, it is important to consider their theoretical aspects within perturbative QCD. In this regard, the algorithm’s IRC safety is an important requirement for its understanding, especially since IRC-safe deep learning algorithms do not restrict the expressivity to a very high degree for jet-tagging. Footnote 9 However, IRC safety alone does not equate to the perturbative physics being under control, or the absence of non-trivial non-perturbative effects, and additional considerations like the algorithm’s all-order effects [ 96 , 107 , 108 , 109 ] need to be properly addressed, with little loss (preferably without losing out) on their generalisation capabilities. On the other hand, there are also classes of observables [ 110 , 111 ] that do not require fixed-order IRC safety but can be studied with more elaborate theoretical underpinnings. Notably, additional sources of deformation also arise from experimental measurements, as well as the presence of pileup interactions.

3.4 Priors from image processing

Convolutional Neural Networks (CNNs) [ 112 , 113 ] remain the quintessential example of deep learning algorithms with very good generalisation abilities after curtailing the expressivity based on domain knowledge. On top of being approximately resilient to local deformations as explained in the previous subsection, they also assume local connectivity and parameter sharing, which, when combined with succesive composition of layers result in scale separation. These biases are carried over in some sense into graph-based architectures depending on specific details of the architecture which we will touch upon in the next section. Here, we discuss the natural ways in which these three assumptions already account for the structure of QCD radiation patterns arising at the LHC.

figure 2

The figure shows a diagrammatic representation of the effective region (shown as a solid greed box) in the input image for a second convolution on the first feature map. The result of the convolution on the green region of the image with the filters (shown as hollow red squares) produces a part of the feature map shown as a solid red box. Similarly, a second convolution on the region of the feature map with the different filters of the same size produces a region of a new feature map. Therefore, while the first filters look into the local region corresponding to their actual size, the second filters learn the features in the image corresponding to a much larger area (determined by the relative size of the filters) in the original image

Local connectivity means that for a given input pixel or a node, the information processed by the network concentrates in its local neighbourhood. Compared to the image size, the smaller filters in a CNN act on local regions determined by their relative sizes. The collinear splitting structure in parton showers which result in the formation of jets is an inherently local phenomenon in the rapidity-azimuth plane. Therefore, the local nature of feature extraction looks into the local substructure in jets or jets in events. Parameter sharing refers to reusing the same set of weights for different local regions in the input. In QCD, the universality of parton showers results in similar patterns of radiation over the full detector Footnote 10 in different positions which motivates reusing the same set of weights to pick up such features. Separation of scales occurs via successive composition of functions acting on local regions in the input representation. Being a more refined version of depth to learn hierarchical abstractions of the data, the locally acting functions create a separation of scales, where the filters in initial layers look at the immediate locality and deeper ones sequentially capture larger scales. This is diagrammatically explained in Fig.  2 . Consequently, hard prongs within jets or jets within events are present at different scales in the rapidity-azimuth plane, and a separation of scales help look into these structures in a hierarchical manner. An example for a three prong top jet is shown in Fig.  3 .

figure 3

The hierarchical separation of scales encountered in a top jet image with the little boxes denoting calorimeter hits. The red circles denote three subjets, which represent the first local features in the jet. On the left, the blue ovals denote the scale at which the two prong structure becomes visible, while on the right the green dotted oval is the scale at which all three subjets become relevant

4 Graph based architectures

In the previous section, we discussed some pertinent characteristics of high-energy collisions at LHC that motivate a set-based representation of events. In this section, we elaborate on the set-based representation and architectures that preserve the set property, concentrating on the graph representation that endows relational structures between pairs of constituents of the set.

4.1 Functions on sets

The set-based representation where the elements of the set represent some feature of objects in any underlying metric space is known as point clouds. It originates from the set-based representation of discrete points in 3D coordinate space for use in various computer graphics and imaging applications while being general enough to allow the elements to reside in more abstract spaces like those found in high-energy physics.

As we are interested in sets of finite but arbitrary size consisting of features living in an abstract space, say \(\mathcal {X}\) , we are generally interested in learning functions from the powerset Footnote 11 of \(\mathcal {X}\) to the output domain \(\mathcal {Y}\) . Such functions are required to be permutation invariant with respect to the input ordering to preserve the property of sets as unordered collection of objects. The universality of such functions was studied in the Deep Sets [ 114 ] framework, which defined set based feature extractors that posses the property of permutation invariance or equivariance. For the invariant case, a Deep Sets model f is of the form

where \(\textbf{h}_i\) are the set elements, M the cardanility of the set, and \(\rho\) and \(\phi\) are MLPs. We can see that the function \(\phi\) learns a per-object map to a latent space, which undergoes a permution-invariant summation over the set constituents. Going into the two-part seggregation of deep learning algorithms, the MLP \(\phi\) acts as a feature extractor while \(\rho\) gives the downstream output. The extracted features are combined in a permutation invariant manner under the summation, which reflects the prior of set-based functions. Note that the summation can be replaced by any permutation invariant operation like taking the component-wise maximum or minimum over the latent representation.

Although the Deep Sets algorithm is a powerful feature extractor, the feature extraction stage looks only into the individual elements of the set. The downstream MLP, on the other hand, has access to the aggregated global information of the entire set. Such an architecture does not exploit the relational information between the set’s constituents. To circumvent this issue, one endows additional structures on the set, like graphs, which expose pairwise relationships, or hypergraphs, which generalise graphs to account for any n -element relationships within the set.

4.2 Describing data with graphs

As we have describe above, graphs equip the input set with relational information between two elements. These elements in graph terminology are called nodes or vertices. The edge set \(\mathcal {E}\) , is a subset of the cartesian product \(\mathcal {S}\times \mathcal {S}\) . The presence or absence of an edge, say ( i ,  j ) in \(\mathcal {E}\) is determined by usefulness of the relation between the two elements i and j . Consequently, the way a graph is constructed detemines the information highlighted amongst the nodes. In LHC physics, these may include forming edges between geometrically close components in a detector module, a Feynamn diagram based construction, or closeness in the rapidity-azimuth plane. If \(\mathcal {E}=\mathcal {S}\times \mathcal {S}\) , the corresponding graph is called a complete graph, the relation between every pair of elements is exposed in the graph. This is generally avoided as it leads to a quadratic scaling with the input set cardinality.

A graph can be represented as \(\mathcal {G}=(\mathcal {S},\mathcal {E})\) , where \(\mathcal {S}\) represents the set of nodes and \(\mathcal {E}\) represents the set of edges. Each element in the set is some feature vector \(\textbf{h}_i\in \mathbb {R}^d\) . However, the set \(\mathcal {S}\) may at times refer to the index set consisting of \(i\in \{1,2,....,|\mathcal {S}|\}\) in the following discussions which will be clear from context. The edge set \(\mathcal {E}\) for computational purposes, can be suitably represented as an \(|\mathcal {S}|\times |\mathcal {S}|\) matrix \(\textbf{A}\) , with components

which is called the adjacency matrix of the graph. Note that \(\textbf{A}\) assumes a particular ordering of the nodes. This is the same when we input the graph quantities to a computer in the form of an array. Although there can be complexity studies of graphs where the nodes are considered indistinguishable, the nodes for our case are distinguished by their feature vectors \(\textbf{h}_i\) . Moreover, the edges for our case will be directed as the message functions (to be explained later) will generally be asymmetric to interchange of the two constituent nodes of an edge.

The addition of edges as relational pairs require the conservation of the overall structure of the edge set for any permutation of the nodes. This is a stricter requirement than permutation invariance of the nodes, and formally covered within the notion of graph isomorphisms . Any two graphs \(\mathcal {G}=(\mathcal {S},\mathcal {E})\) and \(\mathcal {G}'=(\mathcal {S}',\mathcal {E}')\) are isomorphic to each other if we can find an invertible function \(N:\mathcal {S}\rightarrow \mathcal {S}'\) between the node sets which induces an invertible map \(E:\mathcal {E}\rightarrow \mathcal {E}'\) between the edge sets. As permutation of the elements of \(\mathcal {S}\) can be viewed as a bijection of the set to itself, graph isomorphisms additionally require the conservation of the edge structure for any permutation of the nodes. Although the notion of graph isomorphisms is often equated to being oblivious to relabelling of the nodes, our case is not as straightforward as assuming such a simple relabelling as the node features are immutable intrinsic labels of the nodes. On top of these node features, we assign an index to the the array containing each node features and construct the edges as a two-dimensional arrays of size \(2\times |\mathcal {E}|\) in the sparse representation, or the adjacency matrix of size \(|\mathcal {S}|\times |\mathcal {S}|\) . Graph isomorphisms then practically relate to how the node features are arranged in this representation with the graph property intact as long as we work with quantities which are invariant for different ordering of the node features.

figure 4

Examples of isomorphic and non-isomorphic graphs between directed graphs with nodes distinguished by their colours. The graphs in the bounded blue box are isomorphic to one another while the three on the righ are not isomorphic to any other graph. The two graphs within the red boxes become isomorphic to the blue ones when one forgoes the difference in the colours and the direction of the edges

A representation of different isomorphic and non-isomorphic graphs with three coloured nodes is shown in Fig.  4 . The colour in our case represents the node features which are intrinsic to that particular node. Therefore, one has to take into account this intrinsic character when defining graph isomorphisms. All graphs within the blue bounded box are isomorphic to each other as one can easily define invertible maps between the node labels which preserved the underlying directed connections between the coloured nodes. We have intentionally ordered the nodes in a straight line and numbered them in increasing natural numbers (except for the one on the top-left) to make it evident the analogy to a matrix based representation where the numbers denote its index. On the right all three graphs are not isomorphic to each other or to any of the ones in the blue box when considering the colours and edge directions. As there can be no bijections between node sets or edge sets of different sizes, the graph on the yellow box at the bottom can never be isomorphic to any other graph in the figure when relaxing the additional constraints of colourings and directions. However, the graph on top right is isomorphic to the ones in the blue box if one forgoes the colourings but keeps the edge directions. This is not the case for the one in the middle since the green node has two outgoing edges while it is not the case in the other graphs. Finally, all graphs except for the one in the yellow box are isomorphic to each other if one forgoes both the colourings and the edge directions.

We now explain some terminologies which will be used in the following discussion of graph neural networks. The neighbourhood of a node i , written as \(\mathcal {N}(i)\) is the set of all connected nodes to i . For directed graphs, one usually regards either the incoming or outgoing edges (but not both) to define the neighbourhood. For instance, in the isomorphic graphs within the blue box in Fig.  4 , the neighbourhood of the red node if one considers incoming edges is made up of the blue node, while for outgoing connections it is the green node alone. The l -hop neighbourhood of a node i consists of all nodes which can be reached from the node i by following at most l edges. Like before, the direction should be consistently defined for directed graphs and \(l=1\) results in the usual neighbourhood definition. In Fig.  4 , considering outgoing edges, the blue node’s 2-hop neighbourhood consist of the green node on top of the red one. For our discussion of GNNs, we will consider the neighbourhood \(\mathcal {N}(i)\) to be defined as the set of nodes with incoming connections to i . Thereby, we will denote any edge attribute connecting two nodes of indices i and j with the subscript \(i\leftarrow j\) to make the direction self-evident.

4.3 Graph neural networks

In order to extract features from a graph for graph-level purposes, we need an architecture which extracts features which are invariant with respect to graph isomorphisms Footnote 12 where the extracted features are identical for any two isomorphic graphs. For node or edge level applications where one uses their features rather than the global graph representation, the output of the network should depend only on the intrinsic features and not on the index of the features, i.e., for any two isomorphic graphs we can obtain an invertible map between the node sets which preserve the updated features of the graph components.

One of the first proposals of Graph Neural Networks [ 115 , 116 ] (GNNs) for any general graph compatible with back-propagation optimisation devised the concept of states of nodes and edges adapting their usage in recursive neural networks [ 117 , 118 ] which acts on directed acyclic Footnote 13 graphs. The essential quality which conserved the graph properties consisted of recursively updating the states in successive timesteps with a shared parametric model like an MLP. More recent generalisations include the Message Passing Neural Network (MPNN) [ 119 ] approach and the more general Graph Networks [ 120 ]. Our discussions will concentrate on the more popular MPNNs.

An MPNN consists of several message-passing operations that takes in a graph \(\mathcal {G}=(\mathcal {S},\mathcal {E})\) with node features \(\textbf{h}_i\in \mathcal {S}\) , and optionally, edge features for all \((i,j)\in \mathcal {E}\) as \(\textbf{e}_{i\leftarrow j}\) and updates the node features via the following steps:

Message passing: A learnable function (generally an MLP) \(\phi _e\) , called the message function takes the node features \(\textbf{h}_i\) and \(\textbf{h}_j\) , for each edge \((i, j)\in \mathcal {E}\) and optionally the edge feature \(\textbf{e}_{i\leftarrow j}\) and evaluates the message as

Message Aggregation: For each node i , we aggregate the messages \(\textbf{m}_{i\leftarrow j}\) for all nodes j in the neighbourhood \(\mathcal {N}(i)\) with a permutation invariant function \(\Box _l\)

Node update: The node features are updated to \(\textbf{h}'_i\) as a function \(\phi _h\) (which also can be an MLP) of the aggregated message \(\textbf{m}_i\) and the input node feature \(\textbf{h}_i\) as

The number of message passing operation is fixed for a particular model and one either uses the messages \(\textbf{m}_{i\leftarrow j}\) and the node feature \(\textbf{h}_i\) as inputs to an MLP for edge and node specific tasks, respectively. For graph level purposes, we constructs the graph representation \(\textbf{G}\) , by a readout operation on the node features with an analogous global readout function \(\Box _g\) as \(\textbf{G}=\Box _g(\{ \textbf{h}_i | i\in \mathcal {S} \})\) .

It is now evident that an MPNN’s local connectivity is determined by the graph’s structure with the first message-passing operation looking at the immediate neighbourhood, and successive ones looking at larger and larger l -hop neighbourhoods. The parameter sharing (the message function or the node update MLP is the same for all edges and nodes), and the message- aggregation stage, brings forward the biases discussed for CNNs to GNNs. However, while the pooling operations downsample the image representation thereby reducing redundant information from going deeper into the architecture, the readout operation does not reduce the cardinality of the node set and there is a much faster increase of redundant information amongst the node representations as one goes deeper. This hugely restricts the practical viability of deeper MPNNs for small graphs as the node features become increasingly similar with more message-passing operations.

The GN formalism additionally consists of edge updates and global feature updates, however, most common GNNs utilised in LHC phenomenology can be explained as a specific form of the MPNN approach. For instance, the message function in edge-convolutions [ 121 ] is of the form

with the node updated via \(\textbf{h}'_i=\textbf{m}_i\) after a choice of permutation invariant aggregation. The operation \(\oplus\) denotes concatenation of the two vectors.

Another important operation is the attention mechanism which currently have state-of-the-art performance on the public top-tagging dataset. Although attention mechanisms originated [ 122 ] as a way to circumvent the loss of information in Recurrent Neural Networks for natural language processing (NLP), the Transformer model [ 123 ] which solved the bottleneck for efficient training of sequential language models by utilising attention alone while forgoing the recurrence, also found state-of-the-art performance for various NLP tasks and opened the proverbial “Pandora’s box” of Large Language Models [ 124 , 125 , 126 , 127 ]. While these exciting developments are outside the scope of the current review, the attention mechanism being a set operation is applicable to point clouds [ 128 , 129 ]. The message function for any general attention mechanism can be written as

for some learnable function \(\alpha\) and attention mechanism \(\textbf{w}\) whose output dimension should either be a scalar or a vector with the same dimensions as that of \(\alpha (\textbf{h}_i\) ) for component-wise multiplication. The learnable function \(\textbf{w}(\textbf{h}_i,\textbf{h}_j)\) denote the attention of the node j with respect to the source node i , and can be cast in different ways according the particular attention mechanism. For the interpretation of \(\textbf{w}(\textbf{h}_i,\textbf{h}_j)\) as weights in the summed aggregation \(\textbf{h}'_i=\sum _{j\in \mathcal {N}(i)} \textbf{m}_{i\leftarrow j}\) , the requirement \(\sum _{j\in \mathcal {N}(i)} \textbf{w}(\textbf{h}_i,\textbf{h}_j)=1\) is achieved by a softmax normalisation on the j index. Due to this learnable importance of the different nodes, attention-based networks often employ a complete graph structure where the job is left to the attention mechanism to learn the respective inter-node importance.

5 Overview of applications in high energy physics

Graph Neural Networks in their different guises have been explored in a varied range of applications for the physics program at the Large Hadron Collider. There can be several ways to categorise these applications based on the learning algorithm, the prediction task, and the physics application. The categories of learning methods broadly fall under unsupervised, semi-supervised and supervised methods, while the prediction task can be seggregated into node, edge and graph level tasks. In this section, we give a brief overview of the applications of graph-based for some physics applications at the LHC.

Jet Classification: The prototypical example of deep learning applications is the classification of large radius QCD jets [ 130 ] from boosted hadronic decays of heavy particles and various point-cloud architectures have been studied for jet classification. Carrying forward the point cloud representation, ParticleNet [ 131 ], which utilized dynamic graph convolutions, presented the first application and showed high efficiency in classifying QCD/top and quark/gluon tagging. Interaction networks-based methods [ 132 , 133 ] were also explored for jet tagging for 2-prong and 3-prong boosted jets. Attention-based GNNs [ 134 ] were explored for quark-gluon tagging applications. The point transformer [ 129 ] architecture has also been explored [ 135 ] for discriminating quark/gluon and top/qcd jets. Chebyshev graph convolutions [ 136 ] which enables the model to capture local dependencies intrinsic to jet formation, have also been studied for jet-tagging [ 137 ]. Haar pooling message passing networks [ 138 , 139 ], have also shown to improve tagging performance compared to the usual readout operations. Till date Particle Transformer [ 140 ], based on a modified multi-head attention mechanism [ 123 , 141 ], shows state-of-the-art performance in the public top tagging dataset [ 142 ].

As already explained in Sect.  3.1 , physical symmetry has been used in graph-based architectures. Lorentz Group Network (LGN) [ 86 ] utilise the isomorphism of proper orthochronous Lorentz group SO \((1, 3)^+\) to the projective special complex linear group \(\text {PSL}(2, \mathbb {C})\) to build tensorial representations decomposed into irreducible representations in terms of the Clebsch-Gordan coefficients. While LorentzNet [ 143 , 144 ] implemented a simplified scalar and vector representation in the hidden representations to extract the features. Later, PELICAN [ 145 , 146 ] combined permutation and Lorentz equivariant aggregations. On the public top tagging data set, these architectures matched or outperformed ParticleNet with fewer trainable parameters. Meanwhile, LorentzNet showed a very high sample efficiency, reaching an AUC of 0.9839 with only \(5\%\) of the training data. LorentzNet was augmented with capsule network [ 147 ], showing improved performance for quark-gluon performance. The gain in performance comes because the feature extraction specifically utilises the physical symmetry of the dataset.

When utilising the jet’s constituents when extracting features with deep-learning models, it is not necessary that automatic feature extraction on a jet is IRC safe, and the IRC safety of these features depend not only on the input features but also on how they are structured. Constructing infrared and collinear-safe feature extractors on point clouds has also received considerable effort. Energy Flow Networks [ 93 ] (EFNs) adapted the Deep Sets[ 114 ] framework to account for IRC safety by utilising per-particle maps of the angular coordinates and performing a summed readout over the particles after linearly weighting with energy. Extensions of this approach include a permutation equivariant EFN [ 94 ], building hierarchical EFNs by utilising a basis of Legendre polynomials [ 95 ], Lipschitz-EFNs [ 96 ] where the extracted features are made to follow the Lipschitz continuity condition motivated from the geometry of particle collisions [ 148 ], and utilising higher moments [ 97 ]. The per-particle nature of EFN feature extraction has been generalised to graphs in the framework of Energy-weighted Message Passing Networks [ 98 ] for supervised classification. Further extensions include hypergraph based feature extraction [ 100 ], and equivariance in the rapidity-azimuth plane [ 101 , 102 ]. An architecture based on dynamic edge convolutions was also studied for tagging jets originating from dark showers [ 149 ].

Another theoretically motivated way to represent a jet as a graph is through the Lund tree declustering [ 150 , 151 ], which maps emissions according to the jet clustering history. LundNet [ 152 ] employed Lund graphs as inputs for various binary jet classification tasks. Due to the theoretically transparent nature of the input construction, it allows more detailed semi-analytic studies [ 153 ] of its discriminative power than is possible with the other graph-based approaches.

Discovering New Physics via Event Classification:

Graph Neural Networks (GNNs) can categorize particle collision events into different classes, distinguishing between signal and background events. Leveraging the inherent graph structure of particle interactions, GNNs are adept at capturing nuanced characteristics that conventional approaches may overlook, thereby improving the accuracy of event classification. GNNs are advantageous for the event classification task for LHC data as they overcome the shortcomings of CNNs, which are limited to the Euclidean domain. An event at a collider can be naturally represented as graphs, where the final state particle represents the node and their interactions are encoded by edges.

In phenomenological studies, fully connected graphs and physics-based topology-inspired graphs are explored for event classification. Physics-inspired topology-based graphs are an efficient approach to capturing complex data relationships while minimizing NN parameters. In a multiclass set-up for the semileptonic \(t\bar{t}\) final states, GNN was used to discriminate the thirteen independent Wilson coefficients [ 154 ]. The decay-topology-inspired graph is shown in Fig.  5 . To construct the graph, final state reconstructed particles are first added as the nodes, and additional nodes are added to the topology if the parent of the decay product can be reconstructed. On the other hand, fully connected graphs are also advantageous in case of unknown underlying physics topology or if fewer final state particles are involved in an event.

Graphs can have varying numbers of nodes and edges for each event, unlike fixed-size inputs in typical CNNs. Therefore, GNNs are adaptable to work on different event topologies and can generalize well across various types of particle collisions. In an early study, [ 155 ] collision events were represented as event graphs, and the message-passing neural network (MPNN) was used to search for the stop pair production at the LHC. Employing GNN for particle-flow (PF) reconstruction in a high-pileup environment has been developed for a general-purpose multilayered particle detector [ 156 ]. In Ref. [ 157 ], GNN with a fully connected graph is used to combat comprehensive backgrounds arising from other SM processes, which improves the sensitivity of di-Higgs analysis in the vector-boson fusion production channel. Similar architectures are used to improve the four top measurements for SM [ 158 ] and constrain the BSM parameter in the context of 2HDM. The fully connected graph and architecture used in the four top analysis are illustrated in Fig.  6 . Self-attention to extract features from jet substructure and cross attention mechanism to combine these features for event level classification have also been studied [ 159 ] for BSM searches at LHC showing improved performance compared to simple concatenation of features. Accurate measurement of the trilinear and quartic Higgs self-couplings is essential to understanding the shape of the Higgs potential and the electroweak phase transition. While traditional searches at the LHC using Higgs pair production provide limited constraints, recent studies using Graph Neural Networks (GNNs) have shown promise in improving these bounds [ 160 ]. Topological graph algorithms were also developed to reconstruct intermediate particles and address combinatorial challenges in full hadronic decays \(t\bar{t}\) [ 161 ]. Recently, hypergraph representations [ 162 ] have been employed to demonstrate the reconstruction of parent particles for full hadronic decays of \(t\bar{t}\) . Events represented by weighted nodes with edges depicting the distance between events in kinematic space, were utlised [ 163 ] to separate the stop quark pairs from the top pair in the kinematic space. In Refs. [ 164 , 165 ], \(t\bar{t}\) X process is scrutinized with Transformer and graph-based event classification. Using the point cloud representation for Higgs boson decays in tau leptons significantly improves compared to classical analysis techniques [ 88 ].

figure 5

Example of a decay topology inspired graph (figure taken from Ref. [ 154 ])

figure 6

Example of a fully connected event graph (figure taken from Ref. [ 158 ])

Anomaly Detection:

Even though there are many theoretically motivated extensions of the Standard Model, the search for new physics can be regarded as an inherently model-independent endeavour, as nature need not fit into the constraints of our imagination. This motivates developing algorithms that do not rely on a specific BSM model but try to find deviations from the known background SM processes. General techniques exist in ML literature which are appropriate for model-independent searches called anomaly detection, wherein an algorithm focuses on learning the features of a well-understood class (the background in our case) and identifying anomalous data samples that do not have the same features.

A common way to achieve this unsupervised learning is via autoencoders. An autoencoder is an ANN used in unsupervised learning that aims to encode input data into a reduced-dimensional representation (called the latent representation) and then decode it back to its original form. The model learns by comparing the reconstructed output with the original input and tuning parameters to minimize the reconstruction error. Potential of autoencoder architectures have been demonstrated in simulating realistic and diverse aspects of high energy physics events. For instance, convolutional autoencoders were studied for detecting anomalous jet images in Refs.  [ 166 , 167 ].

GNNs rely on understanding the graph’s topology and node connections to extract meaningful representations. Capturing this information accurately during the encoding and decoding phases to reconstruct the original graph structure is non-trivial. Particle Graph Autoencoder [ 22 , 168 ] used an edge-convolution encoder block to map the node features to a two-dimensional latent node representation, which fed to the symmetrical decoder block to reconstruct the node features. For anomaly detection of events or jets, we are interested in the global features of the graph. Although a graph readout extracts this information in classification, the operation itself is destructive, and one cannot recover the graph structure in an autoencoder-based approach where the decoder needs to have the graph structure intact for its reconstruction. To address these challenges in graph autoencoders, Ref. [ 169 ] devised edge reconstruction networks, which were used to reconstruct weighted edge features, thereby learning the graph structure without undergoing a destructive graph readout operation. The Deep Sets autoencoder [ 170 ], on the other hand, relied solely on the encoded global latent space without using any decoder. Graph autoencoders with physics-motivated inductive biases like Lorentz group equivariance [ 171 ] and IRC-safety [ 99 ] have also been explored for anomaly detection.

In this review, we have laid down the rationale for designing automatic feature extractors from high-dimensional data in the context of high-energy physics. The main theme of such strategies is the balance between expressivity and generalisation power. While expressivity deals with the theoretical power of non-parametric models to capture a large set of functions accurately, generalisation power trades some of this expressivity for the practical gain of finding a good function for specific purposes based on prior knowledge of the use case.

The wealth of fundamental understanding of particle physics motivates their use in designing automatic feature extractors for various applications at the Large Hadron Collider. One recurrent theme from prior physics intuition is the suitability of the set-based point cloud and graph representation of collision events. Among different deep learning frameworks, graph neural networks possess a unique ability to learn from relational data. They are also adaptable to versatile data structures, capturing the essence of graphs characterised by node features and their relationships (features related to connected edges) along with graph features, if any. At the heart of a GNN lies a clever mechanism called message passing. Nodes exchange information with their neighbours, iteratively aggregating and updating their representations. Such operations allow a node to understand its self-attributes in light of the context provided by its connected neighbours, capturing complex relationships within the graph.

It is not an understatement to state that mainstream scientific inquiry is in a data-driven era, propelled by the development of powerful and data-hungry deep-learning algorithms. However, from the perspective of fundamental fields like particle physics, the main objective is to update our understanding of the universe based on empirical evidence. Focusing on their performance alone would be detrimental in the long term, and the common consensus in the community is to understand these algorithms in sufficient detail. Removing this obscurity is not just of academic interest but rather a practical requirement arising from the infamous mathematical complexity of such algorithms. The (potential) discovery of new physics at the Large Hadron Collider is an extreme statistical condition, and one needs to understand the algorithm’s behaviour in such extreme conditions concretely.

Data availibility statement

No data associated in the manuscript.

Note that what we consider raw data here is still highly processed when one considers the intricacies of experimental measurements like track finding, pileup subtraction etc.

Without loss of generality, we will consider a scalar function.

For more details of how the process of initialisation affects the network performance, see for instance Refs. [ 55 , 56 , 57 ].

Although, there are larger function spaces like the space of Lebesgue integrable functions \(L_1(I_n)\) and possibility of UATs in them, we will restrict ourselves to \(C(I_n)\) as finding effective training strategies for discontinuous functions are extremely difficult for the large dimensionality of modern neural networks.

In Ref. [ 66 ], it is found that the sum of Betti numbers (a measure of topological complexity) of the subset of the domain of the function with positive outputs for a single output network scales at most as a polynomial in the number of nodes for several activation functions for a shallow network while it can scale exponentially for deeper networks.

The different choices given a base architecture like the activation function, optimisation method, learning rate, the number of hidden layers, the number of nodes in these layers etc., are known as hyperparameters.

Also, see Ref. [ 63 ] for a discussion on the quest for artificial intellligence and its interplay with priors and these no-free-lunch theorems.

As the dimensionality of the input space is still very large, it is hard to ascertain the improvements a priori.

If the information of interest for a particular training objective is in the high-energy patterns in an event, one would intuitively expect an IRC-safe algorithm to be relatively comparable to unsafe ones, as IRC-safe observables concentrate on the hard energy flow of the event.

Even though pile-up break this generality, this remains true after pileup removal after which phenomenological analyses are conducted on events following such patterns.

If the cardinality of \(\mathcal {X}\) is infinity, we are interested in elements of the powerset which has bounded cardinality.

Technically, this means that the graph representation is defined for an equivalence class of isomorphic graphs rather than each individual graphs in the class.

An acyclic graph is one where there are no paths with distinct nodes between any two nodes which form a closed loop.

G. Carleo, I. Cirac, K. Cranmer, L. Daudet, M. Schuld, N. Tishby, L. Vogt-Maranto, L. Zdeborová, Machine learning and the physical sciences. Rev. Mod. Phys. 91 , 045002 (2019). https://doi.org/10.1103/RevModPhys.91.045002

Article   ADS   Google Scholar  

J. Elmsheuser, Evolution of the ATLAS analysis model for Run-3 and prospects for HL-LHC. EPJ Web Conf. 245 , 06014 (2020). https://doi.org/10.1051/epjconf/202024506014

Article   Google Scholar  

T. Aarrestad, HL-LHC Computing Review: Common Tools and Community Software. In: Canal, P., (eds.) Snowmass 2021 (2020). https://doi.org/10.5281/zenodo.4009114

A. Held, E. Kauffman, O. Shadura, A. Wightman, Physics analysis for the HL-LHC: concepts and pipelines in practice with the Analysis Grand Challenge. In: 26th International Conference on Computing in High Energy & Nuclear Physics (2024)

J. Motta, Overview of the HL-LHC Upgrade for the CMS Level-1 Trigger. PoS EPS-HEP2023, 534 (2024) https://doi.org/10.22323/1.449.0534

T.Q. Nguyen, D. Weitekamp, D. Anderson, R. Castello, O. Cerri, M. Pierini, M. Spiropulu, J.-R. Vlimant, Topology classification with deep learning to improve real-time event selection at the LHC. Comput. Softw. Big Sci. 3 (1), 12 (2019). https://doi.org/10.1007/s41781-019-0028-1 . arXiv:1807.00083 [hep-ex]

J. Duarte, Fast inference of deep neural networks in FPGAs for particle physics. JINST 13 (07), 07027 (2018). https://doi.org/10.1088/1748-0221/13/07/P07027 . arXiv:1804.06913 [physics.ins-det]

A. Butter, S. Diefenbacher, G. Kasieczka, B. Nachman, T. Plehn, D. Shih, R. Winterhalder, Ephemeral Learning - Augmenting Triggers with Online-Trained Normalizing Flows. SciPost Phys. 13 (4), 087 (2022). https://doi.org/10.21468/SciPostPhys.13.4.087 . arXiv:2202.09375 [hep-ph]

G. Bortolato, M. Cepeda, J. Heikkilä, B. Huber, E. Leutgeb, D. Rabady, H. Sakulin, Design and implementation of neural network based conditions for the CMS Level-1 Global Trigger upgrade for the HL-LHC. JINST 19 (03), 03019 (2024). https://doi.org/10.1088/1748-0221/19/03/C03019

M. Paganini, L. Oliveira, B. Nachman, Accelerating Science with Generative Adversarial Networks: An Application to 3D Particle Showers in Multilayer Calorimeters. Phys. Rev. Lett. 120 (4), 042003 (2018). https://doi.org/10.1103/PhysRevLett.120.042003 . arXiv:1705.02355 [hep-ex]

S. Farrell, Novel deep learning methods for track reconstruction. In: 4th International Workshop Connecting The Dots 2018 (2018)

S.R. Qasim, J. Kieseler, Y. Iiyama, M. Pierini, Learning representations of irregular particle-detector geometry with distance-weighted graph networks. Eur. Phys. J. C 79 (7), 608 (2019). https://doi.org/10.1140/epjc/s10052-019-7113-9 . arXiv:1902.07987 [physics.data-an]

Fast simulation of the ATLAS calorimeter system with Generative Adversarial Networks. Technical Report ATL-SOFT-PUB-2020-006, CERN, Geneva (Nov 2020). All figures including auxiliary figures are available at https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PUBNOTES/ATL-SOFT-PUB-2020-006. http://cds.cern.ch/record/2746032

E. Buhmann, S. Diefenbacher, E. Eren, F. Gaede, G. Kasieczka, A. Korol, K. Krüger, Getting High: High Fidelity Simulation of High Granularity Calorimeters with High Speed. Comput. Softw. Big Sci. 5 (1), 13 (2021). https://doi.org/10.1007/s41781-021-00056-0 . arXiv:2005.05334 [physics.ins-det]

C. Biscarat, S. Caillou, C. Rougier, J. Stark, J. Zahreddine, Towards a realistic track reconstruction algorithm based on graph neural networks for the HL-LHC. EPJ Web Conf. 251 , 03047 (2021). https://doi.org/10.1051/epjconf/202125103047 . arXiv:2103.00916 [physics.ins-det]

M. Faucci Giannelli, G. Kasieczka, B. Nachman, D. Salamani, D. Shih, A. Zaborowska, Fast Calorimeter Simulation Challenge 2022. https://calochallenge.github.io/homepage/ (2022). https://calochallenge.github.io/homepage/

A. Adelmann, New directions for surrogate models and differentiable programming for High Energy Physics detector simulation. In: Snowmass 2021 (2022)

R. Liu, P. Calafiura, S. Farrell, X. Ju, D.T. Murnane, T.M. Pham, Hierarchical Graph Neural Networks for Particle Track Reconstruction. In: 21th International Workshop on Advanced Computing and Analysis Techniques in Physics Research: AI Meets Reality (2023)

P. Baldi, P. Sadowski, D. Whiteson, Searching for Exotic Particles in High-Energy Physics with Deep Learning. Nature Commun. 5 , 4308 (2014). https://doi.org/10.1038/ncomms5308 . arXiv:1402.4735 [hep-ph]

L.M. Dery, B. Nachman, F. Rubbo, A. Schwartzman, Weakly Supervised Classification in High Energy Physics. JHEP 05 , 145 (2017). https://doi.org/10.1007/JHEP05(2017)145 . arXiv:1702.00414 [hep-ph]

E.M. Metodiev, B. Nachman, J. Thaler, Classification without labels: Learning from mixed samples in high energy physics. JHEP 10 , 174 (2017). https://doi.org/10.1007/JHEP10(2017)174 . arXiv:1708.02949 [hep-ph]

G. Kasieczka, The LHC Olympics 2020 a community challenge for anomaly detection in high energy physics. Rept. Prog. Phys. 84 (12), 124201 (2021). https://doi.org/10.1088/1361-6633/ac36b9 . arXiv:2101.08320 [hep-ph]

G. Aad, Dijet resonance search with weak supervision using \(\sqrt{s}=13\) TeV \(pp\) collisions in the ATLAS detector. Phys. Rev. Lett. 125 (13), 131801 (2020). https://doi.org/10.1103/PhysRevLett.125.131801 . arXiv:2005.02983 [hep-ex]

T. Aarrestad, The Dark Machines Anomaly Score Challenge: Benchmark Data and Model Independent Event Classification for the Large Hadron Collider. SciPost Phys. 12 (1), 043 (2022). https://doi.org/10.21468/SciPostPhys.12.1.043 . arXiv:2105.14027 [hep-ph]

A. Hallin, G. Kasieczka, T. Quadfasel, D. Shih, M. Sommerhalder, Resonant anomaly detection without background sculpting. Phys. Rev. D 107 (11), 114012 (2023). https://doi.org/10.1103/PhysRevD.107.114012 . arXiv:2210.14924 [hep-ph]

G. Aad, Search for New Phenomena in Two-Body Invariant Mass Distributions Using Unsupervised Machine Learning for Anomaly Detection at s=13 TeV with the ATLAS Detector. Phys. Rev. Lett. 132 (8), 081801 (2024). https://doi.org/10.1103/PhysRevLett.132.081801 . arXiv:2307.01612 [hep-ex]

V.S. Ngairangbam, M. Spannowsky, Interpretable deep learning models for the inference and classification of LHC data (2023) arXiv:2312.12330 [hep-ph]

Model-agnostic search for dijet resonances with anomalous jet substructure in proton-proton collisions at \(\sqrt{s}\) = 13 TeV (2024)

T. Finke, M. Krämer, A. Mück, J. Tönshoff, Learning the language of QCD jets with transformers. JHEP 06 , 184 (2023). https://doi.org/10.1007/JHEP06(2023)184 . arXiv:2303.07364 [hep-ph]

A. Butter, N. Huetsch, S. Palacios Schweitzer, T. Plehn, P. Sorrenson, J. Spinner, Jet Diffusion versus JetGPT – Modern Networks for the LHC (2023) arXiv:2305.10475 [hep-ph]

M. Vigl, N. Hartman, L. Heinrich, Finetuning Foundation Models for Joint Analysis Optimization (2024) arXiv:2401.13536 [hep-ex]

L. Heinrich, T. Golling, M. Kagan, S. Klein, M. Leigh, M. Osadchy, J.A. Raine, Masked Particle Modeling on Sets: Towards Self-Supervised High Energy Physics Foundation Models (2024) arXiv:2401.13537 [hep-ph]

J. Birk et al., OmniJet-α: the first cross-task foundation model for particle physics. Mach. Learn.: Sci. Technol. 5 , 035031 (2024). https://doi.org/10.1088/2632-2153/ad66ad

P. Harris, M. Kagan, J. Krupa, B. Maier, N. Woodward, Re-Simulation-based Self-Supervised Learning for Pre-Training Foundation Models (2024) arXiv:2403.07066 [hep-ph]

G. Louppe, M. Kagan, K. Cranmer, Learning to pivot with adversarial networks. In: Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., ??? (2017). https://proceedings.neurips.cc/paper_files/paper/2017/file/48ab2f9b45957ab574cf005eb8a76760-Paper.pdf

J.Y. Araz, M. Spannowsky, Combine and Conquer: Event Reconstruction with Bayesian Ensemble Neural Networks. JHEP 04 , 296 (2021). https://doi.org/10.1007/JHEP04(2021)296 . arXiv:2102.01078 [hep-ph]

A. Ghosh, B. Nachman, D. Whiteson, Uncertainty-aware machine learning for high energy physics. Phys. Rev. D 104 (5), 056026 (2021). https://doi.org/10.1103/PhysRevD.104.056026 . arXiv:2105.08742 [physics.data-an]

T.Y. Chen, B. Dey, A. Ghosh, M. Kagan, B. Nord, N. Ramachandra, Interpretable Uncertainty Quantification in AI for HEP. In: Snowmass 2021 (2022). https://doi.org/10.2172/1886020

A. Golutvin, A. Iniukhin, A. Mauri, P. Owen, N. Serra, A. Ustyuzhanin, The DL Advocate: playing the devil’s advocate with hidden systematic uncertainties. Eur. Phys. J. C 83 (9), 779 (2023). https://doi.org/10.1140/epjc/s10052-023-11925-w . arXiv:2303.15956 [hep-ex]

A. Ghosh, B. Nachman, T. Plehn, L. Shire, T.M.P. Tait, D. Whiteson, Statistical patterns of theory uncertainties. SciPost Phys. Core 6 , 045 (2023). https://doi.org/10.21468/SciPostPhysCore.6.2.045 . arXiv:2210.15167 [hep-ph]

A. Ghosh, B. Nachman, A cautionary tale of decorrelating theory uncertainties. Eur. Phys. J. C 82 (1), 46 (2022). https://doi.org/10.1140/epjc/s10052-022-10012-w . arXiv:2109.08159 [hep-ph]

A.J. Larkoski, I. Moult, B. Nachman, Jet Substructure at the Large Hadron Collider: A Review of Recent Advances in Theory and Machine Learning. Phys. Rept. 841 , 1–63 (2020). https://doi.org/10.1016/j.physrep.2019.11.001 . arXiv:1709.04464 [hep-ph]

A. Butter, T. Plehn, Chapter 7. Generative Networks for LHC Events, pp. 191–240. https://doi.org/10.1142/9789811234033_0007 . https://worldscientific.com/doi/abs/10.1142/9789811234033_0007

J. Duarte, J.-R. Vlimant, Graph Neural Networks for Particle Tracking and Reconstruction (2020) https://doi.org/10.1142/9789811234033_0012 arXiv:2012.01249 [hep-ph]

J. Shlomi, P. Battaglia, J.-R. Vlimant, Graph Neural Networks in Particle Physics (2020) https://doi.org/10.1088/2632-2153/abbf9a arXiv:2007.13681 [hep-ex]

S. Thais, P. Calafiura, G. Chachamis, G. DeZoort, J. Duarte, S. Ganguly, M. Kagan, D. Murnane, M.S. Neubauer, K. Terao, Graph Neural Networks in Particle Physics: Implementations, Innovations, and Challenges. In: Snowmass 2021 (2022)

V. Belis, P. Odagiu, T.K. Aarrestad, Machine learning for anomaly detection in particle physics. Rev. Phys. 12 , 100091 (2024). https://doi.org/10.1016/j.revip.2024.100091 . arXiv:2312.14190 [physics.data-an]

R.L. Workman, Others: Review of Particle Physics. PTEP 2022 , 083–01 (2022). https://doi.org/10.1093/ptep/ptac097

G. DeZoort, P.W. Battaglia, C. Biscarat, J.-R. Vlimant, Graph neural networks at the large hadron collider. Nature Reviews Physics 5 (5), 281–303 (2023). https://doi.org/10.1038/s42254-023-00569-0

B. Hashemi, N. Hartmann, S. Sharifzadeh, J. Kahn, T. Kuhr, Author Correction: Ultra-high-granularity detector simulation with intra-event aware generative adversarial network and self-supervised relational reasoning [ https://doi.org/10.1038/s41467-024-49104-4] .Nature Commun. 15(1), 5825 (2024) https://doi.org/10.1038/s41467-024-49971-x arXiv:2303.08046 [physics.ins-det]

B. Hashemi, C. Krause, Deep generative models for detector signature simulation: A taxonomic review. Rev. Phys. 12 , 100092 (2024). https://doi.org/10.1016/j.revip.2024.100092 . arXiv:2312.09597 [physics.ins-det]

M. Feickert, B. Nachman, A Living Review of Machine Learning for Particle Physics (2021) arXiv:2102.02770 [hep-ph]

W.S. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5 (4), 115–133 (1943). https://doi.org/10.1007/BF02478259

Article   MathSciNet   Google Scholar  

F. Rosenblatt, The perceptron: A probabilistic model for information storage and organization in the brain. Psychol. Rev. 65 (6), 386–408 (1958). https://doi.org/10.1037/h0042519

X. Glorot, Y. Bengio, Understanding the difficulty of training deep feedforward neural networks. In: Teh, Y.W., Titterington, M. (eds.) Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research, vol. 9, pp. 249–256. PMLR, Chia Laguna Resort, Sardinia, Italy (2010). https://proceedings.mlr.press/v9/glorot10a.html

K. He, X. Zhang, S. Ren, J. Sun, Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In: 2015 IEEE International Conference on Computer Vision (ICCV), pp. 1026–1034 (2015). https://doi.org/10.1109/ICCV.2015.123

M.V. Narkhede, P.P. Bartakke, M.S. Sutaone, A review on weight initialization strategies for neural networks. Artif. Intell. Rev. 55 (1), 291–322 (2022). https://doi.org/10.1007/s10462-021-10033-z

P.J. Werbos, Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Harvard University, ??? (1975). https://books.google.co.in/books?id=r3uljgEACAAJ

K. Hornik, M. Stinchcombe, H. White, Multilayer feedforward networks are universal approximators. Neural Netw. 2 (5), 359–366 (1989). https://doi.org/10.1016/0893-6080(89)90020-8

G. Cybenko, Approximation by superpositions of a sigmoidal function. Math. Control Signals Systems 2 (4), 303–314 (1989). https://doi.org/10.1007/BF02551274

M. Leshno, V.Y. Lin, A. Pinkus, S. Schocken, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw. 6 (6), 861–867 (1993). https://doi.org/10.1016/S0893-6080(05)80131-5

F. Voigtlaender, The universal approximation theorem for complex-valued neural networks. Appl. Comput. Harmon. Anal. 64 , 33–61 (2023). https://doi.org/10.1016/j.acha.2022.12.002

Y. Bengio, Lecun, Y.., in Scaling learning algorithms towards AI . ed. by L. Bottou, O. Chapelle, D. DeCoste, J. Weston (Neural Information Processing series. MIT Press, London, England, 2007)

J. Håstad, Computational Limitations of Small-depth Circuits (MIT Press, Cambridge, MA, USA, 1987)

Google Scholar  

E. Allender, Circuit complexity before the dawn of the new millennium, in Foundations of Software Technology and Theoretical Computer Science . ed. by V. Chandru, V. Vinay (Springer, Berlin, Heidelberg, 1996), pp.1–18

M. Bianchini, F. Scarselli, On the complexity of neural network classifiers: A comparison between shallow and deep architectures. IEEE Transactions on Neural Networks and Learning Systems 25 (8), 1553–1565 (2014). https://doi.org/10.1109/TNNLS.2013.2293637

R. Eldan, O. Shamir, The power of depth for feedforward neural networks. In: Feldman, V., Rakhlin, A., Shamir, O. (eds.) 29th Annual Conference on Learning Theory. Proceedings of Machine Learning Research, vol. 49, pp. 907–940. PMLR, Columbia University, New York, New York, USA (2016). https://proceedings.mlr.press/v49/eldan16.html

G. Naitzat, A. Zhitnikov, L.-H. Lim, Topology of deep neural networks. J. Mach. Learn. Res. 21 (184), 1–40 (2020)

MathSciNet   Google Scholar  

D.H. Wolpert, On the connection between in-sample testing and generalization error. Complex Syst. 6 (1992)

D.H. Wolpert, The lack of a priori distinctions between learning algorithms. Neural Comput. 8 (7), 1341–1390 (1996). https://doi.org/10.1162/neco.1996.8.7.1341

C. Schaffer, A conservation law for generalization performance, in Machine Learning Proceedings 1994 . ed. by W.W. Cohen, H. Hirsh (Morgan Kaufmann, San Francisco (CA), 1994), pp.259–265. https://doi.org/10.1016/B978-1-55860-335-6.50039-8

Chapter   Google Scholar  

D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1 (1), 67–82 (1997). https://doi.org/10.1109/4235.585893

D.H. Wolpert, In: Pardalos, P.M., Rasskazova, V., Vrahatis, M.N. (eds.) What Is Important About the No Free Lunch Theorems?, pp. 373–388. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-66515-9_13

B. Denby, Neural networks and cellular automata in experimental high energy physics. Comput. Phys. Commun. 49 (3), 429–448 (1988). https://doi.org/10.1016/0010-4655(88)90004-5

Article   ADS   MathSciNet   Google Scholar  

L. Lönnblad, C. Peterson, T. Rögnvaldsson, Finding gluon jets with a neural trigger. Phys. Rev. Lett. 65 , 1321–1324 (1990). https://doi.org/10.1103/PhysRevLett.65.1321

A. Radovic, M. Williams, D. Rousseau, M. Kagan, D. Bonacorsi, A. Himmel, A. Aurisano, K. Terao, T. Wongjirad, Machine learning at the energy and intensity frontiers of particle physics. Nature 560 (7716), 41–48 (2018). https://doi.org/10.1038/s41586-018-0361-2

L. Moore, K. Nordström, S. Varma, M. Fairbairn, Reports of My Demise Are Greatly Exaggerated: \(N\) -subjettiness Taggers Take On Jet Images. SciPost Phys. 7 (3), 036 (2019). https://doi.org/10.21468/SciPostPhys.7.3.036 . arXiv:1807.04769 [hep-ph]

T. Faucett, J. Thaler, D. Whiteson, Mapping Machine-Learned Physics into a Human-Readable Space. Phys. Rev. D 103 (3), 036020 (2021). https://doi.org/10.1103/PhysRevD.103.036020 . arXiv:2010.11998 [hep-ph]

M.M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst, Geometric deep learning: Going beyond euclidean data. IEEE Signal Process. Mag. 34 (4), 18–42 (2017). https://doi.org/10.1109/MSP.2017.2693418

M.M. Bronstein, J. Bruna, T. Cohen, P. Veličković, Geometric Deep Learning: Grids, Groups (Geodesics, and Gauges, Graphs, 2021)

T. Cohen, M. Welling, Group equivariant convolutional networks. In: Balcan, M.F., Weinberger, K.Q. (eds.) Proceedings of The 33rd International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 48, pp. 2990–2999. PMLR, New York, New York, USA (2016). https://proceedings.mlr.press/v48/cohenc16.html

T. Cohen, M. Weiler, B. Kicanaoglu, M. Welling, Gauge equivariant convolutional networks and the icosahedral CNN. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 97, pp. 1321–1330. PMLR, ??? (2019). https://proceedings.mlr.press/v97/cohen19d.html

J. Jumper, R. Evans, A. Pritzel, T. Green, M. Figurnov, O. Ronneberger, K. Tunyasuvunakool, R. Bates, A. Žídek, A. Potapenko, A. Bridgland, C. Meyer, S.A.A. Kohl, A.J. Ballard, A. Cowie, B. Romera-Paredes, S. Nikolov, R. Jain, J. Adler, T. Back, S. Petersen, D. Reiman, E. Clancy, M. Zielinski, M. Steinegger, M. Pacholska, T. Berghammer, S. Bodenstein, D. Silver, O. Vinyals, A.W. Senior, K. Kavukcuoglu, P. Kohli, D. Hassabis, Highly accurate protein structure prediction with alphafold. Nature 596 (7873), 583–589 (2021). https://doi.org/10.1038/s41586-021-03819-2

G. Kanwar, M.S. Albergo, D. Boyda, K. Cranmer, D.C. Hackett, S. Racanière, D.J. Rezende, P.E. Shanahan, Equivariant flow-based sampling for lattice gauge theory. Phys. Rev. Lett. 125 (12), 121601 (2020). https://doi.org/10.1103/PhysRevLett.125.121601 . arXiv:2003.06413 [hep-lat]

M. Favoni, A. Ipp, D.I. Müller, D. Schuh, Lattice Gauge Equivariant Convolutional Neural Networks. Phys. Rev. Lett. 128 (3), 032003 (2022). https://doi.org/10.1103/PhysRevLett.128.032003 . arXiv:2012.12901 [hep-lat]

A. Bogatskiy, B. Anderson, J.T. Offermann, M. Roussi, D.W. Miller, R. Kondor, Lorentz Group Equivariant Neural Network for Particle Physics (2020) arXiv:2006.04780 [hep-ph]

E. Buhmann, G. Kasieczka, J. Thaler, EPiC-GAN: Equivariant point cloud generation for particle jets. SciPost Phys. 15 (4), 130 (2023). https://doi.org/10.21468/SciPostPhys.15.4.130 . arXiv:2301.08128 [hep-ph]

P. Onyisi, D. Shen, J. Thaler, Comparing point cloud strategies for collider event classification. Phys. Rev. D 108 (1), 012001 (2023). https://doi.org/10.1103/PhysRevD.108.012001 . arXiv:2212.10659 [hep-ph]

C. Frye, A.J. Larkoski, J. Thaler, K. Zhou, Casimir Meets Poisson: Improved Quark/Gluon Discrimination with Counting Observables. JHEP 09 , 083 (2017). https://doi.org/10.1007/JHEP09(2017)083 . arXiv:1704.06266 [hep-ph]

R. Medves, A. Soto-Ontoso, G. Soyez, Lund and Cambridge multiplicities for precision physics. JHEP 10 , 156 (2022). https://doi.org/10.1007/JHEP10(2022)156 . arXiv:2205.02861 [hep-ph]

R. Medves, A. Soto-Ontoso, G. Soyez, Lund multiplicity in QCD jets. JHEP 04 , 104 (2023). https://doi.org/10.1007/JHEP04(2023)104 . arXiv:2212.05076 [hep-ph]

S. Choi, S.J. Lee, M. Perelstein, Infrared Safety of a Neural-Net Top Tagging Algorithm. JHEP 02 , 132 (2019). https://doi.org/10.1007/JHEP02(2019)132 . arXiv:1806.01263 [hep-ph]

P.T. Komiske, E.M. Metodiev, J. Thaler, Energy Flow Networks: Deep Sets for Particle Jets. JHEP 01 , 121 (2019). https://doi.org/10.1007/JHEP01(2019)121 . arXiv:1810.05165 [hep-ph]

M.J. Dolan, A. Ore, Equivariant Energy Flow Networks for Jet Tagging. Phys. Rev. D 103 (7), 074022 (2021). https://doi.org/10.1103/PhysRevD.103.074022 . arXiv:2012.00964 [hep-ph]

W. Shen, D. Wang, J.M. Yang, Hierarchical high-point Energy Flow Network for jet tagging. JHEP 09 , 135 (2023). https://doi.org/10.1007/JHEP09(2023)135 . arXiv:2308.08300 [hep-ph]

S. Bright-Thonney, B. Nachman, J. Thaler, Safe but Incalculable: Energy-weighting is not all you need (2023) arXiv:2311.07652 [hep-ph]

R. Gambhir, A. Osathapan, J. Thaler, Moments of Clarity: Streamlining Latent Spaces in Machine Learning using Moment Pooling (2024) arXiv:2403.08854 [hep-ph]

P. Konar, V.S. Ngairangbam, M. Spannowsky, Energy-weighted message passing: an infra-red and collinear safe graph neural network algorithm. JHEP 02 , 060 (2022). https://doi.org/10.1007/JHEP02(2022)060 . arXiv:2109.14636 [hep-ph]

O. Atkinson, A. Bhardwaj, C. Englert, P. Konar, V.S. Ngairangbam, M. Spannowsky, IRC-Safe Graph Autoencoder for Unsupervised Anomaly Detection. Front. Artif. Intell. 5 , 943135 (2022). https://doi.org/10.3389/frai.2022.943135 . arXiv:2204.12231 [hep-ph]

P. Konar, V.S. Ngairangbam, M. Spannowsky, Hypergraphs in LHC phenomenology – the next frontier of IRC-safe feature extraction. JHEP 01 , 113 (2024). https://doi.org/10.1007/JHEP01(2024)113 . arXiv:2309.17351 [hep-ph]

S. Chatterjee, S.S. Cruz, R. Schöfbeck, D. Schwarz, A rotation-equivariant graph neural network for learning hadronic SMEFT effects (2024) arXiv:2401.10323 [hep-ph]

A. Bhardwaj, C. Englert, W. Naskar, V.S. Ngairangbam, M. Spannowsky, Equivariant, Safe and Sensitive – Graph Networks for New Physics (2024) arXiv:2402.12449 [hep-ph]

M. Cacciari, G.P. Salam, G. Soyez, The anti- \(k_t\) jet clustering algorithm. JHEP 04 , 063 (2008). https://doi.org/10.1088/1126-6708/2008/04/063 . arXiv:0802.1189 [hep-ph]

B. Andrieu, Jet finding algorithms at Tevatron. Acta Phys. Polon. B 36 , 409–415 (2005)

ADS   Google Scholar  

G.P. Salam, Towards Jetography. Eur. Phys. J. C 67 , 637–686 (2010). https://doi.org/10.1140/epjc/s10052-010-1314-6 . arXiv:0906.1833 [hep-ph]

R.K. Ellis, K. Melnikov, G. Zanderighi, W+3 jet production at the Tevatron. Phys. Rev. D 80 , 094002 (2009). https://doi.org/10.1103/PhysRevD.80.094002 . arXiv:0906.1445 [hep-ph]

A. Banfi, G.P. Salam, G. Zanderighi, Principles of general final-state resummation and automated implementation. JHEP 03 , 073 (2005). https://doi.org/10.1088/1126-6708/2005/03/073 . arXiv:hep-ph/0407286

M. Dasgupta, A. Fregoso, S. Marzani, G.P. Salam, Towards an understanding of jet substructure. JHEP 09 , 029 (2013). https://doi.org/10.1007/JHEP09(2013)029 . arXiv:1307.0007 [hep-ph]

G. Kasieczka, S. Marzani, G. Soyez, G. Stagnitto, Towards Machine Learning Analytics for Jet Substructure. JHEP 09 , 195 (2020). https://doi.org/10.1007/JHEP09(2020)195 . arXiv:2007.04319 [hep-ph]

H.-M. Chang, M. Procura, J. Thaler, W.J. Waalewijn, Calculating Track-Based Observables for the LHC. Phys. Rev. Lett. 111 , 102002 (2013). https://doi.org/10.1103/PhysRevLett.111.102002 . arXiv:1303.6637 [hep-ph]

A.J. Larkoski, S. Marzani, J. Thaler, Sudakov Safety in Perturbative QCD. Phys. Rev. D 91 (11), 111501 (2015). https://doi.org/10.1103/PhysRevD.91.111501 . arXiv:1502.01719 [hep-ph]

K. Fukushima, S. Miyake, Neocognitron: A new algorithm for pattern recognition tolerant of deformations and shifts in position. Pattern Recogn. 15 (6), 455–469 (1982). https://doi.org/10.1016/0031-3203(82)90024-3

Y. LeCun, B. Boser, J.S. Denker, D. Henderson, R.E. Howard, W. Hubbard, L.D. Jackel, Backpropagation applied to handwritten zip code recognition. Neural Comput. 1 (4), 541–551 (1989). https://doi.org/10.1162/neco.1989.1.4.541

M. Zaheer, S. Kottur, S. Ravanbakhsh, B. Poczos, R.R. Salakhutdinov, A.J. Smola, Deep sets. In: Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., ??? (2017). https://proceedings.neurips.cc/paper_files/paper/2017/file/f22e4747da1aa27e363d86d40ff442fe-Paper.pdf

M. Gori, G. Monfardini, F. Scarselli, A new model for learning in graph domains. In: Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005., vol. 2, pp. 729–7342 (2005). https://doi.org/10.1109/IJCNN.2005.1555942

F. Scarselli, M. Gori, A.C. Tsoi, M. Hagenbuchner, G. Monfardini, The graph neural network model. IEEE Trans. Neural Networks 20 (1), 61–80 (2009). https://doi.org/10.1109/TNN.2008.2005605

A. Sperduti, A. Starita, Supervised neural networks for the classification of structures. IEEE Trans. Neural Networks 8 (3), 714–735 (1997). https://doi.org/10.1109/72.572108

P. Frasconi, M. Gori, A. Sperduti, A general framework for adaptive processing of data structures. IEEE Trans. Neural Networks 9 (5), 768–786 (1998). https://doi.org/10.1109/72.712151

J. Gilmer, S.S. Schoenholz, P.F. Riley, O. Vinyals, G.E. Dahl, Neural message passing for quantum chemistry. In: Precup, D., Teh, Y.W. (eds.) Proceedings of the 34th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 70, pp. 1263–1272. PMLR, ??? (2017). https://proceedings.mlr.press/v70/gilmer17a.html

Battaglia, P.W., Hamrick, J.B., Bapst, V., Sanchez-Gonzalez, A., Zambaldi, V.F., Malinowski, M., Tacchetti, A., Raposo, D., Santoro, A., Faulkner, R., GülçSehre, Song, H.F., Ballard, A.J., Gilmer, J., Dahl, G.E., Vaswani, A., Allen, K.R., Nash, C., Langston, V., Dyer, C., Heess, N.M.O., Wierstra, D., Kohli, P., Botvinick, M.M., Vinyals, O., Li, Y., Pascanu, R.: Relational inductive biases, deep learning, and graph networks. ArXiv abs/1806.01261 (2018)

Y. Wang, Y. Sun, Z. Liu, S.E. Sarma, M.M. Bronstein, J.M. Solomon, Dynamic graph cnn for learning on point clouds. ACM Trans. Graph. 38 (5) (2019) https://doi.org/10.1145/3326362

D. Bahdanau, K. Cho, Y. Bengio, Neural machine translation by jointly learning to align and translate. In: Bengio, Y., LeCun, Y. (eds.) 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings (2015). http://arxiv.org/abs/1409.0473

A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L.u. Kaiser, I. Polosukhin, Attention is all you need. In: Guyon, I., Luxburg, U.V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc., ??? (2017). https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf

J. Devlin, M.-W. Chang, K. Lee, K. Toutanova, BERT: Pre-training of deep bidirectional transformers for language understanding. In: Burstein, J., Doran, C., Solorio, T. (eds.) Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423

T. Brown, B. Mann, N. Ryder, M. Subbiah, J.D. Kaplan, P. Dhariwal, A. Neelakantan, P. Shyam, G. Sastry, A. Askell, S. Agarwal, A. Herbert-Voss, G. Krueger, T. Henighan, R. Child, A. Ramesh, D. Ziegler, J. Wu, C. Winter, C. Hesse, M. Chen, E. Sigler, M. Litwin, S. Gray, B. Chess, J. Clark, C. Berner, S. McCandlish, A. Radford, I. Sutskever, D. Amodei, Language models are few-shot learners. In: Larochelle, H., Ranzato, M., Hadsell, R., Balcan, M.F., Lin, H. (eds.) Advances in Neural Information Processing Systems, vol. 33, pp. 1877–1901. Curran Associates, Inc., ??? (2020). https://proceedings.neurips.cc/paper_files/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf

L. Ouyang, J. Wu, X. Jiang, D. Almeida, C. Wainwright, P. Mishkin, C. Zhang, S. Agarwal, K. Slama, A. Ray, J. Schulman, J. Hilton, F. Kelton, L. Miller, M. Simens, A. Askell, P. Welinder, P.F. Christiano, J. Leike, R. Lowe, Training language models to follow instructions with human feedback. In: Koyejo, S., Mohamed, S., Agarwal, A., Belgrave, D., Cho, K., Oh, A. (eds.) Advances in Neural Information Processing Systems, vol. 35, pp. 27730–27744. Curran Associates, Inc., ??? (2022). https://proceedings.neurips.cc/paper_files/paper/2022/file/b1efde53be364a73914f58805a001731-Paper-Conference.pdf

A. Radford, K. Narasimhan, T. Salimans, I. Sutskever, et al. Improving language understanding by generative pre-training (2018)

P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph attention networks. In: International Conference on Learning Representations (2018). https://openreview.net/forum?id=rJXMpikCZ

H. Zhao, L. Jiang, J. Jia, P. Torr, V. Koltun, Point transformer. In: 2021 IEEE/CVF International Conference on Computer Vision (ICCV), pp. 16239–16248 (2021). https://doi.org/10.1109/ICCV48922.2021.01595

L. Oliveira, M. Kagan, L. Mackey, B. Nachman, A. Schwartzman, Jet-images — deep learning edition. JHEP 07, 069 (2016) https://doi.org/10.1007/JHEP07(2016)069 arXiv:1511.05190 [hep-ph]

H. Qu, L. Gouskos, ParticleNet: Jet Tagging via Particle Clouds. Phys. Rev. D 101 (5), 056019 (2020). https://doi.org/10.1103/PhysRevD.101.056019 . arXiv:1902.08570 [hep-ph]

E.A. Moreno, O. Cerri, J.M. Duarte, H.B. Newman, T.Q. Nguyen, A. Periwal, M. Pierini, A. Serikova, M. Spiropulu, J.-R. Vlimant, JEDI-net: a jet identification algorithm based on interaction networks. Eur. Phys. J. C 80 (1), 58 (2020). https://doi.org/10.1140/epjc/s10052-020-7608-4 . arXiv:1908.05318 [hep-ex]

E.A. Moreno, T.Q. Nguyen, J.-R. Vlimant, O. Cerri, H.B. Newman, A. Periwal, M. Spiropulu, J.M. Duarte, M. Pierini, Interaction networks for the identification of boosted \(H \rightarrow b\overline{b}\) decays. Phys. Rev. D 102 (1), 012010 (2020). https://doi.org/10.1103/PhysRevD.102.012010 . arXiv:1909.12285 [hep-ex]

V. Mikuni, F. Canelli, ABCNet: An attention-based method for particle tagging. Eur. Phys. J. Plus 135 (6), 463 (2020). https://doi.org/10.1140/epjp/s13360-020-00497-3 . arXiv:2001.05311 [physics.data-an]

M. He, D. Wang, Quark/gluon discrimination and top tagging with dual attention transformer. Eur. Phys. J. C 83 (12), 1116 (2023). https://doi.org/10.1140/epjc/s10052-023-12293-1 . arXiv:2307.04723 [hep-ph]

M. He, Z. Wei, J.-R. Wen, Convolutional neural networks on graphs with chebyshev approximation, revisited. In: Koyejo, S., Mohamed, S., Agarwal, A., Belgrave, D., Cho, K., Oh, A. (eds.) Advances in Neural Information Processing Systems, vol. 35, pp. 7264–7276. Curran Associates, Inc., ??? (2022). https://proceedings.neurips.cc/paper_files/paper/2022/file/2f9b3ee2bcea04b327c09d7e3145bd1e-Paper-Conference.pdf

Y. Semlani, M. Relan, K. Ramesh, PCN: A Deep Learning Approach to Jet Tagging Utilizing Novel Graph Construction Methods and Chebyshev Graph Convolutions (2023) arXiv:2309.08630 [hep-ph]

Y.G. Wang, M. Li, Z. Ma, G. Montufar, X. Zhuang, Y. Fan, Haar graph pooling. In: III, H.D., Singh, A. (eds.) Proceedings of the 37th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 119, pp. 9952–9962. PMLR, ??? (2020). https://proceedings.mlr.press/v119/wang20m.html

F. Ma, F. Liu, W. Li, Jet tagging algorithm of graph network with Haar pooling message passing. Phys. Rev. D 108 (7), 072007 (2023). https://doi.org/10.1103/PhysRevD.108.072007 . arXiv:2210.13869 [hep-ex]

H. Qu, C. Li, S. Qian, Particle transformer for jet tagging. In: Chaudhuri, K., Jegelka, S., Song, L., Szepesvari, C., Niu, G., Sabato, S. (eds.) Proceedings of the 39th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 162, pp. 18281–18292. PMLR, ??? (2022). https://proceedings.mlr.press/v162/qu22b.html

S. Shleifer, M. Ott, NormFormer: Improved Transformer Pretraining with Extra Normalization (2022). https://openreview.net/forum?id=GMYWzWztDx5

V. Mikuni, F. Canelli, Point cloud transformers applied to collider physics. Mach. Learn. Sci. Tech. 2 (3), 035027 (2021). https://doi.org/10.1088/2632-2153/ac07f6 . arXiv:2102.05073 [physics.data-an]

S. Gong, Q. Meng, J. Zhang, H. Qu, C. Li, S. Qian, W. Du, Z.-M. Ma, T.-Y. Liu, An efficient Lorentz equivariant graph neural network for jet tagging. JHEP 07 , 030 (2022). https://doi.org/10.1007/JHEP07(2022)030 . arXiv:2201.08187 [hep-ph]

C. Li, H. Qu, S. Qian, Q. Meng, S. Gong, J. Zhang, T.-Y. Liu, Q. Li, Does Lorentz-symmetric design boost network performance in jet physics? Phys. Rev. D 109 (5), 056003 (2024). https://doi.org/10.1103/PhysRevD.109.056003 . arXiv:2208.07814 [hep-ph]

A. Bogatskiy, T. Hoffman, D.W. Miller, J.T. Offermann, PELICAN: Permutation Equivariant and Lorentz Invariant or Covariant Aggregator Network for Particle Physics (2022) arXiv:2211.00454 [hep-ph]

A. Bogatskiy, T. Hoffman, D.W. Miller, J.T. Offermann, X. Liu, Explainable equivariant neural networks for particle physics: PELICAN. JHEP 03 , 113 (2024). https://doi.org/10.1007/JHEP03(2024)113 . arXiv:2307.16506 [hep-ph]

R. Sahu, CapsLorentzNet: Integrating Physics Inspired Features with Graph Convolution (2024) arXiv:2403.11826 [hep-ph]

P.T. Komiske, E.M. Metodiev, J. Thaler, The Hidden Geometry of Particle Collisions. JHEP 07 , 006 (2020). https://doi.org/10.1007/JHEP07(2020)006 . arXiv:2004.04159 [hep-ph]

E. Bernreuther, T. Finke, F. Kahlhoefer, M. Krämer, A. Mück, Casting a graph net to catch dark showers. SciPost Phys. 10 (2), 046 (2021). https://doi.org/10.21468/SciPostPhys.10.2.046 . arXiv:2006.08639 [hep-ph]

B. Andersson, G. Gustafson, L. Lönnblad, U. Pettersson, Coherence effects in deep inelastic scattering. Zeitschrift für Physik C Particles and Fields 43 (4), 625–632 (1989). https://doi.org/10.1007/BF01550942

F.A. Dreyer, G.P. Salam, G. Soyez, The Lund Jet Plane. JHEP 12 , 064 (2018). https://doi.org/10.1007/JHEP12(2018)064 . arXiv:1807.04758 [hep-ph]

F.A. Dreyer, H. Qu, Jet tagging in the Lund plane with graph networks. JHEP 03 , 052 (2021). https://doi.org/10.1007/JHEP03(2021)052 . arXiv:2012.08526 [hep-ph]

F.A. Dreyer, G. Soyez, A. Takacs, Quarks and gluons in the Lund plane. JHEP 08 , 177 (2022). https://doi.org/10.1007/JHEP08(2022)177 . arXiv:2112.09140 [hep-ph]

O. Atkinson, A. Bhardwaj, S. Brown, C. Englert, D.J. Miller, P. Stylianou, Improved constraints on effective top quark interactions using edge convolution networks. JHEP 04 , 137 (2022). https://doi.org/10.1007/JHEP04(2022)137 . arXiv:2111.01838 [hep-ph]

M. Abdughani, J. Ren, L. Wu, J.M. Yang, Probing stop pair production at the LHC with graph neural networks. JHEP 08 , 055 (2019). https://doi.org/10.1007/JHEP08(2019)055 . arXiv:1807.09088 [hep-ph]

J. Pata, J. Duarte, J.-R. Vlimant, M. Pierini, M. Spiropulu, MLPF: Efficient machine-learned particle-flow reconstruction using graph neural networks. Eur. Phys. J. C 81 (5), 381 (2021). https://doi.org/10.1140/epjc/s10052-021-09158-w . arXiv:2101.08578 [physics.data-an]

Atkinson, O. Anisha, A. Bhardwaj, C. Englert, P. Stylianou, Quartic Gauge-Higgs couplings: constraints and future directions. JHEP 10, 172 (2022) https://doi.org/10.1007/JHEP10(2022)172 arXiv:2208.09334 [hep-ph]

Atkinson O. Anisha A. Bhardwaj, C. Englert, W. Naskar, P. Stylianou, BSM reach of four-top production at the LHC. Phys. Rev. D 108(3), 035001 (2023) https://doi.org/10.1103/PhysRevD.108.035001 arXiv:2302.08281 [hep-ph]

A. Hammad, S. Moretti, M. Nojiri, Multi-scale cross-attention transformer encoder for event classification. JHEP 03 , 144 (2024). https://doi.org/10.1007/JHEP03(2024)144 . arXiv:2401.00452 [hep-ph]

P. Stylianou, G. Weiglein, Constraints on the trilinear and quartic Higgs couplings from triple Higgs production at the LHC and beyond (2023) arXiv:2312.04646 [hep-ph]

L. Ehrke, J.A. Raine, K. Zoch, M. Guth, T. Golling, Topological reconstruction of particle physics processes using graph neural networks. Phys. Rev. D 107 (11), 116019 (2023). https://doi.org/10.1103/PhysRevD.107.116019 . arXiv:2303.13937 [hep-ph]

C. Birch-Sykes, B. Le, Y. Peters, E. Simpson, Z. Zhang, Reconstruction of Short-Lived Particles using Graph-Hypergraph Representation Learning (2024) arXiv:2402.10149 [hep-ph]

A. Mullin, S. Nicholls, H. Pacey, M. Parker, M. White, S. Williams, Does SUSY have friends? A new approach for LHC event analysis. JHEP 02 , 160 (2021). https://doi.org/10.1007/JHEP02(2021)160 . arXiv:1912.10625 [hep-ph]

E. Pfeffer, M. Waßmer, Y.-Y. Cung, R. Wolf, U. Husemann, A case study of sending graph neural networks back to the test bench for applications in high-energy particle physics (2024) arXiv:2402.17386 [hep-ph]

L. Builtjes, S. Caron, P. Moskvitina, C. Nellist, R.R. Austri, R. Verheyen, Z. Zhang, Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments (2022) arXiv:2211.05143 [hep-ph]

T. Heimel, G. Kasieczka, T. Plehn, J.M. Thompson, QCD or What? SciPost Phys. 6 (3), 030 (2019). https://doi.org/10.21468/SciPostPhys.6.3.030 . arXiv:1808.08979 [hep-ph]

M. Farina, Y. Nakai, D. Shih, Searching for New Physics with Deep Autoencoders. Phys. Rev. D 101 (7), 075021 (2020). https://doi.org/10.1103/PhysRevD.101.075021 . arXiv:1808.08992 [hep-ph]

S. Tsan, R. Kansal, A. Aportela, D. Diaz, J. Duarte, S. Krishna, F. Mokhtar, J.-R. Vlimant, M. Pierini, Particle Graph Autoencoders and Differentiable, Learned Energy Mover’s Distance. In: 35th Conference on Neural Information Processing Systems (2021)

O. Atkinson, A. Bhardwaj, C. Englert, V.S. Ngairangbam, M. Spannowsky, Anomaly detection with convolutional Graph Neural Networks. JHEP 08 , 080 (2021). https://doi.org/10.1007/JHEP08(2021)080 . arXiv:2105.07988 [hep-ph]

B. Ostdiek, Deep Set Auto Encoders for Anomaly Detection in Particle Physics. SciPost Phys. 12 (1), 045 (2022). https://doi.org/10.21468/SciPostPhys.12.1.045 . arXiv:2109.01695 [hep-ph]

Z. Hao, R. Kansal, J. Duarte, N. Chernyavskaya, Lorentz group equivariant autoencoders. Eur. Phys. J. C 83 (6), 485 (2023). https://doi.org/10.1140/epjc/s10052-023-11633-5 . arXiv:2212.07347 [hep-ex]

Download references

Acknowledgements

A.B. is supported by the U.S. Department of Energy under grant number DE-SC 0016013. V. S. N. is supported by the STFC under grant ST/P001246/1. The computational works are performed using the Param Vikram-1000 High-Performance Computing Cluster and the TDP project resources of the Physical Research Laboratory (PRL). Authors would like to thank for the warm hospitality and support received at the school of Deep Machine Learning for particle and astroparticle physics (ML4HEP) at ICTS, Bengaluru (2023) and IOP, Bhubaneswar (2024).

Author information

Authors and affiliations.

Department of Physics, Oklahoma State University, Stillwater, OH, 74078, USA

Akanksha Bhardwaj

Theoretical Physics Division, Physical Research Laboratory, Shree Pannalal Patel Marg, Ahmedabad, 380009, Gujarat, India

Partha Konar

Institute for Particle Physics Phenomenology, Department of Physics, Durham University, South Road, Durham, DH1 3LE, UK

Vishal Ngairangbam

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vishal Ngairangbam .

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Bhardwaj, A., Konar, P. & Ngairangbam, V. Foundations of automatic feature extraction at LHC–point clouds and graphs. Eur. Phys. J. Spec. Top. (2024). https://doi.org/10.1140/epjs/s11734-024-01306-z

Download citation

Received : 26 April 2024

Accepted : 23 August 2024

Published : 11 September 2024

DOI : https://doi.org/10.1140/epjs/s11734-024-01306-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Prev Med Public Health
  • v.56(1); 2023 Jan

Qualitative Research in Healthcare: Necessity and Characteristics

1 Department of Preventive Medicine, Ulsan University Hospital, University of Ulsan College of Medicine, Ulsan, Korea

2 Ulsan Metropolitan City Public Health Policy’s Institute, Ulsan, Korea

3 Department of Nursing, Chung-Ang University, Seoul, Korea

Eun Young Choi

4 College of Nursing, Sungshin Women’s University, Seoul, Korea

Seung Gyeong Jang

5 Department of Preventive Medicine, University of Ulsan College of Medicine, Seoul, Korea

Quantitative and qualitative research explore various social phenomena using different methods. However, there has been a tendency to treat quantitative studies using complicated statistical techniques as more scientific and superior, whereas relatively few qualitative studies have been conducted in the medical and healthcare fields. This review aimed to provide a proper understanding of qualitative research. This review examined the characteristics of quantitative and qualitative research to help researchers select the appropriate qualitative research methodology. Qualitative research is applicable in following cases: (1) when an exploratory approach is required on a topic that is not well known, (2) when something cannot be explained fully with quantitative research, (3) when it is necessary to newly present a specific view on a research topic that is difficult to explain with existing views, (4) when it is inappropriate to present the rationale or theoretical proposition for designing hypotheses, as in quantitative research, and (5) when conducting research that requires detailed descriptive writing with literary expressions. Qualitative research is conducted in the following order: (1) selection of a research topic and question, (2) selection of a theoretical framework and methods, (3) literature analysis, (4) selection of the research participants and data collection methods, (5) data analysis and description of findings, and (6) research validation. This review can contribute to the more active use of qualitative research in healthcare, and the findings are expected to instill a proper understanding of qualitative research in researchers who review qualitative research reports and papers.

Graphical abstract

An external file that holds a picture, illustration, etc.
Object name is jpmph-22-451f2.jpg

INTRODUCTION

The definition of research varies among studies and scholars, and it is difficult to devise a single definition. The Oxford English Dictionary defines research as “a careful study of a subject, especially in order to discover new facts or information about it” [ 1 ], while Webster’s Dictionary defines research as “studious inquiry or examination - especially: investigation or experimentation aimed at the discovery and interpretation of facts, revision of accepted theories or laws in the light of new facts, or practical application of such new or revised theories or laws” [ 2 ]. Moreover, research is broadly defined as the process of solving unsolved problems to broaden human knowledge [ 3 ]. A more thorough understanding of research can be gained by examining its types and reasons for conducting it.

The reasons for conducting research may include practical goals, such as degree attainment, job promotion, and financial profit. Research may be based on one’s own academic curiosity or aspiration or guided by professors or other supervisors. Academic research aims can be further divided into the following: (1) accurately describing an object or phenomenon, (2) identifying general laws and establishing well-designed theories for understanding and explaining a certain phenomenon, (3) predicting future events based on laws and theories, and (4) manipulating causes and conditions to induce or prevent a phenomenon [ 3 ].

The appropriate type of research must be selected based on the purpose and topic. Basic research has the primary purpose of expanding the existing knowledge base through new discoveries, while applied research aims to solve a real problem. Descriptive research attempts to factually present comparisons and interpretations of findings based on analyses of the characteristics, progression, or relationships of a certain phenomenon by manipulating the variables or controlling the conditions. Experimental or analytical research attempts to identify causal relationships between variables through experiments by arbitrarily manipulating the variables or controlling the conditions [ 3 ]. In addition, research can be quantitative or qualitative, depending on the data collection and analytical methods. Quantitative research relies on statistical analyses of quantitative data obtained primarily through investigation and experiment, while qualitative research uses specific methodologies to analyze qualitative data obtained through participant observations and in-depth interviews. However, as these types of research are not polar opposites and the criteria for classifying research types are unclear, there is some degree of methodological overlap.

What is more important than differentiating types of research is identifying the appropriate type of research to gain a better understanding of specific questions and improve problems encountered by people in life. An appropriate research type or methodology is essential to apply findings reliably. However, quantitative research based on the philosophical ideas of empiricism and positivism has been the mainstay in the field of healthcare, with academic advancement achieved through the application of various statistical techniques to quantitative data [ 4 ]. In particular, there has been a tendency to treat complicated statistical techniques as more scientific and superior, with few qualitative studies in not only clinical medicine, but also primary care and social medicine, which are relatively strongly influenced by the social sciences [ 5 , 6 ].

Quantitative and qualitative research use different ways of exploring various social phenomena. Both research methodologies can be applied individually or in combination based on the research topic, with mixed quantitative and qualitative research methodologies becoming more widespread in recent years [ 7 ]. Applying these 2 methods through a virtuous cycle of integration from a complementary perspective can provide a more accurate understanding of human phenomena and solutions to real-world problems.

This review aimed to provide a proper understanding of qualitative research to assist researchers in selecting the appropriate research methodology. Specifically, this review examined the characteristics of quantitative and qualitative research, the applicability of qualitative research, and the data sources collected and analyzed in qualitative research.

COMPARISON OF QUALITATIVE AND QUANTITATIVE RESEARCH

A clearer understanding of qualitative research can be obtained by comparing qualitative and quantitative research, with which people are generally familiar [ 8 , 9 ]. Quantitative research focuses on testing the validity of hypotheses established by the researcher to identify the causal relationships of a specific phenomenon and discovering laws to predict that phenomenon ( Table 1 ). Therefore, it emphasizes controlling the influence of variables that may interfere with the process of identifying causality and laws. In contrast, qualitative research aims to discover and explore new hypotheses or theories based on a deep understanding of the meaning of a specific phenomenon. As such, qualitative research attempts to accept various environmental factors naturally. In quantitative research, importance is placed on the researcher acting as an outsider to take an objective view by keeping a certain distance from the research subject. In contrast, qualitative research encourages looking inside the research subjects to understand them deeply, while also emphasizing the need for researchers to take an intersubjective view that is formed and shared based on a mutual understanding with the research subjects.

Comparison of methodological characteristics between quantitative research and qualitative research

CharacteristicsQuantitative researchQualitative research
Research purposeTest the validity of the hypotheses established by the researcher to identify the causal relationships and laws of the phenomenon and predict the phenomenonDiscover and explore new hypotheses or theories based on a deep understanding of the meaning of the phenomenon
Perspective on variablesView factors other than the variables of interest as factors to be controlled and minimize the influence of confounding factorsView factors as natural and accept assessments in a natural environment
Research viewObjective, outsider viewIntersubjective, insider view
Data usedQuantifiable, measurable dataNarrative data that can be expressed by words, images and so on
Data collection methodPrimarily questionnaire surveys or testsPrimarily participant observation, in-depth interviews, and focus group discussions
Nature of data and depth of analysisFocus on superficial aspects of the phenomenon by using reliable data obtained through repeated measurementsThe aim is to identify the specific contents, dynamics, and processes inherent within the phenomenon and situation using deep and rich data
Strengths and weaknessesHigh reliability and generalizabilityHigh validity
Difficulties with in-depth analysis of dynamic phenomena that cannot be expressed by numbers alone; difficulties in interpreting the results analyzed by numbersWeak generalizability; interjection of subjectivity of the researcher is inevitable

The data used in quantitative research can be expressed as numerical values, and data accumulated through questionnaire surveys and tests are often used in analyses. In contrast, qualitative research uses narrative data with words and images collected through participant observations, in-depth interviews, and focus group discussions used in the analyses. Quantitative research data are measured repeatedly to enhance their reliability, while the analyses of such data focus on superficial aspects of the phenomenon of interest. Qualitative research instead focuses on obtaining deep and rich data and aims to identify the specific contents, dynamics, and processes inherent within the phenomenon and situation.

There are clear distinctions in the advantages, disadvantages, and goals of quantitative and qualitative research. On one hand, quantitative research has the advantages of reliability and generalizability of the findings, and advances in data collection and analysis methods have increased reliability and generalizability. However, quantitative research presents difficulties with an in-depth analysis of dynamic phenomena that cannot be expressed by numbers alone and interpreting the results analyzed in terms numbers. On the other hand, qualitative research has the advantage of validity, which refers to how accurately or appropriately a phenomenon was measured. However, qualitative research also has the disadvantage of weak generalizability, which determines whether an observed phenomenon applies to other cases.

APPLICATIONS OF QUALITATIVE RESEARCH AND ITS USEFULNESS IN THE HEALTHCARE FIELD

Qualitative research cannot be the solution to all problems. A specific methodology should not be applied to all situations. Therefore, researchers need to have a good understanding of the applicability of qualitative research. Generally, qualitative research is applicable in following cases: (1) when an exploratory approach is required on a topic that is not well known, (2) when something cannot be explained fully with quantitative research, (3) when it is necessary to newly present a specific view on a research topic that is difficult to explain with existing views, (4) when it is inappropriate to present the rationale or theoretical proposition for designing hypotheses, as in quantitative research, and (5) when conducting research that requires detailed descriptive writing with literary expressions [ 7 ]. In particular, qualitative research is useful for opening new fields of research, such as important topics that have not been previously examined or whose significance has not been recognized. Moreover, qualitative research is advantageous for examining known topics from a fresh perspective.

In the healthcare field, qualitative research is conducted on various topics considering its characteristics and strengths. Quantitative research, which focuses on hypothesis validation, such as the superiority of specific treatments or the effectiveness of specific policies, and the generalization of findings, has been the primary research methodology in the field of healthcare. Qualitative research has been mostly applied for studies such as subjective disease experiences and attitudes with respect to health-related patient quality of life [ 10 - 12 ], experiences and perceptions regarding the use of healthcare services [ 13 - 15 ], and assessments of the quality of care [ 16 , 17 ]. Moreover, qualitative research has focused on vulnerable populations, such as the elderly, children, disabled [ 18 - 20 ], minorities, and socially underprivileged with specific experiences [ 21 , 22 ].

For instance, patient safety is considered a pillar of quality of care, which is an aspect of healthcare with increasing international interest. The ultimate goal of patient safety research should be the improvement of patient safety, for which it is necessary to identify the root causes of potential errors and adverse events. In such cases, qualitative rather than quantitative research is often required. It is also important to identify whether there are any barriers when applying measures for enhancing patient safety to clinical practice. To identify such barriers, qualitative research is necessary to observe healthcare workers directly applying the solutions step-by-step during each process, determine whether there are difficulties in applying the solutions to relevant stakeholders, and ask how to improve the process if there are difficulties.

Patient safety is a very broad topic, and patient safety issues could be categorized into preventing, recognizing, and responding to patient safety issues based on related metrics [ 23 ]. Responding to issues that pertain to the handling of patient safety incidents that have already occurred has received relatively less interest than other categories of research on this topic, particularly in Korea. Until 2017, almost no research was conducted on the experiences of and difficulties faced by patients and healthcare workers who have been involved in patient safety incidents. This topic can be investigated using qualitative research.

A study in Korea investigated the physical and mental suffering experienced during the process of accepting disability and medical litigation by a patient who became disabled due to medical malpractice [ 21 ]. Another qualitative case study was conducted with participants who lost a family member due to a medical accident and identified psychological suffering due to the incident, as well as secondary psychological suffering during the medical litigation process, which increased the expandability of qualitative research findings [ 24 ]. A quantitative study based on these findings confirmed that people who experienced patient safety incidents had negative responses after the incidents and a high likelihood of sleep or eating disorders, depending on their responses [ 25 ].

A study that applied the grounded theory to examine the second victim phenomenon, referring to healthcare workers who have experienced patient safety incidents, and presented the response stages experienced by second victims demonstrated the strength of qualitative research [ 26 ]. Subsequently, other studies used questionnaire surveys on physicians and nurses to quantify the physical, mental, and work-related difficulties experienced by second victims [ 27 , 28 ]. As such, qualitative research alone can produce significant findings; however, combining quantitative and qualitative research produces a synergistic effect. In the healthcare field, which remains unfamiliar with qualitative research, combining these 2 methodologies could both enhance the validity of research findings and facilitate open discussions with other researchers [ 29 ].

In addition, qualitative research has been used for diverse sub-topics, including the experiences of patients and guardians with respect to various diseases (such as cancer, myocardial infarction, chronic obstructive pulmonary disease, depression, falls, and dementia), awareness of treatment for diabetes and hypertension, the experiences of physicians and nurses when they come in contact with medical staff, awareness of community health environments, experiences of medical service utilization by the general public in medically vulnerable areas, the general public’s awareness of vaccination policies, the health issues of people with special types of employment (such as delivery and call center workers), and the unmet healthcare needs of persons with vision or hearing impairment.

GENERAL WORKFLOW OF QUALITATIVE RESEARCH

Rather than focusing on deriving objective information, qualitative research aims to discern the quality of a specific phenomenon, obtaining answers to “why” and “how” questions. Qualitative research aims to collect data multi-dimensionally and provide in-depth explanations of the phenomenon being researched. Ultimately, the purpose of qualitative research is set to help researchers gain an understanding of the research topic and reveal the implications of the research findings. Therefore, qualitative research is generally conducted in the following order: (1) selection of a research topic and question, (2) selection of a theoretical framework and methods, (3) literature analysis, (4) selection of the research participants (or participation target) and data collection methods, (5) data analysis and description of findings, and (6) research validation ( Figure 1 ) [ 30 ]. However, unlike quantitative research, in which hypothesis setting and testing take place unidirectionally, a major characteristic of qualitative research is that the process is reversible and research methods can be modified. In other words, the research topic and question could change during the literature analysis process, and theoretical and analytical methods could change during the data collection process.

An external file that holds a picture, illustration, etc.
Object name is jpmph-22-451f1.jpg

General workflow of qualitative research.

Selection of a Research Topic and Question

As with any research, the first step in qualitative research is the selection of a research topic and question. Qualitative researchers can select a research topic based on their interests from daily life as a researcher, their interests in issues within the healthcare field, and ideas from the literature, such as academic journals. The research question represents a more specific aspect of the research topic. Before specifically starting to conduct research based on a research topic, the researcher should clarify what is being researched and determine what research would be desirable. When selecting a research topic and question, the research should ask: is the research executable, are the research topic and question worth researching, and is this a research question that a researcher would want to research?

Selection of Theoretical Framework and Methods

A theoretical framework refers to the thoughts or attitudes that a researcher has about the phenomenon being researched. Selecting the theoretical framework first could help qualitative researchers not only in selecting the research purpose and problem, but also in carrying out various processes, including an exploration of the precedent literature and research, selection of the data type to be collected, data analysis, and description of findings. In qualitative research, theoretical frameworks are based on philosophical ideas, which affect the selection of specific qualitative research methods. Representative qualitative research methods include the grounded theory, which is suitable for achieving the goal of developing a theory that can explain the processes involved in the phenomenon being researched; ethnographic study, which is suitable for research topics that attempt to identify and interpret the culture of a specific group; phenomenology, which is suitable for research topics that attempt to identify the nature of research participants’ experiences or the phenomenon being researched; case studies, which aim to gain an in-depth understanding of a case that has unique characteristics and can be differentiated from other cases; action research, which aims to find solutions to problems faced by research participants, with the researchers taking the same position as the participants; and narrative research, which is suitable for research topics that attempt to interpret the entire life or individual experiences contained within the stories of research participants. Other methodologies include photovoice research, consensual qualitative research, and auto-ethnographic research.

Literature Analysis

Literature analysis results can be helpful in specifically selecting the research problem, theoretical framework, and research methods. The literature analysis process compels qualitative researchers to contemplate the new knowledge that their research will add to the academic field. A comprehensive literature analysis is encouraged both in qualitative and quantitative research, and if the prior literature related to the subject to be studied is insufficient, it is sometimes evaluated as having low research potential or research value. Some have claimed that a formal literature review should not be performed before the collection of field data, as it could create bias, thereby interfering with the investigation. However, as the qualitative research process is cyclic rather than unidirectional, the majority believes that a literature review can be performed at any time. Moreover, an ethical review prior to starting the research is a requirement; therefore, the research protocol must be prepared and submitted for review and approval prior to conducting the research. To prepare research protocols, the existing literature must be analyzed at least to a certain degree. Nonetheless, qualitative researchers must keep in mind that their emotions, bias, and expectations may interject themselves during the literature review process and should strive to minimize any bias to ensure the validity of the research.

Selection of the Research Participants and Data Collection Methods

The subjects of qualitative research are not necessarily humans. It is more important to find the research subject(s) from which the most in-depth answers to the research problem can be obtained. However, the subjects in most qualitative studies are humans, as most research question focus on humans. Therefore, it is important to obtain research participants with sufficient knowledge, experience, and attitudes to provide the most appropriate answers to the research question. Quantitative research, which views generalizability as a key research goal, emphasizes the selection of research participants (i.e., the research sample that can represent the study’s population of interest), whereas qualitative research emphasizes finding research participants who can best describe and demonstrate the phenomenon of interest.

In qualitative research, the participant selection method is referred to as purposeful sampling (or purposive sampling), which can be divided into various types. Sampling methods have various advantages, disadvantages, and characteristics. For instance, unique sampling (extreme case sampling) has the advantage of being able to obtain interesting research findings by researching phenomena that have previously received little or no interest, and the disadvantage of deriving research findings that are interesting to only some readers if the research is conducted on an overly unique situation. Maximum variation sampling, also referred to as theoretical sampling, is commonly used in qualitative research based on the grounded theory. Selecting the appropriate participant sampling method that suits the purpose of research is crucial ( Table 2 ).

Sampling methods of selecting research participants in qualitative research

Sampling methodExplanation
Typical samplingSelecting the most typical environment and people for the research topic
Unique sampling (extreme case sampling)Selecting unique and uncommon situations or subjects who satisfy the research purpose
Maximum variation samplingSelecting subjects showing maximum variation with a target population
Convenience samplingSelecting subjects who can be sampled most conveniently considering practical limitations, such as funding, time, and location
Snowballing samplingSelecting key research participants who satisfy the criteria established by the researcher and using their recommendations to recruit additional research participants

Once the researcher has decided how to select study participants, the data collection methods must be determined. Just as with participant sampling, various data collection methods are available, all of which have various advantages and disadvantages; therefore, the method must be selected based on the research question and circumstances. Unlike quantitative research, which usually uses a single data source and data collection method, the use of multiple data sources and data collection methods is encouraged in qualitative research [ 30 ]. Using a single data source and data collection method could cause data collection to be skewed by researcher bias; therefore, using multiple data sources and data collection methods is ideal. In qualitative research, the following data types are commonly used: (1) interview data obtained through one-on-one in-depth interviews and focus group discussions, (2) observational data from various observation levels, (3) documented data collected from personal or public documents, and (4) image data, such as photographs and videos.

Interview data are the most commonly used data source in qualitative research [ 31 ]. In qualitative research, an interview refers to communication that takes place based on a clear sense of purpose of acquiring certain information, unlike conversations that typically take place in daily life. The level of data acquired through interviews varies significantly depending on the researcher’s personal qualifications and abilities, as well as his or her level of interest and knowledge regarding the research topic. Therefore, interviewers must be trained to go beyond simply identifying the clearly expressed experiences of research participants to exploring their inner experiences and emotions [ 32 ]. Interview data can be classified based on the level of structuralization of the data collection method, sample size, and interview method. The characteristics of each type of interview are given in Table 3 .

Detailed types of interview methods according to the characteristics of in-depth interviews and focus group discussion

ClassificationSpecific methodCharacteristics
Level of structuralizationStructured interviewData are collected by asking closed questions in the order provided by highly specific interview guidelines
Useful for asking questions without omitting any details that should be checked with each research participant
Leaves little room for different interpretations of the participant’s responses or expressing original thoughts
Semi-structured interviewBetween a structured and unstructured interview; interview guidelines are developed in advance, but the questions are not strictly set and may vary
The most widely used data collection method in qualitative research, as it allows interviews to be conducted flexibly depending on the characteristics and responses of the participants
Researcher bias may influence the interview process
Unstructured interviewThe interview is conducted like a regular conversation, with extremely minimal prior information about the research topic and adherence to interview guidelines to exclude the intention for acquiring information needed for the research
Can obtain rich and realistic meaning and experiences of the research participants
The quality of information acquired and length (duration) of interview may vary depending on the competency of the interviewer, such as conversational skills and reasoning ability
Sample sizeOne-on-one in-depth interviewExcluding cases in which a guardian must accompany the research participant, such elderly or frail patients and children, a single participant discusses the research topic with one to two researchers during each interview session
This data collection method is recommended for research topics that are difficult to discuss with others and suitable for obtaining in-depth opinions and experiences from individual participants
The range of information that can be acquired may vary depending on the conversational skills and interview experience of the interviewer and requires a relatively large amount of effort to collect sufficient data
Focus group discussionAt least 2 (generally 4–8) participants discuss the research topic during each interview session led by the researcher
This method is effective when conducting interviews with participants who may be more willing to open up about themselves in a group setting than when alone, such as children and adolescents
Richer experiences and opinions can be derived by promoting interaction within the group
While it can be an effective data collection method, there may be some limitations in the depth of the interview; some participants may feel left out or not share their opinion if 1 or 2 participants dominate the discussion
Interview methodFace-to-faceThe interviewer personally meets with the research participant to conduct the interview
It is relatively easy to build rapport between the research participant and interviewer; can respond properly to the interview process by identifying non-verbal messages
Cannot conduct interviews with research participants who are difficult to meet face-to-face
Non-face-to-faceInterview between the interviewer and research participant is conducted through telephone, videoconferencing, or email
Suitable data collection method for topics that deal with political or ethical matters or intimate personal issues; in particular, email interviews allow sufficient time for the research participant to think before responding
It is not easy to generate interactions between the research participant and interviewer; in particular, it is difficult to obtain honest experiences through email interviews, and there is the possibility of misinterpreting the responses

Observations, which represent a key data collection method in anthropology, refer to a series of actions taken by the researcher in search of a deep understanding by systematically examining the appearances of research participants that take place in natural situations [ 33 ]. Observations can be categorized as participant and non-participant, insider and outsider, disguised and undisguised, short- and long-term, and structured and unstructured. However, a line cannot be drawn clearly to differentiate these categories, and the degree of each varies along a single spectrum. Therefore, it is necessary for a qualitative researcher to select the appropriate data collection method based on the circumstances and characteristics of the research topic.

Various types of document data can be used in qualitative research. Personal documents include diaries, letters, and autobiographies, while public documents include legal documents, public announcements, and civil documents. Online documents include emails and blog or bulletin board postings, while other documents include graffiti. All these document types may be used as data sources in qualitative research. In addition, image data acquired by the research participant or researcher, such as photographs and videos, serve as useful data sources in qualitative research. Such data sources are relatively objective and easily accessible, while they contain a significant amount of qualitative meaning despite the low acquisition cost. While some data may have been collected for research purposes, other data may not have been originally produced for research. Therefore, the researcher must not distort the original information contained in the data source and must verify the accuracy and authenticity of the data source in advance [ 30 ].

This review examined the characteristics of qualitative research to help researchers select the appropriate qualitative research methodology and identify situations suitable for qualitative research in the healthcare field. In addition, this paper analyzed the selection of the research topic and problem, selection of the theoretical framework and methods, literature analysis, and selection of the research participants and data collection methods. A forthcoming paper will discuss more specific details regarding other qualitative research methodologies, such as data analysis, description of findings, and research validation. This review can contribute to the more active use of qualitative research in the healthcare field, and the findings are expected to instill a proper understanding of qualitative research in researchers who review and judge qualitative research reports and papers.

Ethics Statement

Since this study used secondary data source, we did not seek approval from the institutional review board. We also did not have to ask for the consent of the participants.

Acknowledgments

CONFLICT OF INTEREST

The authors have no conflicts of interest associated with the material presented in this paper.

AUTHOR CONTRIBUTIONS

Conceptualization: Pyo J, Lee W, Choi EY, Jang SG, Ock M. Data curation: Pyo J, Ock M. Formal analysis: Pyo J, Ock M. Funding acquisition: None. Validation: Lee W, Choi EY, Jang SG. Writing - original draft: Pyo J, Ock M. Writing - review & editing: Pyo J, Lee W, Choi EY, Jang SG, Ock M.

IMAGES

  1. Qualitative Research Methods

    key features of a qualitative research

  2. 14 Types of Qualitative Research (2024)

    key features of a qualitative research

  3. What is Qualitative Research? Definition, Types, Examples, Methods, and

    key features of a qualitative research

  4. PPT

    key features of a qualitative research

  5. 10 Distinctive Qualities of Qualitative Research

    key features of a qualitative research

  6. Qualitative Research: Definition, Types, Methods and Examples

    key features of a qualitative research

VIDEO

  1. ASPECTS OF QUALITATIVE RESEARCH

  2. Using Qualitative Feedback to Make Better Product Decisions

  3. Research Design vs. Research Methods: Understanding the Key Differences #dataanalysis #thesis

  4. Do you know about Qualitative research paradigms and Quantitative research paradigms ‼️#ytshort

  5. AI Enhanced Qualitative Data Analysis Tools #shortsvideo

  6. 10 Difference Between Qualitative and Quantitative Research (With Table)

COMMENTS

  1. Characteristics of Qualitative Research

    Qualitative research is a method of inquiry used in various disciplines, including social sciences, education, and health, to explore and understand human behavior, experiences, and social phenomena. It focuses on collecting non-numerical data, such as words, images, or objects, to gain in-depth insights into people's thoughts, feelings, motivations, and perspectives.

  2. What Is Qualitative Research?

    Revised on September 5, 2024. Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which ...

  3. Qualitative Research

    Characteristics of Qualitative Research. Qualitative research is characterized by several key features, including: Focus on subjective experience: Qualitative research is concerned with understanding the subjective experiences, beliefs, and perspectives of individuals or groups in a particular context. Researchers aim to explore the meanings ...

  4. Qualitative Methods

    Characteristics of Qualitative Research. ... Qualitative Research Methods for the Social Sciences. 8th edition. Boston, MA: Allyn and Bacon, 2012; Denzin, Norman. ... This is an excellent source for finding definitions of key terms and descriptions of research design and practice, techniques of data gathering, analysis, and reporting, and ...

  5. 4.2 Definitions and Characteristics of Qualitative Research

    It is based on naturalistic, interpretative and humanistic notions. 5 This research method seeks to discover, explore, identify or describe subjective human experiences using non-statistical methods and develops themes from the study participants' stories. 5 Figure 4.1 depicts major features/ characteristics of qualitative research.

  6. Introduction to qualitative research methods

    Key features of qualitative research methods. One of the most important contributions of qualitative research methods is that they provide rigorous, theoretically sound, and rational techniques for the analysis of subjective, nebulous, and difficult-to-pin-down phenomena. We are aware, for example, of the role that social factors play in health ...

  7. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  8. What Is Qualitative Research?

    Revised on 30 January 2023. Qualitative research involves collecting and analysing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which ...

  9. Qualities of Qualitative Research: Part I

    Quantitative research uses a positivist perspective in which evidence is objectively and systematically obtained to prove a causal model or hypothesis; what works is the focus. 3 Alternatively, qualitative approaches focus on how and why something works, to build understanding. 3 In the positivist model, study objects (eg, learners) are ...

  10. What is Qualitative Research?

    Characteristics of qualitative research. Several key characteristics distinguish qualitative research from other types of research, such as quantitative research: Naturalistic settings: Qualitative researchers collect data in the real-world settings where the phenomena of interest occur, rather than in controlled laboratory environments. This ...

  11. Qualitative Research : Definition

    Qualitative research is the naturalistic study of social meanings and processes, using interviews, observations, and the analysis of texts and images. In contrast to quantitative researchers, whose statistical methods enable broad generalizations about populations (for example, comparisons of the percentages of U.S. demographic groups who vote in particular ways), qualitative researchers use ...

  12. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...

  13. 1

    • How can qualitative research contribute to infor ma tion work? This chapter presents an overview of qualitative research and its place in information work from a practical perspective. To achieve this we work through some definitions and a touch of theory, but this discussion is tempered with practical examples of research that should ...

  14. What is Qualitative Research? Definition, Types, Examples ...

    Unlike quantitative research, which focuses on numerical measurements and statistical analysis, qualitative research employs a range of data collection methods to gather detailed, non-numerical data that can provide in-depth insights into the research topic. Here are the key characteristics of Qualitative Research: Subjectivity: Qualitative ...

  15. PDF Introduction to Qualitative Research Methodology

    qualitative research is not just about applying a different set of tools to gain knowledge, but rather, involves a fundamental shift in the approach to research, and description of reality. ... research on the basis of key features • Adopt a qualitative lens to health-related questions Key words:

  16. Criteria for Good Qualitative Research: A Comprehensive Review

    Fundamental Criteria: General Research Quality. Various researchers have put forward criteria for evaluating qualitative research, which have been summarized in Table 3.Also, the criteria outlined in Table 4 effectively deliver the various approaches to evaluate and assess the quality of qualitative work. The entries in Table 4 are based on Tracy's "Eight big‐tent criteria for excellent ...

  17. Conceptualization in qualitative research

    Key Takeaways. Qualitative research questions often contain words or phrases like "lived experience," "personal experience," "understanding," "meaning," and "stories." Qualitative research questions can change and evolve over the course of the study.

  18. Qualitative Study

    Qualitative research is a type of research that explores and provides deeper insights into real-world problems.[1] Instead of collecting numerical data points or intervening or introducing treatments just like in quantitative research, qualitative research helps generate hypothenar to further investigate and understand quantitative data. Qualitative research gathers participants' experiences ...

  19. Introduction to qualitative research methods

    Abstract. Qualitative research methods are widely used in the social sciences and the humanities, but they can also complement quantitative approaches used in clinical research. In this article, we discuss the key features and contributions of qualitative research methods. Keywords: Qualitative research; social sciences; sociology.

  20. What is Qualitative Research

    Qualitative researchers are interested in the qualities of human interaction, and this is best achieved through language, words. We use words to describe, analyse and discuss the people, communities, and situations we are investigating. We use our own words and those of the individuals we are studying to better understand the circumstance under ...

  21. The principles and application of qualitative research

    identifies the key features of qualitative research as 'an inquiry process of understanding based on distinct methodological traditions of inquiry that explore a social or human problem. The research builds a complex, holistic picture, analyses words, reports detailed views of infor-mants, and conducts the study in a natural setting'. To this

  22. What is Qualitative in Qualitative Research

    Ragin (2004:22) points out that "a good definition of qualitative research should be inclusive and should emphasize its key strengths and features, not what it lacks (for example, the use of sophisticated quantitative techniques)." We define qualitative research as an iterative process in which improved understanding to the scientific ...

  23. What is Qualitative in Qualitative Research

    Ragin (2004:22) points out that "a good definition of qualitative research should be inclusive and should emphasize its key strengths and features, not what it lacks (for example, the use of sophisticated quantitative techniques)." We define qualitative research as an iterative process in which improved understanding to the scientific ...

  24. Data Analysis

    Data visualization can be employed formatively, to aid your data analysis, or summatively, to present your findings. Many qualitative data analysis (QDA) software platforms, such as NVivo, feature search functionality and data visualization options within them to aid data analysis during the formative stages of your project.. For expert assistance creating data visualizations to present your ...

  25. What does 'safe care' mean in the context of community-based mental

    Background Having traditionally received limited attention in empirical research and safety improvement agendas, issues of patient safety in mental healthcare increasingly feature in healthcare quality improvement discourses. Dominant approaches to safety stem from narrow clinical risk management perspectives, yet existing evidence points to the limitations of this characterisation. Although ...

  26. Foundations of automatic feature extraction at LHC-point clouds and

    Existing research shows that physics-inspired feature extractors have many advantages beyond improving the qualitative understanding of the extracted features. In this review, we systematically explore automatic feature extraction from a phenomenological viewpoint and the motivation for physics-inspired architectures.

  27. Qualitative Research in Healthcare: Necessity and Characteristics

    Quantitative research, which views generalizability as a key research goal, emphasizes the selection of research participants (i.e., the research sample that can represent the study's population of interest), whereas qualitative research emphasizes finding research participants who can best describe and demonstrate the phenomenon of interest.