How do you analyze the data you’ve collected?
Margaret Fesenmaier studies ambient awareness that happens in non-synchronous communication such as online social media posts. She is particularly interested in how online communication through social media impacts the relationships of migrants and their friends and family members in the country of origin. In a seminar part of the QUAL speaker series, Fesenmaier, a Ph.D. student from Communication, shared her data analysis process from a recent trip to China where she conducted 18 interviews with university students in Beijing.
Fesenmaier described her data analysis experience as a three-stage iterative process:
- Managing data – including keeping it secure on a USB while she was traveling in China;
- Reducing data – coding and pulling out pieces (quotations) of the data; and
- Developing concepts – the “thinking” part of the analysis where she used a white board and a matrix of her coded data organized by research question.
Developing a codebook
In preparation for her trip, Fesenmaier developed a deductive codes list, a codebook, that was grounded in theory. But she allowed herself to be surprised and discover new things in her semi-structured interviews – food was an important topic, it turned out, and only 1 of her 18 interviewees did not bring up the importance food pictures in their social media streams. Thus, she also incorporated emerging (inductive) codes in her analysis of the interviews.
Theme matrix
Fesenmaier, guided by Johnny Saldana‘s qualitative data analysis recommendations, conducted a first and second wave coding and then used the network and families (groups) functions of ATLAS.ti to figure out which codes belong to which of her research questions. This data reduction stage of the analysis process did not reduce the data a whole lot in Fesenmaier’s experience but it worked well for her. She used the matrix of quotations organized by research questions, to revisit the data in a more systematic way and write in the ATLAS.ti memos about each of the research questions. She also wrote memos about surprising themes that emerged from the data inductively.
“It really is an iterative process,” Fesenmaier said. “At least for me – from words to models and back to words.”
Lastly, Fesenmaier shared with the interdisciplinary audience of graduate students, what she wishes she had done differently in her data collection and analysis:
- In-process writing – more field notes and journaling, keeping observation notes or a diary would have enriched her interview data;
- Cloud data – she wishes she had a CSDE account to use while in the field – to ensure her data was backed up automatically and for peace of mind with IRB and other considerations;
- Pre-understanding of the network function in ATLAS.ti – using this tool sooner as a digital white board would have saved her time in the process of mapping out codes and themes that emerged from the data and grouping them according to research questions;
- Embracing mistakes – while in the field, Fesenmaier worried about the logistics of keeping her interview data safe instead of paying closer attention to the interviewees’ lives around her and collecting notes on the context.
She encouraged those starting out with qualitative data analysis (QDA) software to be brave about experimenting with the different functions: “Press all the buttons and see what happens,” she said. “Just make sure you have all of your raw data saved elsewhere.”