Penn GSE's Manuel González Canché (right) and a student work through an equation.
The whiteboard in Manuel González Canché’s Penn GSE office is covered in an array of dense equations. His professional biography notes that he employs “econometric, quasi-experimental, spatial statistics, and visualization methods for big and geocoded data” in his research.
He is definitely a quantitative researcher. But González Canché is also a collaborator, and through his collaborations, he has found that network analysis can help qualitative researchers see and present their data more clearly, and provide more transparency in their results.
On April 4, Canché will deliver an extended professional development course that introduces these techniques as part of the American Education Research Association’s annual conference. We spoke to González Canché about what attendees will learn and why network analysis might be the future of qualitative research.
Q: There’s a perception, maybe a misperception, that qualitative and quantitative research have to exist in separate worlds. Is there really more these disciplines have to share?
A: I’m trying to convey the idea that qualitatively-generated data contains mathematical structure and this structure can be retrieved using network principles and theory. That sounds fancy and complicated, but it’s truly very straightforward once you learn how to do it.
Network analysis doesn’t replace rigorous qualitative research. But this is a tool that will allow expert qualitative researchers to examine issues deeper and present their findings more transparently.
Q: How does this sort of network analysis work?
A: We start with a data set, which in this case is configured by transcripts from interviews or focus groups. Using their subject matter expertise, the researchers will then qualitatively code the transcripts to identify when subjects strike on a topic of interest.
Penn GSE at AERA 2019 Penn GSE’s Gerald Campano, Amy Stornaiuolo, and Ebony Elizabeth Thomas will lead a professional development seminar for international scholars seeking to publish in English-medium journals.
We can then analyze the dataset as a whole to create a map, which is called a sociogram. When analyzing interview data, this map is static; however, when analyzing focus groups, we in addition can see how a conversation evolved, when knowledge was generated, by whom, and the context of that knowledge generation process.
It also helps us to see how the views of actors in a study are aligned. In one project, I’m collaborating with a colleague who is studying the views of people at higher education institutions. Once we mapped the research, we could see that the people who have an academic orientation — students and faculty — share more similar concerns than students and administrators.
That finding might seem like common sense, but many other assumptions that seem like common sense are actually wrong.
Q: Can you give another example of how this technique can help a researcher?
A: This is an important tool for analyzing focus groups. When you have one individual who is highly talkative, or very smart, or very passionate, they can have a huge influence on the whole group, which isn’t the goal. For example, I helped analyze a focus group where one participant brought up and defined micro-cultures. Once he did, everyone started talking about micro-cultures throughout the entire conversation. Consequently, this topic will be a recurring one, and is the result of one single individual’s contribution.
We can visualize how the focus group evolves over time either by interaction among individuals or by their interactions through the codes they provided. That is, we can see when a given code was important and when it decayed. In short, we can examine the conversation by codes, by individuals and codes, and by individuals.
Those examples also help illustrate when moderators end up driving the conversation by giving too many verbal keys and incentives. When you have multiple focus groups, you can use this technique across the groups. This all helps improve transparency.
Q: Can network analysis help with the research process as well?
A: This is a tool. After the tool is applied, researchers will have some options. Am I satisfied with what I’m seeing? Do I have to go recheck my codes or my interviews? Do I have to go re-interview the same people or other people about the topic? Do I need to go back to the field and gather more or other sources of data?
Q: Do you see this type of work as important to the future of qualitative research?
A: I think so. The first time I offered a course on this was the spring of 2013. By the next year, one of the major qualitative software programs was starting to offer a rudimentary set of tools for network analysis, so I could see I wasn’t crazy.
Learning network analysis is a great opportunity for the next generation of researchers, for junior faculty, post docs, and students who are getting ready to enter the job market. Having these tools will open doors for them to collaborate with more people.
The software we are using, R, is free. And in an eight-hour session, I can give someone without coding experience a solid foundation to do this work on their own.
Q: Ten years from now, do you think we will regularly see network analysis in qualitative research?
A: That’s the goal. Reviewers for the papers I have collaborated on have been excited about us using the tool. I would love to reach a point where qualitative research may seem incomplete if you don’t show the structure and the steps you followed to reach the conclusion.