I took a course this semester in statistics for the social sciences. Last year I took an introductory research methods course.
In theory (ha) these are supposed to be the boring courses, the mud through which you slog on the path to enlightenment/a degree/more compelling topics. People seem to hate and fear statistics (sadistics) in equal measure.
This didn’t ring really true for me. I’m not going to say that I’m racing out for a degree in advanced stats, but it was interesting in the way that what’s under the hood of a car is interesting, or the way that plumbing is interesting, or the way that The West Wing is interesting. Seeing the moving parts is good for you. These things are some of the moving parts of knowledge creation. And knowledge creation (plus the critical consideration of it) is so deeply important in everything we do. Everything.
So this is an essay about research methods, and the above is why I think you shouldn’t stop reading now. But I won’t blame you if you do. Here’s a page of formal fantasy cat portraiture.
A fun problem to me is the fact that a lot of research in the social sciences is fundamentally extractive.
It’s extractive in the way that reading a book is extractive: you read the book and pull out knowledge and don’t usually leave additional knowledge behind (unless you make notes in the margins of library book, in which case you are probably the devil).
But it’s often also extractive in a less benign way: the foundational structures of western knowledge creation in the social sciences require that researchers ask other people for tiny discrete pieces of deeply personal life experience and if they’re lucky they get a conversation out of it and the promise of some kind of moral, epistemological superiority for having contributed to human understanding. If they’re unlucky, they fill out a survey and the pouring-out-of-life is agglomerated and processed into a thesis on something no one is ever going to read about.
The participant trades a piece of themselves and in return it gets validated, turned from knowledge into Knowledge. The cruel twist is that their own life experience requires processes of authority to be assigned value, at which point they no longer own it.
The crushing weight of this realization drags heavily in the context of private-sector person-oriented research. Here it’s even trickier to navigate, a devil’s bargain of profit-making weighed against small shifts towards human centricity. It feels like sand castle building near big waves.
These thoughts intermingle with those sparked by a panel I attended last week. The subject of the panel was public and community engagement through design. It started out a little slow, but eventually got into a discussion on the role of designers in community work, and how in an ideal world you should be moving away from the role of guru and towards a role as an educator, shifting elements of your expertise to the community itself so that their reliance on you diminishes. Stop gatekeeping, basically. It makes me think a little bit about what an applied social science research practice might look like, if we’re to make it sustainable and equitable and ethical all at once.
At this point, I should clarify a few things.
First, I’m not against knowledge making, and I get that it has to happen right now through the academy and with peer-reviewed secondary sources and that this is a bigger system to dismantle or divert. I think we should be asking big questions and we should be seeking to answer them with oceans of voices. An English teacher I had once described Ngũgĩ wa Thiong’o’s A Grain of Wheat as possessing a ‘polyphony of voices’. This phrase stays with me so, so much. It’s what I want for how we develop knowledge and for how we solve problems. Build bigger tables.
Second, I’m aware that there’s already a lot of good work being done to make research more equitable and ethical. It’s been a long and compassionately-considered journey for disciplines that have their origins in racist theory and phrenology as much as the writing of Durkheim and Tocqueville and Marx. I don’t mean to devalue this work. Rather, I want to build on it.
So with these things in mind let’s think about research practice and how it can be made into something where both sides benefit more thoroughly.
One of the really core pieces of our toolkit at work is the use of co-creative, cross-disciplinary workshops to help us make stuff that actual people would want to use. Practically speaking, here’s what that looks like: we gather a bunch of folks in a room from a variety of different but related backgrounds (for example, municipal bureaucrats, civil engineers, cartographers, and parents of young children), and then ask them to discuss a variety of preset topics, and in the end, make something together. Often the things they make (a new plan for family-friendly streets, to continue the hypothetical) are sketchy as hell and have serious flaws, but they’re an incredible start. Better sketchy and mostly right than beautiful and totally wrong.
The benefit of this participatory approach for us as the facilitators is pretty plain. We get insights and tangible outputs, in a research/implementation one-two punch.
The benefits to the participants are significant as well, however. First, they’re active participants in the creation of whatever it is we’re creating; this isn’t just a focus group feeding data to designers. They’re making actual stuff. On top of that, there are opportunities for community building and conversation and occasionally networking in the professional sense. When facilitated well, it’s a model that leaves participants feeling like they’re more than just inputs to an opaque process, and also provides them with opportunities to deepen their roots in their community of expertise.
It’s imperfect, of course. The participants don’t drive in terms of what questions are being asked and answered. We facilitate, pruning conversational and exploratory tendrils that go in ways that aren’t useful to us, regardless of their utility to the participants. This is necessary for reasons that make sense in a consulting/client services context: clients have questions they want answered and they’re paying good money to get them. These questions are driven by business needs and legal constraints. There’s no cloak of moral duty here, and the more it evaporates in favour of corporate interest, the more likely it is that participants are getting compensated in some way. This is at least partly an okay thing.
I had a good conversation last weekend with someone I met at a party. They were working at a community centre in the city on a project related to the sexual health of young people and how to design programming in that realm for this particular centre and its catchment area. This particular project was anchored in participatory action research. The core takeaway of this for me was that the researcher invites the informants to act as co-investigators, helping to shape the questions that are asked as well as the research methods used. As a result, the ownership of the process and the data and the outcomes is collective and rooted in the community being studied. It breaks the binary of studied and studier. This is important.
These conversations on the research side and the talk about design and public/community engagement converged this week, and it’s this experience that I’m most excited about exploring.
I was a participant in a really curious session. Kate had set it up as a brainstorming session for community-anchored educational resources around cycling, specifically aimed at new cyclists. She invited two of her friends who are thoroughly experienced cyclists to talk with me, a terribly inexperienced one, about cycling in Toronto. I don’t think Kate intended it as anything more than a brainstorming session, but in practice it was much more interesting than that, to me.
How it ended up working was that I had a list of questions that I asked the experts. They answered them, and talked widely about their experience as cyclists, providing colour and context above and beyond the basic utilitarian concerns of how one cycles in the city. Kate took notes throughout. It was a compelling model for research in a number of ways.
First, I drove the questions being asked, with some additions from Kate. It could just as easily have been set up such that all participants have the opportunity to ask and answer. Second, it established community bonds in a really rough and and grassroots and tangible way, connecting experienced people willing to share with inexperienced people wanting to learn. Facilitating this transfer along the knowledge gradient was really cool for me, but also useful for the experienced folks, too; a lot of this stuff related to something they’re passionate about had maybe never been posed to them before. Beyond that, Kate was provided with a lot of really good information, relating to the questions of a new cyclist, the under-the-surface critical knowledge that experts possess, and some indication of the kinds of shared languages and contexts that enable the transfer of that knowledge.
Everyone benefitted, and everyone owned the process in their own way. I’d like to see more of this kind of collective knowledge-making in my own work. I recognize this weakens the reproducibility of the research in the longterm, and makes it less authoritative. I think this is okay, within a practice of applied social science, where the goal is to produce knowledge in service of a community. Build bigger tables together, and then maybe leave them behind. You only need so many tables.
Some additional thoughts:
One thing that’s maybe useful but also tricky is any kind of participatory research that involves making or play.
These methods (where you make something out of construction paper or clay or roleplay or whatever) are neat because they let you get at different kinds of insights that conversational or written responses don’t really let you get at. They’re also good at imposing a kind of veil between research intentions and your participants; to be clear, I don’t mean that in a manipulative sense, but rather that it lets your participants forget for a moment that they’re being studied.
My concern with these kinds of methods is that they can be really patronizing, especially depending on who your participants are. I was helping run one of these sessions with an older adult who had some mild cognitive impairment/Alzheimer’s disease, and oh boy you would not believe how much they thought it was bullshit. I think we likely should have been more cognizant of how the activities might be perceived as childish, especially for someone who is seeing their cognitive function infuriatingly decline. Infantilization is a real risk in cases like that.
Even with the co-creation sessions we do at Bridgeable, where we’re working with often Very Serious People, there’s a risk that you alienate them if you don’t frame things properly when you’re using a method that seems less than decorous.
When you’re using a traditional research method, it carries with it a certain kind of official weight, where grown adults think they can engage with it as something fundamentally serious. Of course, this goes hand in hand with the factors that make it alienating and terrifying to some folks.
I don’t really have an answer for this dilemma other than: think very carefully about how to not be patronizing and how to present the value of a participatory method in real terms (as well as its limitations/fundamental characteristics: “This is going to seem a little silly, but here’s why we do it…”).
An unrelated but fun thought: in most of my writing here I’ve been using ‘participant’ rather than the more traditional ‘informant’ or ‘subject’. This isn’t an intentional shift. Just a thing I noticed. Language is fun.