CritCom | Homepage

The Puzzle of Mass Mobilization: Conducting Protest Research in Ukraine, 2004–2014

0 Comments 🕔22.May 2014

This article is part of our Enough! feature on Europe’s exploding social movements.

under-EU-flag

by Olga A. Onuch 

We are always puzzled if not taken aback when ‘ordinary’ citizens – those who are not typically overly engaged, may or may not vote, and are rarely (if ever) participants in the protest game – join activists and opposition party groups en masse, tipping the scales and helping create moments of mass mobilization. The event is usually quite unpredictable, but within hours we observe the sea of people that take over city streets, leaving even the most skeptical observer momentarily impressed if not touched. Their brazen political engagement is formidably brave in the face of potential repression and yet, still so poorly understood by social scientists. Knowing how to study and analyze these phenomena still remains difficult.


Back to the future: How we studied it then

In November 2004, ‘ordinary’ Ukrainians suddenly went from being perceived as a disengaged, post-communist electorate to a participatory society, when about a quarter of the local population became engaged in some form of protest participation, a phenomenon known as the ‘Orange Revolution’. Astonished by this development, pundits and academics scrambled to understand the mass mobilization as it unfolded. Few analysts conducted systematic interviews or surveys with the broader population of protest participants. Most relied on elite interviews with activists and only employed large national surveys, which asked little about the mobilization process and the motivations of the participants. Those scholars who wanted to understand why ‘ordinary’ citizens joined in the massive protests, including myself, have had to rely on data collected months or even years after the events took place. Conducting surveys and focus groups with protest participants is not generally easy, but when you have to keep in mind that recollections of past experiences and preferences make for problematic data at best, questions about the mobilization process, about what motivates ‘ordinary’ people to join in, have to remain, at least in part, unanswered.

The puzzle of mass mobilization in the former Soviet Union (still a rare phenomenon) has taunted us throughout the past decade. Most observers agreed that the ‘Orange Revolution’ was a one-time deal. It was postulated that due to disappointment in post-protest politics, Ukrainians were unlikely to protest again. In that same decade we have observed a global wave of mobilization, in places as different as Brazil, Bulgaria, Egypt, Greece, Thailand, and Turkey. The style, trajectories, and outcomes of these events are all quite disparate, but one factor remains that brings these different events together – the en masse participation of ordinary citizens, who quickly outnumber activists, and form a cross-cleavage coalition in the streets.

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project


EuroMaidan: Revolution 2.0

It happened again, in November 2013, on the eve of the ninth anniversary of the ‘Orange Revolution’: the Yanukovych government’s announcement that Ukraine will not sign a Free Trade deal with the EU sparked protests across central and western Ukraine, which were dubbed EuroMaidan (European Square). The protests lacked coordination and seemed to be growing smaller over the following week. Yet, when the riot police (Berkut) employed excessively violent repression against young students and journalists camping out in the Maidan square, the protests exploded. By December 2nd, more than 1 million Ukrainians were actively participating in protest events across Ukraine. The largest protests took place in Kyiv and Lviv, but also spread to Russophone cities like Kharkiv and Odessa. Both the size and geographic diffusion of the protests were larger than in 2004. Also, unlike in 2004, militia repression by the state and a turn to violent repertoires by the protesters consistently escalated the crisis. It was clear that not only did Ukrainians surprisingly join a mass mobilization, but it was also a significant juncture for the political and public spheres in the country – a potentially democratizing moment of civic engagement and defense of democracy or a destabilizing and polarizing event that would entrench a corrupt regime.

As a scholar of protest engagement and political behavior, it was clear to me that we desperately needed to get data on the participants as soon as possible so as not to miss our chance yet again. On November 26th, along with colleagues from the National University of Kyiv Mohyla Academy, and the help of 20 graduate student canvassers, I began conducting an on-site survey of protest participants in Kyiv. Employing the technology and methodology available to us, we were able to survey just over 1,400 protest participants between November 26th and January 10th. We employed a random sampling method, whereby our canvassers only approached every 6th and every 12th protester. In order to triangulate and give substance to our survey findings, we combined the survey data collection with quick, recorded interviews with participants and digital photographs of posters and slogans employed in the square. While impressive in many ways (not least because of the shoestring budget that was available to us) this data collection was far from an easy endeavor and not without several problems. There are numerous practical and methodological difficulties we faced on the Maidan, and the analysis below is an account of problems encountered, solutions found, and lessons learned for future data collection projects.


Survey design in a shifting setting

The first problem we encountered was designing the questionnaire. Most surveys have several months to develop their questions (as well as their placing and wording), but we had no such luxury. Equally swept up by the events as they were unfolding, we were forced to design a survey in just a few days, and had to guess at what we should ask. Pressed for time, we replicated the structure and logic of past surveys, but needed to adjust for the specificities of the EuroMaidan crisis. We added several items not standard in protest surveys. For instance, we decided to include detailed questions about social media, enabling us to test whatever media analysts already believed to be true: that the #EuroMaidan was a Twitter and Facebook revolution. Other questions added were guided by theories of social mobilization, such as the role of rights framing and social network ties in motivating and mobilizing ‘ordinary’ citizens. Because we were uncertain if our multiple-choice options were useful, we made sure to leave an open-ended ‘other’ option for each question, enabling respondents to add items. Much of the first draft of the questionnaire, while a researched-educated guess, was a blind shot in the dark. We could only pilot the survey with colleagues at the National University, and therefore we were never certain of how non-specialists would read and understand the survey questions. Time was of the essence.

The initial speedy design was only part of the problem. We had no idea how long the protests would last and were thus faced with a methodological problem: the content of our questionnaire had to shift along with the changing context of the protests. Some questions needed options and the addition of items, and in at least two instances entirely new questions (about the repression) had to be added. Luckily we only had to make two such adjustments, which were done between November 26th and December 8th. To give a concrete example, we could see from our digital photographs and rapid interviews that the participants and claims were changing as the protests went on. Specifically, after November 30th, the EU focus dissipated, and the protesters claims and slogans seemed centered on human rights, repression, and the illegitimacy of the government. Thus, we had to adjust the questionnaire where it asked about reasons for joining in and where it asked participants to identify the main slogans of the protests. This type of adjustment is of course problematic methodologically, but if we did not do it, we would have missed the shifting aspects of the protests.

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project


Incomparability of collected data with the non-protest population

Surveying the protesters is vital to understanding who they are and why they came out. At the same time, by not conducting a larger, more inclusive survey, we do not know if and how this sub-group differs from the general population. Optimally, if funding allowed, we would have been able to do both: an intensive and focused survey with protest participants and a complimentary general population survey. While our survey had relevant findings – such as the fact that the average protester is middle-aged and that social network ties and not social media were instrumental in bringing people out to the protests – we are not sure what made the protesters different from non-protesters or, if they are not seemingly different, then what made their participation more likely. Therefore, with such a study we are unable to explain what variables account for this difference in behavior.


The ethics and dangers of conducting research in protest zones

We began surveying the protesters on November 26th. Knowing what we know now about how violent the protests turned, we mostly likely would have avoided the survey. When we began, there was no violence or repression against the protesters. The protests were following a typical Ukrainian-style repertoire – concerts and tent cities in the central square – a repertoire used during previous protest events in 1986, 1991, 2000/01, and 2004. Moreover, we completed our polling before the protests turned extremely violent and excessive force was used against protesters after January 19th. Thus, we avoided a great deal of ethical concerns about the safety of our canvassers and of those who were answering the questionnaire.

Protest research is not easy and requires a great deal of rules to ensure good quality of data and safety for all those involved. It is far too easy to be swept up by the emotions of the sea of up to millions of protesters. While thrilling, this would only complicate data collection. For this reason, our canvassers received detailed instructions as to how to approach individuals (what to say and how to say it), how to conduct the survey, what to wear, and when to conduct surveys. We stressed that they could not stay on the Maidan later than 8:30 or 9:00 p.m., and they were asked to avoid any violent zones. They were also told to suspend their interviewing and go home if violence did erupt or there was a police presence. We kept in touch with our canvassers via SMS and asked them to report online at the end of each day. They were briefed and debriefed by academics from Oxford and Kyiv Mohyla Academy, and there were several spot checks conducted by colleagues in Kyiv.


Canvassers are also local citizens

It is clear that our devoted and well-trained canvassers were also Ukrainian citizens (mostly university students) who were also swept up by the events themselves. We worked to avoid any conflicts of interest and kept the rules of participation quite strict. Not all of our canvassers continued to work with us throughout the duration of the survey for professional and personal reasons. But the ethical concerns that came with the instability of the context, the potential dangers to both our canvassers and our research subjects, and the fact that our canvassers could also be research subjects or opponents of the protests presented a different kind of challenge to us. We were thus led to ask why surveying voters was different from surveying protesters – was it because we believed that emotions were that much more powerful in the latter context? This is a consistent problem linked to conducting research on protest participation. We need to catch people in the act to be certain they did participate, but protest zones are emotionally intense, crowded, loud and, as we saw in January and February in Ukraine, potentially violent environments. Plus, canvassers who can be quickly deployed, and who have the necessary language and survey skills will likely be locally based citizens, with their own views and opinions on the protest events. Yet, this should not mean that we abstain from conducting such social science research, but rather that we must employ rules and best practices to avoid problems. While we are certain that our team was well prepared, looked after and safe at all times, this was mostly because we had volunteers on the ground who could help in monitoring the situation. Better funding could have aided us in this respect.


Financial constraints

As noted above, we had to rely on shoestring funding – a great deal of work (beyond that of canvassers) was completed for free. The majority of the funding came from a small grant and personal research allowance from the British Academy and Newtown Fellowship. Better funding could have afforded us the capability to hire a larger team and not have to rely on volunteers; conduct the survey for a longer amount of time and in more municipalities across the country; provide our canvassers with technologies that ensured better data and personal safety (like tablets and smartphones); and even provide our canvassers with proper clothing (the survey took place in the dead of winter) like coats, hats, gloves, and boots. While we were very lucky to be able to recruit our locally based colleagues’ best and brightest students to participate as canvassers – who were eager to learn, were capable and adhered to all of our requests and rules – more funding would have benefitted both the data collection process and our team.

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project

©2013, Tamara Martsenyuk, Olga Onuch & Ukrainian Protest Project


Conclusions

Although we are eager to know what is happening in the protest zones, and journalists and academics have been happy to reference our survey and the statistics it produced, conducting surveys with protest participants on-site is not easy. Yet, we see more and more protest events taking place across the globe. Civic engagement in protests now seems to be a key ingredient in both established democracies (like Greece and France), in democratizing contexts (like Ukraine and Brazil), and in transition states (like Egypt). But of course conducting protest research, especially survey-based, in these divergent contexts presents different problems. In many ways, practice and lessons learned makes perfect. But, most of all, protest participation will continue to be a policy-relevant issue with, as we have seen in the Ukrainian case, potentially serious geopolitical implications. Thus those of us who are fascinated by the puzzle of mass mobilization must also demand that more funding be granted to study this phenomenon in a serious and professional manner.

 

Olga Onuch is a Newton Fellow at Nuffield College, at the University of Oxford, and a Fellow at the Harvard Ukrainian Research Institute, at Harvard University. She specializes in the comparative study of protest politics and political behavior in democratizing states in Latin American and Eastern Europe. An expert on protests and activism in Ukraine, she is the principal investigator of the Ukrainian Protest Project. She analyzes the mechanisms of mass mobilization in her book, Mapping Mass Mobilizations: Understanding Revolutionary Moments in Argentina and Ukraine (Palgrave MacMillan, 2014).

This article is part of our Enough! feature on Europe’s exploding social movements.

 

No Comments

No Comments Yet!

No one has left a comment for this post yet!

Write a Comment

Your email address will not be published. Required fields are marked *

Search