Tagged: evaluation practice

Working in Someone Else’s Country

In the United States, May is graduation season, and with that season comes an influx of both young and seasoned professionals entering and/or re-entering the workforce through full-time positions and short-term consultancies. May also marks the beginning of summer, where hundreds of students head “into the field” to complete research or internships in international settings. For those in international development, I have just one recommendation before you make the transition: Read this book.

Image

Photo courtesy of Amazon

How to Work in Someone Else’s Country by Ruth Stark (reviewed here by Jennifer Lentfer at How Matters) provides practical advice to mitigate against some of international development’s greatest failures as propagated by poorly prepared (and poorly behaved!) international workers. Think you know all there is to know about “getting development right” and working in partnership with local communities? Think again! Even the seasoned aid worker will pull out some hidden gems to act on, all while nodding in agreement at some of the cringe worthy anecdotes of consultants gone wrong. Especially important for evaluators and M&E specialists, as most of our work tends to be shorter-term and—let’s face it—is particularly susceptible to negative perceptions at the community or local level.

Some key takeaways from my favorite chapters:

  • Relationship is everything…and everyone is related (the author struck gold with this first chapter title!) Time constraints can make the process of developing relationships take second stage, but early investment here doesn’t just pay off in the long run: it’s the right thing to do. And everyone that you encounter matters, especially when you are a guest. Besides, you never know who is related to whom. And don’t gossip about local colleagues to other local colleagues. It’s not just bad form; loose lips sink ships!
  • Figure out your job and who you’re working for. The official job description or ToR is only one piece of the puzzle (and sometimes the most puzzling part!) Take time to figure out your antecedents (the who, what, when, where, why) and what it means for your work. Be especially on the lookout for political history that will guide you in what to do and what not to do. Because you’ll have many stakeholders with competing demands, it’s key to find out who the most important client is and prioritize their priorities. But most importantly, never forget the client who is not at the table to begin with.
  • What to do if you get there and nobody wants you. It’s not just about taking up scarce time, space, and resources. The truth of the matter is that your presence might be a perceived and/or real threat on the ground. Understand and even embrace that reality. Meeting resistance with resistance (or, worse, imposition!) never ends well. Prepare upfront by finding out the background context about how your job came to being, but also be prepared to just shut up and listen!
  • How to make them glad that you are there. Let who you are, not your credentials, define you. This can be especially hard for recent graduates. My favorite piece of advice from the whole book: Don’t give the answer until you know the question. This advice makes a great mantra for an international development professional. I would also add that your answer, when given, is never THE answer. The quickest way to lose support is by pushing your own agenda, rather than understanding and supporting someone else’s.
  • Working with your local counterparts. The most important relationship of all. Sustainability, as a buzzword, has lost all meaning. But it’s all about ensuring that work can continue over the long-term, and this means building up and investing in the careers and professional development of local counterparts. All too often international consultants are too busy “building up” their own careers to recognize this tragic flaw. Ironically, an inability to do this could certainly lead to your own demise or, at the very least, to a steep decline in your reputation as someone others want to work with. Local counterparts should always, always, always participate in planning and decision-making, accompany (and direct) visits with key leaders and officials, take leadership in presentation design and delivery, receive recognition in reports and publications, etc. There is never such a thing as giving too much credit, unless it’s to yourself!
  • Working with governments. Stop criticizing and start collaborating, respect official channels and processes, and don’t argue with senior government officials. Give respect where respect is due. As the author reminds us, “never forget that you are a guest of the host country and work there only at the government’s pleasure.” I’d like to personally recommend this chapter to Madonna (she’s not exactly friends with the government in Malawi—and they’ve got good reason to be irritated).
  • Making a difference. Never stop asking yourself if your presence is making a difference for better or for worse. It’s not just about the project goals and metrics. These are meant to serve people. And people respond best to other people—caring and adaptive humans with soft skills, not unresponsive robots armed with pre-programmed tools and commands.

And, since the illustrative anecdotes about “bad behavior in the field” were one of my favorite parts of the book, I’m asking readers to contribute their own examples of “international consultants/employees gone wrong” in the comments section. There’s nothing better than learning (or unlearning) by example!

The threat of convergent thinking in M&E for international development

Like many in evaluation, I consider myself to be a lifelong learner; I thrive on learning, relearning, and unlearning. Knowledge isn’t static, and most of what we “know” can withstand healthy debate. I’m intrigued by the diversity in knowledge paradigms among evaluation practitioners. In fact, I’d love to see a study that analyzes how evaluators come to align themselves with particular paradigms and approaches (any takers?) Despite this diversity, I’m quite startled by the pervasiveness of convergent thinking in the field. This phenomenon affects M&E for international development in a particularly strong way.

Thanks to Twitter, I recently discovered Enrique Mendizabal’s article on how labels, frameworks, and tools might be stopping us from thinking (several months old by now, but worth the time. Do trust me on this one). One of his points is that tools and frameworks emphasize process to the point that space for thinking is eliminated. The proliferation of such tools creates an illusion of knowledge and expertise (punctuated by jargon), which causes few people to question the process and/or the product.

In many ways, M&E has become about compliance. In my opinion, this is largely the result and the cause of convergent thinking. Efforts to prove impact must be “rigorous” and “based on evidence,” which usually implies the use of research-based tools. I’ll be the first to say that such tools can be and are, in many cases, highly effective. But development practitioners and policymakers talk out of both sides of their mouths.  They tout innovation while actually encouraging and rewarding convergent thinking. The accepted M&E tools and frameworks are largely created in the Western world using Western paradigms. M&E “capacity building” is often code for “M&E compliance”: training a critical mass of specialists to auto-pilot processes and principles that are supposed to encourage learning, but often teach little more than how to say and do the “right thing” at the “right time” to prove results. The presentation of rigorous tools discourages skeptics in our audiences, who all too often feel gently—or not so gently—pressured to accept and implement such tools without critical review and healthy skepticism. Real innovation requires divergent thinking. Do we need rigor and evidence and research? A resounding Y-E-S. Do we need professionals with considerable expertise and experience to help guide M&E efforts? Without a doubt. But it is the M&E specialist’s job to integrate good practice with new ways of thinking.

I recently finished the book Creative People Must Be Stopped by David A. Owens, professor at Vanderbilt University. Owens argues that innovation constraints occur at many levels: Individual, group, organization, industry, society, and technology. The convergent thinking that affects our ability to truly innovate (solve problems and measure impact in new and better ways) comes into play at each of these levels. In my own practice, I’m trying to take more responsibility at the individual level. This not-so-easy task includes addressing three core components of creativity identified by Owens: Perception, intellection (or thinking), and expression. I find myself—and the M&E field writ large—to be most perceptible to intellection constraints.

The first step is to eliminate stereotypes and patterns that prevent potentially relevant data from entering the problem-solving process. M&E specialists become accustomed to defining the same problems in the same ways. But what if that definition is wrong? I’m not talking about small errors in problem definitions that can be corrected through collaborative inquiry. I’m talking about widely accepted “evidence-based” definitions of problems (and their associated implementation and measurement practices) that have become almost akin to common knowledge in the field. We must challenge the definition as well as the solution to development problems. Unfortunately, our common problem definitions lead to common indicators and common data collection methods, which leads to common solutions—across projects, programs, countries, cultures, and contexts. In this sense, M&E has the potential of doing more harm than good.

Monitoring and evaluation plans have become a staple of international projects, and the weight of importance of M&E plans in a project proposal is increasing. It’s important that we—as a professional community of practice—serve as our own biggest skeptics, continue thinking critically, and avoid falling prey to “evaluation for compliance” pressures. With that being said, I’d love to hear your thoughts. How can this be done? What are some examples of M&E compliance gone wrong? How have you succeeded in using M&E processes as a springboard for learning and innovation?