Posts Tagged ‘evaluation’

CONSULTANCY VACANCY – Final evaluation of the EU Human Rights Defenders mechanism

July 25, 2019

 

announced on 23 July 2019 that it is looking for a consultancy team to conduct an external evaluation at the end of the first phase of the Project. This evaluation should focus on documenting the impact that the EU Human Rights Defenders mechanism has had on the situation of human rights defenders during 37 months of implementation and whether the Consortium has delivered in accordance with the Project proposal and main objective of the Project. The evaluation should focus in particular on the direct support to human rights defenders under components and should compare this with other programmes and the broader EU support to HRDs. It should also provide recommendations to improve the relevance and effectiveness of the EU Human Rights Defenders mechanism in its follow-up phase of implementation. See also: https://humanrightsdefenders.blog/2019/05/28/the-eu-human-rights-defenders-mechanism-a-short-overview/

The evaluation should concentrate as far as possible on the EU Human Rights Defenders mechanism in its entirety. It is not the intention to evaluate the performance of the individual ProtectDefenders.eu Partners, although comparisons of practices can be used when relevant in order to draw lessons learned and contribute to improve overall performance.

The terms of reference of this assignment are available here.

Applicants are requested to send their submissions to recruit@protectdefenders.eu, with the subject “Evaluation consultantby 16 August 2019.

Evaluations should allow also to find unexpected impacts of human rights work

January 15, 2016

Evaluations of human rights work should not just be results assessment, but instead, like (in line with Emma Naughton and Kevin Kelpin) as a learning process to discover un expected impacts. Muriel Asseraf in an article in Open Democracy on 22 October 2015, “Finding the unexpected impacts of human rights work“, argues just that in the context of Conectas in Brazil:

Only by understanding if advocacy strategies have been effective and why (or why not), can we understand whether it would make sense to replicate it. Last year, for the first time, Conectas and partner organizations from the Criminal Justice Network launched a large media campaign against the practice of invasive strip-searches for family members who visit their relatives in prison. The impact of the campaign was two-fold: in the state of Sao Paulo, where the campaign was launched, a law was passed to ban the practice, which in and of itself was a great victory. In addition, while not endorsing “human rights” directly, a new audience started to empathize with the situation of these women—grandmothers, mothers or daughters—who have to go through these humiliating treatments in order to visit their relatives in prison. By appealing to people’s understanding of the barbaric situation that prisoners’ relatives have to go through, as opposed to prisoners themselves, the campaign gathered unprecedented support. This impact was unexpected, and learning to identify it has helped us think about other human rights campaigns that could rally an even larger audience to our causes.

In fact, the unexpected lessons that we learn from our evaluation processes happen often. They are frequently surprising and always relevant, and they have informed our strategies and planning processes in ways both profound and constructive.

For example, another evaluation process helped us understand that Conectas’ use of international mechanisms was largely reinforced by how the international press covered the case. Resolutions and recommendations do impact official interlocutors, but if recommendations are somehow featured in international dailies, the reaction of government officials can be much more rapid….

Over time, we have raised our team’s awareness of the need to evaluate their work. Conectas now carries out rigorous planning processes: based on our five-year strategic plan, and our three-year tactical plan, our programs and areas develop annual operational plans that are reviewed twice a year during formal evaluations. The teams themselves conduct these evaluations because, as Naughton and Kelpin have also noted, they are the best suited to understand the subtleties and complexities of a particular situation, and to identify changes or unplanned impacts that others might not see.

During these evaluations, we try to consider not only the quality of the implementation of any given action—although that is also a critical part of the process—but more importantly the feedback of important stakeholders. Participants in our bi-annual Colloquium are asked to answer an opinion survey at the end of the event, as well as six months later in order to measure the impact of the event on their lives and work. Readers and contributors to the Sur Journal are also regularly questioned about how relevant and useful they find the articles for their work.

These survey results have at times been surprising, such as the finding that despite our many efforts to disseminate the print edition of the Sur Journal, the online version has a much larger following. As a result, we decided to transform it into a primarily online journal. The Colloquium surveys have also revealed important elements about the program and format of the meeting….

…. by remaining open to unexpected results, we hope to always evolve and adapt to what is around us. And we can only hope that each organization will do the same, in order to build a more complete view of the field and create more effective interventions.

Full article at: Finding the unexpected impacts of human rights work | openDemocracy