Sunday, January 18, 2015

Key takeaway points from Wilton Park Conference

Wiston House at Wilton Park, West Sussex, England
I attended the Wilton Park Conference on Treaty Body Strengthening held this week.  About 60 participated, from 28 countries, coming from governments, treaty bodies, the UN Human Rights Treaty Division staff, academics and NGOs.  The discussion was lively. The facilities were stately. And the environment was picturesque. What a beautiful setting within which to have this discussion.  Thanks also to the financial support of the conference by the governments of Norway and Switzerland.

The purpose of the conference was to exchange views and challenge our thinking on the human rights treaty body system. What should be the next steps? How can we improve the protection of rights holders and the compliance of governments with their human rights treaty responsibilities?  Where are the present gaps in the system?

All views expressed were for non-attribution, so that a free flow of ideas could result.

Here are some of my key takeaway points from the conference:

  • GA resolution. Many of the measures recommended in last year's UN General Assembly resolution 68/268 have already been implemented or are well on their way, including expansion of the treaty body meeting time from 72 weeks to 95 weeks in 2015. This expansion has required a major reshuffling of the treaty body calendar.  It now means that at least two treaty bodies are in session at virtually every point during the year.  
  • Not enough. Assuming the measures adopted in last year's GA resolution will put the treaty body system on a stable, sustainable footing for the foreseeable future would be seriously underestimating the challenges the system faces. We need to start thinking now about how the future will affect the capacity of the system and finding ways to meet those pressures. 
  • Paradox. Yet radical transformation of the treaty body system takes time and probably needs a crisis to trigger its consideration. The paradox is that we need a crisis to make such a radical shift, but we will not have the time to do so once the crisis is upon us.  How do we solve this problem?
  • Implementation. The implementation topic is an especially complex one. How do we get more attention on implementation of treaty body recommendations without drawing too much time away from other core responsibilities of the treaty bodies? Some participants also believe the treaty bodies should not be the primary actor who monitors compliance; if not, who then and how?  I think most of us think that the follow up procedures adopted by some of the treaty bodies whereby certain recommendations are flagged for special followup within one year, has been a good development.  But measuring its success and spreading the best practices of the procedure to each of the treaty bodies is a challenge. Also, has this development in some ways drawn attention away from the broader implementation topic -- tracking to make sure that each state party has responded to and implemented each of the prior recommendations of the treaty body, not just the ones specially flagged in the follow up process.  Also, follow up to individual case decisions is critical; but this information tends to get lost in the overflow of information in the system.  How many judgements of the treaty bodies where violations by a state have been found and a remedy required, actually been implemented by the state concerned? How do we track this information and put the appropriate pressure on states to comply with these responsibilities?
  • Awareness. Promoting better public awareness of the treaty body system is another difficult challenge. More and better social media, webcasting and media outreach were all discussed. I tend to think that webcasting especially, will be the entry point to most new awareness of the system.  We should be prepared for this opportunity and make sure as webcasting becomes more available that information links provide the necessary basic information about the treaty body system.  But the public awareness topic needs some serious brainstorming and planning, in order to make the treaty body system better known to the public at large. 
  • Migration. Migration is a special problem.  The Committee on Migrant Workers is in danger of becoming irrelevant in the system. The Convention has been in effect for 25 years, since 1990. Yet this treaty has the smallest number of ratifications, at 47 with no new ratifications received in 2014 (only the Committee on Disappearances has fewer ratifications, but that treaty has just gotten started and is only 3 behind, at 44).  The Migrant Workers Convention has poor reporting compliance. Only two state reports were received during all of 2014. It has an individual complaint mechanism that has still not come into force because of lack of ratifications (it needs 10, it has only 2).  When we think of the treaty body system failing we usually think of work overload; in fact the more near term failure might be part of the system that is being ignored, underworked. If one part of the system fails, can it have a knock on effect to the next weakest link, and then the next, and so on? If state parties who do not wish to comply with their reporting and implementation obligations see that one treaty body has failed from lack of use, doesn't it become easier to choose to disengage from other treaty bodies?  I believe this makes the migrant workers convention especially important to focus on, and to make sure it becomes a healthy part of the treaty body system. Strengthening a system must include strengthening its weakest parts, not just addressing its capacity issues. There is a critical need for more ratifications and better compliance with state reporting requirements. Other ways to make the CMW more relevant in the system as a whole should also be explored, like joint general comments, more NGO involvement, and more publicity efforts. 
  • Individual complaints. Individual complaints is another topic that tends to get overlooked when we look at the existing system.  The GA resolution had very little to say about this part of the system. Some problems with the handling of such complaints also surfaced in the discussions at this conference and should be examined more fully. The numbers of complaints submitted are relatively small compared to other aspects of the system, about 100 complaints per year with the majority of those going to the Human Rights Committee. Yet now we have 8 of the 10 treaty bodies with a complaint mechanism in effect.  If there is a sudden uptick in the numbers of individual complaints how will the treaty body system handle them? 
  • Going back to old ideas too. Brushing off old ideas and rethinking them makes sense at this juncture too. Prior proposals from the Dublin process, prior high commissioner proposals, and other ideas that surfaced in the most recent round of treaty body strengthening discussions which did not make their way into the GA resolution. 
  • TB Chairs. The leadership role of the treaty body chairs was noted by several participants. They have emerged in the last couple of years as a positive force in the system. They were a very important influence on the final results in the GA resolution. The question becomes how to institutionalise that leadership in a positive way that adds value to the system.
  • Harmonisation. How far should efforts to harmonise practices between the treaty bodies go?  Do the treaty bodies themselves believe in harmonisation? These questions were raised a number of times in various discussions at the conference, without any clear answers emerging. It seems it is a difficult issue to grapple with in the abstract, but also in the specifics. A balance of diversity and alignment would seem to be called for, but how to orchestrate the balance? And how to recognise when we have reached the right balance? 

Conclusion

These are just a few takeaways that I had after attending the conference. Thanks to the sponsors who invited me and thanks to all who participated for the rich discussion. It was a very thought provoking and enriching experience. Now, onward to action!



Wednesday, January 7, 2015

Improving implementation

How can the treaty bodies improve implementation of their decisions and recommendations?

There are several ways to look at this issue.

  • One is to compare the current reporting volume with the numbers that would be achieved at full performance. States should after all be reporting on how they are implementing the relevant human rights standards.  
  • A second approach is to look at how often a state report actually responds to and explains how the government has implemented each of the Committee's prior recommendations. 
  • A third approach is to evaluate the responses to the special follow up mechanisms that several of the Committees have put in place

This article takes a look at each of these approaches and offers some recommendations to the treaty bodies on procedures they might adopt to improve implementation in the future.

1. Comparing actual to theoretical numbers of state party reports  as an indirect measure of timeliness of reporting


Currently states are submitting approximately 100 reports per year. This represents approximately 35% of the numbers of reports that would be received if all states were submitting their reports on time.


Periodicity ratified periodicity per year
CCPR 168 4,5 37
CERD 177 4 44
CESC 162 5 32
CAT 165 4 39
CRC 194 5,5 35
CEDAW 188 4 47
CMW 47 5 9
CRPD 151 5 30
CED 44 6 7
Total 1296 280

The above periodicity time periods (e.g., 4,5 years between reports for CCPR, etc.) represent actual practice in cases where the treaty-prescribed practice has not been enforced for various reasons or where there is no treaty specified time period (CCPR has no specified time period; CERD has a 2 year reporting interval which is too short; CRC has moved to a 6 year interval due to its workload). So approximately 100 reports are received each year. There would be 280 reports if everyone was filing on time. The present performance therefore represents 36% of a full compliance rate of reporting. 

This figure is also consistent with the specific data for the reports reviewed by the treaty bodies in 2014. My research of those 130 reports indicated an on-time compliance rate of 22%. 

2.  Determining how many reports responded to and implemented each of the Concluding Observations from the prior reporting cycle


Each new periodic report should reply to all recommendations made by the treaty body from the prior report.  Of the 130 reports reviewed by the treaty bodies in 2014, 96 were periodic reports (the remainder were initial reports where no prior concluding recommendations were made). Of these 96 reports, by my count states replied to all of the prior recommendations in approximately 54 instances, representing 56% of the total. 


Treaty body reports responded percent
CESC 18 12 67%
CCPR 14 9 64%
CRC 17 8 47%
CERD 14 7 50%
CEDAW 20 13 65%
CAT 8 5 63%
CRPD 0 0
CED 0 0
CMW 5 0 0%
Total 96 54 56%

No reports are listed for CRPD or CED in the above table because neither treaty body has any periodic reports to review yet, only initial reports. SPT is not on this table because their procedure differs substantially from the others. For CAT, CCPR and CMW it is also not clear how best to evaluate reports submitted under the simplified procedure, the so-called LOIPR procedure -- if the state party fails to reply to all prior concluding recommendations but the list of issues prepared by the Committee does not ask them to do this, are they excused from further response? 

It should also be noted that the above statistics are my subjective evaluation of each government's responsiveness. In some cases it was difficult to determine whether the government had responded to all of the prior concluding recommendations or only some of them. There was no separately delineated list or table or labelling that identified each recommendation and the government's response to it.  In some cases, notwithstanding this ambiguity, I assumed the government had responded to all of the recommendations; in other cases the responses seemed so brief that I did not include it as a compliant response to all recommendations. 

The above figures also only reflect whether a response of some type could be found in the report. I have not attempted to measure the quality or completeness of each particular response. So this is a very distant measure of actual implementation. But it is some indication of how far we are from a full implementation picture. 

3. Follow up mechanism


Six treaty bodies now have a follow up mechanism of some type, which requires states to respond within one year on 3 or 4 identified recommendations. A special rapporteur is then appointed to review these responses and report to the Committee.  Four Committees have now been using this practice for several years, CCPR, CAT, CEDAW and CERD.  The information available on each Committee website varies in completeness. So it is somewhat difficult to track performance. 

I studied the CERD process this year. They post responses from each state party and send letters commenting on the completeness and relevance of the responses. No NGO submissions are posted or summarized. An annual report comments briefly on the responses received, but does not provide any evaluative comments on quality or completeness of the responses.  I noted Committee letters sent to state parties thanking them for a "timely" response even though the responses were up to 9 months late.

For purposes of this analysis I looked at all of the responses posted to date for sessions 66 (March 2005) to 83 (August 2013), since responses are now overdue for all of these sessions. 
  • total number of state reports in this time period -- 162
  • total number of states who responded in some form -- 85 (52%)
  • total number of these states who responded on time -- 47 (29%)
As indicated earlier, it is not possible from this data to analyse how many states actually implemented the recommendations presented, but anecdotally it would seem to be a very low number, perhaps between 5 and 15%.  The method of posting information and reporting on results also makes it difficult to evaluate the performance of the implementation.  But from the above data it is at least apparent that 77 of 162 governments failed to respond at all to the Committee's request to provide follow up information on the implementation of their recommendations. 

RECOMMENDATIONS

  1. Standing agenda item. Each Committee should have a standing agenda item for each session on the subject of implementation. The follow up reports should be a part of this item, but also general discussion, statistical summaries, and Committee decisions should be a part of the discussion. NGOs and NHRIs should be invited to participate.
  2. Statistical summaries. Statistical summaries should be prepared and kept up to date on implementation statistics. A discussion of these statistical summaries should be included in the general agenda item on implementation
  3. Table or index to responses in periodic reports. State parties should be encouraged to include a table or index in each periodic report, mapping each prior concluding recommendation to the location in the report where the government has provided a response on implementation. This change should be added to the latest reporting guidelines of the Committee. It should also be made an explicit request in every LOIPR. 
  4. Case decisions. Implementation of Views (individual case decisions) should also be more prominently reported on by each Committee. In each case where a state party has committed a violation should be tracked and reported on. Government responses and petitioner responses should be posted or summarized (subject to petitioner consent). The relevant time period should also be clarified -- most decisions say that the government must implement the findings in 90 to 180 days, but it is not clear how this time period is measured, especially in cases where there has been a substantial delay between the official date of the decision and the date it first becomes public. 
  5. Guidelines. A best practice set of guidelines for implementation of Committee recommendations and judgements should be produced by each Committee
  6. Training. More training opportunities or an "implementation school" should be offered to state parties
  7. Implementation teams. states should be encouraged to appoint their implementation teams before their report is heard by the Committee; members of the implementation team should be offered training before they start and should be encouraged to attend the Committee hearing where the recommendations they will be implementing will originate from
  8. Official government websites. Governments should be expected to maintain official websites where their treaty responsibilities are disclosed, including reporting cycles and treaty body appearances. Each recommendation should be tracked, with implementation plans and activities reported.  A model list of website content should be provided (in this regard please see the suggestions I have made for these government websites in a prior article on this blog). 
  9. Civil society. NGOs and NHRIs should be given greater access and visibility to the follow up processes. NGO submissions should be posted and reviewed. If a state party is late in submitting a response, the Committee should consider going forward with an analysis that includes the NGO report
  10. Analytical summaries. More evaluative comments should be included by the Committee. It should not be just that the state party has responded, but that the quality of the response is also satisfactory. Has the state party implemented the recommendation? Is additional information needed? An analytical summary of these comments should be available at each session, updated to the latest comments.  It is noted that the Human Rights Committee has initiated a practice of assigning a "grade" to responses; something like this should be adopted by the other Committees And it would be useful if the Human Rights Committee were to issue a statistical summary of these graded responses. 
  11. Database management. The follow up to concluding observation databases at each Committee website should be regularly updated and include any NGO submissions received. Analytical materials from each session should be posted to the database within a prescribed time period so that those who are looking for this information know when to check back. Alternatively there should be a prominent notice on the database itself that indicates when the latest update was made and an opportunity for interested persons to sign up to be notified of updates
  12. Press releases. Press releases should be issued from time to time, informing the news media of the follow up process, providing some details statistically or otherwise, and providing a link to the relevant database information
  13. Timeliness of state party responses. Please do not refer to responses filed up to 9 months late as "timely" in transmittal letters to the state party. This softens the deadlines to the point of losing their credibility. It is hard to measure timeliness of submissions when responses 6 to 9 months late are called timely 
  14. Consider different modalities. The treaty bodies should consider a variety of different modalities for reviewing implementation. Country visits and implementation hearings should be considered. One very interesting example is the country visit made by Mr. Morten Kjaerum of CERD at the invitation of the government of Ireland, in the implementation of the Committee's concluding recommendations in 2006. 
  15. Drafting. Treaty bodies should attempt to improve the readability of concluding observations. See my article earlier this week on this subject. 

Monday, January 5, 2015

How easy are treaty body recommendations to understand?

Concluding observations are a key output of the human rights treaty body system. Each state report is reviewed and discussed with the government concerned and then a report is issued of the treaty body's conclusions and recommendations.  In 2014, there were 130 such reports, approximately 1400 pages and 6700 total recommendations.  How readable are they? How easy are they to understand?

Of course readability is not the only relevant measure of treaty body effectiveness, but it is an area where improvements are perhaps possible.  One way of assessing readability is to use one of the well known formulas for assessing complexity of written text.   A readability score, for example, can be obtained for any English language work at Readability-Score.com, which uses several formulas including the Flesch-Kincaid reading ease method.  The website uses a procedure that counts the words, syllables, and sentence length in any text that you copy-paste into the text box on the homepage.  A useful description of this and other readability assessment methods are discussed in this Wikipedia article.

A "good" reading score is a matter of some debate but would at least be anything above 60, with a grade level score of approximately 12.  A score between 90 and 100 is said to be easily understood by an average 11 year old child. A score between 60 and 70 is easily understood by a 13-to-15 year old child. This assumes of course that the child is in school and is fluent in English. Scores of 30 and below are best understood by university graduates.

I took a 1-2 page writing sample from a recent concluding observations report for each of the ten human rights treaty bodies (I used a country visit report in the case of SPT). It produced the following scores:


Reading ease Ave grade level
SPT 41,6 13,4
CMW 31,8 14,3
CCPR 30,3 14,8
CED 27,8 16,7
CEDAW 25,5 16,3
CESCR 24,4 16,5
CRPD 18,4 16,6
CERD 15,6 17,9
CAT 14,5 18,6
CRC 14,4 19,1


For example, if you take the first set of scores in this table, the recommendations of the Subcommittee on Prevention of Torture (SPT) scored 41.6 in reading ease, meaning the recommendations could apparently be easily understood by someone with at least 13.4 years of schooling (a person who had graduated high school and had 1.4 years of college). 

By contrast, the Universal Declaration of Human Rights (UDHR) has a readability score of 50.2 and an average grade level of 10.7.

From this scoring summary it would seem that there is room for improvement. None of the treaty body samples reached the "good" range of 60 or above. None reached the 50-level that the UDHR achieved. None is easily readable at the level of a high school graduate (12 years).  Seven of the ten received scores below 30, meaning they are geared toward a university graduate school or above level of comprehension.  In fact the Committee on the Rights of the Child (CRC) produces recommendations that are the most complex of these 10 samples, requiring on average at least 19 years of schooling to be easily understood; clearly they are not currently drafted in a way that can be easily understood by a child, even though perhaps children are or should be part of the intended audience. 

Modifying your drafting style to create a more readable text is not an easy task, especially given the time constraints and subject matter faced by the human rights treaty bodies. Indeed, the relatively higher readability scores of the SPT may partly be explained by the difference in modality of their report preparation. The SPT drafts its reports after country visits and between sessions, rather than during the session when the report is being discussed with the government concerned. 

However, regardless of difficulties in finding the time to improve the writing style of their concluding recommendations, it would seem to be a worthwhile effort to consider as the treaty bodies focus on methods of improvement.   

Saturday, January 3, 2015

Summary of 2014

2014 was potentially an important year for the human rights treaty bodies. In April the UN General Assembly adopted resolution 68/268, providing a substantial increase in resources and setting in place several reforms that will hopefully strengthen the system going forward.

From an overall system perspective, here is a summary of some of the key outputs:


Item
Number
comment
Concluding observations & recommendations
·      131 sets of recommendations
·      approx. 1400 pages
·      approx. 6700 total recommendations
Average = 11 pages & 50 recommendations per state party report
State reports received during the year
94 initial & periodic reports + 18 common core reports
Note that these numbers resulted in a modest reduction of the system backlog, since 131 reports were reviewed & only 94 new reports were received in the same period
State reports reviewed
·      131 reports
·      approx. 8553 pages
·      approx. 63% exceeded the recommended page limits
·      approx. 78% were not submitted within the required due date
Average of 66 pages per report, which is 60% longer than the 40 pages recommended for most reports (initial reports are the main exception, with 60 page limit, but they comprise less than 5% of the total number of reports) (CRC reports also had a 120 page limit during this period)
Cases/decisions issued
127
·      This is an approx 20% increase over prior years
·      Still missing the decisions from CAT’s November session (approx 10-15 more expected)
New ratifications
58 (2.7% increase from 2013)
·      This includes ratifications of treaties (39) and ratify/accept individual complaint mechanisms (19).
·      This represents more ratifications than 2013 (46/2.2%) and about the same as 2012 (59/2.9%)
New instruments enter into force
CRC OPIC
the individual complaints mechanism of the Convention on the Rights of the Child was the only new instrument in 2014
Treaty body experts/countries
172 experts from 87 countries

New general comments
5 new general comments (including one joint comment)
1. CCPR #35 liberty & security of person
2. CEDAW #32 women asylum seekers
3. Joint CRC #18/CEDAW #31 harmful practices
4. CRPD #1 equal recognition
5. CRPD #2 accessibility
Treaty body weeks
77 weeks
 this figure includes 13 weeks of pre-sessional working group time
Harmonizing of mechanisms
·      Simplified reporting procedure
·      Reprisals focal point
·      Follow up on concluding observations
·      Follow up on views
·      Adoption of Addis Ababa conflict of interest guidelines
·      5 now have LOIPR procedure
·      2 have appointed a reprisals focal point
·      6 have follow up mechanism on COs, but practice varies and is very basic
·      5 have follow up mechanism on Views, but practice varies, not very transparent
·      Addis Ababa – 9 have adopted the guidelines
·      LOIPR: CCPR, CAT, CEDAW, CMW & CRPD
·      reprisals focal pt – CAT & CCPR
·      Follow up on COs: CCPR, CERD, CEDAW, CAT, CRPD and CED
·      Follow up on Views: CCPR, CAT, CERD, CRPD & CEDAW (+ CESC now has a table on pending cases posted on their website).
·      Addis Ababa: apparently all except CCPR have adopted the guidelines