Key informant interviews focused on specific areas depending on the perceived knowledge, experience and expertise of each informant; however, opinions and information gained from interviews reiterated similar themes including mechanisms and the public nature of ECA report delivery. One recurring issue raised was the need to improve the quality of the reference slides used and the recommendation for the preparation of new slides by the regional slide bank. Both are difficult issues. Results are provided publically to emphasize transparency and the importance of using competency as a basis for validation and expertise within programmes. Regarding slide quality, the slides are made by experienced technicians, subject to quality control and well-validated, but some variation in quality is inevitable when producing large slide sets. However, although competency includes the ability to interpret imperfectly prepared slides, consistency and quality across the ECA slide set are far greater than encountered in the field and should therefore constitute a fair assessment.
Many participants had concerns regarding the assessment’s emphasis on application of the new parasite quantification method recommended in the WHO QA manual . The method followed by the ECA programme caused some confusion among participants used to alternative methods; several participating countries continue to use the “plus method” and are unfamiliar with more accurate methods. Very few record their results as parasites/μL during routine work. The emphasis on quantitation methods during the course probably accounts for the improvement by most participants in the post-ECA assessment. Parasite quantitation clearly needs to be a continued focus of additional training as quantitation is one of the main advantages of light microscopy, and dissemination of correct counting methods to other microscopists in countries should be emphasized [10, 15]. It is unusual for a major disease programme such as malaria to have had such variation in parasite quantification methods in the past.
Participant feedback also revealed particular difficulties faced by some microscopists with identification of some parasite species, particularly P. ovale and P. malariae, and this was sometimes considered to unfairly impact on microscopists from areas where these two species are rare. However, as the programme aims to test the inherent expertise of national reference-level microscopists, it can be safely argued that microscopists at this level should have the skills to diagnose less commonly seen malaria species and unusual cases.
Implications of the ECA for participants and national programmes
The ECA programme in the Asia Pacific has demonstrated that marked improvement in the competency of experienced microscopists can be obtained between pre- and post-ECA exercises after only three to four days of consolidated revision and reviewing techniques. The improvement of 27% in species identification illustrates this rapid impact of correction of basic errors. Run currently on a budget of approximately 12,000–17,000 USD per course per country (including costs of an international facilitator), excluding the cost of the maintenance and replenishment of the Regional Malaria Slide Bank at RITM, it demonstrates that large inroads can be made in improving capacity without high resources. ECA costs are predominantly absorbed within country budgets, but designated central funding is required to maintain the bank and coordination.
The function of the accreditation programme is dependent on the adequacy of the regional slide bank described earlier, and its success is dependent on accuracy of the specified results of each slide. This was achieved through development and use of standard SOPs for slide preparation and emphasis on accuracy of validation: it included six pre-qualified microscopists and use of PCR as final arbiter of species. This is necessary, as it is essential that participants are not penalized for reporting species that are at very low density and missed by validators through chance. Slide production, validation and filing required considerable personnel time and a dedicated staff, and a budget for ongoing bank maintenance. The bank collection was undertaken by Ministry of Health staff, and the respective programmes involved retained 50% of slides for national use, which enabled national programmes to dedicate personnel time to the project. However, the funding for the bank, while modest, has been difficult to secure as it does not fit well with the bilateral funding model followed by most donors in this field. An enlightened and flexible approach by USAID/Asia staff enabled this project to continue. Dedicating funds for such ‘common good’ projects would be a low-cost, high-impact improvement to some funding budgets.
Introducing an accreditation of competency can be highly threatening to members of any profession, who are unfamiliar with it. It opens the unwanted possibility of revealing skills-based weaknesses before colleagues and supervisors, possibly with potential career and financial implications. Accreditation can also highlight, by implication of non-participation, a lack of technical competence of senior officials and managers who do not actively participate in the ECA. Data collected from ECA participant feedback forms, distributed surveys, and key informant interviews suggest that the majority of participants view the ECA programme positively, despite the possible risk to reputation arising from the policy of making competency results available to the programme and colleagues. In addition to improving existing abilities to identify species and quantify parasites, ECA activities are an opportunity to update wider knowledge and microscopy skills. As participants showed considerable improvement between pre- and post-assessment, the accreditation exercise should help to build significant levels of self-confidence in skills and knowledge. In turn, it is hoped this will lead to increased respect for, and trust in, their competency by supervisors, colleagues whose slides they are crosschecking, and among clinicians who need to have trust in diagnostic results.
The value of ECAs will be greatly enhanced if the results are disseminated back to the national malaria programmes and result in an early impact on the programmes. In order to maximize effectiveness and sustainability of ECA activities, it is imperative that continued refresher training and crosschecking by the competent participants be integrated into national malaria programmes. However, key informants frequently noted that an adequate mechanism was lacking to share recommendations from the ECA facilitator through the WHO to countries concerned and to then follow them up. Currently, the course facilitator leaves a draft of the hard copy of the ECA report with the host laboratory and/or programme manager as well as the WHO country representative; a final copy of the report is sent to the WHO Regional Office as well as the coordinator, ACTMalaria. Although the course facilitator is able to briefly discuss findings and recommendations before leaving a country, it is apparent from the results that some programmes are unable to make effective use of the findings. Consequently, pro-active measures are being taken by WHO to enhance the effective communication of course recommendations. WHO country representatives and malaria personnel now facilitate the dissemination of findings and recommendations to national programmes, in collaboration with ACTMalaria. A recent WHO-organized regional malaria programme managers meeting focused on quality assurance for malaria diagnosis, informed by this needs assessment, to initiate a revision of country malaria QA systems work plan.
WHO gives clear recommendations for a hierarchical structure for national malaria microscopy QA programmes, based on regular competency assessments and retraining within the national system, supported by a national slide bank . However, banks of well-prepared, stained examples of malaria parasites are uncommon. Parasite specimen collection and the validation of slides, including PCR, require significant resources and skills often lacking within national programmes. National slide banks must have credibility regarding the validity and accuracy of its slides, sometimes meaning external validators are required. While the WHO manual recommends a simplified crosschecking system, this still requires significant logistical organization and availability of technical expertise. While the ECA seeks to put essential elements of this in place, it appears most countries have not fully utilized this, either through lack of resources or through giving it insufficient priority.
The increase in skills within and between courses, and the potential of the programme to catalyze skills development, is demonstrated well in the Philippines, where seven ECAs have been held and a national structure is in place to utilize reference microscopists and accredit all microscopy technicians within the national system. In three ECA activities conducted in the Philippines, microscopists averaged 94%, 91%, and 91% accuracy respectively in the pre-ECA assessments on parasite species identification. Despite having little room to improve, these microscopists further improved their ability to identify different species of malaria parasites (averaging 95%, 96% and 96% accuracy in post-ECA assessments). High performance on pre-ECA assessments is a good indicator that microscopists regularly participate in ongoing training and maintain high levels of competence. All of the Philippine microscopists assessed during the three recent ECAs achieved Level 1/Expert or Level 2 accreditation, and these reference microscopists are active in slide validation/crosschecking and supervisory visits. Additionally, regular ongoing training for microscopists is in place within the national programme with emphasis on weak areas such as parasite quantitation.
Further limitations of current ECA programme
The contents of the Regional Malaria Slide Bank, and consequently the slide sets used in the assessments, varied somewhat across time as lost slides were replaced and the bank expanded, but the evaluation set remained within the parameters in Figure 2. There will inevitably be variation in ease of parasite recognition between slides from different patients, and in accuracy of quantitation of individual slides, which are based on averages of 12 expert readings from slides from the same set. However, the ability of reference microscopy cadres from some countries to achieve high frequency of Level 1 results after retraining, uniformly in ECA in the Philippines, indicates that, despite some variability in slide quality, a microscopist with sufficiently high expertise can expect to have that competency reflected in a Level 1 rating.
Recruitment of suitable facilitators was a chronic difficulty encountered in this programme. A facilitator must be a highly competent microscopist with a WHO Level 1 accreditation, a positive attitude and proven instructional and facilitation skills, as well as the time to dedicate to the task. Collaboration between Regions is needed to provide such a pool of facilitators for the ECA exercises, providing more sustainability to the programme. Synergies may also be possible with microscopy QA programmes for other diseases, such as tuberculosis.