Premium
This is an archive article published on February 28, 2007

The devil is in the leakage

It is not enough that the outlays and number of social sector programmes are increased. We need ongoing evaluation of such schemes to find the gaps and plug the leakages rapidly

.

In the conflicting pressures between the Left and the Right, the budget has become a mere exercise of fund allocation and tweaking of tax rates. The allocation of funds is also on standard lines except for one — social sector schemes. Total outlay on social services has been increased to Rs 80,315 crore, up from Rs 59,143 crore (2006-07 RE). This accounts for about a quarter of total expenditure. And includes programmes such as Sarva Shiksha Abhiyan, Midday Meal scheme and Integrated Child Development Scheme.

While others debate the lack of direction and stagnation of the reform process due to political pressures, there is one area that the government can strengthen on the back of unanimity across the political spectrum — plugging leakages in social sector schemes.

The Economic Survey 2006-07 mentions three priorities for the government — managing and sustaining high growth; bolstering fiscal prudence and high investment; and improving the effectiveness of government intervention in the social sector. Emphasis has been given through the document on the need for plugging leakages in service delivery and focusing on outcomes. Rajiv Gandhi’s famous statement that only 15 paisa out of 100 reaches the poor has since been supplemented by many studies reporting eventual percolation to the intended beneficiaries ranging from 5 to 50 per cent. Then there are the fascinating cases such as commodities stolen from PDS finding their way back into FCI godowns. With the government committed to ever-increasing social sector expenditures, scientific evaluations need to be built in as an integral part of the policy formulation and administrative overseeing process.

Story continues below this ad

There are various ways by which evaluations are conducted. But the only way to do a proper evaluation is to source the information from the user of the government service, or the consumer. Whether it is the PDS or primary education, or primary health, or the employment guarantee programmes, the only good way to measure their success is to ask the beneficiaries whether and how they have benefited.

There are hundreds of monitoring and evaluation studies on past and currently operating development programmes. The bulk of these studies rely primarily on data and information collected from within the programme. For instance, they ask the teacher how many students he has taught and report the answer. But if parents were asked whether their children are being taught, the answer could very well be different. The 2006 ASER-Pratham survey of rural schools found that in the Vth Std, 47 per cent of the children could not read a Std II text.

Another such study that asked the consumers was conducted by the Planning Commission itself and dealt with the targeted PDS. The initial work on the study commenced in early 2001; the study that covered 3600 households finally got completed in early 2005. The report finds that out of every rupee the government spends on TPDS only 27 paisa reach the poor. Strangely, such evaluations are conducted rarely, irregularly, are of inadequate quality and of course take a long time to complete.

When programmes are spread over a large area, then a representative survey needs to be spread over a large area. And if we want greater details about relative progress in different locations, then the sample sizes also need to be higher. Such evaluations typically require samples of thousands of households, spread all across the country. Many mistakenly believe that this is why it takes many months to get the results. However, the real reason for the delay typically, is poor planning and logistics, the inability to use new technologies, and unfocused design of the evaluation exercise.

Story continues below this ad

This means that by the time we become aware of the problem the programme is not only many years old, its allocations have also been determined for the year after. What should have been a year long experiment will continue to be implemented for at least three years before any changes are made.

As long as such programmes were a small part of the overall expenditure the overall wastage was not a big deal, but with social sector programmes being astronomical expenditure items the evaluation regime has to move into a completely new domain. The new motto of inclusive growth for the 11th Plan will be made possible only if the implementation of such programmes is in line with the stated objectives. It is not enough that the outlays and number of programmes be increased, the situation on the ground has to change. And that means that we need ongoing user evaluation of such schemes, to find the gaps, and plug the leakages rapidly.

The recent Economic Survey reports that a CAG performance audit of the Sarva Shiksha Abhiyan revealed that even after four years of the implementation of the scheme and utilisation of almost 86 per cent of the available funds with implementing agencies, the revised target to enroll all children in schools was not achieved. The report finds that as many as 1.36 crore children in the age group of 6-14 years remained out of school. It lists delays in the release of funds. We need not have waited till 2006 for this, all of this was apparent to anyone as far back as 2002. Except there was no comprehensive evaluation that could have identified the various points at which SSA was failing since its inception.

Even if we assume that as much as 50 per cent of this outlay reaches its intended beneficiaries (the actual figure is likely to be 25 per cent), we would have wasted about Rs 30,000 crore last year and will waste another Rs 40,000 crore next year. An evaluation exercise can help pinpoint not only the amount of wastage, but also the points in the supply chain where failures and leakages are happening, the geographic locations where they are most prominent etc.

Story continues below this ad

Unfortunately, despite the emphasis on the need for improving effectiveness, the budget mentions only one scheme this year with regard to monitoring and evaluation of the Public Distribution System, where Rs 25,696 crore are being spent on food subsidy. But perhaps there is more in the background than is being stated in the budget documents. Perhaps various ministries are putting in place evaluation mechanisms that have the following characteristics: universal coverage of all programmes; based on information from the intended beneficiaries; that are conducted regularly; and identify gaps and leakages rapidly within a few weeks. One can dream.

The writer heads Indicus Analytics

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement