The National Statistics System: Our Challenge



The National Statistics System: Our Challenge

Presented by Statistics South Africa

03-04-2002

Lord Charles Hotel: Somerset West

Cape Town

Pali Lehohla

Statistics South Africa

The National Statistics System: Our Challenge

Abstract

That the implementation of a national statistics system to meet development information needs in South Africa is long overdue is beyond debate. Perhaps what could be the centre of debate would be the options for, and the pace of deliberately developing a national statistics system. Worldwide, it is observed that first world countries possess and continue to invest in well developed information and statistics systems. This statement does not suggest causality but observes this profound pattern. The paper attempts to outline what is required for South Africa to meet its development information needs within a globalising economy and an informatising society. The paper asserts that this can be achieved by developing a national statistics system. Drawing from international best practice and experiences, the paper identifies four general patterns of statistical development adopted by various countries. It observes that the development of statistical infrastructure is a long and arduous process and notes that sustained use of statistics and credibility of the system depend on quality and timeliness of the products delivered by the system. Focusing on South Africa, the paper makes a critical analysis of how the statistics system in South Africa stalled its development over time, and points out the major challenges and populates a “to do’s” agenda for South Africa to be part of the information age.

1. Background:

Governments require data and information for planning, decision-making and monitoring of social and economic development and change. Different sets of requisite information e.g., quantitative or and qualitative are used for this purpose. Data types require different methods of collection. For instance, through a snapshot household survey, quantitative data can be assembled on the living conditions of citizens; alternatively, through a continuous compilation of administrative records, an assessment of living conditions or access to facilities can also be made. The second type of data is qualitative (tending to explain an underlying phenomena), and this requires a different method of compilation including interpretation. It involves feelings and perceptions about issues. The third consists of studying existing records and documents such as records of proceedings, project documents and minutes. Information architecture and electronic data management systems enable technocrats to support politicians in decision making by ploughing through this maze of information. Project KITE in South Africa aims at achieving this goal.

South Africa, in the post apartheid era, recognises the need for a planning cycle and a framework for managing strategic policy priorities. This is captured and demonstrated in the Reconstruction and Development Programme (RDP), the Medium Term Expenditure Framework (MTEF), the creation of clusters, five in all, and the Medium Term Budget Policy Statement (MTBPS). It is admitted that had a more rigorous approach to planning been adopted, the last six years would have witnessed considerably more progress. Subsequently the President gave instructions to the Management Committee (MANCO), of the Forum for South Africa’s Directors’ General (FOSAD), to develop a planning cycle that would be underpinned by a strategic framework. This would strengthen co-ordination and alignment of plans, strategies, budgets, monitoring and reporting. In short it will enable better management of service delivery. In their Lekgotla of 2000, Cabinet instructed departments to create an information system and present progress in the next Cabinet Lekgotla scheduled for January 2001. Statistics South Africa was mandated the task of co-ordinating this effort.

The South African government like many others, through a range of departments and instruments, engages in detailed data gathering and information collection processes of one or the other form in an attempt to inform its development policies and programmes.

2. Organisation of the paper

This paper is organised into three parts. The first part focuses on country experiences, the second relates to the South African situation and the third part puts across a programme of action for the establishment of a national statistics system.

3. What is a national statistics system

“A national statistics system is a system that has a coherent body of data. It consists of users, producers and suppliers of data and information. It aims to ensure continuous co-ordination and co-operation among producers and users of official statistics in order to advance standardisation, quality, consistency, comparability and use of evidence as the basis for policy choices and decision making, and avoid unnecessary and costly duplication.”[1]

A National Statistics System “is a coordinating framework within which the required information for development in the form of indicators are generated. Outputs of the National Statistics System would be indicators and databases within the context of a Management System of Statistical Information. … A National Statistics System is a partnership between those responsible for policy formulation and those responsible for policy implementation so that the latter know precisely what the former wish to achieve, and thereby facilitate production of relevant information to reinforce the planning cycle.”[2]

Organisation of statistics systems

Statistics systems are either centralised or decentralised. In a centralised system, there is a single authority with a legal mandate for collecting statistics and this institution collaborates with others by formal arrangements on the collation of statistics. The Handbook of Statistical Organisation defines such organisation as follows “A system…of one department within the government to organise and operate a scheme of co-ordinated social and economic statistics pertaining to the whole country. This department collects, compiles and publishes statistical information …and, in addition collaborates with other departments of government in the compilation of administrative and specialised statistics.”[3]

Examples of countries with a centralised system are Australia, Canada and South Africa. Those known to have a decentralised system are Japan, America and the United Kingdom (UK), the latter until recently and is moving towards a centralised system.

4. What are official statistics:

Official statistics are defined by the United Nations Statistical Commission as “an indispensable element in the information system of a democratic society, serving the Government, the economy and the public with data about the economic, demographic, social and environmental situation. To this end, official statistics that meet the test of practical utility are to be compiled and made available on an impartial basis by official statistical agencies to honour citizens’ entitlement to public information.”[4]

The White Paper on official statistics in the UK define them as those statistics that “are collected by government to inform debate, decision making and research both within government and by the wider community. They provide an objective perspective of the changes taking place in national life and allow comparisons between periods of time and geographical areas.”[5]

“Open access to official statistics provides the citizen with more than a picture of society.  It offers a window on the work and performance of government itself, showing the scale of government activity in every area of public policy and allowing the impact of public policies and actions to be assessed. Reliable social and economic statistics are fundamental to ...open government (and) it is the responsibility of government to provide them and to maintain public confidence in them.”[6]

An important aspect of official statistics is the trust that the participants accord the system. To retain trust, the agency should decide on the basis of professional considerations, scientific principles and professional ethics, what methods and procedures should be followed for the collection, processing, storage and presentation of statistical data.

5. Statistical development: an international perspective

In this part of the paper we explore statistical developments in a number of countries in particular, the Americas, Oceania, Europe and Africa. Finally we focus on South Africa.

It must be noted that the evolution of official statistics in each country is mainly a product of its history. A recurring feature across the country studies is the dominance of one of the three patterns in the development of their statistics. Firstly, we observe a pattern in a group of countries where statistical development has generally evolved over time without any direct intervention by the political leadership. In these countries the statistics systems succeeded or declined over time. Success stories of statistics systems where this pattern was predominant are Australia, Sweden and Canada. A decline in the statistics system has been observed largely in countries on the African continent. The UK has also had a tumultuous era in the development of its statistics system which was precipitated during the reign of Thatcher. A second pattern is that of countries where statistical development had advanced to impressive levels but collapsed dramatically towards the end of the 1980’s. In this category are the Eastern Bloc countries and China, which through their centralised planning approach developed a sophisticated battery of statistics systems for centralised economies but faced an unprecedented onslaught with the demise of communism and the emergence of the market economy. The third pattern of statistical development features in most of the third world countries. This came about with the advocacy for writing off debt in respect of the highly indebted and poor countries. With the movement towards addressing debt relief, statistical organisations are beginning to adopt strategic plans as the means of managing their affairs. In this countries we observe political leadership taking the lead to address issues of information and statistics. In this regard PARIS21 continues to play a positive catalytic role. Zambia, Mozambique, Uganda and Malawi are some of the countries that have adopted this approach. The fourth pattern of statistical development consists of a situation where countries without addressing matters of debt relief, discover that it cannot be business as usual; democracy, informatisation and globalisation demand of the political leadership to adapt strategic plans and follow them through. As a consequence the leadership put statistics at the top of the agenda. South Africa, Korea and some of the Latin American countries fall in this category.

6.1 Chile

The 1990’s witnessed relentless efforts towards reform in government systems, and since 1995 the Government of Chile had been committed to a process of state modernization. The management of the National Institute for Statistics (INE), like the Australian Bureau of Statistics, as we will see later responded to the historic importance of this initiative.

An important initiative was recognising that for INE to succeed in its modernising programme, it was essential to ensure the full commitment and involvement of staff. As a result, the staff of the institute was brought in to form joint task teams carrying out an organizational diagnosis. These teams were supported by the specialist work of a team of professionals assigned this purpose in 1998. The diagnostic exercise focused on the assessment of critical areas of management, namely: Human Resources Quality, Management Styles and Strategic Planning.

In addressing these four areas INE, ascertained that if it is to achieve quality statistics, essential skills for driving statistics had to be acquired, mastered and assimilated in the Institute. This was achieved through implementing vocation for statistical work at INE.

A key management area for analysis was Human Resources Quality. The main effort here went into ascertaining the degree to which three essential skills, regarded as the critical ones were to be addressed. Firstly, effort was put on the quality of human resources to ensure that they have been assimilated and mastered in the Institute. Secondly, focusing on responsibility and commitment towards society, which in the main emphasises the importance of statistical work as an aspect that requires to be nurtured. Thirdly, vocation for public service which emphasises user orientation in the production of statistics has been implemented. As a result of addressing this area of work a common vision between management and staff was forged.

Management Style was the second critical area analysed within INE. As regards this aspect, the diagnosis pointed out to a practice that focused on procedures and oversight of tasks and activities than towards processes and empowerment. A new management style was implemented which emphasised participation. The third critical management area analysed was Strategic Planning and the diagnosis indicated complete paucity of use of modern tools required for planning. The new design brought about a flatter structure which applied “values of quality, transparency, teamwork and respect for individuals in each and every project undertaken. It is a style that values creativity, responsibility, innovation and criticism in a context where users and their requirements are accorded the highest priority.”[7]

The changes in structure were followed through with changes in the budgeting procedures and adopting new technological platforms such as the incorporation of the intranet in the work process and extending the facility to the regional offices. The benefits have been in notable improvements in the quality of decisions being made. By automating processes the quality of data from field has also improved dramatically.

6.2 Mozambique

In Mocambique, the National Statistics Institute (INE) was created by Presidential Decree 9/96 of August 28th. INE is the central executive body of the National Statistics System (SEN) and it is charged with the production and publication of the country’s official statistics. The INE is an autonomous institution that reports to the Council of Ministers. The SEN’s constituent bodies include the Senior Statistics Board (CSE), the General Population and Housing Census Co-ordination Board, the National Statistics Institute (INE) and the Bank of Mozambique.

The creation of the SEN and the INE came as a sequel of the economic, social and political transformations that started, in particular, with the introduction of the Economic and Social Rehabilitation Plan in 1997. The structure, functions and results of official statistics activity came as a direct response to the new era of multi-party democracy, peace and market economy that emerged in Mozambique.

Within this short space of time INE conducted the 2nd General Population and Housing Census in 1997, the National Household Survey in 1996-97, the production of the Consumer Price Index, the Demography and Health Survey in 1997 and the re-launch of general and specialised statistical publications.

It is observed that INE increased its human resources, recruiting technical personnel with university degrees and promoting several short- and long-term training courses in order to meet its new challenges. The status of INE was administratively promoted to a deputy minister position, similar to the situation in Korea.

A recurring theme in country experiences is training of staff in order to meet emerging challenges. INE carries out a prospective appraisal of the human resource requirements for the National Statistics System. This is particularly laying emphasis for recruiting and retaining better-qualified and specialised staff. They promote the progressive creation of working conditions – organisational, functional and operational – that will help recruit and select personnel. They also promote co-operation with other divisions in the development and implementation of training programmes in priority areas.

As regards infrastructure, INE promotes the progressive establishment of conditions that will modernise the NSS and this is done through a strategic information systems development plan.

Furthermore, INE aims at increasing the geographic decentralisation of the INE (deconcentration) by adequately equipping the Provincial Delegations. INE intends to set up information use committees within the divisions of the appropriate central services and carry out studies and hold seminars concerning user requirements.

6.3 Zambia

To date the Zambian statistical system is thirty eight years old. It came into being by an Act of -Parliament Cap 425 of the Laws of Zambia in 1964. Under this Act the Director of the Central Statistics Office (CSO) is mandated to conduct all censuses and surveys and to organise a coordinated scheme of social and economic statistics relating to Zambia. CSO like many other institutes is a corporate and legal body. Furthermore it is also an autonomous body except for matters concerning human resource. Under the current set up, CSO falls under the Ministry of Finance and Economic Development (MOFED) with Budget and Economic Affairs Division.

The statistical development in Zambia followed a centralised trajectory which was consolidated in 1970, when the unified statistical information system came into being. This process involved the secondment of CSO staff to various line ministries. A similar approach was followed in Malawi until the time when budgetary constraints undermined this approach leading to abandoning of this approach. Currently Malawi is reviving its system and is captured in the 2002-2006 Strategic Plan for statistics in Malawi.

The current status of the statistical system in Zambia consists of the CSO, the statistical units in the line Ministries and parastatals, the University Research Institutes, the Bank of Zambia and the Statistical units in the private sector organizations and NGOs.

In their recent User producer meeting the delegates concluded as follows: “A Standing committee of CSO, BOZ and MOFED should be in place to work on macro-economic data required for developmental policies.

Data analysis should be undertaken by the users according to their needs but in cases of low capacity there should be collaboration with the producers to carry out necessary analysis. CSO should give leadership in this task.

There is need for relevant Institutions to meet regularly to discuss their data needs as well as means of collection and analysis. Committees could be established made up of all stakeholders and these committees should be led by the CSO. This should lead to the establishment of better and sustainable dialogue between users and producers.

There should be coordination between the producers of data and donors in funding the establishment of databases into which all sectors could tap for policy formulation and monitoring.

There is need to integrate both the methods and data in the measurement of Poverty. In this regard, Quantitative and Qualitative approaches should be examined for Integration.

Data should be made more accessible by “going beyond talking to like-minds” through the presentation of complex data in a simple manner to ordinary Zambian.

Need to re-establish a coordinated national statistical system.

Need to re-establish the System of Administrative Statistics (SAS) by re-building statistical teams in the ministries with close links with the CSO.

Establish an Inventory Library at CSO as well as electronic database containing abstracts of what is available (and their data sets).

Necessary to develop capacity building programmes in the areas of data analysis, dissemination, coordination, data management for producers and users of data and top policy makers.

There is need to revise the current statistical Act of 1964 so as to provide an appropriate legal framework.

It is recommended that a strategic plan to implement a statistical framework be developed.

CSO and MOFED should initiate establishment National Statistical Coordinating Committee.

It is recommended that Poverty desks or focal point officers in all Institutions be established.

There is need to evidence based policies that are technically sound, growth oriented, poverty reduction focused and politically achievable and implementable.

A national forum on poverty should be held.

There is the need to strictly monitor the expenditure on Poverty Reduction Programmes of the debt relief fund. To ensure transparency of the monitoring system, Private Sector and the Civil Society should be encouraged to participate in the monitoring.

Carry out a comprehensive statistical needs assessment exercise.”[8]

6.4 Malawi

The National Statistics Office of Malawi is directly responsible for all statistics produced by government ("official statistics"). Despite good co-operation and collaboration amongst those involved in the production process, this is largely ad–hoc and mechanisms for preventing overlaps, gaps, or incompatibilities in the provision of statistics need to be formalised and implemented. Although the 1967 Statistics Act determines the legal responsibilities of the Commissioner of Census and Statistics, the provisions are outdated and largely not enforced.

Part of the reform process in Malawi for statistics is legislative reform, defining roles and responsibilities of the National Statistics Committee, compiling an inventory of statistics that are required, develop and publish a framework for national statistics and co-ordinate standards and classifications and promote the use of statistics. Furthermore promote training and publicise the training programme.

It is further believed that there is need for the NSO to become a semi-autonomous agency, governed by a revised National Statistical Act and accountable to a Management Board. This suggests a closer organisational pattern to that of Uganda where the statistics office is managed by a board of directors.

6.5 Australia

Australia is one of the top three countries noted for their good statistics systems. At one point in time the outgoing Commonwealth Statistician reformed the statistics system of the UK. The Australian experience was initially dominated by its colonial history beginning in 1787 whereby it was demanded of the system to submit regular reports on the number of people and animals by the UK, her colonial master. Over time the Governor of the colonies began to use the data for managing the colony and this became important when the self-government was granted in 1855.The UK required regular reports and the colonies obliged, often by conducting “musters” to count the people and the animals. Thus began the centralised collection of data. However, the Governors of the colonies gradually began to use the data to help with the management of the colonies, and this became particularly important from 1855 when self-government was granted. Fifty years later in 1905, the Census and Statistics Act created a Federal Bureau charged predominately with conducting population and agriculture censuses, and coordinating issues with the State Bureaux. As a consequence of this, there was rapid expansion of the system into areas of macro economic statistics following the great depression in the 1920s.

It took another fifty years, in 1970, for the Australian Government to establish an independent central statistical authority, known as the Australian Bureau of Statistics (ABS). This was to be a user oriented institution that would serve the governments and the community as a whole. Key features of this institution would be being policy neutral, and promote appropriate balance between different fields of economic and social statistics.

Government reforms on improving public sector management spurred another major development in the ABS, and the ABS developed and implemented a strategic planning and an overall corporate management approach. Simultaneously it adopted matrix management with statistical and services program managers in Canberra Office being responsible for planning and strategic guidance for all activities in their programs across all offices, while the day-to-day operational responsibilities lay with the senior managers in each State office. With these changes for the first time the integrated statistical service operated as a whole rather than the sum of parts. It is said that the ABS success lies in part on this style of management.

For it to retain a competitive edge, the ABS acknowledged that its success could not only be judged by its outputs, but rather by how well these products are used by government and the community at large. The ABS has therefore, had to market its services extensively.

It should be pointed out that a considerable amount of statistics in the ABS is handled by the Federal and State Government departments, although in general, the ABS features prominently across most of this activity spectrum. For instance, there is an Australian Institute of Health and Welfare that has a significant statistical role. The Australian Statistician is on its board of management, and under law the ABS must agree to them undertaking the collection of data.

Another critical aspect that has led to the ABS being a successful institution is the long-term investment that has been made by the ABS in recruiting, training and developing staff. Although there are many facets to this, only three are worth mention. First, the ABS employs young promising undergraduates and pay them while they continue with completion of degrees before commencing work as full time employees in the ABS. This has been a successful internship programme for long-term tenure. Second, the ABS spends about 8% of the budget per annum on staff development and this has paid dividends in terms having a stream of knowledgeable statisticians and managers available for deployment. Third, by adopting a policy of staff rotation ABS, ensures that all senior staff acquire a broad-range of work experiences, across various subject-matter and service areas and in regional and central offices.

What is important to note is that the development of an effective statistics system is not a simple or straightforward process. In the Australian case, it has taken over 200 years to evolve. South Africa does not have 200 years.

6.7 The United Kingdom

As part of the developed world the UK experience is an interesting one. All power was centralised in Whitehall.  But in addition to this the system was decentralised, with each department having a separate and independent statistical unit. This legacy started in 1832. This arrangement was to for approximately a century when the depression that preceded the war and the post World War II reconstruction necessitated the compiling of the national accounts as macro-economic statistics became important for governing the country.  This additional role actually enhanced the co-ordination function.

It took yet another thirty years, in 1970, to realise that there were significant efficiency gains and respondent management gains to be exploited by bringing under the same governing body however separately, into two separate units, the collection of data from business and households.  The Business Statistics Office (BSO) and the Office for

Population Censuses and Surveys (OPCS) came into being. But the implementation was such that these were not located at the centre.  While the Central Statistics Office was given the overall responsibility the decentralised statistical structure remained.

As was the case in Australia in 1970 there was a rapid expansion of the statistical system to match the higher profile of statistics in government decision making. However, in the Thatcher years, the gains were pushed back, with cuts in staff and the only statistics to be collected were those that would meet only the needs of government. It was not long before quality issues came to the fore both within and outside government. By the close of the decade, twenty years from 1970, the office was expanded and included responsibility for the collection of business statistics (including managerial responsibility for the BSO) for compilation of trade and financial statistics and for the retail prices index and family expenditure survey.  The CSO was launched as an Executive Agency, and was empowered to run extra collections, and increase the sample sizes in many others.   The result was a dramatic and significant improvement in the reliability of UK's macro economic statistics.

Five years later in 1995, the CSO were to merge with the Office for Population Censuses and Surveys (OPCS) to form a new "Executive" Agency hat would be known as the Office for National Statistics (ONS).  The merger brought together in a logical way the long overdue merger. As part of the logic, the Employment Department was abolished, and responsibility for labour market statistics was transferred to the CSO.   The ONS, has now assumed responsibility of the economic and household based statistics produced in the UK statistical system.  The overall employment stands at 2,500 staff out of about 5,000 overall in the Government Statistics System. With this approach the following was achieved:

• “greater integration between social and economic statistics;

• improved access to official statistics, especially through the database of key statistics

• combining the advantages of a decentralised system with a strong and independent co-ordinating agency; and

• providing benefits for all users, both government and non-government.”[9]

As the century drew to a close further advocacy work has continued in the statistics system of the UK, even drawing more on political leadership. The Labour Government elected to power in 1997 had on its agenda, the establishment of an independent statistical service. As a step to getting close to this in early 1998, it issued a Green Paper “Statistics: A Matter of Trust” and sought the views of the public on the matter. The outcome of this process was published in a White Paper “Building trust in statistics.” It is said that the White Paper did not however, go far enough to revolutionarise the UK statistics system and the opportunity for a quantum leap was lost and parts of the influential decentralised components survived intact. However, a new post of National Statistician responsible for national statistics was established with responsibilities as the Director of the ONS and head of the Government Statistical Service. Furthermore a Statistical Commission was established and advises the Government on issues of statistical integrity and quality assurance as well as commenting on the program of work for national statistics. The Commission reports to Parliament annually and Parliament will have some direct involvement in over sighting national statistics. The trajectory in the UK has been much more difficult compared to that of Australia and in both cases the development was over an extensive period of time.

6.8 Canada

Canada is one of the top four agencies renowned for best practice in the statistics globe with extensive international development programme. Under the Statistics Act, Statistics Canada is required to "collect, compile, analyse, abstract and publish statistical information relating to he commercial, industrial, financial, social, economic and general activities and conditions of the people of Canada." Statistics Canada prides itself of two main objectives: Firstly, to provide statistical information and analysis about Canada’s economic and social structure by developing and evaluating public policies and programs thereby improve public and private decision-making for the benefit of all Canadians.

Secondly, to promote sound statistical standards and practices by using common concepts and classifications to provide better quality data. Through working with the provinces and territories Statistics Canada, aims to achieve greater efficiency in data collection and less duplication. Furthermore they aim at reducing the burden on respondents through greater use of data sharing agreements (sources used include annual tax records, monthly employee payroll records and customs records) improving statistical methods and systems through joint research studies and projects. In addition to bringing out about 350 active surveys on average per year, Statistics Canada brings statistics to life through a variety of innovative programmes in schools amongst many other methods.

Furthermore Statistics Canada, operates within the context of portals or clusters rather than through government departments. The challenge for the agency is how to structure the design of these portals such that they interface seamlessly with the data holdings of Statistics Canada.

On the issue of methods and standards the Chief Statistician in Canada had this to say

“Recently the news media have provided increasing coverage of Statistics Canada's low income cut-offs and their relationship to the measurement of poverty. At the heart of the debate is the use of the low income cut-offs as poverty lines, even though Statistics Canada has clearly stated, since their publication began over 25 years ago, that they are not. The high profile recently given to this issue has presented Statistics Canada with a welcome opportunity to restate its position on these issues. Many individuals and organizations both in Canada and abroad understandably want to know how many people and families live in "poverty", and how these levels change. Reflecting this need, different groups have at different times developed various measures which purported to divide the population into those who were poor and those who were not. In spite of these efforts, there is still no internationally-accepted definition of poverty - unlike measures such as employment, unemployment, gross domestic product, consumer prices, international trade and so on. This is not surprising, perhaps, given the absence of an international consensus on what poverty is and how it should be measured. Such consensus preceded the development of all other international standards. The lack of an internationally-accepted definition has also reflected indecision as to whether an international standard definition should allow comparisons of well-being across countries compared to some international norm, or whether poverty lines should be established according to the norms within each country.

The proposed poverty lines have included, among others, relative measures (you are poor if your means are small compared to others in your population) and absolute measures (you are poor if you lack the means to buy a specified basket of goods and services designated as essential).

Both approaches involve judgmental and, hence, ultimately arbitrary choices. In the case of the relative approach, the fundamental decision is what fraction of the overall average or median income constitutes poverty. Is it one-half, one-third, or some other proportion? In the case of the absolute approach, the number of individual judgements required to arrive at a poverty line is far larger. Before anyone can calculate the minimum income needed to purchase the "necessities" of life, they must decide what constitutes a "necessity" in food, clothing, shelter and a multitude of other purchases, from transportation to reading material.

The underlying difficulty is due to the fact that poverty is intrinsically a question of social consensus, at a given point in time and in the context of a given country. Someone acceptably well off in terms of the standards in a developing country might well be considered desperately poor in Canada. And even within the same country, the outlook changes over time. A standard of living considered as acceptable in the previous century might well be viewed with abhorrence today.

It is through the political process that democratic societies achieve social consensus in domains that are intrinsically judgmental. The exercise of such value judgements is certainly not the proper role of Canada's national statistical agency which prides itself on its objectivity, and whose credibility depends on the exercise of that objectivity.

In Canada, the Federal/Provincial/Territorial Working Group on Social Development Research and Information was established to create a method of defining and measuring poverty. This group, created by Human Resources Development Canada and social services ministers in the various jurisdictions, has proposed a preliminary market basket measure of poverty - a basket of market-priced goods and services. The poverty line would be based on the income needed to purchase the items in the basket. Once governments establish a definition, Statistics Canada will endeavour to estimate the number of people who are poor according to that definition. Certainly that is a task in line with its mandate and its objective approach. In the meantime, Statistics Canada does not and cannot measure the level of "poverty" in Canada.

For many years, Statistics Canada has published a set of measures called the low income cut-offs. We regularly and consistently emphasize that these are quite different from measures of poverty. They reflect a well-defined methodology which identifies those who are substantially worse off than the average. Of course, being significantly worse off than the average does not necessarily mean that one is poor.

Nevertheless, in the absence of an accepted definition of poverty, these statistics have been used by many analysts to study the characteristics of the relatively worst off families in Canada. These measures have enabled us to report important trends, such as the changing composition of this group over time. For example, 20 to 30 years ago the elderly were by far the largest group within the "low income" category, while more recently lone-parent families headed by women have grown in significance.

Many people both inside and outside government have found these and other insights to be useful. As a result, when Statistics Canada carried out a wide-ranging public consultation a decade ago, we were almost unanimously urged to continue to publish our low income analyses. Furthermore, in the absence of a generally accepted alternative methodology, the majority of those consulted urged us to continue to use our present definitions. In the absence of politically-sanctioned social consensus on who should be regarded as "poor", some people and groups have been using the Statistics Canada low-income lines as a de facto definition of poverty. As long as that represents their own considered opinion of how poverty should be defined in Canada, we have no quarrel with them: all of us are free to have our own views. But they certainly do not represent Statistics Canada's views about how poverty should be defined.”[10] Italics and bold are the author’s.

6.9 Korea

The National Statistics Office of Korea was established by Act in 1948 when the Korean government was first established. The office then called the Bureau consisted of one officer and 4 divisions: General Affairs, Planning, Population Census and Vital Statistics. On the 13th of December 1948, the first Population Census was declared by the 39th presidential decree. This was the first administrative order the Korean government ever made in relation to statistical policy. After the Korean War, the Constitution was reviewed and complied with the introduction of the Market Economic System. This implied far-reaching changes, and the Bureau of Statistics was transferred to the Ministry of Home Affairs as a result in 1955 and there was further re-organisation and the Bureau was reconstituted into 3 Divisions of Planing, Population Census and Vital Statistics. General Affairs had disappeared. Six years later in 1961, the Economic Planning Board (EPB) was launched, and the Bureau of Statistics was relocated to EPB from the Ministry of Home Affairs. An additional branch was added namely, the Division of Survey and Analysis, this was in 1962. By the end of 1963, the Bureau of Statistics was renamed as the National Bureau of Statistics (NBOS). At the same time, the NBOS - now as an external bureau of EPB - was restructured into the 4 Divisions of Statistical Planing, Statistical Standards, Population Statistics, and Economic Statistics. As a consequence of this restructuring and its location in EBP, the NBOS expanded several times. During this period, the NBOS had played a vital role of providing the basic data required for formulating and evaluating economic development policies.

Upon recognising the increasing importance of producing fundamental statistics and coordinating national statistical services, the National Bureau of Statistics was again increased in size and promoted to a status of vice minister administratively, like is the case in Mozambique. At the same time it was consequently renamed the National Statistical Office (NSO) in December 1990. The NSO now included 3 bureaus, 14 divisions, 11 statistical offices and 15 local branch offices.

To address training needs in September 1991, the Statistical Training Center was set up for the purpose of producing lots of professional experts for statistics. In February 1995, two Divisions were established namely Statistical Information and International Statistics which aimed at improving the quality of statistical information services and fulfilling ever-increasing demands on international statistical services.

In September 1996, the position of the Statistical Examiner was created, and the Division of Population Statistics was expanded into 2 Divisions of Population Census and Vital Statistics. In addition, the Division of Statistical Research was also created.

Currently the NSO has 4 bureaux, 19 divisions, 12 statistical offices and 35 local branch offices and 1 training centre.

The work on social indicators in Korea was initiated thirty years ago in 1972. As a result of this process Korea had by 1978 three hundred and fifty across eight areas of social concern. By 1987, in response to changing economic conditions, Korea extended its indicator profile to 468, covering nine areas of social concern. As a result of continuous improvements Korea now covers 553 indicators covering 13 areas of social concern. Korea is at the forefront of measuring information and communications technology (ICT) as part of the indicator package.

6.10 South Africa

“The world-wide movement to transform the way government goes about its business has greatly influenced the Government of South Africa. The government has sought to place at the forefront of governance accountability to the citizenry of the country, effective and efficient management; delivery of services and “business” principles of business planning and goal attainment as the way it runs South Africa.”[11]

6.10.1 Background

To understand the trajectory of the development of statistical systems in South Africa, this background is appropriate. The development of the statistical system in South Africa has often been rooted in the underlying political system. Mclennan confirms this observation by suggesting that “the evolution of official statistics in each country is mainly a product of its history.”[12] Although South Africa on the African continent has had one of the longest formal history of statistical development, its path has been largely a sorry one. In general, the system has been an undernourished one, characterised by extreme and rapidly growing fragmentation. This was particularly so in the period from 1970 to 1994. This was consistent with the implementation of grand apartheid. Prior to this period, the system reflected the racial tensions inherent in the formative years of apartheid. The post liberation period has been marked by attempts to consolidate the system but lack of vision and strategy undermined attempts for operational effectiveness. This meeting co-hosted by The Presidency, Statistics South Africa and PARIS21 allows for a strategic intercourse for statistical development. It is the first deliberate and strategic attempt for creating an NSS to inform a democratic dispensation.

6.10.2 Fragmentation of the system under apartheid

From 1970 to 1994 the system had been fragmented and was dominated by six distinct forces. Each of these had its sphere of geographic and/or thematic influence.

• The CSS predominantly focused on Whites as a population group and the economy of the geographic component referred to as the erstwhile White South Africa. This left the economy of the Black population in the then South Africa relatively unknown and hitherto, this is still unexplored in South Africa as currently constituted.

• The HSRC as a source of official statistical information, focused on the Black population. This was both in the homelands and South Africa but largely studied their demographics as a theme.

• The third force was the DBSA who looked at financing of Black development and as an information source, they predominantly acted in the homelands. There was some conflict and professional jealousy between the erstwhile CSS and the DBSA, particularly over the report on the nine provinces issued in 1994/95.

• The fourth area was dominated by the academia. They were largely active in demographics and population projections.

• The fifth force consisted of market researchers and the Bureau of Market Research was the predominant force specialising in particular, in income and expenditure surveys. They were also very active in the homelands.

• The sixth institution was the statistics offices of the homelands themselves with varying levels of effectiveness. Their focus was largely to inform the sharing of the Customs Revenue Pool and the running of censuses.

The official body, the then CSS, focused on a minority of less than five million as opposed to the forty three million to which Stats SA has to deliver on. Given this history of fragmentation, there is the need for a level headed and deliberate process that aims at systems consolidation at the product, service, solution and institutional level.

6.10.3 The RDP

In an effort to deal with the challenges outlined above, in 1994 Cabinet agreed that every program and project in the government should have a business plan. Such a business plan would stipulate what the government or department seeks to attain, as well as the means of doing so. Included in the package were cost structures and costs as well as identifying beneficiaries. More importantly the proposals should define in concrete terms how the project or program is connected to the government overall goals set out in the Reconstruction and Development Programme (RDP) white paper, the Constitution, and other legislative arrangements and priorities.

Quite clearly, complying with the Cabinet decision implied having a battery of information at different layers of government, geography and at different levels of program, project, and or activity. The test for implementation was how government departments would respond to the first hundred days, which amongst others identified the following goals: Access to medical care for all pregnant women, universal education for children of school going age, feeding scheme at primary schools, electrification of areas that did not have electricity and provision of water and sanitation. Furthermore, the building of houses with the aim of reaching a million within five years was just about being witnessed.

Ambitious and necessary was the programme. It was launched by the State President with the aim of having a high profile impact while noting that addressing the inequities of the past would take a while. The programme marked the ushering of a new government that also focused on more profound long range plans that would come on line in subsequent years. The officials were quite eager to plan and implement but were confronted with several challenges.

6.10.4 Information and Data challenges

The first challenge for the South African government official was the identification of these “scientifically” valid measures and the development of some socio-political and business consensus of their utility as acceptable measures of phenomena in question. The second problem was lack of information and data that could lend itself to enable planning at sub-national levels, i.e., at the local level. The local economic plans lacked basic data and this made planning very difficult to undertake, beyond mere statements of intent. Faced with not only an unsympathetic bureaucracy, but one which had no vision on the information needs of society made the work of the new political and bureaucratic incumbents unenviable. Fellegi commenting on systems of measurement and indicators notes that “It is through the political process that democratic societies achieve social consensus in indicator domains that are intrinsically judgmental.”[13] Italics are the author’s.

6.10.5 Limitations of attempts for meeting data and indicator needs

“Central to assessing the business plan and evaluating the extent to which the government in general, department and sub units of department in particular attain their goals simply whether they do what they are supposed to do, a system of accountability was proposed and agreed upon, which entail: the identification of measurable goals; the identification of measurement or indicators of success or failure in attainment of the goals; and some way of measuring the extent, prevalence, dimensions, and growth and consequence of a particular problem such as poverty or ill health, unemployment, crime rate, etc.”[14]

6.10.5.1 Training for technology instead

Early attempts to address not only indicators and their systems, but more importantly, rudimentary data needs, focused on information technology instead of dealing with substantive content issues that address systems of indicators. These issues would be amongst others, the choice of indicators, what they are, how should they be compiled, what quality and methodological principles should be observed, periodicity of their generation, institutional arrangements for their compilation, infrastructure required, funding considerations and building consensus on them. It therefore came as no surprise that this gallant effort by the officials in the RDP office never generated the indicator framework it was supposed to. A training programme whose initial cost estimates ran to about R9 million was undertaken by ESKOM for officials of government and CASE tools were applied by King information consultants who taught participants one-to-many and many-to-one, how to develop relational databases, the importance of Joint Application Development (JAD) and object orientation. This training while important in its own right for a different audience indeed missed point.

6.10.5.2 Good idea bad timing

With the benefit of hindsight, one can argue that not only was this approach unwise, its timing was wrong. The approach was unwise in that it did not start with what the data needs of government for development were. Instead it focused on the electronic management of things that did not conceptually exist or had not been adequately discussed and understood. The implementation process of the training was also fraught with timing difficulties. Staff who attended what largely appeared to be addressing technology, were trained without any possibilities of guarantees that they were the most likely to be in the planning and monitoring offices of the new government. At the time the training was implemented the restructuring had just begun in earnest. Uncertainty was running rampant both amongst the old guard who were seeing their way out and negotiating packages and golden hand shakes, and the new bureaucrats who came in, sure of a place but not certain about the area they would occupy. The Statistics organisation, which supposedly is the main cradle for development information was in bad shape and intransigent to change, did not even participate in the training.

6.10.5.3 Nomenclature: Talking past one another

The third level of problems was the nomenclature and use of terminology, which on the face of it may appear to be trivia, especially challenged by major development issues of brick and mortar, medicines and food terminology and language, become the last thing to argue about. Mokaba notes that “most of the confusion in government language between what is a project, program or simply a task emanate from the multiplicity of levels at which each of the speakers may be operating from. This was clearly indicated by the confusion around the Katorus Presidential Lead Project or program. To the province and the local unit the Katorus operation was a R600 million program consisting of various elements of capacity building, infrastructure development, local government, and safety and security. Bu to the office of the president and to the minister without portfolio and surely for the president, Katorus was just a project, one of the projects that comprise the total urban redevelopment program.”[15]

Perhaps even more confusion about the empirical referent of the word program in the GNU is the use of the term in budgetary process to refer to the health services of South Africa as a sectoral program and the Reconstruction and Development Program or the National Works Program.”[16]

6.10.5.4 Subtle turf battles and procrastination

The fourth level problem consisted of subtle battles over turf, posturing and thereby precipitating fragmentation by the information producers. While the new government noted that there was serious paucity of information, there was not a visible effort to prioritise statistics and possibly improve the situation, instead private sector initiatives and individual departmental efforts helplessly attempted to address these data and information requirements and needs. The Central Statistical Service, like the Human Sciences Research Council, for historical reasons found themselves as passive observers in the scramble for providing information. Amongst the more active in the information arena, were the CSIR, who supplied the data for what was to be popularly coined as spatial development initiatives (SDI’s) which identified corridors with development potential. Another major source of information was the SALDRU study, which provided socio-economic data but much more at a macro level while data requirements for planning in the instance of the RDP, are often times needed at a more local level. The DBSA at the time had produced an information document on the nine provinces and was playing an active role as an information supplier. The Central Statistics Services came rather late to the party and was less of a welcomed member because it indeed was intransigent and or lacked the profound understanding and appreciation of the central role it had to play in a changing society. In direct competition with the DBSA, they produced provincial profiles. The stage was adequately muddied by bits of inadequate information and posturing in the information arena.

6.10.5.5 Resistance and lack of institutional and legal framework for implementing

The fifth problem was resistance from the old guard and those who sympathised with them. The sixth was the lack of an institutional and legal framework within which plans could be implemented. The Tender Board was often mentioned as the biggest stumbling block hampering delivery. Geographically, the settlements were still divided and without the existence of seamless local government structures, implementation was to remain the biggest challenge, particularly for the priority areas that were identified by the RDP and the first 100 days.

6.10.5.5 Lack of statistical leadership and vision

The sixth and biggest problem perhaps that faced the statistics system in South Africa before democracy, and in the first six years of democratic existence, was the lack of vision for a national statistics system. The focus in the six years of democracy focused on operational effectiveness to the exclusion of an overall strategic framework that determines “why official statistics.”

6.10.5.6 Lack of co-ordination of national priorities

The seventh problem that made it difficult to have an effective statistics system was the absence of the digestive capacity to interpret government policies and priorities and appropriate them in a systematic way, where sequencing of processes determine how priorities interlock in a self supporting way. There is a sense that with the demise of central planning in the Eastern Block, South Africa tended to skirt around the issue of co-ordination and probably got to the point of almost loathing it. Statistics in general, responds to policy. The value chain of information starts with defining a purpose in life for a country, and then locating and describing ways of implementation and finally instituting measures of verification. It is in the terrain of verification that official statistics become important. Statistics are a rudder that guides policy formulation, monitors implementation and measures as well as evaluates performance. So they serve a client who is the policy maker and those who implement it. One key pitfall from the experiences of the RDP Office was to subscribe to being lean as a virtue without assessing the capacity required for organising issues of state and governance. The lean and mean philosophy while virtuous, requires to be contexualised. The current efforts by the cluster system to strengthen the centre of government have just begun to address issues of co-ordination and effectiveness at the level of strategy and possibly redress the limitations of the RDP office.

The arena of statistics as it existed and continues to exist, consists of a system that produces what the office best thought was good for government and the stakeholders. However, the office has recognised that it continued to inform at a tangent, lacked a system of indicators and missed the audience. South Africa has not been alone in this dilemma of wandering in statistical wilderness as the series of case studies illustrates. It was battling with problems of a unique nature and these were very country specific.

6.10.7 The watershed meeting on statistics system (the North West axis)

In October 1994, the provincial government of the North West hosted a watershed meeting which emanated as a sequel of meetings that the North West hosted bilaterally with all nine provinces and jointly with the then FOSAD equivalent, on the future of national statistics in South Africa. The meeting was attended by all RDP offices and the provincial and local government representatives as well as national ministries.

Although the CSS was invited, they declined to participate as they believed that their transformation process was on course and the planning workshop was inadvisable. A key resolution of the meeting, communicated to both the CSS and the RDP office was that the organisation needed serious surgery including advertising the top position of head of the organisation. Census ‘96 was discussed extensively and implementation plans were also looked into. The conclusion was that the census was in jeopardy and a recommendation for an assessment mission for readiness was made. The mission was conducted and resolved that the census be postponed by six months and be headed by a different individual to the then incumbent. The meeting also discussed the geography of the country and the need to re-organise the geo-political and economic space of South Africa. This could be viewed as the pre-cursor to the Demarcation Board. On national accounts, the meeting resolved that the CSS had to be strengthened in order to take charge of both sides of the national account. This is in order to eliminate the moral hazard of a policy department, in this case the South African Reserve Bank, (SARB), handling national accounts statistics, i.e., rendering the bank the role of referee and player. On the CPI, the meeting resolved that there is need to collect information for the compilation of CPI for the rural areas so as to have a targeted handle on how the rural communities get affected by price changes. It was further indicated that the CPI in the way it is collated cannot be representative and required revamping. It covered only 14 urban areas. More fundamentally the meeting resolved on the creation and funding of a national statistics system and strengthening training of cadets who would participate in the system. Lastly the workshop appealed for a more representative Statistics Council. This workshop was amongst the first few to attempt a strategic focus on the provision of statistics in South Africa.

The North West administrative and political leadership axis jointly with the RDP office, provided the space for challenging quite successfully what was becoming a myopic self-serving statistical hegemony based in Pretoria. This led to visible changes at the Central Statistical Services from July 1995 and subsequent improvements in timeliness and quality, the latter albeit gradual and problematic.

This workshop offers South Africa a golden opportunity to act strategically in terms of managing statistics. If there is anything to learn from Australia, Canada and more recently Mozambique and Uganda, it is ways of providing a strategic framework for the management of statistics and methods of operationalising these activities within this framework. From the ABS we can learn that: a “strategic management process delivers ever changing strategies which move the ABS in the directions set by the objectives in the plan. Some strategies in the plan will not be fully realised; on the other hand, new strategies will emerge from across the organisation and be implemented. The success of this process depends on how well these directions are understood, accepted and pursued by all staff.”[17]

6.10.8 The 1996 Census

Hopes were pinned on the 1996 Census results, however, these brewed their own controversy, particularly the preliminary results which suggested that we were not as many as we usually thought. The release of a final result that was two million people more did not help to build the image of the organisation.

The census has largely helped to ameliorate the paucity of data for planning. However, the worst government wide mistake committed was the selling of the data. While the price per dataset was not prohibitive, the notion of selling turned departments and users off as they saw this practice as unorthodox and ran counter co-operative governance. Besides, it was argued that the census had already been paid for by the fiscus. The facts facing Statistics South Africa were however that, they had to balance their budget and the sale of data to recover R 7million was real. Government was ruthlessly chasing the reduction of government deficit and any over-expenditure was dealt with ruthless discipline. This era brought about discipline and reduced the deficit to levels that are unprecedented across government. The abrupt discontinuation of the October Household Survey programme is bound to create discontinuities in the ability of government to measure progress. Ad hoc implementation of surveys mitigates against any notion of statistical dependability in the future. The leadership of the country has begun to apply pressure on the statistical system and require it to deliver.

6.10.9 Political leadership and statistical challenges

In the January 2002 Cabinet Lekgotla, after an extended discussion on communication, the Lekgotla focused on what it is that is to be communicated, i.e., the content of communication. It was concluded that such content will be of a nature of facts and figures, i.e., of a statistical nature. The President in the same Lekgotla alluded to the desire to paint a picture of seven years of democracy. Such a picture depends on the indicators to be provided by the statistical system.

In the last two State of the Nation Addresses that the President made, I have noted that in 2001, only two sets of statistics out of the five that were used came from Statistics South Africa. In the State of the Nation Address of 2002, none of the figures coming from Statistics South Africa were used. The question that this state of affairs begs is how relevant is the organisation, how accessible are the statistics it produces and what is the level of public trust the statistics are accorded.

In the same Cabinet Lekgotla, as already indicated elsewhere in the paper, the President gave a clear instruction to have the national statistics system to work and deliver. Furthermore, in an encounter of the Directors General with the President, of the four agenda items tabled, one was on statistics. There is no doubt that at the highest level of political authority, statistics are not only viewed as important, but are urgently required for managing public affairs. If you can’t measure it you cannot manage it.

The Minister of Finance has posed questions on the veracity of statistics and what he sees is that they fail to add up. He has called upon the organisation to embark on an agenda for improvement.

The conclusion one draws from this series of interventions is that there is political will and leadership to get the statistics of this country on a sound footing. What is needed is for the bureaucracy to locate its position within the already established value chain and align its strategy. This workshop guides such positioning.

6.10.10 Agenda for statistical improvement

By 1999, Parliament passed the Statistics Act, Act 6 of 1999 which empowered the Statistician General to co-ordinate official statistics. The Act also raised the level of authority of the organisation. In November 2000, a Statistician General was appointed. The new leadership has focused on improving leadership and changing management practices. The drive has been towards a flatter structure, promoting a learning organisation and allowing lateral information flows. Managing and encouraging exchange of experiences and personnel on the continent has witnessed a commendable measure of success.

Following this appointment, several sessions have been held with the Statistics Council on the agenda for improving statistics and clarifying roles and responsibilities of Council in relation to Statistics South Africa in general and the National Statistics System in particular.

By January 2001 a document on the creation of a national statistics system was tabled before FOSAD for consideration and FOSAD adopted it. By September 2001 working teams for the NSS were established and by February 2002 a fully functional NSS unit was established in Statistics South Africa. Within a few months of it being operational, cross-departmental work teams have been established and a strategy document on the NSS has been prepared and adopted by the FOSAD meeting of November 2001. In conjunction with the Presidency, Statistics South Africa will lead the implementation of the NSS and this indicator framework workshop provides an opportunity for users, producers and suppliers to exchange views on priorities. A series of bilateral meetings with departments have been rolled out since February 2002 and will continue. Included in these meetings, we focused training of staff in government departments on the use of data holdings that Stats SA possesses.

A quality and methodology team has been set up at Statistics South Africa to monitor, advise and ensure that quality is not only implemented but is seen to be done. A systematic process of browsing through each series and identifying areas for improvement has begun. For instance preparations are afoot to look at Census ‘96 and confront it with that of 2001 even before Census 2001 yields results.

A rigorous statistical training programme has begun. Up to 32 members of staff are studying statistical techniques with 22 of them being fulltime. The organisation has set aside about 3% of its total budget for statistical training and commits to at least each staff member receiving a minimum of six days training a year. Furthermore the organisation has set aside R1 million for appointment of internships for periods not exceeding three months. On the people with disabilities, we have moved deliberately to increase our intake, particularly in data processing and jointly with the department of labour have conducted a very successful training that has brought people with disabilities on board with competencies that meet requirements.

We have started a profound programme for promoting statistical literacy as a means of achieving economic literacy. To this end, in conjunction with the Department of Education and SAMDI we are developing materials from the census, especially the census at schools for training pupils in the handling of data.

The organisation leads the pack in adopting state of the art technology, especially in data processing. The challenge is that of integrating disparate systems into one seamless data holding with interfaces to the user community.

On the agenda at the turn of the century are challenges of measurement on causes of death. Extent and nature of crime, measurement of incidence and prevalence of HIV/AIDS, patterns and levels of enrolment, especially within the context of declining number of pupils sitting matric exams. Employment statistics are being revamped through the use of better sampling scheme.

Notable improvements and collaboration is being made on the GIS as a corporate asset for South Africa incorporated.

6.10.11 Conclusion

In South Africa, the first five years of statistical freedom can be noted for what could best be termed as the “data spaghetti era” as well as profound lack of common vision on the nature and direction of the statistics system. There was relentless focus on operational effectiveness with less attention on strategy and direction in which the organisation and the system were to move. A less favourable resource base for building statistical infrastructure hampered rapid development of the organisation. Some series like the October Household Survey, had to be abandoned because of funding problems and have been replaced by yet an important series, the labour force survey which does not however, cover the extent and modules included in the OHS. The political leadership is fully aware of the need for improving the statistics system and indeed has elevated the status of the organisation to the level of a fully fledged department. However, as can be seen in the case of successful statistics systems, these systems take a while to develop, they consume resources as collections to inform increase in number, and they succeed where training is life long. Without strategic management a statistics organisation cannot survive. For South Africa, it appears that like in Mozambique and Uganda, the political leadership has resolved to actively support the coming in to being of the institution.

7.0 The to do’s

From the country studies we can learn that there are several “to do’s” for the improvement of the statistics system in South Africa. Below is the list that I trust could guide resolutions that can be implemented:

• Auditing the system and its current products;

• Producing a strategic plan with key deliverables, identified beneficiaries, costs of not having a national statistics system and associated resource inputs;

• Adopting a planning and implementation framework;

• Ensuring that statistics are of good quality;

• Implementing peer review for production;

• Recommitting the work teams to the improvement agenda;

• Delivering an indicator template and indicators;

• Delivering a picture for the seven years of democracy;

• Building the infrastructure for production of statistics, such as field operations;

• Training staff and allowing for fulltime study from time to time;

• Agreeing on series to be collated;

• Establishing funding stability for statistical series;

• Speeding up the expansionary route for statistics and funding it at least to the requisite levels;

• Producing time frames against programmes;

• Implementing the fourteen-point action plan.

-----------------------

[1] Statistics South Africa 2000

[2] Statistics South Africa 2001

[3] Handbook of Statistical Organisation 1980

[4] United Nations Statistics Commission 1994

[5] United Kingdom White Paper on Open Government (July 1993) Cm 2290

[6] United Kingdom White Paper on Open Government (July 1993) Cm 2290

[7] Challenges and trends in the modernisation of national statistics systems 2001

[8] Data requirements for the PRSP in Zambia, CSO August 2001

[9] The evolution of official statistics: implications for management and trainin,g Bill McLennan

[10] Statistical methods discussion papers and new surveys on poverty and low income, Statistics Canada September 1997

[11] Operational evaluation unit: The development and use of development indicators, Benny Mokaba, 1995

[12] The evolution of official statistics: implications for management and trainin,g Bill McLennan

[13] Statistical methods discussion papers and new surveys on poverty and low income, Statistics Canada September 1997

[14] Operational evaluation unit: The development and use of development indicators, Benny Mokaba, 1995

[15] Operational evaluation unit: The development and use of development indicators, Benny Mokaba, 1995

[16] Republic of South Africa: Budget Review 15 March 1995

[17] Corporate plan for the ABS

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download