Jump to ContentJump to Main Navigation
The Politics and Governance of Basic EducationA Tale of Two South African Provinces$

Brian Levy, Robert Cameron, Ursula Hoadley, and Vinothan Naidoo

Print publication date: 2018

Print ISBN-13: 9780198824053

Published to Oxford Scholarship Online: November 2018

DOI: 10.1093/oso/9780198824053.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2022. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use.date: 01 July 2022

Provincial Governance of Education—The Western Cape Experience

Provincial Governance of Education—The Western Cape Experience

Chapter:
(p.85) 4 Provincial Governance of Education—The Western Cape Experience
Source:
The Politics and Governance of Basic Education
Author(s):

Cameron Robert

Brian Levy

Publisher:
Oxford University Press
DOI:10.1093/oso/9780198824053.003.0004

Abstract and Keywords

The focus of this chapter is the management and governance of education at provincial level—specifically on efforts to introduce performance management into education by the Western Cape Education Department (WCED), and their impact. Post-1994 the WCED inherited a bureaucracy that was well placed to manage the province’s large public education system. Subsequently, irrespective of which political party has been in power, the WCED consistently has sought to implement performance management. This chapter explores to what extent determined, top-down efforts, led by the public sector, can improve dismal educational performance. It concludes that the WCED is a relatively well-run public bureaucracy. However, efforts to strengthen the operation of the WCED’s bureaucracy have not translated into systematic improvements in schools in poorer areas. One possible implication is that efforts to strengthen hierarchy might usefully be complemented with additional effort to support more horizontal, peer-to-peer governance at the school level.

Keywords:   South African basic education, Western Cape Education Department, Western Cape Provincial Government, political settlements, education bureaucracy, new public management, performance management, whole school evaluation, public administration, principal–agent theory

4.1 Introduction

In South Africa’s public education system, the national level is assigned responsibility for policymaking, for resourcing the system, and for setting the overall regulatory framework. Responsibility for implementation is delegated to the country’s nine provinces. This chapter and the next explore how two provinces—the Western Cape and Eastern Cape—have exercised their implementation responsibilities.

In the years following the democratic elections of 1994, the new South African government enunciated the intention of adopting ‘best practice’ approaches to policy implementation, at both national and provincial levels. This included a high-profile effort to incorporate into government the principles and practices of results-based ‘new public management’ (NPM)—both across the public sector as a whole, and within the education sector. The dilemma, though, is that the political and institutional conditions for NPM to be effective are stringent.

As Chapter 3 has shown, efforts at the national level to introduce NPM into South Africa’s education sector fell foul of the underlying institutional and political realities. The African National Congress (ANC) governed as an ‘alliance’; policies were negotiated among multiple competing factions, with a strong voice for organized labour. In consequence, ambitious-seeming national-level NPM measures ended up being watered down almost to the point of becoming toothless. The majority of provinces mirrored the national level, in the sense that provincial-level institutional and political constraints undercut the potential for effectively introducing NPM.

(p.86) The Western Cape province emerges as a potential exception. Post-1994, it has seen repeated alternation among competing political parties. At the outset of democracy, it inherited a bureaucracy that was well placed to manage the province’s large public education system relatively effectively. Subsequently, irrespective of which political party has been in power, the WCED consistently has sought to implement performance management. The Western Cape thus offers a good opportunity for exploring to what extent determined, top-down efforts, led by the public sector, can turn around a legacy of dismal educational performance.

Our exploration takes the form of an analytically informed historical narrative, following the framework laid out in Chapter 1, and the empirical methodology laid out in Bates, Greif, Levi, Rosenthal, and Weingast (1998). We bring to the research the perspective of scholars in the fields of governance, institutions, politics, and public management. (Neither of us is an education sector specialist.) Our historical narrative is based on interviews with a wide range of current and former senior officials and other stakeholders, and an in-depth review of primary and secondary materials.

Our findings are paradoxical. On the one hand, we find that the WCED is (and long has been) a relatively well run bureaucracy, not only within the South African context, but also (in our experience as specialists in comparative public management, and with reference to comparative indicators of government effectiveness globally1) likely so when compared with educational bureaucracies in other middle-income countries; further, we find (and Chapter 6 confirms, using sophisticated econometric techniques) that over the past decade the WCED has been intensifying its commitment to performance management. On the other hand, however, we find that notwithstanding the sustained efforts, educational outcomes, especially among lower socio-economic segments of the population, remain at levels similar to those of countries and regimes with per capita incomes (and public resource availability) that are orders of magnitude below the Western Cape.

This chapter explores this paradox as follows. Sections 4.2 and 4.3 take a long view of the drivers of performance of the WCED. With this history as backdrop, section 4.4 provides an initial comparative assessment (previewing Chapter 7) of education sector performance in the Western Cape relative to other provinces within South Africa and some other African countries; and over time. Section 4.5 extends the review and assessment of performance into the period since 2009, when the Democratic Alliance (DA) put in place a new generation of performance management tools. Section 4.6 reflects more (p.87) broadly on the paradoxical results, on what might be the possible causes of the paradox—and, based on experience in other countries, on what might be some potential entry points for accelerating progress in achieving better educational outcomes.

4.2 The ‘Long Route of Accountability’ in the Western Cape

Table 4.1 uses the governance typology introduced in Chapter 1 to highlight the contrast between the institutional arrangements for decision-making in education at the national level (as delineated in Chapter 4) and in the Western Cape. (The numbers should be interpreted as indicative of comparative patterns, not precise quantitative estimates.) As per Chapter 4, at the national level the governance of decision-making vis-à-vis education sector policy and regulation is disproportionately negotiated, with a significant personalized dimension. By contrast, for reasons that this section and the next will detail, in the Western Cape governance largely is based on hierarchical and impersonal decision-making.

Table 4.1. Governance of Education—Contrasting Political Settlements

Provincial Governance of Education—The Western Cape Experience

The World Bank’s 2004 World Development Report (World Bank, 2004) provides a useful broad framework for thinking about hierarchical decision-making. As illustrated in Figure 4.1, it distinguishes between two sets of hierarchical accountability relationships, which together add up to a ‘long route’ of public service provision—a ‘voice’ relationship, through which citizens hold political leaders accountable for delivering results, and a ‘compact’ relationship, through which top-level policymakers can hold lower-level bureaucrats accountable. On both scores, the Western Cape’s legacy is a (relatively) propitious one. This section focuses on the ‘voice’ link; the next section on the ‘compact’.

Provincial Governance of Education—The Western Cape Experience

Figure 4.1. The long route of accountability

Source: World Bank (2004).

In the ‘long route’, the mechanisms through which citizens exercise voice is through political competition—and political competition has played out differently in the Western Cape than in South Africa’s other provinces. A key distinction here is between ‘programmatic’ and ‘patronage’ political (p.88) competition. In programmatic settings, political parties compete around alternative visions of what government should do, with all leading parties equally committed to try and deliver on their promises, should they be elected. In patronage settings, competition is based on the differential abilities of parties to build alliances by offering special, personalized favours to clientelistic networks.

In 1994, South Africa ended centuries of political and economic racial discrimination, and established an electoral democracy. This democracy was organized around a quasi-federal system consisting of a national government, and nine provinces, which were granted some authority (often with shared responsibilities involving both central and provincial government). One of these provinces was the Western Cape, which was previously part of a larger Cape Province; prior to 1994, the Cape Province also included portions of what are now the Eastern Cape and Northern Cape provinces.

Across most of South Africa, electoral politics since 1994 has been dominated by the ANC, which enjoyed large electoral majorities—and which governs through a combination of programmatic commitments and personalized promises (the balance between which varies from province to province). By contrast, the Western Cape has been characterized by robust inter-party-political competition, centred around alternative programmatic agendas. Indeed, as Table 4.2 details, in twenty years there have been seven different political parties/coalitions controlling the province.

Table 4.2. Political Control of the Western Cape Provincial Government: 1994–2014

New NP/African National Congress Government of Provincial Unity

1994–98

New NP

1998–99

New NP/Democratic Party Coalition

1999–2000

Democratic Alliance

2000–01

African National Congress/New NP Coalition

2001–05

African National Congress

2005–09

Democratic Alliance

2009–

To many observers’ surprise, the National Party (NP, historically the dominant party of white Afrikaners) won control of the Western Cape province in (p.89) the first democratic elections in 1994. The NP subsequently (unsuccessfully) tried to rebrand itself as the ‘New National Party’ and then combined with the Democratic Party (DP) to form the DA). In recent years, the Western Cape vote increasingly has shifted to the DA, which in 2009 became the province’s majority party, with 51.5 per cent of the vote—and was re-elected in 2014 with a larger majority (59.44 per cent).

Underlying the Western Cape’s distinctive form of political competition are its patterns of class composition and ethnic distribution. The role of class composition is explored in Chapter 7; the focus here is on the ethnic distribution. As Table 4.3 shows, as of 1996, over three-quarters of South Africa’s populations were black/African. However, this group comprised only 21 per cent of Western Cape residents. The Western Cape majority comprised people of mixed race (‘coloureds’, in the South African lexicon), for the majority of whom Afrikaans was the home language. Since 1994, country-wide, the overwhelming majority of the black/African vote consistently has gone to the ANC. The coloured vote, by contrast, has been far more contested—not only by competing appeals to ethnic allegiance, but also by programmatic promises to deliver better government.

Table 4.3. Population Distribution, by Ethnic Background: 1996 Census

Western Cape

National

Numbers

Percentage

Numbers

Percentage

Black/African

826,691

20.9

31,127,631

76.7%

Mixed race (‘coloured’)

2,146,109

54.2

3,600,446

8.9%

Indian/Asian

40,376

1.0

1,045,596

2.6%

White

821,551

20.8

4,434,697

10.9%

Unspecified/other

122,148

3.1

375,204

0.9%

Total

3,956,875

100%

40,583,574

100%

Source: Republic of South Africa (RSA, Department of Statistics, 1996).

These differences in ethnic composition and political allegiance have, indirectly, had a further consequence for governance (specifically in education) in the Western Cape—a more constrained South African Democratic Teachers (p.90) Union (SADTU). As Chapter 3 has shown, SADTU had a major influence in shaping the content of performance management systems in basic education at the national level. But SADTU has been less influential in the Western Cape.

In part, this is a consequence of SADTU’s close alignment with the ANC, which, as we have seen, has been relatively weaker in the Western Cape. In part, it is a consequence of the different trajectories of the anti-apartheid struggle in the Western Cape and elsewhere. SADTU was in important part a focal point of resistance to apartheid’s ‘Bantu education’. Given the Western Cape’s different demographics, the logic of resistance to apartheid took a different turn in the province than elsewhere. This resulted in different patterns of teacher organisation. Even at its peak in 2004, SADTU members never accounted for more than 67 per cent of the Western Cape’s teachers. By 2014, SADTU had 54.5 per cent membership and the Amalgamated Teachers Union (ATU),2 45.5 per cent. NAPTOSA, the more conservative union, which focuses primarily on professional issues, is the biggest component of ATU. This can be compared with provinces such as Mpumalanga/North West, where SADTU’s membership is more than 70 per cent of unionised teachers (Education Labour Relations Council [ELRC], 2005, 2010, 2013).

Labour relations thus played out differently in the Western Cape than elsewhere in the country. For one thing, the relative weakness of SADTU meant that it did not have the de facto veto which it seemingly enjoyed in many other provinces on all management initiatives. For another, the WCED has long had in place a sophisticated Labour Relations Unit with fifty-four staff, which has tried to manage the unions, rather than embarking upon direct confrontation; for example, it has a well-developed process to deal with labour relations disputes, in particular with teachers who are aggrieved that they did not get promotion. Some of the senior WCED management are also SADTU members; broadly, formal WCED-SADTU interactions generally proceed along professional lines, with all bringing the concerns of committed educators to the table.

4.3 The Western Cape’s Education Bureaucracy: From ‘Good Enough’ Weberianism to Performance Management

This section explores the second link in Figure 4.1’s long route of accountability chain—the ‘compact’. It explores how bureaucratic hierarchy has operated in recent decades within the Western Cape, specifically within the WCED.

(p.91) Historically, South Africa had a centralised form of governance, but the intergovernmental relations system changed substantially as a result of the 1996 constitution, which stipulated the creation of a quasi-federal system, consisting of national, provincial and local spheres of government. Education has been designated as a ‘concurrent’ function of both national and provincial government. Service conditions for educators and education policy are set nationally. However, the employers of teachers are the respective provincial heads of the education department (the Superintendents-General). The WCED in turn has deconcentrated education to eight districts, which themselves are divided into forty-nine circuits.

Provinces have extremely limited own revenue. In 2008–09, own revenue amounted to 3.7 per cent of provinces’ total revenue. The provinces receive most of their revenue from national government via equitable share and conditional grants. In 2008–09, the provinces received 80.1 per cent of their revenue via the equitable share, and 16.2 per cent from conditional grants. Provinces have the discretion to spend their equitable share on their functions as they deem fit. This means national government cannot intervene with the allocation of the respective provincial budgets, although they do have to conform with national norms and standards, which for education are set by the national Department of Education (Jansen and Taylor, 2003: 6–7).

As at November 2014, WCED employed 933 public servants at its Cape Town head office; 1,274 public servants, along with 680 office-based educators, at eight district and circuit offices; and 29,900 teachers at 1,533 government schools. Circuit staff are mainly office-based educators, although they do have a few public servants (administrative support staff) in their team (WCED Data Base, 2014).3

Our exploration of how this system has been governed is organized around four sets of themes (and related sub-periods): the bureaucratic inheritance as of 1994; some national-level efforts to restructure the education sector in the initial years of democracy, and their effects within the Western Cape; efforts between 1999 and 2009 to racially ‘transform’ the bureaucracy; and the introduction at the provincial level of national initiatives to foster performance management (Section 4.5 continues the story beyond 2009, when the DA became the majority party in the Western Cape.) As will become evident, throughout the past two decades, the WCED’s platform has been relatively strong.

(p.92) 4.3.1 A Platform of Relative Continuity

As of 1994, the structure, organisation, resource availabilities and quality of South Africa’s educational system were overwhelmingly the consequence of a centuries-long legacy of inequality, poverty and apartheid. Democratisation was accompanied by public policies that ended the apartheid organisational structures and, as Chapter 2 details, radically reshaped the flow of public resources in a more pro-poor direction.4 But the shadow of the past continues to loom large. This continuity is evident (notwithstanding the more progressive fiscal allocations) in the continuing overall advantageous access5 to resources of public schools that serve elite populations (a topic that is outside the scope of the present chapter). Continuity can also take more subtle forms, for example in the likely persistence over time of divergent organisational cultures within schools and in their proximate bureaucratic sub-systems. Consequently, it is with the organisational legacy at the end of apartheid that our exploration of the evolving operation of the WCED begins.

The state of schools in the Western Cape in the early 1990s just prior to democratisation with respect to performance was as follows.

In terms of historically ‘white’ (so-called Model-C) schools, there was a well-resourced and performing school system. They were partially funded by the state and had increased autonomy. They were regulated by the Education Department of the Cape Provincial Administration (CPA). There was little evidence of patronage in the appointment/promotion of teachers.

Historically, ‘coloured’ schools were under the control of the House of Representatives (HoR), the political structures created by the apartheid authorities for the ‘self-government’ of South Africa’s mixed-race population. This was strongly resisted by communities and some teachers (Chisholm et al., 1999; Soudien, 2002; Kallaway, 2002). The system did, however, enable the schools to extract resources from the HoR, which gave them a better education than African schools. As discussed in detail in Appendix A4.1, while there was some evidence of the use of the school system as a source of patronage during the apartheid era, there was no evidence of systematic capture of the system by a predatory elite. The majority of schools had a conservative organisational culture (Fiske and Ladd, 2004: 75–6). In some politically activist schools, there was a strong emphasis on professionalism, which was used as a bulwark (p.93) against the excesses of apartheid.6 The HoR did, however, attempt to control the appointment of senior positions, most notably principals.7

Black schools were poorly resourced. The Department of Education and Training (DET; the former Department of Bantu Education) which controlled black education was characterized by authoritarian control, poorly trained teachers, personalized patronage (Chisholm et al., 1999), and a lack of performance culture. There was also strong resistance to apartheid education in black schools (Kallaway, 2002; Soudien, 2002). Given the demographics, the DET black school system was disproportionately small in the Western Cape setting.

The new provincial government of the Western Cape, created in the immediate aftermath of the 1994 elections, inherited the education departments that were located within the Western Cape—the CPA, HoR and DeT. Portions of the old CPA hived off and became part of the Eastern Cape and Northern Cape provinces. The Western Cape, unlike many other provinces, did not have any Bantustans8 to incorporate. This contributed to a more seamless amalgamation than most of the other provinces (where amalgamations with Bantustans had turned out to be time-consuming, disruptive and costly).

Not only was the Western Cape one of two provinces (of a total of nine) which were controlled by the opposition after the 1994 elections,9 but it was the only province where there was no change of political power. Fiske and Ladd (2004: 75–6) pointed out that during the 1994–95 period, when the power and responsibilities of the provinces were still being established at the national level, the erstwhile CPA bureaucracy that had provided education for white students was still able to exert significant power by providing much of the administrative expertise for the new department. The political forces that had gained control over the education of the coloured students in the 1980s through the HoR continued to be influential and to exert a largely conservative force. In fact, a number of ex-HoR politicians had joined the NP and four of the ministers10 in Hernus Kriel’s 1994 cabinet had come from HoR ranks.

(p.94) However, education officials who had previously been employed in the DET were left in a quandary, not knowing whether they were to report to the DET head office in Pretoria that was being shut down, or the new WCED. Despite the uncertainty of the DET, most of the abovementioned factors contributed to the Western Cape department of education being up and functioning quite quickly in comparison with the departments in other provinces.11

Table 4.4 provides striking evidence of continuity in government. The bureaucracy largely was insulated from the rapid turnover of the provincial-cabinet-level appointments of political heads (i.e. the provincial ministers of education). As the table shows, over the past two decades, the WCED has effectively been led by three superintendents general—Brian O’Connell, Ron Swartz, and Penny Vinjevold. This degree of stability in bureaucratic leadership is a major asset in underpinning performance.

Table 4.4. Superintendents General: WCED 1994–2014

F. Knoetze (acting, 1994–95)

Brian O’Connell (1995–2001)

Johan Fourie (acting, 2001)

Ron Swartz (2002–09)

Brian Schreuder (acting, 2009)

Penny Vinjevold (2009–16)

But continuity also has its costs; old organisational cultures can remain entrenched. Indeed, this is what appears to have happened in the WCED. Interviews with ex-HoR officials and one former minister for education suggested that coloured ex-HoR officials (and not old CPA officials) dominated the new education department. Unlike the CPA, which ran schools on a provincial basis only, the HoR ran education nationally, and had the most staff. As one interviewee said, ‘The HoR in effect incorporated the old CPA and DET’.

As noted earlier, and detailed in Appendix A4.1, the HoR’s Department of Education brought with it a conservative and rule-bound culture into the WCED. Patronage was present, but it was on the margins of what one might call ‘good enough Weberianism’.12 Interviews suggest that in the immediate aftermath of the 1994 democratic elections, this conservative culture became the dominant strain in the new WCED—‘Good enough Weberianism’ became the order of the day.

(p.95) 4.3.2 Absorbing Policy Shocks from the National Level

In the first half-dozen years of democracy, the education sector was characterized by far-reaching structural changes that aimed to decisively leave behind the apartheid legacy. As a companion paper (Hoadley et al., 2016) details, these included: a South African Schools Act, which decentralised very substantial authority to school-level governing bodies; a transformation of the curriculum; a radical shift in how teachers were trained; and a restructuring of the budgetary and personnel policies in an effort to eliminate racial inequities in resources.

From the perspective of the WCED, the most difficult policy change of the first Western Cape legislature (1994–99) was undoubtedly the rationalisation of teachers. In historically white and coloured areas, the pupil–teacher ratio had been almost the same, and substantially higher than in black schools. The new rules on teacher recruitment made provision for schools to use their own sources of revenue; this created an opportunity for schools in relatively privileged areas (the so-called ‘former Model C’ white schools) to levy relatively high school fees on parents, and thereby cushion the impact of the cuts of government-funded teaching posts by privately providing positions, namely school governing bodies (SGB) posts. The erstwhile coloured schools did not have wealthy parents on whom they could levy high school fees; as a result, they were the group that were most adversely affected by the teacher rationalisation process in the Western Cape. On average, formerly coloured high schools lost more than eleven teacher positions per school between 1996 and 1999. Conversely, former African high schools gained a teacher (Fiske and Ladd, 2004: 108–22, Chisholm et al., 1999: 397–8).

According to Fiske and Ladd (2004: 108–22), about 2,900 teachers opted for voluntary severance packages (VSPs) and almost 2,000 left teaching in 1998 alone. Chisholm et al. (1999: 397–8) pointed out that 25 per cent of principals themselves took the packages; furthermore, the teachers who took severance packages and left the school system had higher average qualifications than those who remained. The average teacher in coloured secondary schools in 1996 had nearly four-and-a-half years of tertiary education, but by 1997 the typical teacher had one-third of a year less training. What was particularly problematic was the impact of the loss of mathematics and science teachers, many of whom were quick to accept the VSP because they had marketable skills which could be utilised in business and other sectors of the economy.

4.3.3 Transforming Incrementally

As with all departments across South Africa’s public sector, following the 1994 elections, the WCED began to transform its racial composition to mirror (p.96) South Africa’s democratic realities. Brian O’Connell (1995–2001), then Vice-Rector of Peninsula Technikon, was brought in as the first superintendent general in the democratic Western Cape as a unifying force. He was in charge of the department from 1995 to 2001. It was felt by the ruling NP that choosing someone from outside the three administrations would be less divisive than selecting a leader from within one of the three pre-existing departments.

Some affirmative action began relatively early. The 2000 restructuring of district-level education management and development centres (EMDCs) by O’Connell led to the ‘population of districts with more representative appointments’. When the ANC won full control of the Western Cape in 2005, it accelerated this process of affirmative action in the department of education. In some parts of the administration (e.g. the Office of the Premier), organisational restructuring had, according to some interviewees, led to a rapid acceleration of the ANC’s ‘cadre deployment’13 strategies, along the lines it had pursued in other parts of the country. So, when ex-SADTU national vice president and then superintendent general of WCED, Ron Swartz, introduced an organisational restructuring of the head office in 2007, this was seen in many quarters as another cadre deployment exercise.

However, interviewees for this research suggested that there was a strong organisational need for this restructuring. The sub-directorate, Branch: Education and Planning, in the WCED was widely viewed as too big and unwieldy; it had curriculum, examinations, specialised education, research, ICT and infrastructure under its control. Ron Swartz split this branch into two. Interviewees suggested that there was logic and sound justification based on this restructuring, which was modelled on earlier reforms in Gauteng province.

The 2007 reorganisation thus involved the creation of a new head office organogram with a number of new positions. About 60 new staff were appointed, with hardly any of the existing staff losing their jobs. In this way, the ANC provincial government responded to pressure from ANC provincial party structures to transform the department. But, according to one long outstanding senior interviewee, it also recognised the need for a dedicated professional approach to the management of teachers, so proceeded with restructuring in a way that did not lead to the exodus of existing expertise.

The placement of staff under the Swartz reorganisation was largely completed when the ANC was voted out of power by the DA in 2009. The (p.97) DA tweaked the organisation structure in 2011, but there was not a significant change to it (two to three persons were made redundant organizationally). A few staff who were perceived by the DA as incompetent (almost all had been appointed under Swartz), were worked out of the department around 2011.

4.3.4 Introduction of Performance Management

Throughout the two decades of democratic government, the WCED has endeavoured to put in place results-oriented approaches to performance management. In the first fifteen years, these efforts took a lead from the systems-building efforts promoted by the national-level department of basic education.

Since the latter 1990s, in an effort to link performance and career-pathing, the national-level department of basic education has come up with an ongoing stream of performance management initiatives—from the development appraisal system (DAS), individual performance management, whole school evaluations (WSE), to the ‘integrated quality management system’ (the IQMS, which encompasses all of these performance management systems (ELRC, 2003). As Chapter 3 details, these initiatives have often been intensive in bureaucratic processes, but light on results-based follow through. For all the limitations of these initiatives, the Western Cape bureaucracy has consistently taken performance management seriously—both by putting in place systems to implement the national initiatives, and (as we explore later) by taking a series of home-grown initiatives.

First to be introduced (in the late 1990s) were individual performance evaluations. Most interviewees were scathing of the performance management system for individual staff. A former provincial minister for education stated that: ‘IQMS is not a proper form of evaluation. It does not add real value. It costs a fortune to administer and is time-consuming.’ Other comments ranged from ‘a useless form of evaluation’, to ‘a bit of a joke’. Interviews picked up gaming of performance management.14 One strategically located interviewee indicated that in some schools, there is a disjuncture between the performance of schools and individual performance of teachers. (p.98) In some cases, teachers in underperforming schools get high individual performance evaluations.

The whole school evaluation (WSE) was promulgated nationally in 2001 (RSA, 2001), and implemented in the Western Cape in 2006, replacing the old provincial inspectorate system. It involves three steps: pre-evaluation documents prepared by the schools; an external evaluation; and post-evaluation, whereby schools and districts analyse the WSE report and incorporate the recommendations into school improvement plans (SIPs). In the Western Cape, the WSE is carried out by teams, consisting of permanently appointed officials and part-time WCED supervisors appointed by the WCED this purpose. There is a multi-functional team for high schools, which consists of a team leader and three team members. Each school is evaluated according to the nine focus areas specified in the WSE policy. Lesson observation takes place in languages, mathematics, natural/life sciences and an elective (high schools) or foundation phase (primary schools). The length of the visit is three or five days, depending on the size of the school.

There are a number of critiques of WSE. Firstly, from the union perspective, they are described as: ‘nice little reports where little is done. WSE is equally useless (in comparison with performance management)’. Another complaint from a couple of interviewees was that, due to SADTU resistance, the external supervisors cannot evaluate teachers in classrooms. However, according to the WCED official in charge of WSE, there are classroom visits but under circumscribed conditions, e.g., the school needs to know in advance. Additional critiques were that WSE is not robust enough, and that it takes too long to implement. The objective is to evaluate high schools once in three years, and primary schools once in five years. There are 1,524 schools in the Western Cape. Since WSE evaluation began, about 757 schools (50 per cent) have been evaluated (WCED Data Base, 2014).

But for all of the criticisms, there was also a sense that WSEs add value. SIPs prepared by schools are monitored online by the districts and are also an early warning system; many WCED staff we interviewed reported this to be a relatively effective measure of monitoring. The WCED follows up with a sample of schools which had been evaluated between 2006 and 2010, and checks on what was already done. Ongoing monitoring and support of schools identified as poorly performing is done by the districts. Interviewees reported that WSE evaluations have led to principals being held to account and, on occasion, disciplined. It has contributed to principals being subtly eased out for non-performance. It was argued by interviewees that good plans backed by competent public administration can add value, even if the tools themselves have substantial built-in limitations.

(p.99) 4.4 Performance of Basic Education in the Western Cape in Comparative Perspective

To what extent did the WCED’s bureaucratic assets and commitment to performance management contribute to superior school performance? This section benchmarks Western Cape performance using three sets of measures: a comparative measure of management performance; measures which contrast the Western Cape’s educational performance with that of South Africa’s other provinces; and a comparative measure of Western Cape performance relative to selected other African countries (Section 4.5 looks in depth at recent changes over time within the Western Cape).

4.4.1 Comparative Managerial Performance

Management performance assessment tests (MPATs) have been undertaken across multiple departments across multiple provinces by the department of performance monitoring and evaluation located in the national presidency. The MPAT rates performance according to four levels:

  • Level 1—non-compliance with legal/regulatory requirements.

  • Level 2—partial compliance with legal/regulatory requirements.

  • Level 3—full compliance with legal/regulatory requirement.

  • Level 4—full compliance, and doing this smartly.

The MPATs include a comparative assessment of the relative quality of the education bureaucracies across South Africa’s nine provinces. Appendix A4.2 describes the MPAT and the detailed results for education departments. Table 4.5 provides a summary overview of the relative performance of the WCED. Given the provenance of the MPAT—and the expectation that the national presidency would not be biased towards showing the only province not governed by the ruling ANC in an undeserved good light, the results are striking. As Table 4.5 signals, the WCED emerges as far and away the best managed of the provincial (p.100) education departments: as of 2012–13, it was fully in compliance (or better) with 79 per cent of the key performance indicators which were assessed. The next best provincial education department (Gauteng) was in compliance (or better) with 65 per cent.

Table 4.5. MPAT Assessments of Overall Performance for South Africa’s Education Departments

Provincial Governance of Education—The Western Cape Experience

Source: The Presidency, Department of Performance Monitoring and Evaluation (DPME, 2013).

4.4.2 Comparative Educational Outcomes, 2007/08

Necessarily, an assessment of the quality of an education system must benchmark the educational outcomes achieved by that system—over time, and relative to other systems. We do so in this subsection and, again, later in the chapter.

Comparative benchmarking is challenging. Educational outcomes depend in important part on the socio-economic profile of a system’s learners. Consequently, if one is to rigorously benchmark the quality of an education system’s management, using outcome-based indicators, it is necessary to control for demographic variations in the student population. In a South African context, the Western Cape’s relatively favourable socio-economic profile is likely to produce relatively strong educational outcomes even if (contra Table 4.5) education in the Western Cape was no better managed than elsewhere in the country. A further complication is that, perhaps even more than elsewhere in South Africa, the Western Cape is extraordinarily dualistic—so average outcomes disguise huge within-system variation, making it difficult to make judgements about quality at different points along the socio-economic spectrum. Yet another challenge has to do with the measurement of outcomes. Tests of learning outcomes are often unreliable, with very large standard error of estimates, even for the same test. Further, changes in test design can undermine year-on-year comparability, even if the intention had been to make seemingly modest tweaks. Finally, in order to show positive outcomes, education systems have come up with many ways to ‘game’ tests—from ‘teaching to the test’, to constraining who actually gets to write the test.

Chapter 6 addresses the above issues with a comprehensive, multivariate econometric analysis of education outcomes in the Western Cape relative to other locales, including systematic disaggregation of performance across the socio-economic spectrum. This chapter uses a variety of standardized outcome measures to provide an initial set of descriptive statistical benchmarks of the Western Cape’s performance relative to other locales, as of 2007/08. (2007/08 is the time immediately preceding the DA’s 2009 provincial electoral victory; the data here thus serve as a useful baseline for the trends under DA governance, presented in Section 4.5.)

We begin with the results of standardised tests administered in 2007 to a large sample of sixth graders in fifteen countries by the independent Southern (p.101) and East African Consortium for Monitoring Educational Quality (SACMEQ). (The South African SACMEQ sample comprised 9,083 students drawn from 392 schools; sample size per province ranged from 900 to 1,500 observations.) Table 4.6 reports the SACMEQ scores for the Western Cape relative to South Africa’s other provinces. The Western Cape emerges as the best-performing of South Africa’s nine provinces, with Gauteng a close second—and the remaining seven lagging significantly behind. The relative ranking of provinces is broadly similar whether one takes the median score, the score for learners at the seventy-fifth percentile of socio-economic distribution, or the score for learners at the lower, twenty-fifth percentile, socio-economic tier. As Chapter 6 shows, for the most part these relative rankings persist even once a wide variety of other, exogenous influences are controlled for.15

Table 4.6. SACMEQ Benchmark I: Western Cape Achievement in Grade 6 Mathematics, Relative to Other South African Provinces, 2007

50th percentile (median)

25th percentile

75th percentile

Western Cape

560

496

636

Gauteng

548

483

610

Northwest/Northern Cape

483

439

548

Free State

483

439

535

Kwazulu-Natal

469

424

535

Mpumalanga

469

425

522

Eastern Cape

454

408

509

Limpopo

439

408

483

1 Achievement in grade 6 mathematics by province, 2007.

Source: SACMEQ data files (2007), RSA, DBE (2010a).

Table 4.7 reports ‘matric’ results (the end-of-high-school National Senior Certificate examination) by province for 2008. As of 2008, the Western Cape’s performance vis-à-vis matric results was mixed. Using the pass rate as a benchmark, the Western Cape emerges as the top-performing province. However, when the benchmark used is not the percentage of exam-takers who pass, but rather the percentage of eighteen-year-olds who both take and pass the exam, the Western Cape ranks only fourth among nine provinces.16 As of 2008, Gauteng, Kwazulu Natal and Limpopo were able to successfully (p.102) take a higher proportion of the age cohort through a full twelve years of education than the Western Cape. If, however, the benchmark of success is made more robust—preparation of pupils for university entrance—the Western Cape’s strengths re-emerge. As Table 4.7 shows, the Western Cape and Gauteng were by a large margin the two most successful provinces in preparing their pupils for university. Considering together the pass rate, and performance vis-à-vis university entrance, one can reasonably conclude that, relative to other provinces, the WCED served (relatively) well those who persisted within the system.

Table 4.7. National Senior Certificate Results, Full-Time Students (2008)

As % of exam-takers

As % of 18 year olds

Total number of 18 year olds (‘000s)

% pass

% Bachelors

% pass

% Bachelors

Western Cape

78.4%

33.0%

35.8%

14.6%

99.1

Gauteng

76.4

30.5

40.0

15.0

187.4

North West

68.0

19.4

34.0

9.1

68.8

Free State

71.8

21.0

34.5

11.8

59.2

Mpumalanga

51.8

13.1

33.4

7.1

88.3

Northern Cape

72.7

20.1

32.3

10.0

23.2

Limpopo

54.3

12.6

38.1

7.9

133.5

Kwazulu Natal

57.6

18.2

38.5

12.9

222.9

Eastern Cape

50.6

14.4

19.6

5.4

161.

National

62.6%

20.1%

34.6%

10.2%

1,044

Note: A ‘pass’ requires a grade of at least 40 per cent in three subjects; and of 30 per cent in an additional three subjects. A ‘bachelors pass’ (university eligibility) requires a grade of 50 per cent or better in at least four subjects, and a passing grade for the remaining subjects.

Sources: RSA, DBE (2010b, 2014).

As a further benchmark, Table 4.8 uses SACMEQ data to compare education performance in the Western Cape with that of Mauritius, Kenya, Tanzania and Botswana. As the data show, the Western Cape’s median sixth grader scored below the equivalent learner in Mauritius, and similarly to learners in Kenya and the Tanzanian mainland. At the twenty-fifth percentile (i.e., the lower SES (p.103) tier), the Western Cape scored marginally below all the comparator countries, other than Botswana. Chapter 6 confirms that these results are statistically robust once other exogenous influences on performance are controlled for.

Table 4.8. SACMEQ Benchmark II: Western Cape Mathematics Scores Relative to Other African Countries, 2007

50th percentile (median)

25th percentile

75th percentile

Western Cape

560

496

636

South Africa (overall)

483

424

548

Mauritius

610

522

718

Kenya

548

509

610

Tanzania

555

500

604

Botswana

521

468

573

Source: SACMEQ data files (2007).

It is plausible that the Western Cape’s relatively low scores reflect the province’s many centuries of traumatic history (including servitude, racial oppression and social dislocation, on farms and elsewhere) that are not adequately captured in the socio-economic control measures used in Chapter 6. But set against this is the reality that the per learner expenditure in the Western Cape is five-fold (to cite one comparator country) that of Kenya. As of 2008, relative to some other sub-Saharan African countries, the bureaucratic strengths of the WCED had not translated into superior performance.

4.5 Pragmatic Managerialism—the DA-Governed WCED

This section brings our review of WCED performance and its bureaucratic underpinnings forward to the present—focusing on how the WCED has been managed in the five years since the DA took power. When the DA took the reins of provincial power in the Western Cape in 2009, it did not lack ambition:

For us, success means becoming the best-run regional government in the world, so that we can realise our vision of an open opportunity society for all in the Western Cape.

(Provincial Government of the Western Cape, 2010)

Basic education is a core function of the provincial government—and also a function where better performance is central to the ‘vision of an open opportunity society for all’. To what extent has the DA administration made gains vis-à-vis its far-reaching ambitions? Insofar as it has made gains, what have been the key reforms?

4.5.1 Recent Trends in Performance

As was evident from Section 4.4, when the DA came to power in the Western Cape in 2009, both bureaucratic quality and performance in basic education were already generally better than elsewhere in South Africa. Subsequent to 2009, as Tables 4.9 and 4.10 will show, there is evidence of continuing gains in performance.17 But for all of the incremental gains, the gains are within the (p.104) range of what was achieved by other South African provinces—and the results remain below what has been achieved in Kenya and Mauritius, despite the fact that Kenya, for one, had far fewer resources at their disposal than the Western Cape.

Table 4.9. National Senior Certificate Results, Full-Time Students (2015)

Total number of 18 year olds (‘000s)

As % of 18 year olds

 

% pass

% Bachelors

Western Cape

106.3

43.8%

21.4%

Gauteng

209.4

44.0

18.7

North West

68.2

40.5

13.1

Free State

56.2

46.3

16.5

Mpumalanga

89.3

49.2

16.4

Northern Cape

23.9

34.7

10.5

Limpopo

125.7

54.9

16.8

Kwazulu Natal

214.5

47.8

16.4

Eastern Cape

135.7

36.7

11.3

National

1,029

45.2%

16.3%

Note: A ‘pass’ requires a grade of at least 40 per cent in three subjects; and of 30 per cent in an additional three subjects. A ‘bachelors pass’ (university eligibility) requires a grade of 50 per cent or better in at least four subjects, and a passing grade for the remaining subjects.

Sources: RSA, DBE (2010b, 2014).

Table 4.9 updates for 2015 the 2008 ‘matric’ results that were presented in Table 4.7. A comparison of the two tables reveals that:

  • The Western Cape indeed made substantial gains over the seven-year period, increasing the number of graduates by over 11,000 (a 31 per cent increase over the number who passed matric in 2008), and the number reaching a university entrance standard by over 8,000 (a 55 per cent increase over the 2008 number).

  • (p.105) These gains were achieved even as the Western Cape saw some (modest) increase in the total number of eighteen-year-olds. (Gauteng was the only other province where the number of eighteen-year-olds increased.)

  • Relative to other provinces, the Western Cape’s gains were greater than those achieved by Gauteng over the same period, but about the same as for the other provinces.

As of 2015, the Western Cape system continued to outperform the others in the proportion of pupils who achieved a university entrance standard, but remained below the national average in the proportion of the age cohort which successfully completed high school.

Table 4.10 reports on recent trends in performance of grade 3, 6, and 9 students in ‘systemic tests’, introduced by the WCED in 2002.18 As noted earlier, year-on-year comparisons of test results generally are fraught with difficulties. In the case of the Western Cape’s systemic tests, comparable test results are available only since 2010. Table 4.10 points to some recent trend improvement but the results also underscore a stark reality. As of 2013, fewer than 30 per cent of grade 6s—and fewer than 15 per cent of Grade 9s—met a minimum passing standard for numeracy. Given these overall pass rates, the results at the lower end of the socio-economic spectrum almost surely continue to be startlingly low.19

Table 4.10. Grade 3, 6 and 9 systemic tests—numeracy pass rates (pass set at 50%)

Grade 3

Grade 6

Grade 9

Year

Pass rate

Tested

Pass rate

Tested

Pass rate

Tested

2006

31%

82,879

2007

14%

71,874

2008

35%

74,119

2009

17.40%

83,921

2010

48.30%

78,495

24.40%

81,402

10.40%

83,605

2011

47.60%

79,109

23.40%

78,288

10.90%

81,936

2012

51.50%

83,030

26.40%

79,301

13.90%

89,674

2013

55%

97,375

28,30%

78,723

14.30%

85,320

2014

54%

85,623

30,4%

72,214

14,9%

71,345

Source: WCED (2013, 2014, 2015).

4.5.2 Fine-Tuning Performance Monitoring

In this subsection and the next we turn to the management initiatives that underpinned WCED performance over the 2009–15 period. When we began this study, we expected to find a post-2008 ‘doubling-down’ on the part of the DA administration in NPM-style performance-driven management practices. What we actually found was something more complex—an intriguing combination of heightened attention to performance monitoring, combined with a shift to a more pragmatic managerialism, responding to challenges as they arose with ad hoc, and sometimes discretionary, solutions. Throughout, the (p.106) WCED has largely remained committed to a top-down, hierarchical approach to governing the sector, with only a very nascent exploration of more facilitative, horizontal approaches to education sector governance.

In general, the NPM doctrine combines two seemingly disparate, but potentially complementary, departures from classic bureaucratic Weberianism—an intensified focus on the monitoring of performance, combined with greater flexibility (and accountability for results) for front-line service provision units. This subsection focuses on the first of these two areas (performance monitoring)—one where the WCED has progressively strengthened its tools, with the gains continuing into the DA administration.

As of 2015, the centrepiece of the performance monitoring effort is the directorate of business, strategy and stakeholder management which is located in the office of the head of the WCED, the superintendent general. Established in 2007 (i.e. predating DA rule), its formal functions initially comprised ‘providing a secretarial and administrative support service to the office of Head of Education’. Since then its powers have expanded. The directorate originally faced resistance from existing directorates, who thought it was an attempt to bypass them. It took three years of effort and sustained support from the highest levels of the WCED to put its performance-monitoring systems in place.

The directorate benefits from a sophisticated online tracking system, which includes the following:

  • An ‘individual learner tracking system’—which tracks the progress and performance of individual learners throughout their time within the WCED.20

  • Online SIPs for each of the 1,500 schools in the systems. The SIPs incorporate in an integrated, streamlined fashion, that is accessible to each school:

    • aggregated school-level summaries of the result of the individual learner tracking exercises;

    • the results of whole school evaluations (which, as noted earlier, have been completed for about half the WCED’s schools, with 120 additional schools evaluated each year);

    • the school-level results of systemic tests;

    • academic performance plans, completed for each school;

    • a rolling, three-year planning cycle, incorporated into each SIP, progress in the implementation of which can be monitored systematically.

  • (p.107) School-level budget and staffing planning and execution tools—capable of monitoring for each school across the system whether and how budgets are being spent, and including tools for ordering supplies (notably including textbooks, where problems of availability have bedevilled schools throughout South Africa) online, and tracking whether orders have been placed in a timely manner.21

  • School improvement monitoring—undertaken quarterly, with a specific focus on underperforming schools.

  • District improvement plans, which track trends in performance at higher levels of system aggregation than the schools themselves.

The superintendent general is thus supported by a strategically located planning and monitoring unit which appears to be the hub of performance in the WCED. This ensures that there is an ‘early warning’ system, whereby problems of school performance are brought directly to the attention of the head of department.

According to WCED interviewees, the online tracking system has been highly effective. It has led to a reduction in time for the filling of teachers and principal posts. It can monitor how schools have spent their budgets—and, indeed, whether they have spent their budgets. In recent years, the tool has been used to track teacher absenteeism. Leave forms have been used to calculate the total number of absent days as a percentage of the total number of days people could have been present. Teacher absenteeism has fallen from 19 days annually down to six and then four days per annum. The WCED had anticipated a 4 per cent absenteeism figure, but in practice it has averaged out at a consistent 3 per cent from 2011 to 2013.

In sum, top-down planning and monitoring systems have helped ensure that the vast majority of schools in the WCED are relatively well-managed, at least from a logistical perspective. Teacher posts are filled relatively rapidly, and teachers show up to work; school infrastructure is adequately maintained; supplies, including textbooks, are available; the system adapts reasonably effectively to changes in the numbers of learners within schools, and to the ongoing increase (as a result of migration) in the number of learners in the system as a whole.

4.5.3 A Turn to Pragmatic Managerialism

For all that our research found the WCED to be a well-managed and relatively well-resourced hierarchy with robust tools of top-down performance (p.108) management, as the test scores detailed earlier signal, this has not been sufficient to achieve major gains in educational outcomes. What might be the gap? At the outset of our study we expected that part of the answer lay in the rigid ways in which the top-down systems were implemented. But, intriguingly and unexpectedly, we found that subsequent to 2009, the WCED leadership appears to have become increasingly pragmatic in its application of performance management.

In 2002, seemingly consistent with the two-fold logic of performance management—stronger performance monitoring plus greater facility-level autonomy—the WCED moved towards a formally more decentralised structure, via the re-organisation of much of the department into eight district-level offices. In 2007, it deepened this seeming decentralisation by creating sixty-eight ‘circuit’ management units within the districts, each responsible for approximately twenty to thirty schools. In practice, as numerous interviewees confirmed, between 2002 and 2009 the decentralisation was largely on paper. Interviewees repeatedly used the same phrase to describe the formalistic (IQMS, WSE etc.) way in which the WCED operated during that period—‘management by circular’.

But the new DA team adopted a different (though still largely hierarchical) approach. In 2009, the DA appointed Penny Vinjevold as Superintendent General.22 (She continued in that role until mid-2016.) From the start of her tenure, Vinjevold identified the districts as the nodes of service delivery that would drive performance, with district directors to be given more autonomy to run their areas of jurisdiction. Districts (and the circuit management teams within each district) are the front-line of promoting performance, the ‘eyes and ears’ of the WCED. Their functions include ensuring that all teaching posts are filled; that teachers are teaching; that governing bodies are working properly; that schools receive adequate support; that relevant training is provided; and that performance information is used to inform efforts to improve school performance (although a continuing constraint is that many front-line staff lack the statistical skills to use this information effectively).

Along with improving the district-level structures, Vinjevold identified (in an interview for this study) the following as her top four priorities post-2009:

(p.109)

  • As her ‘biggest priority’ (articulated from 2010 onwards), assuring that all 1,500 schools had a ‘good principal’. (Given natural attrition and the age profile of the principal cadre, the opportunity exists to replace almost the entire principal cadre over an eight-year period. Indeed, between 2009 and 2013, 509 out of 1,542 principals were replaced.)

  • Assuring that every child had a textbook for every subject—something where the Western Cape, though better than many other provinces, had fallen short in the transition to a new curriculum.

  • Managing the budget to ensure that teacher salaries did not exceed 75 per cent of total available budgetary resources, thereby assuring budgetary flexibility for the system as a whole to function.

  • Explicitly challenging employees throughout the WCED (including the many administrative positions) with the question: ‘how does your job help learning improve?’

Vinjevold’s identification of principal quality as her biggest priority is consistent with a central finding of a large body of empirical research that the quality of school-level leadership is an important proximate explanation of school performance (Bush, 2007; Bush, Kiggundu, and Moorosi, 2011; Prew, 2007; Wills, 2016). Consistent with that finding, our companion school-level study (in Chapter 8) also found that the performance over time of its four sample schools was strongly associated with the quality of school-level leadership. We thus use the changing approach of the WCED in recent years to the selection of school principals to illustrate how its post-2009 turn to ‘pragmatic managerialism’ has played out in practice.

As the school-level study explored in depth, the formal responsibility for selecting principals rests largely with SGBs, with the WCED hierarchy (primarily via the district offices) playing a bureaucratic support role. Where SGBs are committed to the achievement of strong educational outcomes, these arrangements can work well. But where, as was evident in three of the four schools examined in our companion case study of four Western Cape schools (Hoadley et al., 2016), they are prone to manipulation, the result can be a ‘low-level equilibrium of mediocrity’.

Since 2009, the WCED has used a variety of managerial tools in an effort to influence principal selection in ways that could shake loose these low-level equilibria. These have included:

  • A de facto policy that when vacancies for principal arose in poorly performing schools, the winning candidate should not be a deputy principal from the same school. (The school-level study showed vividly how the prospect of in-house promotion could undermine the competitiveness of the principal selection process.)

  • (p.110) The use of early retirement options and other inducements (e.g. lateral transfers) to encourage principals in poorly performing school to vacate their positions.

  • The introduction of written psychometric competency assessment tests for candidates for principal, with the costs of testing borne by the WCED. While, given the rules governing labour relations, these could not formally be required, since these tests (and their financing) have been made available, all SGBs have made use of them.

  • A review of the selection process in poorly performing schools—and interventions (including from the highest levels of the WCED) where questions arose as to the likely performance of the selected candidate.

The newly empowered districts are central to these efforts to improve the quality of the principal selection process. Circuit managers sit on selection panels of principals and deputy principals as observers. District directors are expected to form their own views of the candidate for principal, and forward them up the hierarchy—and then be accountable for the quality of principal appointments in their districts. It is too soon to assess systematically to what extent these policies have resulted in a strengthening of school-level leadership. But, in our view, using managerial tools along the lines described above to improve principal selection has the potential to yield significant gains in educational outcomes.

More broadly, looking beyond principal selection, since 2009 the WCED has systematically sought to alter the profile of its bureaucracy. A 2009 staffing scan revealed that fewer than 30 per cent of circuit team managers (circuit teams comprise the direct interface between schools and the WCED bureaucracy) had previously served as school principals; post-2009, in filling line positions in the bureaucracy, preference was given to employees with prior experience at school level, especially as principals. Further, district and circuit offices began to be given greater flexibility in how they went about their business.

But the predominant focus remained hierarchical. Teacher training remained strongly supply-driven. No systematic mentoring arrangements were in place for newly appointed principals and other new senior staff working to turn around hitherto dysfunctional schools (beyond the hierarchical quasi-inspection functions of circuit offices). Until 2009 elected SGBs23 were viewed more as an obstacle than as a potential asset for school-level governance. Further, the increased focus on direct contribution to learning resulted in a de (p.111) facto reduction of opportunities for engagement on the part of many non-governmental organisations who had been working with schools.

4.6 Is More Performance Management the Answer?

At the outset of this chapter, we noted the paradox of basic education in the Western Cape. On the one hand, as the detailed analytic narrative presented in this chapter underscores, the WCED is (and long has been) a relatively well-run public bureaucracy—not only within the South African context, but also likely so when compared with educational bureaucracies in other middle-income countries; further, the WCED’s performance-orientation appears to have increased over time. On the other hand, both the data presented in this chapter and the careful econometric analysis in Chapter 6 show that educational outcomes in the Western Cape are mediocre—with performance no better than, say, Kenya, notwithstanding much greater availability of resources. Sustained, determined efforts to strengthen the operation of the Western Cape’s education bureaucracy have not translated into the large, hoped-for gains. Why?

There are multiple possible explanations for the disappointing outcomes. These include:

  • The hugely difficult socio-economic setting faced by many children that come into the WCED (broken families; gang-ridden communities; drug addiction and endemic foetal alcohol syndrome; recently established informal settlements as waves of new migrants come into the Western Cape).

  • Continuing fallout from the disruptive educational policy shocks from the national level during South Africa’s first decade of democracy—the large-scale rationalisation of teachers; the introduction (and subsequent retreat from) a poorly thought-through ‘outcomes-based education’.

  • Weaknesses in teacher skills, not (yet) offset by sustained and effective efforts to strengthen in-house teacher training.

  • The fact that only since 2009 has ‘management by circular’ been superseded by effective performance management, so gains which might become visible in the future are not yet evident.

Though we certainly do not rule out any of the above, based on our observations of the Western Cape, our comparative experience of public sector governance, and experience of education systems elsewhere (plus the comparative econometric results reported in Chapter 6 below), we believe it is helpful to highlight a further possible explanation—namely, that efforts to improve educational outcomes have been too narrowly pre-occupied with hierarchical (p.112) approaches, and might usefully be complemented (more than has been the case) with additional effort to support more horizontal, peer-to-peer governance.

As this chapter has shown, over the past two decades the WCED has focused largely on improving its hierarchical management systems. In this, it has been successful. Getting textbooks delivered; ensuring that teaching posts are filled with teachers who meet a minimum set of criteria; tracking how schools use resources (including trends in performance); getting funding to the right places at the right times; pro-actively trying to fill leadership positions with the right people for the job—in contrast to many other departments of education in South Africa and elsewhere, the WCED does all of these things relatively well. These are important strengths.

But is hierarchy sufficient? As Chapter 1 detailed, the literature on public management distinguishes between relatively homogenous ‘production’ activities and more heterogeneous ‘craft’ activities. While hierarchy can effectively govern the former, numerous scholars (e.g. Israel, 1987; Wilson, 1989; Lipsky, 2010) have argued that ‘craft’ activities require more flexible and localised governance arrangements. The global literature on education also includes much evidence and advocacy (and some controversy) as to the potential for horizontal governance to add value. And the case studies of Western and Eastern Cape schools in Chapters 8 and 9 of this volume underscore the centrality of school-level governance dynamics in explaining school performance—for ill as well as good.

In concluding, we feel it important to underscore that we are not advocating for radical, rapid change in the WCED’s management of education. For all of the magnitude of the continuing challenges, having a system in place that can deliver on the ‘basics’ is a valuable asset. Rather, what we propose is complementing the current approaches with greater support for enhancing the effectiveness of those more horizontal initiatives for which formal institutional arrangements already are in place, but (except for schools in higher-income areas) have so far received at best limited support for playing their formally designated roles more effectively—combined with opportunities for learning about which initiatives work (and which do not), and for adaptation to the emerging lessons at the level of school-related communities and networks.

In the short run, the gains from these more bottom-up initiatives might seem localised, and thus modest. But, given that the requisite SGB enabling environment is already in place, the risks also are low; the benefits may or may not turn out to be large. (When network effects take hold, their cumulative consequences can be profound.) Based on our broader experience (outside the education sector) of the drivers of success among public organisations, and also the findings reported elsewhere in this volume, we believe that the case is strong for the WCED to deepen its exploration, via learning-by-doing, of what (p.113) might be achieved through finally working to bring to life, across the socio-economic spectrum, the arrangements for more horizontal governance, which, for almost two decades, have been part of the formal landscape of school-level governance.

Patronage in the House of Representatives

In the early 1980s, the government established separate ethnic administrations (known as the tricameral system) within the public service for whites, ‘coloureds’, and ‘Asians’, called the House of Assembly (HoA), House of Representatives (HoR) and House of Delegates (HoD), respectively.

The HoR operated on the margins of a Weberian framework. Formally it operated within a merit-based system, but one which incorporated strong elements of patronage. The Labour Party (LP) had become the dominant party in the HoR elections which had been characterized by voters’ boycott and low polls. This meant the HOR had a crisis of legitimacy from day one, which the LP set out to rectify by increasing its support.

The HoR was accused of using the Department of Education and Culture for the purposes of ‘jobs for pals’, appointing LP supporters to principals’ positions ahead of better-qualified appointments. Franklin Sonn, then president of the Cape Teachers Professional Association (CTPA), accused the LP of interfering in professional matters. He went on to say that since the introduction of the tricameral system, the LP was clearly seeking patronage by making party political selections for promotion posts (The South, 3–9/12/1987). There were accusations that the minister in charge of the Department of Education and Culture, Carter Ebrahim, failed to produce professional reasons for turning down suitably qualified candidates, despite recommendations from school and selection committees (The Argus, 18/6/1987; 7/12/1990).

This was corroborated by interviews with researchers and activists of the 1980s, although these interviewees did suggest that the HoR was more focused on controlling the appointments of promotion posts, most notably principals; they had less interest in influencing entry-level appointments. One interviewee argued that: ‘collaborators were appointed to be principals by the government’. In fact, many teachers opposed to the system refused to accept promotion.

In summary, the HoR appears to have adopted a relatively mild form of patronage for the purposes of building a political machine, but little evidence of explicit corruption. Patronage was on the margins within what can be termed ‘good enough Weberianism’.

Organisation Culture in Coloured Schools

There have been a number of studies of the organisational culture of the white public service which suggested that the South African public service was steeped in traditional public administration, albeit with an apartheid bent. This home-grown version of (p.114) traditional public administration contributed to a bureaucratic, hierarchical and unresponsive public service, aimed at controlling rather than developing the citizens of the country (Schwella, 2000; McLennan and Fitzgerald, 1992).

The HoR and its predecessor, the Department of Coloured Affairs, adopted this rule-bound compliance culture of the white public service. The HoR was part of the public service; some senior managers in the Department of Education and Culture (of both white and mixed race ethnic background) had transferred from the mainstream public service. There was also a common language (Afrikaans) and culture among white and coloured staff.

Chisholm et al. (1999) argued that control over teachers’ work in black and coloured schools was bureaucratic, hierarchical and authoritarian. The strict control of school boards over teachers’ work created a bureaucratic system which was monitored through the use of school inspectors. This included all aspects of school governance, administration and the curriculum of coloured schools. Crain Soudien (2002: 217) argued that ‘from oral history testimony of educators at the time, it was inspectors who played a central role in subduing teachers and holding them to account’. This was not to suggest that there were watertight mechanisms of surveillance—in fact there is much evidence of ‘alternative education’ being offered within and around the official confines of the curriculum (Wieder, 2001: 48).

There was, however, a more complex relationship than simply state control. Chisholm (1991: 15–25) details the marriage of academic excellence and political awareness in Teachers League South Africa (TLSA) schools through 1976. Schools that were considered TLSA strongholds (e.g. Harold Cressy, Livingstone, Trafalgar, and South Peninsula) were known for high standards and political teachings. An interview with seasoned educationalist, Crain Soudien, confirmed that the coloured Department of Education, and subsequently the HoR, were not merely tools of state control; they included committed educational professionals. Further, inspectors were not just there to check on teachers: they were sometimes ambivalent towards activist anti-apartheid teachers. Also, school committees (consisting of parents) were appointed by the state as instruments of control; they sometimes consisted of articulate activists who were able to push back state control.

Chisholm et al. (1999: 114) point out that older, more conservative teacher organisations, which had participated in racially divided departments of education, described themselves as ‘professionals’. Soudien argues that in the Western Cape, strong professionalism was a driving force of both the system-orientated Cape Teachers Professional Association (CTPA), and the more radical TLSA (Teachers League of South Africa). This in turn was also a safeguard against the worst excesses of patronage.

The Management Performance Assessment Tests (MPAT) are a national assessment for public servants (although not for teachers). They are conducted by the Department of Performance, Monitoring and Evaluation (DPME) which is located in the Presidency. The DPME released a report discussing the combined results of the 103 national and (p.115) provincial departments that submitted self-assessments to DPME (The Presidency, 2012). Here we focus specifically on the results of MPATs of education departments.

The 2013 Assessment was based against thirty-one management standards, in seventeen management areas (developed collaboratively with the Department of Public Service and Administration (DPSA) and National Treasury (NT). Standards were developed collaboratively (with NT, DPSA and Office of the Public Service Commission [PSC], Office of the Auditor General and Offices of the Premier) (The Presidency, 2013).

The assessment process is shown in Figure A4.1.

Provincial Governance of Education—The Western Cape Experience

Figure A4.1. MPAT assessment process

The Presidency, DPME (2013).

As per the text, a four-level scale was used to assess each department across each of four dimensions. The results are reported in Tables A4.1–4.5.

Table A4.1. MPAT Assessment of Education: Strategic Management

MPAT 2012/13 final scores % of education department scores for strategic management

Level 1

Level 2

Level 3

Level 4

EC EDU

67

33

FS EDU

33

33

33

GP EDU

100

KZN EDU

67

33

LP EDU

67

33

MP EDU

67

33

NC EDU

67

33

ND BE

67

33

NW BE&T

67

33

WC EDU

100

ALL EDU

3

12

45

40

ALL RSA

33

44

13

11

Source: The Presidency, DPME (2013).

Table A4.2. MPAT Assessment of Education: Governance and Accountability

MPAT 2012/13 final scores % of education department scores for governance and accountability

Level 1

Level 2

Level 3

Level 4

EC EDU

67

11

11

11

FS EDU

33

67

GP EDU

11

22

33

33

KZN EDU

22

67

11

LP EDU

11

22

22

44

MP EDU

22

33

44

NC EDU

56

22

33

ND BE

44

22

33

NW BE&T

44

11

22

22

WC EDU

22

22

56

ALL EDU

30

25

25

20

ALL RSA

38

23

24

15

Source: The Presidency, DPME (2013).

Table A4.3. MPAT Assessment of WCED: Human Resources Management

MPAT 2012/13 final scores % of education department scores for human resource management

Level 1

Level 2

Level 3

Level 4

EC EDU

70

10

20

FS EDU

20

30

40

10

GP EDU

10

40

30

20

KZN EDU

30

40

30

LP EDU

10

50

40

MP EDU

20

50

30

NC EDU

30

50

20

ND BE

20

40

30

10

NW BE&T

20

70

10

WC EDU

40

40

20

ALL EDU

25

41

29

5

ALL RSA

32

41

21

6

Source: The Presidency, DPME (2013).

Table A4.4. MPAT Assessment of WCED: Financial Management

MPAT 2012/13 final scores % of education department scores for financial management

Level 1

Level 2

Level 3

Level 4

EC EDU

57

43

FS EDU

14

29

57

GP EDU

29

43

29

KZN EDU

14

57

29

LP EDU

14

57

29

MP EDU

29

43

29

NC EDU

71

29

ND BE

14

14

71

NW BE&T

14

71

14

WC EDU

57

43

ALL EDU

18

38

35

9

ALL RSA

19

27

45

9

Source: The Presidency, DPME (2013).

Table A4.5. MPAT Assessment of WCED: Overall Score of the Education Department

MPAT 2012/13 final scores % of all KPA scores per education department

Level 1

Level 2

Level 3

Level 4

EC EDU

59

17

17

7

FS EDU

10

31

52

7

GP EDU

14

21

31

34

KZN EDU

17

38

34

10

LP EDU

10

45

31

14

MP EDU

14

34

34

17

NC EDU

28

41

28

3

ND BE

24

24

45

7

NW BE&T

24

45

21

10

WC EDU

21

34

45

ALL EDU

24

31

31

15

Source: The Presidency, DPME (2013).

It can be seen that the WCED scored 100 per cent at Level 4 for strategic management; 56 per cent at Level 4, 22 per cent at Level 3 and 22 per cent at Level 2 for governance and accountability; 20 per cent at Level 4, 40 per cent at Level 3 and 40 per cent at Level 2 for human resources management; and 43 per cent at Level 4 and 57 per cent Level 3 for financial management. If these scores are averaged out, Western Cape received 45 per cent at Level 4, 34 per cent at Level 3 and 21 per cent at Level 2.

The DPME concluded that the overall performance of education departments varied greatly, mainly due to varying performance on governance and accountability, as well as financial management. It stated that within the education sector, the departments that performed best were the Western Cape, whose performance was underlined by generally good provincial support and co-ordination, along with the Gauteng and Free State (The Presidency, 2013).

(p.116) There have been concerns raised about the MPAT approach. The first critique is that the methodology is subjective in that it relies on self-assessment. This is acknowledged by the DPME itself (The Presidency, 2012, 2013), which states that the findings were limited by the availability of evidence to substantiate self-assessment scores from all departments.

The WCED countered this by arguing that they did use external moderation in a systematic way. An examination of the Western Cape raw data (WCED, 2014) suggests, at least at face value, that the external moderation criteria are quite thorough and linked to performance in many ways. For example, if one looks at strategic management, where the WCED received 100 per cent, there were a number of robust criteria (p.117) that the department had to conform with to achieve this high rating. External moderators had to verify, inter alia, that the annual performance plans (APPs) are logically and explicitly linked to delivery agreements and/or programmes of action, as well as the departmental strategic objectives contained in the strategic plan; that the relevance, reliability and verifiability of the information contained in the situational analysis of the APP is according to the framework for managing programme performance information; and whether the APP contains evidence of reconsideration of the situational analysis in the strategic plan, irrespective of whether it resulted in confirming the continued validity of the situational analysis or the amendment of the APP.

(p.118) Furthermore, the external moderators must check whether targets in the APP are listed over budget year and MTEF period for each budget programme identified; whether annual targets are broken down in quarterly targets; whether the expression/quantification of strategic objectives and annual and quarterly targets in terms of ‘SMART’ principle in the APP; whether there is a logical and explicit link between the strategic objectives and targets in the APP and the departmental strategic objectives, as contained in the strategic plan, delivery agreements and/or programmes of action, and whether there is a logical and explicit link between the strategic objectives and targets to budget programmes contained in the APP.

Finally, moderators had to check whether minutes of management meetings reflect use of quarterly performance assessments to inform improvements and whether indicators in annual report and APP are the same and reflect actual annual performance.

The second criticism of the MPAT is that it focuses on compliance rather than performance. DPME (The Presidency, 2013) state in their presentation that the review of compliance does create an awareness of performance. This may be the case, but an awareness of performance does not necessarily translate into performance improvement.

References

Bibliography references:

Bates, R. H., Greif, A., Levi, M., Rosenthal, J.-L., and Weingast, B. R. (1998) Analytic Narratives. Princeton, NJ: Princeton University Press.

Bruns, B., Filmer, D., and Patrinos, H. (2011) Making Schools Work: New Evidence on Accountability Reforms. Washington DC: World Bank.

Bush, T. (2007) ‘Educational leadership and management: Theory, policy and practice’, South African Journal of Education 27(3), 391–406.

(p.119) Bush, T., Kiggundu, E., and Moorosi, P. (2011) ‘Preparing new principals in South Africa: The ACE: School Leadership Programme’, South African Journal of Education 31, 31–43.

Cameron, R. and Naidoo, V. (2016) ‘When a “ruling alliance” and public sector governance meet: Managing for performance in South African basic education’, ESID Working Paper, No. 62. Manchester: The University of Manchester.

Chisholm, L. (1991) ‘Education, politics and organisation: The education traditions and legacies of the Non-European Movement’, Transformation 15, 1–25.

Chisholm, L., Soudien, Vally, S., and Gilmour, D. (1999) ‘Teachers and Structural Adjustment in South Africa’, Education Policy 13, 386–401.

De Jager, N. (2009) ‘No “New” ANC?’, Politikon 36(2): 275–88.

Education Labour Relations (ELRC) (2003) Collective Agreement Number 8 of 2003. Integrated Quality Management System, Pretoria.

ELRC (2005) Collective Agreement No. 1 of 2005. Pretoria.

ELRC (2010) Collective Agreement No. 1 of 2010. Pretoria.

ELRC (2013) Collective Agreement No. 1 of 2013. Pretoria.

Fiske, E. and Ladd, H. F. (2004) Education Reform in Post-Apartheid South Africa. Washington DC: Brookings Institute Press.

Grindle, M. S. (2004) ‘“Good enough” governance: Poverty reduction and reform in developing countries’, Governance 17(4), 525–48.

Hoadley, U., Levy, B., Shumane, U., and Wilburn, S. (2016) ‘The political economy determinants of school performance in four Western Cape Schools’, ESID Working Paper No. 84. Manchester: The University of Manchester.

Israel, A. (1987) Institutional Development: Incentives to Performance. Washington DC: Published for the World Bank by Oxford University Press.

Jansen, J. and Taylor, N. (2003) Educational Change in South Africa 1994–2003: Case Studies in Large-scale Education Reform. World Bank, Education Reform and Management Publication Series, II(I).

Kallaway, P. (ed.) (2002) The History of Education under Apartheid, 1948–1994: The Doors of Learning and Culture Shall be Opened. Cape Town: Pearson South Africa.

Levy, B., Hirsch, A., and Woolard, I. (2015) ‘Governance and inequality: Benchmarking and interpeting South Africa’s evolving political settlement’, ESID Working Paper No. 51. Manchester: The University of Manchester.

Lipsky, M. (2010) Street-level Bureaucracy: Dilemmas of the Individual in Public Service. New York: Russell Sage Foundation.

McLennan, A. and Fitzgerald, P. (eds) (1992) The Mount Grace papers. The New Public Administration Initiative and the Mount Grace Consultation. Johannesburg: Public and Development Management Programme, University of Witwatersrand.

Prew, M. (2007) ‘Successful principals: Why some principals succeed and others struggle when faced with innovation and transformation’, South African Journal of Education 27(3), 447–62.

Provincial Government of the Western Cape (2010) Delivering the Open Opportunity for All, the Western Cape’s draft Strategic Plan. Cape Town.

Reddy, V., Prinsloo, C., Arends, F., and Visser, M. (2012) Highlights from TIMSS 2011: The South African Perspective. Pretoria, Human Sciences Research Council, www.hsrc.ac.za.

(p.120) Republic of South Africa (RSA) (1996) Population Census 1996. Pretoria: Department of Statistics.

RSA (2001) Government Gazette 433(22512), Policy on Whole School Education. Pretoria: Department of Education.

RSA (2010a) The SACMEQ III Project in South Africa: A Study of the Conditions of Schooling and the Quality of Education. Pretoria.

RSA (2010b) Report on the NSC Examination, 2010. Pretoria: Department of Basic Education (DBE), Educational Measurement, Assessment and Public Examinations.

RSA (2014) National Senior Certificate Examination: Technical Report. Pretoria: Department of Basic Education.

RSA (2016) Report on progress in the schooling sector against key learner performance and attainment indicators. Pretoria. Department of Basic Education.

Schwella, E. (2000) ‘Paradigms: Context and competencie’. In F. Theron and E. Schwella (eds), The State of Public and Development Management in South Africa. The Mount Grace II Papers. Stellenbosch: University of Stellenbosch, 33–43.

Soudien, C. (2002) ‘Teachers’ responses to the introduction of apartheid education’. In P. Kallaway (ed.), The History of Education under Apartheid, 1948–1994: The Doors of Learning and Culture Shall Be Opened. Cape Town: Pearson South Africa.

Southern and East Africa Consortium for Monitoring Educational Quality (SACMEQ) (2007) http://www.sacmeq.org/.

Spaull, N. (2013) ‘South Africa’s education crisis: The quality of education in South Africa. 1994-2011’. Johannesburg: Centre for Development and Enterprise (CDE).

The Argus, 18/6/1987 ‘Teachers claim Political bias’.

The Argus, 7/12/1990 Abuse of Power, Sharp Attack on Hendrickse over teachers’.

The Presidency, Department of Performance Monitoring and Evaluation (2012) Management Performance Assessment Tool (MPAT) Report on results of assessment process for 2011/2012. Pretoria.

The Presidency, Department of Performance Monitoring and Evaluation (2013) MPAT. Management Performance Assessment 2012/13 Final scores Education Sector Feedback. Pretoria.

The South, 3–9/12/1987 ‘Jobs for Pals’.

WCED (Western Cape Education Department) (2013) Annual Report, 2012–13. Cape Town.

WCED (2014a) Education Data Base. Cape Town.

WCED (2014b) Annual Report, 2013–14. Cape Town.

WCED (2015) Annual Report, 2014–15. Cape Town.

Wieder, A. (2001) ‘They can’t take our souls: Teachers’ League of South Africa reflections of apartheid’, Race, Ethnicity and Education 4(2), 145–66.

Wills, G. (2016) ‘Principal leadership changes in South Africa: Investigating their consequences for school performance’, Stellenbosch Economic Working Papers: 01/16.

Wilson, J. Q. (1989) Bureaucracy: What Government Agencies Do and Why They Do It. New York: Basic Books.

World Bank (2004) World Development Report 2004: Making Services Work for Poor People. Washington DC: World Bank, https://openknowledge.worldbank.org/handle/10986/5986 License: CC BY 3.0 IGO.

Notes:

(1) For a comparative assessment of governance and inequality in South Africa and four other middle-income countries (Brazil, Mexico, Turkey, and Thailand), which draws on governance indicators, see Levy, Hirsch, and Woolard (2015).

(2) ATU consists of a number of independent unions, who combine for the purposes of collective bargaining only.

(3) The data are only for teachers employed directly by the WCED. School governing bodies (SGBs) also have the right to employ teachers directly, but the WCED does not keep data on teachers employed by SGBs, given that the employing authority is individual schools.

(4) The share of public education expenditure for primary and secondary schools going to schools serving the poorest 20 per cent rose from 19 per cent in 1993, to 22 per cent in 2000 to 26 per cent in 2005; between 1993 and 2005, the corresponding share going to the richest 20 per cent fell from 28 per cent to 13 per cent. Chapter 2 provides additional details.

(5) For example, via the provision of supplementary resources by affluent parents (including for the recruitment of additional teachers); the more favourable inherited physical infrastructure; and the persistence of better-trained and more experienced teachers in elite schools.

(6) Interview with Crain Soudien, former Professor of Education, University of Cape Town, 27 March 2014.

(7) ‘Jobs for Pals’, The South, 3–9/12/1987; ‘Teachers claim Political bias’, The Argus, 18/6/1987; ‘Abuse of Power, Sharp Attack on Hendrickse over teachers’, The Argus, 7/12/1990.

(8) Territories set aside for black inhabitants of South Africa as part of the policy of apartheid, and governed by so-called independent authorities.

(9) KwaZulu-Natal was then controlled by the Inkatha Freedom Party (IFP), the other opposition-controlled province.

(10) In terms of the Western Cape Constitution, provincial ministers are called ministers. This is the only province which uses this nomenclature.

(11) Interview with senior WCED official, 12 June 2014.

(12) By ‘good enough Weberianism’, we mean public administration structures that are organized along classically bureaucratic lines, have some significant shortfalls, but are sufficiently strong to support largely programmatic policies (as opposed to patronage). The term builds on Grindle’s concept of ‘good enough governance’.

(13) In 1997, the ANC introduced its Cadre Policy and Deployment Strategy, which advocated political appointments to senior positions in the public service. It emphasised recruitment from within the party, and potential deployees were made to understand and accept the basic policies and programmes of the ANC. The strategy made no reference to the need for administrative competence (de Jager, 2009).

(14) The IQMS is made up of four lesson observation performance standards and eight outside the classroom performance standards (see Cameron and Naidoo, 2016). Teachers get evaluated over all twelve performance standards. An example of gaming reported by interviewees is a pattern where school staff whose functions primarily are administrative (e.g. the principal), give three to four lessons just before the IQMS evaluation process, which enables them to be averaged out over twelve rather than eight performance standards.

(15) The principal control variables are: the home background of test-takers; their socio-economic status; the scores of teachers on the SACMEQ tests; teacher age, qualification and experience; and classroom factors (teaching time, homework, assessment, and textbook availability).

(16) RSA (2016: Table 22). Western Cape dropout rates are especially high after Grade 9. Why is the Western Cape’s performance (relatively) matric pass rate poor, relative to the full age cohort? One possibility, noted in the text, is that provinces ‘game’ results by adopting a policy of holding back candidates. Another possible explanation is that the Western Cape (especially metropolitan Cape Town) is notorious for its gang culture plus other, related social dysfunctions. One of the consequences of these dysfunctions could be to fuel a counter-norm, at adolescence, one which legitimizes prematurely leaving school.

(17) A further relevant set of evidence comprises the results for 2002 and 2011 of the international mathematics and science (TIMSS) tests, analysed for South Africa by Reddy, Prinsloo, Arends, and Visser (2012). Among South Africa’s provinces, the Western Cape scored highest in both years. However, between 2002 and 2011 its scores declined modestly (from 414 to 404). All other provinces saw an increase, with the overall South African score rising from 285 to 348.

(18) At the national level, a country-wide Annual National Assessment (ANA) was introduced in 2011 to assess literacy and numeracy for grades 3, 6, and 9. However, as of the time of writing this chapter, the ANAs did not yet appear adequate to serve as a basis for comparison across provinces (Spaull, 2013). Published ANA results have raised eyebrows in the academic community, due to the differences between self-reported school performance and independently moderated school performance. For example, for the Eastern Cape, in 2013 the percentage of grade 3 students with a score of 50 per cent or more was self-reported for numeracy as 54.9 per cent, but adjusted downwards after external verification to 42.2 per cent; for literacy, the self-reported score was 50.2 per cent, and the adjusted score 27.0 per cent.

(19) In 2004, only 0.1 per cent (to underscore: one-tenth of 1 per cent!!) of grade 6 learners in schools that were formerly under control of DET met the passing level (50 per cent) proficiency standard for numeracy.

(20) The directorate uses data that are derived from the central education management information system (CEMIS) that are managed by the directorate of knowledge management.

(21) This is also run by the WCED directorate for knowledge management/centre for e-innovation (Department of the Premier) on behalf of the WCED directorate: resources.

(22) Vinjevold had a long career in education prior to becoming the WCED’s superintendent general in 2009. She worked as an educator for many years; returned to complete an MA in Education at the University of the Witwatersrand in 1994; worked as an education researcher from 1994 to 2001; was appointed as a chief director in the WCED by the then ANC government in 2001; and from 2005 to 2009 served as deputy director general in the national department of education.

(23) SGB elections are held on a three-year cycle. In 2014, funding was provided to two non-governmental organisations to help train new SGBs on their roles, immediately following the 2015 round of SGB elections.