<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1d1 20130915//EN" "JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">PYTHAGORAS</journal-id>
<journal-title-group>
<journal-title>Pythagoras Journal of the Association for Mathematics Education of South Africa</journal-title>
</journal-title-group>
<issn pub-type="ppub">0006-8241</issn>
<issn pub-type="epub">2311-9284</issn>
<publisher>
<publisher-name>AOSIS OpenJournals</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">PYTH-35-240</article-id>
<article-id pub-id-type="doi">10.4102/pythagoras.v35i2.240</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Original Research</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Mathematics, curriculum and assessment: The role of taxonomies in the quest for coherence</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Long</surname>
<given-names>Caroline</given-names>
</name>
<xref ref-type="aff" rid="AF0001">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Dunne</surname>
<given-names>Tim</given-names>
</name>
<xref ref-type="aff" rid="AF0002">2</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>de Kock</surname>
<given-names>Hendrik</given-names>
</name>
<xref ref-type="aff" rid="AF0003">3</xref>
</contrib>
</contrib-group>
<aff id="AF0001">
<label>1</label>Centre for Evaluation and Assessment, Faculty of Education, University of Pretoria, South Africa</aff>
<aff id="AF0002">
<label>2</label>Department of Statistical Sciences, University of Cape Town, South Africa</aff>
<aff id="AF0003">
<label>3</label>Independent consultant, South Africa</aff>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>
<bold>Correspondence to:</bold> Caroline Long, <bold>Email</bold>: <email xlink:href="caroline.long@up.ac.za">caroline.long@up.ac.za</email>, <bold>Postal address</bold>: PO Box 2368, Houghton 2041, South Africa</corresp>
<fn>
<p>
<bold>How to cite this article</bold>: Long, C., Dunne, T., &#x0026; De Kock, H. (2014). Mathematics, curriculum and assessment: The role of taxonomies in the quest for coherence. <italic>Pythagoras, 35</italic>(2), Art. #240, 14 pages. <ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.4102/pythagoras.v35i2.240">http://dx.doi.org/10.4102/pythagoras.v35i2.240</ext-link>
</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>12</day>
<month>12</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<volume>35</volume>
<issue>2</issue>
<fpage>1</fpage>
<lpage>9</lpage>
<history>
<date date-type="received">
<day>06</day>
<month>08</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>10</day>
<month>11</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2014. The Authors</copyright-statement>
<copyright-year>2014</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/2.0/">
<license-p>AOSIS OpenJournals. This work is licensed under the Creative Commons Attribution License.</license-p>
</license>
</permissions>
<abstract>
<p>A challenge encountered when monitoring mathematics teaching and learning at high school is that taxonomies such as Bloom&#x0027;s, and variations of this work, are not entirely adequate for providing meaningful feedback to teachers beyond very general cognitive categories that are difficult to interpret. Challenges of this nature are also encountered in the setting of examinations, where the requirement is to cover a range of skills and cognitive domains. The contestation as to the cognitive level is inevitable as it is necessary to analyse the relationship between the problem and the learners&#x2019; background experience. The challenge in the project described in this article was to find descriptive terms that would be meaningful to teachers. The first attempt at providing explicit feedback was to apply the assessment frameworks that include a content component and a cognitive component, namely knowledge, routine procedures, complex procedures and problem solving, currently used in the South African curriculum documents. The second attempt investigated various taxonomies, including those used in international assessments and in mathematics education research, for constructs that teachers of mathematics might find meaningful. The final outcome of this investigation was to apply the dimensions required to understand a mathematical concept proposed by Usiskin (2012): the <italic>skills-algorithm, property-proof, use-application</italic> and <italic>representation-metaphor</italic> dimension. A feature of these dimensions is that they are not hierarchical; rather, within each of the dimensions, the mathematical task may demand recall but may also demand the highest level of creativity. For our purpose, we developed a two-way matrix using Usiskin&#x0027;s dimensions on one axis and a variation of Bloom&#x0027;s revised taxonomy on the second axis. Our findings are that this two-way matrix provides an alternative to current taxonomies, is more directly applicable to mathematics and provides the necessary coherence required when reporting test results to classroom teachers. In conclusion we discuss the limitations associated with taxonomies for mathematics.</p>
</abstract>
</article-meta>
</front>
<body>
<sec id="S0001" sec-type="intro">
<title>Introduction</title>
<p>In the current global educational climate, some degree of regulation is deemed necessary in both the curriculum document prescription and in systemic assessment (Kuiper, Nieveen &#x0026; Berkvens, <xref ref-type="bibr" rid="CIT0016">2013</xref>). If teachers are to be judged by the outcomes of systemic assessments then at least the components making up the curriculum and the assessment tasks should be made explicit, so that the classroom activities may be aligned and reasoned judgments may be made by teachers concerning their classroom focus.</p>
<p>In the first part of this article, we propose a model for assessment that integrates both external and classroom-based educational functions. In order for this model to function optimally, there is a need for coherence in the description of educational objectives, classroom activities and assessment; we therefore need a common language across all three educational processes.</p>
<p>In the second part of the article, we provide an overview of the main cognitive categories in Bloom&#x0027;s taxonomies, both the original and revised versions, the various frameworks from the Trends in International Mathematics and Science Study (TIMSS), as well as the recent South African curricula.</p>
<p>Bloom&#x0027;s Taxonomy of Educational Objectives was initially conceptualised to assist curriculum planners to specify objectives, to enable the planning of educational experiences and to prepare evaluative devices (Bloom, Engelhart, Furst, Hill &#x0026; Krathwohl, <xref ref-type="bibr" rid="CIT0008">1956</xref>, p. 2). Because the educational objectives are phrased as general <italic>cognitive processes</italic>, including activities such as remembering and recalling knowledge, thinking and problem solving, it is necessary to rephrase the particular statements in terms of the subject under consideration (Andrich, <xref ref-type="bibr" rid="CIT0003">2002</xref>). In fact, the taxonomy may be &#x2018;validated by demonstrating its consistency with the theoretical views&#x2019; that emerge in &#x2018;the field it attempts to order&#x2019; (Bloom et al., <xref ref-type="bibr" rid="CIT0008">1956</xref>, p. 17). The process of thinking about educational objectives, defining the objectives in terms of the mathematical tasks and relating these tasks to the teaching activities and assessment tasks is an important exercise for the policymakers, curriculum designers, test designers and teachers.</p>
<p>An overview of the other taxonomies in use in TIMSS and in the various South African curriculum documents provides the background to the planning, communication and feedback processes for the Grade 9-11 monitoring and evaluation project with which the authors are currently engaged. The broad question arising from the project needs is: Can the (three) essential elements, an externally designed monitoring component, a classroom-based formative assessment component and a professional development component, be logically and coherently aligned for the purpose of informing teaching and learning?</p>
<p>The subquestion is: How may we best design assessment frameworks (the design tool specifying the purposes, structure and content of an assessment instrument) in such a way that there is coherence from the mathematical knowledge to be taught and learned, through the design of a set of assessment instruments, to providing diagnostic and practical feedback to teachers about learner performance and needs?</p>
<p>The congruence of curriculum, in the sense of what subject knowledge is to be learned, pedagogy, in terms of how particular concepts and skills are to be learned, and assessment, how the two former elements of the educational experience are to be assessed, is the central concern of this article.</p>
<sec id="S20002">
<title>Teaching and learning</title>
<p>Good, and especially excellent, teachers cannot all teach to the same recipe. Of course, the same mathematics canon underpins their teaching and student learning and the ultimate goal is attainment of the abstract and powerful predicative knowledge of mathematics. But, the route to this end goal along a developmental path is through the operationalisation of mathematics in terms that can be grasped by the young and aspirant mathematicians (see also Vergnaud, <xref ref-type="bibr" rid="CIT0036">1994</xref>). Also, because learners are able to draw from appropriate contexts the mathematical understanding underpinning the formal mathematics, the creative teacher draws on contexts pertinent to the learners and appropriate for generating mathematical understanding.</p>
<p>When teachers plan assessments for their classes, and even for the clusters of classes in a school, the assessment is generally geared to what the learners have been taught. The contents of the test will not be unexpected. The language will be familiar. But in the case of external assessment for qualification purposes, or national systemic studies in which school performance or teacher performance is monitored, or large-scale assessments in which many different countries are involved, the attainment of coherence of language across countries, schools and individual teachers is more difficult to achieve. Countervailing these limitations of an external systemic type of testing is the view that the outcomes of systemic-type assessment should not be the only, nor the primary, source of information for a school evaluation (Andrich, <xref ref-type="bibr" rid="CIT0004">2009</xref>).</p>
<p>The lynchpin of coherence here is attention to the validity of the test components, consistency across the collection sites, so as to generate thus generating reliable test data, and attention to the overall validity of the assessment programme, including the purpose for which the assessment outcome is to be used (Messick, <xref ref-type="bibr" rid="CIT0021">1989</xref>). Adherence to these requirements is not easy to attain. The attainment demands clear communication about the curriculum contents and about the expected responses from learners. For example, the expectation from the teacher may be that individual concepts, with associated procedures, are acquired; in contrast, the examiner may require the learner to apply problem solving skills to a mathematical task requiring multiple concepts. Bloom et al. (<xref ref-type="bibr" rid="CIT0008">1956</xref>) refer to the need to understand the educational context of the learner in order to correctly align educational assessment and correctly categorise the cognitive levels.</p>
<p>Some criticism of external systemic-type testing is noted here. Schoenfeld (<xref ref-type="bibr" rid="CIT0030">2007</xref>) warns that the type of assessment items generally given in tests of larger scale may often work against the type of problem solving process, extended and thorough in nature, advocated by Polya (<xref ref-type="bibr" rid="CIT0027">1957</xref>). Others point to socio-economic factors that impact on the school culture, and therefore on learning and teaching, that warrant deeper consideration (Nichols &#x0026; Berliner, <xref ref-type="bibr" rid="CIT0025">2005</xref>, <xref ref-type="bibr" rid="CIT0026">2008</xref>; Usiskin, <xref ref-type="bibr" rid="CIT0032">2012</xref>; Wolk, <xref ref-type="bibr" rid="CIT0038">2012</xref>). Questions about teacher autonomy and professionalism, and about who has the professional authority to monitor professional teachers, are of paramount importance. These critiques are noted here but are not the concern of this article.</p>
<p>Webb (<xref ref-type="bibr" rid="CIT0037">1992</xref>), in response to dissatisfaction with what he perceives as inadequate testing processes, proposes that mathematics education requires a specific assessment programme. He argues that the then-current assessment models had been based on outdated psychological models designed for purposes no longer relevant. Aligned with this view, we explore later in the article a taxonomy proposed by Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>) that has been operationalised in the University of Chicago Schools Mathematics Project textbooks (UCSMP; see <ext-link ext-link-type="uri" xlink:href="http://ucsmp.uchicago.edu/">http://ucsmp.uchicago.edu/</ext-link>). See for example the Algebra textbook, Teacher&#x0027;s Edition (McConnell et al., <xref ref-type="bibr" rid="CIT0020">2002</xref>).</p>
</sec>
</sec>
<sec id="S0003">
<title>Proposed model</title>
<p>In answer to the critique of current assessment practices and problems experienced in practice, Bennett and Gitomer (<xref ref-type="bibr" rid="CIT0007">2009</xref>) propose a model that provides articulation between three components: systemic assessment (monitoring), formative assessment (classroom-based diagnostics and classroom teaching) and professional development (see <xref ref-type="fig" rid="F0001">Figure 1</xref>; see also Bennett (<xref ref-type="bibr" rid="CIT0005">2010</xref>, <xref ref-type="bibr" rid="CIT0006">2011</xref>).</p>
<fig id="F0001">
<label>FIGURE 1</label>
<caption>
<p>Model for system assessment.</p>
<p><italic>Source</italic>: Adapted from Bennett, R.E., &#x0026; Gitomer, G.H. (2009). Transforming K-12 assessment: Integrating accountability testing, formative assessment and professional development. In C. Wyatt-Smith, &#x0026; J.J. Cumming (Eds.), <italic>Educational assessment in the 21st century</italic> (pp. 43&#x2013;62). Dordrecht: Springer. <ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1007/978-1-4020-9964-9_3">http://dx.doi.org/10.1007/978-1-4020-9964-9_3</ext-link>
</p>
</caption>
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="PYTH-35-240-g001.tif"/>
</fig>
<p>For the external <italic>monitoring component</italic> we propose along with Bennett and Gitomer that any mode of assessment should be aligned with cognitive models that are currently acknowledged as supporting learning. The implication is that when a test is designed for monitoring purposes that both the critical subject knowledge and the associated requirements from a cognitive development perspective are to be considered. Here we note that modern scientific techniques for the generation and analysis of test data may be used to provide information about the individual student, and to ensure reflection on the test instruments themselves and their constituent items. Suitably supported, these methods also permit the tracking of individual needs and performance in the classroom and evidence for the extent of change, progress and redress of performance for the specific child. These techniques, critical to the model, are explored elsewhere. See Dunne, Long, Craig and Venter (<xref ref-type="bibr" rid="CIT0011">2012</xref>) and Long, Dunne and Mokoena (<xref ref-type="bibr" rid="CIT0019">2014</xref>) for discussion on techniques for analysis and reporting of assessment results.</p>
<p>The classroom-based <italic>formative assessment component</italic> of the model requires that teachers be provided with information obtained through the <italic>monitoring component</italic>. This information should reflect both apparent learner proficiency and item performance characteristics. The feedback needs to be sufficiently specific to enable the teachers to reflect on how best to meet the emerging needs of the learners as detected within the assessment. We acknowledge that there will be circumstances in which this reflection will have to be accompanied by improvement of teacher mathematical skills, which is the intent of the <italic>professional development component</italic>.</p>
<p>The professional development component of this system of interventions should be informed by a deeper insight into the nature of the knowledge domains. In essence, the professional development component is required to build with the teachers a model of mathematical development against which teachers may gauge the progress of their learners. The intended curriculum constitutes an essential but incomplete part of this professional development function. The component also involves identifying with the education role players and re-examining the various necessary factors involved in acquiring mathematical proficiency. Teachers and decision-makers together explore the reasons for these critical factors being absent from the school classroom and address strategies to address that absence.</p>
<p>In order to promote congruence at the three sites, an explicit model of conceptual development from the perspective of mathematics and of cognitive development from the perspective of learning is required (see Vergnaud, <xref ref-type="bibr" rid="CIT0035">1988</xref>). These two components have both a hierarchical trajectory and horizontal breadth encompassing both related mathematics concepts and the required cognitive engagement. In order to make explicit at any one time the breadth and depth of knowledge and the responses required of individuals, an explicit description of the particular knowledge field is required.</p>
</sec>
<sec id="S0004">
<title>The purpose of Bloom&#x0027;s taxonomy and associated challenges</title>
<p>When Bloom gathered a group of assessment specialists together in the mid-20th century, his purpose was to provide the assessment community with a common language about learning goals which would facilitate communication across subject matter, persons and grade levels. In an attempt to ensure development of &#x2018;higher mental processes&#x2019; Bloom (1994, p. 2) proposed a common framework for the setting of examinations and for the assessment of these examinations (cited in Andrich, <xref ref-type="bibr" rid="CIT0003">2002</xref>, p. 40). As noted previously, this framework was initially conceptualised as an assessment tool which could aid in the classification of items for item banking purposes.</p>
<p>The educational objectives explicated in the taxonomy could then be translated into behaviours that would provide evidence that the objective had been achieved (Andrich, <xref ref-type="bibr" rid="CIT0003">2002</xref>, p. 41). The aim of the common framework was to help curriculum designers &#x2018;specify objectives so that it becomes easier to plan learning experiences and prepare evaluation devices&#x2019; (Bloom et al., <xref ref-type="bibr" rid="CIT0008">1956</xref>, p. 2).</p>
<p>This common language and vocabulary was to serve as a basis for determining the specific meaning of broad educational goals that informed both the local and the international community. It was also a means for &#x2018;determining the congruence of educational objectives, activities and assessments&#x2019; (Krathwohl, <xref ref-type="bibr" rid="CIT0015">2002</xref>). The establishment of a broad base of descriptions that could describe a range of educational experience was to guard against the limitations of any curricula that had been narrowly conceptualised. For the assessment community a bank of items covering a range of question types was to provide a solution to the increasing demand for the construction of assessment items.</p>
<p>Bloom&#x0027;s original taxonomy embraced <italic>cognitive</italic>, <italic>affective</italic> and <italic>psychomotor</italic> skills. The cognitive processes included six major components: <italic>knowledge, comprehension, application, analysis, synthesis</italic> and <italic>evaluation</italic>. The affective aspect included five major components: <italic>receiving, responding, valuing, organising</italic> and <italic>characterising</italic>. There was a third component named <italic>psychomotor skills</italic> (Bloom et al., <xref ref-type="bibr" rid="CIT0008">1956</xref>). This conceptualisation of educational objectives, embracing a broader view of knowledge and the inferred cognitive responses, was groundbreaking at the time and the effect on education has been an exponential growth in taxonomy use.</p>
<p>It is of interest here that though knowledge is specified as a component, defining this component does not prove that straightforward. Whilst there is an element of memory involved, in that recalling facts, terms, basic concepts and answers forms part of this component, this component also embraces <italic>knowledge of specifics</italic> (terminology and specific facts), <italic>knowledge of ways and means of dealing with specifics</italic> and <italic>knowledge of the universal and abstractions in a field</italic> (principles and generalisations, theories and structures). The idea behind the taxonomy is that it does not only specify breadth but also unfolds a depth of engagement within a particular topic. In this respect the elements of the taxonomy have been regarded as hierarchical, moving from simple to complex, concrete to abstract, so creating a cumulative hierarchy of knowledge and skills (Bloom et al., <xref ref-type="bibr" rid="CIT0008">1956</xref>; Krathwohl, <xref ref-type="bibr" rid="CIT0015">2002</xref>).</p>
<p>Although this idea of hierarchy has been acknowledged as groundbreaking, there has been critique from a number of sources. One of these critiques is that the elements do not necessarily form a hierarchy (Usiskin, <xref ref-type="bibr" rid="CIT0032">2012</xref>, and others). Another view is that whilst the first three elements of the taxonomy are somewhat hierarchical, the last three, in contrast, can be conceptualised as distinct but parallel (Anderson &#x0026; Krathwohl, <xref ref-type="bibr" rid="CIT0001">2001</xref>).<xref ref-type="fn" rid="FN0001">1</xref>
</p>
<p>Another critique is that the act of cognition is so highly interrelated and connected across its features that any attempt to classify and confine the thinking process is bound to fail. Here we note that Bloom et al. (<xref ref-type="bibr" rid="CIT0008">1956</xref>) were acutely conscious of the danger of the fragmentation arising from the use of particular focuses and advocated a degree of classification that did least violence to the construct under investigation. This critique is partly addressed by the revised Bloom&#x0027;s taxonomy that arranges the existing elements into two dimensions, placing <italic>types of knowledge</italic> on the vertical dimension and the <italic>cognitive process dimensions</italic> on the horizontal dimension (see <xref ref-type="table" rid="T0001">Table 1</xref>) (Krathwohl, <xref ref-type="bibr" rid="CIT0015">2002</xref>). Three types and six cognitive processes permit a two-way 3 &#x00D7; 6 array of classifications.</p>
<p>A further observation that results from the use of the taxonomy rather than the original conceptualisation is that for each subject area, the observable behaviours and the different levels of thinking and performance will manifest differently. The abstract nature of the taxonomy requires that for each subject area, the six levels have to be recontextualised by curriculum developers, examiners and classroom teachers who know the subject discipline (Andrich, <xref ref-type="bibr" rid="CIT0003">2002</xref>).</p>
<p>The reconceptualisation of the taxonomy into two dimensions makes the adaptation for the different subject knowledge domains somewhat easier. The inclusion of metacognitive strategies as a separate category in the revised Bloom&#x0027;s taxonomy is regarded by some as a major advance in that without metacognition, it is argued, learning cannot be claimed (Anderson et al., <xref ref-type="bibr" rid="CIT0002">2001</xref>).</p>
<sec id="S20005">
<title>The difficulty of classifying items in terms of Bloom&#x0027;s taxonomies</title>
<p>It is at this point that we reflect on two sets of three items, designed as formative assessment resources, for the formative assessment component of the project. The first set focuses on Algebra (see Worksheet 2 in <xref ref-type="app" rid="APP0001">Appendix 1</xref>).</p>
<p>Our attempts to classify the items that were originally created to cover a range of cognitive processes proved difficult. The individual items have been given a temporary home populating the cells. But how does one distinguish &#x2018;remember X conceptual knowledge&#x2019;, &#x2018;understand X procedural knowledge&#x2019;, and &#x2018;apply factual knowledge X&#x2019; (represented in <xref ref-type="table" rid="T0001">Table 1</xref>)?
</p>
<table-wrap id="T0001">
<label>Table 1</label>
<caption>
<p>Revised Bloom&#x0027;s taxonomy manifesting two dimensions used to classify assessment items in the mathematics monitoring and evaluation project.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Knowledge dimension</th>
<th align="center" colspan="5">Cognitive process dimension</th>
<th align="left"/>
</tr>
<tr>
<th align="left"/>
<th align="left">Remember</th>
<th align="left">Understand</th>
<th align="left">Apply</th>
<th align="left">Analyse</th>
<th align="left">Evaluate</th>
<th align="left">Create</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Factual knowledge</td>
<td align="left">4.1 Area of a square</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
</tr>
<tr>
<td align="left">Conceptual knowledge</td>
<td align="left">2.1 Geometric sequence</td>
<td align="left">-</td>
<td align="left">2.2 Factorise an algebraic expression</td>
<td align="left">4.3 Distance direction (Pythagoras)</td>
<td align="left">-</td>
<td align="left">-</td>
</tr>
<tr>
<td align="left">Procedural knowledge</td>
<td align="left">-</td>
<td align="left">4.2 Volume of a cylinder</td>
<td align="left">-</td>
<td align="left">2.3 Analyse representations of a linear function</td>
<td align="left">-</td>
<td align="left">-</td>
</tr>
<tr>
<td align="left">Metacognitive knowledge</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>We note here that the selection of a cell or cells within which to locate the item depends not only on the understanding of the mathematics involved but also on the approach that the learner may take to solving the problem. An issue arises for a class within which a particular problem has been discussed: re-use of the problem will be classified as &#x2018;remember and apply&#x2019;, whereas if the particular topic has not been dealt with in class the learner may be required to analyse the problem and apply conceptual understanding. This difficulty confirms the statement that by Bloom et al. (<xref ref-type="bibr" rid="CIT0008">1956</xref>) that &#x2018;it is necessary to know or assume the nature of the examinees prior educational experience&#x2019; (p. 20), in order to classify test questions.</p>
</sec>
</sec>
<sec id="S0006">
<title>Similar conceptual and taxonomic efforts in mathematics</title>
<p>Similar work in mathematics education, in parallel or in conjunction with the work of Bloom, Krathwohl and colleagues, has been conducted in an attempt to achieve congruence from the curriculum, through the pedagogical domain, and into assessment.</p>
<p>Distinctions between types of mathematics knowledge, <italic>relational understanding</italic> and <italic>instrumental understanding</italic> by Skemp (<xref ref-type="bibr" rid="CIT0031">1976</xref>) describe the theoretically distinct though practically linked constructs. He describes relational understanding as the ability to deduce specific rules and procedures from more general mathematical relations. Instrumental understanding describes the ability to apply a rule to the solution of a problem without understanding how it works. This contrast, however, refers to the learner&#x0027;s understanding and may be an objective for teaching, but cannot easily be distinguished in an assessment item.</p>
<p>The somewhat different terms <italic>conceptual knowledge</italic> and <italic>procedural knowledge</italic> are identified by Hiebert and Lefevre (<xref ref-type="bibr" rid="CIT0013">1986</xref>) following Scheffler (<xref ref-type="bibr" rid="CIT0028">1965</xref>). The distinction is made between conceptual knowledge in which relations are established between concepts and procedural knowledge elements which are sequential in character. Conceptual knowledge is attained by &#x2018;the construction of relationships between pieces of information&#x2019; or by the &#x2018;creation of relationships between existing knowledge and new information that is just entering the system&#x2019; (Hiebert &#x0026; Lefevre, <xref ref-type="bibr" rid="CIT0013">1986</xref>, p. 4). Hiebert and Lefevre make a secondary distinction between <italic>primary level</italic> relationships and the <italic>reflective level</italic> constructs. The primary level refers to elements of knowledge that are at the same level of abstraction, whilst the reflective level refers to a higher level of abstraction that occurs when two pieces of knowledge initially conceived as separate pieces of knowledge are abstracted to become a principle or concept that is generalisable to other situations. These levels of abstraction align with the purpose of mathematics education expressed by Vergnaud (<xref ref-type="bibr" rid="CIT0035">1988</xref>), which is to transform current operational thinking into more advanced concepts that are generalisable across varied situations.</p>
<p>Procedural knowledge, in Hiebert and Lefevre&#x0027;s (<xref ref-type="bibr" rid="CIT0013">1986</xref>) definition is described as knowing the formal language, or the &#x2018;symbol representation system&#x2019;, knowing algorithms and rules for completing tasks and procedures and knowing strategies for solving problems. In practice, the two perhaps conceptually distinct knowledge types are intricately linked and cannot be distinguished (Long <xref ref-type="bibr" rid="CIT0017">2005</xref>; Usiskin, <xref ref-type="bibr" rid="CIT0032">2012</xref>; Vergnaud, <xref ref-type="bibr" rid="CIT0035">1988</xref>).</p>
<p>Subsequently, Kilpatrick, Swafford, and Findell (<xref ref-type="bibr" rid="CIT0014">2001</xref>) included conceptual understanding and procedural fluency, similar in essence to the terms used by Hiebert and Lefevre (<xref ref-type="bibr" rid="CIT0013">1986</xref>), as two of five strands necessary for mathematical proficiency. The other three strands are adaptive reasoning, strategic competence and a productive disposition (Kilpatrick et al., <xref ref-type="bibr" rid="CIT0014">2001</xref>, p. 141).</p>
<p>In essence, the Kilpatrick strands focus on <italic>features of learner activity</italic> in the mathematics classroom to which a teacher may properly attend. Whilst these strands are useful for the purpose of planning learner activity, they do not function as a taxonomy or typology for purposes of categorising curriculum knowledge or for guiding the design of a test instrument, nor as an instrument to judge teacher competence.</p>
<p>In an attempt to make the design of curriculum, the stating of objectives, the educational activities and the assessment thereof coherent and iteratively cyclical, Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>) and colleagues at the UCSMP have conceptualised an elaborated view of what it means to understand mathematics, which comprises five dimensions: skills-algorithm understanding, property-proof understanding, use-application understanding, representation-metaphor and history-culture understanding. This elaborated view of understanding mathematics is conceived from the learner&#x0027;s perspective and as such should be useful in terms of teaching and learning. This taxonomy of understanding will be discussed in connection with the project to which it was applied in a later section.</p>
<p>Given the above alternative distinctions made, we reflect on the process of categorising and describing items for the purpose of communication and for providing feedback to the teachers in our project. We turn to the items on Worksheet 2 and Worksheet 4 (in <xref ref-type="app" rid="APP0001">Appendix 1</xref>). Are these items easily classifiable as conceptual knowledge or understanding, or procedural knowledge or procedural fluency? A partial answer from Bloom et al. (<xref ref-type="bibr" rid="CIT0008">1956</xref>) is that the classification depends on knowledge of or an assumption about the learners&#x2019; prior knowledge. Nevertheless, large-scale studies and national systemic programmes require guiding assessment frameworks.</p>
</sec>
<sec id="S0007"
sec-type="Trends in International Mathematics and Science Study frameworks: A taxonomy">
<title>Trends in International Mathematics and Science Study frameworks: A taxonomy</title>
<p>The TIMSS frameworks have been used to inform curricula and provide a framework against which tests may be constructed and results reported. Though with no direct evidence it appears that the Third International Mathematics and Science Study, as it was known in 1995, TIMSS-Repeat (1999) and the Trends in International Mathematics and Science Study 2003, 2007 and 2011 have all engaged with Bloom&#x0027;s taxonomy, both the original and revised versions, and with the various categorisations made in mathematics education literature. In order for the international large-scale studies to make a claim for both reliability and validity, it is essential that they make explicit the frameworks informing the design of the assessment instrument, including both the content domains and the cognitive domain. The early TIMSS studies, in 1995 and 1999, used the term <italic>performance expectations</italic> to provide the second dimension. These expectations were as follows: <italic>Knowing, Using routine procedures</italic> and <italic>Problem solving</italic> at Grade 4; <italic>Representing situations mathematically, Using more complex procedures, Generalising</italic> and <italic>Justifying</italic> at Grade 8 (Schmidt, McKnight, Valverde, Houang &#x0026; Wiley, <xref ref-type="bibr" rid="CIT0029">1996</xref>; see also <xref ref-type="table" rid="T0002">Table 2</xref>).
</p>
<table-wrap id="T0002">
<label>TABLE 2</label>
<caption>
<p>Conceptual and cognitive domains: Bloom&#x0027;s (original and revised), TIMSS 1995/1999, 2003, 2007/2011, RNCS and CAPS</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Type</th>
<th align="left">Bloom&#x0027;s original 1956</th>
<th align="left">Bloom&#x0027;s revised 2002</th>
<th align="left">TIMSS 1995/99</th>
<th align="left">TIMSS 2003</th>
<th align="left">TIMSS 2007/20011</th>
<th align="left">RNCS 2002 &#x0026; CAPS 2011</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">
<bold>Knowledge</bold>
</td>
<td align="left">Knowledge of - specifics (facts and terminology)<break></break>- ways and means of dealing with specifics (conventions, trends and sequences, methodology)<break></break>- universals and abstractions (principles and generalisations, theories and structures)</td>
<td align="left">The knowledge dimension (<bold>factual</bold> knowledge, <bold>conceptual</bold> knowledge, <bold>procedural</bold> knowledge, <bold>metacognitive</bold> knowledge)</td>
<td align="left">Content domain (mathematical concepts)</td>
<td align="left">Content domain (mathematical concepts)</td>
<td align="left">Content domain (mathematical concepts)</td>
<td align="left">Content domain (mathematical concepts)</td>
</tr>
<tr>
<td align="left">
<bold>Cognitive domain</bold>
</td>
<td align="left">Comprehension, Application, Analysis, Synthesis, Evaluation, (Much more detail provided in Kratwohl, 2002)</td>
<td align="left">Remember, Understand, Apply, Analyse, Evaluate, Create (Much more detail provided in Krathwohl, 2002)</td>
<td align="left">
<bold>Knowing, Using routine procedures, Problem solving</bold>. (Grade 4)<break></break>
<bold>Representing situations mathematically Using more complex procedures, Generalising, Justifying</bold> (Grade 8)</td>
<td align="left">
<bold>Knowing facts and procedures</bold> (recall, recognise and identify, compute, use tools)<break></break>
<bold>Using concepts</bold> (know, classify, represent, formulate, distinguish)<break></break>
<bold>Solving routine problems</bold> (select, model, interpret, apply, verify)<break></break>
<bold>Communication and problem solving</bold>
</td>
<td align="left">
<bold>Knowing</bold> (recall, recognise, compute, retrieve, measure, classify and order)<break></break>
<bold>Applying</bold> (select, represent, model, implement, solve routine problems)<break></break>
<bold>Reasoning</bold> (analyse, generalise, synthesise and integrate, justify, solve non-routine problems)<break></break>
</td>
<td align="left">
<bold>Knowledge</bold> (estimation and appropriate rounding of numbers, straight recall, use of correct formula, use of mathematical facts, appropriate use of mathematical vocabulary)<break></break>
<bold>Routine procedures</bold> (perform well-known procedures, simple applications and calculations, derivation from given information, use of correct formula)<break></break>
<bold>Complex procedures</bold> (complex calculations or higher order reasoning; investigate elementary axioms to generalise them into proofs for straight line geometry, congruence and similarity, no obvious route to the solution, connections between different representations, conceptual understanding)<break></break>
<bold>Problem solving</bold> (unseen, non-routine problems, higher order understanding and processes, break the problem down into its constituent parts)</td>
</tr>
<tr>
<td align="left">
<bold>Other</bold>
</td>
<td align="left">Affective (Receiving, responding, valuing, organising, characterising)<break></break>Psychomotor skills</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>TIMMS, Trends in International Mathematics and Science Study; RNCS, Revised National Curriculum Statement; CAPS, Curriculum and Assessment Policy Statement.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>The categories used in TIMSS 2003 for the cognitive domain were: <italic>Knowing facts and procedures</italic> (recall, recognise or identify, compute, use tools), <italic>Using concepts</italic> (know, classify, represent, formulate, distinguish), <italic>Solving routine problems</italic> (select, model, interpret, apply, verify) and <italic>Reasoning</italic> (logical, systematic thinking, including both inductive and deductive thinking) (Mullis et al., <xref ref-type="bibr" rid="CIT0022">2003</xref>).</p>
<p>In 2007 and 2011, the categories changed somewhat to <italic>Knowing</italic> (recall, recognise, compute, retrieve, measure, classify or order), <italic>Applying</italic> (select, represent, model, implement, solve routine problems) and <italic>Reasoning</italic> (analyse, generalise, synthesise and integrate, justify, solve non-routine problems) (Mullis et al., <xref ref-type="bibr" rid="CIT0023">2005</xref>, <xref ref-type="bibr" rid="CIT0024">2009</xref>) (<xref ref-type="table" rid="T0002">Table 2</xref>)<italic></italic>. Without going into detail, one may observe broad similarities across the TIMSS frameworks with the Bloom&#x0027;s taxonomies, both original and revised.</p>
<p>We note here that our items from Worksheet 2 and Worksheet 4 (in <xref ref-type="app" rid="APP0001">Appendix 1</xref>) may be allocated to TIMSS <italic>content domains</italic> fairly easily as the topics in the framework are elaborated to a fine level of detail. The difficulty still remains with assigning a cognitive domain to the items or, to phrase the challenge differently, to assign the expected response of the learner.</p>
<p>Both the TIMSS frameworks and the Bloom&#x0027;s taxonomies (both original and revised) have influenced curricula planning in many participating countries, including South Africa, over recent decades.</p>
</sec>
<sec id="S0008">
<title>RNCS and CAPS taxonomies in international perspective</title>
<p>In this section we comment, in relation to Bloom&#x0027;s taxonomy and TIMSS, on the South African curricula, the Revised National Curriculum Statement (RNCS) introduced in 2002 (Department of Education, <xref ref-type="bibr" rid="CIT0010">2002</xref>), though only fully implemented some 5 years later, and the Curriculum and Assessment Policy Statement (CAPS) (Department of Basic Education, <xref ref-type="bibr" rid="CIT0009">2011</xref>), introduced in 2011 and implemented from 2012 to 2014.</p>
<p>The categorisations &#x2013; knowledge, routine procedures, complex procedures, and problem solving &#x2013; in CAPS (DBE, <xref ref-type="bibr" rid="CIT0009">2011</xref>) are similar to the TIMSS 1995 and 1999 categories. The earlier RNCS curriculum used the same content categories, but had more elaborated cognitive dimensions, which were more akin to the Bloom&#x0027;s categories and the TIMSS 2007 categories. <xref ref-type="table" rid="T0003">Table 3</xref> provides a summary with the RNCS and CAPS categories somewhat roughly aligned.
</p>
<table-wrap id="T0003">
<label>Table 3</label>
<caption>
<p>Curriculum framework for test design purposes with CAPS -percentages for Grade 9.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Content Area</th>
<th align="center">Knowledge</th>
<th align="center">Routine procedure</th>
<th align="center">Complex procedure</th>
<th align="center">Problem solving</th>
<th align="center">Total (%)</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Number, operations and relations</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">15</td>
</tr>
<tr>
<td align="left">Patterns, functions and algebra</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">30</td>
</tr>
<tr>
<td align="left">Space and shape (Geometry)</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">35</td>
</tr>
<tr>
<td align="left">Measurement</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">10</td>
</tr>
<tr>
<td align="left">Data handling</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">-</td>
<td align="center">10</td>
</tr>
<tr>
<td align="left">
<bold>Total (%)</bold>
</td>
<td align="center">
<bold>25</bold>
</td>
<td align="center">
<bold>45</bold>
</td>
<td align="center">
<bold>20</bold>
</td>
<td align="center">
<bold>10</bold>
</td>
<td align="center">
<bold>100</bold>
</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn>
<p>CAPS, Curriculum and Assessment Policy Statement.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>Applying the matrix of content domain categories (mathematical concepts) and the cognitive domain categories (the responses expected from individuals) allows test designers to cover the broad range of knowledge requirements expected by the curriculum. Of course such a matrix of content and learner activity level inevitably sets up artificial distinctions between subject topics and between the responses expected.<xref ref-type="fn" rid="FN0002">2</xref> There are likely to be many occasions when the test designer will be in a quandary as to which category to assign a particular item. <xref ref-type="table" rid="T0003">Table 3</xref> provides an example of a standard framework providing content and cognitive domains. The cells would then be populated according to the curriculum requirements, for example as laid out in CAPS.</p>
<p>The requirement is for the designer to populate the tabular framework with suitable types and numbers of items for each cell in the matrix. In the case of our Worksheet 2 and Worksheet 4 items, they may all be allocated to the category <italic>Application</italic>, which includes routine procedures and complex procedures (<xref ref-type="table" rid="T0003">Table 3</xref>), although some may argue that the items all belong in the <italic>Knowledge</italic> category.</p>
<p>The authors acknowledge that the depth of description in these taxonomies and frameworks has not been provided in this article. They are listed rather to show how there are similarities and differences across these frameworks which then point to the complexity of constructing such taxonomies.</p>
<p>It is necessary for the taxonomy and the various frameworks to be transformed into subject-specific descriptions (Andrich, <xref ref-type="bibr" rid="CIT0003">2002</xref>; Van Wyke &#x0026; Andrich, <xref ref-type="bibr" rid="CIT0034">2006</xref>). The TIMSS descriptions have achieved this requirement (see Mullis et al., <xref ref-type="bibr" rid="CIT0022">2003</xref>, <xref ref-type="bibr" rid="CIT0023">2005</xref>
<xref ref-type="bibr" rid="CIT0024">2009</xref>). An interesting divergence to be explored is that whilst Bloom&#x0027;s original and revised taxonomies claim a hierarchy of cognitive processes, the TIMSS framework claims only minimal hierarchy of cognitive domains with a range of difficulty within each cognitive domain (Mullis et al., <xref ref-type="bibr" rid="CIT0022">2003</xref>, p. 32).</p>
</sec>
<sec id="S0009">
<title>Challenges in 21st century assessment</title>
<p>The challenge to test developers in the 21st century is to achieve some congruence between tests used for monitoring or summative purposes, for the active classroom and classroom-based assessment. We also propose that in addition to the alignment required for these modes of assessment, there is also critical engagement in a professional development cycle. The congruence of educational objectives, teaching and learning activities and assessment envisaged by Bloom (see Krathwohl, <xref ref-type="bibr" rid="CIT0015">2002</xref>) is difficult to achieve. However, given the importance of aligning assessment practices with classroom practices, it is necessary to have a framework that is explicit and is in some respects common to both settings.</p>
<p>The current monitoring and evaluation project under consideration in this article has an external monitoring component: there is also in the design a feedback component provided to teachers in the interest of improving teaching and learning. The model for this project based on the work of Bennett and Gitomer (<xref ref-type="bibr" rid="CIT0007">2009</xref>), and Bennett (<xref ref-type="bibr" rid="CIT0005">2010</xref>, <xref ref-type="bibr" rid="CIT0006">2011</xref>), has been explained earlier in the article. The content of the assessment programme requires reviewing and making decisions about substantive mathematics knowledge.<xref ref-type="fn" rid="FN0003">3</xref> In addition, the fact that there is feedback to the teachers means that there should be a common conceptual language and some congruence of expectations across all three sites, the curriculum, classroom teaching and external assessment.</p>
<p>In this project the problem emerged of making explicit the content of the curriculum framework for learners in Grade 9. (The project also encompassed Grades 8, 10 and 11; the focus in this article is Grade 9.) The research team believed that making the framework explicit would serve three purposes: firstly, it would provide some direction to the constructors of the test items and provide an overview of the test; secondly, the explicit descriptions could provide feedback for teachers; thirdly, in the interests of democratic participation, the design of the test would be transparent (see <xref ref-type="app" rid="APP0002">Appendix 2</xref> and <xref ref-type="app" rid="APP0003">Appendix 3</xref>).</p>
<p>The broad question, as stated earlier, is: Can the three essential elements, a monitoring component, a formative assessment component and a professional development component be logically and coherently aligned for the purpose of informing teaching and learning?</p>
<p>The subquestion is: How may we best design assessment frameworks (the design tool specifying the purposes, structure and content of an assessment instrument) in such a way that there is full coherence from the mathematical knowledge to be taught and learned, through the set of assessment instruments to providing diagnostic and practical feedback to teachers about learner performance and needs?</p>
<p>In the early phases of the project, we explored alternatives to the current practice which draws on Bloom&#x0027;s taxonomy and variations as described in the national curriculum documents. In particular, we examine the function of taxonomies in guiding a monitoring and developmental process.</p>
</sec>
<sec id="S0010">
<title>Selecting a taxonomy</title>
<p>There is the obvious difficulty of assigning mathematics items to one particular mathematics category, but when assigning an item to a cognitive category, the problem is one of presuming how the learner will respond to the item. Whether an item is categorised as knowledge, routine procedures or complex procedures and problem solving (as in the CAPS, <xref ref-type="table" rid="T0002">Table 2</xref>) depends on the level of knowledge acquired by the learner; this level relates directly to what has been taught (Bloom et al., <xref ref-type="bibr" rid="CIT0008">1956</xref>; Usiskin, <xref ref-type="bibr" rid="CIT0032">2012</xref>). A possible solution to this predicament of interpretation is to limit the categories to mathematics components rather than attempting to second-guess how the generic learner will respond. This approach, focusing on the mathematical content of the question, may circumvent the difficulty test designers have in manipulating the mathematics to fit a cognitive category. A second criterion for the selection of a taxonomy, or criteria for categorising test items, is for the categories to align with teaching and learning. The question to consider here is whether or not feedback from a particular category will provide information to the teacher about needs and interventions.</p>
<p>The first approach we took in this project was to describe in detail what we expected of the learner responding to the item. This approach is exemplified in the patterns, functions and algebra component of a Grade 9 test illustrated in the table in <xref ref-type="app" rid="APP0002">Appendix 2</xref>. The cognitive requirements form the horizontal headings across the table.</p>
<p>The purpose of making this content aspect explicit was to inform teachers of the contents of the test so that they could make a reasoned judgement about the performance of their classes in relation to the test during this external monitoring programme. In other words if the teachers knows that Item X covered probability, and she also knows that she made a judgement call to leave probability out of the Grade 9 work plan with the view to having an intense focus in Grade 10, she would understand her students lack of performance in this section.</p>
<p>A second approach categorised the items in terms of the dimensions of understanding identified by Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>; see also <xref ref-type="app" rid="APP0003">Appendix 3</xref>). Using three criteria for a useful taxonomy, that is, firstly to stay true to the mathematics, secondly to guide a balanced assessment and thirdly to provide useful feedback to teachers, we thought to explore the potential of the five dimensions of understanding proposed by Usiskin, which are also operationalised in the UCSMP high school textbooks. He proposes that for a full understanding of concepts, five dimensions are necessary:<list list-type="bullet">
<list-item>
<p>The <italic>skills-algorithm</italic> dimension of understanding deals with the procedures and algorithms required to achieve answers. This dimension includes the understanding of procedures and algorithms, which Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>) and others assert is much deeper than what has been called procedural understanding or procedural fluency (see Kilpatrick et al., <xref ref-type="bibr" rid="CIT0014">2001</xref>; Long, <xref ref-type="bibr" rid="CIT0017">2005</xref>). The understanding and ability to carry out a skill invariably involves at base the understanding of the associated concept and requires all sorts of skills. This dimension of understanding mathematics concepts is what is mostly addressed in school classrooms and found in systemic type tests.</p>
</list-item>
<list-item>
<p>The <italic>property-proof</italic> understanding of concepts deals with the principles underlying, for example, the number system and operations. It may be argued that a procedure is only really understood when one can identify the mathematical properties that underlie the procedures. Knowledge of the properties and being able to &#x2018;prove&#x2019; that the procedure works enables one to more confidently generalise the procedure to other problems. Here we may contrast conceptual understanding with procedural understanding, although as argued previously this distinction has to be qualified.</p>
</list-item>
<list-item>
<p>The <italic>use-application</italic> understanding of mathematics deals with the applications of mathematics in real situations. A person may understand how to perform some procedure and may know why his method works, but they cannot fully understand unless they know when, why and how to use the skill and procedure in applications. Applications are not necessarily higher order thinking, but rather a different type of thinking according to Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>).</p>
</list-item>
<list-item>
<p>
Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>) avers that the three types of understanding previously described do not give a complete picture: to fully understand a concept a person must be able to represent the concept in different ways. The <italic>representation-metaphor</italic> understanding refers to the pictures, graphs or objects that illustrate concepts and that can be used interchangeably with symbolic representation. Such analogies may need to be in one or more of verbal, figural, graphical or tabular modes and may need to invoke more than a linear ordering or more than a single static dimension. They may also require a location in time.</p>
</list-item>
<list-item>
<p>The fifth is the <italic>history-culture</italic> dimension. Whilst this theme is an important dimension of understanding, it cannot easily be tested where responses require only short answers. It reflects a sense of the interrelatedness of mathematical content and its embedding in the social fabric of experience. Some key consequences of this dimension of understanding include an appreciation of the utility and creativity associated with mathematical thinking and problem solving at the level of the learner and an insight into the proximity of mathematics. We suggest it has motivational consequences.</p>
</list-item>
</list>
</p>
<p>In applying this revised taxonomy we faced two dilemmas. The first was that we had to conform in some degree to the status quo. The CAPS document, the legal framework guiding teachers in their everyday teaching and assessment, requires strict adherence. In that document the four levels applied are <italic>Knowledge, Routine procedures, Complex procedures</italic> and <italic>Problem solving</italic> (DBE, <xref ref-type="bibr" rid="CIT0009">2011</xref>). We generally use three categories, <italic>Knowledge, Applications</italic> and <italic>Problem solving</italic>.
</p>
<p>The second dilemma was where to include problem solving. Usiskin&#x0027;s (<xref ref-type="bibr" rid="CIT0032">2012</xref>) focus is on understanding a concept. Does problem solving form part of the dimensions of understanding, so, for example, could we place problem solving into the category <italic>use-application</italic>, or should it have a category of its own?</p>
<p>The process &#x2018;problem solving&#x2019; has many different interpretations. In some sectors problem solving means a &#x2018;word sum&#x2019;; to others the term means encountering a problem never seen before by the learner cohort. This salient but inherently unverifiable definition is very difficult in practice because a teacher or test designer may never know whether a learner has seen a particular problem type previously or not. The good teacher, enthusiastic parent or grandparent and the Internet could all have a part to play in rendering a really good problem routine, in that it becomes something the child has seen and perhaps solved before. Problem solving according to Polya (<xref ref-type="bibr" rid="CIT0027">1957</xref>) has distinct phases. The problem solver when confronted with a problem they have not seen previously needs to firstly understand the problem, then think about the strategy to use, then &#x2018;generate a relevant and appropriate easier related problem&#x2019;, then &#x2018;solve the related problem&#x2019;, and finally &#x2018;figure out how to exploit the solution or method to solve the original problem&#x2019; (Schoenfeld, <xref ref-type="bibr" rid="CIT0030">2007</xref>, p. 66).</p>
<p>Taking this process seriously means that problem solving is not possible in a standard testing situation. We have to acknowledge here that our tests are omitting a very significant part of mathematics. In fact, Schoenfeld (<xref ref-type="bibr" rid="CIT0030">2007</xref>) asserts that the types of questions and answers common in many mathematics classrooms work against the generation of good problem solvers in those classrooms. In the case of problem solving we have compromised and included the notion of problem solving as a separate category, although knowing that the items allocated to that category are only shadows of what Polya would describe as a real problem.</p>
<p>So, in <xref ref-type="table" rid="T0004">Table 4</xref>, we have assigned the items from Worksheet 2 and Worksheet 4 (see <xref ref-type="app" rid="APP0001">Appendix 1</xref>) to one of the dimensions, knowing that the allocation to another single dimension may be argued and that a single item may well span two categories, true to the nature of mathematics applications. Each of these four dimensions of understanding, <italic>skills-algorithms, property-proof, use-application</italic> and <italic>representation-metaphor</italic>, has aspects that can be memorised; they also have potential for the highest level of creative thinking, for example the invention of a new algorithm (Usiskin, <xref ref-type="bibr" rid="CIT0032">2012</xref>). Each of the dimensions is relatively independent of the others. Each of the understandings has proponents who teach mathematics largely from that single perspective. Usiskin (<xref ref-type="bibr" rid="CIT0032">2012</xref>) claims however that the understanding of mathematics is multidimensional, with each of these dimensions contributing some elements of the notion of understanding.
</p>
<table-wrap id="T0004">
<label>Table 4</label>
<caption>
<p>Dimensions of understanding, levels of processing, and possible weightings.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Strategies</th>
<th align="left">Skills-algorithm</th>
<th align="left">Properties and principles</th>
<th align="left">Use-application</th>
<th align="left">Representation</th>
<th align="center">Total</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Routine strategies (one process)</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">4.3 Distance, direction application</td>
<td align="left">-</td>
<td align="center">40%</td>
</tr>
<tr>
<td align="left">Complex strategies (two or more steps)</td>
<td align="left">2.2 Factorise an algebraic expression<break></break>4.2 Volume of a cylinder</td>
<td align="left">2.1 Geometric sequence<break></break>4.1 Area of a square</td>
<td align="left">-</td>
<td align="left">2.3 Representations of a linear function</td>
<td align="center">30%</td>
</tr>
<tr>
<td align="left">Problem solving (Polya&#x0027;s process)</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="center">20%</td>
</tr>
<tr>
<td align="left">Highly creative engagement with a unique outcome</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="center">10%</td>
</tr>
<tr>
<td align="left">
<bold>Total</bold>
</td>
<td align="left">
<bold>30%</bold>
</td>
<td align="left">
<bold>20%</bold>
</td>
<td align="left">
<bold>30%</bold>
</td>
<td align="left">
<bold>20%</bold>
</td>
<td align="center">
<bold>100%</bold>
</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>We argue that this taxonomy of dimensions provides, firstly, a necessarily mathematics-specific taxonomy and, secondly, that these dimensions support good teaching practice and that therefore feedback to teachers in terms of these dimensions may be helpful.</p>
<p>An interesting observation is that by including an additional somewhat hierarchical dimension the taxonomy becomes three-dimensional: the mathematical knowledge as listed in the curriculum, the dimensions of understanding and the levels of complexity involved.</p>
<p>Note that the explicit weightings in <xref ref-type="table" rid="T0004">Table 4</xref> are aligned to the South African curriculum documents and would differ depending on the content domain and on the constitution of the class and their aspirations. Having a class of aspiring engineers may warrant more emphasis on problem solving and the creative application of mathematics, whilst also not neglecting routine algorithms that are an important component of the engineer&#x0027;s tool box.</p>
</sec>
<sec id="S0011" sec-type="conclusion">
<title>Conclusion</title>
<p>At the heart of the matter for the curriculum designer, the teacher and the assessment specialist is an advanced understanding of mathematics that takes into account the interconnections between the current school mathematics topics, the connections to the earlier concepts and the progression in subsequent years to more advanced topics (Usiskin, Peressini, Marchisotto &#x0026; Stanley, <xref ref-type="bibr" rid="CIT0033">2003</xref>). Also required is the exploration of alternate definitions, the linking between concepts, knowledge of a wide range of applications and alternate ways of approaching problems (Usiskin et al., <xref ref-type="bibr" rid="CIT0033">2003</xref>). This background knowledge informs the designer in any systemic testing programme. In a model such as the one envisaged by Bennett &#x0026; Gitomer (<xref ref-type="bibr" rid="CIT0007">2009</xref>), there is the potential that attention be paid to the critical areas of mathematics and that these areas are aligned with the classroom.</p>
<p>The fact that we are constrained by existing test programmes, which serve some purpose in the current system, implies that we need to find way of progressively adapting the existing requirements. We must simultaneously bear in mind both that administrators and teachers are change weary and that any changes need to be thoroughly debated and explicit consensus reached about the role and forms of systemic tests.</p>
<p>We note here that we have developed formative assessment resources (Worksheet 2 and Worksheet 4 in <xref ref-type="app" rid="APP0001">Appendix 1</xref>) that are linked to the monitoring component, and are designed in sets with each of the items covering different dimensions and ranging in difficulty. The purpose of these products is for teacher use in the classroom so that the teacher does not have to rely only on external monitoring for feedback about teaching and learning, but will also have useful resources at his or her disposal. Monitoring and accountability purposes can accommodate time lags that intervention strategies cannot afford.</p>
<p>As has already been observed, professional development that does not relate to the classroom experience may not be useful. In addition, systemic assessment that gives no thought to its diagnostic relevance in the classroom must be questioned. The dilemma here, as with the levels advocated by Bloom, is how to operationalise these levels or components of understanding, in such a way that they manifest evidence that the objectives of the curriculum have been met or that learner proficiency is being developed and exhibited. Bloom&#x0027;s levels or TIMSS cognitive domains convey very little in themselves unless they can be interpreted for a specific mathematical context (as they have been in the TIMSS frameworks). For these systems of categories to be useful, they have to be further elaborated by the subject specialist. The Usiskin taxonomy serves this purpose.</p>
<p>Devices such as <xref ref-type="table" rid="T0004">Table 4</xref> guide the designer of an assessment programme towards appropriate balance and coverage of curriculum and attempts to cover different types of cognitive engagement for the context of a specific grade. However, further mathematical insight is required to populate such a multidimensional framework with appropriate items. These insights include the apparent difficulty level of items within each cell in the table. A norm-referenced instrument can emerge, suitable for diagnostic and intervention purposes, which will require some form of marking memo. Such an instrument can be valuable in every classroom but perhaps at different times and stages to suit the progress of the learners in each context. This variability suggests the importance of collaborative projects that construct comparable assessment instruments using a common design framework across the targeted curriculum and then share access to the resulting variety of classroom-focused instruments.</p>
<p>Any additional criterion-referencing, as may be desired for adjudicating individual learner attainment in a classroom summative assessment or in a systemic testing programme, will require some external specification of explicit outcome criteria for various levels of performance quality. These criteria require judgments about the extent to which each of the constituent items are indicators of the required performance levels. These judgments should also be explicitly recorded and may influence memo mark allocations. The related matter of conditions for the legitimacy of addition of marks to establish a single overall performance total is a separate non-trivial issue, but is not discussed further in this article.</p>
<p>The challenge presented to the mathematics education community by Vergnaud (<xref ref-type="bibr" rid="CIT0036">1994</xref>) is that the analysis of concepts and processes must be from a mathematical perspective. He asserts that no linguistic or logical system or natural language description, or levels of abstraction, such as Bloom&#x0027;s taxonomy, can provide the &#x2018;concepts sufficient to conceptualise the [mathematical] world and help us meet the situations and problems that we experience&#x2019; (Vergnaud, <xref ref-type="bibr" rid="CIT0036">1994</xref>, p. 42).</p>
<p>It is the precision of symbolic representation and well-defined concepts in mathematics that conveys both the essential aspects of the mathematical situation and the schemes used by the learner of mathematics. This somewhat radical stance challenges educational researchers and practitioners, whilst being pragmatic in the current policy environment, to keep in mind the essential mathematics.</p>
<p>A related challenge is to maintain the distinction between a learning environment, which requires extensive investigation and engagement with meaningful contexts, and an external assessment programme, which inevitably focuses on the outcomes of a process. Short-circuiting the learning process with obsessive testing may be counterproductive. Here we are reminded of two modes of evaluation, that of the connoisseur, a genuine appreciation of the art of teaching, and that of the critic, an inspector that moves in with a checklist (Eisner, <xref ref-type="bibr" rid="CIT0012">1998</xref>). Bloom et al.&#x0027;s (1956) aim in formulating the taxonomy of educational objective was to extend the repertoire of teaching through engagement with the taxonomy. We envisage that the ideas expressed in this article will provide the impetus for further discussion.</p>
</sec>
</body>
<back>
<ack>
<title>Acknowledgements</title>
<p>The ideas in this article have been generated whilst working on a project conducted by the Centre for Evaluation and Assessment at the University of Pretoria. The project was funded by the Michael and Susan Dell Foundation. The theoretical developments have been the work of the authors.</p>
<sec id="S0012">
<title>Competing interests</title>
<p>The authors declare that they have no financial or personal relationship(s) that may have inappropriately influenced them in writing this article.</p>
</sec>
<sec id="S0013">
<title>Authors&#x2019; contribution</title>
<p>C.L. (University of Pretoria) conceptualised the article, with major contributions from T.D. (University of Cape Town) and H.d.K. (independent consultant). Most of the writing has been the responsibility of the first author, with critical review and insights into the curriculum provided by the co-authors. All three authors have been involved in the monitoring and evaluation project, and have contributed to the conceptualisation of the products, that is, the frameworks and tables.</p>
</sec>
</ack>
<ref-list id="references"><title>References</title>
<ref id="CIT0001"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Anderson</surname>, <given-names>L.</given-names></string-name>, &#x0026; <string-name><surname>Krathwohl</surname>, <given-names>D.A.</given-names></string-name></person-group> (<year>2001</year>). <source><italic>Taxonomy for learning, teaching and assessment. A revision of Bloom&#x0027;s taxonomy of educational objectives</italic></source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Longman</publisher-name>.</mixed-citation></ref>
<ref id="CIT0002"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Anderson</surname>, <given-names>L.W.</given-names></string-name>, <string-name><surname>Kwathwohl</surname>, <given-names>D.R.</given-names></string-name>, <string-name><surname>Airasian</surname>, <given-names>P.W.</given-names></string-name>, <string-name><surname>Cruikshank</surname>, <given-names>K.A.</given-names></string-name>, <string-name><surname>Mayer</surname>, <given-names>R.E.</given-names></string-name>, <string-name><surname>Pintrich</surname>, <given-names>P.R.</given-names></string-name>, <etal>et al.</etal></person-group> (<year>2001</year>). <source><italic>A taxonomy for learning, teaching and assessing: A revision of Bloom&#x0027;s educational objectives</italic></source> (<edition>abridged edition</edition>). <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Longman</publisher-name>.</mixed-citation></ref>
<ref id="CIT0003"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Andrich</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2002</year>). <article-title>A framework relating outcomes based education and the taxonomy of educational objectives</article-title>. <source><italic>Studies in Educational Evaluation</italic></source>, <volume>28</volume>, <fpage>35</fpage>&#x2013;<lpage>59</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1016/S0191-491X(02)00011-1">http://dx.doi.org/10.1016/S0191-491X(02)00011-1</ext-link></comment></mixed-citation></ref>
<ref id="CIT0004"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Andrich</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2009</year>). <source><italic>Review of the curriculum framework for curriculum, assessment and reporting purposes in Western Australian schools, with particular reference to years Kindergarten to Year 10</italic></source>. <publisher-loc>Perth</publisher-loc>: <publisher-name>University of Western Australia</publisher-name>.</mixed-citation></ref>
<ref id="CIT0005"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Bennett</surname>, <given-names>R.</given-names></string-name></person-group> (<year>2010</year>). <article-title>Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment</article-title>. <source><italic>Measurement</italic></source>, <volume>8</volume>, <fpage>70</fpage>&#x2013;<lpage>91</lpage>.</mixed-citation></ref>
<ref id="CIT0006"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Bennett</surname>, <given-names>R.E.</given-names></string-name></person-group> (<year>2011</year>). <article-title>Formative assessment: A critical review</article-title>. <source><italic>Assessment in Education: Principles Policy &#x0026; Practice</italic></source>, <volume>18</volume>(<issue>1</issue>), <fpage>5</fpage>&#x2013;<lpage>25</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1080/0969594X.2010.51367">http://dx.doi.org/10.1080/0969594X.2010.51367</ext-link></comment></mixed-citation></ref>
<ref id="CIT0007"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Bennett</surname>, <given-names>R.E.</given-names></string-name></person-group>, &#x0026; <person-group person-group-type="author"><string-name><surname>Gitomer</surname>, <given-names>G.H.</given-names></string-name></person-group> (<year>2009</year>). <chapter-title>Transforming K-12 assessment: Integrating accountability testing, formative assessment and professional development</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>C.</given-names> <surname>Wyatt-Smith</surname></string-name>, &#x0026; <string-name><given-names>J.J.</given-names> <surname>Cumming</surname></string-name></person-group> (Eds.), <source><italic>Educational assessment in the 21st century</italic></source> (pp. <fpage>43</fpage>&#x2013;<lpage>62</lpage>). <publisher-loc>Dordrecht</publisher-loc>: <publisher-name>Springer</publisher-name>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1007/978-1-4020-9964-9_3">http://dx.doi.org/10.1007/978-1-4020-9964-9_3</ext-link></comment></mixed-citation></ref>
<ref id="CIT0008"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Bloom</surname>, <given-names>B.S.</given-names></string-name>, <string-name><surname>Engelhart</surname>, <given-names>M.D.</given-names></string-name>, <string-name><surname>Furst</surname>, <given-names>E.J.</given-names></string-name>, <string-name><surname>Hill</surname>, <given-names>W.H.</given-names></string-name>, &#x0026; <string-name><surname>Krathwohl</surname>, <given-names>D.R.</given-names></string-name></person-group> (<year>1956</year>). <source><italic>Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain</italic></source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>David McKay Company</publisher-name>.</mixed-citation></ref>
<ref id="CIT0009"><mixed-citation publication-type="conference"><person-group person-group-type="author"><collab>Department of Basic Education</collab></person-group>. (<year>2011</year>). <source><italic>Curriculum and assessment policy statement Grades R-12. Mathematics</italic></source>. <conf-loc>Pretoria</conf-loc>: <publisher-name>DOE</publisher-name>.</mixed-citation></ref>
<ref id="CIT0010"><mixed-citation publication-type="conference"><person-group person-group-type="author"><collab>Department of Education</collab></person-group>. (<year>2002</year>). <source><italic>Revised national curriculum statement Grades R-9 (Schools) Mathematics</italic></source>. <conf-loc>Pretoria</conf-loc>: <publisher-name>DOE</publisher-name>.</mixed-citation></ref>
<ref id="CIT0011"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Dunne</surname>, <given-names>T.</given-names></string-name></person-group>, <person-group person-group-type="author"><string-name><surname>Long</surname>, <given-names>C.</given-names></string-name></person-group>, <person-group person-group-type="author"><string-name><surname>Craig</surname>, <given-names>T.</given-names></string-name></person-group>, &#x0026; <person-group person-group-type="author"><string-name><surname>Venter</surname>, <given-names>E.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Meeting the requirements of both classroom-based and systemic assessment of mathematics proficiency: The potential of Rasch measurement theory</article-title>. <source><italic>Pythagoras</italic></source>, <volume>33</volume>(<issue>3</issue>), <comment>Art. #19</comment>, <fpage>16</fpage> pages. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.4102/pythagoras.v33i3.19">http://dx.doi.org/10.4102/pythagoras.v33i3.19</ext-link></comment></mixed-citation></ref>
<ref id="CIT0012"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Eisner</surname>, <given-names>E.W.</given-names></string-name></person-group> (<year>1998</year>). <source><italic>The enlightened eye: Qualitative inquiry and the enhancement of educational practice</italic></source>. <publisher-loc>Upper Saddle River, NJ</publisher-loc>: <publisher-name>Merrill</publisher-name>.</mixed-citation></ref>
<ref id="CIT0013"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Hiebert</surname>, <given-names>J.</given-names></string-name></person-group>, &#x0026; <person-group person-group-type="author"><string-name><surname>Lefevre</surname>, <given-names>P.</given-names></string-name></person-group> (<year>1986</year>). <chapter-title>Conceptual and procedural knowledge in mathematics: An introductory analysis</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>J.</given-names> <surname>Hiebert</surname></string-name></person-group> (Ed.), <source><italic>Conceptual and procedural knowledge: The case of mathematics</italic></source> (pp. <fpage>1</fpage>&#x2013;<lpage>27</lpage>). <publisher-loc>Hillsdale, NJ</publisher-loc>: <publisher-name>Erlbaum</publisher-name>.</mixed-citation></ref>
<ref id="CIT0014"><mixed-citation publication-type="book"><person-group person-group-type="editor"><string-name><surname>Kilpatrick</surname>, <given-names>J.</given-names></string-name></person-group>, <person-group person-group-type="editor"><string-name><surname>Swafford</surname>, <given-names>J.</given-names></string-name></person-group>, &#x0026; <person-group person-group-type="editor"><string-name><surname>Findell</surname>, <given-names>B.</given-names></string-name></person-group> (Eds.). (<year>2001</year>). <source><italic>Adding it up: Helping children learn mathematics</italic></source>. <publisher-loc>Washington, DC</publisher-loc>: <publisher-name>National Academy Press</publisher-name>.</mixed-citation></ref>
<ref id="CIT0015"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Krathwohl</surname>, <given-names>D.R.</given-names></string-name></person-group> (<year>2002</year>). <article-title>A revision of Bloom&#x0027;s taxonomy: An overview</article-title>. <source><italic>Theory into Practice</italic></source>, <volume>41</volume>(<issue>4</issue>), <fpage>212</fpage>&#x2013;<lpage>218</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1207/s15430421tip4104_2">http://dx.doi.org/10.1207/s15430421tip4104_2</ext-link></comment></mixed-citation></ref>
<ref id="CIT0016"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Kuiper</surname>, <given-names>W.</given-names></string-name></person-group>, <person-group person-group-type="author"><string-name><surname>Nieveen</surname>, <given-names>N.</given-names></string-name></person-group>, &#x0026; <person-group person-group-type="author"><string-name><surname>Berkvens</surname>, <given-names>J.</given-names></string-name></person-group> (<year>2013</year>). <chapter-title>Curriculum regulation and freedom in the Netherlands - A puzzling paradox</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>W.</given-names> <surname>Kuiper</surname></string-name>, &#x0026; <string-name><given-names>J.</given-names> <surname>Berkvens</surname></string-name></person-group> (Eds.). <source><italic>Balancing curriculum freedom and regulation across Europe</italic></source>. <comment>CIDREE Yearbook 2013</comment> (pp. <fpage>139</fpage>&#x2013;<lpage>162</lpage>). <publisher-loc>Enschede</publisher-loc>: <publisher-name>SLO &#x0026; Jan Berkvens</publisher-name>.</mixed-citation></ref>
<ref id="CIT0017"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Long</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2005</year>). <article-title>Maths concepts in teaching: Procedural and conceptual knowledge</article-title>. <source><italic>Pythagoras</italic></source>, <volume>62</volume>, <fpage>59</fpage>&#x2013;<lpage>65</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.4102/pythagoras.v0i62.115">http://dx.doi.org/10.4102/pythagoras.v0i62.115</ext-link></comment></mixed-citation></ref>
<ref id="CIT0018"><mixed-citation publication-type="conference"><person-group person-group-type="author"><string-name><surname>Long</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2011</year>). <source><italic>Mathematical, cognitive and didactic elements of the multiplicative conceptual field investigated within a Rasch assessment and measurement framework</italic></source>. <comment>Unpublished doctoral dissertation. Faculty of Humanities, University of Cape Town, Cape Town, South Africa. Available from <ext-link ext-link-type="uri" xlink:href="http://hdl.handle.net/%2011180/1521">http://hdl.handle.net/%2011180/1521</ext-link></comment></mixed-citation></ref>
<ref id="CIT0019"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Long</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Dunne</surname>, <given-names>T.</given-names></string-name>, &#x0026; <string-name><surname>Mokoena</surname>, <given-names>G.</given-names></string-name></person-group> (<year>2014</year>). <article-title>A model of assessment: Integrating external monitoring with classroom practice</article-title>. <source><italic>Perspectives in Education</italic></source>, <volume>32</volume>(<issue>1</issue>), <fpage>158</fpage>&#x2013;<lpage>178</lpage>.</mixed-citation></ref>
<ref id="CIT0020"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>McConnell</surname>, <given-names>J.W.</given-names></string-name>, <string-name><surname>Brown</surname>, <given-names>S.</given-names></string-name>, <string-name><surname>Usiskin</surname>, <given-names>Z.</given-names></string-name>, <string-name><surname>Senk</surname>, <given-names>S.L.</given-names></string-name>, <string-name><surname>Widerski</surname>, <given-names>T.</given-names></string-name>, <string-name><surname>Anderson</surname>, <given-names>S.</given-names></string-name>, <etal>et al.</etal></person-group> (<year>2002</year>). <source><italic>Algebra, teacher&#x0027;s edition</italic></source>. <publisher-loc>Glenview, IL</publisher-loc>: <publisher-name>Prentice Hall</publisher-name>.</mixed-citation></ref>
<ref id="CIT0021"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Messick</surname>, <given-names>S.</given-names></string-name></person-group> (<year>1989</year>). <article-title>Meaning and values in test validation: The science and ethics of assessment</article-title>. <source><italic>Educational Researcher</italic></source>, <volume>18</volume>(<issue>2</issue>), <fpage>5</fpage>&#x2013;<lpage>11</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.3102/0013189X018002005">http://dx.doi.org/10.3102/0013189X018002005</ext-link></comment></mixed-citation></ref>
<ref id="CIT0022"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Mullis</surname>, <given-names>I.V.S.</given-names></string-name>, <string-name><surname>Martin</surname>, <given-names>M.O.</given-names></string-name>, <string-name><surname>Smith</surname>, <given-names>T.A.</given-names></string-name>, <string-name><surname>Garden</surname>, <given-names>R.A.</given-names></string-name>, <string-name><surname>Gregory</surname>, <given-names>K.D.</given-names></string-name>, <string-name><surname>Gonzalez</surname>, <given-names>E.J.</given-names></string-name>, <etal>et al.</etal></person-group> (<year>2003</year>). <source><italic>TIMSS assessment frameworks and specifications 2003</italic></source>. <publisher-loc>Chestnut Hill, MA</publisher-loc>: <publisher-name>Boston College</publisher-name>.</mixed-citation></ref>
<ref id="CIT0023"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Mullis</surname>, <given-names>I.V.S.</given-names></string-name>, <string-name><surname>Martin</surname>, <given-names>M.O.</given-names></string-name>, <string-name><surname>Ruddock</surname>, <given-names>G.J.</given-names></string-name>, <string-name><surname>O&#x0027;Sullivan</surname>, <given-names>C.Y.</given-names></string-name>, <string-name><surname>Arora</surname>, <given-names>A.</given-names></string-name>, &#x0026; <string-name><surname>Erberber</surname>, <given-names>E.</given-names></string-name></person-group> (<year>2005</year>). <source><italic>TIMSS 2007 assessment frameworks</italic></source>. <publisher-loc>Chestnut Hill, MA</publisher-loc>: <publisher-name>TIMSS &#x0026; PIRLS International Study Center, Boston College</publisher-name>.</mixed-citation></ref>
<ref id="CIT0024"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Mullis</surname>, <given-names>I.V.S.</given-names></string-name>, <string-name><surname>Martin</surname>, <given-names>M.O.</given-names></string-name>, <string-name><surname>Ruddock</surname>, <given-names>G.J.</given-names></string-name>, <string-name><surname>O&#x0027;Sullivan</surname>, <given-names>C.Y.</given-names></string-name>, &#x0026; <string-name><surname>Preuschoff</surname>, <given-names>C.</given-names></string-name></person-group> (<year>2009</year>). <source><italic>TIMSS 2011 assessment frameworks: TIMSS &#x0026; PIRLS</italic></source>. <publisher-loc>Chestnut Hill, MA</publisher-loc>: <publisher-name>International Study Center, Lynch School of Education, Boston College</publisher-name>.</mixed-citation></ref>
<ref id="CIT0025"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Nichols</surname>, <given-names>S.</given-names></string-name>, &#x0026; <string-name><surname>Berliner</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2005</year>). <source><italic>The inevitable corruption of indicators and educators through high stakes testing</italic></source>. <publisher-loc>Tempe, AZ</publisher-loc>: <publisher-name>Education Policy Studies Laboratory, Arizona State University</publisher-name>.</mixed-citation></ref>
<ref id="CIT0026"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Nichols</surname>, <given-names>S.</given-names></string-name>, &#x0026; <string-name><surname>Berliner</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2008</year>). <article-title>Why has high stakes testing slipped so easily into contemporary American life?</article-title>. <source><italic>Phi Delta Kappan</italic></source>, <volume>89</volume>(<issue>9</issue>), <fpage>672</fpage>&#x2013;<lpage>676</lpage>. <comment><ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1177/003172170808900913">http://dx.doi.org/10.1177/003172170808900913</ext-link></comment></mixed-citation></ref>
<ref id="CIT0027"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Polya</surname>, <given-names>G.</given-names></string-name></person-group> (<year>1957</year>). <source><italic>How to solve it: A new aspect of mathematical method</italic></source>. <publisher-loc>Princeton, NJ</publisher-loc>: <publisher-name>Princeton University Press</publisher-name>.</mixed-citation></ref>
<ref id="CIT0028"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Scheffler</surname>, <given-names>I.</given-names></string-name></person-group> (<year>1965</year>). <source><italic>The conditions of knowledge</italic></source>. <publisher-loc>Glenview, IL</publisher-loc>: <publisher-name>Scott, Foresman &#x0026; Company</publisher-name>.</mixed-citation></ref>
<ref id="CIT0029"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Schmidt</surname>, <given-names>W.</given-names></string-name>, <string-name><surname>McKnight</surname>, <given-names>C.</given-names></string-name>, <string-name><surname>Valverde</surname>, <given-names>G.</given-names></string-name>, <string-name><surname>Houang</surname>, <given-names>R.</given-names></string-name>, &#x0026; <string-name><surname>Wiley</surname>, <given-names>D.</given-names></string-name></person-group> (<year>1996</year>). <source><italic>Many visions, many aims</italic></source>. <publisher-loc>Dordrecht</publisher-loc>: <publisher-name>Kluwer Academic Publishers</publisher-name>.</mixed-citation></ref>
<ref id="CIT0030"><mixed-citation publication-type="conference"><person-group person-group-type="author"><string-name><surname>Schoenfeld</surname>, <given-names>A.H.</given-names></string-name></person-group> (<year>2007</year>). <article-title>What is mathematical proficiency and how can it be assessed?</article-title> In <person-group person-group-type="editor"><string-name><given-names>A.H.</given-names> <surname>Schoenfeld</surname></string-name></person-group> (Ed.), <source><italic>Assessing mathematical proficiency</italic></source> (pp. <fpage>9</fpage>&#x2013;<lpage>73</lpage>). <comment>Mathematical Sciences Research Institute Publications</comment>, Vol. <volume>53</volume>. <conf-loc>New York, NY</conf-loc>: <publisher-name>Cambridge University Press</publisher-name>. <comment>Available from <ext-link ext-link-type="uri" xlink:href="http://library.msri.org/books/Book53/contents.html">http://library.msri.org/books/Book53/contents.html</ext-link></comment></mixed-citation></ref>
<ref id="CIT0031"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Skemp</surname>, <given-names>R.</given-names></string-name></person-group> (<year>1976</year>). <article-title>Relational understanding and instrumental understanding</article-title>. <source><italic>Mathematics Teaching</italic></source>, <volume>77</volume>, <fpage>20</fpage>&#x2013;<lpage>26</lpage>.</mixed-citation></ref>
<ref id="CIT0032"><mixed-citation publication-type="conference"><person-group person-group-type="author"><string-name><surname>Usiskin</surname>, <given-names>Z.</given-names></string-name></person-group> (<year>2012</year>, <month>July</month>). <source><italic>What does it mean to understand school mathematics?</italic></source><conf-name>Paper presented at the 12th International Congress on Mathematical Education</conf-name>, <publisher-name>COEX</publisher-name>, <conf-loc>Seoul, Korea</conf-loc>.</mixed-citation></ref>
<ref id="CIT0033"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Usiskin</surname>, <given-names>Z.</given-names></string-name>, <string-name><surname>Peressini</surname>, <given-names>A.</given-names></string-name>, <string-name><surname>Marchisotto</surname>, <given-names>E.</given-names></string-name>, &#x0026; <string-name><surname>Stanley</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2003</year>). <source><italic>Mathematics for high school teachers: An advanced perspective</italic></source>. <publisher-loc>Upper Saddle River, NJ</publisher-loc>: <publisher-name>Prentice Hall</publisher-name>.</mixed-citation></ref>
<ref id="CIT0034"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Van Wyke</surname>, <given-names>J.</given-names></string-name>, &#x0026; <string-name><surname>Andrich</surname>, <given-names>D.</given-names></string-name></person-group> (<year>2006</year>). <source><italic>A typology of polytomously scored items disclosed by the Rasch model: Implications for constructing a continuum of achievement</italic></source>. <publisher-loc>Perth</publisher-loc>: <publisher-name>University of Murdoch University</publisher-name>.</mixed-citation></ref>
<ref id="CIT0035"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Vergnaud</surname>, <given-names>G.</given-names></string-name></person-group> (<year>1988</year>). <chapter-title>Multiplicative structures</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>J.</given-names> <surname>Hiebert</surname></string-name>, &#x0026; <string-name><given-names>M.</given-names> <surname>Behr</surname></string-name></person-group> (Eds.), <source><italic>Number concepts and operations in the middle grades</italic></source> (pp. <fpage>141</fpage>&#x2013;<lpage>161</lpage>). <publisher-loc>Hillsdale, NJ</publisher-loc>: <publisher-name>National Council of Teachers of Mathematics</publisher-name>.</mixed-citation></ref>
<ref id="CIT0036"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Vergnaud</surname>, <given-names>G.</given-names></string-name></person-group> (<year>1994</year>). <chapter-title>Multiplicative conceptual field: What and why?</chapter-title> In <person-group person-group-type="editor"><string-name><given-names>G.</given-names> <surname>Harel</surname></string-name>, &#x0026; <string-name><given-names>J.</given-names> <surname>Confrey</surname></string-name></person-group> (Eds.), <source><italic>The development of multiplicative reasoning in the learning of mathematics</italic></source> (pp. <fpage>41</fpage>&#x2013;<lpage>59</lpage>). <publisher-loc>Albany, NY</publisher-loc>: <publisher-name>State University of New York</publisher-name>.</mixed-citation></ref>
<ref id="CIT0037"><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><surname>Webb</surname>, <given-names>N.L.</given-names></string-name></person-group> (<year>1992</year>). <chapter-title>Assessment of students&#x2019; knowledge of mathematics: Steps toward a theory</chapter-title>. In <person-group person-group-type="editor"><string-name><given-names>D.A.</given-names> <surname>Grouws</surname></string-name></person-group> (Ed.), <source><italic>NCTM handbook of research on mathematics teaching and learning</italic></source>. (pp. <fpage>661</fpage>&#x2013;<lpage>683</lpage>). <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Macmillan Publishing Company</publisher-name>.</mixed-citation></ref>
<ref id="CIT0038"><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Wolk</surname>, <given-names>R.A.</given-names></string-name></person-group> (<year>2012</year>). <article-title>Common Core vs. Common Sense</article-title>. <source><italic>Education Week</italic></source>, <volume>32</volume>(<issue>13</issue>), <fpage>35</fpage>&#x2013;<lpage>40</lpage>.</mixed-citation></ref></ref-list>
<app-group>
<app id="APP0001">
<label>Appendix 1</label>
<sec id="S0014">
<label>Worksheet 2: Grade 9: Patterns, functions and algebra</label>
<p>Determine the general term for this pattern:</p>
	<fig id="F0002">
	<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="PYTH-35-240-g002.tif"/>
	</fig>
<p>Factorise fully: 20x<sup>2</sup> &#x2013; 45y<sup>4</sup>
</p>
<p>A, B and C show different representations of a linear function. Which one of the following three representations does not represent the same linear function?
<table-wrap>
<table frame="hsides" rules="groups">
<tbody>
<tr>
<td align="center">-2</td>
<td align="center">-1</td>
<td align="center">0</td>
<td align="center">1</td>
<td align="center">2</td>
<td align="center">3</td>
</tr>
<tr>
<td align="center">-2</td>
<td align="center">0</td>
<td align="center">2</td>
<td align="center">4</td>
<td align="center">6</td>
<td align="center">10</td>
</tr>
</tbody>
</table>
</table-wrap>
</p>
<p>B: <italic>y</italic> = 2<italic>x</italic> + 2</p>
<p>C:</p>
<fig id="F0003">
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="PYTH-35-240-g003.tif"/>
</fig>
</sec>
<sec id="S0015">
<label>Worksheet 4: Grade 9: Measurement</label>
<p>The area of a square is 4 m<sup>2</sup>. Calculate the area of the shape if one side of the original square is doubled.</p>
<p>Calculate the volume of the cylinder. Use 22/7 as an approximation for &#x3C0;.</p>
<fig id="F0004">
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="PYTH-35-240-g004.tif"/>
</fig>
<p>An airplane flies 300 km due north. However the pilot ignored the constant side wind which took him off course. The flight path is shown in the figure below. How far is he from his original destination?</p>
<fig id="F0005">
<graphic xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="PYTH-35-240-g005.tif"/>
</fig>
</sec>
</app>
<app id="APP0002">
<label>Appendix 2</label>
<table-wrap id="T0005">
<label>TABLE 1&#x2212;A2</label>
<caption><p>Adapted cognitive domain categories (Grade 9).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Algebra and Functions</th>
<th align="left">Knowledge of algebraic language and the ability to simplify algebraic expressions, solve simple equations and work with one representation of a relationship or rule.</th>
<th align="left">Comprehension and application of factors and the laws of exponents. Making connections between different representations of a relationship.</th>
<th align="left">Solving problems that require selecting a representation for a situation, solving equations and interpreting graphs.</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">
<bold>Algebraic skills and processes:</bold>
<break></break>Solving equations by inspection, trial-and-improvement or algebraic processes (using additive and multiplicative inverses; factorisation)<break></break>Using the distributive law and algebraic skills to simplify algebraic expressions, to find the product of two binomials, to factorise algebraic expressions (common factors and difference of squares)<break></break>Using the laws of exponents to simplify expressions</td>
<td align="left">
<bold>1</bold> (simplifying algebraic expression)<break></break>
<bold>2</bold> (multiplying algebraic factors)<break></break>
<bold>5a</bold> (solving equations)<break></break>
<bold>5b</bold> (solving algebraic fraction equations)<break></break>
<bold>5c</bold> (solving exponential equations)<break></break>
<bold>9</bold> (properties of a rectangle, applying algebra)</td>
<td align="left"/>
<td align="left">
<bold>8</bold> (applying algebraic principles to solving a problem)<bold>9</bold> (applying algebraic principles to solving a problem)</td>
</tr>
<tr>
<td align="left">
<bold>Sequences, graphing and functions:</bold>
<break></break>Investigating numeric and geometric patterns and relationships by adding terms and explaining and representing the rules that generate them<break></break>Drawing graphs on the Cartesian plane for given equations (in two variables) or determining equations or formulae from given graphs or tables<break></break>Representing and using relationships between variables to determine input or output values and to interpret the equivalence of different descriptions of the same relationship or rule presented in different ways in order to select the most useful representation for a given situation</td>
<td align="left">
<bold>3a</bold> (linear sequence)<break></break>
<bold>3b</bold> (equation representing a sequence)<break></break>
<bold>3c</bold> (applying a function)</td>
<td align="left">
<bold>6</bold> (explain the relationship between function relationship and table)<break></break>
<bold>7</bold> (interpreting distance-time graphs)<break></break>
<bold>11a</bold> (finding coordinates from equations)<break></break>
<bold>11b</bold> (plotting coordinates)</td>
<td align="left">
<bold>4a</bold> (equations representing a situation)<break></break>
<bold>4b</bold> (solve by inspection and using equations)<break></break>
<bold>10a</bold> (identifying a pattern)<break></break>
<bold>10b</bold> (constructing a formula)<break></break>
<bold>10c</bold> (applying the formula)</td>
</tr>
</tbody>
</table>
</table-wrap>
</app>
<app id="APP0003">
<label>Appendix 3</label>
<table-wrap id="T0006">
<label>TABLE 1&#x2212;A3</label>
<caption>
<p>Items allocated to Usiskin&#x0027;s dimensions (Grade 9).</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Algebra and functions</th>
<th align="center">Knowledge (mathematical concepts)</th>
<th align="center">Applications and problem solving</th>
<th align="left"/>
<th align="left"/>
<th align="left"/>
<th align="left"/>
</tr>
</thead>
<tbody>
<tr>
<td align="left"/>
<td align="left">Skills and algorithms</td>
<td align="left">Properties principles</td>
<td align="left">Representation</td>
<td align="left">Proofs and justification</td>
<td align="left">Use and applications</td>
<td align="left">Problem solving</td>
</tr>
<tr>
<td align="left">
<bold>Algebraic skills and processes</bold>
<break></break>Solving equations by inspection, trial-and-improvement or algebraic processes (using additive and multiplicative inverses; factorisation)<break></break>Using the distributive law and algebraic skills to simplify algebraic expressions, to find the product of two binomials, to factorise algebraic expressions (common factors and difference of squares)<break></break>Using the laws of exponents to simplify expressions</td>
<td align="left">
<bold>1</bold> (simplifying algebraic expression)<break></break>
<bold>2</bold> (multiplying algebraic factors)<break></break>
<bold>5a</bold> (solving equations)<break></break>
<bold>5b</bold> (solving algebraic fraction equations)</td>
<td align="left">
<bold>5c</bold> (solving exponential equations)<break></break>
<bold>9</bold> (properties of a rectangle, applying algebra)</td>
<td align="left">-</td>
<td align="left">-</td>
<td align="left">
<bold>8</bold> (applying algebraic principles to solving a problem)<break></break>
<bold>9</bold> (applying algebraic principles to solving a problem)</td>
<td align="left">-</td>
</tr>
<tr>
<td align="left">
<bold>Sequences, graphing and functions</bold>
<break></break>Investigating numeric and geometric patterns and relationships by adding terms and explaining and representing the rules that generate them<break></break>Drawing graphs on the Cartesian plane for given equations (in two variables) or determining equations or formulae from given graphs or tables<break></break>Representing and using relationships between variables to determine input or output values and to interpret the equivalence of different descriptions of the same relationship or rule presented in different ways in order to select the most useful representation for a given situation</td>
<td align="left">
<bold>3a</bold> (linear sequence)<break></break>
<bold>3b</bold> (equation representing a sequence)<break></break>
<bold>3c</bold> (applying a function)</td>
<td align="left">
<bold>6</bold> (explain the relationship between function and table)</td>
<td align="left">
<bold>7</bold> (interpreting distance-time graphs)<break></break>
<bold>11a</bold> (finding coordinates from equations)<break></break>
<bold>11b</bold> (plotting coordinates)</td>
<td align="left">-</td>
<td align="left">
<bold>4a</bold> (equations representing a situation)<break></break>
<bold>4b</bold> (solve by inspection and using equations)</td>
<td align="left">
<bold>10a</bold> (identifying a pattern)<break></break>
<bold>10b</bold> (constructing a formula)<break></break>
<bold>10c</bold> (applying the formula)</td>
</tr>
</tbody>
</table>
</table-wrap>
</app>
</app-group>
<fn-group>
<fn id="FN0001">
<label>1</label>
<p>Interesting connection here with the Van Hiele levels.</p>
</fn>
</fn-group>
<fn-group>
<fn id="FN0002">
<label>2</label>
<p>See Long (2011, p. 234) for a detailed discussion.</p>
</fn>
</fn-group>
<fn-group>
<fn id="FN0003">
<label>3</label>
<p>In the early stages of the project these tasks were performed by the research team. In later stages of the project this task became a joint function of both researchers and teachers.</p>
</fn>
</fn-group>
</back></article>