Detecting mathematics learning online

Corneli, J., & Ponti, M. (2012). Detecting mathematics learning online. Proceedings of the 8th International Conference on Networked Learning. Retrieved from




author = {Corneli, Joseph and Ponti, M.},
booktitle = {Proceedings of the 8th International Conference on Networked Learning},
date-added = {2012-04-07 02:33:01 +0000},
date-modified = {2012-11-27 19:05:33 +0000},
date-read = {2012-04-12 09:48:07 -0400},
keywords = {1p2pu},
oa-url = {},
read = {1},
title = {Detecting mathematics learning online},
url = {},
year = {2012},


Key ideas

“The Learning Model”: How would we detect learning in this space?

  • A naıve idea: model learning as vocabulary acquisition
    • perhaps with “certification” through the evaluation process
  • More complete: model learning as a change in patterns of behavior
  • Sophisticated: model learning in terms of the use of new heuristic strategies

Behavior change

Here we would look for behavior like:

  • Working at the cutting edge (introducing new material?)
  • Progressive problem solving (working at increasing depth?)
  • Collaborative effort (asking or answering questions?)
  • User-Identified high points (bookmarked threads and articles?)
  • Processing level (eg browsing vs attempting to solve problems?)
  • Quality of self-explanation (knowing how activities relate to goals?)
  • Exploration (trying different resources?)

Links here

Highlights (78%)

Abstract The purpose of the paper is to design a rubric for assessment of informal learning in undergraduate level mathematics. Our proposed assessment strategy would support and parallel the mathematics learning environment we are developing at PlanetMath. PlanetMath is currently an open Web 2.0 system that consists of peer-produced encyclopaedia articles and forum discussions: we will add facilities for contributing textbook-style problems and solutions, and detailed activity tracking to model and support learning. p. 1

We witness an increasing number of open online groups populated by self-organized and self-managed individuals who actively generate content in the attempt to foster learning communities open to all who wish to participate. These groups attempt to deploy the educational affordances (Kirschner, 2002) of Web 2.0 tools and technologies for information sharing, communication, content creation and collaboration. These affordances can enable participatory processes that can support multiple modes of learning in new open spaces, and blur the boundaries between production and consumption (Brown & Adler, 2008; Alevizou, Conole, Culver & Galley, 2009). Calls for learning approaches (especially those linked to socio-cultural theory) that are able to better exploit the educational affordances of Web 2.0 have been proposed (e.g., McLoughlin & Lee, 2007; Brown & Adler, 2008; Alevizou et al., 2010). However, claims about the educational and affective dimensions of online learning, e.g. that “these affordances stimulate the development of a participatory culture in which there is genuine engagement and communication, and in which members feel socially connected with one another” (McLoughlin & Lee, 2007, p. 667) invite further grounding in empirical research. Similarly, there seems to be a dearth of empirical studies addressing the learning-specific impact of Web 2.0. p. 1

We describe an analytical framework for detecting when and how learning is taking place among participants in the open, peer-produced, mathematics website, For this to work, we require learning-specific metrics: for example, “participation” is not in and of itself a particularly salient measure of learning. We follow the popular view that “mathematics is a language”. In this view, learning mathematics can be modelled as a process of learning a specialized vocabulary, along with the technical grammar of proof. p. 1

We agree with David Smith (2002, p. 5) when he writes, “learning mathematics is, first of all, learning, and only secondarily about mathematics.” We will draw on microgenetic analysis, a collection of methods and techniques that have been used to study learning “as it happens” in a wide range of fields (Lavelli et al., 2005). Although we have not yet been able to validate this framework in practice, it has informed the design of the platform we are building, as we will describe below. p. 1 ( was initially developed by Aaron Krowne during the course of his Master’s studies at Virgina Tech. PlanetMath is a virtual community and peer-produced repository of mathematical knowledge (Krowne, 2003). Its central feature is a peer-produced mathematics encyclopaedia, which has been written by a group of around 300 volunteers, and now containing nearly 9000 entries defining over 15000 terms. Similarly to Wikipedia, its contents are available under the terms of the Creative Commons By/Share-Alike License, but as a subject-specific encyclopaedia, it tends to have more specialised content (e.g. detailed proofs) (Corneli, 2011). p. 2

The transition from reference resource to learning environment p. 2

PlanetMath’s aim is to make mathematical knowledge more accessible, therefore it is a natural step to adapt PlanetMath’s strategy of peer production to improve the site’s support for learners. The approach underway takes the existing knowledge base as its core resource and extends it with a range of new materials and interactions focusing on problem solving – “the heart of mathematics” (Halmos, 1980). In short, our aim is to build a knowledge-rich online learning environment for mathematics. p. 2

A learning environment is made of human practices and material objects (Warger & Dobbin, 2009). It is a part of the human ecology wherein learners work alone or with others, using resources to pursue learning goals. It hosts the interactions of different types of learners, who interact, not only with each other, but also with artefacts, technologies and content (Nardi and O'Day, 1999). This “hosted interaction” feature is what distinguishes a learning environment from a learning object repository or digital library. p. 2

We follow Conole (2008) in thinking of “learning design” as a way of creating and representing practice. p. 2

Learning in this environment primarily entails solving problems, though numerous other activities will have learning relevance, e.g. asking a question, giving or receiving advice, or improving an expository text. From the user’s perspective, the environment will offer some significant advantages over a standard textbook or problem archive. For instance, nothing is more daunting than being faced with a problem to solve and not knowing what the terms in that problem mean. On PlanetMath, we will be able to provide automatically-generated links to the definitions of technical terms in problem statements. Frequently, such definitions are not enough, and the user will be able to make annotations asking for hints about how to use the definitions. These requests will help improve the quality and relevance of encyclopaedia articles. Over time, we expect a large archive of hints and worked examples to accumulate, overcoming the cold-start problem for learners. In particular, as generations of learners interact with the site, sharing and reflecting on strategies for meeting their learning goals, a collection of efficient learning pathways should emerge. Our hope is this archive and guide will help learners connect to mathematics in a meaningful way: rather than having textbook problems serve as a daunting obstacle to application or research, they should, in this context, serve as stepping stones to relevant and meaningful engagement in mathematical practice. p. 3

Theoretical perspective: PlanetMath as epistemic object and space for knowledge building p. 3

In this section, we describe two theoretical concepts we are exploring to underpin our analytic framework. Firstly, we suggest that PlanetMath can be conceptualized as an epistemic object (Miettinen & Virkkunen, 2005). Miettinen and Virkkunen explain that epistemic objects or objects of inquiry are constructed and are not fixed, but are projected towards something that does not exist yet. We suggest that PlanetMath can become an epistemic object in this sense – an “instruction manual”, whose process of development and refinement can contribute to transforming mathematics education. This is a process that develops by critique, questioning, and even dissent at the content level, as well as an ongoing and parallel process of tinkering on the infrastructural level. Conceptualizing PlanetMath as an epistemic object connects us to the socio-cultural historical approach, which examines the foundations of learning not in “abstract rules”, but in material artefacts (Miettinen & Virkkunen, 2005). In this view, PlanetMath can be seen as a shared artefact objectifying forms of expression of cognitive processes and patterns that exist outside of the minds of individual participants. The process of learning mathematics can be embodied in this artefact’s constituent tools and signs. Drawing on Vygotsky’s (1979) concept of mediated action, a participant can internalize these, by interacting with other users, with content, and with tools. PlanetMath becomes a new means of mediating the activity of learning mathematics. One can imagine it providing many of the same features as textbooks (problems and exposition), as well as many features that textbooks cannot provide (like peer-to-peer conversations and mentoring). p. 3

Secondly, we suggest that the concept of knowledge building is relevant to examining the foundations of mathematics learning in such a concrete setting. Knowledge building refers to the production of objects (Bereiter & Scardamalia, 1996), which, in PlanetMath, include, for example, production of articles and solutions to problems. Similar to work in research and scientific communities, learners build knowledge by asking cutting-edge questions that help the community advance its collective understanding, by engaging in problem solving and by seeking to understand problems at deeper levels (Scardamalia & Bereiter, 2003). Again, this cocreated knowledge, which is relevant to the goal of learning mathematics, is embedded in shared artefacts (e.g., structures, algorithms and proofs, explanations and justifications) (Bereiter & Scardamalia, 1996). p. 3

Developing an analytical framework to detect learning p. 3

Our suggested framework consists of three layers. The first layer will keep track of what happens in the learning environment at the “micro-developmental” level, in the following way: p. 3

  • Examine user activities, such as solving a problem, asking a question, and giving a hint.
  • Examine the vocabulary (specialized terms) that people use in texts they submit to the site.
  • Assume that when this vocabulary is used in a solution that has been marked correct, then the vocabulary has also been used correctly. p. 4

The second layer is inspired by the analytical toolkit (ATK) used by Chan, Lee and van Aalst (2001) to examine knowledge-building inquiry and discourse among high school students using the Knowledge Forum. In addition to providing measures of activities in the Knowledge Forum, including notes written, notes read and keywords used, the toolkit assesses aspects of the development of collective knowledge, using an adaptation of principles designed by Scardamalia (2000). The toolkit aims to detect: p. 4

  • Working at the cutting edge: participants work to advance individual and collective knowledge, and not to reiterate what is already known.
  • Progressive problem solving: participants pursue problems at progressively deeper level, which means that when participant in the community understands a problem, s/he uses this learning to deepen the understanding of the problem and solve it.
  • Collaborative effort: participants need to share understanding and to advance collective knowledge.
  • Identifying high points in the discourse: participants note contributions that help them understand a problem better and advance collective knowledge. p. 4

Additional situational or relational features that can describe mathematical problem solving behaviour (Hosein, 2009) which fit at this layer include: p. 4

  • Processing level: deep or surface learning, e.g. as evidenced by attempting to work through problems versus merely browsing solutions.
  • Quality of self-explanation: the learner is able to explain why they are doing what they are doing.
  • Exploration: the learner makes use of available tools and content. p. 4

We will aim to build software that can identify in an automatic way these aspects of learning and knowledge building. For example, site-wide analysis of vocabulary usage would enable us to identify someone who uses a completely new term to be “working at the cutting edge”. We can also use the network of interlinked documents to understand how a given activity relates to another (“progressive problem solving”). We will extend our model of simplified “language acquisition” (Layer 1) and different measures of engagement (Layer 2), with a third layer, which deals with the substance of problem solving itself. Before describing this third layer, we make some remarks on how we plan to analyse the data we will be gathering. p. 4

Microgenetic analysis p. 4

Open online learning environments provide researchers with access to all of the same data that participants use to communicate with one another (Stahl, 2006). Researchers have access to a natural data set similar to that provided by “talk-aloud protocols”, which can provide detailed evidence of learning and development. This kind of data – and the informal, ad hoc interactions that generate it – matches well with some of the ideas and techniques of microgenetic analysis. The microgenetic approach studies learning as a process, rather than the outcome of a process (Lavelli, Pantoja, Hsu, Messinger & Fogel, 2002). The approach examines moment-bymoment changes observed in a short period of time, often for a high number of separate observations, but not necessarily subject to the same staged treatment patterns found in longitudinal studies. Observations tend to be analyzed intensively, both qualitatively and quantitatively. Microgenetic approaches have been used to take into account the social process of development, in which individuals learn concurrently in a distributed fashion (Fischer & Granott, 1995). This method seems particularly well suited to our informal peer-based learning context, where a pre-test/post-test method for assessing learning quality would generally be inappropriate or infeasible. Statistical analysis of this sort of data presents some unique challenges (Cheshire et al., 2007). A higher-level challenge appears when we try to take what we’ve learned and apply it to shape practice. We give an indication of how we plan to address this challenge in the following section. p. 4

Applied heuristics: a Minskian approach p. 4

The purpose of this section is to establish a third layer of analysis, which we would position above the use of specialised mathematical vocabulary and below social dynamics like “working at the cutting edge”. Namely, here we look for engagement with the methods of mathematical problem. This layer includes the traditional “grammar of proof” (e.g. proof by induction or reductio ad absurdum). However, importantly, it also includes heuristics, and this is what we focus on in the current section. The treatment here is not exhaustive, indeed, a key feature of this layer is that it can expand to include any new heuristics we discover. p. 5

In one of his memos for the One Laptop per Child (OLPC) project, Marvin Minsky (2008-09) wrote: Children […] learn words for various objects and processes – such as addition, multiplication, fraction, quotient, divisor, rectangle, parallelogram, and cylinder, equation, variable, function, and graph. But they learn only a few such terms per year – which means that in the realm of mathematics, our children are mentally starved, by having to live in a “linguistic desert.” It is hard to think about something until one learns enough terms to express the important ideas in that area. His concern, however, is, not merely with increasing the rate of vocabulary acquisition, but with learning ways to think and problem solve. He quotes Allen Newell (1955): The essential point of efficient learning is that, after you have solved a problem, it is not enough just to remember the answer: you need to remember the strategies that you used to discover that answer. p. 5

Minsky then proposes several example heuristics (ways of thinking about problems) that might be taught, or detected when they are employed (Table 1). He also suggests some meta-level heuristics:

  • Select appropriate representations: Building an understanding of the problem or goal at levels ranging from deciding its domain, finding suitable parameters or making an intuitive sketch.
  • Find appropriate analogies: Some of the most useful results in mathematics combine ideas from disparate sub-fields (like geometry and algebra); further, some say that mathematics is the science of patterns, so knowing how to look for useful patterns is a vital skill.
  • Deploy negative expertise: Knowing what has failed to work in the past can be useful.
  • Construct more realistic self-models: As one gains experience, one can understand better how one thinks. p. 5

Learning strategies p. 5

In some sense, we could say that all “informal” mathematical speech represents use of heuristic reasoning. As with the four criteria from Chan, Lee and van Aalst (2001) mentioned earlier, we would like to be able to automatically detect patterns like “reasoning by analogy” or “deploying negative expertise”. Many of these could be identified from textual features drawn from the informal parts of mathematical speech (“by analogy” or even “it easily follows”), or else via explicit discourse markers, following the example of the “Dangerous Bend” sign employed by Bourbaki (Kranz, 2011) – one use for the icons supplied in Table 1. In analysing a given text, we can also draw on some of the literature about technical language, such as, for example, Trimble’s (1985) discussion of technical rhetoric. p. 5

If a problem seems familiar, try reasoning by analogy. If you solved a similar one in the past, and can adapt to the differences, you may be able to re-use that solution. If the problem still seems too hard, divide it into several parts. Every difference you recognize may suggest a separate subproblem to solve. p. 5

If it seems unfamiliar, change how you’re describing it. Find a different description that highlights more relevant information. p. 6

If you get too many ideas, then focus on a more specific example – but if you don’t get enough ideas, make the description more general. p. 6

If a problem is too complex, make a simpler version of it. Solving a simpler instance may suggest how to solve the original problem. p. 6

Asking what makes a problem seem hard, may suggest another approach – or a better way to spend your time. p. 6

When your ideas seem inadequate, remember someone more expert at this, and imagine what that person would do. p. 6

Whenever you find yourself totally stuck, stop whatever you’re doing now and let the rest of your mind find alternatives. p. 6

The best way to solve a problem is to already know how to solve it – if you can manage to retrieve that knowledge. p. 6

Very neat graphics mnemnonics p. 6

If none of these methods work, you can ask another person for help. p. 6

Table 1: Problem-solving heuristics suggested by Minsky, together with mnemonic diagrams p. 6

Although we are not in a position to ensure the availability of peer support, we suspect that, on average, learners will be able to use PlanetMath to avail themselves of “observation, coaching, and successive approximation” (Collins et al., 1989). If these efforts are successful, significant changes in the way mathematics education works may follow. In particular, as we mentioned above, we hope that PlanetMath’s “super-textbook” will help learners connect to mathematics in a way that is relevant and meaningful to them. Indeed, the idea that we could collect all textbook answers in one place does bring to mind a very interesting question, about why mathematics education in schools and universities has been focused on solving “textbook problems” that have already been solved many thousands of times before. Can improved use p. 6

of technology give us something better? While we believe the answer here should be a fervent “yes”, details will be determined in practice, not a priori. Still, we may at this time consider similar questions about other domains of knowledge. Can we develop similar “application layers”, similar to our “problem solving layer”, that re-use encyclopaedia content from Wikipedia or other knowledge resources in peer produced learning environments or interactive learning tools? p. 7

Turning this question around, what might general-purpose “social layers” for Open Educational Resources (such as or glean from a subject-specific project like PlanetMath? It appears that the biggest strength of our proposed approach to detecting learning is its three layer framework: we can detect specialised vocabulary use, socio-technical features like “working at the cutting edge”, and manage a catch-all category of heuristics which can be used to detect and talk about different ways of thinking. While there is a somewhat domain-specific mathematical flavour to this approach, we feel it poses an interesting model for educators and educational scientists working in other fields to consider adapting to their purposes. Further work will need to be done to establish whether this analytical framework can effectively detect patterns of learning. p. 7

Interesting point about domain-specific environments/research vs "catch-all". Something I need to think about for my research. p. 7

References p. 7

Alevizou, P., Conole, G. Culver, J., & Galley, R. (2009). Ritual performances and collective intelligence: theoretical frameworks for analysing emerging activity patterns in Cloudworks. In L. Dirckinck-Holmfeld, p. 7

Chan, C. K. K., Lee. E., & van Aalst, J. (2001). Assessing and fostering knowledge building inquiry and discourse. Paper presented at the 9th Biennial Meeting of the European Association for Learning and Instruction (EARLI). August 28-September 1st, 2001. Fribourg, Switzerland. p. 7

Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing and mathematics. In L. B. Resnick (Ed.), Knowing, learning and instruction: Essays in honor of Robert Glaser (pp. 453-494). Hillsdale, NJ: Lawrence Erlbaum Associates. p. 7

Conole, G. (2008). The role of mediating artefacts in learning design. In Lockyer, L., Bennett, S., Agostinho, S., & Harper, B. (Eds.). Handbook of Research on Learning Design and Learning Objects: Issues, Applications and Technologies (pp. 187 207). Hershey, PA; Idea Group Inc (IGI). p. 7

Corneli, J. (2011). The PlanetMath Encyclopaedia. Invited talk at the ITP 2011 Workshop on Mathematical Wikis (MathWikis-2011), Nijmegen, Netherlands. Retrieved August 22, 2011, from ol-767/paper-03.pdf p. 7

Cheshire, A., Muldoon K., Francis B., Lewis C. N., & Ball L. J. (2007). Modelling change: New opportunities in the analysis of microgenetic data, Infant and Child Development, 16, 119-134. p. 7

Fischer, K. W., & Granott N. (1995). Beyond one-dimensional change: Parallel, concurrent, socially distributed processes in learning and development, Human Development, 38, 302-314. p. 7

Hosein, A. (2009). Students’ approaches to mathematical tasks using software as a black-box, glass-box or open-box. Unpublished Ph.D. Thesis, The Open University, UK. p. 7

Kirschner, P.A. (2002). Can we support CSCL? Educational, social and technological affordances for learning. In P. A. Kirschner (Ed.), Three Worlds of CSCL: Can we support CSCL? (pp. 7-47). Heerlen, The Netherlands: Open University of the Netherlands. p. 7

Krowne, A. (2003). Building a digital library the commons-based peer production way. D-Lib Magazine, 9. Retrieved September 18, 2011, from p. 7

Lavelli M., Pantoja A.P.F., Hsu H., Messinger D., & Fogel A. (2005). Using microgenetic designs to study change processes. In D. Teti (Ed.), Handbook of Research Methods in Developmental Science (pp. 40-65). Oxford, UK: Blackwell Publishers. p. 7

McLoughlin, C., & Lee, M. J. W. (2007). Social software and participatory learning: Pedagogical choices with technology affordances in the web 2.0 era. Proceedings ASCILITE, Singapore. Retrieved August 23, 2011, from p. 8

Miettinen, R., & Virkkunen, J. (2005). Epistemic objects, artefacts and organizational change. Organization, 12(3), 437-456. p. 8

Minsky, M. (2008-09) Memos for the One Laptop Per Child project, published serially online in five instalments. Retrieved August 23, 2011, from p. 8

Nardi, B., & O'Day, V. (1999). Information ecologies: Using technology with heart. Cambridge, MA: MIT Press. p. 8

Warger, T., & Dobbin, G. (2009). Learning environments: Where space, technology, and culture converge. ELI White Papers, EDUCAUSE Learning Initiative (10/29/2009). Retrieved August 12, 2011, from p. 8


p. 2

(from talk slides)

(from talk slides and Marvin Minsky, 2008-09, Memos for the One Laptop Per Child project)