Notes from Learning Analytics Conference 2011: Day 2

March 3, 2011, [MD]

During the second day of the Learning Analytics Conference, I continued taking notes in Etherpad, just like I had done during the pre-conference, and day 1. After lunch, I felt quite burnt out however, after taking quite detailed notes for two and a half day already. In addition, I had some very interesting conversations during lunch, and my head kept spinning around those ideas, rather than focusing on the current speakers, so the notes below are by no means complete. Luckily Doug Clow took notes from the afternoon sessions.

All in all, it was a really great conference - beautiful venue, lot's of opportunities to interact, and a ton of new ideas. I hope to write some longer more reflective pieces about themes that I saw in the conference, and how they relate to my own research, once I get my todo list under control. Erik & Hannah Duval - keynote

attention - what do people pay attention to, when they learn? someone is, or is not, interacting with what is going on. Can we capture what the person does, can we use what we capture, to get better at doing what we do?

Human readable attention stream yammer - intranet twitter, gives you the "pulse" of the team.

human, explicit, nice, but doesn't scale - overwhelming, like LAK11

using attention to filter & suggest, provide awareness & support social links wakoopa - analytics. plugin. tracks everything you do. awareness of what you have been doing.

Software recommendations - other people with similar behaviour are using different applications, etc.

Find out when friends start using new software. If the people in this room could keep each other informed about the tools we use, in a very light-weight way.

Can we do something like that, for learning?

Physical exercise - notion of capturing data automatically, and using them to help you get better at what you want to do - very big community, RunKeeper for example. Run with your device.

"I want to run a marathon in September" - out comes a training program, and tracks it - hey you're not on track. How would this look like for learning, especially language learning etc.

RescueTime - you can set yourself goals. How much time to spend on email etc. It will tell you if you go over.

Google tracks searches.

Contextualized attention metadata. Responsive Open Learning Environment (ROLE), EU funded project.

Pull a number of components together, like widgets, and build your own learning environment. We try to keep track of everything going on in these environments.

We can build tools that visualize what's going on (Visualizing PLE Usage, Erik Duval et al)

http://www.role-showcase.eu/role-tool/cam-zeitgeist

Awareness for learners & teachers

Hans Poldoja is a PhD student of Erik Duval - his EduFeedr. People post things on their blogs, software figures out if they are doing what they should be doing.

Through tools, starting to collect datasets about how people interact with learning.

Figure out what recurrent patterns are, and what they mean.

TELEurope.eu - teleurope.eu/pg/groups/9405/data...

Would love to wire my students, and measure what goes on in the brain (but small ethical problems).

The quantified self

Dangers?

Scary if the university or the organization owns the learning metadata - 1984...

AttentionTrust.org -

Total recall. Book about E-Memory concept

Jeff Jarvis - the benefits of leading your life in an open way, tracking a lot of stuff, making that available. - book upcoming. "Public parts"

Motivation and self-efficacy among students - are they doing something for their own, rather than doing it for your professor. Strategy with own students: very open learning, people outside of the class see what you do, can be very motivating to students.

My students will auto-report what they think I want to know, so they get a better grade - which doesn't necessarily have anything to do with what they are doing. How to really track this?

Comment from audience: This seems very related to the "game layer", social competition etc.

Katja Niemann - Usage Contexts for Object Similarity: Exploratory Investigations

The self-regulated learner needs support to decide which learning object fits his needs best in current context

Recommend suitable learning objects according to - learning goal - competence level - preferred learning style

Problems with finding those objects - expert metadata: expensive - automatically generated metadata: good results for texts, but not for other media types - social metadata: sparse, ambigous, faulty

Contextualized Attention Metadata (CAM) Linguistic basic unit: word - sentence CAM: action/object - session

Use methods from linguistics on these sessions from CAM.

Paradigmatic relations two words that often appear in the same context might be semantically similar ex "beer" and "wine"

SO do objects with similar usage have similar context?

Each object holds a usage context profile (UCP) comprising all its usage contexts C consists of pre- and post-contexts

UCP similarity - compare pre-context and post-context

http://portal.mace-project.eu MACE - testbed to connect lerning objects in field of architecture

Using learning analytics to assess student's behaviour in open-ended programming tasks- Paulo Bilkstein

If we don’t come up with ways to give teachers incentives to assign projects to encourage 21st century problem solvers, they won’t do it.

Anna De Liddo - Discourse-Centric Learning Analytics

discourse as indicator of learning - key indicator of meaningful learning is the quality of contribution to discourse

sociocultural perspective on learning discourse as a tool to think collectively

through which people can compare their thinking, explore ideas, shape agreement

chronologically vs logically rendered dialogue environments (most online environments represent discourse as a timeline)

You have to read the entire thread to find the key items that have been discussed - not scaleable.

Online Deliberation: Emerging Tools Workshop (http://www.olnet.org/odet2010)) Essence: E-Science, Sensemaking & Climate Change

Demo of Cohere

Have to explicitly choose the kind of contribution they are making. Can annotate and include webpages. Make connections - search database, and pick post you want to connect to. Have to associate a semantic to the connection - what kind of link is this?

Ways of filtering posts, visualize in different ways.

Online discussion - ask students to classify what contributions you are making, and how this connects - unrelated to where your post appears.

Analytics per learner - Cohere personal notebook, all the notes, annotated websites, connections made, people connected etc. Different tables: post types (how learner contributed to discourse). What kinds of rhetorical moves are they making when they connect through posts?

Discourse network structure = concept network + social network

Concept network - nodes are posts, edges are semantics of connections. Normal network analysis: identify hub topics or hub posts. Who authored these posts? (In our case studies, the hub posts were questions).

Social network: tells you if there are sub-groups of learners that are not talking to each other.

Outdegree = measure of users' activity - you created a lot of activity pointing to others Indegree = indirect measure of relevance of a user's post - how many connections have been done to posts authored by you

We are interested in the rhetorical role that a user's contribution is making to a document or conversation and the nature of the connection to other contributions using semantic relationships.

Future:

Dan Suthers: We did this in the 1990's, problem we ran into: reliability of learner self-categorization. Often, everyone would just choose the category on the top of the list.

 

The learner needs to see the value of using these tools.

This is building on Dan Suthers' work, and Scardamalia and Bereiter's work, etc. It's a challenge for learners

This is how you make your thinking visible - if this is being assessed, that might be an incentive.

 

Learning Analytics and Exploratory Dialogue - Simon Buckingham-Shum

Hours of material - how can LA help spot critical, knowledge-building discourse?

How many points in the webinar triggered learning/knowledge building.

Text chat is very challenging, because there are fragments.

Data source: OU online conference.

3 kinds of talk

Comes from Mercer (2004) Sociocultural discourse analysis (J Appl. Ling) studying children in classrooms. http://politicaltheology.com/index.php/JAL/article/viewArticle/1443

 

Indicators of exploratory talk?

Indicated 94? indicators. Some of the obvious ones are quite misleading.

 

Future research needed to

Stian Håklev March 3, 2011 Toronto, Canada
comments powered by Disqus