This week is all about tools and techniques in the LAK12 class. There is a rapid development going on now, with many tools being developed and used. Some of these tools are open source and some are complete suites of tools commercially available.
Different kinds of analysis use different tools. Discourse analysis has been done for many years manually by researchers, and also has a strong base of methods and techniques. Discourse analysis can use qualitative analysis tools or visualizations (Many Eyes).
Social network analysis work dates back to the 1960's. Similar techniques are now being applied using tools such as SNAPP and Gephi.
Social media analytics tools deal with sentiment and products (Omniture)(SAS), and Radian6.
Search engines can predict a flu outbreak before experts. Google Flu.
Mining global data patters to predict human wellbeing changes, especially in vulnerable populations are also new projects. GlobalPulse
David Dornan has given permission to use his word doc on learning analytics tools and techniques. I will post a link to this in my Google docs later.
I'm looking forward to hearing Shane Dawson on social network analysis using SNAPP today.
Learning Analytics
Search This Blog
Sunday, September 23, 2012
Monday, March 19, 2012
MMOC LAK12 ending
Well, the MOOC Learning Analytics and Knowledge 2012 course has just ended. I would love to have an ongoing weekly list of readings and interactive session of interesting participants and presenters. What a treat it has been.
The course did a good job (thank you George Siemens!) in introducing learning analytics, differentiating from big data and academic analytics, and giving up-to-the-minute information on different types of LA deployments in the education field.
Analytics in education has a diverse background and is now invovlving computer scientists, psychologists, statisticians, and psychologists working closely on various projects. These stages of implementation were all discussed:
*Extracting and analyzing data from learning management systems
*Building an analytics matrix that incorporates data from multiple sources (social media, LMS, student information systems, etc).
*Profile or model development of individual learners (across the analytics matrix)
*Predictive analytics: determining at-risk learner
*Automated intervention and adaptive analytics: i.e. the learner model should be updated rapidly to reflect near real-time learner success and activity so that decisions are not made on out-dated models
*Development of "intelligent curriculum" where learning content is semantically defined
*Personalization and adaptation of learning based on intelligent curriculum where content, activities, and socia connections can be presented to each learner based on her profile or existing knowledge
*Advanced assessment: comparing learner profile with architecture of knowledge in a domain for grading or assessment (see the image below taken from the article previously discussed, Penetrating the Fog).
Assessment through analytics in the article is presented this way:
This is an exciting concept: visualizing the data -- about what is learned and how, about the learner -- and using this data to enrich the teaching and learning process. LA seems to be being used a lot for assessing and helping at-risk students in traditional course settings in high school or college. Although this may be useful, I see the power of LA to give the interested learner data about self and others and the learning s/he wishes to accomplish. Learning how to learn, strengthening one's awareness of and one's own learning dimensions, as well as making recommendations for ways to interact with knowledge and experts in your field, may be fine tuned to learner preferences and learning goals. This is especially interesting for the field sometimes referred to as "life long learning". This is what we are doing all of our lives, but many times school may not address how to learn. This is unfortunate, because that is the most useful skill. Once one knows how to learn, one can learn what is needed for new jobs, new opportunities, new hobbies or interests, when one needs it (just in time learning). We know this is the way of the world now and in the future, and we need to prepare our learners not to be spoon fed by traditional classroom settings.
The course did a good job (thank you George Siemens!) in introducing learning analytics, differentiating from big data and academic analytics, and giving up-to-the-minute information on different types of LA deployments in the education field.
Analytics in education has a diverse background and is now invovlving computer scientists, psychologists, statisticians, and psychologists working closely on various projects. These stages of implementation were all discussed:
*Extracting and analyzing data from learning management systems
*Building an analytics matrix that incorporates data from multiple sources (social media, LMS, student information systems, etc).
*Profile or model development of individual learners (across the analytics matrix)
*Predictive analytics: determining at-risk learner
*Automated intervention and adaptive analytics: i.e. the learner model should be updated rapidly to reflect near real-time learner success and activity so that decisions are not made on out-dated models
*Development of "intelligent curriculum" where learning content is semantically defined
*Personalization and adaptation of learning based on intelligent curriculum where content, activities, and socia connections can be presented to each learner based on her profile or existing knowledge
*Advanced assessment: comparing learner profile with architecture of knowledge in a domain for grading or assessment (see the image below taken from the article previously discussed, Penetrating the Fog).
Assessment through analytics in the article is presented this way:
This is an exciting concept: visualizing the data -- about what is learned and how, about the learner -- and using this data to enrich the teaching and learning process. LA seems to be being used a lot for assessing and helping at-risk students in traditional course settings in high school or college. Although this may be useful, I see the power of LA to give the interested learner data about self and others and the learning s/he wishes to accomplish. Learning how to learn, strengthening one's awareness of and one's own learning dimensions, as well as making recommendations for ways to interact with knowledge and experts in your field, may be fine tuned to learner preferences and learning goals. This is especially interesting for the field sometimes referred to as "life long learning". This is what we are doing all of our lives, but many times school may not address how to learn. This is unfortunate, because that is the most useful skill. Once one knows how to learn, one can learn what is needed for new jobs, new opportunities, new hobbies or interests, when one needs it (just in time learning). We know this is the way of the world now and in the future, and we need to prepare our learners not to be spoon fed by traditional classroom settings.
Tuesday, March 13, 2012
Simon's presentation to NLC LA: Dream, Nightmare, Fairydust?
Thank you Simon for the cautions on the future of learning analytics: will it be dream, nightmare, or fairydust?
I find the metaphor of the learner dashboard mirror to reflect on one’s own capacities as a key component of intelligent learning analytics. When we look at the EDUCAUSE chart from George Siemen’s paper on “Penetrating the Fog”on academic analytics versus learning analytics, the key difference seems to be the stakeholders. Academic analytics stakeholders are administrators, funders, marketing, governments and education authorities. Learning analytics stakeholders are learners and faculty. I would argue that further differentiation is necessary to define learning analytics of benefit to learners only.
Predictive models help identify learner, curriculum, and study success patterns. Purdue Signals gives students and faculty real time traffic lights. Faculty may be able to identify at risk students sooner, or pinpoint students that may be ready to drop out, and thus intervene. Analytics for the faculty may help with curriculum redesign, course sequence rethinking, and other scenarios that will eventually benefit the student.
What do the above have to do with “…their appetite to know and their capacity to learn.”? (Livingstone, 1941) We want our learners of today and the future to know how to learn what they want or need to learn when it is needed, be creative and innovative, collaborate, communicate, problem solve, and be information literate. The key points that learning management systems and other computing systems make is connecting analytics to student behaviors without understanding the meaning of the behaviors, and to student achievement, not necessarily to student learning, their capacity to learn, or their curiosity. Many of these systems have goals to identify the at-risk, provide support for students who need it, and increase student success rate. These are all aimed at achievement, which we, as educators know is not necessarily learning. Behaviorist philosophy of the past century defined learning as an observable change in behavior, whereas now it is accepted to view learning as changes in our mental associations that come through experience, and as such are not always observable. Learning how to learn whatever one needs to learn, or the capacity to conduct inquiry or problem solve are not necessarily supported by the analytic systems first described.
Learning analytic systems that empower the learner to learn from him/herself, reflect on their own learning, pursue authentic inquiries, and build learning dispositions and capacities will most likely be most effective as the “mirror dashboard” in your talk.
The ELLI is a self-reporting questionnaire, tapping at the unobservable, but personally relevant learning dimensions. The visual spider diagram gives feedback for the present, and record of changes. Awareness of learning dispositions, strategies for moving forward, the spider diagram, have been demonstrated to have significant effect on learner’s learning. ELLI within a social network, supplemented with mentoring, may be the vision to deal with our complex world.
I find the metaphor of the learner dashboard mirror to reflect on one’s own capacities as a key component of intelligent learning analytics. When we look at the EDUCAUSE chart from George Siemen’s paper on “Penetrating the Fog”on academic analytics versus learning analytics, the key difference seems to be the stakeholders. Academic analytics stakeholders are administrators, funders, marketing, governments and education authorities. Learning analytics stakeholders are learners and faculty. I would argue that further differentiation is necessary to define learning analytics of benefit to learners only.
Predictive models help identify learner, curriculum, and study success patterns. Purdue Signals gives students and faculty real time traffic lights. Faculty may be able to identify at risk students sooner, or pinpoint students that may be ready to drop out, and thus intervene. Analytics for the faculty may help with curriculum redesign, course sequence rethinking, and other scenarios that will eventually benefit the student.
What do the above have to do with “…their appetite to know and their capacity to learn.”? (Livingstone, 1941) We want our learners of today and the future to know how to learn what they want or need to learn when it is needed, be creative and innovative, collaborate, communicate, problem solve, and be information literate. The key points that learning management systems and other computing systems make is connecting analytics to student behaviors without understanding the meaning of the behaviors, and to student achievement, not necessarily to student learning, their capacity to learn, or their curiosity. Many of these systems have goals to identify the at-risk, provide support for students who need it, and increase student success rate. These are all aimed at achievement, which we, as educators know is not necessarily learning. Behaviorist philosophy of the past century defined learning as an observable change in behavior, whereas now it is accepted to view learning as changes in our mental associations that come through experience, and as such are not always observable. Learning how to learn whatever one needs to learn, or the capacity to conduct inquiry or problem solve are not necessarily supported by the analytic systems first described.
Learning analytic systems that empower the learner to learn from him/herself, reflect on their own learning, pursue authentic inquiries, and build learning dispositions and capacities will most likely be most effective as the “mirror dashboard” in your talk.
The ELLI is a self-reporting questionnaire, tapping at the unobservable, but personally relevant learning dimensions. The visual spider diagram gives feedback for the present, and record of changes. Awareness of learning dispositions, strategies for moving forward, the spider diagram, have been demonstrated to have significant effect on learner’s learning. ELLI within a social network, supplemented with mentoring, may be the vision to deal with our complex world.
Monday, March 12, 2012
LA: Dream, Nightmare, or Fairydust? S. Buckingham Shum Hot Seat
Simon Buckingham Shum on Learning Analytics: Dream, Nightmare, or Fairydust?
Networked Learning Conference 2012 Hot Seat
a weeklong discussion (March 12-18)
Simon will be presenting LA: "what are the implications for learning?"
from his abstract:
"...Then there are those of a more cautious nature. So what if we have shedloads of data? Now we can drown faster. Learning, enquiry, argumentation, sensemaking, scholarship, insight — these skills are of an entirely different order, the highest forms of meaning-making, uniquely human. And what have analytics to say about the less tangible 21stCentury skills that we need to nurture if the next generation is to manage the unprecedented complexity and uncertainty that they will inherit from us? Surely data analytics have nothing to say about intrinsic disposition to learn, emotional resilience in the face of adversity, the ability to moderate a discussion, resolve conflict, or ask critical questions? Finally, who is in control of analytics: are they tools to study learners, or tools to place in their hands, to create reflective, more agile individuals and collectives?
Analytics may in time come to be used to judge you — as a learner, an educator, or your institution. The challenge for us is to debate what it means for this new breed of performance indicators to have pedagogical and ethical integrity. What can and should we do, and what are the limits? Do they advance what we consider to be important in learning, teaching, and what it means to be a higher education institution in the 21stCentury?
Are you thinking Dream, Nightmare, or Fairydust?"
use of technology to try to build patterns for what we hope are significant patterns around learns. A dream technology for tracking, or a dumbed down reductionist point of view as a nightmare, or tech vendors seeing a new market and overblown claims.
A new analytics platform
"Some have tried to argue that this technology doesn't work out cost effectively when compared to conventional tests...but this misses a huge point. More often than not, we test after the event and discover the problem--but this is too late."
We are of course, talking about aquarium analytics--revolutionary!! it will continuously track the problems before they affect the fish-- as assured outcome of predictive software.
If we change a few key words, does this seem like your learning ecosystem--maybe we are talking about real students.
Simons neighbor, Mark asked for help to install the fish software--arrows, etc. an exciting sense of control. But you still need to know what 'good'looks like.
SoLAR defines LA: (2nd int. conf, 2012)
"Learning analytics is concerned with the collection, analysis and reporting of data about learning in a range of contexts, including informal learning, academic institutions, and the workplace. It informs and provides input for action to support and enhance learning experiences, and the success of learners."
Again, discussion of the distinction between learning and academic analytics. George Siemens and Phil Long's chart from Educause article shows different stakeholders--who are interested in different outcomes and looking for answers to different questions. Maybe in the future, these will merge. LA is newer and more concerned with the micro-interaction of learners. Success levels and process data about how learners are doing may be important to the academic analytics stakeholders.
Simon urges reflection over learning analytics and the . Power-who is in control and who gets to see it, have to ask?s Principles--ethical principles around mining of data from different sources Pedagogy-we need to ask questions about what kind of learning we need, not just be thrilled at the technology to use it.
DREAM NIGHMARE FAIRYDUST
COMPANIES COMING IN FROM THE BUSINESS intelligence sector, they don't seem to know anything about learning, or how language affects learning, for example IBM Watson. We are sharing personal data--Quantified Self conference-is people sharing health data, about their lives (iPhone Location Data Visualization). Education is in the sights of these businesses, but how will universities et al respond to this new potential? What is the learners response going to be? Will it be forced, voluntary, how much of their own data will they see?
Within the enterprise world, social analytics are becoming a commodity service? Who has the best reputation in their environment?
OU- FLASHMEETING: sufficient context is needed when we are reading off various analytics (Flashmeeting foreign language mentoring)
OU: PREDICTIVE MODELING: Predictive modeling help us to identify patterns of student success that vary between student groups/areas of curriculum/study methods
Benefits:
provides a more robust comparison of module pass rates
support the institution in identifying aspects of good performance that can be shared, and aspects where improvement could be realized. (OU student stats and surveys team--Institute of Educational Technology)
All above still in traditional pedagogy, what about the learning revolution, education reform??
"We are preparing students for jobs that do not exist yet, that will use technologies that have not beeninvested yet, in order to solve problems that are not even problems yet." "Shift Happens" http://shifthappens.wikispaces.com
What kinds of capacities will be needed to deal with this complex world that is here now and is coming?
"The test of successful education is not the amount of knowledge that pupils take away from school, but their appetite to know and their capacity to learn." Sir Richard Livingstone, 1941
How does LA figure into these less tangible pieces. Tectonic shifts in the learning landscape-- tech moving fast (cloud, real time,) free open movement (is increasingly expected-might pay later) social learning (innovation now depends on it, knowledge barely codified before it is out of date), values changing (autonomy, diversity, self-expression, participation becoming more and more important, need to know relevance to own lives of what they are learning), post industrial (new institutional roles in post-industrial education).
All of these are indeed tectonic shifts, we are changing, these are PROFOUND changes in power, relationships, economics, and our very infrastructure--these must figure in to our conception of the future of learning. We must consider these in reshaping learning and education, and in the future of learning analytics. How can LA work with C21 skills, learning to learn, and authentic inquiry? My educational past practice has been with C21 skills, learning to learn, inquiry based programs, project based learning, and I have seen otherwise identified "low" and "unmotivated" students rise to the occasion, become learners, and even experts that give talks for the rest of the class, that assist other students in their areas of expertise, that have a deep curiosity and are excited to learn, thirsty to find out more, curious, interested, and willing to do more than is usually "expected" of students at their levels. These students give up breaks, lunches, and recesses voluntarily when mired in "work" they are completely immersed in. Isnt' this the scene we want in our schools and homes? Isn't this the kind of students we want to encourage? I'm so glad to hear that others see learning analytics as more than just counting the number of websites visited, and the number of posts within an LMS. Good students have learned the game of school--they know what their teachers are looking for and they give it to them, if they are really good, they learn in spite of the chains of traditional schooling. Other students, who don't know the game, don't know the rules, and could care less--but they are students that can be motivated--IF they are given authentic tasks, control over their own learning, and so on.
Now we need to define, and I need to define specifically how learning analytics can be used to assess learning to learn, assess students engagement in authentic inquiry. How can LA contribute to social capital, questioning critically, learning argumentation, citizenship, habits of mind, resilience, collaboration, creativity, metacognition, identity, readiness, sensemaking, engagement, motivation, and emotional intelligence? These are process oriented aspects of learning that I deem as THE IMPORTANT factors in learning.
Simon introduced things that he thinks will be important:
ELLI: Effective Lifelong Learning Inventory
Networked Learning Conference 2012 Hot Seat
a weeklong discussion (March 12-18)
Simon will be presenting LA: "what are the implications for learning?"
from his abstract:
"...Then there are those of a more cautious nature. So what if we have shedloads of data? Now we can drown faster. Learning, enquiry, argumentation, sensemaking, scholarship, insight — these skills are of an entirely different order, the highest forms of meaning-making, uniquely human. And what have analytics to say about the less tangible 21stCentury skills that we need to nurture if the next generation is to manage the unprecedented complexity and uncertainty that they will inherit from us? Surely data analytics have nothing to say about intrinsic disposition to learn, emotional resilience in the face of adversity, the ability to moderate a discussion, resolve conflict, or ask critical questions? Finally, who is in control of analytics: are they tools to study learners, or tools to place in their hands, to create reflective, more agile individuals and collectives?
Analytics may in time come to be used to judge you — as a learner, an educator, or your institution. The challenge for us is to debate what it means for this new breed of performance indicators to have pedagogical and ethical integrity. What can and should we do, and what are the limits? Do they advance what we consider to be important in learning, teaching, and what it means to be a higher education institution in the 21stCentury?
Are you thinking Dream, Nightmare, or Fairydust?"
use of technology to try to build patterns for what we hope are significant patterns around learns. A dream technology for tracking, or a dumbed down reductionist point of view as a nightmare, or tech vendors seeing a new market and overblown claims.
A new analytics platform
"Some have tried to argue that this technology doesn't work out cost effectively when compared to conventional tests...but this misses a huge point. More often than not, we test after the event and discover the problem--but this is too late."
We are of course, talking about aquarium analytics--revolutionary!! it will continuously track the problems before they affect the fish-- as assured outcome of predictive software.
If we change a few key words, does this seem like your learning ecosystem--maybe we are talking about real students.
Simons neighbor, Mark asked for help to install the fish software--arrows, etc. an exciting sense of control. But you still need to know what 'good'looks like.
SoLAR defines LA: (2nd int. conf, 2012)
"Learning analytics is concerned with the collection, analysis and reporting of data about learning in a range of contexts, including informal learning, academic institutions, and the workplace. It informs and provides input for action to support and enhance learning experiences, and the success of learners."
Again, discussion of the distinction between learning and academic analytics. George Siemens and Phil Long's chart from Educause article shows different stakeholders--who are interested in different outcomes and looking for answers to different questions. Maybe in the future, these will merge. LA is newer and more concerned with the micro-interaction of learners. Success levels and process data about how learners are doing may be important to the academic analytics stakeholders.
Simon urges reflection over learning analytics and the . Power-who is in control and who gets to see it, have to ask?s Principles--ethical principles around mining of data from different sources Pedagogy-we need to ask questions about what kind of learning we need, not just be thrilled at the technology to use it.
DREAM NIGHMARE FAIRYDUST
COMPANIES COMING IN FROM THE BUSINESS intelligence sector, they don't seem to know anything about learning, or how language affects learning, for example IBM Watson. We are sharing personal data--Quantified Self conference-is people sharing health data, about their lives (iPhone Location Data Visualization). Education is in the sights of these businesses, but how will universities et al respond to this new potential? What is the learners response going to be? Will it be forced, voluntary, how much of their own data will they see?
Within the enterprise world, social analytics are becoming a commodity service? Who has the best reputation in their environment?
OU- FLASHMEETING: sufficient context is needed when we are reading off various analytics (Flashmeeting foreign language mentoring)
OU: PREDICTIVE MODELING: Predictive modeling help us to identify patterns of student success that vary between student groups/areas of curriculum/study methods
Benefits:
provides a more robust comparison of module pass rates
support the institution in identifying aspects of good performance that can be shared, and aspects where improvement could be realized. (OU student stats and surveys team--Institute of Educational Technology)
All above still in traditional pedagogy, what about the learning revolution, education reform??
"We are preparing students for jobs that do not exist yet, that will use technologies that have not beeninvested yet, in order to solve problems that are not even problems yet." "Shift Happens" http://shifthappens.wikispaces.com
What kinds of capacities will be needed to deal with this complex world that is here now and is coming?
"The test of successful education is not the amount of knowledge that pupils take away from school, but their appetite to know and their capacity to learn." Sir Richard Livingstone, 1941
How does LA figure into these less tangible pieces. Tectonic shifts in the learning landscape-- tech moving fast (cloud, real time,) free open movement (is increasingly expected-might pay later) social learning (innovation now depends on it, knowledge barely codified before it is out of date), values changing (autonomy, diversity, self-expression, participation becoming more and more important, need to know relevance to own lives of what they are learning), post industrial (new institutional roles in post-industrial education).
All of these are indeed tectonic shifts, we are changing, these are PROFOUND changes in power, relationships, economics, and our very infrastructure--these must figure in to our conception of the future of learning. We must consider these in reshaping learning and education, and in the future of learning analytics. How can LA work with C21 skills, learning to learn, and authentic inquiry? My educational past practice has been with C21 skills, learning to learn, inquiry based programs, project based learning, and I have seen otherwise identified "low" and "unmotivated" students rise to the occasion, become learners, and even experts that give talks for the rest of the class, that assist other students in their areas of expertise, that have a deep curiosity and are excited to learn, thirsty to find out more, curious, interested, and willing to do more than is usually "expected" of students at their levels. These students give up breaks, lunches, and recesses voluntarily when mired in "work" they are completely immersed in. Isnt' this the scene we want in our schools and homes? Isn't this the kind of students we want to encourage? I'm so glad to hear that others see learning analytics as more than just counting the number of websites visited, and the number of posts within an LMS. Good students have learned the game of school--they know what their teachers are looking for and they give it to them, if they are really good, they learn in spite of the chains of traditional schooling. Other students, who don't know the game, don't know the rules, and could care less--but they are students that can be motivated--IF they are given authentic tasks, control over their own learning, and so on.
Now we need to define, and I need to define specifically how learning analytics can be used to assess learning to learn, assess students engagement in authentic inquiry. How can LA contribute to social capital, questioning critically, learning argumentation, citizenship, habits of mind, resilience, collaboration, creativity, metacognition, identity, readiness, sensemaking, engagement, motivation, and emotional intelligence? These are process oriented aspects of learning that I deem as THE IMPORTANT factors in learning.
Simon introduced things that he thinks will be important:
ELLI: Effective Lifelong Learning Inventory
Friday, March 2, 2012
Student success vs. Success as a student
First, let's look at being a successful student (success as a student).
If you do web searches on being a successful student, you will find numerous sites that will give advice on the topic. Many say that students must focus, prioritize, sleep and participate. (http://www.scholarshipexperts.com/college-life/good-student.jsp) Some go into more detail, explaining that successful students attend all of their classes, get to class on time, are respectful of other students, don't chat with their friends during class, come to class prepared, take advantage of extra-credit opportunities, make sure their assignments are in on time and are neat and professional looking. (http://www.santarosa.edu/lifesciences2/success.htm) Some sites emphasize skills that are necessary to be a good student. These skills include self-discipline, study skills, time management, keeping to deadlines, learn what the instructor wants, and basically comply with the instructor. (http://courseware.ee.calpoly.edu/~jbreiten/htbas.html)
Now, all of the above skills are great--TO BE A GOOD STUDENT. But, do we want our students to be good students at the expense of LEARNING, UNDERSTANDING, PROBLEM SOLVING, BEING CREATIVE, AND THINKING? These are different priorities.
Student success is not the same thing as success as a student. There was a student quote:
"I have no clue about what I want to do with my life: I have no interests because I saw every subject of study as work, and I excelled at every subject just for the purpose of excelling, not learning. And quite frankly, now I'm scared." _-"Here I Stand", Erica Goldson, June 25, 2010
http://americaviaerica.blogspot.com/2010/07/coxsackie-athens-valedicatorian-speech.html
I have been a lifelong educator, teaching at all levels from primary through graduate school. In all cases, I've talked with students about school as a game. This game has rules, and if you don't play by the rules (get assignments in on time, pass exams, complete the assignments that the teacher deems necessary), you do not succeed. If you do these things, you can get good grades on your report cards and good teacher recommendations. However, you don't necessarily learn or understand. In fact, I feel it is entirely possible to go through school with good grades and yet not really THINK about what you are learning, not question, not engage in dialogue with others, including your peers about the topics you are studying, or really think critically in any way about the subject of study. Do teachers think of the persons in their classes as students or learners? Isn't there a different model for learners to learn? What would learners do differently than learn time management, neatness, and compliance with teachers' wishes?
Now that we've looked at the difference between students and learners, we need to consider what learning analytics really is. Note the first definition:
"Learning analytics in the academic domain is focused on the learner, gathering data from course management and student information systems in order to manage student success, including early warning processes where a need for interventions may be warranted."
--Analytics in Higher Education: Establishing a Common Language, Angela van Barnevald, Kimberly E. Arnold, and John P. Campbell, ELI Paper 1:2012, January 2012
This definition from an ELI paper in 2012 seems reductive, stifling, and more interested in using analytics to check on students and student skills such as getting assignments in on time and passing exams. Are these not important in a class? Of course they are. BUT, are they the thing we want to measure and assist? Or do we want to look at students contributions to discussion and dialogue, their ability to ask good questions, their wanting to collaborate rather than compete (which is what traditional school structures encourage.)
David Warwick has contributed a useful chart that defines Student and Learner Qualities. http://davidwarlick.com/2cents/?p=2762
Warwick contrast student relationship to educators as a employee, who obediently follows directions to learners who are actually citizens of the learning society. He defines motivation of students as obligatory, whereas learners feel responsibility for the value of their work. The why is interesting: students are compelled and learners are curious. Isn't curiosity a more valuable character trait to assess and to encourage than compliance?
Warwick on assessment:
"For the student, assessment is king, in very much the same way that quality control is such a critical part of the manufacturing processes. But assessment, for learners, is much less obvious, and at the same time, it is much more integral to the learning. Assessment for classrooms of learners is the enormous amounts of qualitative data that is collected by the teacher (and other students) on a minute-by-minute bases."
This is another important difference. We look to see what learners actually do with what they have learned, not that they can simply recapitulate it all on a test by bubbling in the correct A, B, C, or D.
So, what do we want learning analytics to measure and encourage? What we measure should be what is important; the very act of measuring placing priority on that quslity and has the effect of incentivizing, encouraging, and celebrating success for that quality.
Let's not let learning analytics be confined to helping students learn to be students, let's work on refining analytics design so that analytics is part of the teaching learning process that helps students become learners, lifelong learners, learners who know how to learn whatever they need to know when they need it (just in time learning), and learners that think critically, are curious, know how to find out about things they want to know, are and become passionate about the things they want to know, engage in intellectual discourse with peers, teachers, and experts in the field they are learning about.
If you do web searches on being a successful student, you will find numerous sites that will give advice on the topic. Many say that students must focus, prioritize, sleep and participate. (http://www.scholarshipexperts.com/college-life/good-student.jsp) Some go into more detail, explaining that successful students attend all of their classes, get to class on time, are respectful of other students, don't chat with their friends during class, come to class prepared, take advantage of extra-credit opportunities, make sure their assignments are in on time and are neat and professional looking. (http://www.santarosa.edu/lifesciences2/success.htm) Some sites emphasize skills that are necessary to be a good student. These skills include self-discipline, study skills, time management, keeping to deadlines, learn what the instructor wants, and basically comply with the instructor. (http://courseware.ee.calpoly.edu/~jbreiten/htbas.html)
Now, all of the above skills are great--TO BE A GOOD STUDENT. But, do we want our students to be good students at the expense of LEARNING, UNDERSTANDING, PROBLEM SOLVING, BEING CREATIVE, AND THINKING? These are different priorities.
Student success is not the same thing as success as a student. There was a student quote:
"I have no clue about what I want to do with my life: I have no interests because I saw every subject of study as work, and I excelled at every subject just for the purpose of excelling, not learning. And quite frankly, now I'm scared." _-"Here I Stand", Erica Goldson, June 25, 2010
http://americaviaerica.blogspot.com/2010/07/coxsackie-athens-valedicatorian-speech.html
I have been a lifelong educator, teaching at all levels from primary through graduate school. In all cases, I've talked with students about school as a game. This game has rules, and if you don't play by the rules (get assignments in on time, pass exams, complete the assignments that the teacher deems necessary), you do not succeed. If you do these things, you can get good grades on your report cards and good teacher recommendations. However, you don't necessarily learn or understand. In fact, I feel it is entirely possible to go through school with good grades and yet not really THINK about what you are learning, not question, not engage in dialogue with others, including your peers about the topics you are studying, or really think critically in any way about the subject of study. Do teachers think of the persons in their classes as students or learners? Isn't there a different model for learners to learn? What would learners do differently than learn time management, neatness, and compliance with teachers' wishes?
Now that we've looked at the difference between students and learners, we need to consider what learning analytics really is. Note the first definition:
"Learning analytics in the academic domain is focused on the learner, gathering data from course management and student information systems in order to manage student success, including early warning processes where a need for interventions may be warranted."
--Analytics in Higher Education: Establishing a Common Language, Angela van Barnevald, Kimberly E. Arnold, and John P. Campbell, ELI Paper 1:2012, January 2012
This definition from an ELI paper in 2012 seems reductive, stifling, and more interested in using analytics to check on students and student skills such as getting assignments in on time and passing exams. Are these not important in a class? Of course they are. BUT, are they the thing we want to measure and assist? Or do we want to look at students contributions to discussion and dialogue, their ability to ask good questions, their wanting to collaborate rather than compete (which is what traditional school structures encourage.)
David Warwick has contributed a useful chart that defines Student and Learner Qualities. http://davidwarlick.com/2cents/?p=2762
Warwick contrast student relationship to educators as a employee, who obediently follows directions to learners who are actually citizens of the learning society. He defines motivation of students as obligatory, whereas learners feel responsibility for the value of their work. The why is interesting: students are compelled and learners are curious. Isn't curiosity a more valuable character trait to assess and to encourage than compliance?
Warwick on assessment:
"For the student, assessment is king, in very much the same way that quality control is such a critical part of the manufacturing processes. But assessment, for learners, is much less obvious, and at the same time, it is much more integral to the learning. Assessment for classrooms of learners is the enormous amounts of qualitative data that is collected by the teacher (and other students) on a minute-by-minute bases."
This is another important difference. We look to see what learners actually do with what they have learned, not that they can simply recapitulate it all on a test by bubbling in the correct A, B, C, or D.
So, what do we want learning analytics to measure and encourage? What we measure should be what is important; the very act of measuring placing priority on that quslity and has the effect of incentivizing, encouraging, and celebrating success for that quality.
Let's not let learning analytics be confined to helping students learn to be students, let's work on refining analytics design so that analytics is part of the teaching learning process that helps students become learners, lifelong learners, learners who know how to learn whatever they need to know when they need it (just in time learning), and learners that think critically, are curious, know how to find out about things they want to know, are and become passionate about the things they want to know, engage in intellectual discourse with peers, teachers, and experts in the field they are learning about.
Criticisms of Learning Analytics
Gardener Campbell, Virginia Tech www.gardenercampbell.net
Criticisms of Educational Experience in Learning Analytics
Moving seamlessly between areas is not trivial
?of domain crossing-Stephen Johnson calls Adjacent Possibility
Danger
increased by reductive learning analytics
lock ourselves into a model that will take us backwards
"Perhaps the final claimant for the title of ultimate thory of the universe is M-theory....M-thory is not simple. You can't print it on a T-shirt...M-thoery is not a single theory. It is a collection of theories. Hawking describes them as a "family of theories". Each member of the family is a good description of observations in some range of physical situations, but none is a good descripton of observations in all physical situations. Non can account for 'everything'.... No single flat map is a good representation of the EArth's surface. Just so, no single theory is a good representation of all observations."
Kitty Gerguson, Stephen Hawking: An Unfettered Mind
Analogy to Learning Analytics? why is this relevant today? This is a conclusion we've reached provisionally about the cosmos--probably the second most complex thing in our universe. The first most complex thing in the universe is our brains. Why would we think that any kind of approach to learning would be any less complicated? How do we get at measuring this? Carefully. With a degree of humility. We will inevitably try to get to some sort of reduction to be able to understand, to wrap our heads around it. Minds-trillions of interconnections among neurons.
Our Shared Reality, 2012
"We are living in the middle of the largest increase in expressive capability in the history of the human race."
Clay Shirky, Here Comes Everybody
Siri now. This suggests the scale of what we are trying to map. Any attempt at quantifying or qualifying has to answer to this scale. We can't organize ourselves just around what we can measure.
"Learning analytics in the academic domain is focused on the learner, gathering data from course management and student information systems in order to manage student success, including early warning processes where a need for interventions may be warranted."
--Analytics in Higher Education: Establishing a Common Language, Angela van Barnevald, Kimberly E. Arnold, and John P. Campbell, ELI Paper 1:2012, January 2012
ACKKK!! This is much closer to the Cartesian coordinates than it is to the M-thory and expressive capability.
TED talk: Sebastian Seung: I am my connectome
Course management systems have nothing to do with the expressive nature of the self. It has nothing to do with the self, identity, or with the complexity of being human. This should be at the heart of what we talk about when we talk about learning or understanding. This is about schooling, very different from learning. Much schooling is simply transactional. We build industrial models of education, assembly lines that cannot be stopped. We measure the cosmos from Flatland. The structures we are now quantifying via LA are a set of phenomena that are just about getting from one end to the other. Monkey see, monkey do: writing down what the teacher writes on the blackboard.
Gardner gives us four strong cautions with regard to LA:
•"Student Success"
•Complexity
•Points of "Intervention"
•The "Third Wave"
Student success is not the same thing as success as a student. There was a student quote:
"I have no clue about what I want to do with my life: I have no interests because I saw every subject of study as work, and I excelled at every subject just for the purpose of excelling, not learning. And quite frankly, now I'm scared." _-"Here I Stand", Erica Goldson, June 25, 2010
http://americaviaerica.blogspot.com/2010/07/coxsackie-athens-valedicatorian-speech.html
I have been a lifelong educator. Yet, this quote is of no surprise to me. I have always taught my students that school is a game with rules and you have play by the rules, and then you need to learn on top of that. I told my son as he was going through school that he needed to excel and do well to get to each successive step, BUT that he MUST learn IN SPITE of the system. People can easily go through the system and get all A's, excellents, great recs, but not learn, or learn to learn, or learn to think, or learn to be creative at life, liberty, and the pursuit of happiness, and problem solving.
If we apply the reductive model of analytics, we will have our top students, top engineers, etc. A set of students who finish school not even sure what they want to do with their lives.
Standards and rubrics don't seem to get to the deeper pieces of learning. Standards are set to map cognition in a Cartesian coordinate plane. The factory approach puts even the well meaning people to sleep. Cognition exists in the design of the analytic systems as well.
reddit.com check out the FAQs
I want to learn. Can we look not just at student compliance, but at student contributions and not just what they write in answer to a prompt but how they link to the world outside. Forum: IWTL: How to Learn This is a self-organizing system--not like a learning management system. Prompt: You learn something new every day; what did you learn today? Submit interesting and specific facts that you just found not (not broad information you looked up, TodayILearned is not /r/wikipedia.
What a great forum! What an enlightened place for students, learners, users, whatever we want to call them. I have been an educator because I was unhappy with the system that existed. I tried to make changes constantly so that students, learners, children, all could LEARN, UNDERSTAND, THINK. I agree completely with Gardner problems with the learning management systems. I'm glad that this conversation is finally coming up within the LAK class.
John Naughton, From Gutenberg to ZuckerbergL What you Really Need to Know about the Internet (2012) --
Complexity is the new reality.
•non linear, unpredictable--butterfly effect
•feedback matters--a lot-- must be loops that feed the complexity
•systems demonstrate self-organization (like reddit.com above)
•EMERGENCE-- synergies--new phenomena -- then we have to learn about the new phenomena and then we have to engage in double loop learning. How do we deal with paradoxes?
What is Double Loop Learning? This refers to success being defined as more than positive outcome "relative to pre-established targets." This defines success allowing learners "need to reflect on the appropriateness, in the light of unfolding events, of the assumptions (the mental model) used to set up those actions and targets." Naughton Analytics mean little more than positive outcome. Double loop would allow a true analytics system that would itself be able to learn--to adapt to the changes, to reflect and make appropriate change. (just as math is not equal to calculating)
Student success is not the same thing as success as a student.
Complexity-- set up target, and then figure out if the target has remained the same.
Intervention- typically LA intervenes when it thinks there will be risk of failure-- but learning doesnt proceed from confusion> lightbulb > clarity, or thesis and antithesis resulting in sythesis--it is instead a set of nodes and connections between these, like a concept map--this is a truer picture of learning.
LA system of suitable complexity--intervenes when the student begins to understand something and validates that what you are thinking is interesting, you are on the edge of finding out, you are on the right path here. building the system around interventions around the point of about to fail is not as interesting as intervention around you are about to understand.
connectome
waves-ELI director, Malcolm Brown A third wave, integration of social media-2-integration of LMS into learning 1.
have we really integrated social media--reddit, twitter, blogs, if we had engaged wide-open social media we would see how complex learning really is--let's question the third wave because we're routing around complexity. What is the goal?
quasi-demo of this kind of system- www.wdyl.com (what do you love?) the icon for a search is a heart-- put in learning--
framed around invitations that will begin in delight and end in wisdom. the google page is like a project based, inquiry based
more coming soon
understand the complexity of lerning. there is a healthy humility about what we don't know--measures curiosity but also arouses it--
free range student writing-auto ethnography - so that sparked my curiosity--link to a website--student is giving info about what it is the student is doing within the learning envirionment
How do you measure wisdom? through Chris Dede wrote article about wisdom inlearning, he pointed to the point--A Handbook of Wisdom, Cambridge U Press, 2005--several strands of reasearch now devoted to wisdom-id cross cultural consensus, traditional learning models, things they don't include, awareness that within each cultures reporting self-awareness there is a strong strand of wisdom -- Wisdom is a core human value, it has to do with learning, but not just learning, has to do with doing, but not just doing, why not ask questions about the process. A small pieces loosely joined web presents an interesting possibility for connections that would increase the chances we could find wisdom in our collective conversation.
look up other references:
"Paradox of the Active User", John Carroll & Mary Beth Rosson, 1987
(Is Effective learning possible?)
John Naughton, From Gutenberg to Zuckerberg: What you Really Need to Know about the Internet (2012)
World Wide Mind Stephen Churros
Criticisms of Educational Experience in Learning Analytics
Moving seamlessly between areas is not trivial
?of domain crossing-Stephen Johnson calls Adjacent Possibility
Danger
increased by reductive learning analytics
lock ourselves into a model that will take us backwards
"Perhaps the final claimant for the title of ultimate thory of the universe is M-theory....M-thory is not simple. You can't print it on a T-shirt...M-thoery is not a single theory. It is a collection of theories. Hawking describes them as a "family of theories". Each member of the family is a good description of observations in some range of physical situations, but none is a good descripton of observations in all physical situations. Non can account for 'everything'.... No single flat map is a good representation of the EArth's surface. Just so, no single theory is a good representation of all observations."
Kitty Gerguson, Stephen Hawking: An Unfettered Mind
Analogy to Learning Analytics? why is this relevant today? This is a conclusion we've reached provisionally about the cosmos--probably the second most complex thing in our universe. The first most complex thing in the universe is our brains. Why would we think that any kind of approach to learning would be any less complicated? How do we get at measuring this? Carefully. With a degree of humility. We will inevitably try to get to some sort of reduction to be able to understand, to wrap our heads around it. Minds-trillions of interconnections among neurons.
Our Shared Reality, 2012
"We are living in the middle of the largest increase in expressive capability in the history of the human race."
Clay Shirky, Here Comes Everybody
Siri now. This suggests the scale of what we are trying to map. Any attempt at quantifying or qualifying has to answer to this scale. We can't organize ourselves just around what we can measure.
"Learning analytics in the academic domain is focused on the learner, gathering data from course management and student information systems in order to manage student success, including early warning processes where a need for interventions may be warranted."
--Analytics in Higher Education: Establishing a Common Language, Angela van Barnevald, Kimberly E. Arnold, and John P. Campbell, ELI Paper 1:2012, January 2012
ACKKK!! This is much closer to the Cartesian coordinates than it is to the M-thory and expressive capability.
TED talk: Sebastian Seung: I am my connectome
Course management systems have nothing to do with the expressive nature of the self. It has nothing to do with the self, identity, or with the complexity of being human. This should be at the heart of what we talk about when we talk about learning or understanding. This is about schooling, very different from learning. Much schooling is simply transactional. We build industrial models of education, assembly lines that cannot be stopped. We measure the cosmos from Flatland. The structures we are now quantifying via LA are a set of phenomena that are just about getting from one end to the other. Monkey see, monkey do: writing down what the teacher writes on the blackboard.
Gardner gives us four strong cautions with regard to LA:
•"Student Success"
•Complexity
•Points of "Intervention"
•The "Third Wave"
Student success is not the same thing as success as a student. There was a student quote:
"I have no clue about what I want to do with my life: I have no interests because I saw every subject of study as work, and I excelled at every subject just for the purpose of excelling, not learning. And quite frankly, now I'm scared." _-"Here I Stand", Erica Goldson, June 25, 2010
http://americaviaerica.blogspot.com/2010/07/coxsackie-athens-valedicatorian-speech.html
I have been a lifelong educator. Yet, this quote is of no surprise to me. I have always taught my students that school is a game with rules and you have play by the rules, and then you need to learn on top of that. I told my son as he was going through school that he needed to excel and do well to get to each successive step, BUT that he MUST learn IN SPITE of the system. People can easily go through the system and get all A's, excellents, great recs, but not learn, or learn to learn, or learn to think, or learn to be creative at life, liberty, and the pursuit of happiness, and problem solving.
If we apply the reductive model of analytics, we will have our top students, top engineers, etc. A set of students who finish school not even sure what they want to do with their lives.
Standards and rubrics don't seem to get to the deeper pieces of learning. Standards are set to map cognition in a Cartesian coordinate plane. The factory approach puts even the well meaning people to sleep. Cognition exists in the design of the analytic systems as well.
reddit.com check out the FAQs
I want to learn. Can we look not just at student compliance, but at student contributions and not just what they write in answer to a prompt but how they link to the world outside. Forum: IWTL: How to Learn This is a self-organizing system--not like a learning management system. Prompt: You learn something new every day; what did you learn today? Submit interesting and specific facts that you just found not (not broad information you looked up, TodayILearned is not /r/wikipedia.
What a great forum! What an enlightened place for students, learners, users, whatever we want to call them. I have been an educator because I was unhappy with the system that existed. I tried to make changes constantly so that students, learners, children, all could LEARN, UNDERSTAND, THINK. I agree completely with Gardner problems with the learning management systems. I'm glad that this conversation is finally coming up within the LAK class.
John Naughton, From Gutenberg to ZuckerbergL What you Really Need to Know about the Internet (2012) --
Complexity is the new reality.
•non linear, unpredictable--butterfly effect
•feedback matters--a lot-- must be loops that feed the complexity
•systems demonstrate self-organization (like reddit.com above)
•EMERGENCE-- synergies--new phenomena -- then we have to learn about the new phenomena and then we have to engage in double loop learning. How do we deal with paradoxes?
What is Double Loop Learning? This refers to success being defined as more than positive outcome "relative to pre-established targets." This defines success allowing learners "need to reflect on the appropriateness, in the light of unfolding events, of the assumptions (the mental model) used to set up those actions and targets." Naughton Analytics mean little more than positive outcome. Double loop would allow a true analytics system that would itself be able to learn--to adapt to the changes, to reflect and make appropriate change. (just as math is not equal to calculating)
Student success is not the same thing as success as a student.
Complexity-- set up target, and then figure out if the target has remained the same.
Intervention- typically LA intervenes when it thinks there will be risk of failure-- but learning doesnt proceed from confusion> lightbulb > clarity, or thesis and antithesis resulting in sythesis--it is instead a set of nodes and connections between these, like a concept map--this is a truer picture of learning.
LA system of suitable complexity--intervenes when the student begins to understand something and validates that what you are thinking is interesting, you are on the edge of finding out, you are on the right path here. building the system around interventions around the point of about to fail is not as interesting as intervention around you are about to understand.
connectome
waves-ELI director, Malcolm Brown A third wave, integration of social media-2-integration of LMS into learning 1.
have we really integrated social media--reddit, twitter, blogs, if we had engaged wide-open social media we would see how complex learning really is--let's question the third wave because we're routing around complexity. What is the goal?
quasi-demo of this kind of system- www.wdyl.com (what do you love?) the icon for a search is a heart-- put in learning--
framed around invitations that will begin in delight and end in wisdom. the google page is like a project based, inquiry based
more coming soon
understand the complexity of lerning. there is a healthy humility about what we don't know--measures curiosity but also arouses it--
free range student writing-auto ethnography - so that sparked my curiosity--link to a website--student is giving info about what it is the student is doing within the learning envirionment
How do you measure wisdom? through Chris Dede wrote article about wisdom inlearning, he pointed to the point--A Handbook of Wisdom, Cambridge U Press, 2005--several strands of reasearch now devoted to wisdom-id cross cultural consensus, traditional learning models, things they don't include, awareness that within each cultures reporting self-awareness there is a strong strand of wisdom -- Wisdom is a core human value, it has to do with learning, but not just learning, has to do with doing, but not just doing, why not ask questions about the process. A small pieces loosely joined web presents an interesting possibility for connections that would increase the chances we could find wisdom in our collective conversation.
look up other references:
"Paradox of the Active User", John Carroll & Mary Beth Rosson, 1987
(Is Effective learning possible?)
John Naughton, From Gutenberg to Zuckerberg: What you Really Need to Know about the Internet (2012)
World Wide Mind Stephen Churros
Wednesday, February 29, 2012
Multi-level analysis of distributed learning, uptake
Dan Suthers, A unified framework for multi-level analysis of distributed learning LAK11: http://blip.tv/solaresearch/8_dan_suthers-5692149
Dan works mostly in computer supported collaborative learning. This work came out of doing analysis of interactions in small groups, face to face, online, multiple media and put this together somehow? He was also supported an online community of teachers (TappedIn. Representation of data and analysis enabled the scaling up.
Many theories about how learning happens in social settings:
1. the individual is stimulated by the social setting-social as stimulus to social entity as learning agent
2.Castell's networked individualism
3. Jeremy Rochelle's maintaining a joint conception of the problem
4. how much coupling is needed between learners
diffusion of innovations
knowledge building
they all have in common--contact between people must occur for learning to happen
interaction-not necessarily conversation--maybe Uptake better concept
UPTAKE-the act of the actor taking something relevant from what someone has done before as being relevant for your own ongoing activity (downloading files, etc.)
evidenced by what we can see directly-people's actions in their environment
Analytic Challenges
interaction between the individual, small group and collective agency
requires multiple level analysis
practical challenge-- studyuing learners working in whiteboard and a threaded discussion--interaction is spread or distruted over multiple media--put it back together
traces of activity are fragmented-how to create the whole in the analysis?
ties--not at all like log data-want to unpack what the ties mean
want to put together this trace of interaction
system: that collects things together in and puts things together in a single
annotated artifact and analyses it.
using db or log files (http) want to figure out what is happening
1. understand what entities we want to see and the relationship between them
ie threaded discussion, has some features to illustrate
2.the granularity at which things are recorded at may be different than the granularity of what you want to look at
message may have 3 logs for it, threading relationship
person 2 has written msg 2 and written msg 3 snd then posts something--now more abstract version as a transcript rather than log files-- wikis etc -unify the record of different media.
3. what about interaction? adjacency carries-assumption-
4. construct contingency graphs-- identify empirical relationships between events that collectively evidence uptake
called contingencies after Garfinkel's "contingently achieved accomplishments" how actors draw on the evolving context.
not truly proof of interactions, when people do things they draw on the resources of their environments
5. did by hand, now figuring out how to automate:
contingencies: media dependency
6. contingencies: Lexical or Semantic Overlap: also contingencies between the read events and writing of messages --events that are near each other may be related, events by the same actor,
for example, reuse of noun phrases ( contingency graph showing contextual action mode) graph of entity-relations of discussion 1 and all actors.
7. end up with: Contingency Graph as Contextualized Action Model
•analytically relevant manifest relationships between the actor's actions and other events that have been recorded
•Next:raise the analytic level of description to latent relationships and higher order structures.
8. be selective in what contingencies you put in--could become pretty complex
all above are data structures that are manipulated computationally.
•interpret collections or subgraphs of contingencies as corroborating evidence for uptake
•supports sequential analysis of interaction
Uptake Graph--an Interaction Model
possible automated way to find the uptake in discussions-- way to find the potential for learning- a structure you can look for in the data structures
micro analysis of transactions- manually, now trying to automate-especially highly interactive discussion
Next Layer--abstracting away from the sequentiality of the events-
affiliations of people through media-- Accociograms
directed affiliation network of actors and artifacts
mediation model: how actors' associations are mediated Latour
this largely factors out time--looks at mediated interactions--finds round trip between actors-- wouldn't see this in threaded discussion structure-round trips are important-dialogue is how learning happens in groups
Relationships
patterns of mediated associations reveal relationships
dialogue pattern-round trip
consumer pattern P3 reads everything P2 produces
Multi-media associations
characterize pairwise relationships in terms of distribution across media
compare roles of various media in supporting associations (suthers and Chus, networked learning, 2010)
cluster analysis
compare roles of media bridging between groups
transitive closure,
Ties--
straightforward to collapse into sociogram by transitive closure or similar computations
mediated associations
SNA methods can be applied to the sociograms
this framework allows potential automation of representation of data to do analyses on-- interpretation of analyses
multi-level analysis
Prior research
contingency graphs are used for:
microanalysis of process through which learners achieved an insight
semi-automated analyses of graph manipulations to find pivotal moments
currently applying this to TappedIn, longest running network of educators
latours idea-following the actors
Advantages of this framework:
as a data representation
integration of distributed data: uncloak distributed interaction
common format for reuse of algorithms
as an analytic framework
multi-level multi-theoretical analysis possible
multiple ontologies allow for mapping between interaction, mediated affiliation and tie levels of analysis
Dan works mostly in computer supported collaborative learning. This work came out of doing analysis of interactions in small groups, face to face, online, multiple media and put this together somehow? He was also supported an online community of teachers (TappedIn. Representation of data and analysis enabled the scaling up.
Many theories about how learning happens in social settings:
1. the individual is stimulated by the social setting-social as stimulus to social entity as learning agent
2.Castell's networked individualism
3. Jeremy Rochelle's maintaining a joint conception of the problem
4. how much coupling is needed between learners
diffusion of innovations
knowledge building
they all have in common--contact between people must occur for learning to happen
interaction-not necessarily conversation--maybe Uptake better concept
UPTAKE-the act of the actor taking something relevant from what someone has done before as being relevant for your own ongoing activity (downloading files, etc.)
evidenced by what we can see directly-people's actions in their environment
Analytic Challenges
interaction between the individual, small group and collective agency
requires multiple level analysis
practical challenge-- studyuing learners working in whiteboard and a threaded discussion--interaction is spread or distruted over multiple media--put it back together
traces of activity are fragmented-how to create the whole in the analysis?
ties--not at all like log data-want to unpack what the ties mean
want to put together this trace of interaction
system: that collects things together in and puts things together in a single
annotated artifact and analyses it.
using db or log files (http) want to figure out what is happening
1. understand what entities we want to see and the relationship between them
ie threaded discussion, has some features to illustrate
2.the granularity at which things are recorded at may be different than the granularity of what you want to look at
message may have 3 logs for it, threading relationship
person 2 has written msg 2 and written msg 3 snd then posts something--now more abstract version as a transcript rather than log files-- wikis etc -unify the record of different media.
3. what about interaction? adjacency carries-assumption-
4. construct contingency graphs-- identify empirical relationships between events that collectively evidence uptake
called contingencies after Garfinkel's "contingently achieved accomplishments" how actors draw on the evolving context.
not truly proof of interactions, when people do things they draw on the resources of their environments
5. did by hand, now figuring out how to automate:
contingencies: media dependency
6. contingencies: Lexical or Semantic Overlap: also contingencies between the read events and writing of messages --events that are near each other may be related, events by the same actor,
for example, reuse of noun phrases ( contingency graph showing contextual action mode) graph of entity-relations of discussion 1 and all actors.
7. end up with: Contingency Graph as Contextualized Action Model
•analytically relevant manifest relationships between the actor's actions and other events that have been recorded
•Next:raise the analytic level of description to latent relationships and higher order structures.
8. be selective in what contingencies you put in--could become pretty complex
all above are data structures that are manipulated computationally.
•interpret collections or subgraphs of contingencies as corroborating evidence for uptake
•supports sequential analysis of interaction
Uptake Graph--an Interaction Model
possible automated way to find the uptake in discussions-- way to find the potential for learning- a structure you can look for in the data structures
micro analysis of transactions- manually, now trying to automate-especially highly interactive discussion
Next Layer--abstracting away from the sequentiality of the events-
affiliations of people through media-- Accociograms
directed affiliation network of actors and artifacts
mediation model: how actors' associations are mediated Latour
this largely factors out time--looks at mediated interactions--finds round trip between actors-- wouldn't see this in threaded discussion structure-round trips are important-dialogue is how learning happens in groups
Relationships
patterns of mediated associations reveal relationships
dialogue pattern-round trip
consumer pattern P3 reads everything P2 produces
Multi-media associations
characterize pairwise relationships in terms of distribution across media
compare roles of various media in supporting associations (suthers and Chus, networked learning, 2010)
cluster analysis
compare roles of media bridging between groups
transitive closure,
Ties--
straightforward to collapse into sociogram by transitive closure or similar computations
mediated associations
SNA methods can be applied to the sociograms
this framework allows potential automation of representation of data to do analyses on-- interpretation of analyses
multi-level analysis
Prior research
contingency graphs are used for:
microanalysis of process through which learners achieved an insight
semi-automated analyses of graph manipulations to find pivotal moments
currently applying this to TappedIn, longest running network of educators
latours idea-following the actors
Advantages of this framework:
as a data representation
integration of distributed data: uncloak distributed interaction
common format for reuse of algorithms
as an analytic framework
multi-level multi-theoretical analysis possible
multiple ontologies allow for mapping between interaction, mediated affiliation and tie levels of analysis
Subscribe to:
Posts (Atom)