Instructor: Daniel McFarland
Department/School: Graduate School of Education
Course: Organizational Analysis
Teaching and Learning Approach: MOOC
Goals: Daniel McFarland wanted to provide high quality course content, engage students, offer free and discounted readings, enable peer evaluation of term papers and study the course to improve it.
Approach: The course argues that organizational theories are a great way to comprehend and manage organizational contexts. Over the course of 10 weeks, McFarland presented 10 organizational theories, or ways of interpreting firm behaviors. Every week, they discussed cases (non-profits, firms, universities, policy arenas, etc) and applied theories to them.
The course has much in common with humanities and social science courses, and therefore tries to develop conceptual understanding (lectures), dialogues (forums), and applications (case studies & writing).
McFarland and the instructional team:
Posted forum questions each week that helped focus conversation
Read forum posts and then customized, weekly reflections and responses to the most up-voted threads in screen side chats (not required for grade!)
Developed in-video quizzes to check student comprehension and engagement as they watched lectures
Developed a final exam that assessed recall of the course material and a capacity to compare and contrast different organizational theories
Online Strategies: The MOOC on Organizational Analysis implemented a variety of new features that the instructional team learned a great deal from:
They implemented a new mode of Peer Assessment for actual term papers that would be used in social science and humanities classes. They got over 500 persons to write papers, but learned only 400 of those actually grade, and they need to change the paper allocations so they represent a pool of graders. Without that, students get one less grade than they planned. They learned from the peer assessment effort that most public users do not have time or the English language skills to write long papers. Hence, they want to create a new track (general) where students write and peer evaluate paragraph-long writing assignments (mirroring the forum posts).
They also successfully implemented an A/B testing platform where we presented different video lectures and emails to randomized groups. They learned that intermittent Picture-in-Picture increases a sense of social presence and reduces a sense of cognitive load, thereby leading to heightened course completion. In addition, they learned that individualistic email reminders have the greatest impact on student commitment and they intend to use those repeatedly.
They worked with SIPX to reduce the costs of textbooks. SIPX negotiated with publishers for free or discounted readings and provided them as a digital course packet. Many students downloaded SIPX material, but mostly the free versions. They learned that many students will use SIPX, but that the majority will find work-arounds by crowd-sourcing summaries and finding free versions of texts via other means. They want to repeat this collaboration so as to reduce text costs via legal means.
Rubric and Training
Students first learned a grading rubric and then applied it to pre-graded sample papers and were given feedback on their accuracy.
Students then submitted two papers and had to grade their own and 3 peer papers using the rubric.
Last, they implemented very engaging forums and learned ways to improve them. Every week they did long screen side chats that responded to user comments and threads.
Course Enhancements: In the second iteration of the course, the instructional team made several modifications:
To sustain student commitment, they loosened deadlines (3 week windows) and improved the forum by setting up distinct weekly forum topics that pinned a couple faculty provided threads at the top. In addition, they interrelated the forum posts with the general track assignment of paragraph writings and peer grading. They thought this integration would help garner higher levels of student commitment and retention.
They also explored options for an individualized report card (to supplement the Statement of Accomplishment), to list tasks a student performed, their level of accomplishment, and their reputation score among peers.
The instructional team experimented with supporting peer-to-peer learning using the Stanford talkabout tool to organize small group discussion sections using Google+ Hangout video conferences. The talkabout system integrates an agenda of suggested topics and exercises to help guide the conversation and complement the work MOOC participants are doing in class.
Lessons Learned: The instructional team learned other things about the MOOC that inform efforts at improvement:
Most of the drop off in their course occurred after week 1 – with the assistance of a social psychologist they want to require an initial writing assignment of three sentences (small cost) for all participants to gain greater commitment and retention for the remainder of the course.
Most of our students are international and professionals – we therefore want to meet their interests and create more international and for-profit case material.
Many students work and have weeks where they cannot engage in the course.
Many students were not happy with what the Coursera Statement of Accomplishment lists. They wanted to get a report that lists the tasks they completed and at what proficiency level.
Plans for Next Iteration of Course: The instructional team would like to render the course into a professional development platform that can be repeatedly taken and used over time as a means to skills and reputation development of users (e.g., people can come back, take higher tracks, complete more videos, get higher reputation points as contributors, etc). They want it to be an alumni network and community that fits an interest topic. Users will have accruing accomplishment scores as the MOOC expands the material it offers. It may even require a different forum-organization like Stack Overflow to include all the prior information provided in past sessions.
They may add in a social network feature and see how it mediates the learning process.
Daniel A. McFarland is an Associate Professor of Education, Sociology, and Organizational Behavior at Stanford University, and is the director of Stanford’s certificate program in Computational Social Science. He holds a Ph.D. in sociology from the University of Chicago and has published widely on organizational behavior in sociology’s top journals. Dan has taught courses in organizational behavior and social network analysis at Stanford for over a decade and received a 2006 award for student advising in the Graduate School of Education.
TAs: Charles Gomez, Emily Schneider, Dan Newark
Faculty Experimentation (Video)
Online course attracts 40,000 participants - and questions from GSE students, Stanford Graduate School of Education School News, February 5, 2013
This project was funded in part by a Faculty Seed Grant from the Office of the Vice Provost for Teaching & Learning (VPTL).