Two staged deadlines:
Crowdsourcing can be a powerful means of constructing collective wisdom. However, as economist Andrew Lo puts it, it can also be a vector for mob madness. The goal in this assignment is to try our hand at wielding the wisdom edge of this particular double-edged sword, and to learn how challenging it can be to control it.
The output of this assignment will be a question bank that the staff will sample from to construct the in-class midterm for CS 278. The staff will publish the question bank publicly so you can study from it, where each question is annotated with a class-voted score from this assignment. The staff will then sample questions for the in-class midterm by drawing a subset of questions from that question bank, drawn proportionally to their class-voted scores. We're all writing our own midterm here, so make sure it's good!
This is an unusual assignment in that it will unfold over several stages, so there will be multiple intermediate deadlines. This structure is due to the multi-stage nature of many crowdsourcing workflows.
Deadline: 11:59pm on Monday, April 29th
This first stage generates the content that we will be remixing and voting on. Your goal is to generate three midterm exam questions, one each for three different lectures from the class so far.
First, use a random number generator to pick three lectures out of the eight lectures that we've had in class and will cover in the midterm:
Write one excellent midterm exam question per sampled lecture. The exam will be closed notes. Aim to avoid simple regurgitation questions: you want the question to test whether people deeply understand and can apply a concept from the lecture. However, the question should be answerable in a sentence or in a short paragraph: no essay questions, no multiple choice, no true/false. In any question you write, make sure that it focuses on one or two main concepts from the lecture — don't go too broad, or it won't be answerable in a short paragraph.
Aim for your question to be able to separate students who would earn an A on the midterm (we will call them A students here in this spec) and students who would earn a B on the midterm (we will call them B students here in this spec): that the B student would get it wrong, but the A student would get it right. Ideas for questions might include:
We will not be including direct references to Mike Krieger's Awesome Guest Lecture on the midterm, but suppose it was and was the lecture that our random number generator picked. These are some examples of possible sample questions derived from that lecture:
Submit your three questions, one per lecture, on our online system by 11:59pm on Monday, April 28th. Stage 2 will go live on Tuesday morning. If you take late day(s) on Stage 1, keep in mind that being late may mean that your questions do not get sampled for remixing in Stage 2, which will make your final reflection substantially more challenging to complete.
Deadline: 11:59pm on Friday, May 3rd
Crowdsourcing would be easy if everything always went exactly as you intended. But other people aren't in your head, so things can go in unexpected directions. Now, you're going to be entrusting your peers with your questions from the first stage, and hoping that it comes out the way you wanted. And you, likewise, will be remixing and voting on other students' questions, trying to improve them in order to make the best midterm possible.
Log on to our system, and you'll be given three random questions written by other students. For each question, remix it to generate three alternative rewrites:
You now have three remixes of the same question: an Easy question, a Medium question, and a Hard question.
Submit all three question remixes for each original question (a total of 3 remixes/original question * 3 original questions = 9 remixes) to our online system. After you submit them, you will be given a series of paired comparison votes similar to the meme comparisons in Assignment 1. Each comparison will be asking for your opinion on two possible exam questions. Comparisons will always be sampled from the same group (e.g., compare two Easy questions in one round, compare two Hard questions in another round). You will not be able to vote on any questions that you authored in Stage 2, or any questions that were written by remixing your questions from Stage 1.
The class votes will determine which questions go on our midterm, using the TrueSkill algorithm as with Assignment 1. You must vote at least 50 times. However, unlike before, we will not be sampling each person's opinions equally — we will run the system with all the data, which means your opinion will be weighed proportionally to the number of votes you enter. As with most crowdsourcing, if you care more, vote more. (Crowdsourcing, like democracy, is hard!)
Your remixed questions and votes are due Friday, May 3rd at 11:59pm. We will freeze the question rankings at that point. If you are taking late day(s) on Stage 2, keep in mind that this means that your questions will not be included in the midterm and your votes may not be included in the frozen set. The leaderboard (top 25th percentile) for each category of questions (easy, medium, & hard) will be released late Friday night (technically very early Saturday morning) shortly after the 11:59 pm deadline.
The course staff will be adding in a few questions of our own. We anticipate that about one quarter of the midterm will be staff-generated.
The course staff will sample from the frozen set of ranked questions as reported in the link above to create the in-class midterm. So, the midterm will be roughly 1/4 Easy questions, 1/4 Medium questions, and 1/4 Hard questions — plus the 1/4 Staff questions that we will write. We will sample only from the top 25th percentile of questions in each category. We will perform a weighted sample of these questions according to the z-score of each of the TrueSkill scores for each question in each category. For example, if a question has the exact average TrueSkill score for that category, its z-score is 0. If a question has a TrueSkill score that is one standard deviation above the mean, its z-score is 1. And so on. Since we are sampling only from the top 25th percentile of questions in each category and TrueSkill scores tend to be normally distributed, all z-scores will be positive numbers. We may manually remove questions from the sampled set in order to avoid multiple questions testing the same concept in the midterm.
Now, go study for the midterm. You have access to the question bank! We'll return to this assignment afterwards, with final reflections.
Deadline: 11:59pm on Wednesday, May 15th (after the midterm)
What happened? Visit this link to see: your original questions, the remixed versions of each of your questions produced by your classmates, the final score on each of those remixed questions, and the list of comparison votes received on each of those remixed questions.
Please submit a PDF of no more than 750 words containing (1) a description of what you were aiming for with your original questions; (2) a reflection on what happened when your classmates remixed your original questions, including whether they made them better or worse, and why that might have happened; and (3) an analysis of why you think the voting on your questions turned out the way that it did, including whether you agree with the final scores assigned to those questions.
You will be graded primarily on two factors: (1) your completion of the process; and (2) your analysis of the crowdsourcing pipeline and what happened to your questions.
Extra credit challenge: any question that is sampled on the midterm that you can take credit for, you get automatic full credit on the midterm. If you created the original question in Stage 1, or if you created the remixed version in Stage 2, and the question is sampled to be included on the midterm, you will get free credit on that question in the midterm. This means that if your question (or remixed questions based on yours) gets voted highly in the TrueSkill rankings, you have a stronger chance of getting those free points on the midterm.