Endeavor

The Challenge: 

How might we design learner-centered features aimed at improving student outcomes and engagement in phonics, personalized for various types of struggling readers?

 

Our Sponsor:

Houghton Mifflin Harcourt is a leading publisher of educational content and technology.  We are helping them to research and design new features for System 44, a reading intervention program for the most challenged readers in Grades 3 – 12+ that serves over 600,000 students in U.S.

 

Our Solution:

We invite you to experience our solution.   Try Endeavor!  

Endeavor serves two purposes:

  • engages students with a choose-your-own-adventure style narrative
  • provides teachers with finer grained data of students’ performance on challenges embedded within the narrative

Through our user research and literature reviews, we found:

  • some students were just going through the motions, they were highly extrinsically motivated, not at all intrinsically motivated
  • there was untapped potential in providing data,  of student actions on System 44, to their teacher

 

 

DESIGN PROCESS

  1. UX RESEARCH

Go where the users are

As our team’s Research Lead, I guided our team in the exploratory research stage.  We observed 7 class sessions, and interviewed 16 students and 5 teachers.  Then we used affinity diagrams to draw out themes.  Tip: when affinity diagramming, it is a good idea to make it a norm that people don’t talk, so that any one person’s influence is limited

affinitydiagram_teacher

 

2. DATA SYNTHESIS

Find a problem worth solving

We found that there was a sizeable percentage of students who found System 44 incredibly boring.  One student we observed thought he had already done one part two times (we think he was so zoned out he couldn’t tell one part from another).  Another student was on part 3 of 25, half-way through the year.

What these students had in common was an external motivation source.  Something other than their own interest and goals was motivating them.  Shifting their motivation from extrinsic to intrinsic seemed like a worthy goal.  We hypothesized that solving this would take care of other issues such as a lack of progress, which was dire because they were very behind in their reading ability.

Screen Shot 2018-09-14 at 12.44.45 AM

 

3. INTEGRATE LEARNING SCIENCE

Allow students to make their own choices in System 44 to shift their motivation.

Since we had already landed on shifting students’ motivation from extrinsically to intrinsically located as a goal, we conducted a quick literature review, and found a well-regarded theory that pointed us towards possible implementations.  This theory, Self-Determination Theory was created by Edward Deci and Richard Ryan.  Here’s a link to one of their papers, which has 28,401 citations.

There are three pillars to intrinsic motivation, they argue.  Simply, shifting from extrinsic to intrinsic motivation requires strengthening people’s needs for competence, autonomy and relatedness.

We brainstormed potential solutions with all three pillars in mind, but in the end, one stuck out as both feasible and sorely lacking in System 44:

Autonomy, the need to feel as though we are making decisions that we want to make.

 

4. IDEATION

Collaborate with our client to decide on an idea

After generating 12 ideas we met with our clients, who flew to Pittsburgh.

Together, we created a 2×2 with Feasibility (our ability to deliver the solution) and Impact (on students) as the axes.

In the end, a high-impact, and feasible solution that we settled on was an interactive narrative.

Our mission statement became We will develop an interactive storytelling prototype that motivates students to practice skills they struggle with the most, enabling them to become competent and confident in reading long passages.

 

5. ITERATIVE DESIGN PROCESS

Use prototyping and user testing as a way to minimize risk

We created at least six prototypes of an interactive story over the following weeks, tested each with students and sought out feedback from teachers and our client partners.

With the first prototype, we needed to see if students actually liked having the autonomy of choosing their way through a narrative.

Here’s a screen from our first prototype.

Screen Shot 2018-09-14 at 2.03.55 AM.png

 

Involve teachers: injecting game-based assessments into our prototype

Along with an interactive narrative, we created embedded game-based assessments to give teachers valuable data to personalize instruction with.

Here’s one of our initial game-based assessments, which looks a lot like those in System 44.

Screen Shot 2018-09-14 at 2.04.05 AM

 

Our biggest challenge: ‘chocolate covered broccoli’

We got feedback from user tests that  the game-based assessments (like the one above) felt “separate from the rest of the narrative.”  This was an instance of the danger of chocolate covered broccoli, a common dilemma with educational games, which means taking something that’s dry and educational, and trying to make it fun by adding gamification.  Here’s a link to what google has to say on chocolate covered broccoli.

So what’s the problem with chocolate covered broccoli anyway?  Well, if students come to expect games with phonics, they won’t truly be motivated to read if there are no gamified rewards.

 

Remake our prototype to include well-integrated assessments

  1. Plan the story’s plot to include “scenes” in which we could add a game-based assessment as a logical climax of those scenes.  This guided us in realizing how many assessments we needed to make.
  2. Brainstorm different ways that the students’ correct or incorrect answers would affect the story.  For example, in our story, the students hide their gold from a pirate.  If they fail to unlock a combination lock with the appropriate word, they lose gold.  This should further engage them with the narrative, create opportunities for fine-grain data for teachers, and hopefully associate feelings of pleasure with reading in students.
  3. Create the assessments to provide useful data to teachers.   We hypothesized that teachers would benefit from distinguishing students’ problems as comprehension based (not knowing what a word means after reading it) or decoding based (not being able to read, or decode, the word in the first place).  Moreover, knowing which phonemes students mistakenly switch, would allow teachers to provide very personalized instruction.

Screen Shot 2018-09-14 at 2.44.08 AM

 

What the students thought

After altering our prototype to have game-based assessments that were truly integrated into the narrative’s storyline, we tested it with our target audience.  The result: all of them felt that the narrative functioned as a seamless whole!

 

6.  FINAL PROTOTYPE

Present the key features

An inviting home screen.  Students can pick a story matched to their interests.

Screen Shot 2018-09-14 at 3.01.02 AM.png

 

‘Read it to me’ button and ‘gold count’ in the upper-right.  Narrative text and choices (in orange) in the middle.  A visual timeline on the bottom.

Screen Shot 2018-09-14 at 3.02.27 AM.png

 

Game-based assessments.  Listening to a word and attempting to comprehend its meaning.  The robot is the students’ companion throughout the story.

Screen Shot 2018-09-14 at 3.02.39 AM.png

 

Fine-grained data hooks.  This is what students see if they fail the previous problem.  Through multiple follow-up assessments, we can determine exactly where students are struggling.

Screen Shot 2018-09-14 at 3.02.55 AM.png

 

Graphics revealing students performance.  Students see the gold and items they collected and their performance on each of the four assessments, and have the opportunity to practice the words they got wrong.

Screen Shot 2018-09-14 at 3.03.03 AM

 

7. EPILOGUE

The next System 44

With a new version of System 44 coming in 2019, we hope that our research and prototypes help Houghton Mifflin Harcourt in accomplishing the very reason they hired us: to make System 44 more engaging, effective, multi-sensory, and supportive of its most vulnerable users.

For me personally, this project showed me how learning science can be applied practically, and helped me improve my UX research & design, project management, cross-functional collaboration, and client relation skills.

Should I continue to work on this project, I would investigate the use of scaffolds to help learners who are especially unmotivated to access the text, and enjoy the story.  To me, improved confidence, ability and motivation to read is achievable only with extensive (and hopefully, enjoyable) practice, with a combination of reading authentic texts and practicing targeted skills.

 

Time: 7 months

Team: Chenxin Wang, Julia Ridley, Lucy Yang, Zach Mineroff

Methods: exploratory research (semi-structured interviews and observations), data synthesis (affinity diagramming), storyboarding, prototyping 

Tools: HTML, CSS, Javascript, Twine

 

Advertisements

LURN

The Challenge:

How might we design a game to help early career teachers learn classroom management skills?

 

Our Solution: 

We created a collaborative board game called LURN.  These are the game directions.  During the game, players take turns drawing two types of cards: A) ‘situation cards’ that detail a classroom dilemma that needs to be addressed, and B) ‘student trait cards’ that players use to ‘imagine a student’ who needs to be taken into consideration when choosing what to do (i.e. which classroom management strategy to use).

After drawing the cards, students imagine the student, discuss the possible strategies, and pick the ones that they feel are best for the situation and the student.  Then they flip over the situation card and see whether their answers earned them -2, -1, +1 or +2 ‘rapport points’ (represented by the clear squares below).  This finishes one round.

The game is won or lost when the rapport points all disappear, or when they fill up the entire board.

 

DESIGN PROCESS

  1. INSPIRATION

While living in Boston, MA, I attended play test nights at MIT’s Teaching System Lab.  Research scientists and game designers were creating ‘playful professional development experiences’ (playful PD experiences), and testing them with teachers.  Find out more here.  I remember having my mind blown by a playful PD experience on creating rubrics.  Ever since then, it has been in my mind to create a playful PD experience.  So when I had the chance to pitch an idea in Educational Game Design class at CMU, I took it.  Luckily, two amazing designers, Ulu and Laura, saw teacher professional development as an opportunity for design as well.

After we had gotten together, we decided to tackle ‘classroom management’ with our game.  Ulu and I have both been teachers, and know how critical it is to deal with classroom misbehavior in a firm, fair and thoughtful way; and how so many young teachers struggle to do this.  Laura got on board, and we were on our way.

 

2. USER RESEARCH

We designed a survey and sent it out to our teacher-friends and a group of masters students at my alma mater.  We got 17 responses back.  Tip: when designing learning objectives, try to find out how experts and novices differ in their strategies.

Novices

“I would spend too much time harping on one student for limited participation and remove myself from others that sought my attention for constructive purposes.”

One question asked participants, “How would have handled a situation at the beginning of your career?”  We analyzed responses, and found that early career teachers prioritized eliminating issues, by yelling, calling the principal and sending students to the main office for minor incidents. 

Experts

“I think students seek routine so as to feel comfortable in knowing what is coming next and how to approach it. This holds true with teaching lessons as well as managing behavior.”

In contrast to their novice-selves, experts tended to treat incidents holistically. They cited frameworks like “restorative circle” and community/team approaches.  They establish routines early in the year, which prevented some problems. They nurtured relationships with students, and made reasonable adjustments to classroom expectations for certain students.  They did not follow all of the school-level rules, only enforcing the ones they believed in. 

 

3. LEARNING OBJECTIVES

We chose the following learner-centered learning objectives:

  1. Teachers will be able to scale their reactions to student behaviors appropriately in a given context, taking into account the visibility, directness, and degree of sternness in their response (e.g. volume, language, body language).
  2. Given a student incident, teachers will be able to identify and apply effective strategies for re-establishing broken routines/expectations, including considering the frequency with which a pre-standing issue is addressed.

 

4. CONSIDERING MULTIPLE IDEAS

We brainstormed the mechanics of four different game ideas, including the one we picked in the end.  The other three were:

  • Teacher’s Mind: Players represent different perspectives in one teacher’s mind and work together to hopefully make it through a school year’s worth of classroom management issues.
  • Collaborating & Enforcing Rules: Players are teachers and use rules/norm cards to resolve classroom management issues, while also convincing the other players that their solution is best to win the most points.
  • Student + Incident: Players are teachers and must develop strategies and arguments around classroom management issues to convince the “judge” and win the most points.

 

5. ITERATIVE DESIGN PROCESS

Playtest 1

We recruited three participants at a playtest night at Carnegie Mellon to test out our situation and student cards.  In this version, the players had to negotiate with each other and “agree” on one shared answer. 

See below for an example of the student cards (left) and situation card (right):

 

The key questions we wanted to answer were:

  • How do players feel about the scoring system and answers?
  • What effect did the student cards have on players’ ability to imagine a student?
  • How long will a single round take?   How much discussion is needed to decide on a strategy?

Through the playtest, we found that:

  • Players had trouble conceiving the reasons for answers being scored as they were.  (Note that an explanation for the correct choice – A, B, C or D – was never given to them)
  • The way they imagined the character was to connect the student-cards in unique ways.  For example, they drew and connected the cards bully with stable home-life and focused in class in different ways.  One player thought this person was “extreme on the inside,” while another said that the character probably tried to show that she was smart on the outside.
  • The players took a total of 5 to 7 minutes per round.   Of that, 3 to 5 minutes was dedicated to discussion about strategies.

 

Playtest 2

Since the players in playtest 1 didn’t understand why the correct answers were correct, we added rationales to the cards.  We actually changed the “answer side” of the situation cards completely.  Now, they looked like this:

 

The key question was whether players could interpret these symbols.  It turned out that it was difficult to distinguish some of the symbols like the squares in the -2 box for row B.

 

Playtest 3

For this playtest, we introduced the board and recruited four pre-service teachers from the University of Pittsburgh.  The goal in the game became to fill up the board with chips.  Players subtracted or added chips according to what the back of the situation cards said (-2, -1, +1, or +2).  We also created additional student cards that had a picture of a student.  Actually, there were two kinds of student pictures: blob students (see picture below), and human-like students.  The reason we had the former was to prevent players from thinking of stereotypes while playing the game.  We didn’t want them to think of a person of color and a misbehavior as connected in some way.  

blob people - meant to reduce stereotyping
blob people – meant to reduce stereotyping

 

The key questions we wanted to answer were:

  • Do the players realize the purpose for the blob student pictures?
  • What kind of lessons do the players take away from the game?  
  • Do players prefer to come to consensus and submit one answer, or answer individually?  

Through the playtest, we found that:

  • Players, independent of us, came to the conclusion that the blob student pictures’ purpose was to avoid stereotypes.
  • Players told us that they took away a diverse range of lessons.  For one player, she said that it reinforced the notion that students come into the classroom with ‘baggage’ and that it was important to take a step back when someone does something wrong and try to find out why they did that.
  • Players said that they preferred to submit individual answers because it is more realistic.
  • The lack of context confused them.  The players thought that some of the answers would have been different, depending on whether the setting of the game was urban or suburban.  

Playtest 4

We returned to the University of Pittsburgh and playtested with a pre-service teacher and an administrator.  We decided to test some new features in this last round of the semester.  For example, we gave the players 5 seconds to pick an answer, because that seemed to be more realistic.  We also interviewed a veteran teacher before the playtest, and changed the explanations with her advice.  

 

In this playtest we wanted to know whether the teachers would like:

  • the 5 second rule
  • the changed explanations

We found that:

  • they preferred to discuss, like in previous versions, which made the 5 second rule meaningless
  • they liked the new explanations

 

6. APPLYING LEARNING PRINCIPLES

What learning principles were evident in the final version of the game?

  • Collaboration with individual accountability – research by famed psychologist Dr. Robert Slavin found that collaboration works best when group members know how well, or how poorly other group members contributed.

In the game, players pick the answers they believe are correct.  After finding out the correct answer, each player either adds to or takes away from the group’s point total.  This is an example of collaboration with individual accountability.

  • Feedback – in order to get better at something, it is helpful to have feedback after making an attempt.  Moreover, specific feedback is usually better than general feedback.

The explanations for different answer choices, on the back of the situation cards, serve as feedback.

  • Anchored learning & just-in-time learning – anchored learning means giving the learner realistic problems, and just-in-time learning means that the problem is given before the explanation.

Unlike the theoretical instruction that graduate students sometimes get, all of the problems in the game are realistic.  In terms of just-in-time learning, it is our hope that pre-service teachers play the game before entering the classroom as student teachers.  That way, they will already have some prior knowledge of how to handle classroom management issues beforehand, and can be more successful.  At the very least, playing the game should lead to…

  • Discussion – through discussion, people see different points of view, and have to defend their own.  This is useful for developing conceptual knowledge.

Discussion is the central game mechanic.  Much of the other game mechanics, such as imagining a student, are meant to encourage discussion.  Players discuss their answers before seeing how to score them.

 

7. NEXT STEPS

We are planning on submitting our game to the Serious Play Conference‘s competition this year (2019).

 

Time: 2 months

Team: Ulu Mills, Laura Rodirguez

Methods: prototyping, play testing, visual design, laser cutting

Tools: Adobe Indesign and Illustrator

Collaborative Video

The Challenge:

Create a quick digital prototype that allows people to learn cooperatively with a YouTube video.

 

Our Solution:

We developed a prototype that leverages the context of videos to help intermediate English language learners learn together.  We invite you to try it out yourself – try the collaborative video!

This is how it works.  A user is watching the video; at a random time, the video stops and the user is prompted to write a question for another learner to answer (the question goes into a database).  Then, at a different time, the video stops again, a different question is pulled from the database and the user is prompted to answer it.

 

DESIGN CONSIDERATIONS

Audience

We ended up picking the domain of English as a Second Language because people who are learning English can benefit from having a video as a visual reference with a context that they can follow. The other type of content we considered was International Culture because of the potential of having people from a target culture answering questions posed by people from outside the target culture.

English ability level

We ended up targeting intermediate-level English learners because we felt that they would benefit from crowd-sourcing questions. Initially we considered targeting beginner-level English learners, and building a prototype with text boxes they could drag to form questions and answers. However, we felt that they would get a similar experience if we would removed the collaborative aspect from the system.

Elaboration

After posing a question and an answer, our system asks the user a prompt to “write another sentence” that adds to their answer. Initially, we considered different wording for the prompt, like “can you elaborate?”  or “can you choose a more specific verb to use in your answer?”  We decided against these options because the former was unclear and our target users might not be familiar with the word ‘elaborate, ’ and the latter makes it seem like their original answer isn’t good enough.

Timing of prompts

Currently, the system picks a random time between 0% and 70% into the video, or at the very end of the video, and asks the user to pose a question about either what is happening or what already happened in the video.  Initially, we thought about prompting the user at any time during the video. We decided against this because we thought that interrupting the user’s video-watching experience near to the end, potentially at the climax of the story, would be unenjoyable.

 

USER TESTING RESULTS

Strengths

Since our video content was entertaining, and the activity’s directions were clear, our “students” were interested in writing questions about Mr. Bean. They had very little trouble in formulating a question based on the content of the video, and were able to generate good, concise questions as they worked through the exercise. Prompting students to elaborate on their initial answers with an additional sentence was also very successful. Our students noted that they wanted to elaborate on their answers, but weren’t sure how long their questions should be initially. This prompt gave them a good opportunity to do so.

Areas for improvement

The experience for answering learner questions was a bit mixed. Our students became engaged by the story of the video, and our prompt read “can you ask a question about what is happening, or what happened?” This odd combination meant that questions targeting a section of the video in the past would be confusing, or jarring. Students also noted that they wanted to edit the question part of their ‘submission’ before finally submitting the question and the answer. Statically accepting answers and not allowing for edits acted as a pain point.

Question for the future

One dilemma that a student posed was whether they should be able to pick when they ask a question. By asking him to think aloud, we noticed that he enjoyed certain parts of the videos more than others, and wanted to ask a question in that moment. However, when we debriefed, he said that having this agency would probably have caused him to stop thinking about questions that he could ask. Also, if they had agency to ask when they want, would people forget to ask questions, or not ask questions at all? These are issues we could potentially resolve after user testing with our target audience.

 

REFLECTION

This was one of my favorite projects because it opened my eyes — I had never seen a tool that could be added onto something that already existed (YouTube videos).  To me, it turned YouTube videos into teaching and learning resources.  I also really enjoyed working with my partner Luca on a quick project.  Due to time constraints, we had to brainstorm and prototype possibilities through sketching very rapidly.  Finally, I believe that this program, with some more work, could be a legitimately useful tool.  As a former ESL teacher, I know that I have colleagues who I could recommend it to.

 

Time: 2 weeks

Team: Luca Damasco

Methods: prototyping, user testing

Tools: HTML, CSS, Javascript, YouTube API

Morning Booster

The Challenge:

Create a two-week instructional unit on a topic of your choice, drawing on learning science.  Use a backwards design approach (learning goals — > assessments — > instructional activities).   Write a plan for a controlled experiment that could be implemented in a school.

 

My Solution:

I wrote an instructional unit to teach 5th graders concepts, procedures and dispositions relevant to design thinking.  I created a plan for a controlled experiment that would test the effect of student choice on self-efficacy (belief in one’s ability) and actual performance.  Here’s my presentation poster, and an abbreviated report.

 

DESIGN PROCESS

  1.  INSPIRATION

What influenced the work?

I appreciate design thinking as a set of tools for problem-solving that’s centered on the current reality and a target audience.  The first time I was really exposed to it was as an interviewee.  One of my heroes, Dan Coleman, was asking me about my experience as a teacher, to help the company TeachersConnect understand their target users.  (As an aside, I worked for a short time with TeachersConnect) .

To me, design thinking means a bias towards action, and one of its advantages is that it tends to lead to developing products and services that people that are more likely to use.

One more influencer

Our class read The ABCs of How We Learn, and it completely changed my perspective on teaching.  Many of the instructional activities were based on the principles in this book, and backed by high quality research.

Screenshot 2018-02-19 01.20.25

 

2.  FOCUS OF THE UNIT

Giving students a design challenge

It begins by giving students a lens through which they can look at the world around them.  The intention is for them to be able to look critically at objects and answer the questions who is this for and what is it for?

Then, students tackle a design challenge: how can you improve the morning of a classmate?  In doing so, they get to know one of their peers, create a journey map, brainstorm ideas, and validate them with their partner.   In the end, they invent something a peer can use between the time they wake up and the time they leave the house.

 

3. LEARNING OBJECTIVES 

Creating  goals for students that speak to their whole person

Based on my knowledge of design thinking, and interviews with project mentor Shelley Moertel, I developed goals for what the unit should teach students to know, what they should be able to do, and how their “character” should change.

 

 

4. ASSESSMENTS

Knowing what the students learned

For each and every of the learning goals, I developed multiple assessments for a teacher to use.  Creating valid and reliable assessments is one of the hardest jobs for a curriculum designer, because it requires integrating: knowledge of on the learning environment, creativity to not turn everything into a test, and exacting attention to detail to create detailed rubrics that are user-friendly and have a good level of objectivity.  Tip: start to plan assessments before writing instruction, so that you stay focused on goals, but also iterate and add assessments while writing instruction.

The assessments in Morning Booster give teachers multiple viewpoints into each of student’s learning.   That is, the goals are assessed in more than one way.

Types of assessments included:

  • Informal observations (and what the teacher should look and listen for)
  • Journal entries (and what the teacher should look for)
  • Quizzes (and answer keys)
  • Interview plan deliverable (and a rubric for the teacher)
  • Journey map deliverable (and a rubric )
  • Storyboard deliverable (and a rubric)
  • Revisions to work to see whether students accepted constructive criticism (and a checklist for the teacher)

 

5. INSTRUCTIONAL ACTIVITIES

Applying learning science in the classroom

I used learning science principles in the following ways:

  • I proposed norms for the teacher and students – Norms, first studied by Erving Goffman, are a shared code of conduct.  They’re useful because certain norms are useful for certain disciplines, and because they encourage desirable behaviors.

The norms in this unit were: 1) share your ideas, even the crazy ones; 2) see constructive criticism as a way to someone trying to help you.  Designing benefits from thinking outside the box, and this unit involves getting a lot of feedback from peers.

 

  • I created activities using ‘contrasting cases’  – Learning from contrasting cases may have came from the study of expertise.  A characteristic of expertise is being able to perceive small differences.  For example, an expert in wine can perceive differences between wines on a detailed level.  When beginners are given contrasting cases, they are able to see these small differences too.

The activity below helps students see how balls have a more specific purpose than things to play sports with.  Reflecting on this fact leads to a greater understanding of designing objects for a purpose.

contrasting cases

 

  • Let students experience problems before hearing solutions or explanations –  Dan Schwartz at Stanford says, “the problem [with lectures] is that students often do not… construct knowledge from what they read or hear.  Their only recourse is to memorize the words (or tune out) rather than understand the implications of those words… When students appreciate the details of a problem, they learn the expository information more precisely,” (114, 118, Schwartz et al)

Students experience problems before an explanation for the teacher.  For example, they are tasked with a short design challenge (design a new backpack for a peer) before the big challenge (design something that improves your partner’s morning). They tackle it without the teacher telling them how to do it.

 

  • Teach students metacognitive skills – metacognition, or thinking about thinking, lets people understand a task, evaluate what they know and don’t know, and then create a plan to learn what they don’t know.

Teaching students metacognitive skills happens in two major ways.  First,  teachers ‘think aloud’ prompts to model thinking about thinking.  Second, students frequently compare finished work with expectations of that work, which helps them reflect, and plan more intentionally in the future.

 

  • Included ‘worked examples’ in the instruction – “Worked examples are models of expert solutions” (293, Schwartz et al).

In this unit, the teacher shows students a step-by-step procedure, and explains each step, for conducting interviews.  First, the teacher calls up a friend and interviews them about their morning.  Then, the teacher re-frames the conversation in a worked example, by giving students a transcript of the conversation, and explaining her interview techniques for each question and follow-up question.

 

  • Used analogies to help students make sense of new concepts – Analogies, as we all know, help us learn new principles, but they’re also important to teach because analogical reasoning is a good learning strategy on its own.

When possible, I drew analogies between new concepts and concepts the students were likely to know.  For example: curiosity and exploratory interviews, roller coasters and journey maps, and comics and storyboards (below).

direct instruction

 

 

Time: 8 weeks

Role: curriculum design

Methods: backwards design, interviews with subject matter experts, evidence-based learning principles

Tools: PowerPoint, Sketch

 

QVFolio

The Challenges:

How might we empower students to visualize their learning artifacts and process in portfolios?

How might we help teachers support their students throughout the portfolio creation process?

 

Our Partners:

We worked with two middle-school teachers from a nearby school district.  They taught technology/engineering and English, which challenged us to create a solution that could meet both of their needs and desires.

Screen Shot 2018-01-29 at 12.51.57 AM

 

Our Solution:

Based on our research, we created:

a) concept video for the QVFolio app that students could use to capture, document, curate and present their projects; and that scaffolds in the app that could support the learning process.

b) pocket-sized ‘prompt cards’ in the spirit of these cards from IDEO, which teachers can use when discussing strategies for supporting students, or reference when they see their students struggling.  Here is a link to our cards.  Please let me know if you can’t access them.

 

 

DESIGN PROCESS

  1. UX RESEARCH

Interview experts

After interviewing four people with exemplary portfolios, we gained an understanding of the end-to-end process of making a portfolio.  From this, we created a model for portfolio creation.  

Screen Shot 2018-01-29 at 12.01.19 AM

 

Talk to our target audience: middle-school teachers and students

We interviewed our teacher-partners to learn about what they wanted. 

These are the highlights: 

  • The technology teacher wanted to make his students’ learning visible through portfolios.  He also wanted his students to use their documentation to share their work with each other and give each other constructive feedback.
  • The English teacher aspired for her students to see their own growth through their portfolios.  She had seen past students’ surprise when they saw how far they had come over the course of the year.

 

We sent out a survey to our teacher-partners’ students, and gathered 188 responses.

Here are some highlights:

  • 25% of students reported that they currently share their finished class projects with their families, while 34% wanted to do so.
  • 71% of students gave a 4 or 5 (out of 5) in response to “How proud are you of your finished projects,” but only 38% gave a 4 or 5 (out of 5) in response to”How interested are you in sharing your finished projects with others?”   (perhaps they felt nervous about sharing)
  • 65% of students gave a 4 or 5 (out of 5) in response to “How helpful is it to look at and comment on your classmates’ work, for you own work?” (below)

 

2. DATA SYNTHESIS

We used affinity diagramming to draw out themes, which we validated with our teacher-partners, and used to create a common persona (below), and an experience map that summed up our user research.

 

Here are some highlights from the experience map:

  • Opportunities: Both teachers emphasize student choice over the topics they work on in their classrooms.  Our solution will have to be flexible.
  • Challenges: Both teachers use numerous technologies in their classes, and are weary of adding on to them.  Our solution will need to be very user-friendly.

 

3. IDEATION

Drawing on our research, we saw an opportunity to create something that would put students in the driver’s seat of the portfolio creation process.  After writing HMW statements and brainstorming ideas, we saw that a combination of several fit our design principles the best.  Below is one storyboard, read from top to bottom.  The words in the upper right corner – capture, organize, curate, share and reflect – map to the  stages of the portfolio creation process.

 

 

4. ITERATIVE DESIGN PROCESS

Prototype 1

Our first prototype was designed to help us better understand how students would capture, document and curate a project.   We also wanted to see what challenged them and what helped, so we brainstormed ‘prompts’ like “Why don’t you try…” and brought graphic organizers like Flow Charts with us.

We asked 6 students in the technology/engineering class to “tell a story of an experience they had in their technology classroom.”  We gave them: polaroid camera, legos, colored pencils, paper.  Here are some examples of what they came up with: 

Screen Shot 2018-01-29 at 12.05.32 AMScreen Shot 2018-01-29 at 12.06.03 AM

 

Prototype 2

Since we knew that our final design would be digital, we wanted to find the pain points of an online system.  For the second user study, we had 6 English students tell a story in from their ELA classroom using google slides, their work, and a camera phone.  

 

Findings from User Study 1 and 2

Our collected findings and implications for the final design are below.

Screen Shot 2018-01-29 at 12.43.17 AM

 

5. FINAL DESIGN

Putting it all together

Our final design focused on three stages of the portfolio making process: prepare, capture and organize.  Our solution has two main parts:

  • QVFolio app: Our final design focused on three stages of portfolios: prepare, capture and organize.  ‘Prepare’ refers to what teachers and students do before starting to work, like gathering materials.  ‘Capture’ refers to collecting evidence, which is achieved through our QVFolio app that connects to a phone’s camera, which also asks students to reflect on their experiences right away.  ‘Organize’ refers to putting together all their documentation for one project, and can also be done with the QVFolio app as well.  From our user studies, we found that graphic organizers and prompts can be helpful, so we conceptualized that students can access graphic organizers, and  teachers can push out announcements, examples and hints. 

Screen Shot 2018-01-29 at 12.07.22 AM

Screen Shot 2018-01-29 at 12.47.00 AM

  • pocket-sized ‘prompt cards’, color-coded for each of the stages.  Teachers can use when discussing strategies for supporting students, or reference when they see their students struggling.  Here is a link to our cards.  Please let me know if you can’t access them.
Screen Shot 2018-01-29 at 12.07.09 AM
Prompt cards

 

Time: 12 weeks

Team: Anne Xie, Courtney Francis, Tianmi Fang

Role: user research and design

Methods: exploratory research (semi-structured interviews, observations), competitive analysis, literature review, data synthesis (affinity diagramming, experience mapping), personas, storyboarding, prototyping 

Learn Mandarin

The Challenge:

Design and implement a one-hour e-learning module on a topic of your choice, incorporating e-learning principles.  Conduct an A/B test to see whether an innovative online teaching practice leads to statistically significant learning gains.

 

Our Solution:

We designed an e-learning module to teach Mandarin vocabulary and questions.  Then we conducted an A/B test, which showed that our innovative teaching practice had not led to statistically significant learning gains.  Here’s our report.

 

 

DESIGN PROCESS

  1. LEARNING OBJECTIVES

Set the scope to our time constraints

With a total of eight weeks, and no native Mandarin speakers on our team, we knew that setting an appropriate scope for our project would be key.  We quickly contacted a Chinese professor at CMU to get some background info on how she taught non-native speakers.  Then, we decided as a team that we would focus on food vocabulary and the grammar used to ask yes/no questions, because it seemed reasonable that our classmates from different cultures could talk about food in our common space, and asking questions is a good way to learn new vocabulary.

We chose the following four student-centered learning objectives, which specify the given conditions, and the intended student behavior.

  1. Given audio of a Mandarin phrase, students will write the English equivalent.
  2. Given an English phrase, students will recall the Mandarin (Pinyin) translation.
  3. Students will apply the ‘ma’ rule to make a yes/no questions.
  4. Students will recall the pitches for each of the four tones in Mandarin.

 

2. ASSESSMENTS

Align assessment with learning objectives

We created at least two assessment questions per learning objective. This meant we could assess each learning goal at least once in the pre-test before the instruction and once in the post-test at the end of the instruction.

 

3. USER RESEARCH & DESIGN

Find out how experts and novices approach our questions

We conducted think aloud tests with participants and constructed a cognitive task analysis (CTA) to determine the skills and sub-skills that we needed to teach and assess in our module.

One of our participants was a student with beginner-level Mandarin skills.   She was the perfect person to do a think aloud because she was enough of an expert to complete tasks on her own, and enough of a novice to notice the thought processes she used to answer the questions.  Knowing her thought processes enabled us to augment our instruction with specific instruction that matched what she was doing in her head.  There are obvious limitations to this, but in our short amount of time, it was the best we could do.

 

Screenshot 2018-01-09 20.36.01

 

4. IMPLEMENTATION

Development environment

We developed our e-learning module using CMU’s course authoring platform, OLI, which stands for Open Learning Initative.  We chose this platform because of its ability to provide specific feedback for different answer choices, and because of easy integration with another CMU tool, DataShop, which we could use to analyze our participants’ results.

Applying Learning science

We applied Clark and Mayer’s e-Learning principles in the following ways.

  • Multimedia principle – Using the right kind of multi-media enables learners to find the information they need with the least amount of extraneous mental effort, thus reducing their cognitive load.

We provided tables of information.  Although this can seem obvious, applying the principle reinforced best practices for creating online learning.

  • Personalization principle  – Interestingly, when the tone in online learning modules is casual, suggestive and inviting, learners actually learn better!  The reason, according to Clark and Mayer, is that this activates a part of people’s brains that makes people more open to learning.

We explicitly invited learners to the module through a Welcome page.  We also used concise, encouraging and conversational language throughout.

  • Knowledge-Learning-Instruction taxonomy – not all knowledge is of the same ‘type.’  Targeting the learning of certain knowledge with particular teaching techniques is the best way to approach instructional decisions.  The KLI framework was developed in large part due to Ken Koedinger, who just so happened to be teaching our class!  Here’s the seminal paper.

In our module, we were teaching two main types of knowledge: multiple facts (i.e. vocabulary) and a skill (i.e. one grammatical rule).  Facts are well taught with spaced recall, which I’ll talk about below.  Skills are well taught by having students “induce” the rule by seeing multiple, differing examples, which show the “boundaries” of the rule.  For example: seeing the grammatical rule in different yes/no questions, but not in an open-ended question.

  • Spaced practice – This technique is used to improve memory for specific facts.  By distributing practice over time, learners remember these facts better.  Companies such as Duolingo and Membean use spaced practice.

To implement spaced practice in our module, we made sure to include questions about prior topics throughout the module.

Screenshot 2018-01-09 19.50.47

 

5. RESULTS

Evidence of learning

We observed that learners performed 52% better from pre-test to post-test.  We also checked whether the pre-test or post-test was significantly more difficult than the other.  Our results showed that they were not.

              Screenshot 2018-01-09 20.56.07

 

‘Doing Learning Science’ with an A/B Test

We performed an experiment to explore whether, by providing a specific listening strategy (detailed below), learners’ learning gains would improve.  As it so happened, data analyzed with a t-test showed that there wasn’t evidence of improved learning.

This led us to additional questions, like: could the strategy be so widely known that everyone, including our control group, used it?  Although we could have been disappointed by the result, my take-away was one of Professor Koedinger’s closely held beliefs: we can’t just apply learning science, we also need to do learning science.

Listening Strategy: Play a recording several times and pick out sub-phrases you know to build up understanding of the whole sentence.

Screenshot 2018-01-07 17.00.37

 

6. CONTENT

In order to verbally say yes/no questions, first think of the analogous statement, and then add the word ‘ma’ to the end of it.

Example:

Nǐ xǐ huān jī ròu means You like chicken.

Nǐ xǐ huān jī ròu ma means Do you like chicken?

Can you answer this question?

Screenshot 2018-01-09 20.59.35

 

Time: 8 weeks

Team: Elizabeth Onstwedder, Teja Talluri

Role: Instructional designer

Methods: cooperating with subject matter experts, UX research (think alouds), cognitive task analysis, QA testing, A/B testing

Tools: Open Learning Initiative (CMU), DataShop, iMovie

How Teachers Use Twitter

The Question:

As a consultant for TeachersConnect, I was asked to find out the answer to the question, “What do teachers do on Twitter?”

 

My Answer:

By “attending” the edchat #bfc530,  a 5:30am meetup for educators on Twitter – I found teachers supporting their peers with information, encouragement, inspiration and when needed, to say, “I’ve been there too.”

Additionally, they were asking for and receiving help in identifying technology, resources, professional development and job opportunities.

 

The Result:

The fact that edchats exist, despite their non user-friendly nature, was additional evidence for TeachersConnect that their idea of a social network for teachers has immense value.  TeachersConnect took the research as user validation, and used the detailed findings to dig even deeper into teacher’s needs.  I like to think that I contributed to the company’s user research base and, consequently, their design principles, which were forged after a lot more work, especially by Dan Coleman and Dave Meyers.  The design principles can be seen in this blog post called “Find Me a Soulmate.”

 

The Deliverable: