EFFICACY STUDY

Ensuring greater learning scores

Practice’s patented pedagogy has a measurable, proven impact on learners’ lives.
How do we know this? We tested it.

Independent researchers conducted a small efficacy test to determine if teachers who participated in a series of Practice exercises experienced increased competency acquisition.

What did we find?

Teachers that participated in the Practice professional development course demonstrated significantly higher results than a comparable study conducted by the Gates Foundation.

HERE’S HOW IT WORKED

  • We designed and launched an online professional development course for math teachers.

  • We collected teacher’s pre and post videos, and worked with independent researchers.

  • Researchers used the Mathematics Quality of Instruction to analyze pre and post videos.

AFTER USING PRACTICE, TEACHERS SHOWED SIGNIFICANT INSTRUCTIONAL GROWTH

LINKING BETWEEN REPRESENTATIONS

The extent in which symbolic and verbal representations were linked

EXPLANATIONS

Clear expectations of a problem, its procedural steps and reasons behind every step

  • PRE-TEST
  • POST-TEST

The Study

Recognizing a need to provide an effective, scalable, and low-cost solution to train and support high quality teachers, the National Science Foundation awarded Practice a Small Business Innovation Research grant to beta test its platform with a meaningful group of novice and master teachers, collect survey responses, and iterate the Practice platform in response to the users’ experience. In addition to beta testing the platform, gathering feedback, and iterating the platform, Practice designed and completed an efficacy test for a small sample of users to determine if educators who used Practice’s online learning platform experienced increased teaching competency.

The research determined that 1) Practice can be used to deliver meaningful professional development results, 2) a sample of practicing teachers from a range of district, charter, and parochial schools found participation in Practice’s platform to be a valuable learning experience, and 3) a significant number of master teachers from a range of organizations found the platform to offer a valuable teaching tool and plan to use it in the future. The research also found preliminary indications of verifiable efficacy. Extendable to every aspect of teacher education, the focus of Practice’s Phase I research was mathematics education.

Select the links to the left to understand the need for an effective, scalable, and low-cost solution to train and support high quality teachers, Practice’s proposed solution, a summary of user feedback from Practice’s study, initial efficacy results, and Practice’s future research plans.

The Need

The single most important in-school factor affecting student learning is a quality teacher (Chetty, Friedman & Rockoff 2011). Measured by increased student achievement, effective teacher education and professional development programs include frequent practice, observation of self and others, interaction between teachers about their craft, and mentors (Ingersoll 2012, Mielke 2012, Pianta 2011, Leana 2011, Boyd et al. 2008, Scales 2008, Ericsson et al. 2007, Levine 2006, Ingersoll & Kralik 2004, Rodgers 2002, York-Barr 2001). Interactive learning is considered to be a critical activity by those who study teacher learning (Bakkens, Vermunt, and Wubbels 2010; Meirink, Meijer, Verloop, and Bergen, 2009; Van Eekelen, Boshuizen, and Vermust 2005). Despite research touting the great benefits of collaboration between novice and expert teachers (Little 2007; Shulman and Shulman 2004), studies show that collaboration in schools is rare and difficult to support (Grasel, Fussangle, and Probstel 2006; 2AgePro Consortium 2009).

In particular, U.S. public education has long struggled with how to deliver effective, scalable, cost-efficient professional development for its teachers. In a recent survey conducted by the Organisation for Economic Cooperation and Development (OCED), half of the teachers in grades 7 to 9 reported they rarely or never co-teach or observe their peers teaching (OCED 2014). Along similar lines, less than half receive feedback from their peers (OCED 2014). Programs that include frequent practice, observation of self and others, a collaborative environment, and mentors are difficult to emulate. Such programs not only require an enormous amount of money, but also an enormous amount of time and effort.

One of the highest-cost components of a university-based teacher education program or alternative certification program is the clinical preparation or practice teaching experience (NCATE 2011). By way of example, Baylor University works closely with the Waco Independent School District. The partnership provides an intensive clinical experience for teachers. During the 2009-2010 school year, the total cost per teacher was approximately $6,000 (Watts & Levine 2010).

As a result of the steep costs associated with highly effective clinical programs like the Baylor-Waco program, most university-based teacher education programs do not include clinical practice in their programs. Since the 2008 recession, higher education has increased tuition and cut spending, often in ways that diminish access, quality, and ultimately jeopardizes outcomes (Center on Budget Policy and Priorities 2014). Rather than pay for steep clinical costs, most education programs focus on academic preparation and course work that is loosely linked to school-based experiences (NCATE 2010). This alternative fails to adequately prepare teachers for the classroom (NCATE 2010).

Once in the field, professional development programs provide ongoing training and support for teachers. Professional development costs can include the following: 1) teacher’s salary; 2) substitute wages; 3) trainer’s or coach’s salary; 4) salary for administrator’s time for administering the program; 5) materials for the program; 6) rental cost for facility used; 7) travel for off-site activities; 8) tuition for university-based professional development; and 9) fees for conferences (Odden et al. 2002). School district spending for professional development ranges from 1.3% to more than 8% of a school district’s total expenditures, with an overall average of 2.8% (Mills, et al. 2002). On a per teacher basis, one study estimated that school districts spend between $3,000 and $9,000 per teacher each year on in-service training (Pianta 2011).

Recent economic challenges have likely led to reductions in that level of spending. According to the Center on Budget and Policy Priorities, the past six years have been the worst on record for state budgets. Collectively, state budgets dropped to their lowest level in 2010 with a $191 billion deficit (Center for Public Education 2013). Consequently, school systems that depend on state revenue for the majority of their funding have faced continual cuts, not the least of which is to training and development. In a survey conducted by the American Association of School Administrators, 69.4% of districts reported projected cuts to professional development spending to balance their budgets for the 2013-2014 school year (Center for Public Education 2013).

Yet short-term savings from cuts to professional development programs may yield longer-term costs. Between 40 and 50 percent of all U.S. teachers will leave the classroom within the first five years of teaching (including nine and a half percent that leave before the end of their first year) (Ingersoll 2012). Turnover in teaching is about four percent higher than other professions (National Commission on Teaching and America’s Future 2004). High turnover rates impose high costs and undermine efforts to guarantee quality teaching for every child. The top reasons for leaving the profession include: 1) lack of professional support; 2) poor school leadership; 3) low pay; and 4) personal reasons. It is well documented that supporting beginning teachers leads to lower attrition rates. It has been found that just two small initiatives (working with a mentor and having regular supportive communication with an administrator) make it substantially more likely that a new teacher will stay in the classroom (Ingersoll 2012). Effective professional development programs are a key component of a school district’s retention strategy.

As higher education continues to face the need to reduce costs, high turnover rates in schools endure, and district and charter schools continue to face large budget deficits, the need for an economic, scalable, and effective solution has only become more acute. The Practice platform offers the promise of such a solution.


Resources

2AgePro Consortium, (Eds.). (2009). Methods and practises utilized to support teachers’ professional development: Current state description. (Deliverable 2.1.). Oulu, Finland: University of Oulu, Learning and Research Services. http://rua.ua.es/dspace/bitstream/10045/29131/1/EREJ_02_02_03.pdf

Bakkenes, I., Vermunt, J. D. & Wubbels, T. (2010). Teacher learning in the context of educational innovation: Learning activities and learning outcomes of experienced teachers. Learning and Instruction, 20(6), 533-548. http://www.sciencedirect.com/science/article/pii/S0959475209000929

Boyd, D., Grossman, P., Lankford, H., Loeb, S., and Wyckoff, J. (2008). Teacher Preparation and Student Achievement. National Center for the Analysis of Longitudinal Data in Education Research, Urban Institute. Retrieved May 23, 2011, from http://www.urban.org/UploadedPDF/1001255_teacher_preparation.pdf

Center for Public Education, National School Boards Association. (2013). Allison Gulamhussein, Teaching the Teachers: Effective Professional Development in an Era of High Stakes Accountability. http://www.centerforpubliceducation.org/Main-Menu/Staffingstudents/Teaching-the-Teachers-Effective-Professional-Development-in-an-Era-of-High-Stakes-Accountability/Teaching-the-Teachers-Full-Report.pdf

Center on Budget Policy and Priorities. (2014). National Center for Education Statistics, Projections of Education Statistics to 2022, Table 6.

Chetty, R., Friedman, J. N., & Rockoff, J. E. (2011). The Long-Term Impacts of Teachers: Teacher Value-Added and Student Outcomes in Adulthood (NBER Working Paper 17699). Cambridge, MA: National Bureau of Economic Research.

EdSurge. How Teachers Are Learning: Professional Development Remix. Rep. June 2014. https://d3e7x39d4i7wbe.cloudfront.net/uploads/report/pdf_free/6/PD-Remix-EdSurge-Report-2014.pdf

Ericsson, K. Anders, Michael Prietula, and Edward Cokely. (2007). The Making of an Expert. Harvard Business Review.

Gräsel, C., Fussangel, K. & Pröbstel, C. (2006). Lehrkräfte zur Kooperation anregen: Eine Aufgabe für Sisyphos? [Encouraging teachers for co- operation: A task for Sisyphus?] Zeitung für Pädägogik, 52(2), 205-219.

Ingersoll, R. and Kralik, J. (2004). The Impact of Mentoring on Teacher Retention: What the Research Says. Retrieved May 29, 2013, from http://www.ecs.org/clearinghouse/50/36/5036.pdf

Ingersoll, R. and Merrill, L. (2012). Seven Trends: The Transformation of the Teaching Force. Retrieved May 29, 2013, from http://www.gse.upenn.edu/pdf/rmi/rmi_seven_trends.pdf

Ingersoll, B., & Schreibman, L. (2006). Teaching recipro- cal imitation skills to young children with autism us- ing a naturalistic behavioral approach: Effects on lan- guage, pretend play, and joint attention. Journal of Autism and Developmental Disorders, 36, 487–505.

Leana, C. (2011) The Missing Link in School Reform. Stanford Social Innovation Review. Retrieved May 29, 2013, from http://www.ssireview.org/articles/entry/the_missing_link_in_school_reform

Levine, A. (2006). Educating School Teachers. The Education Schools Project. Retrieved April 19, 2013, from http://www.edschools.org/pdf/Educating_Teachers_Report.pdf

Little, J. W. (2007). Chapter 9.Teachers’ accounts of classroom experience as a resource for professional learning and instructional decision making. Yearbook of the National Society for the Study of Education, 106(1), 217- 240.

Lotter, Christine et al. (2009). The Influence of Repeated Teaching and Reflection on Pre-service Teachers’ Views of Inquiry and Nature of Science. Journal of Science Teacher Education.

Meirink, J. A., Meijer, P. C., Verloop, N. & Bergen, T. C. M. (2009). How do teachers learn in the workplace? An examination of teacher learning activities. European Journal of Teacher Education, 32(3), 209-224.

Mielke, Paul G. Investigating a Systematic Process to Develop Teacher Expertise: A Comparative Case Study. (2012). Dissertation Cardinal Stritch University.

Mills, K.H., Odden, A., Fermanich, M., Archibald, S., and Gallagher, A. (2002). An Analysis of Professional Development Spending in Four Districts Using a New Cost Framework. Consortium for Policy Research in Education: University of Wisconsin-Madison. Retrieved April 19, 2013, fromhttp://cpre.wceruw.org/papers/4DistrictPD_SF.pdf

Mullins, I.V.S., Martin, M.O., Foy, P., and Arora, A (2011). TIMSS 2011 International Results in Mathematics. TIMSS 2011 International Results in Mathematics. Retrieved April 19, 2013, from International Study Center, Boston College.

NCTAF. (2004). Carroll, Tom, and Kathleen Fulton. “The True Cost of Teacher Turnover.” The National Commission on Teaching and America’s Future.http://nctaf.org/wp-content/uploads/2012/01/Tom-Carroll-Kathleen-Fulton-True-Cost-of-Teacher-Turnover-graphic-Threshhold-Spring-2004.pdf

NCATE. (2010). Zimpher, Nancy (Ed.). Transforming Teacher Education Through Clinical Practice: A National Strategy to Prepare Effective Teachers. The National Council for Accreditation of Teacher Education. Retrieved April 19, 2013, from http://www.ncate.org/LinkClick.aspx?fileticket=zzeiB1OoqPk%3D&tabid=715.

NCATE. (2010). Zimpher, Nancy (Ed.). Transforming Teacher Education Through Clinical Practice: A National Strategy to Prepare Effective Teachers. The National Council for Accreditation of Teacher Education. Report of the Blue Ribbon Panel on Clinical Preparation and Partnerships for Improved Learning.http://www.ctc.ca.gov/educator-prep/coa-agendas/2011-01/2011-01-item-11.pdf

NCES (2010). “Fast Facts.” National Center for Education Statistics. Web. http://nces.ed.gov/fastfacts/display.asp?id=372

NCES (2012). “Fast Facts.” National Center for Education Statistics. Web. http://nces.ed.gov/fastfacts/display.asp?id=372

OECD (2014). Organization for Economic Cooperation and Development. “Survey: Teachers Worldwide Seek More Opportunities for Collaboration.”Teaching and Learning International Survey(2013): n. pag. Education Week. 25 June 2014. Web. http://blogs.edweek.org/edweek/inside-school-research/2014/06/teachers_worldwide_seek_more_c.html

Odden, A., Archibald S., Fermanich, M., and Gallagher, H.A. (2002). A Cost Framework for Professional Development. Journal of Education Finance.

Osipova, Anna et al. (2011). Refocusing the Lens: Enhancing Elementary Special Education Reading Instruction through Video Self-Reflection. Learning Disabilities Research & Practice;

Pianta, R. C. (2011). Teaching Children Well: New Evidence-Based Approaches to Teacher Professional Development and Training. Center for American Progress. Retrieved April 19, 2013, fromhttp://www.americanprogress.org/issues/2011/11/pdf/piana_report.pdf

Rodgers, Carol. (2002). Defining Reflection: Another Look at John Dewey and Reflective Thinking. Teachers College Record.

Scales, Peter. (2008). The Reflective Teacher. Teaching in the Lifelong Learning Sector. Maidenhead: McGraw-Hill/Open UP.

Shulman, L. S. & Shulman, J. H. (2004). How and what teachers learn: A shifting perspective. Journal of Curriculum Studies, 36(2), 257-271.

UNESCO (2013). “A Teacher for Every Child: Projecting Global Teacher Needs from 2015 to 2030.” United Nations Educational, Scientific and Cultural Organization (2013): n. pag. UNESCO Institute for Statistics. Web. http://www.uis.unesco.org/Education/Documents/fs27-2013-teachers-projections.pdf

Van Eekelen, I. M., Boshuizen, H. P. A. & Vermunt, J. D. (2005). Self- regulation in higher education teacher learning. Higher Education, 50(3), 447-471.

Watts, E., and Levine, M (2010). Partnerships, Practices, and Policies to Suport Clinically Based Teacher Preparation: Selected Examples. NCATE. Retrieved April 19, 2013, from http://www.ncate.org/LinkClick.aspx?fileticket=rMrsfjZ2vZY%3D&tabid=715

York-Barr, Jennifer. (2001). Reflective Practice and Continuous Learning. Reflective Practice to Improve Schools: An Action Guide for Educators. Thousand Oaks, CA: Corwin.

The Solution

Practice’s video-based learning platform provides instructors tools to build exercises that imitate key components of a successful in-person training experience but are delivered asynchronously and online. The components include:

• an ability for teachers to role-play or practice skills;
• peer engagement where teachers observe each other and give and get feedback;
• an opportunity for teachers to observe best practices;
• an opportunity for teachers to receive expert feedback; and
• an ability for teachers to assess their learning.

In 2014, EdSurge, a leading source for educators, entrepreneurs, and investors involved in education technology, published a report outlining how technology supports teacher professional development. The report first assessed professional development needs by conducting over 400 teacher surveys and over 50 teacher interviews. Next, the report examined the professional development tools currently in the market. After comparing the needs and the tools, EdSurge proposed a framework for evaluating tools by examining how they support a model professional development learning cycle. The framework identified four stages of a model professional development learning cycle: Engage, Learn, Support, and Measure. Practice exercises include each of the four stages identified in the professional development learning cycle. Furthermore, leading learning theory and research supports the stages included in each Practice exercise. The following outlines how a Practice exercise works, how each exercise mirrors the professional development learning cycle outlined by EdSurge, and how the design of each exercise supports leading learning theory. Each Practice exercise includes the following stages: 1) Prompt & Response; 2) Peer Assessment; 3) Model & Reflection; and 4) Analysis. In order to set the stage for rich peer interactions and provide teachers an opportunity to practice specific skills, a Practice exercise first asks teachers to watch a scenario-based learning video during the Prompt & Response stage (e.g. a student asks how to perform a specific math equation or a parent asks for an explanation to his daughter’s slipping grades). After watching the Prompt, which often includes modeling, teachers upload a video response showcasing or practicing their skills.

Once a teacher uploads his or her response, he or she unlocks the second stage – Peer Assessment. Here, teachers observe each others’ responses and provide each other feedback through a rubric and written feedback. The professional development learning cycle identifies Engage as its first stage because often it is from peer conversations that teachers identify new practices they want to implement or solutions to problems they want to fix. Combined, the Prompt & Response and Peer Assessment stages engage teachers around a single skill challenge and foster conversations about new and old practices. Furthermore, these two stages provide an opportunity for teachers to practice specific skills and receive immediate and specific feedback. According to K. Anders Ericsson, Swedish psychologist and Professor of Psychology at Florida State University, expertise is achieved when people practice deliberately. Part of practicing deliberately involves teachers practicing clearly identified skills and receiving immediate and specific feedback (Ericsson 2007; Mielke 2012). Practice provides mechanisms for both of these actions.

Once teachers have consulted with peers, they then seek information that exists outside of their collegial circle. The professional development learning cycle identifies this as the Learn stage. Practice exercises support the Learn stage by asking teachers to juxtapose their practice video to a best practice video shown during Practice's Model & Reflection stage. Albert Bandura, Professor of Social Science in Psychology at Stanford University, is well-known for a variety of theories and experiments including his Bobo doll experiment, in which he found that children who observed aggressive treatment towards the dolls were more likely to imitate such behavior themselves. Among many other findings, Bandura developed the social learning and social cognitive theories. Bandura argued that people learn by observing others’ actions and the rewards and punishments of such actions. To be most effective, Bandura argued the modeling should: 1) Capture the attention of the learner through engaging and relevant content (attention); 2) Contain steps that are easily remembered and replicable by the learner (retention and reproduction); and 3) Demonstrate why the learner would want to replicate the actions (motivation) (Bandura 1977).

The Model Response video captures teachers attention by presenting a relevant skill – one the teachers just practiced during the Challenge stage. Further, Model Response videos are encouraged to contain steps that are easily remembered and replicable by the teacher as well as demonstrate why the teacher should replicate certain skills. Through this process, teachers reflect on their own practice and learn new practices. Armed with support from peers and new information, teachers begin to implement new practices in the classroom. Here, teachers often need coaching and mentoring. The professional development learning cycle identifies this as Support.

During the Analysis stage of a Practice exercise, select teachers receive direct support from a master teacher or a coach. All participating teachers can learn from these experts by reviewing this feedback. Finally, Practice exercises allow teachers to measure their growth by reviewing rubric scores on their own practice video as well as observational comments in a discussion thread associated with their practice video. The professional development learning cycle refers to this learning stage as Measure – a way for teachers to assess their learning through rubrics and evaluation forms.

 


Resources

Bandura, Albert. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review.

Ericsson, K, Krampe, R., and Tesch-Romer, C. (1993). The Making of an Expert. Psychological Review.

Grusec, Joan. (1992). Social Learning Theory and Developmental Psychology: The Legacies of Robert Sears and Albert Bandura. Developmental Psychology.

Mielke, Paul G. Investigating a Systematic Process to Develop Teacher Expertise: A Comparative Case Study. (2012). Dissertation Cardinal Stritch University.

Rodgers, Carol. (2002). Defining Reflection: Another Look at John Dewey and Reflective Thinking. Teachers College Record. Scales, Peter. (2008). The Reflective Teacher. Teaching in the Lifelong Learning Sector.

Maidenhead: McGraw-Hill/Open UP. York-Barr, Jennifer. (2001). Reflective Practice and Continuous Learning. Reflective Practice to Improve Schools: An Action Guide for Educators. Thousand Oaks, CA: Corwin.

User Feedback

At the start of 2014, the National Science Foundation awarded Practice a six-month Phase I Small Business Innovation Research grant to conduct the following research:

• use the Practice learning platform in order to build a series of learning exercises for teacher professional development focused on the Common Core State Standards in mathematics;
• have a meaningful group of teachers use the platform as learners;
• have a meaningful group of master teachers use the platform as instructors;
• gather input from these beta users on the platform’s design, functionality, and value;
• adapt the platform based upon this user feedback; and
• demonstrate the platform to several graduate schools of education and school districts in an effort to obtain their input on design, functionality, value, and likelihood of adoption.

In addition to the above objectives, Practice was able to design and complete an efficacy test for a small sample of users to determine if educators who used Practice’s online learning platform experienced increased mathematics competency acquisition (see next tab for details on our efficacy test).

Objective I: Use the Practice learning platform to build a series of learning exercises for teacher professional development focused on the Common Core State Standards in mathematics.

Beginning in January 2014, Practice worked with a professor of mathematics at Drexel University’s School of Education to design a series of learning exercises for middle school math teachers focused on the Common Core State Standards in mathematics (CCSS-M). Practice constructed six exercises that served as the core of a three-month professional development curriculum for middle school math teachers. We then used this prototype course to test the platform.

Objective II: Have a meaningful group of teachers use the platform as learners.

Twenty teachers registered for the six-module, three-month professional development course. Of the twenty registrants, 14 were classroom teachers and six were instructional coaches. On average, each teacher had taught for 10 years and all taught middle school students. Of the teachers and instructional coaches that registered for the course, twelve worked for a charter school, one worked at parochial school, one worked at a private school, and six worked in large urban public school districts. Feedback we received from these teachers is described under Objective IV. Iterations we made in response to this feedback are summarized under Objective V.

We were disappointed by the low number of participants in the course. Due to the timing of the grant, we did not begin recruitment until the middle of the school year. As a result, it was difficult to find a school district that could change its scheduled professional development program in order to accommodate our schedule. While the group of twenty teachers provided us excellent feedback on the platform’s design, functionality, and perceived value, we need to ensure greater participation for future research.

Objective III: Have a meaningful group of master teachers use the platform as instructors.
We worked closely with one master teacher to design and facilitate our six-month mathematics course. This master teacher regularly consulted with Practice and advised on user experience and interface iterations. Over the course of our Phase I grant, we demonstrated the platform to fifty school districts, charter management organizations (CMOs), teacher-training organizations, publishers, and independent charter schools. Following these demonstrations, a number of master teachers asked to use the platform during our Phase I research period. These master teachers provided Practice valuable insight into how we could improve the platform from a master teacher’s perspective. Feedback we received from these master teachers is described under Objective IV. Iterations we made in response to this feedback are summarized under Objective V.

Objective IV: Gather input from these beta users on the platform’s design, functionality, and value.
The following section outlines the feedback we received from teachers and instructional coaches as well as from master teachers. With respect to teachers and instructional coaches, we gathered feedback from the twenty middle school math teachers and instructional coaches that participated in our six-month CCSS-math aligned professional development. We gathered input on the platform’s design, functionality, and value through surveys and individual face-to-face meetings. With respect to feedback from the master teachers, we collected data from surveys, interviews, and email correspondence.

Survey Results

Participants in our six-month math course completed two online surveys. The initial survey was distributed after the first module of the six-module course. The second survey was distributed at the conclusion of the course. The following outlines the survey results:

Initial Survey

Of the 20 participants, eleven people completed the initial survey.

With respect to rating Practice’s overall value, 55% of the participants rated Practice a three (3) while 45% rated Practice a four (4) on a one (1) – five (5) scale where 1 is not valuable and 5 is extremely valuable.

Chart overall rating

In referring to Practice’s value, one participant noted:

Listening to the responses helps me think about the language I use and how my consistent/inconsistent use of correct language can impact students’ learning.

With respect to ease of use, 50% quickly figured out the functionality and had no difficult navigating the platform. 40% experienced a slightly slower learning curve and 10% did not quite understand how to use the platform.

In a final survey question, we asked participants how we could improve Practice. In written responses, the participants focused on the need to improve Peer Assessment. One participant suggested that we change the order of the learning stages so learners first practice a skill and then immediately compare their video with that of an expert. In this sequence, the participant believed that the teacher would be better positioned to provide meaningful feedback to his or her peers because they saw an expert’s response. Another participant requested that we eliminate the top peer-reviewed list. This particular participant objected to competition in a tool that was intended for learning and development.

Second Survey

Of the 20 participants in our CCSS-M course, seven people completed the final survey.

With respect to overall value, 86% of the participants said the course was extremely valuable while 14% found is somewhat valuable.

Overall value pie chart

When asked to expound, one participant noted:

Practice exercises were compelling. They forced me to generate a response in order to access each new learning stage. Because of that, I was unable to experience the course as a passive learner. Though it may be somewhat cynical to say, much of my professional development and educational graduate work (both for teaching and administration) has the built-in pitfall of giving me the option to sit passively. Practice did not provide that option, which made it valuable. I also think the (slight) competitive (or at least curious) sense one gets from doing a peer review privately is engaging and made the course valuable.

With respect to ease of use, 43% found the platform extremely easy-to-use, 43% found it easy-to-use, and 14% found it somewhat easy-to-use.

Ease of use

One participant noted:

I have never really done an online course like this before and I found the platform to be very easy to use. I appreciated the emails with the reminders of opening and closing dates.

In an effort to gauge if participants would like to use the platform again, the survey asked if they would recommend that others use Practice. In response, 42% of the participants noted that they would strongly recommend that others use Practice, 29% of the participants noted that they would recommend the platform to others, and finally 29% may recommend Practice. One instructional coach noted, “The model is great. I want to use it with my teachers.”

In the final survey question, we asked participants how we could improve Practice. In written responses, the participants all focused on how we can improve feedback loops. One participant asked to have experts give every single participant direct written feedback. Another participant asked that we improve peer feedback by modeling how to give constructive, meaningful feedback before we ask the participants to engage in peer review.

Interviews

In addition to survey feedback, we interviewed a number of course participants. The following outlines input we received from these interviews:

The Director of Professional Development (PD) at a leading Charter Management Organization (CMO) participated in the first module of our course to evaluate whether or not her CMO should adopt the platform for future professional development. The Director of PD noted that Practice’s weakest learning stage was Peer Review. To improve this stage, she suggested that we allow instructors to customize the free form feedback prompt in order to direct teachers’ feedback. While the Director of PD wanted us to improve peer review, she appreciated the fact that the tool easily allowed teachers to capture practice videos. Traditionally this CMO avoided online professional development because technical hurdles clouded the experience. The Director of PD noted that our seamless in-browser web-cam provided a solution to the typical technical obstacles (e.g. compressing files) teachers faced when using online video tools for professional development.

A number of instructional coaches from the CMO also took our CCSS-M course. Two coaches provided in-person feedback. Both coaches agreed that the most powerful learning stage ApprenNet offered was the Challenge stage. The coaches noted that traditional professional development is “dangerous” because it does not provide a forum for new teachers to practice skills addressed during the professional development. As a result, the first time teachers practice new skills is in front of their students. The coaches agreed that Practice provides a place for teachers to practice their skills multiple times before they apply the skills in real time with real consequences. To strengthen this learning stage, the coaches suggested that we add a learning stage that requires the teacher to review his or her own video. At the moment, the teacher can review his or her submission, but they are not required to complete any formal review process.

In contrast to the Prompt & Response stage, the coaches noted that Peer Assessment needed the most improvement. While the act of observing others was powerful, both coaches felt this stage could be improved by helping peers give each other more constructive, honest feedback. The coaches noted that peer feedback varied considerably in value. To help make peer feedback more consistent and meaningful, both instructional coaches suggested a number of changes. First, they suggested that we send email notifications to highlight when an individual receives feedback. Both coaches believed that these notifications would increase social capital around individual videos by brining teachers back to the platform to engage them in conversation about a particular skill. Further, the coaches suggested that we add badging to reward and recognize people who gave meaningful feedback. Highlighting models of good feedback would encourage emulation from others.

Furthermore, watching the top-three peer responses and reading expert feedback was not as powerful as either coach expected. The coaches noted that this stage had diminishing returns. The first video they watched and feedback they read was helpful, but after the second and third video they lost interest. Rather then have all participants watch three videos with expert feedback, the coaches suggested that we encourage instructors to provide a short recap of the exercise and highlight trends he or she saw and tips teachers could take to improve future practice. In addition, the coaches suggested giving instructors the ability to highlight one or two videos in the recap and use that as a teaching tool as opposed to commenting on the top peer-assessed videos. This would give the instructor more control over what to highlight and make it more relevant to the overall analysis of the exercise.

The Vice President of Math Education at another leading CMO, took the CCSS-M professional development course to evaluate whether or not the tool would be useful to implement across the CMO’s ten schools. Overall, the VP of Math Education said that he “liked the course very, very much.” In particular, he liked 1) the structure because it provided accountability for accessing each learning stage; 2) the time limit on response videos because it forced participants to hone their thoughts; 3) the ability to review other answers because it helped him reflect on his own answer; and 4) email notifications that kept him on track with respect to when a learning stage opened and closed.

The VP of Math Education also suggested a number of ways we could improve peer assessment. First, he suggested that we email participants when they receive feedback to bring them back to the platform and engage in conversation about the feedback. Second, he suggested implementing a badge system to reward people who give constructive feedback. Finally, he suggested that Practice offer a course to all users on how to provide meaningful feedback.

Objective V: Adapt the platform based upon this user feedback.

The following chart identifies user feedback we received throughout Phase I and describes how we adapted the platform in response to this feedback:

 

Screen-Shot-2014-07-18-at-7.09.16-AM

The one theme that stood out from user feedback is the need to improve the feedback and assessment aspects of the platform. Practice is currently working on improving these two aspects of the platform.

Objective VI: Demonstrate the platform to several graduate schools of education and school districts in an effort to obtain their input on design, functionality, value, and likelihood of adoption.

During the course of our Phase I grant, we demonstrated the platform to ten (10) graduate schools of education, seven (7) school districts, ten (10) charter management organizations (CMOs), seventeen (17) teacher-training organizations, three (3) publishers, and three (3) independent charter schools. The following includes an overview of the feedback we received on the platform’s value, design, functionality, and likelihood of adoption.

Value

Overall people saw great potential and value in Practice. A number of organizations noted that the platform offered a great tool to train coaches. The Director of Teacher Induction at a large urban school district, and the Director of Professional Development at a leading CMO, noted that Practice solved the issue of coaches and teachers not having enough hours in a day to observe teachers. On a similar note, the Director of a university’s Innovation and Learning Center, noted that Practice had a unique solution to meaningfully scaling observations.

The Director of STEM Education at one of the country’s leading suburban school districts noted that Practice provided a tool to leverage internal expertise by providing teachers functionality to easily capture and disseminate best practice videos of veteran teachers within the district. Along the same lines, a mathematics teacher at a virtual school immediately recognized how Practice could improve interaction between distance teachers. She noted that there is “significant value in folks hearing and seeing perspectives and strategies from others.” Moreover, a former biology teacher and current Ph.D. candidate in STEM Education at a top Graduate School of Education commented that one of the greatest values of the platform is its ability to crowd source best practice videos from Peer Review. The Ph.D. candidate noted that teachers rarely get to see best practices videos and Practice not only provides this option, but also the added benefit of letting the teachers curate the best practices.

Administrators at a large urban public school district and two large CMO providers felt the greatest value of Practice was the ability for teachers to form Professional Learning Communities (PLCs) across a population of geographically dispersed teachers. Rarely do teachers across a district and/or a CMO have the ability to frequently get together and share ideas. According to administrators from the large public school district, Practice seamlessly facilitates PLCs. Furthermore, the CMO providers noted that Practice is a great tool to connect teachers during the time period between two live professional developments.

Finally, a Chief Operating Officer at another CMO and a K-12 Education Consultant for public school districts and graduate schools of education, both noted that providing peers a way to review each other remotely is an extremely valuable part of the platform. In particular, the consultant noted that she has witnessed how much teachers enjoy receiving peer feedback in a live classroom setting at graduate schools of education. She imagines that the ability to scale this online will only provide additional opportunities for teachers to engage in peer observation and review.

While reactions to our demonstrations yielded great positive feedback, we did discover a number of potential implementation obstacles. In particular, a number of universities and districts noted that they did not have the personnel and expertise to both launch and then ensure participation in a professional development delivered through Practice. One education consultant specifically noted that she has seen dozens of districts shy away from adopting new technology simply because they did not have the manpower to support implementation and they didn’t want to add to the existing array of technical applications they had to address for professional development and assessment. Finally, one school leader noted that he, like many school leaders are “luddites,” and would prefer to conduct professional development in person. Nevertheless, this school leader noted that he realizes Practice and other applications like it are the wave of the future.

Design

With respect to design, most master teachers found that the technology was “quite compelling” and had an “elegance to it.” In particular, one school leader “like[d] the steps and how [Practice] strung them together.” Despite overall positive feedback about the platform’s design, there were a few things that master teachers did not appreciate. A Director of PD from a large urban school district and a university Director of a Center for Innovation and Learning did not like that peer feedback created a leaderboard. Both both found that the leaderboard sent a message of competition as opposed to collaboration.

Functionality

Master teachers provided numerous suggestions for how we can improve the platform’s functionality. In particular, the master teachers suggested the ability to give private feedback, timestamping, customization of the written feedback prompt, and suggestions to improve profiles. In the chart under Objective V we note the feedback received, reasons for the feedback, and the steps we took to address feedback on functionality. In Part IV we address how we plan to use this feedback to drive future product development.

Likelihood of Adoption

As noted earlier, the greatest hurdle we face with respect to adoption is identifying organizations that have the capacity to invest the time and effort needed to create a curriculum using a new pedagogy, implement a new technology, and ensure teacher participation. In short, the hurdle for adoption is a question of bandwidth. A number of organizations expressed a desire to pilot the platform but simply could not implement our platform in the timeframe of our Phase I project due to a lack of internal capacity. With that said, most continue to express an interest in using Practice. A number of organizations, including a leading publisher, a number of university-based programs, CMOs, and large urban and suburban school districts, are in the midst of setting up pilots for the 2014-2015 school year. A number of organizations have started pilots and one large urban school district is already a paying client, having secured a grant to fund their use of the platform.

Efficacy Results

In addition to gathering feedback from the participants in our three-month mathematics course, we conducted a preliminary efficacy test with Dr. Claire Robertson-Kraft, a Post Doctoral Student at the University of Pennsylvania, to determine whether teachers’ participation in the modules impacted the quality of their mathematics instruction. Our intent was to develop a research design for a larger scale efficacy test and to see if the results of a preliminary test merited further research.

Research Design

A modified version of the 4-point Mathematics Quality of Instruction (MQI) – a well-known rubric to measure teachers’ mathematics instruction – was used to capture information on teachers’ improvement in their mathematics instruction. The MQI was developed at the Harvard Graduate School of Education with support from the National Science Foundation and the U.S. Department of Education. In our study, trained researchers used four sections of the MQI measure to code teachers’ video submissions, which were 3-5 minutes in length, at the beginning and end of the online course. The identified MQI sections were relevant to the content of the online course and modified slightly to reflect the fact that the videos were solely teacher modeling (rather than actual lessons taught to students).

A paired samples t-test (or dependent t-test) was used to capture the difference in mathematical quality of instruction before and after teachers’ participation in the online course. This type of test is used when the same group of units has been tested twice. By comparing the same teacher before and after the intervention, the participants effectively serve as their own control group.

The strength of this methodological approach is that it measures the impact of the online course on valued outcomes (e.g., teachers’ actual skill) rather than merely attitudinal measures. However, without a control group, this analysis cannot isolate the impact of the program itself from what would have been expected growth in mathematics instructional quality over the same period of time. Additionally, it is not possible to separate out the impact of the Practice platform from the content and quality of the online course. In future research, we intend to conduct additional experiments to address these deficiencies.

For various reasons, data from only five of the 20 teachers who participated in the course was useable for the test. Nevertheless, the results from this data revealed meaningful impacts from their use of the platform.

Two of the four practices measured by the MQI revealed statistically significant and large effect sizes indicating that the intervention had a considerable impact on these outcomes. The Measures of Effective Teaching (MET) study, funded by the Gates Foundation and launched in 2009 with over 3,000 teacher volunteers, used an earlier version of the MQI rubric to measure effective mathematics instruction. While we used the newer MQI rubric, the comparisons are still noteworthy. The MET study, conducted by Harvard’s Graduate School of Education, viewed an intervention to be effective if it had a half a standard deviation increase in a specific practice on the MQI. The results in our test were significantly higher.

Implications

While the results from this preliminary efficacy test were promising, further testing is required before we can make definitive claims of efficacy. With respect to program design, we need to determine our desired outcomes for a larger scale efficacy test and what intermediary variables could be used to capture progress. Furthermore, we must have a plan to ensure a significantly higher sample size of participants. Furthermore, future research should assign teachers to at least two groups (control and treatment with Practice). With a control and treatment, we can isolate the impact of Practice compared to the content.

Future Research

To continue to measure effectiveness, we are planning to design two randomized control trials. To measure Practice’s impact, we will create two professional development courses, recruit teacher participants, and then divide volunteer teachers into three groups. The first group will take the entire course delivered on Practice(the Treatment). The second group will not take the course (the Control). The third group will be given access to course content, but will not participate in course through the Practice platform (Control+Content).

By comparing observational surveys and other data points, Practice researchers will measure if there were any differences between the Control, Treatment, and Control+Content. Practice will also administer two attitudinal surveys over the course of the study to gather teachers’ own reflections on the platform.

At the conclusion of our research, Practice will collect all the data and publish a report.

SEE PRACTICE’S IMPACT FOR YOURSELF

REQUEST A DEMO