MMSA develops and applies research and best practices in science, technology, engineering, and math (STEM) education. Many of the projects at MMSA seek to find answers to unique and important questions regarding STEM learning in many different environments. WeatherBlur Next Generation is an exciting project funded by a three-year cyberlearning grant from the National Science Foundation. When the WeatherBlur research team was tasked with measuring participants’ knowledge of the Science Process they developed an assessment tool that can be administered in a timely and consistent way. Inspired by a science process flow chart developed at UC Berkeley, the research team developed a hands-on version to use while interviewing WeatherBlur students. The instrument consists of two circles, each representing the exploration and investigation stages of the science process, and a series of small squares labelled with possible smaller “research” steps that may be part of the larger step.
In one-on-one interviews with students, scientists and teachers, the research team reads a vignette about ticks and then asks participants to discuss which smaller steps they think are most important. The interviews, conducted in the fall and spring are analyzed and coded by the research team. By having a standard instrument given to participants we are able to gauge progress and change in student’s understanding of the scientific principles they are studying.
Another key project at MMSA is the ACRES project (Afterschool Coaching for Rural Educators in STEM), a major investment by the Noyce Foundation and the National Science Foundation to support high-quality professional development opportunities to out-of-school providers, especially in rural settings. The ACRES project is in the process of developing a script improvement tool in order to help after school STEM providers create better experiences for kids. The tool aims to provide a consistent method for assessing learning outcomes as well as producing a set of standards by which various skills can be rated consistently. Participants will be able to get useful feedback and advice on improvements that can be made from remote team members. Currently, the instrument is in development and will be piloted in the coming months.
Creating innovative tools that can be administered consistently under a variety of conditions to a number of different participants by various team members is a difficult task in itself. The research team at MMSA also strives to make them relevant and engaging for participants as well. Recent feedback from an ACRES participant from 21st CCLC program points to the success of that goal, “from everything I have had so far [professional development experiences], I know this is the one that has had the greatest impact on me. It has really been great for me as a teacher to elevate my ability to get my students to learn for themselves.”