Displaying 5 results
What do these numbers actually mean? Rethinking Student Grades and Scoring.
A grading system based on total points does not accurately reflect the level of student understanding of science content. Students who demonstrate that they understand half of the content should not earn a failing score. Nor should students earn arbitrary points for doing non-science content related things. Student scores should reflect what a student understands and not how well the student can play the game we call school. We teachers are encouraged to do standards based grading, but not everyone knows how or where to start or even if it is worth putting forth the effort to make the change. Participants will be led through my journey in becoming a teacher who uses standards based grading. The struggles in changing my mindset about grades and the way I grade will be presented as well as the benefits of having a better understanding of what the students actually know, having student grades more accurately reflect what they know, having fewer students fail among other things. Basic strategies for assessing level of understanding will also be presented. Time will be given for questions and answers.
Takeaways: Participants will be given strategies about changing their view of scoring students by the total number of points they got correct verses the student's level of understanding.
Meredith Diehl (Northview High School: Sylvania, OH)
Presenter Materials for this Session:
(Please login with your NSTA account to view the materials)
What do these numbers actually mean.pptx
Biology Assessment Standards.docx
Approaches to Assessment and Grading that Support Student Sensemaking
McCormick Place - Skyline W375a
As educators shift their teaching practice to align with the Framework for K-12 Science and the NGSS, they face various challenges and barriers. One pressing challenge is how to align their new approach to teaching and learning with existing assessment and grading systems. In this session, we will present provide examples of 3D assessments and associated scoring guidance. Participants will review student work for these sample assessments and identify evidence of understanding. They will collaborate with others in the session and determine how they would give grades based on set criteria. The second part of the session will highlight different approaches to grading based on local grading expectations (e.g., standards-based grading, daily grade requirements, or 100 point-based systems). Participants will leave the session with approaches to assessment and grading that support student sensemaking and honor the diverse resources students bring to the classroom.
Takeaways: Participants will leave the session with approaches to assessment and grading that support student sensemaking and honor the diverse resources students bring to the classroom.
Sarah Delaney (OpenSciEd: San Francisco, CA)
CONSTRUCT: a Crowd-sourced Online Tool for Developing Middle-school Physical Science Assessments using Disciplinary Core Ideas
Do the test questions you use adequately reflect your students’ true understanding of science? We’ll share guidelines for writing effective questions that don’t leave any of your students out and will help you determine whether your students are making sense of phenomena they are investigating - do their ideas match science ideas of the NGSS?
Using a research-based “citizen science” approach, teachers can volunteer their favorite items and help improve our existing MOSART questions. Crucial item characteristics will be measured and reported, such as difficulty, effectiveness, gender, and racial/ethnic bias. Write new questions or revise ones you already have to address how well students make sense of elements outlined in the NGSS DCIs.
The following is a question that is too difficult for middle school students:
Matter is made of tiny bits called atoms. What is between the helium atoms in a balloon?
a)Tiny particles that bind atoms together. b)A chemical substance that attaches helium atoms together. c)Nothing; the helium atoms touch each other on all sides. d)Nothing, just empty space. e)Air.
How would you revise this item? We’ll have “practice” opportunities to look at assessment questions that are difficult or biased and discuss possible revisions with other educators.
Takeaways: Write assessment questions to address item characteristics of difficulty, effectiveness, gender, and racial/ethnic bias
Cynthia Crockett (Center for Astrophysics | Harvard & Smithsonian: Cambridge, MA), Philip Sadler (Center for Astrophysics | Harvard & Smithsonian: Cambridge, MA)
"Say That Again???..." Know Your Students' - and Your Own - Misconceptions in Science
“Kids say the darndest things” don’t they - or do they? Either way, it’s really what they believe, whether it’s correct or incorrect. Do you know what ideas your students bring to the classroom and use to shape their ideas about science? How do we accurately assess their ideas against the disciplinary core ideas of the NGSS?
We wonder where those ideas come from and why they own them. Our students make sense of science from many places and venues and then use that as a foundation for their learning. However, it may not always be a solid foundation. We can help students develop their science knowledge using phenomena, observation, and robust assessment as well as a through a deeper understanding of the misconceptions they hold. Know the extent of what your students are thinking and why they think it using research-based assessments and the importance of including their ideas in those assessments. Explore students’ ideas and misconceptions (as well as your own!) in the Physical Sciences at various grade levels and know some of what they bring with them before they walk in the door!
Takeaways: Educators will learn research-based misconceptions that students hold across grade bands in the physical sciences in order to incorporate those into assessment.
Cynthia Crockett (Center for Astrophysics | Harvard & Smithsonian: Cambridge, MA)
What's a Cluster? Understanding the Illinois Science
The Illinois Science Assessment is written by Illinois
science teachers for Illinois science students. Learn more about the format
of this test and how you can model test clusters in your
Takeaways: Illinois Science Teachers will gain insight into how to better prepare students for the ISA by learning how to create clusters for use in their classroom.
Carol Baker (Lyons Elementary School District 103: Lyons, IL), Harvey Henson (Southern Illinois University Carbondale: Carbondale, IL), Angela Box (Southern Illinois University Carbondale: Carbondale, IL)