Tuesday, July 1, 2008

2015 goals

Our program strengths at New Canaan High School library include providing 24/7 services to students and enough options for differentiation so that every student has access to research help that can meet their learning style whenever they need it, no matter where they are. Our online services were developed out of sheer need - necessity was definitely the mother of this invention. Our online instructional portal is accessible not only to the entire New Canaan community, but also to anyone on the Web. We find that transparency makes us user-friendly - Click and learn, and that learning is available to anyone who clicks. This is especially helpful to collaborating instructional partners, whether it be the classroom teacher, the special education teacher, the instructional assistant, the ESOL teacher, a district K-8 librarian, or parents.


The key is the “Click and learn” part. Many - and especially the students who need to the most - don’t do the clicking and learning. This is our weakness. While we’ve given students numerous access points to instruction, we struggle to document what they are actually learning from the library program - whether it be face-to-face or online. This is our ongoing challenge.


Last year, we focused on measuring growth among juniors in research learning. We used a plethora of data points - too many, in fact. It became unwieldy to aggregate all the collected data into meaningful information, so we ended up using the research paper grade to measure growth, and the results did not clearly establish correlation between library instruction and student improvement. This year, we want to use an assessment that will definitively measure the impact of library instruction on student learning. While tempted to use the measure we used last year - one we developed, I think we need to build on that, make it a little easier, or replace it altogether (with EasyBib or TRAILS). Moreover, big overarching questions include:
  • Which library services are most valuable?
  • Which method of instructional delivery is most valuable? Most impactful?


Earlier iterations of our Junior Research Paper pre and post assessments, taught us that areas of weakness among junior research skills include:
  • Scrutinizing language to determine author purpose (close reading)
  • Embedding references in their writing
  • Moving from more superficial, encyclopedic, or reference research to more in-depth, detailed research.
These are the skills we will introduce in the 9th grade social studies program. We hope to collect feedback on our instruction through exit surveys for freshman and sophomore social studies research projects. I have concerns about converting this into a SMART Goal because I find open-ended questions most informative, but also hard to measure. I am attaching a sample of the kind of data we can collect through this mechanism.


In 2013, we developed a research ICT Scope & Sequence for the high school, aligning it with the 21st learning expectations we developed during the NEAS&C self-reflection process. With skill development outline in mind, I will teach more intensive, period-long classes in the ninth grade social studies to help prepare students earlier for the junior research paper, giving younger students stronger skills and more time to practice them before their high school graduation is at stake. Last year, we “renovated” the Collapse project, and I think we can improve it again this year with a few modifications. Our tenth grade experiences in social studies are very robust, so just some fine-tuning will help bring those experiences in line with the 11th grade expectations. My point is all our K-12 research experiences lead toward this research paper, which is why I have no hesitation in coming back to it year after year for my goal. Until we come up with another one, this is the capstone project in our district. All other research experiences should instruct students on skills that will help them succeed on this one benchmark.


To analyze our impact on Junior Research paper instructions, I will select a sample group that is representative of the entire Junior class. I will start with a cohort of 45 students. It is important to start with a larger group, because it is those who struggle the most with whom will spend the most time later on. I will learn more about my instruction from the underperformers than the high performers, although not necessarily the lowest performers, so even though I want a representative group, It has to be large enough for it to include enough strugglers for me to learn what I need to.


We use the NCHS school-wide junior research rubric to measure student performance on this graduation standard. Since this is the measure used to calibrate grading across the two disciplines that assign and grade the research paper, it is the measure I will use as well.


We hope to administer an exit survey that will ask students to rate the impact of library services on their research learning on both research papers. Services to be evaluated include:
  • THE ANNEX@ online instructions
  • Videos from our YouTube channel
  • Face-to-face classroom instruction from a librarian
  • Texting the library service
  • Personal one-on-one help from a librarian
  • The library collection


I would love to run focus groups for research paper students, but it is virtually impossible to schedule such a thing without pulling students from class, which is unfair. If everyone did that, students would miss too much class. Instead, I will probably do individual interviews with a handful of students, and excerpt quotes to share here.   
TEPL alignment:


Metacognition RTDC. 9 Regularly models metacognition and consistently provides opportunities for student reflection and self-assessment as part of the learning process.
The research paper requires multiple checkpoints where students can pause, reflect on their progress in relation to the research continuum, reevaluate their original ideas and questions, and contemplate their next steps. This year, we hope to be able to work with classroom teachers on providing feedback to students on those reflections. This can be a tough sell. Some teachers are all on board. Others are far more territorial and feel it is their job alone, and that accepting outside assistance means that they are not adequately performing their job. I try to present this as a healthy collaboration, but as I said, it can be a tough sell. Sometimes it is simply a time management issue. Teachers feel that collaboration takes more time than working alone, which can be true, at least in the beginning. By the time, we agree on an instructional platform, communication services, format, and calibrating evaluation, the teacher may feel as though grading hard-copy papers alone is faster and easier. It’s a fair concern. But I do believe that a higher teacher to student ratio is always better for the students. I wish more colleagues saw it that way.


Domain 3: Responsive Teaching in the Differentiated Classroom - The educator implements effective instruction that engages students in rigorous and relevant learning, resulting in student growth and achievement - RTDC 2 - Instructional Practices: Uses a variety of evidence - based practices matched to the learner and task, enabling the students to construct and apply new learning.
As I mentioned at the start, this is our strength. See our email signature below. We offer students more access to help than any other school library I know of. Period. It is one of the reasons that other school librarians look to our program as an exemplar. It isn’t that we are better, only more accessible. Having said that, channeling access into personalized instruction is tricky. We want to do a better job of this. This will be an important focus for us this year.
email signature.png


Professional Growth Plan (PGP)


Rationale
If I monitor student learning better, then I will better understand what to teach students, and how to teach it most effectively.
Student Impact
As I monitor students better, they will have more opportunities for reflection, and to obtain feedback at various checkpoints, which will help them be more purposeful in planning the next steps in their research.
Professional Learning
I learn a great deal from developing presentations for professional learning. Whether I am using my own work as content or interviewing other educators about their practice, I am forced to pause, reflect, transcribe, document what I know and what I still need to learn throughout the year. I am a better practitioner because of this ongoing work.
Resources
I learn better from conversation than print. Thus my professional learning network is my greatest resource, and it is a big one - from my New Canaan High School colleagues with whom I meet on a daily basis, to far away New Zealand librarians I met through Twitter, I have access to unlimited learning resources. I can pause when it gets too noisy, and I just need to process, then pick up the learning thread when I am ready to ingest more.
Collaborators
My NCHS colleagues, specifically for this year's goals, the English and Social Studies junior teachers, and my extended PLN.
Impact on Professional Practice
What I learn will change what I teach, and how I teach it in the future.
Impact on student learning
Hopefully, they will come back after graduation and tell us that they really feel as though they have a leg up on writing research papers in college!
Learning community
Continue working with my advisory/mentee group to ensure that each one feels connected to an adult in our learning community. Our group is a diverse one with wide-ranging issues. This continues to provide deep professional satisfaction, though it is not without its challenges.
Whole school student learning
As an educator who teaches all 1,300 students and co-teachers with any one of our 120 teachers, I try to employ social media to improve school climate - to reward students for their positive behavior publicly, I've been note to exclaim in class, "That's Tweetable!" The synergy between our social media platforms makes it easy to integrate those mini celebrations into our instructional portal - THE ANNEX@, where they are displayed alongside our instruction.
Mid-Year Reflection


Part 1: Prompts to spark reflection on professional practice and effectiveness. Consider one or more of these reflection prompts to frame your mid-year self-reflection. This is not intended to be an exhaustive list; it is simply to provide a springboard for your reflection and build a reflective stance on practice over time.
  • With respect to your professional learning focus, what has been most satisfying to date this year? What has contributed to that?
  • With respect to your professional learning focus, what has been most challenging to date this year? What obstacles have you encountered and how have you attempted to address those obstacles?
  • What questions about your practice have emerged for you for continued study and growth through your work this year?
My favorite professional learning experiences come from partnering with my colleagues - both in and beyond New Canaan Public Schools - to spark our learners’ curiosity and their desire to question everything. As mentioned in my Start of the Year Reflection, I wanted to roll out full-period lessons that I could co-teach with my 9th grade colleagues. We built three such units into the first semester curriculum. What made it spectacular was the seamlessness with which we were able to integrate these into the students’ coursework so that it seemed logical to students for a librarian to be in the class for that lesson - there was no perception of “add-on instruction”, which is a sort of an occupational hazard in my line of work.


The lessons,
  • The Confucius Article
    • Scrutinizing language to determine author purpose (close reading)
  • The Ancient Greece DBQ
    • Embedding references in their writing
  • The Dalit Lesson
    • Moving from more superficial, encyclopedic, or reference research to more in-depth, detailed research.
These lessons were initiated by a classroom teacher, Amy R. who teaches both juniors and freshmen. She knows what lies ahead for these students, and she is acutely aware of the juniors’ weaknesses. When she felt her students were ready for a challenge, she approached me, asking me to teach 9th graders a specific new skill. I asked her what content students were learning, and we developed a lesson around that. These lessons were highly effective according to the classroom teacher, as evidenced by their bibliographies/works cited in comparison to those from the class of 2017 - whom she taught in the previous year.


The greatest professional learning challenge was time. In addition to teaching, I still have a library to administer. We are transitioning our collection in several important ways (genre-fying, and downsizing the print collection). The loss of a full-time instructional assistant in 2013 has profoundly impeded our ability to stay on track with our goals. Moreover, without the continuity of a full-time staff member at the circulation desk, students are not getting the reader’s advisory services they once had. We used to be able to cultivate a genuine love of reading in our students through casual circulation desk conversations. Our instructional assistant knew our students interests and was able to match books with kids in ways that our volunteers simply cannot.   


Part 2: Please complete reflections and cite related evidence year-to-date for each category. Consider the rubric descriptors as they align to your performance but assigning an overall rating for each category is optional at this time. Changes you are considering to your Professional Growth Plan can be indicated in the Student Growth and/or Observation categories. Submit this mid-year reflection to your evaluator prior to the mid-year conference.




Reflecting on student learning/performance data year to date, describe:
  • Areas of Strength with evidence noted
  • Next steps with rationale
Figure A - Student Score Sheet
We pre-assessed our juniors’ research skills in November using a simplified version of the assessment we used in 2013-2014 (see attached). We shared the aggregated results with the classroom teachers and students (see Figure A). As was the case last year, students did not perform particularly well on the pre-assessment (see figure B). There is a silver lining to poor outcomes on the pre-assessment though. Students are presented with evidence that they may have something to learn about research. It makes them a little more receptive to future instruction. The class-by-class results are attached.
Figure B - Class of 2016 Pre-Assessment Results


As you may be able to see from figure B, only six percent of students chose the correct answer on the author purpose question. this is a close reading exercise that prompts students to carefully examine language and visual clues in the document to discern the author’s intention. Other areas of concern included choosing a sound search strategy in a database, embedding references in their writing, distinguishing objective from subjective journalism (i.e. knowing the difference between OP-ED and journalistic reporting). Under half the class was able able to recognize bias in an article. So at least two of the concerns I identified in my start of the year reflection were confirmed:
  • Scrutinizing language to determine author purpose (close reading) - Questions # 14 - 6%, 11 - 31%, and 17 - 49%
  • Embedding references in their writing: Question #9 - 21% correct


On the other hand, my original speculation that students don’t fully understand how move from more superficial, encyclopedic, or reference research to more in-depth, detailed research, was proved wrong (71% got it right) - IF my question was an effective measure for that skill, and I am thinking it wasn’t.


I was able to adapt my instruction based on the pre-test outcomes. We developed online mini-lessons for each pre-assessment question, then we offered the classroom teacher three options:
  • in-class librarian-facilitated lessons
  • Show video recorded version of librarian-facilitated lesson to the class during class
  • “flipped” video to assign students to watch for homework with follow up visit from librarian to facilitate ensuing discussion
Since they were provided with individual class reports, and to avoid redundant instruction, teachers were able to pick and choose which lessons best aligned with their class’ needs.


The greatest challenge was the English teachers’ start time on the research paper. Most started right around Thanksgiving, and finished just before exams. Some social studies teachers, who were instructed to start the research paper early in the second semester, did not have access to all of their students’ first semester scores as they introduced their social studies research paper. Having first semester scores facilitates tailoring instruction to individual students needs during the second semester. It helps both the classroom teacher and the librarian target specific students who may need additional support in order to pass, particularly those who did not meet goal the first time around.


Once those scores come in, we will be able to personalize instruction for social studies students as they commence working on their second research paper. It will also be helpful to look for correlation between the pre-assessment and student scores. I am concerned that pre-assessment success is not a predictor for research paper success, which means that I would need to change how I use it in the future. I still really like the questions, but maybe I shouldn’t use it as an assessment, but rather a class activity to prompt a discussion about the research process.


Reflect on school performance data and your contributions year to date:
In 2013, the school developed seven school-wide 21st century learning expectation rubrics as part of the NEAS&C reflection process (problem solving, communication, reflection, healthy living, collaboration, respect, contribution). At the start of the year, we did not have a mechanism to document student mastery of these goals. I combined all of the high school’s 21st century learning rubrics into one document (see attached) so that teachers could mix and match criteria based on their units of instruction. This minimizes rubric overload. For example, a teacher could craft a compact rubric for collaboration, communication, and only parts of the parts of problem solving that apply to research. These were forwarded to Rick R. so he could incorporate them into PowerSchool as learning standards. Now teachers can check off that standard on the student’s record when it is met.


Referencing the NCPS Effective Teaching Framework and feedback from observations year to date, describe:
  • Areas of strength with evidence noted (as appropriate at the indicator level)
  • Next steps with rationale (as appropriate, at the indicator level)


While I was not observed by an administrator, I co-teach every lesson with at least one classroom teacher, and I consistently ask for feedback. Fortunately, my colleagues are comfortable sharing their thoughts. Most often, I am told to slow down. Sometimes, I am told to make things clearer. Occasionally, I am going too fast for the teacher, and not fast enough for the students. Also, teachers occasionally forget that I see their students in other classes, and that I might have covered a concept or skill in a different discipline. Nevertheless, I take my colleagues feedback very seriously. In the future, I hope to collect feedback directly from students with exit surveys. They are, after all, the most important stakeholders, and unlike administrators, they have the opportunity to observe all my lessons.
Reflect on learning community growth and your contributions year to date (including reflections / professional evidence related to Domains 5 and 6):


I facilitated professional learning workshops for every professional development day we’ve had this year. I co-taught apps for iPad use, Moodle, and Google Classroom to rotating audiences of up to 25 teachers at a time. Teachers also recognize that I am “techie” enough to answer their questions and get them going with new software and applications for classroom use. While there are other certified faculty in the building who share this responsibility, we have all learned that more is, in fact, more where tech integration is concerned. The more tech integrators there are, the more teachers will experiment - and that is great for kids!

Year-End Reflection



Reflecting on student learning / performance data year to date, describe: (Teacher Reflection)
At the mid-year, I didn’t have all the first semester scores, so it was hard to determine how useful the pre-assessment was in identifying students who may struggle with the research paper. In the meantime, I spent some time looking for patterns among leveled courses (regular, honors and AP U.S. History). I averaged each class score, then compared that to the overall junior class score, computed how many students in each section were enrolled in AP U.S. History, and then labeled the honors classes (see Figure C). Brown and Hamill’s classes really perplexed me. Their performance on the pe-assessment further validated my concerns that student scores are not effective predictors of students’ ability to successfully complete their research papers.

Figure C
Once I received the students’ first semester research paper scores, I was able to look for correlation between pre-assessment and the scores. Only 56% of the scores showed any correlation at all. The lowest pre-assessment scores were not a reliable predictor in identifying students who would need extra support All three of the lowest scorers passed the 1st semester research standard. High scores were more reliable in determining who might not need additional support from a librarian (7 of the 9 highest scorers passed the research standard). The most incongruous results came from students who scored 5 on the research paper in the first semester. Of the 14 students who earned the top score (5), only 3 performed well on the pre-assessment. So next year, we will find another way to use that assessment. Perhaps we should use the sophomore English portfolio scores instead. We could use the first draft of the first semester’s research paper’s works cited, but I am concerned that that may be too late to intercept learners for whom discouragement and anxiety might have already set in. This is going to be a critical challenge as we move forward.

Nevertheless, once we had the first semester scores, we were able to use those to identify students who may need additional support. Mindful of FOI regulations, most teachers just handed me a list of student names on a piece of hard copy paper. In one class alone, 74% of students did not meet goal on the first semester research paper. The teachers scheduled co-taught lessons with me where I taught as many as five individual lessons to classes, each with a built in assessment. It occurs to me that these might be a better predictor of student performance, but it is too late in the research process, and aggregating 300 students’ outcomes for 5 assessments is too unwieldy for 300 students. That’s where I ran into trouble last year. It was simply too overwhelming.
  1. Objective v. Subjective Journalism - The Bernie Madoff Lesson
  2. Resource Evaluation - Bibliography Dos and Don’ts
  3. Identifying Parts of a Citation - Periodicals v. Web
  4. Embedded References - The Carlisle Bargeron Mystery
  5. Unpacking a Research Study - College Students’ Stress

As in the first semester, teacher had options as to how to administer these lessons:
  • In-class librarian-facilitated lessons
  • Show video recorded version of librarian-facilitated lesson to the class during class
  • “flipped” video to assign students to watch for homework with follow up visit from librarian to facilitate ensuing discussion

The most effective collaboration took place in the American Students class with Bob S. and Evan R. because they managed the assignment through moodle with built-in checkpoints where students uploaded their work as they progressed. They made me a co-teacher in their Moodle course, and assigned me five students to monitor. Having access to the entire class helped me calibrate my feedback to students so that it was aligned with the classroom teacher’s rigor and style.

At the end of the year, we were able to look for student growth. Within my cohort group, 48% of students scored the same both semesters, 27% scored higher, and 25% scored lower (See figure E).

Figure E
As in prior years, this did little to confirm or deny whether the library program had any impact on those score changes. We administered an exit survey to (hopefully) establish correlation between score changes and students’ rating of library services helpfulness. Unfortunately, we only found correlation in 35% of our respondents’ scores (See figure F).

Figure F

So after all this examination, it looks like all we have to measure our impact on student learning is student testimony - which was resoundingly positive. On average, 47% of respondents said they found THE ANNEX@ helpful, 20% found our YouTube channel helpful, 70% found librarian-facilitated lessons helpful, 51% found texting the library helpful, 60% found one-on-one help from a librarian helpful, and 86% found our collection helpful. We broken these numbers down by research  paper in the Figure G below.
Figure G
Next year, I hope to employ the Moodle model we used in the American Studies class across the grade. This will require a great deal of support from the department chairs. The classroom teachers may find it meddlesome. I see this as a great solution to finally establishing the correlation I seek.

Reflect on school performance data and educator contributions year to date: (Educator) *
Keeping our move to SBAC, I developed materials to help teacher better appreciate the role research plays in Common Core skill development (Please see attached). To better explain the research process for students, we developed the Research Continuum. Moreover, some of our lesson on close reading - particularly the second semester Bernie Madoff lesson, which we administer using SubText on iPads, when they are available, mirrors the testing format students experience when taking the test.

Referencing the NCPS Effective Teaching Framework and feedback from observations year to date, describe:
As you can see, though I was not formally observed, I’ve given very careful consideration to the impact of my lessons, and their delivery. This entire reflection is an investigation into the effectiveness of my instruction. While I have much to learn, I have some useful tools to help me improve in the future my instruction.

Domain 2:
2.1 Student Needs (Strength)
  • Plans include opportunities for students to make choices based  on their learning needs, learning styles, and/or interests.
Students have multiple access points to instruction. We offer Face-to-face lessons, online written instructions, video lessons, and texting services for students who still have question.s in most cases, we answer text messages within minutes. Students deeply appreciate having so many ways to get help. While few students use all of the available library services, Most genuinely appreciate those they choose to use.

2.2 Coherent Design (Strength)
  • Plans include opportunities for students to make connections between skills or concepts being taught.
  • Plans include opportunities for students to generate questions that further their understanding of skills or concepts being taught.
  • Plans include opportunities for students to identify/pose and solve problems related to real-world issues
Research continuum 9th grade Short.png
Figure H

Figure H above illustrates how we teach students that research is a recursive process - that they must continually question everything, including their own methods and theses. This graph is an essential instructional tool we employ with students in grades 9-11. Teachers are beginning to post them in their classrooms.

2.4 Appropriate Interventions (Growth)
  • Plans demonstrate educator’s collaboration with colleagues and specialists in development of interventions.
This is where I hope to better align instruction to the student data. I have looked at data ten different ways, but I am not seeing the story I seek emerge. I was given useful suggestions about how to get the “story” by examining a cohort group over their 9-12 experience. I am excited to get started on this right away.

2.5 Literacy Skills (Growth)
  • Plans include opportunities for students to construct meaning through reading, writing, listening, speaking, viewing, and presenting to enhance content area learning.
  • Plans include opportunities for students to problem solve, interpret and use data and numerical representations to enhance content area learning.
I really have to get better about exit surveys. That’s how I will move into the exemplary category.

21st Century Skills (Strength)
  • Plans may include opportunities for students to select appropriate technology tools based on unit goals.
  • Plans incorporate opportunities for students to select and apply 21st century learning standards.
  • Plans include opportunities for students to reflect and self-assess on their use of 21st century learning.

This is at the corps of what we do. Our Collapse Project was a great example of how we encouraged students to take charge of their learning. Their feedback was extremely positive and their were able to explain their understanding of new learning on their own terms. This project generated a great deal of student engagement, and we are looking for additional curricular areas to replicate the strategy, which was so clearly successful.

Domain 3 - Responsive Teaching

3.5 Responsive teaching (Strength)
  • Educator and student roles are differentiated based on students’ academic and behavioral needs, allowing students to progress towards independence at different rates.
  • Students may select appropriate digital tools and resources for researching and collaborating.

This was the case in classes where teachers identified students who might need additional support - they “assigned me” students to monitor. We created additional checkpoints for those learners. I sought them out in class when they missed appointments or deadlines. My role was always that of a “helper” rather than a “judger”, and this really helped to build the trust we needed to let them share their work and concerns. In all but one case, I was able to get them to complete their work, even where it looked as though they might not.

Reflect on learning community growth and educator contributions year to date (including reflections / professional evidence related to Domains 5 and 6): (Educator)

Everything I said at mid-year was sustained throughout the school year. I facilitate workshops, I integrate technology for in the classroom, I co-teach with my instructional partners, I help teachers plan lessons and manage discipline issues, I work with departments and PLCs develop curricular units and scaffold their instruction, I reach out to learners who need additional support. I am coordinating a 2015 summer curriculum project during which (surprise, surprise!) I will collaborate with four other 11th grade teachers to develop instructional materials for the junior research project.