Tuesday, March 8, 2011

STAR Chart Report: Week 2 Assignment

Posted in partial fulfillment of requirements of EDLD 5352, Lamar University

Progress Report on the Long-Range Plan


     The Progress Report on the Long-Range Plan for Technology, available online for anyone with good close-up vision and a few hours to spare, can be found at http://www.tea.state.tx.us/index2.aspx?id=5082&menu_id=2147483665. It offers a comprehensive view of the plan for bringing today’s schools into the 21st century. The plan is an acknowledgement that students possess greater technology skills than their local school infrastructure can support. It is widely believed that improvements in technology have to be systemic and consistent in order to be effective, and the plan describes both the plan and its current status from September 2008 to August 2010. . 
     Written in two major sections, the progress report begins with descriptions of the four key areas of the Long Range Plan: Teaching and Learning; Educator Preparation and Development; Leadership, Administration, and Instructional Support; and Infrastructure, notably the four components of the Texas STaR chart. The second part sums up how the twenty regional service centers are meeting the recommendations of the Long-Range Plan.
     The first component of Teaching and Learning is the Texas STaR Chart, and the progress report details the strides made in this area. Technology grants have provided improvements for classroom teachers through integration of interactive lessons. Initiatives such as the Technology Immersion Project allow students to have access to increased opportunities to integrate technology into their learning.
     The section on Educator Preparation and Development also begins with STaR chart data with graphs. The NCLB legislation provided funding to improve teacher training and access to technology, and the standards of educator preparation are described.  Online training opportunities address the needs of teachers unable to commit to more fixed, traditional models of professional development. Programs such as the Intel® Teach Program and iTunes University are examples of such online offerings.
     Leadership, Administration, and Instructional Support and Infrastructure for Technology are the last sections in the report and each addresses STaR chart data. Specific information regarding broadband access and internet safety issues are features of this section.     
     The Progress Report demonstrates how the crucial resources for administrators, teachers, librarians, and students are being provided for Texas’ 21st Century classrooms.

Sunday, March 6, 2011

Technology Assessments: Pros and Cons

    
     At the start of each new school year, we get the email from the ITS reminding us to complete the STAR Chart. Periodically we get the odd survey or Survey Monkey link. It seems that we are often being asked to rate ourselves and our proficiency with technology. Some surveys are quite long and tedious. It begs the question, however: to what end do we gather all this data? 
     One must assess one's proficiency in order to determine needs and define goals. If the gathering of the data is simply to rate, though, it is not as likely to have buy-in from those being rated.  Teachers are more motivated to answer accurately and thoroughly when there is even the possibility of follow-through and support post-survey. When professional development or new technology acquisitions indicate that the surveys were analyzed and given credence, not only does morale improve, but there will be a greater sense of ownership. The likelihood that these stakeholders will invest themselves in the changes resulting from the professional development or new technology purchases increases exponentially. “They heard us and did something about it!”
     On the other hand, it is entirely possible to have too much data. Assessing proficiency without also addressing goals can lead to declining morale and teacher apathy towards technology. If teachers are repeatedly asked about their skills and even about their own personal technology goals, but the infrastructure and budget remain static or limited, teachers are less likely to participate effectively. On a campus where such assessments only lead to a rating, without possibility of improvement, it is only data for data’s sake. The information piles up without reason and buy-in is lost.

Wednesday, February 23, 2011

Week One Ruminations

When information comes from a network, it is not always obvious where it came from, who wrote it, or why. This expands what it is to be a successful and responsible reader today. It means that part of reading is asking questions about what you are reading." (David Warlick,Literacy in the New Information Landscape.)

My 5th graders have spent the past week creating projects for a Creative Showcase video conference with another school. I was previewing some of their projects and was startled to find a photo of Ruby Bridges "as she looks today"; the photo is of a woman who is clearly in her eighties, which I am fairly certain Ruby Bridges is not. The students had already left so I tried to retrace their digital steps. It didn't take long. A cursory search of AP Images for "Ruby Bridges" brings up many photos, one of which isRuby Butler, an octagenarian who witnessed a lynching in the 1950's from a bridge.

The ten year olds who are savvy enough to create a movie on Moviemaker using imported photos from AP still need their "old school" digital immigrant teacher to be the guide on the side and remind them that whatever pops up on the screen isn't necessarily fact, and even if it is, it may not be relevant to the task.

We have been talking all year about the dangers of Wikipedia and its often false information, but I need to be more vigilant in monitoring their use of the "safe" and district-endorsed databases, because information, while readily available, must be questioned, not only for its accuracy, but its value to the objective. Because I teach reading comprehension, it is a lesson we include regularly: What is the author's purpose?

As for myself, I have begun to question information that comes from a network more thoroughly. Before committing to a course of action or route to a destination (of both the physical and academic sorts) I have to verify the validity of the information. Yahoo Maps are often exactly that--maps for the yahoos who trust them and have time to spare rerouting. Even Google Maps can have outdated satellite images and provide very convincing but false data. News sources can be biased just like their old yellow journalism print counterparts. Questioning truly is part of being a responsible reader.

Sunday, August 8, 2010

Entering the Action Research Arena

Goal: Determine the effectiveness of classroom performance system technology when consistently utilized in instruction. Compare standardized test scores where the technology was and was not utilized.

 

ACTION STEPS 

PERSON(S)

RESPONSIBLE

TIMELINE 

RESOURCES 

EVALUATION

 

Survey teachers interested in utilizing the system

 

 

 

C. Murat 

 

 

8/17/2010-8/30/2010 

 

 

Attendance records for introductory training held 7/28/2010 

 

 

Survey Monkey

sent via email 

 

Monitor usage of CPS™ system

 

 

C. Murat

D. Fish 

 

9/1/2010-5/25/2011 

 

Sign-out sheets/logs 

 

Survey Monkey

sent via email;

voting via Outlook 

 

Compare Iowa Test of Basic Skills scores

(ITBS™)

 

 

 

C. Murat 

 

 

10/2010 

 

 

ITBS scores from 2009 and 2010 

Analysis of students' individual scores from 2009 to 2010; class comparisons

 

Compare Texas Assessment of Knowledge and Skills (TAKS) scores

 

 

 

C. Murat 

 

 

5/26/2011-6/15/2011 

 

 

Test data from 2009 and 2010 

Analysis of students' individual scores from 2009 to 2010; class comparisons

 

 

 

Compile and publish results

 

 

 

 

C. Murat

D. Fish

M. Brewster 

 

 

6/30/2011 

 

 

School web site, PTA newsletter, blog 

 

 

Conclusion on effectiveness of CPS™ technology

Wednesday, August 4, 2010

Getting It Together

We're getting there! The AR project has evolved slightly. The main focus will be determining whether sustained use of a new technology has a measurable impact on student learning. I will focus on Math and Reading scores on the ITBS and TAKS.

I will be sending out a brief questionnaire soon.

But first, a detour: I have to teach a staff development on our first day back!
It's how to incorporate the ELPS into your content area. I have materials and know pretty much what I am going to say. But yikes. The first day back? I need to find a way to incorporate it into my internship portfolio. Certainly going to put it in my Vita.

Saturday, July 31, 2010

Full Speed Ahead

1. Foundation: I will track usage of a technology resource in the classroom setting and determine the effectiveness by comparing standardized test scores in two areas:
A) How are individual student scores affected by using the CPS™ system from the previous year?
B) Are test scores in classrooms in which the system is utilized consistently higher than those in which it is not utilized?
2. Analyzing Data: I will compare Iowa Test of Basic Skills scores from 2009 and 2010; I will analyze TAKS scores from 2009 and 2010.
3. Developing Deeper Understanding: It will be necessary to define terms when referring to “consistent usage”. I will also need to consider variables such as student health and attendance when comparing individual scores from the previous year to current results.
4. Engaging in Self-Reflection: Are there other factors which may be affecting the results of the study? Should faculty attendance be considered? Are the tests similar or have there been changes in format? Has the scoring system changed from 2010 to 2011?
5. Exploring Patterns: What emergent patterns exist regarding who is or is not utilizing the technology? Are teachers with 1-5 years experience more or less likely to utilize the technology?
6. Determining Direction: Will the analysis focus on Reading, Math, Science, or a combination? What are the advantages/disadvantages of limiting the direction?
7. Taking Action: If there is noticeable improvement where the technology has been consistently utilized, is this data compelling enough to promote more school wide use? Would improvement lead to more monies being dedicated to procurement of more such devices and training?
8. Sustaining improvement: The technology has existed for more than five years. What factors are needed to promote sustained usage of the tools and will newly-hired staff members have access to training?