Just having a subscription to the TIM Tools isn’t enough. You have to use them. And to use them well and get good data you need a solid plan. We often see folks who rush into implementation without a plan and then end up confused about how they can interpret the data, or worse, making interpretations that aren’t supported by the way they collected the data.

In our master classes, the core activities have included developing and refining implementation plans for the various tools. We decided to pull back the curtain and share these planning tools with our wider community.

So what are the key factors to a strong implementation? Across the tools there are some common components that need to be addressed: purpose, standard guidelines for use at the school and wider (district, state, country) levels, data collection specifics, and planned analyses. While each of these factors may look different between tools, having a common approach to start can be helpful. Let’s look at each of these common factors first, and then we can dive into tool specifics.

 

Key Elements to Define for Each Tool

Purpose
Clearly and thoroughly state your purpose for using the tool. This is the foundation that will inform the rest of the decisions in your plan. Guiding questions include: What do you hope to achieve by implementing this tool? and What questions would you like to be able to answer?

Data Collection Design
Based on the stated purpose and the questions you hope to answer, you can then create a plan for how you will collect data to ensure that your design will allow you to answer your questions. This involves carefully thinking through who will participate, when, and how often. Depending on the tool, this will also include planning a design for how the program (such as mentoring) will be conducted.

Administration Guidelines
Based on the purpose and data collection design, you can create standardized procedures for how the tools is administered. At the school level, the guidelines will refer to how each teacher, observer, coach, or mentor uses the tool. At a wider level, the guidelines will also include how each school and/or each district implements the tool and the entire related program.

Planned Analyses
Thinking through this element serves as a final check to make sure that your data collection design and administration guidelines are well defined and will result in data that you can then analyze to answer the questions you set out in your purpose. Working on this stage may highlight pieces that need to be better stated or planned for in one of the earlier stages, and thus revisions can be made before you begin implementing the tool.

Now we can get into the specific elements as applied to each tool, as well as tool-specific components that need to laid out. For now we will look at the most widely-used tools, the Technology Uses and Perceptions Survey (TUPS) and the TIM Observation Tool (TIM-O).

Implementing the TIM Observation Tool (with Examples)

Establish a Purpose for Observation

Ex. 1: To provide specific, meaningful, useful feedback to teachers to help them understand and expand their use of technology in the classroom.

Ex. 2: To gather data about classroom uses of technology in our district to inform technology integration decisions.

Determine Participants

Ex. 1: Teachers at Sunny Elementary school who are matched with mentors.

Ex. 2: A stratified random sample of teachers at schools across the district.

Choose Observers

Ex. 1: Five technology mentor teachers.

Ex. 2: Ten district-based technology specialists and 35 school-based tech specialists.

Determine Observation Schedule

Ex. 1: Scheduled based on mentor and mentee availability.

Ex. 2: According to sampling plan. Not arranged with teacher in advance.

Collect and Analyze Data

Ex. 1: Mentor teachers collect and assess individual results to provide formative feedback to mentee teachers.

Ex. 2: Levels of technology integration for the district are estimated based on a “high water mark” calculation for the observations collected. The summaries are used to inform professional development planning, to inform purchasing decisions, and to set long-term technology integration goals.

Implementing the TUPS (with Examples)

Establish a Purpose for Survey

Ex. 1: To provide specific, meaningful, useful feedback to teachers to help them understand and expand their use of technology in the classroom.

Ex. 2: To gather data about classroom uses of technology in our district to inform technology integration decisions.

Determine Participants

Ex. 1: All teachers in the district.

Ex. 2: Teachers participating in a grant project.

Ex. 3: Middle school math and science teachers.

Determine Survey Schedule

Ex. 1: Beginning and end of each school year.

Ex. 2: End of each school year.

Ex. 3: Beginning and end of grant project period.

Ex. 4: During start of year faculty meeting.

Communication

Ex. 1: Email to teachers, including purpose and what to expect upon completion.

Ex. 2: Face to face announcement.

Ex. 3: Online discussion with teachers.

Ex. 4: Teachers included in planning team.

Analysis

Ex. 1: Data analysis team to include curriculum and technology staff, as well as teachers.

Ex. 2: Examine data summaries in order to answer specific questions based on stated purpose.

Ex. 3: Data analysis guided by: “In what ways is the use of technology during instruction student-centered?”

Ex. 4: Data analysis guided by: “In what ways has technology use changed since the last administration of the TUPS?”

Bringing it All Together: Setting up Your Successful Implementation

How Will the Tools Inform Each Other?
Finally, an important piece of successful implementation plans is specifying how the tools will work together. As we know, gathering multiple sources of data will allow for stronger, richer interpretations. Thinking about this ahead of time and planning for how the tools can work together will give you more robust results. Each of the TIM Tools provides a unique lens into the instructional setting, and thus yields a distinct result. As the tools were designed to work in concert with one another, building upon this opportunity and using it in the design of your implementation plan, and even the design of your programs, will further strengthen the potential for a successful, and impactful implementation.

Christine Harmes is a consultant on research, measurement, and evaluation, and an ICF-certified coach. Her research interests focus on improving teacher use of technology, computer-based testing and usability. At the Florida Center for Instructional Technology at the University of South Florida, Dr. Harmes focuses on research and tool development related to technology integration.

Want To Know More?

Schedule a personal walk-through of the TIM Tools for yourself or your staff. See how this flexible set of technology integration tools will meet your school or district's specific needs.