Friday, July 8, 2016

Teacher Evaluation

Introduction

In this blog, I'm comparing teacher evaluation systems from Tucson Unified School District (TUSD), located in my home town, and Rhode Island Department of Education (RIDE). I looked at the "value-added" models but they reminded me of an attempt to apply actuarial science to teachers and schools. They also reminded me of systems health insurance companies use to monitor physicians and hospitals, probably because I worked for a health insurance company in the past. Healthcare is full of acronyms too and VAMS sounds like a new healthcare acronym: BPCI, CAD, HFMA, PTCA and TBI to name a few. And there are issues with data measurement, “If VAMs are not meaningfully associated with either the content or quality of instruction, what are they measuring?"

I decided to take a look at some less data-driven models.  Both TUSD and RIDE's models are easy to understand and implement. 

TUSD's Teacher Effectiveness Evaluation Model

"The model is made up of four components including the Danielson Framework, Academic Growth, the Student Survey, and the Teacher Reflection. Each component factors into a teacher's final score, albeit with different weighting. The Danielson Framework comprises the majority of the score determination by making up 56% of the total score. The Academic Growth makes up 33% of the total score. The Student Survey makes up 10% of the total score and the Teacher Reflection is 1% of the total score."




The four domains of the Danielson Framework are: Planning and preparation, the classroom environment, instruction, and professional responsibilities.

This year, Academic Growth will be calculated using scores from pre-post assessment tests which contain multiple choice questions plus written answers.

Student Surveys are based on the Tripod Study from Harvard University and will, "measure seven classroom climate constructs including: Care, Challenge, Control, Clarify, Captivate, Confer, and Consolidate. Each survey has a different number of total questions." 

Finally, "the Teacher Self Reflection is completed by the teacher and is scored either 1 or zero depending on whether it was completed or not."

RIDE's Teacher Evaluation and Support System

"The Rhode Island Model relies on multiple measures to paint a fair, accurate, and comprehensive picture of teacher performance. All teachers will be evaluated on four measures:

1. Professional Practice: Classroom Environment - This measure represents Domain 2 of the Teacher Professional Practice Rubric, which includes four components.
2. Professional Practice: Instruction - This measure represents Domain 3 of the Teacher Professional Practice Rubric, which includes four components.
3. Professional Responsibilities - The Professional Responsibilities Rubric includes four domains: School Responsibilities and Communication, Professional Growth, and Planning. The ratings of these four domains combine to create one measure of Professional Responsibilities. 
4. Student Learning - This measure assesses the teacher's impact on student learning through the use of Student Learning Objectives (SLOs) and/or Student Outcome Objectives (SOOs), and the Rhode Island Growth Model (RIGM), when applicable. 
Evidence from each of the four criteria will be combined to produce a Final Effectiveness Rating of Highly Effective, Effective, Developing or Ineffective"


 


RIDE teachers additionally receive three evaluation conferences a year, three classroom observations a year and it's Professional Practice Rubric features the Danielson Framework embedded within it. The model also includes a Professional Growth Goal which may be adjusted mid-year. RIDE also has a Performance Improvement Plan for teachers who need support.


Summary

I found elements in both models which were to my liking. Both models feature the Danielson Framework which describes a teacher's responsibilities towards his or her students. It operates as a professional set of standards. Other professions have similar standards and members are judged according to them. I'd prefer to be judged according to the Danielson Framework.




TUSD's pre-post student assessment tests are good because they let teachers know how students view their teaching methods. I'd like to know the effects of my teaching methods directly from my students. Although only worth one point, the teacher self-reflection is important because it requires teachers to think carefully about their work. Perhaps it could be completed after reviewing the pre-post assessment tests. 

RIDE includes professional growth and school responsibilities and communication in it's model. I believe that a professional growth goal is critical because it allows teachers to work towards continuous improvement. And there should be a rating for school responsibilities and communication

To conclude, my summary of top picks for judging are: the Danielson Framework, pre-post assessment tests, teacher self-reflection, professional growth goal and school responsibilities and communication. I'd prefer to be judged by some of these measurements.





Resources

Walker, T. (2014, May 30). New Study Strikes Latest Blow Against 'Value-Added" Teacher Evaluation. Retrieved from: http://neatoday.org/2014/05/30/new-study-strikes-latest-blow-against-value-added-teacher-evaluation-2/.

Teacher Effectiveness Evaluation Model 2015-16. Retrieved from: http://www.tusd1.org/contents/govboard/packet07-14-15/7-14-15-BAI20-TeacherEvaluationScaling2015.pdf.

Rhode Island Model Evaluation and Support System Teacher Edition IV. Retrieved from: http://www.ride.ri.gov/Portals/0/Uploads/Documents/Teachers-and-Administrators-Excellent-Educators/Educator-Evaluation/Guidebooks-Form.

Danielson 2014-2015 Rubric Adapted to New York Department of Education Framework for Teaching Components. Retrieved from: http://www.cfn107.org/uploads/6/1/9/2/6192492/danielson_2014-2015_rubric.pdf



Saturday, June 25, 2016

Pre-Assessment for Differentiation

This blog consists of four parts:

1. Review of Teacher Works Samples
2. Creating a pre-assessment using Kahoot
3. Creating flowcharts on innovative teaching strategies using Lucidchart
4. Summary of Assessments

Teacher Works Samples

The first part of this activity is about reviewing Teacher Works Samples. I looked at the following examples from "Renaissance Teacher Works Samples Consortium." They were: 

1.Teacher Work Sample: Spring 2008, High School, Business, Topic: Marketing and Entertainment
2.Teacher Work Sample: Spring 2007, Grades 10-12, Business, Topic: Financial Services. 

Per Teacher Work Sample #1, although "the class performs at a high academic level" and has some prior marketing knowledge, the teacher assumes that students have little knowledge in regards to marketing plans. The main objective of the course is to teach students how to write a real-life marketing plan for a local theater. The teacher's assumption about students having limited knowledge about marketing is correct. The pre-assessment consisted of written questions about the four objective goals. Students scored 54%, 48%, 60%, and 50% on goals 1-4. Total pre-assessment score was 53%. This teacher work sample is representative of the types of students I'll be teaching which are high-achievers seeking knowledge in a specialized area outside of the general education curriculum.

In Teacher Work Sample #2, the subject matter is less advanced but students still did not meet the 80% acceptance rate on the three learning goals. This is because the topics were new to these students, it doesn't imply lack of academic readiness. Scores were 76%, 71% and 91% with an overall score of 81%. These results tell the teacher to focus on goals one and two. This teacher wrote a pre-assessment which consisted of ten true/false, five matching, nine multiple choice and two open-answer writing questions. Again, the subject matter is outside of the general education curriculum. This is the type of pre-assessment test I'll plan to use on my students. I want to know their thoughts about marketing.

Pre-Assessment

The second part of this activity requires that a pre-assessment be written using an app. My pre-assessment is based on State of Arizona Standard 2.0 - Demonstrate Marketing Concepts. I created a basic quiz in Kahoot deriving from the objectives listed within the standard. I will not be using it in my class because mobile devices are banned in the classroom. I also found Kahoot restrictive because it limits the amount of characters writers can use when creating test questions. In my class, I plan to give a pre-assessment which includes questions requiring written answers. I want to know what students think about marketing. 

https://play.kahoot.it/#/k/750793e3-b7cf-44b2-86a7-8da097db0758

Flowchart and Analysis

The third part of this activity is about creating a Lucidchart for three groups of students: high-academic achievers, middle-academic achievers and low-academic achievers.

Teacher Work Sample #1 describes the type of students taking elective marketing classes. These students are high-academic achievers, most will have some knowledge of marketing but will have limited knowledge of marketing plans. All students were at the same level of academic readiness and the work sample indicated a homogenous group of students. Passing a math standard is a prerequisite to taking advanced placement marketing classes. These are the types of students I expect to see in my classes as well.

The rubric in this activity focuses on three different levels of academic readiness assuming a heterogeneous classroom environment. Based on the Teacher Work Sample and my experience in the classroom, this will not be the case. For my unit, I plan to differentiate based on high-academic achievers with little knowledge of the subject matter. To fulfill the rubric, I'll develop my differentiation strategy around the criteria listed below:

1. High-academic achievement = The Most Knowledge
2. High-academic achievement = Average knowledge
3. High-academic achievement = Lowest level of knowledge

Differentiation states that teachers can effectively differentiate instruction according to content, process and product. Three areas which can be differentiated are quantity of work, same activity - different task and level of difficulty. For example, students with little knowledge can answer less test questions or complete less complicated tasks than their peers who have a higher levels of knowledge.

Because I'm teaching high-academic achievers, I'm flipping that requirement. The only difference is the level of knowledge with high-academic achievers, it is not about academic readiness. As these students have all passed math standards, they are academically ready to take advanced placement business classes. Most will want to be up to speed with their classmates (although there may be a few exceptions). This might be a high-academic achiever who is bored with classroom activities (gifted) and I'll have to provide stimulating activities based on personal interest and learning profile because academic readiness is not the issue.

My Lucidchart shows a differentiation strategy for Lesson Plan One using same task - different activities. I'd use this approach for all of the lessons were I to be teaching high-academic achieving students. The first lesson will get them up to speed and following lessons will focus on levels of interest in the subject and how to provide stimulating and challenging activities.






Summary of Assessments

Assessments used to track students for this activity are:

1. High-level = Document plus peer teaching to mid and low-level students.
2. Mid-level = Document, post-assessment quiz results and peer teaching to low-level.
3. Low-level = Document and post-assessment quiz results.

My summative assessments for this class are a midterm exam, a final exam and a group marketing plan project. Students will work in groups and select from a list of local businesses. For example, if a particular group of students has an interest in dogs, they'll select a small business specializing in dog training and daycare. If another group has an interest in sports, then they'll focus on a firm working with sports.

I'll use a mix of formative assessments and a variety of activities to keep these high-academic achievers engaged in the subject matter. I'll additionally have to determine the gifted students present in my class and how to accommodate them. There may also be students lacking organizational skills in my class. Formative assessments and activities will be developed around Marketing Standard 2.0 with personal interests and learning profiles in mind.

Resources


Lesson Planning Tips for Different Student Levels. (2016). Retrieved from: http://teaching.monster.com/benefits/articles/8976-lesson-planning-tips-for-different-student-levels.

Tomlinson, C. and Allan, S. (2000). Leadership for Differentiating Schools and Classrooms. Alexandria, VA. ASCD.

Martin, D. (2016). Lower Level Learners: Teaching Their Way. Retrieved from: http://teachinghistory.org/teaching-materials/ask-a-master-teacher/24111.
















 












Sunday, June 19, 2016

High Stakes Assessments


The Current State of Arizona's High-Stakes Standardized Testing

"Tucson's largest school district (TUSD) canceled standardized state testing scheduled for Monday, Tuesday and Wednesday and instead will hold regular classes, after enactment Friday of a law eliminating the exam requirement . . . . . . . . . .  I congratulate the Legislature and Gov. Ducey for removing this  vestige of high stakes testing," said Diane Douglas, Arizona's superintendent of public instruction, in the news release . . . . . .  The only state test that students will be required to pass to graduate is the civics exam Ducey signed into law last month. These are the only bills he has signed so far."

Arizona's Instrument to Assess Standards, formerly known as the AIMS test, ended with Ricky Hernandez of the Pima County School Superintendent's Office stating, "We think it would make sense that at a time when our state assessments and our standards are currently in flux that our high school students would not be required to pass a statewide standardized test."

AIMS was never popular in Arizona. When it was first implemented, it had a high failure rate, so it was made easier to to ensure the receipt of federal money under the NCLB Act while appeasing politicians and parents. AIMS was viewed as either too easy or too hard and it created a no-win situation in Arizona. It was a multiple choice exam given to all 10th grade students which apparently failed to to measure higher-level skills needed for college and career, nor did it measure teacher performance.

On February 23, 2015, Arizona replaced the AIMS test with the AzMerit test to be administered to 3rd-12th grade students. The test consists of three parts: Reading, Math and Writing. I opened up the 10th grade Reading practice test and answered a few questions. It was a combination of multiple choice and paragraph selection. I thought it was quite good in promoting critical-thinking skills as it featured questions like, "Which is a central idea of the passage?" followed by "Select the detail from the passage that supports the central idea." Unfortunately, upon it's inauguration, "most of Arizona's students failed the state's revamped standardized test — unwelcome news that  many believe is necessary to pave the way toward higher student achievement in the long run." Below is a graph showing the results of the Language Arts inaugural test taken by 10th grade students. 




However, school officials predicted lower test scores during the first several years of testing because the State previously anticipated switching to Common Core, but never made the change. Despite this, some charter schools were among the highest performers on the test. But, there will be an adjustment period for everyone.

In the meantime, TUSD has implemented a new pilot program which will begin in Fall 2016. The goal is to reduce the amount of district-mandated standardized testing. Instead, TUSD plans to create one benchmark test to replace the three currently required annual tests. The benchmark test will initially be administered at 5-6 undetermined campuses. At the same time, students will still have to take the AzMerit test. Apparently, teachers have to spend approximately a week of instructional time on the AzMerit test. Reducing the amount of time teachers spend preparing for standardized tests is the goal of establishing one benchmark district-mandated test. And, “teachers and principals constantly express concerns over the volume of standardized testing that students, especially younger students, endure," says TUSD board member Mark Stegeman.

From my perspective, there is a lot of pressure on Math and English teachers to prepare students for these tests. As I'll be teaching elective courses, I won't be participating in preparing students for AzMerit or district-mandated tests. There is so much controversy around these tests, but how do you really create a test which prepares students for college and career readiness? Colleges and the workplace are so diverse themselves, what do they want? What exactly are they asking for? It seems like Arizona is changing its tests constantly and that does have an effect on teachers, their pay and bonuses. At one local school district, Catalina Foothills, 97% of teachers didn't meet performance requirements because of the new AzMerit test.

In the next part of this blog, I'll feature some questions and answers from a friend who is a special education teacher in the New York public school system.

High Stakes Testing and Special Education in the New York Public School System

“The strategy currently employed by New York City Department of Education and the State Education Department is not working for students, teachers or schools,” said Pedro Noguera, Professor of Education and executive director of the Metropolitan Center for Urban Education at New York University. “High-stakes standardized tests are being used to rank and measure students and teachers, and to punish schools, rather than as tools to diagnose learning needs and inform instruction . . . . . . statewide, teachers whose jobs are threatened by test scores will resist working with high-need, ELL and special-education students, Biklen said. Student teachers will be considered liabilities, and aspiring teachers as well as accomplished veterans will decline posts in struggling, hard-to-serve urban schools, where low test scores could doom a school to closure." 

 

The State of New York has loads of problems in regard to teacher certification, standardized testing and it's former involvement with Pearson Scoring, "Merryl Tisch is retiring as Chancellor of the New York State Board of Regents after trying to force Common Core, high-stakes testing, and teacher evaluations based on student test scores down the throats of the people of New York. Her efforts produced a backlash from parents and teachers including a massive opt-out campaign that helped her decide to quit. Last April, about twenty percent of the eligible students in grades third- through eighth refused to take mandated reading and math tests."

 

With these issues in mind, I asked a friend, Will Ruch, a special education teacher in the New York Public School system the following questions: 

 

1. Name of the school you're teaching at:
    NYC DOE
    Specialty:  K-1 Students with Autism
    Grade Level:  K-1
    Subjects:  All


2. How much time is spent in testing? 

In all grades there are end of unit performance tasks roughly once a month.  These are formative assessments teachers use to guide instruction.

In the fall in grades 3-5, we do a Fall Baseline in ELA and Math.
In the spring (Feb), we do a Spring Benchmark in ELA and Math approximately 2 hours per test.

Additionally, in late spring of this year we did Computer Based Field Tests, which are tests that do not count for student or teacher grades but are used to develop next year's tests.

For official testing purposes (teacher, student, school and principal "grades), we do F&P Running Records 4 times a year (testing reading levels; one-on-one with each student approx 40 minutes per test).

For state testing in the spring, there are six testing days with approx 5 hours of testing per 3 days (so 10 hours total); on the testing days the school basically shuts down, as teachers are pulled from all classes to proctor tests, as many special ed students have accommodations that entitle them to separate testing groups, extended time, etc.

To prepare for these tests, we have after school and Saturday academies for test prep, as test prep does not take place w/in the school day (so no, we do not teach to test, we teach to standards and/or performance tasks).  The after school academies are about 4 hours a week, Saturday is about 4 hours, so for a student that signs up for the whole package, that is 8 extra hours of instruction a week; all of this has to be paid for, as teachers are paid overtime to teach these academies.

3. Are teachers teaching to the test?   No, we teach to the Common Core standards and instruction is scored during observations using the Danielson Framework.

4. Are rewards or bonuses given to teachers whose students score high?  No.  Teachers earn extra money by working overtime in academies to test prep; in addition, as scores affect teacher, school, and principal ratings, a lack of performance could have dire consequences.

5. Are students required to pass the test to move to the next grade or graduate?  NYC uses multiple data to determine whether a student is promoted; obviously, if a student passes all the state tests with flying colors, that is the only data we would use; however, in the case of a lower-scoring student, student work would be considered, especially if parents are protesting a decision to hold back a student.

7. Are test scores used for teacher evaluations? Yes, in that each teacher is assigned a growth score determined by measures of student learning.  The measures of student learning are test results, numerically adjusted to account for poverty, special ed status, student attendance, etc.  Then, teachers and schools are compared with similar peers to rank them.  For example, schools in middle class neighborhoods are compared with other schools in similar neighborhoods, etc.

8. What types of tests are administered to your students?

End of unit assessments developed by teachers and /or a part of our Common Core aligned curriculums:
ELA:  F&P Running Records, and ReadyGen end of unit tests, and TC Writing Performance Tasks
Math:  GoMath End of Unit Assessments
NY State ELA and Math Spring and Fall Benchmarks, plus annual tests in Spring
NY State Field Tests in Late Spring (since these are beta tests, we usually do one subject, one grade.  This year it was grade 5 math).

Our state tests are developed by Questar and were formerly developed by Pearson.

I am not even mentioning all of the diagnostic assessments that are used by school psych, our autism program (for program admissions) and/or the data collection systems and assessments used by our Pre-K teachers.  Yikes!

I believe that testing is required to assess both student and teacher performance, but should be one of many factors used to determine student promotion and/or teacher ratings.

Observation ratings, combined with formative and summative assessment results are a good way to judge most teachers.  When the populations of students are skewed there are exceptions.  For example, students with severe disabilities or students who are gifted and talented.

Conclusion

To summarize, high-stakes testing in the U.S. is fraught with problems and mired in red tape. But it doesn't seem like other countries are much better off, except for Finland and a few others. Finland has such a low population though, it would be much easier to manage it's education system than in the U.S. or more populous countries.


Resources

Echevarri, F. (2016). TUSD Cancels AIMS Tests in Light of New State Law. Retrieved from: https://news.azpm.org/s/28604-tusd-cancels-high-school-testing-week-in-wake-of-new-state-law/.

Kossan, P. (2009, March 15). Educators seek answers beyond AIMS. Retrieved from: http://archive.azcentral.com/arizonarepublic/news/articles/2009/03/15/20090315aims0315.html.

AzMerit Test Reading. Retrieved from: https://sat30.cloud1.tds.airast.org/student/V340/Pages/TestShell.aspx.

Cano, R. (2015, December 1). AzMerit Scores: Most students failed inaugural test. Retrieved from: http://www.azcentral.com/story/news/local/arizona/education/2015/11/30/azmerit-scores-most-students-failed-inaugural-test/76561.

Huicochea, A. (2016, April 15). TUSD Pilot Program seeks to reduce mandated testing. Retrieved from: http://tucson.com/news/local/education/tusd-pilot-program-seeks-to-reduce-mandated-testing/article_39e26cd3-b22d-554a-b53d-278578fa30b2.html.

NYCLU. (2012, June 27). High Stakes Tests Harm Students and Teachers, Undermine Equity in New York's Schools. Retrieved from: http://www.nyclu.org/news/high-stakes-tests-harm-students-and-teachers-undermine-equity-new-yorks-schools.

Singer, A. (2016, March 3). Overturn the Tisch Miseducation Legacy in New York. Retrieved from: http://www.huffingtonpost.com/alan-singer/overturn-the-tisch-misedu_b_9457350.html.















Thursday, May 19, 2016

Planning Assessments


Introduction  

 

I'm still using State of Arizona Standard 2.0 - Demonstrate Marketing Concepts, but have moved on to Objective 2.4 now. There are eight objectives within this standard, so I've still got four left before I complete this Standard. I'll now provide the description of Objective 2.4 - Explain a Marketing Plan. This objective is the start of a student's summative final project which is a group business plan due by the end of the semester.

After reading the study material and considering my objective, I decided to use the following three formative assessments:

1. Explain What Matters: "Explain the most critical part of a given topic to a self-selected audience."

2. Yes/No Chart: "List what you do and don't understand about a given topic - what you do on the left, what you don't on the right, but your overly-vague responses don't count. Specificity matters."

3. Dos and Don'ts: "List 3 Dos and 3 Don'ts when using, applying, relating to the contents." 

In choosing these assessments, I'm attempting to allow students to "build" knowledge. I will use the Yes/No Chart assessment to introduce students to the contents of a marketing plan and the Dos and Don'ts assessment to increase deeper knowledge. However, none of this matters unless they understand the PDCA method which is described in Formative Assessment #1. Use of the PDCA method must be continuously applied to a marketing plan.


Formative Assessment #1 

 

Explain What Matters

Before digging into the contents of a marketing plan, I want my students to understand the first rule of planning because it is the driver of a marketing plan. The PDCA method is identified as the first rule of planning and it consists of the following steps: "plan a process improvement, do the improvement, check results, act to hold gains or reenter planning."

 I will instruct my students that the first rule of planning a marketing plan is Plan-Do-Act-Check (PDCA) in one sentence and to go look up the details on the Internet. At the end of class, they'll have to turn in a card with their name on it and a paragraph on the PDCA method and how it is used in the development and maintenance of marketing plans. Although I considered asking for a verbal response from each student, it would be too time-consuming. Asking each to write a paragraph saves time.



Why did I choose this assessment?  

I want my students to get familiar with acronyms and independent work because they'll be creating a business plan as part of their summative assessment. They'll be doing a lot of independent work to create business plans. Although the business plan will be a group project, each student in the group will be responsible for a portion of it and each portion identified by name.


Formative Assessment #2


Yes/No Chart

At this point, students are ready to start learning about the contents of a marketing plan. Because it is critical that students learn about the contents of a marketing plan, I've applied two formative assessments to it. Students will individually create a Yes/No Chart on all of the following items listed below:




This chart lists the contents of a marketing plan. Students will have to list what they do and don't know about each of the chart items and be specific. After completing the chart, students will have to research anything they don't understand in class or at home and turn in results by the end of the week. In addition to the business plan, students will also be tested on the contents of a marketing plan.

Why did I choose this assessment?

In order to successfully create a marketing plan and pass tests, students will need to understand every item on the chart. This is necessary because they'll be creating every item on the chart. My current plan is give students a week to work on this chart.


Formative Assessment #3


Dos and Don'ts 

After returning graded Yes/No charts, each student will take his or her Yes/No chart and list three Do's and Don'ts when "using, applying, and relating to each of the items listed above." They'll have a week to complete this exercise which will be submitted for grading at the end of the week.

Why did I choose this assessment? 

 This assessment was the logical next step in promoting understanding of the contents of a marketing plan. The Yes/No Chart provided the basis for understanding but listing Dos and Don'ts promotes deeper learning of the items. Once I determine that students have demonstrated deeper knowledge of the chart, they'll begin working on group marketing plans. There should be some computer lab time each week to work on their marketing plans and they may also work at home on them.


Conclusion


 I specifically chose each of these assessments because they fit best with Objective 2.4 - Explain a marketing plan. I deliberately looked for formative assessments which allowed the learner to build knowledge. The first assessment (Explain What Matters) introduced the topic of marketing planning to students. Before even looking at marketing plans, students need to learn how to plan it out. The second assessment I used promoted basic understanding and knowledge while the third assessment increased deeper learning by requiring students to think critically about the subject matter.






References


Anselmo, D. (2010). Marketing DeMystified. New York. NY: McGraw-Hill.

Heick, T. (2013, March 13). 10 Assessements You Can Perform in 90 Seconds. Retrieved from: http://www.teachthought.com/pedagogy/assessment/10-assessments-you-can-perform-in-90-seconds/.





Saturday, May 14, 2016

Understanding and Applying Standards




In this module, I learned how to unpack a standard and the theories and mechanics behind backwards mapping, writing lesson plans and objectives. I also learned that the Arizona Department of Education provides a comprehensive curriculum guide which includes standards, objectives and lesson plans. For example, when I reviewed the Accounting standards, I found objectives, content, implementation (lesson plans), terminology and whether or not the objective was a testing item. In the curriculum guide, I only saw flexibility in regards to creating activities and learning experiences, but nothing else. Thus, I will not be writing my own objectives but I will be following the curriculum guide. And, I will be allowed to create activities which will supplement existing lesson plans.

Since I will be teaching Business Education subjects, I wanted to learn about their respective standards to become familiar with what I need to know in the Fall. And, I wanted to stay with my subjects and not veer off into a subject I won't be teaching as it has no value to me. Finally, the teacher I do classroom observation with told me to study the curriculum guide, so I'm following her instructions.

What this week's work taught me is how to find the objectives and how to see the backwards mapping and construction of lesson plans. Prior to studying backwards mapping and writing lesson plans, I wouldn't have understood the curriculum guide, how it was created or what I was supposed to do with it. For example, I did not know that column heading "Implementation" contained the lesson plans I was to use for teaching. I also didn't know that the State's lesson plans do in fact correlate with the information in Fink and Bloom's taxonomy. Prior to studying Fink and Bloom's taxonomy, I didn't know the origins of the State's curriculum guide. What I discovered is that the State followed Fink and Bloom's taxonomy to the letter. They used the same words, designed lesson plans according to the taxonomy and suggested activities and learning experiences which corresponded directly to the taxonomy. Before this week's activities, interpretation of the curriculum guide would have been difficult.

After completing this week's activities, I realized that in order to use the curriculum guide, I needed to understand it and this week's activities accomplished that goal. Now that I understand the curriculum guide, I can use it and I'll be prepared for teaching these courses in August.

Thursday, May 12, 2016

Standards and Backwards Mapping



In this blog, I'm going to discuss developing a unit plan for State of Arizona Business Management and Administrative Standard 2.0: Demonstrate Marketing Concepts. The standard was taken from Arizona Department of Education Career and Technical Education standards and is part of the the Business Management and Administrative list of standards.

I chose this standard because I'll be teaching it and need to refresh on it. It is part of the entrepreneurship class and will be mostly taught to junior and senior level high school students. Although sophomores are eligible, I was informed that it is mostly taken by junior and senior level students participating in FBLA (Phi Beta Lambda) and DECA (Delta Epsilon Chi) clubs. Both clubs are active on the college level with FBLA additionally operating in the post-college, professional environment.

The standard asks students to "Demonstrate Marketing Concepts" by way of eight measurement criteria which are:

2.1 Explain marketing terminology
2.2 Analyze internal and external markets
2.3 Explain the difference between product and service-based marketing
2.4 Explain a marketing plan
2.5 Predict how changes in sales volume, unit costs and unit sales and pricing affect net income (Note to teacher: accounting - financial ratios)
2.6 Describe how businesses compete for market share in identified markets
2.7 Explain the impact marketing research has on the success of a business
2.8 Use desktop publishing to design and print a flier to market a product or service

Looking ahead to Activity 3, these criteria should also be viewed as objectives as there is an identifiable goal in each with a final goal of "demonstrating marketing concepts." Under Fink and Bloom's taxonomy we have:

1. Explain - Foundational Knowledge - Understanding and Remembering
2. Analyze - Application - Critical Thinking
3. Demonstrate - Application - Performance Skill, Human Dimension and Caring
4. Predict - Application - Practical Thinking
5. Describe - Foundational Knowledge - Understanding and Remembering
6. Explain -  Foundational Knowledge - Understanding and Remembering
7. Use - Application - Performance Skill

And does the Standard and it's measurement criteria fulfill SMART learning objectives? They will once conditions, time limits and relevant activities are attached to them. In their present state, they are missing a few SMART components.

Knowing that the district will be updating to Apple Macs in Fall 2016, I anticipate that this and other Business Management standards will be updated because they currently show Power Point and Word as tools to use and lesson plans listed under them. But I couldn't find a curriculum guide for the Business Management standards. There was reference to lesson plans, but I wasn't able to access them. The Accounting Curriculum Guide was complete in that it listed all standards plus measurement criteria, content, implementation, terminology plus assignments and projects for the teacher to use. Thus, although Accounting Standard 3.0 is very large, it is also very complete and features instructions for lesson plans in it. In that standard, the user would simply read the text under the heading, "Implementation," and use it to create a lesson plan.

As I stated above, I wasn't able to access lesson plans for Marketing Standard 2.0, only references to them. That being said, I created my own set of proficiencies, assessments and activities for the standard.

 Proficiencies

Four proficiencies students will have to demonstrate are: create a business plan, an infographic project and pass both midterm and final exams.

 Assessments

My three critical assessments are the midterm and final exams plus successful creation of a business plan. Because business plans can be complex, I'll divide students into groups. Each group will ultimately present the completed business plan to the rest of the class. In order to increase learning, some groups will work on business plans for service-based industries while others will focus on product-based industries.


Activities

I anticipate that many activities and projects will be attached to this standard. Some will include: create a business plan, a SWOT analysis project, a marketing mix project, a marketing research project, a pricing game and an infographic project instead of the flier project listed above. As I stated above, I'm anticipating that the Business Management and Administrative Standards will be updated as new technology is being introduced. Although teachers must follow curriculum guidelines, they are allowed to introduce topics and technology that may be helpful. For example, although the focus is on business marketing, I may suggest students study how marketing is used in political campaigns, by non-profits to encourage donations, the government as well as social impact and psychological marketing strategies. For example, one student did a project on the influence of marketing on gangs and how the desire for high-priced sneakers encouraged criminal behavior (stealing) while other students chose to look at successful strategies used by non-profits to boost membership and donations. I'll give my students a choice of topics to encourage learning.





References

The University of New Mexico School of Medicine. (2005). Effective Use of Performance Objectives For Learning and Assessment. Retrieved from: http://ccoe.rbhs.rutgers.edu/forms/EffectiveUseofLearningObjectives.pdf.