Saturday, April 26, 2014

Empty Room = Educational Opportunity: How Do You Furnish An Empty Classroom Correctly?

I have an empty room. It used to be a computer lab. This year we decided to move the lab into another room (conglomerated two labs into one actually). The reason is simple: I don't believe that computer labs are as needed as they used to be. Almost all students have their own computers, be it phones, pads, laptops, etc. The issue now is connectivity - do they have access to cloud based systems? If they do, then they can make a lab wherever they are and use their own devices to do so. Many school have begun to use IPADS (which we have done as well) or other portable tablet devices and have consequently allowed those students who do not have portable computers of their own to use these. Does this mean we should get rid of all our computer labs? No it does not, since the younger students will most likely not have their own devices, or are not allowed to bring them to school. Also, some courses use specialized software that students will not normally have on their machines or which would not easily run or be used on tablet devices.

So we have one lab. What about the empty room? This empty room is earmarked to become a multi-purpose room.
Empty Room
The concept is this: we want the room (45' by 22' or so) to be usable for all classes, all grades, and for anything they want it to be. Tall task. But if we keep in mind 21st Century Learning tasks and contemplate what you want students to be able to do in the classroom, then I feel that this room can be furnished and set up to accommodate this.

Thinking from the younger grades up, here is a list of what we want the room to be able to accommodate:

  1. Open space for students to move and find their own area to work in
  2. Inviting, not over stimulating (for those ADD students), but not so bland that it does not inspire
  3. Modular: no set style to the entire room - students can move objects around
  4. Areas where you can lecture to an entire classroom
  5. Areas where you can do mini-lectures with small groups of students - where many of these can happen at the same time (for collaboration)
  6. Areas where students could work independently
  7. Multimedia area (screen - projector - computer) for presentations
  8. Areas to draw or brainstorm on (whiteboards)
  9. Access to the Internet (WIFI)
So what will this look like and what will we put in? The process is still in the brainstorming stage, but we have a few ideas. Here is a quick list of some of the things we have thought about:
  1. All desks and chairs should be movable
  2. Chairs should be adjustable in height to accommodate bigger or smaller users
  3. Tables should be trapezoidal in shape so they can be made into small or larger groups of desks if need be
  4. Whiteboards should be movable: this allows us to make walls and work areas wherever we are
  5. A large mural on one wall
  6. Three tier stairs against the mural wall (22' wide) to accommodate for all students to sit on if you need to lecture
  7. An area with a sofa and a desk, computer, etc., for multimedia presentations or for teleconferencing options
  8. Single seating areas along the window
  9. WIFI node in the room for clear internet access (along with a few plug ins for laptop power and network access)
  10. Some foam blocks or bean bags for younger students to sit on and move around as well.
As we progress through this, I will post updates on the project. 

Tuesday, April 22, 2014

Fraser Institute: A Look at the Standings from the Numerically Changeable Top

 Saying the name of this organization to teachers or administrators of schools in BC will prompt responses: some will be positive and some will be words that I talked about in my last post. The school I work at happens to be at the top of the pile of high schools in our city this past year (1st for high schools [1 of 12] and 6th for Elementary schools [6 of 30]) and, yes, we are happy (we'll take it). However, what strikes me about our standing is how easy it could have been different. Also, it brings to question how accurate it is.

Our school is small: the n of the students used by the Fraser Institute was 19 for the high school. Our mark was a 7.4 (out of 10). Again, this doesn't speak well for all schools in our area as we, the highest ranked high school this year, only received a mark of 74%. For the Grade 4 (Elementary), the n was even smaller (13) and we received a 6.9 ranking (6th out of 30 elementary schools in our area, which is good).

Interestingly, of the top 10 ranked high schools, only 3 of them had over 100 grad students. Does this suggest that smaller schools do better? Let's go one step further: do better at what exactly? 

My initial thought on the issue of working in a small school and the Fraser Institute ranking system is this: numerically, when you have only 16 students as we did, having a few students who did not for whatever reason do well on their exams will bring the institute's score down, perhaps by quite a large amount. In fact, the Fraser Institutes' own "How To Read The Ratings" help file has this to say: "Indicator results for small schools tend to be more variable than do those for larger schools and caution should be used in interpreting the results for smaller schools." And yet, overall, it is the smaller private schools in BC that seem to consistently do better in this form of rating than the larger public schools.

The Fraser Institute uses many measuring points to come up with a total for a school:"The Overall rating out of 10 takes into account the indicators a) average exam mark, b) percentage of exams failed, c) school vs exam mark difference, d) gender gaps, e) graduation rate and f) delayed advancement rate." According to Sridhar Mutyala, in his posting entitleThe Real Problem With The Fraser High School Rankings – Part 1, the percentages of each indicator is as follows: 
  • average exam mark—20%
  • percentage of exams failed—20%
  • school vs exam mark—10%
  • English gender gap—5%
  • Math gender gap—5% 
  • courses taken per student—20% - This seems to have been taken out this year, so the rest of the weighting has changed a bit for the 2013 year.
  • diploma completion rate—10%
  • and delayed advancement rate—10%. 

I would encourage anyone to look at Mr. Mutyala's post as he does a great job of looking at the statistical issues with the ranking and indicators used by the Fraser Institute. However, to speed things up a bit, here is a synopsis: the issue that Mr. Mutyala has with some of these indicators, and I tend to agree with him, is that many of them have very little to do with how a school does in educating their students (and here I will not even go into whether these indicators have anything to do with how a student is "educated"). So why were these indicators chosen? According to the Fraser Institute, the indicators cover the following areas: three indicators deal with effective teaching, there are indicators showing consistency in marking (and gender gaps), and there are indicators showing practical well informed counselling. 

However, I go back to the notion that small changes in one or many indicators can make large differences in the overall mark. For example, if I use our school as an example, when one student out of 16 fails an exam, the overall mark of the school will drop dramatically.  As well, if a grade 12 student drops out of school (and there can be various reasons for this that are not controllable in any way by the school itself) it can also heavily affect a school's overall rating. 

Notice, however, that small schools seem to do better on the Fraser Institute report. In fact, if all your grads graduate and do well on their exams, they will rank high on the report; even higher than larger schools with the same scenario. According to Mr. Mutyalah, the mix of variables and "ad hoc weighting" of the indicators by the Fraser Institute (how did they come up with those percentages?) will actually favors small schools over larger schools. This trend is easily seen when you look at the Fraser Institute's ranking in comparison to raw test scores

The question that this raises is this: are exam scores an acceptable way of measuring a school's ranking? I suggest "no", and arguments could be made here about the pros and cons of standardized testing as sole indicators of education - this topic is much too large to broach here.

With all of this in mind, I conclude the following: the ranking by the Fraser Institute is not the sole factor that families should base their decision on school choice for their child. Instead, parents need to look at many factors, including school demographics, ethics, beliefs, etc. (what is important, what is not) and begin there: how do you want your child to be educated (what is an "educated" student in your eyes?). granted, parents should keep an eye on academics and, yes, this is important. Yet, the schools themselves (teachers and administrators), should always look at achievement scores of their students and respond accordingly; but there are many factors that will need to be addressed on a per student basis to help each student reach their academic potential.

As well, let me add this comment: perhaps students themselves should become more involved in their own education, and schools themselves should instead only be the place where they go to work through their educational goals and not a place that is responsible for how they did