kitchen table math, the sequel: fluency
Showing posts with label fluency. Show all posts
Showing posts with label fluency. Show all posts

Monday, September 3, 2012

get the MESAG, part 2

Back from the Open, so summer is officially over, as opposed to emotionally over, which it was the instant Chris set foot inside his dorm.

Hate the empty nest! hate! hate! hate!

Anyway, getting back to my interrupted post on precision teaching, when you learn something to fluency, you "get the MESAG":

Maintenance  "You never forget how to ride a bicycle."

When you learn content or skills to fluency, you remember them.
EnduranceThis one surprised me. Fluency in content and skills means you can perform the content or skill for as long as you need to perform them. You have stamina.
StabilityAnother surprise, rife with implications for our ADHD epidemicFluent knowledge and skills  are impervious to distraction. A noisy classroom has no effect on skills a child knows so well he can do them in his sleep. If he can do long division in his sleep, he can do long division inside Penn Station.

Application

Transferring semi-old knowledge to new contexts is hard. My favorite story re: transfer of knowledge is the little autistic boy whose parents and teachers spent months painstakingly teaching him to butter his bread. Finally he learned! Everyone was happy until, a few weeks later, they discovered that the little boy had no idea how to spread peanut butter on bread. Spreading butter on bread and spreading peanut butter on bread were two different things, and they had to start all over again.

Fluency allows you to apply your bread-buttering skills to peanut-buttering bread.

Autistic children, by the way, are rarely taught anything to fluency. "Discrete trial" teaching puts a ceiling on the number of repetitions a child can do in a minute. 80% correct does not equal fluency.
GenerativityFluent knowledge and skill "repertoires" readily recombine to produce new skills that don't have to be directly taught.

Is fluency the magic that makes inflexible knowledge flexible?

Sunday, September 2, 2012

Is Handwriting Causally Related to Learning to Write?

The contribution of handwriting to learning to write was examined in an experimental training study involving beginning writers with and without an identified disability. First-grade children experiencing handwriting and writing difficulties participated in 27 fifteen-min sessions designed to improve the accuracy and fluency of their handwriting. In comparison to their peers in a contact control condition receiving instruction in phonological awareness, students in the handwriting condition made greater gains in handwriting as well as compositional fluency immediately following instruction and 6 months later, The effects of instruction were similar for students with and without an identified disability. These findings indicate that handwriting is causally related to writing and that explicit and supplemental handwriting instruction is an important element in preventing writing difficulties in the primary grades.

Is handwriting causally related to learning to write? Treatment of handwriting problems in beginning writers.
Graham, Steve; Harris, Karen R.; Fink, Barbara
Journal of Educational Psychology, Vol 92(4), Dec 2000, 620-633

Saturday, September 1, 2012

get the MESAG

Another dispatch from my two weeks at Morningside's Summer School Institute:

"Get the MESAG" is Morningside's acronym for the results of fluency: get the MESAG.

Maintenance
Endurance
Stability
Application
Generativity

More tomorrow ---

Tuesday, August 21, 2012

Common Core re-defines fluency

More from the Education Week article on Common Core assessments:
He pointed to one illustrative example in PARCC’s materials that tries to gauge students’ fluency in division and multiplication. It offers five equations, such as 54÷9=24÷6, and asks 3rd graders to specify whether each is true or false.

“I like that it does multiple assessments in one item,” he said. “It asks kids to work each of those problems easily and be comfortable with it, which is what fluency is.”
Published in Print: August 22, 2012
Consortia Provide Preview of Common Assessments
By Catherine Gewertz
Offhand, I don't see how a set of five equations like these can test fluency. Fluency isn't simply a matter of accuracy, ease, and comfort. Fluency includes speed, and to assess fluency you need fluency aims, or standardized rates of performance.

How fast should a third grade student be able to answer these 5 questions? That's what you would need to know to use these equations to assess fluency, and one set of 5 simple equations probably isn't enough to measure speed.

Beyond that, I'm skeptical this is a fluency test at all. Seems to me it's more an application-of-knowledge test than a test of fluency per se.

Here's the list of fluency aims Rick Kubina culled from the precision teaching literature.

Maybe I should send a copy to PARCC.

Monday, August 6, 2012

Terry

I heard a lot about Terry while I was at Morningside Academy's Summer School Institute.
Eric [Haugton, one of the creators of precision teaching] helped his wife Elizabeth plan for a kindergarten student named Terry Harris. Terry had cerebral palsy, and walked with crutches. Elizabeth was teaching him to write his name. It had taken from September to Christmas vacation to teach Terry how to write his first name. Elizabeth wondered if there wasn’t a better way to teach him to write his last name. Even though there were only four new letters to teach, it still seemed like a daunting task. Eric asked her if Terry could write 250 to 200 vertical stokes in a minute. Elizabeth mentioned that Terry was quadriplegic — Eric replied, “I didn’t ask what he looks like, Elizabeth — can he do 250 to 200 vertical strokes per minute or not?” Elizabeth admitted that she didn’t think so. “Can he do 140 to 120 zero’s in a minute?” Again Elizabeth said he probably could not. “Those are the elements that make up the compounds for every letter or number we write. If they are not fluent, then learning to write numbers and letters will fail.”

Returning to school Elizabeth and Terry spent the next three weeks working on strokes and 00s. Terry went from about 50 vertical strokes to over 175, and from 25 zero’s to over 90. “But Terry and I were getting tired of this drill, and we were ready to try going back to writing his name.” So they did; how long did it take for Terry to learn to write Harris?

Terry learned it in five minutes.

LESSONS LEARNED: Eric Haughton and the importance of fluency
Wicked Local Hingham | January 22, 2012
Autistic children often spend years learning the same things over and over and over again in school.

What would happen if all of these students were moved from "discrete trial"/80% mastery criteria to precision teaching/fluency training?

Wednesday, April 13, 2011

progress report, part 2

C. and I took another math section of the SAT today, and it looks like I am now definitively finishing math sections. In fact, I'm starting to finish with a minute or two to spare. C. is finishing, too, although he's still not able to do all the problems. He skipped 3 today and probably got everything else right. I skipped none and probably missed 2. (We can't check our answers 'til we enter them online.)

I'm still making dumb mistakes. Typically, I'll miss one of the very first questions on the test, which C. never does but enjoys watching me do. We're starting to have a ritual: I miss question 2 or 3, and C. says, "You always miss the easy one and get the hard ones right."

Today I misread the number 6 on a graph as the number 5. Arrrgh.

The good news: I seem to have stopped making bubbling errors. Thank God. Losing points to bubbling errors when you can't even finish the damn test is uniquely demoralizing.

These days I have enough time to check each page for bubbling errors, but I don't find any when I check. More and more, I think speed is the answer -- whatever speed means, exactly, which is more than simply finishing early.

I'm probably talking about fluency.

As I become more fluent in SAT math, I'm becoming more fluent in bubbling, too.

Next challenge: no more dumb mistakes.

Tuesday, July 7, 2009

Physics Education Continued

In a prior post, I discussed the problem of college students having poor physical intuition both before and after taking university physics. In another post I will discuss David Hestenes' proposed solution to this problem, but first I wanted to provide some examples of what kinds of errors these students are making.

Hestenes et. al. developed a test called the Force Concept Inventory, but they've embargoed online versions of it (it's available to you if you can prove you are a physics teacher or professor.) The following questions are similar to questions on the FCI, but I've respected their embargo, and adapted them from similar questions. Their own questions are taken from other papers as well, as the literature is filled with examples of how physics students don't understand basic mechanics.

Here are two test questions, the first adapted from Students' preconceptions in introductory mechanics, J. Clement, Am. J. Phys. 50(1), Jan. 1982, and the second adapated from Rule-governed approaches to physics--Newton's third Law, D. P. Maloney, Phys. Educ., Vol 19, 1984. Note that my adaptations haven't been tested on thousands, so they may not be as crystal clear as I hope...

1. A ball is tossed from point A straight up into the air and caught at point E. It reaches its maximum height at point C, and points B and D are at the same height above the ground. IGNORE AIR RESISTANCE.
Try to imagine that "up" on the page is the z direction, and that the horizontal direction is x. No motion is occurring in x.





a. Draw with one or more arrows showing the direction of each force acting on the ball when it is at point B.
b. Is the speed of the ball at point B greater, lesser, or the same as at point A?
c. Is the speed of the ball at point D greater, lesser, or the same as at point B?


2. Consider the following diagrams of two blocks on a frictionless surface and answer the following questions. Ignore air resistance.
a. Assuming both blocks are at rest:


How does the force that A exerts on B compare to the force B exerts on A, if A and B are equal in mass?
How does the force that A exerts on B compare to the force B exerts on A, if A and B have different masses?

b. Assuming both blocks are moving to the right with velocity v:


How does the force that A exerts on B compare to the force B exerts on A, if A and B are equal in mass?
How does the force that A exerts on B compare to the force B exerts on A, if A and B have different masses?

c. Assuming both blocks are moving to the left with constant acceleration a:


How does the force that A exerts on B compare to the force B exerts on A, if A and B are equal in mass?
How does the force that A exerts on B compare to the force B exerts on A, if A and B have different masses?
---
While the actual test employed some randomization and various other elements (set values for the masses, e.g.) the results for this last question were that less than 10% of experienced students (those who had taken college physics) got the right answer using the right reasoning, and 0% of the novice students (those who had not yet taken college physics) got the right answer.

UPDATE: See, I told you I hadn't vetted the questions. Updates are above in BOLD. College Physics above means college students taking a standard first term mechanics course. In the test they did with question c, the students were junior or senior year chemistry students who were required to take a 1 year physics course as a prereq for their major. The author was at Creighton University, so presumably these students were at Creighton University as well. The author points out that at least half a dozen of these students who got these wrong had also taken the MCAT, and possibly had studied physics AGAIN as well.

SECOND UPDATE:

how about some answers?

Problem 1: a. There's one force on the ball. it's Gravity, pointed down. b. The speed of the ball at B is less than the speed at A. The speed drops continuously until we reach C, in fact. c. The speed of the ball at B is the same as at D. In fact, the speed of the ball at any height X above the initial A is the same whether going up or going down. The ball speed depends only on height above our origin.

Problem 2: the answer to all problems is the same: the force exerted by A on B is the same as the forced exerted by B on A.


Physics Education and Failures in Conceptual Understanding
Fixing Physics Education: Modeling Instruction
Physics Education Continued
More Modeling Instruction: Techniques

Fixing Physics Education: Modeling Instruction

In prior posts, I referred to work done by David Hestenes and his colleagues at Arizona State University addressing the dismal results of traditional university level physics instruction. Hestenes and others developed the Force Concept Inventory, FCI, to demonstrate that even after a year of traditional instruction, college students had failed to create proper models in their own minds for how mechanics actually works. Specifically,
"Before physics instruction, students hold naive beliefs about mechanics which are incompatible with Newtonian concepts in most respects.
• Such beliefs are a major determinant of student performance in introductory physics.
• Traditional (lecture-demonstration) physics instruction induces only a small change in the beliefs. This result is largely independent of the instructor’s knowledge, experience and teaching style. "


Hestenes has gone forward from there, and developed a new curriculum design for high school and college physics instruction, and has pushed this curriculum design out to high school and college physics teachers on his own, and more recently, through NSF backing. He calls this new type of instruction modeling instruction.

What is meant by modeling instruction? He means that you learn physics by constructing appropriate models for the interactions in your system, and then you apply inference to your model to solve whatever problem you have. The emphasis is on getting the student to recognize the model at hand by getting them familiar with constructing models in the first place. What's a model? A model is a representation of your system and its properties. A model tells you everything you need to know. So a model tells you your system, the boundaries of your system, the state variables inside your system, the initial conditions of your system, the transition function for the state variables in that system, and whatever interactions you need to know. This sounds vague, but the point of the model is that you can explicitly say whether or not you've got everything you need to infer what happens if you write it all down.

In modeling instruction, the idea is that "the modeling method approaches the problem of restructuring students’ intuitions by engaging them in explicit construction and manipulation of externally structured representations. In the case of mechanics (6), we have found it advisable to engage students in explicit comparisons of the three major misconceptions in Box 1 with their Newtonian alternatives. When these three are adequately treated, many other misconceptions about mechanics fall away with them. "

To facilitate the teaching of mechanics by modeling instruction, the modeling group at ASU created course materials and curricula for a high school level mechanics class. Their modeling method for mechanics explicitly teaches 10 models, five of them models of motion (kinematical models): constant velocity, constant acceleration, the simple harmonic oscillator, uniform circular motion, and a collision model; and five models of force (causal models): the free particle, the constant force, the central force, the linear binding force, and the impulsive force. The idea is that by explicitly organizing the ideas of motion and force in this way, the student will see the common physics in each. This is as opposed to organizing ideas around "problems".

Here's an example. "oh, that's a projectile problem", " oh that's a block-siding problem" "oh, that's a orbit problem" is a typical way a student might think about the physics problem in front of them, but it doesn't help elucidate what were the relevant features of the problem at hand, whereas recognizing "oh, that's a constant velocity problem", "oh that's a free particle problem," "oh, that's a central force problem" leads you immediately to know or be able to infer the geometric structure, the interaction structure, the force structure, the changes over time, etc.

Enough talk! Let's jump in. Here are the course notes for unit on the free-particle model. The instruction goals are to use the free particle model to develop intuition for Newton's First Law (commonly stated as "an object in motion tends to stay in motion; an object at rest tends to stay at rest"), for Newton's Third Law , and to correctly be able to represent forces as vectors.


1. Newton’s 1st law (Galileo’s thought experiment)


Develop notion that a force is required to change velocity, not to produce motion
Constant velocity does not require an explanation.

2. Force concept

View force as an interaction between and agent and an object
Choose system to include objects, not agents
Express Newton’s 3rd law in terms of paired forces (agent-object notation)


3. Force diagrams

Correctly represent forces as vectors originating on object (point particle)
Use the superposition principle to show that the net force is the vector sum of the forces



4. Statics

•F = 0 produces same effect as no force acting on object decomposition of vectors into components

Continuing, the teacher's notes state "It is essential that you get students to see that the constant velocity condition does not require an explanation; that changes in velocity require an interaction between an agent and an object. We quantify this interaction by the concept of force. After the dry ice and normal force demos, one can use worksheet 1 as an opportunity to deploy the force concept in a qualitative way. It is important to carefully treat how to go about drawing force diagrams in which one represents the object as a point particle. Drawing the dotted lines around the object helps students distinguish between the object and the agent(s). "
And

"Newton’s Third Law ...Researchers have identified and categorized many such misconceptions, but two of them are particularly important, because they are persistent common sense alternatives to Newton’s Laws. Ignoring variations and nuances, these misconceptions can be formulated as intuitive principles.
I. The Impetus Principle: Force is an inherent or acquired property of objects that make them move.
II. The Dominance Principle: In an interaction between two objects, the larger or more active object exerts the greater force."


The notes then describe detailed demos and labs, with pre and post discussion points as well:
"It is an indirect goal of this activity to provide students an opportunity for arguing that a free particle, i.e. one subject to zero net force, will have a constant velocity. Also, students should conclude that any apparent change in velocity of an object indicates that a non-zero net force is acting upon it, provided that the observer is in an inertial frame of reference. ,,,
Make the point that when no force acts on the block in the horizontal direction, the block maintains constant velocity.
* Point out that an impulse applied perpendicular to the original trajectory does not result in the block making a right angle turn.
* Be sure to ask why they think the block continues to move once it leaves the hand. Some are likely to answer " due to the force of the hand."


The notes continue in this fashion, with specific notes on what demos/labs, how to guide them, what the appropriate leading questions are, when to ask for students' input, when to build to consensus.

This might seem like a fairly normal course, being taught with a normal lab. But the structure is different. More on that in the next post.


Physics Education and Failures in Conceptual Understanding
Fixing Physics Education: Modeling Instruction
Physics Education Continued
More Modeling Instruction: Techniques

Thursday, July 2, 2009

Physics Education and Failures in Conceptual Understanding

For decades, physics professors in universities and colleges in the US have known there is something wrong with physics studies in their schools. Their own physics majors have a poor understanding of basic concepts in mechanics, electricity and magnetism, and quantum mechanics.

Journals like The American Journal of Physics (devoted to teaching and pedagogy at the university level) and Physics Teacher (the same for high school and lower) bring up these issues, with a variety of proposed changes and solutions both at the individual classroom level and at the higher theory-of-ed level. D. Hestenes at ASU and his colleagues have done work in this area, both in questioning the failures of pedagogy and developing some solutions. First, the problems.

D. Hestenes wrote in "What Do Graduate Oral Exams Tell Us?" (Am J. Phys. 63:1069 (1995)) of finding a quote from physicist W. F. G. Swann, in "The Teaching of Physics", (Am. J. Phys. 19, 182-187 (1950)):
"Much can be said about oral examinations for doctor’s degrees, and in my judgment not much can be said that is good. I have sat in innumerable examinations for Ph.D. at very many different universities, sometimes as a member of the permanent faculty and sometimes as a visitor. In almost every case the knowledge exhibited was such that if it represented the true state of mind of the student, he never should have passed. However, after the examination is concluded there is usually a discussion to the effect that: "Well, So-and-so got tied up pretty badly, but I happen to know that he is a very good man," etc., etc., and so finally he is passed."


Hestenes goes on to quote Swann as saying [A student] "passes his tests frequently [including graduate comprehensive exams], alas, with very little comprehension of what he has been doing."

Hestenes diagnoses the problem as this:

It seems not to have occurred to the faculty that dismal oral exams may be symptoms of a severe deficiency in the entire physics curriculum. I submit that there is good reason to believe that they are symptomatic of a general failure to develop student skills in qualitative modeling and analysis.


These general failures mean that even students who have the grades to appear to have excellent mastery of the material do not understand basic elements of the material they have "learned".

It also suggests that college students who fail to understand the material may end up there because their confusion prevents them from attaining the mastery the "good" students have.

Of course, the errors didn't just start in college. Generally speaking, proper physical intuition is lacking in students who took high school physics, even in those who did well. Hestenes writes in "Force Concept Inventory", (Physics Teacher, Vol. 30, March 1992, 141-158)
"it has been established that1 (1) commonsense beliefs about motion and force are incompatible with Newtonian concepts in most respects, (2) conventional physics instruction produces little change in these beliefs, and (3) this result is independent of the instructor and the mode of instruction. The implications could not be more serious. Since the students have evidently not learned the most basic Newtonian concepts, they must have failed to comprehend most of the material in the course. "


Hestenes et. al. wrote the Force Concept Inventory, a multiple choice test whose aim is to "to probe student beliefs on this matter and how these beliefs compare with the many dimensions of the Newtonian concept. " It poses questions that force a choice between the correct Newtonian answer for an explanation of a given system, and other commonsense explanations that are actually misconceptions. After the test, interviews are done to determine students' reasoning.

Here's an example of a misconception that the FCI aims to tease out of a student:
[The misconception of "impetus":]
The term "impetus" dates back to pre-Galilean times before the concept was discredited scientifically. Of course, students never use the word "impetus"; they might use any of a number of terms, but "force" is perhaps the most common. Impetus is conceived to be an inanimate "motive power" or "intrinsic force" that keeps things moving. This, of course, contradicts Newton’s First Law, which is why Impetus in Table II is assigned the same number as the First Law in Table I. Evidence that a student believes in some kind of impetus is therefore evidence that the First Law is not understood.


The FCI has been given to thousands of college and high school students. The above paper details the results on the FCI, given as a pre and post test to both high school and undergraduate physics courses, with tremendous detail on similarities and differences across classrooms in the country. More, it provides strong evidence that traditional college physics pedagogy isn't doing anything to teach physics to the students who take it:

"The pretest/post test Inventory scores of 52/63 for [The Regular Physics Mechanics course at Arizona State University] are nearly identical to the 51/64 scores obtained with the Diagnostic for the same course...we have post test averages of 60 and 63 for two other professors teaching the same course. Thus, we have the incredible result of nearly identical post test scores for seven different professors (with more than a thousand students). It is hard to imagine stronger statistical evidence for the original conclusion that Diagnostic posttest scores for conventional instruction are independent of the instructor. One might infer from this that the modest 11% gain for Arizona State Reg. in Table III is achieved by the students on their own. "


Which brings us back to the state of physics majors going to graduate school:

One of us (Hestenes) interviewed 16 first-year graduate students beginning graduate mechanics at Arizona State University. The interviews were in depth on the questions they had missed on the Inventory (more than half an hour for most students). Half the students were American and half were foreign nationals (mostly Chinese). Only two of the students (both Chinese) exhibited a perfect understanding of all physical concepts on the Inventory, though one of them missed several questions because of a severe English deficiency. These two also turned out to be far and away the best students in the mechanics class, with near perfect scores on every test and problem assignment. Every one of the other students exhibited a deficient understanding of buoyancy, as mentioned earlier. The most severe misconceptions were found in three Americans who clearly did not understand Newton’s Third Law (detected by missing question 13) and exhibited reading deficiencies to boot. Two of these still retained the Impetus concept, while the other had misconceptions about friction. Not surprisingly, the student with the most severe misconceptions failed graduate mechanics miserably, while the other two managed to squeak through the first year of graduate school on probation.


Is it just that the Chinese students who manage to get into US physics grad schools are such creme de la creme that they are perfect? Or is Chinese instruction vastly superior?

(And for those who wonder about American instruction in other subjects, read this and weep:)

One disturbing observation from the interviews was that five of the eight Americans, as well as five of the others, exhibited moderate to severe difficulty understanding English text. In most cases the difficulty could be traced to overlooking the critical role of "little words" such as prepositions in determining meaning. As a consequence, we discarded two interesting problems from our original version of the Inventory because they were misread more often than not.


And yet, those who make it through physics graduate school to professordom mostly correct these errors, at least in mechanics. (Though not necessarily. In quantum mechanics, new professors are notorious for teaching elements of the material incorrectly. In special relativity, David Mermin, prof at Cornell, believes many professors teach the entire subject wrong. (He discusses this in a paper called something like "how to teach Special Relativity.") Hestenes suggests this is due to the realities of post quals grad school: the day in, day out teaching and researching refine one's intuition over and over again.

I think this implies something else as well. Error correction in intuition can only occur and stick if the mastery of the manipulation of the equations is so strong that you (correctly) believe what they tell you. If you can be forced to do the math on the board, and forced to read and think about what it says, then you can learn the truth counter to what your intuition tells you, but only if you are utterly sure you did the math on the board correctly.

If instead, you doubt yourself, doubt your manipulation of equations, doubt your application of the laws as you understand them, then you will get confused, doubt your answer, default to your intuition, and scrap learning the correct way to think.

That means you need a tremendous amount of mastery. How in the world to achieve that?

Hestenes' answer --changing how physics is taught in high school and in college--will be explored in a week or so.


Physics Education and Failures in Conceptual Understanding
Fixing Physics Education: Modeling Instruction
Physics Education Continued
More Modeling Instruction: Techniques

Mastery and Conceptual Understanding in Physics

For decades, physics professors in universities and colleges in the US have known there is something wrong with physics studies in their schools. Their own physics majors have a poor understanding of basic concepts in mechanics, electricity and magnetism, and quantum mechanics.

Journals like The American Journal of Physics (devoted to teaching and pedagogy at the university level) and Physics Teacher (the same for high school and lower) bring up these issues, with a variety of proposed changes and solutions both at the individual classroom level and at the higher theory-of-ed level. D. Hestenes at ASU and his colleagues have done work in this area, both in questioning the failures of pedagogy and developing some solutions. First, the problems.

D. Hestenes wrote in "What Do Graduate Oral Exams Tell Us?" (Am J. Phys. 63:1069 (1995)) of finding a quote from physicist W. F. G. Swann, in "The Teaching of Physics", (Am. J. Phys. 19, 182-187 (1950)):
"Much can be said about oral examinations for doctor’s degrees, and in my judgment not much can be said that is good. I have sat in innumerable examinations for Ph.D. at very many different universities, sometimes as a member of the permanent faculty and sometimes as a visitor. In almost every case the knowledge exhibited was such that if it represented the true state of mind of the student, he never should have passed. However, after the examination is concluded there is usually a discussion to the effect that: "Well, So-and-so got tied up pretty badly, but I happen to know that he is a very good man," etc., etc., and so finally he is passed."


Hestenes goes on to quote Swann as saying [A student] "passes his tests frequently [including graduate comprehensive exams], alas, with very little comprehension of what he has been doing."

Hestenes diagnoses the problem as this:

It seems not to have occurred to the faculty that dismal oral exams may be symptoms of a severe deficiency in the entire physics curriculum. I submit that there is good reason to believe that they are symptomatic of a general failure to develop student skills in qualitative modeling and analysis.


Of course, the errors didn't just start in college. Generally speaking, proper physical intuition is lacking in students who took high school physics, even in those who did well. Hestenes writes in "Force Concept Inventory", (Physics Teacher, Vol. 30, March 1992, 141-158)
"it has been established that1 (1) commonsense beliefs about motion and force are incompatible with Newtonian concepts in most respects, (2) conventional physics instruction produces little change in these beliefs, and (3) this result is independent of the instructor and the mode of instruction. The implications could not be more serious. Since the students have evidently not learned the most basic Newtonian concepts, they must have failed to comprehend most of the material in the course. "


Hestenes et. al. wrote the Force Concept Inventory, a multiple choice test whose aim is to "to probe student beliefs on this matter and how these beliefs compare with the many dimensions of the Newtonian concept. " It poses questions that force a choice between the correct Newtonian answer for an explanation of a given system, and other commonsense explanations that are actually misconceptions. After the test, interviews are done to determine students' reasoning.

Here's an example of a misconception that the FCI aims to tease out of a student:
[The misconception of "impetus":]
The term "impetus" dates back to pre-Galilean times before the concept was discredited scientifically. Of course, students never use the word "impetus"; they might use any of a number of terms, but "force" is perhaps the most common. Impetus is conceived to be an inanimate "motive power" or "intrinsic force" that keeps things moving. This, of course, contradicts Newton’s First Law, which is why Impetus in Table II is assigned the same number as the First Law in Table I. Evidence that a student believes in some kind of impetus is therefore evidence that the First Law is not understood.


The FCI has been given to thousands of college and high school students. The above paper details the results on the FCI, given as a pre and post test to both high school and undergraduate physics courses, with tremendous detail on similarities and differences across classrooms in the country. More, it provides strong evidence that traditional college physics pedagogy isn't doing anything to teach physics to the students who take it:

"The pretest/post test Inventory scores of 52/63 for [The Regular Physics Mechanics course at Arizona State University] are nearly identical to the 51/64 scores obtained with the Diagnostic for the same course...we have post test averages of 60 and 63 for two other professors teaching the same course. Thus, we have the incredible result of nearly identical post test scores for seven different professors (with more than a thousand students). It is hard to imagine stronger statistical evidence for the original conclusion that Diagnostic posttest scores for conventional instruction are independent of the instructor. One might infer from this that the modest 11% gain for Arizona State Reg. in Table III is achieved by the students on their own. "


Which brings us back to the state of physics majors going to graduate school:

One of us (Hestenes) interviewed 16 first-year graduate students beginning graduate mechanics at Arizona State University. The interviews were in depth on the questions they had missed on the Inventory (more than half an hour for most students). Half the students were American and half were foreign nationals (mostly Chinese). Only two of the students (both Chinese) exhibited a perfect understanding of all physical concepts on the Inventory, though one of them missed several questions because of a severe English deficiency. These two also turned out to be far and away the best students in the mechanics class, with near perfect scores on every test and problem assignment. Every one of the other students exhibited a deficient understanding of buoyancy, as mentioned earlier. The most severe misconceptions were found in three Americans who clearly did not understand Newton’s Third Law (detected by missing question 13) and exhibited reading deficiencies to boot. Two of these still retained the Impetus concept, while the other had misconceptions about friction. Not surprisingly, the student with the most severe misconceptions failed graduate mechanics miserably, while the other two managed to squeak through the first year of graduate school on probation.


Is it just that the Chinese students who manage to get into US physics grad schools are such creme de la creme that they are perfect? Or is Chinese instruction vastly superior?

(And for those who wonder about American instruction in other subjects, read this and weep:)

One disturbing observation from the interviews was that five of the eight Americans, as well as five of the others, exhibited moderate to severe difficulty understanding English text. In most cases the difficulty could be traced to overlooking the critical role of "little words" such as prepositions in determining meaning. As a consequence, we discarded two interesting problems from our original version of the Inventory because they were misread more often than not.


And yet, those who make it through physics graduate school to professordom mostly correct these errors, at least in mechanics. (Though not necessarily. In quantum mechanics, new professors are notorious for teaching elements of the material incorrectly. In special relativity, David Mermin, prof at Cornell, believes many professors teach the entire subject wrong. (He discusses this in a paper called something like "how to teach Special Relativity.") Hestenes suggests this is due to the realities of post quals grad school: the day in, day out teaching and researching refine one's intuition over and over again.

I think this implies something else he doesn't say. Error correction in intuition can only occur and stick if the mastery of the manipulation of the equations is so strong that you believe what they tell you. If you can be forced to do the math on the board, and forced to read and think about what it says, then you can learn the truth counter to what your intuition tells you, but only if you are utterly sure you did the math on the board correctly.

If instead, you doubt yourself, doubt your manipulation of equations, doubt your application of the laws as you understand them, then you will get confused, doubt your answer, default to your intuition, and scrap learning the correct way to think.

That means you need a tremendous amount of mastery. How in the world to achieve that?

Hestenes' answer --changing how physics is taught in high school and in college--will be explored in a week or so.


Physics Education and Failures in Conceptual Understanding
Fixing Physics Education: Modeling Instruction
Physics Education Continued
More Modeling Instruction: Techniques

Tuesday, April 8, 2008

A Barrier to Academic Achievement: Difficulty with Handwriting, and a Solution

According to a recent study, somewhere between 10% to 30% of children have difficulty learning to produce rapid, legible hand-written work(1). Handwriting difficulty is often linked with other problems such as attention deficit disorder. Poor quality of handwriting of children with handwriting problems seems particularly related to a deficiency in visual-motor integration. (2)

Children who do not acquire fluent, legible handwriting in the early years often experience far-reaching negative effects on both academic success and self-esteem.(1)

“Handwriting is one of the basic building blocks of good writing and plays a critical role in learning,” Graham, Currey Ingram Professor of Special Education at Vanderbilt University’s Peabody College, said. “Young children who have difficulty mastering this skill often avoid writing and their writing development may be arrested. They also may have trouble taking notes and following along in class, which will further impede their development.”


There are three possible sources of children developing handwriting difficulties: a problem with the child, a problem with the teacher, or a problem with the curricula (and related materials).

In " How do primary grade teachers teach handwriting? A national survey",(3) the authors found that


Nine out of every ten teachers indicated that they taught handwriting, averaging 70 minutes of instruction per week. Only 12% of teachers, however, indicated that the education courses taken in college adequately prepared them to teach handwriting. Despite this lack of formal preparation, the majority of teachers used a variety of recommended instructional practices for teaching handwriting. The application of such practices, though, was applied unevenly, raising concerns about the quality of handwriting instruction for all children.


In a less-formal presentation of the national study data, Steve Graham is intereviewed:


Graham suggests that a return to consistent handwriting instruction, with an understanding of the challenges different children face, would not only result in more legible papers but also support overall learning across subjects.

“Teachers need to continue to teach their students how to properly form and join letters. We found that this sort of instruction takes place for 10 minutes or less a day in most schools, down from two hours a week in the 1950s,” Graham said. “At home, there are many things that parents can do to help their young children improve their penmanship. Activities such as identifying and tracing letters, forming letters from memory, copying words and playing timed games to see how quickly they can accurately produce written letters and words all go toward building this skill.”


There are two common handwriting approaches or curricula used in U.S. schools--one: traditional, based on the Palmer method, and two: "italicized" -- more flowing. The most popular of the former is Zaner-Bloser and the most popular of the latter is d'Nealian (developed by Donald Thurber).

There is very little research on the relative effectiveness and efficiency of each approach.

There is, however, a third way: Handwriting Without Tears.

I spent Friday and Saturday at a Handwriting Without Tears seminar. Handwriting Without Tears (HWT) was created in the 1970s by an occupational therapist as a remedial program, and over the decades, has grown into (1) a pre-K through 5th grade classroom curriculum and (2) a remedial program..

It works as a classroom curriculum because the letters are taught in logical order, the letter formation skills are taught to mastery, and the curriculum uses multisensory methods. The teacher watches for, and immediately corrects, errors in letter formation, and the curriculum includes frequent "Review and Mastery" opportunities.

It works as a remedial program because HWR's authors have structured remediation in small, precise steps.

I highly recommend Handwriting Without Tears.


1. Feder KP, Majnemer A. Dev Med Child Neurol. 2007 Apr;49(4):312-7.
2. Volman MJ, van Schendel BM, Jongmans MJ. Am J Occup Ther. 2006 Jul-Aug;60(4):451-60.
3. Graham S, Harris KR, Mason L, Fink-Chorzempa B, Moran S, Saddler B Reading and Writing 2008 21(1-2):49-69.

Elsewhere:

Handwriting Key to Learming, Newsweek, November 12 2007

LD Podcast: Dr. Steve Graham on writing development.

Interview with Steve Graham

Previous KTM Posts referencing Handwriting
Somewhere in a Well to Do District

Learning in a Castle of Fear


Speed Test

Thursday, August 2, 2007

advice re: fractions, decimals, percent

from the Comments thread to: What is 10%?


from Steve H:

"What is 10% off? issue."

I've just been going over this with my son. I keep asking him 10% of WHAT NUMBER, exactly? I want him to always know what that amount is. I don't call it the whole because it might be confusing for problems that ask for 10% off of 50% off. The classic problem is the store that tells you that you can get an additional 10% off of the 30% off discount price. The question is WHAT number, exactly, does the 10% refer to?

I also got done telling him that when you see something like 30% in a word problem, you never use the number 30 in the calculations. You have to use either .3 or 3/10. It seemed like a minor point, but I could see it sink in.

In the past, we've talked about 15% tips and how to calculate them, but he needs to see percent problems in all forms and see fractions and decimals as two different forms of the same thing.

Another good problem is to talk about the store owner who buys at the wholesale price and marks up by 100% to get the retail price. The store owner could then have a sale and mark down the goods by 30%. The question still has to be WHAT NUMBER, exactly, does the percent refer to?

Another issue I ran into was that he was thrown a little by things like 125% - percents greater than 100%.


from Joanne Cobasko:

I used many work sheets converting fractions to decimals (by doing the division-since a fraction is just a division problem) and then converting decimals to percents.

The repetition that a fraction is a decimal and a decimal is a fraction, and a percent under 100 is represented by a 2 digit decimal seems to be working well. I did this when I noticed that Saxon was not providing the simple algorithms to solve the fraction of a whole problems (and of course multiplication of decimals and fractions haven't been introduced so I taught that too). I have been teaching ahead of Saxons approach with the actual algorithms. My son hates drawing the pictures- but I have him draw to prove he understands. I taught him to write the equation first and solve the problem then draw the picture.

When Saxon began introducing problems looking for a fraction of a group I taught that the word "of" meant that you had to multiply the fraction (or decimal) by the whole number.

1/2 of 30 = 1/2 * 30/1 = 30/2 = 15
or
50% of 30 = .5 * 30 = 15.0

Start with the simple fractions 1/2, 1/4, 1/3, then work up to the others.

Memorizing that
1/2 =.5 = 50% or
1/4 = .25 = 25% or
1/3 =.333 = 33.3%

We are now approaching doing percents in our head such as
1/8 which is half of 1/4 so
1/8 =12.5% (1/2 of 25%) and that
2/5 = 40% because 1/5=20% so 2 of the 1/5's would be twice as much, so
2*20%=40%

I am hoping this familiarity with decimals, fractions and percents combined with memorization and mental math skills (which Saxon introduced in HS version of 5/4 and higher)will help my son to solve more advanced problems such as the ones you are now presenting to Chris.

[from Catherine: we are doing LOTS of these worksheets - and we need to do mental math, but C. isn't quite "up to that" yet...]


more from Steve H:

I see a clear difference in my son between before mastery and after mastery. Before mastery, he may be able to explain and do a problem eventually, but he doesn't fully grasp the subtleties and variations of what he is doing. After mastery, the process and understanding is automatic.

We've been working on combining plus and minus signs when you add, subtract, multiply and divide. Unfortunately, he is always trying to find a simple pattern that solves the problem. The fault with patterns is that they are based on nothing. There are lots of patterns that can be found and many of them are not helpful at all. I always try to explain things using the basic identities.

One thing we ran into the other day was where does the minus sign belong in a fraction. I told him that you can put the negative sign anywhere you want.

I told him to identify terms and always think of a term or number with a sign in front of it. If you don't see a sign, it's a '+'. I also told him that a minus sign is really a factor of -1.

if you have

3 - 1/2

Then the second term is

- 1/2

or it could be

(-1)(1/2)

or

(-1)/2

or

1/(-2)

You can put the minus sign in front of the number, like

-.5 or -(1/2)

or you can put it in the numerator or denominator. Since the fraction is just a number, you can think of the minus sign in front of everything, but you can also put it into the numerator or the denominator if you want.

He didn't like that idea.

I gave him this fraction.

(-2)/3

I then asked him what

(-1)/(-1)

equals. He hesitated and then asked, "One"?

I said OK, now multiply

(-2)/3 by (-1)/(-1)

to see what you get.

He knows how to multiply numbers with different signs, but he had to think about this. You could see the wheels turning.

I told him that whenever I look at a minus sign, I can see all of the different places I can put it or all of the different ways I can use it.

These things can't sink in without a lot of practice. Mastery provides understanding. It can't be rote. Understanding is not possible without mastery. Finally, mastery and understanding have little to do with pattern recognition.


from instructivist:

[We are now approaching doing percents in our head such as
1/8 which is half of 1/4 so
1/8 =12.5% (1/2 of 25%) and that
2/5 = 40% because 1/5=20% so 2 of the 1/5's would be twice as much, so
2*20%=40%]

This is a great way to learn mental math. I have been doing this instinctively.

Calculating tips of 15% or 20% (service mus really be good) menally should also be child's play. Ten percent of anything is easy. Add half of that and you get 15%. It's baffling that some kids struggle with this.

[1/3 =.333 = 33.3%]

There is a fancy, six-figure word that goes with repeating decimals (the bar on the repeating number or numbers): vinculum. Converting these repeating decimals to fractions is a nice algebra exercise. The number of numbers covered by the vinculum tells you if you need to multiply by 10x, 100x or whatever.

AND:

"Understanding is not possible without mastery."

That's a powerful statement. It should blow the constructivists out of the water who purport to seek "understanding" but disparage mastery with obnoxious phrases like "drill and kill."

AND:

It occurred to me that a calculator is of limited use when trying to figure out if certain fractions are repeating decimals when converted. The calculators I am familiar with do automatic rounding.

I tried 5/7 on my TI-30X IIS and get 0.71. No indication that a repeating decimal is involved. My TI-83 Plus gives me more but also rounds without showing the group of repeating numbers.

I see this as another reason why long division is important. How would calculator-dependent students see that the sequence 714285 repeats, I ask NCTM?

[Catherine again: I've informed C. that he will be doing long division worksheets shortly, and he will be doing them to fluency]


from le radical galoisien:

There is a rough method of deriving a fraction from any arbitrary decimal.
For example, 2/7 is 0.285714286 (etc.) 1 divided by that decimal is 3.5. That is 7/2, or the inverted form of 2/7 ...

hyperspecificity in autism
hyperspecificity in autism and animals
hyperspecificty in the rest of my life
hyperspecificity redux: Robert Slavin on transfer of knowledge

Inflexible Knowledge: The First Step to Expertise
Devlin on Lave
rightwingprof on what college students don't know
percent troubles
what is 10 percent?
birthday and a vacation

Saturday, July 28, 2007

how to talk about spirals

I am just now getting to the Singapore/Saxon thread (and I STILL have not read the 888 lesson - which is why I haven't weighed in - SORRY! I'm frantic about my book, stuck on a question about stereotypies & environment & the brain, etc.)

We're going away for the night, so I may not get to it now, either, but I've begun reading the Comments thread.

Like instructivist, I appreciate this parsing of the difference between the Singapore spiral and the US spiral:

Singapore's Framework is an Additive Spiral that Builds Topic Content Grade-by-Grade; NCTM's Framework Is a Repetitive Spiral Approach that Covers Similar Topics Across Grade Bands

Singapore
Spiral approach by content strand (additive)
Specific for each grade (K-6)
Clear, specific topics
Mathematically logical sequence

NCTM
Spiral approach by content strand (repetitive)
Within broad grade bands (K-2, 3-5, 6-8 )
Imprecise topics
Absence of boundaries and inclusive

The fluency idea has raised a question in my mind.

What's good about spiraling-with-mastery is the fact that after studying the same material 3 years in a row people remember it for the rest of their lives:

Studies show that if material is studied for one semester or one year, it will be retained adequately for perhaps a year after the last practice (Semb, Ellis, & Araujo, 1993), but most of it will be forgotten by the end of three or four years in the absence of further practice. If material is studied for three or four years, however, the learning may be retained for as long as 50 years after the last practice (Bahrick, 1984; Bahrick & Hall, 1991). There is some forgetting over the first five years, but after that, forgetting stops and the remainder will not be forgotten even if it is not practiced again. Researchers have examined a large number of variables that potentially could account for why research subjects forgot or failed to forget material, and they concluded that the key variable in very long-term memory was practice.*(see below *) Exactly what knowledge will be retained over the long-term has not been examined in detail, but it is reasonable to suppose that it is the material that overlaps multiple courses of study: Students who study American history for four years will retain the facts and themes that came up again and again in their history courses.

Practice Makes Perfect -- But Only If You Practice Beyond the Point of Perfection

by Daniel Willingham
American Educator


(When I mentioned this finding to my sister-in-law, who is a federal prosecutor, she instantly said, "That must be why law school is 3 years.")

I'm wondering whether learning to the point of fluency, as opposed to mastery-defined-as-percent-correct, would alter this finding.

The article is about overlearning, which the fluency people suspect is the same thing as fluency. However, I don't know whether the subjects in the "50-year retention" studies studies had overlearned the same material in each of the three years they studied it.

Thursday, July 26, 2007

Tom Loveless on automaticity

Question from Alison Corner, Principal, Pawtucketville Memorial, Lowell MA:

Their is alot of emphasis in K-4 on teaching math as a problem solving exercise. How would you recommend incorporating teaching computation skills for automaticity?

Tom Loveless:

Some lessons have to be devoted to computation alone, including why procedures work. Automaticity involves both accuracy and speed so I would stress both in memorizing basic facts in addition, subtraction, multiplication, and division and in the use of algorithms.

The State of Math Standards - transcript


I don't know how I managed to remember automaticity (which I have in theory been trying to achieve with C.), but forget speed.

Actually, I do know how: my years of ABA, when Jimmy and Andrew worked to a 90% criterion for mastery, dominated my memory of what mastery meant.

One year in KUMON ("speed and accuracy") wasn't enough to overwrite 4 years of ABA, or however many it was....

how to determine fluency for an individual child

An adult-to-child proportional formula can also be helpful in setting performance aims. Measures are first taken of the student’s tool skills rate and the rates at which a competent adult performs both the tool skill and the target math skill.* The tool skill for answering math facts is writing random numbers without solving any problems. This fast-as-you-can number writing rate provides a ceiling for the fastest rate at which answers to math facts can be produced. The proportion obtained by dividing the adult’s performance rate on the target math skill by his or her tool skill rate is then multiplied by the student’s tool skill rate. The resulting figure is the fluency aim for the student. For example, if the adult solves math facts at a rate of 60 correct answers per minute and can write 120 random numbers per minute, his or her target skill to tool skill proportion is .5 (60 divided by 120). Based on the adult-child proportional formula, the fluency aim for a student whose tool skill rate is 80 would be set at 40 correct answers per minute (80 times 0.5). Providing direct and repeated practice on the relevant tool skills may be an effective way of improving the overall fluency of some children. Alternative modes of response, such as answering orally, should also be considered for children who exhibit very slow writing or poor fine motor control.

source:
Do Your Students Really Know Their Math Facts: Using Daily Time Trials to Build Fluency (pdf file)
by April D. Miller and William L. Heward
p 134
Intervention in School and Clinic, Volume 28, Number 2, November 1992


Effects of sequential 1-minute trials with and without inter-trial feedback and self-correction on general and special education students' fluency with math facts
(abstract)



* tool skills; component skills; composite skills - see speed test

Saturday, July 21, 2007

speed test

The precision teaching folks distinguish amongst:


After discovering that C. can solve only 50 simple addition problems in 60 seconds when he should be able to do between 70 and 90, I decided to find out how fast he can write digits.

I had him write the digits 0 - 9 over and over again as fast as he could for 60 seconds.

He wrote 110, 10 more than the performance standard the PT folks seem to use.

So we're going to be doing Saxon Fast Facts sheets, Books 7/6 and 8/7 (the Tests and Worksheets Bookets), until he's up to speed.

Literally.

I'm also starting both of us on Cursive Writing Skills from Megawords. His handwriting is horrific, as is his printing, and it's not getting better over time.


back to basics
speed test

Tuesday, July 17, 2007

back to basics

hmmm...

After reading this passage in Carl Binder's article Doesn't Everybody Need Fluency? (pdf file) I decided to give C. a speed test today.

[I]n regular classrooms we learned that students need to be able to write answers to between 70 and 90 simple addition problems per minute in order to be able to successfully and smoothly master arithmetic story problems. However, some students seemed to level off at around 20 or 30 problems per minute, and no amount of reward or encouragement seemed to help. Some of our colleagues (Starlin, 1971; Haughton, 1972) decided to check how many digits those students could read and write per minute—critical components of writing answers to problems. As you might guess, they were very slow, which held down their composite performance. With practice of the components on their own to the point of rapid accurate performance (for example, reading and writing digits at 100 per minute or more), students were able to progress smoothly toward competence on solving the written math problems.

70 problems per minute = 70 problems per 60 seconds

C. did two timed tests. The first was 50 problems; the second was 25.

70 problems/60 seconds = 50 problems/43 seconds

To meet this performance, standard he needed to write answers to 50 simple addition problems in 43 seconds. Right?

First test: 50 problems in 190 seconds.

yikes

After he was done, I took the same test & clocked in at 50 problems in 35 seconds without breaking a sweat. (Try it yourself. You'll see.)

I had C. try the test again. He cut his time in half, but he was still at 50 problems/66 seconds. He says he's tired & has a headache....but then again I'm tired, too, (albeit sans headache) not to mention old.

I'll have him take a third speed test tomorrow. If he's not down to 43 problems/50 seconds, I'll test his speed writing digits. Then, when I find out there's no way in h - e - double hockey sticks he can write 100 digits in 60 seconds, we will commence printing practice.

So.... we're back to handwriting. Yet another inconsequential non-21st century skill never, ever taught to mastery in our public schools!

bonus narrative: C. used cursive writing to label his Saxon percent ovals, leading me to the discovery that he's writing his f's wrong.

"I think you're writing your f's wrong."

"No I'm not! That's how you're supposed to write 'f'!"

I decided not to argue about it.

Let off the hook, C. stared at my cursive version of the letter "b" for a couple of seconds, then said, "I forgot how to write b's."

I decided not to argue about that, either.



simple addition worksheets online:

create & print whole number addition worksheets from aplusmath (best source - you can specify 50 problems)
free whole number addition worksheets from S&S Software
create & print addition worksheets
free math worksheets from tlsbooks
free math worksheets from math-drills.com

back to basics
speed test

Monday, July 16, 2007

first crack at editing exercise

re: eureka

wow

C. didn't have a clue how to proceed.

He only managed to cut one word:

Stunt people were around long before films. Even Shakespeare probably used them in fight scenes.

I thought "even" needed to stay put, but I see now that the transition is implied.

This is going to be hard for me, too.

Of course I'm wondering whether I'm starting at the top. Is there a simpler way into this kind of exercise?

I will mull.

Meanwhile, on the math front, C. did the Saxon Fast Facts Multiplication sheet - 64 1-digit multiplication problems - in 2 minutes 37 seconds, getting 100% correct. That seemed good (Saxon says the sheet should be done in less than 5 minutes) until I re-read this post:

[I]n regular classrooms we learned that students need to be able to write answers to between 70 and 90 simple addition problems per minute in order to be able to successfully and smoothly master arithmetic story problems.

More confusion. Each "simple" multiplication problem has a 2-digit answer, whereas most simple addition problems have 1-digit answers. I think.

Tomorrow I'm going to have him do 100 simple addition problems and see what's what.

If we have to practice writing digits, we will practice writing digits.

His handwriting is terrible. I came across some of his work from a couple of years ago and discovered there's been no improvement at all.

My efforts to afterschool handwriting, which I abandoned lo these many years ago, were an abject failure.


expert advice on teaching writing from Joanne Jacobs
eureka
more from Joanne Jacobs
doctor pion on writing a precis and critical reading
home writing program in place, for now

math facts at The Key School

Grade Four Timed Tests
Fluency with Basic Math Facts
September 11, 2006

The goal of timed tests is computational fluency: by this we mean quick and accurate knowledge of math facts. As has been said before to parents in earlier grade levels (and is worthy of repeating), automatic recall of basic math facts is desired because it frees up students’ minds for complex problem solving. To this end, fourth grade students will prepare for weekly timed tests. Below are the details.

What: Each timed tests consists of 50 problems to be completed in four minutes or less, although many students set personal goals of two minutes. Timed tests begin with subtraction facts (0-20) and, in the winter, move to multiplication facts (0-9).

When: Timed tests are given every Wednesday for the entire year. Students keep corrected timed tests in their math binders along with a chart of their progress.

How: We review study tips with students and provide work sheets for practice. Enclosed are strategies for subtraction and multiplication to help your child polish his or her math facts at home. Students can make flash cards of difficult math facts. Additionally, the Lower School’s math library has a variety of math aids available for home practice; materials may be borrowed for two weeks at a time.


The Key School is supposed to be one of the best private schools in the country. Based in what my friend who has two kids there tells me, it is.

need for speed - Vlorbik, Steve, Doug Sundseth

I'm assuming this is from Vlorbik:

please don't tell me that
you're taking "celeration" seriously.

to my eyes, it makes the already
formidable disincentives to teach
look almost pleasant.

if these guys actually
get any influence, i'll be
back on the loading dock ...


All I can say to that is: don't listen to me.

Having got my "start" as a parent in ABA,* naturally I think it would be an excellent idea to introduce celeration charting into K-12.

But I have no idea - none - what the fall-out of such a scheme might be.

Here at ktm we tend to assume parents can't get too far lobbying for change in their districts. But there's always the other possibility: you spend 10 years of your life pushing, bugging, noodging, and otherwise squeaky-wheeling your way to actual change, and it turns out to be the wrong change.

Although I stumbled across the phrase "unintended consequences" relatively late in life, the instant I read it I thought: Bingo. Probably anyone who's managed to have not just one but two autistic kids has assimilated the saying about the best laid plans of mice and men often running amok.



here's Steve H:

"Since 1990 the Standard Celeration Society has comprised a collegial organization for all persons who use Standard Celeration Charts to monitor and change human behavior frequencies."

I'm all in favor of mastery and speed in the basics of math, but I'm not too keen on schools trying to monitor units of behavior frequency change. I would rather have them monitor units of correctness (grading) on weekly quizzes and tests.


These two comments led me to post my question to our "math brains":

How fast are you at the fundamentals?

I'm not quite sure why I ask, apart from the obvious, which is that I need some kind of rough performance standard for C & for me.

The field of reading appears to have well-developed fluency standards.

Math seems to be different. I've come across performance standards for the basic math facts, but I'm thinking about fluency on some of the "composite skills," too. I imagine the world of precision teaching has not yet produced research-based standards for, say, how quickly an expert can factor a trinomial.

Or: should there even be such a thing?

I don't know which higher-level component skills math experts typically learn to automaticity.



Here's Doug's answer to "how fast are you?":

Quite fast; I normally see the answer to simple questions without consideration.

Possibly related: In the summer between 4th and 5th grade, the summer school I attended did mad-minute multiplication. We were doing 100 single-digit problems in one minute with 95%+ accuracy.

More generally, I've learned other things by extended rote practice and I still know most of those things with no real thought. (German irregular verbs, typing, judo throws.)

Yes, the practice is tedious for both student and teacher. But the student only has to do it once and the teacher is being paid very well. (Note: the work of a teacher is easier and the pay is better than the work and pay on a loading dock -- I've worked on a loading dock.)

As to the loading dock issue, I do have a serious comment: I've led speed drills with kids, and they're fun. Or at least the kids and I found them fun in our afterschool setting. A speed drill takes 5 minutes max (in my own experience 1 minute would have been better, though Saxon's sheets are 5 minutes long); the kids get a charge out of them; and everyone can see himself beating his last time. I was amazed at how quickly the kids picked up speed from one drill to the next, including one high-end SPED kid. That boy zoomed.

A timed drill wakes kids up and, as John Saxon wrote, "sets an up-tempo atmosphere to start the lesson." Or at least it did for me.

I have no idea what would happen if you introduced celeration charts in all subjects, or with all fundamentals..... but I can predict with some confidence that you aren't going to lose future math teachers over a requirement that they do one-minute speed drills at the start of a class.

I think.

update: Here's a Minnesota school that has posted fluency data for its students.



Do Your Students Really Know Their Math Facts? (pdf file)
performance standards (pdf file)

* I've linked to the new book by the Koegels who, while trained by Ivar Lovaas, are a different kettle of fish. The Koegels are our autism gurus.