- Education and Science»
- Colleges & University
College Exit Exams? No Way, Hombre!
David Brooks of The New York Times recently wrote an article titled "Testing the Teachers," in which he describes a learning assessment test that should be required for all college graduates, sort of like an exit exam. He compares this to No Child Left Behind, but he says that colleges are mostly open to the idea where teachers of K-12 were not. While I cannot speak for all colleges, I know that where I attended No Child Left Behind was not as celebrated as Brooks' statement might lead one to believe.
First of all, do you even know what the final semester of college is like for a graduating senior? The end of any semester, in fact, for anyone is nerve-wracking to begin with because of term papers and final exams. Add one more standardized test on top of that and students could have a[nother] nervous breakdown or go completely loco. It's even worse when roommates are involved, not to mention having to clean up a dorm room for move-out day even though you know they're still going to squeeze more damage-bill money out of you for no good reason. In short, the end of the semester is not a good time to be doing this. That's why entrance exams take place well before the final exams of the school a student currently attends. Also, the current system, at least where I went to school, does have a formal assessment of each class unless the professor has tenure. Informally, there are teacher-rating web sites available for all levels of education. Check them out, I dare you.
Secondly, I call bull on the statistics Brooks quoted. According to Richard Arum and and Josipa Roska in their study titled "Academically Adrift," the first two years of college record an average increase in skills of about seven percent, and only marginally in the years after that. While I have not read this article for myself, I can offer several reasons as to why that could be. I'm not sure if they realized this, but the first year or two is when most students have to take required courses offered by the university. These are mainly uninspiring, tedious, and/or difficult depending on what an individual student's strengths are. Heck, half the people in my honors classes were taking remedial math! Not only that, but some students also spend the first two years figuring out which major they should pursue; sometimes they change majors along the way or choose to pursue two at once. These people cannot afford to slack off. Speaking of majors, the last two years where these marginal increases in skills take place usually involve taking elective credits students may not want but need to graduate (on paper), so they're even less motivated already. Also, if they're finishing up classes in their major, they may have already forgotten things from those required courses that didn't end up having much bearing on their chosen field at all.
This brings me to the most important point I have to offer. Brooks also mentions that the amount of study time per week today is about half of what it was about fifty years ago. Firstly, what was he thinking comparing one generation to another? Secondly, if people don't want to study and end up doing poorly because of it, it might be their own fault in some cases. While there are some things that haven't changed much since the sixties (political and social unrest, leaps in technology, etc.), studying is not just about hitting the books anymore. Attention spans also aren't what they used to be given that today's college students grew up around technology in their homes or communities. The way we learn is constantly evolving, and instead of measuring this with pathetic standardized tests mandated by the government, these so-called learning-assessors should probably think about observing people and asking them questions on their own. To conclude, I have this to say: the college I attended offered top-notch education, but its administrative practices could drive a person to drink (yes, someone who works for the university actually said this to me).