You are viewing celandine13's journal

Opinions on college education


 The extremes on this debate are absurd.  College is not useless; it has many benefits to students. On the other hand, it would be surprising if the most efficient way to provide those benefits was exactly the social institution of college in America as it exists today.

The signaling function of college works pretty damn well.  Yale/Princeton/MIT students are, on average, brighter and more conscientious than students at less elite schools, who are, in turn, brighter and more conscientious than people who never went to college.  (It's not just signaling, I think; it's also secondary socialization and developing an identity from your peers.  Princetonians learn to behave like Princetonians.)  You don't want to casually destroy a functioning social institution without thinking about the consequences.  

Any scientist will tell you that one of the best ways to improve the quality of your work is to surround yourself with smart people.  Any employer will tell you that finding smart, conscientious employees is a leading challenge of running a business.  If you want to claim that you have a filter for smart people that works better than college, my question is, "Why ain't you rich?"  If such a thing existed, it would singlehandedly replace the recruiting industry.  I'm interested in experiments in new ways to evaluate people's competence, but so far I think that field is in its infancy.  Employers won't really read your Github any more than they'll read all the publications on your CV.

Online education can democratize only a small part of what people get out of college.  I can see Stanford-style lectures being very useful for informal self-teaching, potentially a major boon to homeschooled children or people in developing countries without access to world-class universities.  But how much more valuable is a video lecture course than a textbook at the public library?  Both are merely formats for displaying information.  (I'd learn more from a textbook than a video lecture, myself.)  There have always been autodidacts whose "college" was a public library, but most people have never taken full advantage of libraries to replace education, and I don't think most people would take full advantage of online lectures.  One of the problems with tech people is the assumption that "innovation" means "consumer internet" -- it's a temptation for all of us to think that way, but it's inaccurate.

The real problem with college education, of course, is that it's very bad at educating.  The teachers are professors and graduate students -- for them, teaching is a speed bump, a hassle they're required to get through on the way to doing research.  (Those who actually like teaching get a status hit.) So teaching is done in the lowest-bandwidth form possible.  Lectures, a midterm, and a final.  Two data points of feedback. People don't learn that way, unless they're independent enough learners that they didn't need the teacher in the first place.  

One-on-one tutoring is vastly more effective.  So some people imagine apprenticeship replacing college.  Want to be a biologist?  Be Craig Venter's apprentice.  But this is problematic in itself...you'd need many years of courses to be knowledgeable enough to be any use to Craig Venter as an apprentice.  

If you separated signaling, consumption, and learning, and optimized a "school" around each of them, I could imagine three separate stages of young life.

Learning: something like a "cram school."  Intensive tutoring, extremely frequent tests.  Sophisticated data analysis provides a very fine-grained profile of the student's strengths and weaknesses.  This would be a focused, boot-camp-like, time of life; all studying, all the time.  The only thing you do is acquire competence in a subject and come out demonstrating exactly how good you are at it.  I think it would naturally be a single subject only, and works best for quantitative subjects: two or three years spent getting up to B.A.-level competence in math or chemical engineering or whatever.  It's meant to get you up to speed.

Consumption: young people, if they have the chance, do need time to relax, to learn who they are, to explore identities, to enjoy intellectual stimulation and the proverbial late-night dorm-room discussion.  It's part of growing up.  But because this is an experience valued for what it means to an individual personally, there's no reason it has to be the same for everyone.  I actually think the Grand Tour or Great American Road Trip serve this function.  A stint at a low-effort job, with lots of time to shoot the breeze and try to write your novel, also works.  Some people really want the dreaming spires of academia, and classes you take just for the fun of it.  If our society bit the bullet, and admitted that periods of idleness, if you can afford them, are good for self-discovery and lifelong happiness, then we wouldn't have to shoehorn everybody into the same model.

Signaling: this is, more or less, an apprenticeship.  Or graduate school, or an internship. Find your role model, and work under him or her.  If this requires specialized knowledge, you've already gone to a cram school by now.  If you wanted to take some time off and find yourself, you've done that.  You might be 30 when you're ready for an apprenticeship; you might be 16.  All you have to do is demonstrate to your mentor that you're worth his or her time, and you can do useful work while you learn the business.

I think the modern American college education has elements of all three types of "schools," but shortchanges the learning part. We'd probably learn more if we had a separate institution that was just for learning, and just for learning the specific things you need to know.

Tags:

Errors vs. Bugs and the End of Stupidity


"A pianist has to believe in telekinesis.  You have to believe you have the power to move your fingers with your mind."

I learned that from Phil Cohn, my piano teacher's piano teacher.  Once in a while, when I was in high school, she'd arrange for me to have a master class with him.  He was a diminutive man who looked exactly like Dr. Strangelove, and had a gentle way of guiding your hands and body while you played.  He was very interested in the physicality of piano; he liked to tell stories about students of his who could play any piece upside down, or cross-handed, or one-fingered like Chico Marx.  He had a lot of theories about the process by which we can learn to control our muscle movements.

I wasn't an exceptional pianist, and when I'd play my nocturne for him, there would be a few clinkers.  I apologized -- I was embarrassed to be wasting his time.  But he never seem to judge me for my mistakes.  Instead, he'd try to fix them with me: repeating a three-note phrase, differently each time, trying to get me to unlearn a hand position or habitual movement pattern that was systematically sending my fingers to wrong notes.

I had never thought about wrong notes that way.  I had thought that wrong notes came from being "bad at piano" or "not practicing hard enough," and if you practiced harder the clinkers would go away.  But that's a myth.

In fact, wrong notes always have a cause. An immediate physical cause.   Just before you play a wrong note, your fingers were in a position that made that wrong note inevitable. Fixing wrong notes isn't about "practicing harder" but about trying to unkink those systematically error-causing fingerings and hand motions.  That's where the "telekinesis" comes in: pretending you can move your fingers with your mind is a kind of mindfulness meditation that can make it easier to unlearn the calcified patterns of movement that cause mistakes.

Remembering that experience, I realized that we really tend to think about mistakes wrong, in the context of music performance but also in the context of academic performance.

A common mental model for performance is what I'll call the "error model."  In the error model, a person's performance of a musical piece (or performance on a test) is a perfect performance plus some random error.  You can literally think of each note, or each answer, as x + c*epsilon_i, where x is the correct note/answer, and epsilon_i is a random variable, iid Gaussian or something.  Better performers have a lower error rate c.  Improvement is a matter of lowering your error rate.  This, or something like it, is the model that underlies school grades and test scores. Your grade is based on the percent you get correct.  Your performance is defined by a single continuous parameter, your accuracy.

But we could also consider the "bug model" of errors.  A person taking a test or playing a piece of music is executing a program, a deterministic procedure.  If your program has a bug, then you'll get a whole class of problems wrong, consistently.  Bugs, unlike error rates, can't be quantified along a single axis as less or more severe.  A bug gets everything that it affects wrong.  And fixing bugs doesn't improve your performance in a continuous fashion; you can fix a "little" bug and immediately go from getting everything wrong to everything right.  You can't really describe the accuracy of a buggy program by the percent of questions it gets right; if you ask it to do something different, it could suddenly go from 99% right to 0% right.  You can only define its behavior by isolating what the bug does.

Often, I think mistakes are more like bugs than errors.  My clinkers weren't random; they were in specific places, because I had sub-optimal fingerings in those places.  A kid who gets arithmetic questions wrong usually isn't getting them wrong at random; there's something missing in their understanding, like not getting the difference between multiplication and addition.  Working generically "harder" doesn't fix bugs (though fixing bugs does require work). 

Once you start to think of mistakes as deterministic rather than random, as caused by "bugs" (incorrect understanding or incorrect procedures) rather than random inaccuracy, a curious thing happens.

You stop thinking of people as "stupid."

Tags like "stupid," "bad at ____", "sloppy," and so on, are ways of saying "You're performing badly and I don't know why."  Once you move it to "you're performing badly because you have the wrong fingerings," or "you're performing badly because you don't understand what a limit is," it's no longer a vague personal failing but a causal necessity.  Anyone who never understood limits will flunk calculus.  It's not you, it's the bug.

This also applies to "lazy."  Lazy just means "you're not meeting your obligations and I don't know why."  If it turns out that you've been missing appointments because you don't keep a calendar, then you're not intrinsically "lazy," you were just executing the wrong procedure.  And suddenly you stop wanting to call the person "lazy" when it makes more sense to say they need organizational tools.

"Lazy" and "stupid" and "bad at ____" are terms about the map, not the territory.  Once you understand what causes mistakes, those terms are far less informative than actually describing what's happening. 

These days, learning disabilities are far more highly diagnosed than they used to be. And sometimes I hear the complaint about rich parents, "Suddenly if your kid's getting B's, you have to believe it's a learning disability.  Nobody can accept that their kid is just plain mediocre.  Are there no stupid people left?"  And maybe there's something to the notion that the kid who used to be just "stupid" or "not a great student" is now often labeled "learning disabled." But I want to complicate that a little bit.

Thing is, I've worked with learning disabled kids.  There were kids who had trouble reading, kids who had trouble with math, kids with poor fine motor skills, ADD and autistic kids, you name it.  And these were mostly pretty mild disabilities.  These were the kids who, in decades past, might just have been C students, but whose anxious modern-day parents were sending them to special programs for the learning disabled. 

But what we did with them was nothing especially mysterious or medical.  We just focused, carefully and non-judgmentally, on improving their areas of weakness.  The dyslexics got reading practice.  The math-disabled got worksheets and blocks to count.  Hyperactive kids were taught to ask themselves "How's my motor running today?" and be mindful of their own energy levels and behavior.  The only difference between us and a "regular" school is that when someone was struggling, we tried to figure out why she was struggling and fix the underlying problem, instead of slapping her a bad report card and leaving it at that.

And I have to wonder: is that "special education" or is it just education?

Maybe nobody's actually stupid.  Maybe the distinction between "He's got a learning disability" and "He's just lousy at math" is a false one.  Maybe everybody should think of themselves as having learning disabilities, in the sense that our areas of weakness need to be acknowledged, investigated, paid special attention, and debugged.

This is part of why I think tools like Knewton, while they can be more effective than typical classroom instruction, aren't the whole story.  The data they gather (at least so far) is statistical: how many questions did you get right, in which subjects, with what learning curve over time?  That's important.  It allows them to do things that classroom teachers can't always do, like estimate when it's optimal to review old material to minimize forgetting.  But it's still designed on the error model. It's not approaching the most important job of teachers, which is to figure out why you're getting things wrong -- what conceptual misunderstanding, or what bad study habit, is behind your problems.  (Sometimes that can be a very hard and interesting problem.  For example: one teacher over many years figured out that the grammar of Black English was causing her students to make conceptual errors in math.)

As a matter of self-improvement, I think it can make sense not to think in terms of "getting better" ("better at piano", "better at math," "better at organizing my time").  How are you going to get better until you figure out what's wrong with what you're already doing?  It's really more an exploratory process -- where is the bug, and what can be done to dislodge it?  Dislodging bugs doesn't look like competition, and sometimes it doesn't even look like work.  Mr. Cohn was gentle and playful -- he wasn't trying to get me to "work harder," but to relax enough to change the mistaken patterns I'd drilled into myself. 



Why finance jobs are popular


Ezra Klein just made an argument I've been making since sophomore year of college.

Finance and consulting jobs are popular among elite college students because they give a predictable path to professional life for liberal arts students who have no idea what to do with their educations.  Top banks and consulting firms come to colleges and offer a structured, competitive application process, not too different from the process of applying to college.  They offer on-the-job training.  They present themselves as a "a low-risk, high-return opportunity that they can try for a few years and, whether they like it or hate it, use to acquire real skills to build careers ... a practical graduate school that pays students handsomely to attend."  It's these reasons, more than just the high salaries, that attract college students to finance -- they also flock to Teach For America, which doesn't pay well but has the same kind of structured application process and promise of on-the-job training.

Ezra Klein sees this as a lesson for schools -- if you don't want all your students going into finance (and college administrations often don't) you need to do a better job of preparing students for employment.  Wall Street is taking advantage of a crop of bright, conscientious kids with absolutely no plans for the future.

I would draw a few other lessons, in other directions.

First, for students.  It's easy for young people, especially if we're bright and sheltered, to fall into the trap of only doing things that are pre-prepared for us. In high school, we took honors classes and did math team or debate, and we applied to the best schools -- but most of us didn't get a part-time job at a real newspaper or start a lawn-mowing business.  In other words, we did things designed for high-schoolers and didn't step out of that box.  In college, we did things designed for college students, and didn't think to socialize with "adults" or do "adult" things.  (The exceptions -- college kids who ran for town council, or got on the executive committee of a nonprofit -- are memorable).  So when it's time to get a job, it's easy to just unthinkingly apply for the job that seems "designed" for Princeton kids, the one whose recruiters show up to campus.  If you practice having more imagination and initiative early on, you can counteract that bias -- if finance is the right choice for you after all, that's great, but you shouldn't be living your life on default settings.

Second, for employers.  Take a leaf out of Teach For America's book -- you can attract young people not with higher salaries, but with highly visible on-campus recruitment, and the promise of prestige through a competitive application process.  If you can find a way to train liberal arts students with no work experience, and convince them that you're giving them a valuable resume item, you can get grueling labor out of some very smart, energetic people.

Third, for my personal obsession, job search.  Ivy League students don't find jobs on Monster.com because that's a leap into the unknown.  They often have no idea what they want to do for a living; they just want to be in the professional class, and the obvious way to do that is to apply to the handful of finance and consulting firms that recruit on campus.  But if there were a job search tool that had more of an exploratory function -- that gave you, not just a list of local jobs, but match results that took into account your interests, skills, and personality -- and (crucially) if there was adequate filtering for prestige to counteract the fear of taking "some random" job -- then my demographic might actually use it.  The concentration of college grads in finance seems like a symptom of lack of exploratory tools.  It's choosing the familiar over the unknown. 

And recommendations technology is largely about making the "unknown" accessible.  A restaurant you've never been to, or a book you've never read, is a risk -- but Yelp reviews or Amazon recommendations make it a lot less risky.  A stranger is a mystery -- unless he has a Facebook profile, a blog, and an OkCupid account, in which case he's a lot less mysterious.  When we're uninformed, we stick to the tried and true; given more information, we can afford to be more adventurous.  I don't think we've yet created the tools that will let job seekers be more adventurous.

Profile

celandine13
celandine13

Latest Month

July 2012
S M T W T F S
1234567
891011121314
15161718192021
22232425262728
293031    

Syndicate

RSS Atom
Powered by LiveJournal.com
Designed by Tiffany Chow