teaching

Ronald Bailey writes in the January Reason about his experiences with personal genomics ("I’ll Show You My Genome. Will You Show Me Yours?"). He's a booster, and much of the article is a review of basic objections (privacy concerns, weakness of gene-phenotype associations, imprecision) and some replies to them. He has several passages worth quoting, including this one:

Some time before the end of this decade, kids are going to be running gene scans and maybe even whole genome sequencing experiments in their ninth-grade biology classes, just the way some of us did blood typing experiments back in the mid-20th century. Then they are going to share that information with their friends on whatever social media follow Facebook and Twitter, and they’ll do it without parental consent. Nerdy high school sweethearts might swap DNA profiles and run them through computer programs designed to predict what their potential children might look like. In the process, of course, they will also be sharing information about their parents’ genes.

We're just starting a new decade, I had to remind myself. Gene chips probably will be cheap enough then to run in high school labs. Is there anyone who's thinking about the need to teach high school kids about factor analysis? Bayesian inference? Because I find a twisted appeal in the idea that postdocs now are doing what high school science projects will be about in ten years.

I've thought for a long time that most of the basic analysis of genomes is undergraduate-level work. Most of the effort is learning how to use software, which is not mathematically demanding but does take time.

Writing the software is a different issue. But as we apply the same techniques to more and more organisms, there will be no new software to write for most analyses. Plug in your data, assuming that you've been sensible enough to define an appropriate sampling strategy, and the software will give you an answer.

Consider a time when genotyping can be done for $2 a chip in bulk. Each year, a new chip design is distributed to high schools across a state. One year, it may be dandelions. The kids sample yards across the state, collect plant phenotype data, and submit data to a common pool. Dispersal patterns, flowering time, other phenotypes are all possible targets of study. A structured population enables them to stratify their sample, exploit linkage due to historical events, and study traits linked to biological invasion.

For the price of one R01 grant, kids across a whole state might develop a new model organism, learn the principles of genomics and produce the data equivalent of dozens of research papers.

(via Razib)

UPDATE (2010-12-15): A reader writes quizzically:

I can't figure out what you are saying here. That it's all so simple that high school kids will understand it without any training in statistics? That all possible analyses of genomic data have already been devised, and all that's left is to turn the crank? Maybe I'm just dense, but I think you need to describe the twisted appeal you're experiencing, not just report it.

What good will the data be from gillions of dandelion gene chips, if the kids don't have the time to measure umpteen different dandelion phenotypes to correlate with the gene data? Whose judgment will decide which traits to consider, and will high school teachers have that judgment? Etc.

Are you saying the software already exists to correlate (or fail to do so) the mountains of new human gene chip data with all of the subjects' medical and life history data? Or are you saying that this is exactly the problem?

I'm honestly not sure if you are in frank trans-humanist pro-technocracy mode, or if you are ironically alluding to its liabilities.

Never assume a blog post has a well-formed point.

I think the potential study I describe is one with enormously more power than anything being done today on plant dispersal, and with power at least equal to the best work on gene-phenotype associations in model organisms (setting aside developmental biology).

Kids in school aren't statisticians, but thousands of them do have brute force on their side. I don't see any obvious reason why software can't be written to spit out these answers. Naturally that software will have to make lots of assumptions, which means that somebody is going to have to design a sensible sampling scheme that can be carried out by students, allowing for their lack of training. It's an educational challenge, but I'd say it's' doable.

This means, of course, in 10 years the statistics that support this kind of study won't be interesting to the kinds of people who write such software. The science progresses. I hope that in 10 years the real scientists will be doing something a little better than what the software will be able to spit out.

At the same time, I think we have to acknowledge that most of what today's genomics postdocs are doing is exactly the kind of analysis that I'm describing for high school kids in 2020, except with much smaller, poorly-designed samples. What makes this Ph.D.-level work is that our current software is not very good at it -- in large part because the current software is mostly written by postdocs with little training in systems design.

Nothing makes quite as good a time-waster as the comment threads to posts where professors complain about students. Especially students using computers.

Janet Stemwedel: "Things observed while sitting in on colleagues’ classes."

My favorite comment, from Mike Hoye:

Here’s an exercise that may be revealing: Have somebody, maybe a student or an automated system, whatever, make a transcript of everything you said during a two or three hour lecture, verbatim. Then read it, front to back. It won’t take you three hours. It will take you fifteen or twenty minutes. That should tell you what the real information density of your lectures is like to people who used to have the option of reading other books, making doodles or just struggling to stay awake, but who now have the option of wifi.

Professor Bainbridge: "How to handle classroom chaos."

My favorite comment, from Fr. Philip:

One summer I noticed that a few students were facebooking during class. These students became my favorites for calling on to look up classical texts on-line and reading them aloud.

My attitude: I'm there for the student who wants to learn, who is the first in his family at university, and to whom the tuition money is a hardship. If any of the other students interfere with his learning, they'd better watch out for me.

An article about the future bookless libraries, which may already be springing up at a campus near you:

Last month, the University of Texas at San Antonio announced it had built the world’s first bookless library. Its Applied Engineering and Technology Library offers access to 425,000 e-books and 18,000 e-journal subscriptions, and librarians say they’ve yet to hear a complaint from the 350-plus students and faculty who pass through its doors daily. “We’ve gotten no negative feedback,” says Krisellen Maloney, library dean at the University of Texas. “We looked at circulation rates, we looked at electronic resources, we looked at requests, and we decided that having the services was more important than the physical books.” She adds bluntly: “When we prioritized the needs, the books weren’t the priority.”

That can't be good for the academic book market.

Filed under

Natalie Angier, who knows something about how to introduce science to the masses, blows off some steam about STEM in this week's Science Times:

A new report from the President’s Council of Advisors on Science and Technology offers many worthy ideas for improving science education, like creating a “master corps” of the nation’s finest science teachers who would in turn train others; but the STEM word keeps thudding up its pages like so many gristle nubs in a turkey burger. It’s greasy-peasy: collapse down education, and you’ve got a buzz phrase to rival phys ed.

Much more snark -- in fact, a lot more snark in this NY Times piece than your typical blog post. And I should know -- I did the same thing last year!

On the one hand, "STEM" is so completely uninspiring, that it's even obvious to the actor best known from "Wings," boat anchor of "Must See TV." On the other hand, "STEAM" not only blows, it also sucks. And I don't mean that in a steampunk kind of way.

Eva Amsen describes her trip down the BRCA2 cycle path, near the Sanger Institute in the UK. She also points to Jennifer Rohn's description of the path last year.

The path was opened in 2007, and includes the 10,000th mile of cycle path in the UK. The nearby Sanger institute (famous for its genome sequencing) suggested to decorate the path with a gene, and BRCA2 was chosen because it also has approximately 10,000 base pairs, but also because it's a gene that people can relate to. Genetic screens for breast cancer risk are something that people will have heard of, and this is the gene that those tests look at.

Amsen has pictures of the path which give a good impression of the scale. What an interesting way to personalize the information in a genome; I wonder if the positions of mutational variants are marked?

A California pilot study is going to give students iPads with e-textbooks for algebra.

Students with iPads will have instant access to more than 400 videos from teaching experts walking them through the concepts and assignments, rather than having to rely on the teacher's explanation in class. There is also a homework coach and animated instructions on how to complete assignments.

Sipe said the videos allow teachers to focus on individual instruction rather than walking the entire class through the same examples again and again. The iPad also allows students to take audio or text notes and do assignments right on the device itself, giving the teacher the ability to track their progress in real time.

This has the chance to fundamentally change the way teachers work. But there will have to be a major adjustment in student behavior. You've got a good 20 minute introduction to a lesson, recorded by somebody who has been picked as a good presenter. Then, in principle, the live teacher interacts with students like a laboratory. But you've got to get the students to actually watch the video and do the prep work first!

Of course, since iPad textbooks can be apps, you can integrate quizzes right into the material, and other more or less invasive measures...

Yuehong Zhang reports in brief in Nature[1] the extent of plagiarism in scientific papers submitted to one journal in China:

Since October 2008, we have detected unoriginal material in a staggering 31% of papers submitted to the Journal of Zhejiang University–Science (692 of 2,233 submissions). The publication, designated as a key academic journal by the National Natural Science Foundation of China, was the first in China to sign up for CrossRef's plagiarism-screening service CrossCheck (Nature 466, 167; 2010).

We are therefore campaigning for authors, researchers and editors to be on the alert for plagiarism and to work against cultural misunderstandings. In ancient China, for example, students were typically encouraged to copy the words of their masters.

I saw a really provocative presentation last fall about the extent of plagiarism in papers from several countries. This Chinese journal is not an extreme case -- in some countries, members of National Academies or government ministers are serial plagiarism offenders. We're not talking about reprinting parts of research papers from the same lab; we're talking about wholesale lifting of papers and results from other scholars, often in the U.S, Canada or Europe. The software that enables checking for plagiarism today has revealed a shocking extent of copying in the scientific literature.

The most common explanation is "cultural misunderstanding". This is no doubt true to some extent, but some of the worst offenders are scientists who are anything but naive. There's some shady business out there.


References

Filed under

If you're an instructor curious about how to introduce blogs in your courses, you may want to read this post by Daniel Lende at the new Neuroanthropology. He describes his experiences getting students to broaden senior theses, community-based research and even exams.

I should write up the way I handle student blogs for my large course, Biology of Mind, which is starting again this semester. To my knowledge, it's one of the largest blog-based student projects. But I'll have to get a moment to spare first....

Krystal D'Costa (Anthropology in Practice) links to a mini-documentary about the role of social media in the education of "Gen-Y": "Decade 2: Encouraging Educators to Rethink Social Media Strategies in the Classroom."

First, that these subjects are operating in a world that didn't exist five years ago. Some hold job titles like Social Media Strategist, and others are entrepreneurs who can shape their job as they want and need using social tools. These are individuals who have learned early the power of technology and shared communication, and they've harnessed it. Second, they're aware that they have needed to find their way in the dark. Several individuals in the documentary discuss how poorly prepared they feel their education has left them. This is an interesting statement when one considers reports that this not a tech savvy generation. And it prompts one to question whether the educational system can support the changing face of connectedness and business overall.

Are teenagers and college students learning about social media in the same way they learn about the birds and the bees -- mainly from their peers? Only a handful are really learning to control the media in their lives. People who end up in jobs like "Social Media Strategist" are the result of some kind of uncontrolled selection experiment.

Which maybe is as it should be. Uncontrolled selection experiments are pretty much how most successful people get started, I guess.

The Guardian has a helpful entry in its series on careers: "What to do with a degree in anthropology."

Most don't pursue graduate work:

Of the anthropology graduates who left university in 2008, 51% were in employment after six months in a diverse range of careers such as advertising and sales (8%), business and finance (6%) and public or private sector management (12%). However, a large number were working in catering (15%) or in clerical roles (20%) – no doubt a reflection of the current scarcity of graduate-level jobs.

I think there is no degree that articulates so well with a broad range of other fields -- there are ways to combine anthropology with everything from history to engineering. That's one reason why graduates have such a broad range of careers, they're led by their interests in other fields as well.

Filed under

From the Chronicle of Higher Education, an article by Jeffrey Young: "College 2.0: Teachers Without Technology Strike Back."

I think that the article confuses matters by lumping together people with many different aims. Which I guess is sort of the point of all "technology in college" conversations. Different applications require different pedagogical approaches. There's no sense pointing out "Luddites" unless you can show the way that a particular technology would increase their effectiveness. A college's investment in teachers is a whole lot more expensive than the investment in clickers, projectors, online courseware, and the rest.

Nevertheless it's entertaining to see cherry-picked examples of professors proudly rejecting technology:

His professor made students write short papers and then gave extensive feedback, which forced them to hone their arguments and express themselves more clearly. And he made them write out the papers in longhand, in blue books, during class. "There's something about the immediacy or exigency of it," Mr. Leeds said. "When I took those written exams, I found that I made connections that I didn't know I knew—it shook up my brain cells like a supernova."

So today Mr. Leeds requires his students to write short, in-class papers. In blue books. By hand. Just like his favorite professor did.

From the comments:

No wonder some people would rather go to jail than to college.

Many have the same attitude about Powerpoint, I know.

It's that time of year again, when newspapers start reminding us that cheating and plagiarism happen.

"Lines on plagiarism for students blur in the digital age"

“If you are not so worried about presenting yourself as absolutely unique, then it’s O.K. if you say other people’s words, it’s O.K. if you say things you don’t believe, it’s O.K. if you write papers you couldn’t care less about because they accomplish the task, which is turning something in and getting a grade,” [anthropologist Susan D.] Blum said, voicing student attitudes. “And it’s O.K. if you put words out there without getting any credit.”

"To stop cheats, colleges learn their trickery"

For educators uncomfortable in the role of anti-cheating enforcer, an online tutorial in plagiarism may prove an elegantly simple technological fix.

That was the finding of a study published by the National Bureau of Economic Research in January. Students at an unnamed selective college who completed a Web tutorial were shown to plagiarize two-thirds less than students who did not. (The study also found that plagiarism was concentrated among students with lower SAT scores.)

I absolutely hate those "Web tutorials". They're a waste of time, that have come to be used for everything from human subjects guidelines, conflict of interest rules, and employment handbook reviews. I don't question that they may work, but I perceive them as hostile: a "CYA" tool for administrators who don't want personal contact with their employees.

I don't have a silver bullet for plagiarism, but it helps to explicitly introduce the topic, and review acceptable citation and quotation practices before the first writing assignment. It helps even more to assign online work with hyperlinks -- there's little reason nowadays to require a whole class to cut a tree for their work, when they're already accustomed to interacting online. And make assignments that aren't meaningful outside the context of the class -- specific reviews of particular readings, or critical analysis of specific hypotheses, not general topics.

Professors encourage plagiarism when they don't hold students accountable for their opinions. When a student isn't personally invested, she isn't going to think seriously, and she may find it easier to just to slide by.

Filed under

Mailbag: Remembering the books

Regarding "Bubbling through college":

- - - - and can remember which pages are where.

This alludes to something big that goes largely unnoticed, it seems, and about which I have trouble deciding.

In my own 1960s-80s educated brain, the *location* of knowledge is deeply tied my access to and retention of it. Clearly with online learning and e-books, this means of structuring knowledge goes away, and nothing in particular replaces it. I've lost this strong tendency myself -- I no longer remember a fact, for example, even by *where on the page* the text was located.

But does this matter? Is this "ergonomics" of knowledge essential to all human brains, or is it only a trivial habit developed by a few arbitrary generations in the course of history? Does its loss mean knowledge will be less structured in the future, or merely structured in equally useful but different ways?

I still find myself doing this with PDFs, and I can remember well details of grade-school textbooks this way. But I have no knowledge of how common this may be. It seems to me that we may be exploiting an ancestral "geographic" ability, and it harks back to the "method of loci" which has been a trick for remembering things as far back as Roman times. But how natural is it?

There may be many other kinds of tricks that exploit innate brain abilities that wouldn't ordinarily be recruited for narrative information.

Bill Gates says college will be obsolete in 5 years:

“Five years from now on the web for free you’ll be able to find the best lectures in the world,” Gates said at the Techonomy conference in Lake Tahoe, CA today. “It will be better than any single university,” he continued.

He believes that no matter how you came about your knowledge, you should get credit for it. Whether it’s an MIT degree or if you got everything you know from lectures on the web, there needs to be a way to highlight that.

Glenn Reynolds says higher education is a bubble that is set to burst:

So my advice to students faced with choosing colleges (and graduate schools, and law schools) this coming year is simple: Don’t go to colleges or schools that will require you to borrow a lot of money to attend. There’s a good chance you’ll find yourself deep in debt to no purpose. And maybe you should rethink college entirely.

What to do? Colleges can't keep giving the same product and expect it to work much longer; their monopoly on granting credentials has already started to break for many business and professional fields. Bill Gates is the largest shareholder of one of the countries largest high-tech employers. If they can find a way to get talent faster, saving their recruits tens of thousands of dollars in the process, that gives them a huge advantage.

College needs to be better. The "best lectures in the world" aren't good enough. Happily, there are some other things that we also do well.

I think Gates may be on to something here:

One particular problem with the education system according to Gates is text books. Even in grade schools, they can be 300 pages for a book about math. “They’re giant, intimidating books,” he said. “I look at them and think: what on Earth is in there?“

The best instructional books I've learned from, and which I go back to, are consistently short. Short enough that a person can learn everything in them, and can remember which pages are where.

"Just-so stories" driving me crazy

NPR has been doing a special series of reports during their "Morning Edition" program called "The Human Edge", all about various aspects of human evolution. I think it's just wonderful that they're doing this, and the stories are available on the NPR website, which is also great.

I've been out of town and so haven't been following closely. So I'm just noticing that some of these stories actually drive me up the wall. Every one of them is presented as what Stephen Jay Gould called a "just-so story".

I'll take one of the latest articles as an example: "Food For Thought: Meat-Based Diet Made Us Smarter". The story begins with a short resume of the "expensive tissue hypothesis", with quotes from one of expensive tissue's main exponents, Leslie Aiello. This hypothesis is a serious one, which paleoanthropologists take seriously, and which has some empirical support in the comparative biology of primates. But here's how the story poses the hypothesis:

"You can't have a large brain and big guts at the same time," explains Leslie Aiello, an anthropologist and director of the Wenner-Gren Foundation in New York City, which funds research on evolution. Digestion, she says, was the energy-hog of our primate ancestor's body. The brain was the poor stepsister who got the leftovers.

...

Meat is packed with lots of calories and fat. Our brain — which uses about 20 times as much energy as the equivalent amount of muscle — piped up and said, "Please, sir, I want some more."

As we got more, our guts shrank because we didn't need a giant vegetable processor any more. Our bodies could spend more energy on other things like building a bigger brain. Sorry, vegetarians, but eating meat apparently made our ancestors smarter — smart enough to make better tools, which in turn led to other changes, says Aiello.

That's a "just-so story." How did meat make us smarter? Is it a magical meat property? If I fed enough meat to the local deer, would they get smarter? The expensive tissue hypothesis proposes an energetic trade-off, but doesn't provide any mechanism by which the evolution of smarter brains (or diet shift) would occur. A trade-off is simply "you can't have your cake and eat it too." It needn't say anything at all about how you bake a cake, or what happens if you can't eat it.

I'm not anti-expensive tissue, I just want to recognize the limits of these explanatory hypotheses. Energy cannot explain everything about human cognitive evolution. It's an important constraint, but it cannot be the only one. Without some countervailing force, energy expenditure would always favor smaller brains. So we deserve some account of mechanism, not just energy budget.

The story about endurance running attempts to tackle the issue of mechanism: "For Humans, Slow and Steady Running Won the Race". This story relies on interviews with Dan Lieberman, who favors the idea that Homo erectus adopted a form of long-distance running.

"Most animals are designed for speed, for power, not for endurance," Lieberman explains, as we make a turn onto the bridge. "And we are a special species in having been selected for endurance, not speed."

So we grew longer legs and lighter feet; the joints in the legs and pelvis got bigger to absorb a lot of impact; and we grew a bigger butt muscle.

Lieberman says these and other changes allowed us to run down and exhaust prey, like antelopes. He notes that "persistence hunters" in Africa have been known to do that. And the payoff would've been big for early humans: lots of high-calorie meat to feed a bigger brain.

Again, this is presented as a just-so story. It's a plausible narrative, but the article doesn't situate it in an evolutionary context. How exactly would you test this hypothesis? You could look at prey species profiles for early Homo (favoring low-endurance species), you could consider other cognitive and physiological requirements of persistence hunting (tracking ability, knowledge of water sources), you could look for evidence of gateway strategies (use of slow-acting poisons that require long-distance tracking). You could also try to refute alternative explanations for the anatomical features in question, such as their usefulness for long-distance walking, walking on irregular substrates, or simple allometry with body size or lifespan.

These are serious hypotheses with literature and evidence supporting them. I just wish that they could be reported in a way that made it sound like paleoanthropologists are skeptical scientists!

If you need something to heat up your July, you can check out the NY Times forum, "What if College Tenure Dies?".

It makes an interesting pool: Imagine five academics given the assignment to make their best argument for or against tenure, in 500 words or less. How many words will they waste, on average? I'd say the winning number is 350.

via TaxProf.

Filed under

Graphic biology teacher survey results

Several people (e.g., P. Z. Myers, Jerry Coyne) have passed along a poster representation of some statistics on evolution, creationism, and other stuff in secondary biology education.

These statistics are from the National Survey of High School Biology Teachers, taken in 2007 and reported in a 2008 paper by Michael Berkman and colleagues [1]. I wrote about the survey results at greater length when the paper by Berkman and colleagues reported on them.

Biology teachers creationism chart

What I want to know is where are these high school biology classes that include more than 20 hours of human evolution? That's four weeks! Two percent of the survey is 18 teachers. Good for them, and I hope they're using the blog!

The 17% who say they don't cover human evolution at all... I think that it wouldn't be too hard to make a real dent in this statistic. It does not take extra time to instill basic knowledge about human evolution, if you're already discussing basic genetics. All of the good examples of Mendelian inheritance are good examples precisely because they illustrate recent human evolution. Any discussion of human variation really is a discussion of human evolution. You just need to include the missing evolutionary frame, the one that makes sense of these things.

Still it's true that many biology classes don't touch on issues related to humans at all. Even these are missing an obvious opportunity -- other organisms are relevant to our biology precisely because of our shared evolutionary history.

The part of the survey that I found dismaying was the low number of hours devoted to evolutionary biology in general. As I put it then:

We're entering an age in which health decisions will be made based on genetic information -- when everyone may know their own gene sequences if they want to. New diseases are emerging, new crops are being developed, and new organisms are being transplanted from one continent to another. Decisions about the economic development of entire regions -- perhaps entire nations -- are now subject to the evaluation of biodiversity, including threatened and endangered species.

The people making these decisions ten to twenty years from now will have an average of 13.7 hours of education about evolution.

Looking at the distribution of numbers, it's clear that the average of 13.7 is buoyed by a tail of high-instruction classes. The median and mode are between 5 and 10 hours. This has to change, if we're going to have a populace capable of using genetic information.


References

Marie-Claire Shanahan has written on A Blog Around the Clock an essay discussing the Berkeley genetic test:

I chatted informally with some friends about the issue. One expressed her divided feelings about it saying (roughly quoted) "It seems like they [university admin] have addressed the ethical concerns well by being clear about the use of the swabs and the confidentiality but something still just doesn't feel right. There's still a part of me that shivers just a little bit."

What is the shiver factor?

Her thoughts provide another perspective, and I hope more will come.

I think that the test does two things. It requires that students give a different kind of trust to the university -- for information that's not covered by the usual federal protections of student records, and that requires a new "consent statement". To enforce this new trust, the test imposes a pressure from peers and faculty upon students.

I don't see how that trust has been earned. Especially by the University of California system -- remember MoCell?

Berkeley DNA tests revisited

I wrote about the UC Berkeley genetic testing of incoming freshmen earlier this spring. The summer is halfway over and the saliva kits have been sent. Now Scientific American has a long and balanced article on the contrasting approaches to genetic testing at Berkeley and an upper-level seminar at Stanford: "Exposing the Student Body: Stanford Joins U.C. Berkeley in Controversial Genetic Testing of Students".

This is an article worth reading by anyone interested in personalized genomics or bioethics. I wouldn't have expected that university classes would be such an early battleground for genetic information, privacy rights, and junk science. But nothing about either program is unprecedented. I wrote in 2005 about genetic testing associated with a course at Penn State. As I noted in 2005, I have a lot of concerns about applying these genetic tests to students. They can have an educational effect, but not always a beneficial one.

The UC-Berkeley program actually provides vastly less information than the ancestry testing that has been applied to students in courses in the past. That's my main objection -- it's an awful lot of trouble for essentially no scientific value. I mean, they might as well just do blood types!

There's a lot in the article about the thinking of the main decision makers. I'll share these two paragraphs:

In fact, after Salari originally proposed the class last fall, a Stanford task force of about 30 basic scientists, clinical scientists, genetic professors, genetics counselors, bioethicists, legal counselors and students spent several months working through the various ethical issues and establishing safeguards to protect students. In contrast, the organizers of Berkeley's project incurred criticism because they spent hardly any time considering the potential reaction to their new orientation program.

Kimberly Tallbear, a professor of science, technology and environmental policy at Berkeley, explains that neither [Dean] Mark Schlissel nor any of the project's other organizers consulted with Berkeley's bioethics community. "Schlissel said several times they were surprised about the controversy," Tallbear says. "I said to him, 'Well doesn't that tell you that you needed input from us? Because we could have told you about the controversy and debate.'"

The article also discusses the "research study" aspect -- participants will be asked to sign an informed consent form and data will be kept. It may seem like the three genotypes provided to the students would not be very interesting as research topics. But it's not too hard to imagine psychology grad students in three years becoming very interested in research projects involving a high-risk population for binge drinking and known ALDH2 genotypes. Berkeley freshmen may be enrolling now in the first phase of a long-term research study on alcohol and sexual assault.

The Science Insider listens to actor Tim Daly, advocating for science education, who thinks the officially sanctioned ed-school terminology is bad marketing.

"The acronym STEM (science, technology, engineering, and mathematics) blows," says Daly, who participated in a lunch-time rally today for the upcoming National Lab Day on 12 May as co-chair of the Creative Coalition, a nonprofit organization that lobbies for the arts community. "Everybody thinks you're talking about stem cells. It should be STEAM. It's not only a better acronym, but it will enhance what they are doing."

On the one hand, "STEM" is so completely uninspiring, that it's even obvious to the actor best known from "Wings," boat anchor of "Must See TV." On the other hand, "STEAM" not only blows, it also sucks. And I don't mean that in a steampunk kind of way.

I mean, why not just call it "SMEAT"? Because, people, you're arguing about an acronym. Is there anything more quintessentially nerdy than trying to find the majick acronym that will float to the top of the grant application pile? They may as well name it "nerdropology."

Filed under

Michael E. Smith has some suggestions after going to the SAA meetings:

How to give a bad presentation at a professional conference

I have always been amazed at the low quality of many presentations at these meetings, starting at the first one I attended as an undergraduate. It seems that many archaeologists must WANT to give bad presentations. If that is the case, then I can be helpful and give you some tips on giving bad presentations.

Good (inverse) advice follows, mostly centered around reading a prepared text and working poorly with slides.

New Smithsonian human origins hall

Thanks to all those readers who sent me links to the new human origins hall at the National Museum of Natural History, in Washington D.C. The NY Times' Edward Rothstein reviews the new exhibit:

During the brief 200,000-year life of Homo sapiens, at least three other human species also existed. And while this might seem to diminish any remnants of pride left to the human animal in the wake of Darwin’s theory, the exhibition actually does the opposite. It puts the human at the center, tracing how through these varied species, central characteristics developed, and we became the sole survivors. The show humanizes evolution. It is, in part, a story of human triumph.

I pointed to a feature about the John Gurche reconstructions last month. You can see many of these along with some 3-d models of fossil casts at the exhibition's website. The online component of the exhibit, titled, "What does it mean to be human," has been given a lot of effort. It includes essays about several areas of paleoanthropological research, some interactive features (including the 3-d casts), and a forum for teachers. As you might imagine from the quote above, I don't agree with everything in the exhibit, but they've done a very nice job creating a storyline (focused on human adaptability to climate and environment) and illustrating it.

I'm having a bit of a laugh about the "Human Family Tree", though. It's an interactive feature so I can't paste a copy. They've taken care to make sure that every fossil is on a side-branch, not on the "main trunk" of human evolution. But what tickled me is that some of the "branches" appear to ramify from lots of different places in the tree -- like "Paranthropus" for example is paraphyletic. Also I love how some of the species weren't specifically given facial reconstructions (some don't have crania), so they have a generic "caveman mugshot" on the tree. It's a reminder of the contentious scientific politics that lie hidden behind certain hypotheses, no matter how accessible-looking they are!

Filed under

A very interesting essay by Edward Rothstein in the NY Times special museum section: "The thrill of science, tamed by agendas".

Rothstein features a comparison of the human-centered renovation of the Griffith Observatory, and the new Rose Center for Earth and Space in New York, which goes with more of a pale blue dot theme.

Of course, the insignificance of human existence is one of the fearsome lessons of modern science. But when we are young, we learn differently. We begin by learning to value our own understanding and only gradually come to recognize its limits. We begin by making sense of the world before we see how much lies beyond sense. The process doesn’t work well in the other direction: we can be left mystified by the world and lose respect for the human.

Something like this has started to happen in some museums. This decentering of the human can become a devaluing of the human; the museum may even begin to see human frailties as a great flaw in the cosmic order that must be repaired. So this new variety of science museum must not just display or explain. It must be relevant, useful, practical, critical — something that helps with fund-raising as well.

From there, he covers the "self-loathing" that seems to have crept into natural history museums concerning humans and nature. Some of his comments are reasonable, some hyperbolic, but all thought-provoking.

Filed under

Washington Post: "Wide Web of diversions gets laptops evicted from lecture halls":

Professors have banned laptops from their classrooms at George Washington University, American University, the College of William and Mary and the University of Virginia, among many others. Last month, a physics professor at the University of Oklahoma poured liquid nitrogen onto a laptop and then shattered it on the floor, a warning to the digitally distracted. A student -- of course -- managed to capture the staged theatrics on video and drew a million hits on YouTube.

Whiners.

All this does is privilege students with smaller devices in tablet form factor. I'd like to see the opposite story, about how computers are empowering new kinds of classroom experiences.

Although there was that time that the student with the computer tried to "correct" me by citing Wikipedia in class. That led to a few minutes of entertainment for everyone!

Filed under

Jeffrey Zeldman: advice for business that works just as well in academics: "Show up early"

How can a client blame you for a cab driver’s mistake? How can a conference organizer hold you accountable for an airline’s cancelled flight?

They can do it because lateness is part of the order of things, and grownup professionals plan for it, just as they plan for budget shortfalls and extra rounds of revision.

No, my link is not sending a passive-aggressive message to anyone...

Filed under
Syndicate content