Wednesday, December 30, 2009

The Race to Define the Kilogram

In August, scappuccino reported on a BBC radio story about the changing weight of the kilogram. That is, THE kilogram: an apple-sized mass of platinum-iridium protected behind glass that defines how much a kilogram weighs. There are replica masses all over the world, but this is the queen bee. If she loses weight, everyone feels it. And she has been. Over the past few decades her weight has fluctuated. It's also possible that the comparison weights have been gaining mass, but no one can be sure because the change is happening on such a small scale. It's only by a few micrograms - nothing that would affect general consumers like you and me - but enough to affect precision measurements done by scientists. Needless to say, there's a bit of a panic to come up with a solution so that the weight of the world, so to speak, won't continue to rest on Le Grand K's shoulders.

There are currently two key efforts to successfully redefine the kilogram in a way that would not rely on a hunk of metal sitting in Paris, accumulating contaminants or losing mass each time it is cleaned. While each has been in the works for many years, the race seems to be heating up, and things may come to a head in the next five years.

So how would YOU define a standard of weight?

The simplest, perhaps most obvious method, would be to find a mass and keep it as the standard. But we see now that that method has it's potential flaws.

So how about counting up something? Lets say 500 black beans equals one kilogram. Again, this might work for general consumption, but it still has its flaws. We'd have to assume that black beans all weigh the same, when in reality they fluctuate just a little. So what other standard mass could we use?

Scientists believe that atoms should have standard mass. One atom of silicon should have the same mass as an identical atom of silicon. The problem there is that atoms are difficult to count. There could be 50 septillion atoms in a kilogram of silicon! (That's 50 trillion trillion). Atoms are small, and there's a lot of them. But some scientists believe this is the very best way to define the kilogram, and they are working on what's called the Avogadro project to determine the number of atoms in one kilogram of silicon crystal.

Before they can count the number of atoms in a kilogram of silicon, scientists must arrange those atoms into an ideal body that they can study. After more than a decade of work, the result so far are a family of grapefruit-sized spheres made up of 99.99% silicon 28. The raw materials alone cost about $700,000 per sphere. They are reportedly the most perfectly spherical objects ever made by man, having been measured in millions of locations to make their surfaces perfectly smooth. Leading the Avogadro project, scientists at the National Physics Laboratory in England and the Physikalisch-Technische Bundesanstalt (PTB) in Braunschweig, the German national metrology institute, are now trying to count the atoms in these spheres, but even that can't happen until the spheres are absolutely perfect. Their website proclaims, "As to counting, there is no end in sight!"

If those scientists can measure the atoms in a sphere of silicon crystal to an acceptable level of uncertainty, it would also redefine Avogadro's constant, which is linked to another international base unit of measurement, the mole. A mole is the amount of substance which contains as many particles as there are carbon atoms in 12 grams of monoisotopic carbon 12. It's also used as a comparison value for calculating the exact amount of any material (chemistry students remember it all too well). It's a massive number (there are a lot of atoms in even a gram of carbon) that has an uncertainty of 5x10^8. The Avogadro project will have to beat that uncertainty to be adopted. Scientists working on the Avogadro project announced in a feature article for the Comité International des Poids et Mesures (CIMP, the International Committee for Weights and Measures) that they hope to reach a redefinition of the kilogram by 2011.

Competing with the Avogadro project is an effort known as the watt balance, which would define the kilogram in terms of current and magnetic field. If you send a current through a wire in the presence of a magnetic field, the wire will feel a force. If you can measure a current and magnetic field which generate a force equal to the weight of a kilogram, you can define it by the values of the current and magnetic field.

There is a little bit of mathematical manipulation with this measurement that would end up providing a more precise value of the Planck constant, "h". This quantum mechanical constant relates energy to the frequency of a photon, and it pops up in a lot of mathematical equations, so the impact could ripple through many areas of physics.

Right now, the race is on. Scientists on both sides have reason to believe their method will work best, that the other may not be precise enough or easy enough to replicate, while others are waiting to see if either side will deliver what they promise. Neither has achieved the needed precision to compete with the current system, but both claim they are well on their way. In another year or so things may have headed up, or be delayed indefinitely once again.

Here are a few other great reports on this story from BBC News and the Backreaction blog.


Read the rest of the post . . .

Tuesday, December 29, 2009

Volcanic Quakes Help Forecast Eruptions

Monitoring the earthquakes caused from magma movements inside an active volcano could help to improve the accuracy of forecasting an eruption.

According to Emily Brodsky, the magma inside a volcano appears to be the real driving force of pre-eruption quakes, indicating how soon an eruption will follow.

"It is really that simple," said Brodsky, an associate professor of Earth and Planetary Sciences at the University of California, Santa Cruz. "It all comes back to physics and the fluid dynamics of magma in the volcano."

Brodsky worked with Luigi Passarelli, a visiting graduate student from the University of Bologna in Italy, who observed 54 eruptions around the world.

"Passarelli collected the data, and it revealed obvious patterns in eruption activity," said Brodsky. "This kind of work isn’t glamorous, but it is very important to find these patterns, and they have led to something that is really very simple."

What Brodsky and Passarelli discovered is that they can use a volcano's "run-up" time -- the time between the beginning of pre-eruption earthquakes and the actual volcanic eruption itself -- to improve the accuracy of future volcanic eruption alerts.

A volcano with frequent eruptions has a short run-up time. These volcanoes have little earthquake activity due to the runny magma within that contains very little silica. The magma flows easily, filling the internal chamber before quickly exiting the mouth of the volcano and allowing only a small amount of time to issue an eruption warning.

Alternatively, a dormant volcano has a very long run-up period. Dormant volcanoes contain magma that is full of thick, slow moving silica. It takes much more time for this kind of magma to fill a volcano and reach its mouth. As the thick magma pushes its way up the volcanic neck and presses on the surrounding rock causing earthquakes that provide more advanced warning of an upcoming eruption.

Brodsky and Passarelli hope that this information will be applied to volcanoes around the world.

"Volcanoes like Mount Rainer are already monitored by the USGS Cascades Volcano Observatory, but there are many other unmonitored volcanoes around the world," said Brodsky. "By applying what we've found, we can improve eruption forecasts and provide better alerts."

"Volcanoes are complex beasts and this research is telling us that they are all controlled by magma -- it's really a very simple process," she said. Brodsky presented her work with Passarelli in a December meeting of the American Geophysical Union meeting in San Francisco, Calif.


By Emilie Lorditch
Inside Science News Service

Read the rest of the post . . .

Monday, December 28, 2009

Tongue in Cheek

What a classic holiday image - a child in distress. Part of growing up is learning lessons about the world around you, and this early physics lesson is one that most kids don't forget. It's the principle of thermal conductivity.

The sensation of having your tongue stick to a metal pole sends chills up my spine. Not for the cold or the taste, but the removal. Unless you've got a cup of hot water, it's rather painful to pull your tongue off. It's that or the risk of remaining tethered to a pole. In 8-year-old lore, it's metal poles and mailboxes that you have to watch out for. Frost on wood or rubber doesn't present such a high risk. Why doesn't your tongue freeze to your glove when you eat a bit of snow off of that? Why not?

Quite simply, when you lick cold metal, the water in your tongue freezes to ice and binds you to your temporary prison. Here's why: Heat is sort of a socialist. It wants everyone to be even. It's always trying to get the heat in one place to move over to colder areas and won't rest until they're the same temperature. So eventually the pie, hot when it comes out of the oven, cools off, and the air around it is a bit warmer. If two surfaces touch, the heat wants to flow from the hotter to the cooler object. What prohibits this from making our whole world the same uniform temperature is thermal conductivity (and other stuff if you want to talk global, but we're just talking about tongues and frost here...). The heat from your hands wants to flow out to the cold air - but is stopped by your gloves. Those gloves are good insulators - they don't allow the heat to pass very quickly. Metal, on the other hand, transfers heat very well.

So when you stick your tongue to that metal pole, assuming it's cold enough that the pole is well below freezing, the heat from your tongue is sapped by the metal before your body can replenish it. The water in your tongue freezes, and there you are, stuck like fool. Warm water will unfreeze you, as will saliva if you can get enough of it running before it, too, turns to ice. You can also manage to get a moist hand stuck to a metal pole - but you'll notice it would be almost impossible to get it stuck to a piece of wood or rubber. Your body would heat you up much faster than those materials could suck the heat away. Ice has a thermal conductivity higher than wood but lower than metal - which is why your tongue might stick to a Popsicle, even if it's a warm day.

To test this in a less painful way, try leaving a piece of metal and a piece of wood lying near each other for 24 hours. Put them outside, inside, under your mattress, just a few places with distinctly different temperatures. Once they've sat for a day, long enough, we assume, to become the same temperature as their surroundings, touch them. Which one is colder?

Normally, your immediate sensation will tell you that the metal feels colder. If you have a means of taking the temperature of the wood and the metal before you touch them, do so. Even though the metal feels colder, you'll find that it is usually the same temperature as the wood. But the metal sucks up your hand's heat very fast, making your hand feel cool.

Oh, childhood lessons.
Read the rest of the post . . .

Wednesday, December 23, 2009

I SAID,IT'S COLD OUT THERE!

Baby, it's cold outside. Unless you're living in Los Angeles, where people are so rich that they bought the sun and get to use it as much as they want. For the rest of us, it's cold. And you know what that means? Fewer sound shadows.

I blogged last winter about temperature inversions, which occur in areas where there is not a lot of circulation during the winter. I promise this will get back to sound shadows, but here's the gist of temperature inversions:

As sunlight passes through the layers of our atmosphere and reaches our bright shining faces, it doesn't actually do a lot to heat up the air. Air is a pretty bad conductor of heat, so it takes a while to warm up. Instead, the sun heats up the ground. The ground acts like a hot plate and gradually heats the air right above it. That warm air rises (Sincerely, physics.) and colder air drops down close to the hot plate. Then that warms up, continues the circulation, and pretty soon we've got a pretty toasty little bubble.

BUT when the ground can't heat up very quickly, due to factors like lower levels of sunlight or very cold nights, the little hot plate doesn't warm up and the air near the ground stays cold. The cold air sits near the ground and stays there.

Now normally hot air rises, but the higher up you go, the thinner the air gets. This decrease in pressure means mountain air is usually colder than valley air, even if the sun is shining on both. So it's actually normal to have the air get colder as you go higher in altitute. But when the ground stays cold and no longer heats up the air down low, then you get an inversion: where the air gets warmer as you go up. That means cold days for folks living in the valleys.

The air is like this for pretty much the extent of winter in the arctic, because the sun shines so little there. If you happen to live in a valley or enclosed region, an inversion can cause major problems. Pollutants pumped into the air are not warm enough to rise up and get out, and you don't have the regular circulation. The warm air puts a cap on the region, keeping pollutants trapped. This further clouds the air and keeps even more sunlight from reaching the ground. So it's a self perpetuating cycle. You need a storm system or major change in the weather to knock things back to normal. (The image to the right shows two shots from the same position overlooking the Salt Lake Valley. Guess which one was during an inversion?)

So for the most part, inversions suck. But as it turns out, you might be able to hear things better when there is an inversion.

If you've ever gone camping on a lake you might have noticed that in the daytime you can see a group of campers across the lake, but you can't hear them. However, at night you seem to hear them very well. People often attribute this to the absence of daytime noise, or the sense of quiet at night, but it actually has to do with the temperature of the air.

Sound waves, like light waves, can be refracted, or bent by the medium they pass through. Light passing through a glass of water, for example, can be bent so that the bottom half of a pencil sitting in the glass appears to be in a different location than the top half.

Sound waves and light waves are fundamentally different, but many of the governing principles are the same. The speed of sound through a medium is determined by the temperature of that medium (not so for light, which always travels at the same speed). The warmer the medium, the faster the sound waves can travel. Like taffy - much easier to manipulate when warm.

In the daytime, the warmer areas are near the ground and cooler areas are above it. Portions of the sound wave near the ground move faster, while portions higher up move slower. This causes the portion of the wave near the ground to bend upward. As it does so, it creates a "shadow region" near the ground where the sound wave does not pass. You can see in the image below (c. Dr. Dan Russel of Kettering University) how the acoustic shadow might make it hard to hear someone still within your line of sight.


At night or during an inversion when the temperature is cooler near the ground and rises as you go up, the acoustic shadow is reversed.


Winds can also cause acoustic shadow areas, and there are a few books about how this might have determined the outcomes of Civil War battles (like the number of troops you send might depend on how bad the battle sounds from afar).

Neat as this is, it seems a small compensation that you can hear people's conversations better when there is an inversion. It's pretty much a lot of this: "This weather sucks." "Yeah." "We should probably invest in some cleaner sources of energy before we all get black lung from inhaling all this pollution." "Yeah."


-Agent Utah

Read the rest of the post . . .

Tuesday, December 22, 2009

Body Heat Power Source

Imagine portable electronics that run on a free, reliable energy source. No chargers to worry about, no dropped calls because you played too much Tetris on your Droid, and an endless playlist on your iPod that's truly endless, at least until they pry it from your (literally) cold dead hand. Well, you may not have to imagine for long.

Vladimir Leonov and Ruud Vullers of the Interuniversity Microelectronics Center have developed power supplies that can run off of your body heat. All you have to do is strap on the blingtastic headband you see here, and you're ready to go.

OK . . . the technology is not quite ready for prime time, but it's much more promising than lots of other proposals for systems known as "energy harvesters" that gather power that would otherwise go to waste. Consider, for example, the wasted energy in the jiggling of your own waist (assuming you have a little extra there like I do.) As you go about your day and your spare tire bounces around, all that motion represents energy that could be put to good use. It's truly wasted waist power.

Lots of people have proposed capturing the energy in body motion through complex mechanical contraptions or piezoelectric material. Some systems consist of energy generators in the soles of your shoes. Another one would be incorporated into soldiers' backpacks. It creates electricity as the soldier moves up and down with each step.

It's not a new idea. There have been self-winding watches around for centuries that rely on the random motions of your arm to keep running. The problem with most of them is that the energy has to come from somewhere. It's not a big deal if all you want to do is check the time, but as soon as you try to run something as power hungry as a cell phone, you get into trouble.

Consider the military backpack power supply. Every watt of power it produces is coming from the soldier, which means there's less power available to do everything else a soldier needs to do. Besides wearing a grunt down faster, it will mean he or she will have to eat more to keep going. In effect, you're replacing conventional batteries and power sources with a really inefficient generator (the soldier) that runs on k-rations.

Leonov and Vullers have a much better idea. You produce lots of waste heat, even when you're sitting still. In fact, we have pretty sophisticated mechanisms in our bodies to make sure that extra heat escapes, and your insides stay at a very constant temperature. But that means your body is throwing away energy that could be put to good use. As long as there's a difference between the temperature of your skin and the surrounding environment, then things know as thermopiles can convert the temperature difference to electricity. Using the electricity to run something like this chic electrocardiogram (ECG) shirt can actually help cool you down while keeping tabs on your vitals.

The temperature difference between your skin and the air is too small to generate much electricity, so Leonov and Vullers are working to develop ultra low power devices to go with their thermal energy harvesters. It's a bit of a paradox though, because if you make the power requirements low enough, then why not just carry around a tiny battery?
Well, I'll tell you why. It's because you get to choose between a plain old, boring, run-of-the-mill battery . . . or something like this far out blood monitoring bracelet! I'm not digging the headband energy harvester, but I would totally wear the pulse oximeter. It would go great with the ECG shirt. (Are you getting all this ThinkGeek?)

You can read more about the research by checking out the paper that recently appeared in the Journal of Renewable and Sustainable Energy.

Read the rest of the post . . .

Sunday, December 20, 2009

Spiral Control

Researchers at the University of Japan have found that they can change sea snails from lefties to righties by nudging early embryonic cells with a glass rod. The shells of Lymnaea stagnalis curve either to the left or the right, a trait determined by genetics and which begins to show in very early development. It appears that by simply nudging the snails in the right way, we can change this.

Photo: The snail shells can curve to the left or right. Glass rods prod the embryos. Large photo: Kuroda lab. Inset: B. Endo.

Whether a snail is a lefty or a righty is often described as handedness, or more accurately, chirality. The mirror image of a chiral event is not the same as the original. So, while handedness sounds like an odd word, it is used because one of the most common examples of chirality is our hands. The mirror image of your right hand is actually your left. You can use your right hand to shake someone elses right hand, but not the mirror image of a right hand (the left). The trip through the mirror changes things.

So with the snail shells, the results are up for debate. Have the researchers really done anything incredible? Maybe not. But they did chose to use baby animals in their work and that almost guarantees a news story (even when its baby snails). But ultimately, the researchers haven't quite determined anything concrete about biological chirality. They did go out exploring and found something interesting; hopefully it will lead to some more definite conclusions. What people really want to find is the gene causes handedness, and that remains a mystery.

But the notion of messing with chirality is interesting. In the particle realm the definition of chirality is a bit more complicated, but a particle's spin can be used to define it's chirality. Spin up or down - right handed or left handed. For most particles, physicists seem to have chirality mapped. Did you know there are right handed and left handed photons? They are interchangeable, so even though they have different chirality, they can perform the same functions and appear in equal amounts. Electrons are also ambidextrous - you can find right and left handed electrons. It's the neutrinos that cause the trouble.

Ah, neutrinos. Those odd little particles that took us so long to discover and yet rain down on us all the time. They pass right through matter, through the layers of the Earth, through our bodies, and through massive particle detectors with almost no interaction. Thankfully for particle physicists, neutrinos do, very rarely, collide with regular matter, and with large enough, sensitive enough machines that we put underground to block out false alarms, we can detect them. In the 1950's scientists studying beta decay found that neutrinos are left handed. This sent quite a shock wave through the physics community because it was the first time that such one-sided chirality had been observed in the fundamental particles. Did this mean there was a law governing chirality? What made neutrinos behave this way? It was later discovered that the neutrino's antiparticle, the antineutrino, is right handed.

The antineutrino is still a bit of an enigma of its own. Even though these particles have been observed, scientists aren't sure how they fit into the standard model of the universe. The positron is the antiparticle of the electron, with the same mass but opposite charge. The antiproton has the same mass as a proton but different charge. The antineutron has no charge, just like the neutron, but has other opposite properties: it's made up of antiquarks, while a regular neutron is made up of regular quarks. The neutrino isn't made up of quarks and has no charge. So there lurks the possibility that the neutrino and the antineutrino are the same particle, and that the neutrino is both right handed and left handed, but that each chirality arises in different circumstances. There are experiments going on right now to determine whether or not the neutrino is it's own antiparticle. There is also the possibility that there are right-handed neutrinos and left-handed anti-neutrinos lurking somewhere in the universe. Confusing, but important to point out. The neutrino remains not fully understood, and chirality is one big piece of the puzzle.

So coming back to the snails - is there a connection between particle chirality and biological chirality? Could these experiments with snails help us understand particle chirality? Well, it's hard to say. Mostly I think the consensus is no. Particle chirality is a whole different beast. You've got quantum mechanics making things complicated...it's a bit of a mess. But what is interesting is that right now biologists are trying to find a gene that causes chirality while neutrino physicists want to know if the neutrino and antineutrino are the same particle. It might be possible - might - that a discovery in one field could aid understanding in another.

I find chirality rather spooky, actually. In the same way that I find all fundamental rules of the physical world a little eerie, but more than a little exciting. Down at the very smallest level, where we begin to see the fundamental workings of the universe exposed, I always feel as though we are approaching the veil. We are tearing away the layers and getting closer to...what? Is the universe an onion with nothing beneath the layers? Of course, that's the wrong metaphor. Even if we don't figure out what's behind the veil, watching those rules play out from the subatomic level upwards, just watching our universe act itself out, is much more interesting than an onion.


Read the rest of the post . . .

Thursday, December 17, 2009

Light Up the Holiday Science Round Up

I don't know about you, but I love this time of year - the snow, the cold, the decorations, the songs. Plus I don't have to do a lot of shopping (being poor has its benefits!) so I think I get to enjoy it a bit more than some.

It's also a time of year to nerd-out, as our other bloggers have shown. In addition to books and t-shirts for that special geek in your life, try sending out Hubble Holiday cards featuring images from the newly revamped Hubble Space Telescope. And what a better subject for a Christmas card? I mean, have you ever seen a more tremendous lighting display than this:That's one of the newest images to come from Hubble since it was equipped with the Wide Field Camera 3 in May of this year. By August, the telescope was sending back the deepest images ever taken in the near infrared range. No one has ever seen this far out into the Universe in this wavelength before. What is stunning about this image is both how many galaxies and stars appear (though most are galaxies, as stars at this distance would be very difficult to see) and secondly, how small a portion of the sky this image takes up. This is only the area of 1/15 of the full moon! So multiply this image by a few million and you'd have a full view of the near-infrared sky.

Another great thing about this time of year is of course the cooking. Fresh turkey baked right has long been considered an art form in my family, but maybe it's really a science. Former director and founding member of SLAC National Accelerator Laboratory WKH "Pief" Panofsky, quite an amazing guy in many respects, found the old knowledge that one should cook a turkey "30 minutes per pound" insufficient (although really I think any cookbook has some different specifics on that). Anyway, Pief developed his own equation for Turkey cooking:
t = W(2/3)/1.5
Where t is the time to cook the turkey in hours and W is the weight. The constant of 1.5 was apparently gathered empirically.


Swinging back to Christmas lighting - if you are in the Baltimore area this season, be sure to stop by the 700 block of 34th St., between Keswick Rd. and Chestnut St. in Hampden. It is the holiday capitol of cheesy Christmas lighting decorations. I grew up in the Western US and for some reason, plastic light-up yard ornaments never caught on the way they did out East. But whatever your taste for your own yard, this street will make you want to go home and dress the house up Griswold style. There are also some artist who make Christmas decor out of such items as bike wheels and hubcaps.

But even Clark Griswold can't stand up to the guy who not only timed his Christmas light display to rock and roll songs, but also hooked it up to a Wii to make the holiday rocking interactive.

If you do get that kind of seasonal motivation, you could try making this the year that you switch to LED Christmas lights. Apparently they've been flying off the shelves. LED's promise around 50,000 hours of burn time, which would last an average household more than a 15 years, plus they use up to 90% less energy of traditional lights. Plus, for the benefit of Christmas trees, LED's don't heat up like traditional bulbs, reducing the risk of fire. Although, as shown in some controlled and fascinating studies by the National Institute of Standards and Technology, most Christmas tree fires can be avoided simply by keeping your tree watered. Here's a video of what it took to get a non-watered tree to start burning. Though it's a bit of a downer to think of such holiday accidents, watching how quickly these things go up in flames is fascinating, and will hopefully serve as a reminder to keep yours watered.


That said, while LED's low-level heating may be desirable for reducing tree fire risk, it has also caused problems for traffic lights because the LED's don't melt snow. Thus, during storms they become obscured, and this can cause accidents. If you see an obscured traffic light, treat it as a stop sign.


For those who prefer a different kind of holiday light and want them to be more energy efficient, here's a wind-powered Menorah. Students of Yeshiva University realized that the wind tunnels dividing buildings on campus could be used to power some sort of device. 'But what?' they wondered. Nice choice.

From the NYTimes blog article:

Mr. Stauber said he saw symbolism in the project. “In the miracle of the menorah, they got back to the temple and there was only enough oil for one night, but they made it last eight days,” he said. “I see an analogy with the world’s fight for sustainable energy, to take that and make it last as long as we’re going to need it.”
Oh, and the world's tiniest snow-man:

For that you can thank the UK's National Physics Laboratory. The snowman is 10 micrometers across, or 1/5 the width of a human hair.

Happy Holidays!

Read the rest of the post . . .

Year's Best Gift Could Be A Job From Santa


Outsourcing the delivery of Christmas presents to mere mortals would provide a jolt to the staggering global economy.

In this year's myriad discussions of stimulus and jobs programs, no one has yet publicly raised the idea to ask Santa Claus to take Christmas Eve off. Outsourcing his job by asking mere mortals to deliver presents to the world's children could provide the jolt required to right a staggering economy.

Claus could not be reached to comment on this story. Independent researchers have yet to develop a reasonable understanding of the techniques that allow him to travel the globe, delivering packages to hundreds of millions of residences over the course of a single evening.

NORAD -- the military organization responsible for the aerospace and maritime defense of the United States and Canada -- tracks his sleigh on radar, and speculators have attributed his swiftness to everything from special reindeer feed to relativistic physics.

Because little hard data is available to explain Claus' techniques, distributing the goods would have to depend on familiar vehicles such as delivery trucks. The mathematical techniques of operations research, a discipline that tackles complicated problems like scheduling commercial aircraft and improving productivity, would have to be the secret weapon. The job is large, researchers said, but not insurmountable.

According to the U.S. Census Bureau, the number of children in the world age 14 and younger is 1.8 billion. Assuming two kids per household, 900 million residencies would require deliveries. Presents can be delivered only when kids are asleep, leaving a window of no more than 10 hours, and probably fewer -- to allow for those who would sneak out of bed to check under the tree. Accounting for time zones, Claus has fewer than 36 hours to deliver all these presents. It's a flabbergasting, improbable 7,000 deliveries per second.

Fact sheets from major delivery agencies suggest that replacing Claus' efforts would be a huge job. UPS Inc. reports serving 7.9 million customers daily; FedEx lists an average daily volume of 7.5 million shipments; and even the Postal Service never matches the volume of Claus' delivery effort, even on the busiest mailing day of the year -- albeit with the advantage of fewer places to stop, clustered in a single country.

"The key is having the right amount of everything, at the right time," said Warren Powell, an operations researcher at Princeton University in N.J. He suggested that the shipping methods would greatly impact the speed and cost -- air delivery is quick and costly, ships are slow and cheaper.

"If you can start making toys in the spring and summer, you can start moving them late summer and fall using fairly low-cost options," said Powell.

Claus' sleigh may be able to carry all the presents at once, without reloading, which makes one factory at the North Pole sensible. For his replacements, it would be much more efficient to spread the toy production to many locations, and ease the burden to deliver the toys to local warehouses where delivery drivers load up for their Christmas Eve trips.

"For us the place to start is: how many people do I need to do the job?" said Jack Levis, the Director of Process Management at UPS. He's responsible for the systems that assign deliveries to drivers, suggest the routes they take and other analytical projects.

Levis' job often involves making the plans to hire personnel and divide up work. "It's how much work do you have to do? Where is it? How many people do I need in each area?"

On average, drivers deliver 200 packages per day, said Levis. If drivers, motivated by Christmas cheer, increase their productivity and each visit 200 households, delivering presents for each child, 4.5 million drivers could visit the 900 million households around the world in a single night.

Because toy production has not yet been decentralized and is believed to occur at the North Pole, the effort would be even larger. In addition to delivery drivers, the effort would also require feeder drivers (to bring the presents from the North Pole to regional distribution centers), sorters (to sort the packages at distribution centers and assign them to each delivery driver), and logistical planners.

Many of these workers would start working long before Christmas Eve to prepare for the big day. Additionally, some cultures share holiday gifts on multiple days, or on days other than Dec. 25, allowing drivers to work multiple days. Launching this jobs program could really put a dent in the unemployment problem.

Santa, when's the last time you took a vacation?

-Chris Gorski
Inside Science News Service

Read the rest of the post . . .

Tuesday, December 15, 2009

'Tis the season for Geek Gifts

Hanukkah ends this Friday, which means you have to come up with (counting tonight) four more nights' worth of terrific presents to bestow upon your favorite nerd. In case you're running out of ideas in a hurry, here are a few on my geeky wish list.

Nerdy Shirts from Threadless.com

Threadless is my all-time favorite source for t-shirts. (Besides the local Goodwill, that is.) Artists (many of them amateurs) submit designs, then potential buyers vote on the ones they'd like to see made into T-shirts. Threadless prints a limited amount of each design, safeguarding your independent style from cramping. Combining the geek uniform with social networking, Threadless is the ultimate venue for nerdy shirts.

For the cable-wielding, wire-soldering AV nerd, or the person whose back, you suspect, looks like this anyways:

Audio-Visual Nerd shirt from Threadless.com

For the person who asks you to fix their blender because you're majoring in physics:

Dear Scientists: This was supposed to be the future. Where is my nuclear-powered levitating house? (From Threadless.com)

For the astronomy enthusiast. Don't forget—the first solar eclipse of 2010 is on January 15, and will be visible from central Africa, the Indian Ocean, and eastern Asia!

What really happens during a solar eclipse, from Threadless.com

Books, books, books

Brother and sister astronomy team William and Caroline Herschel, polishing a huge telescope mirror. The quirky Herschels are two of the main characters in Richard Holmes's The Age of Wonder
If you love dry-humored footnotes, Romantic poetry, ribald tales and horse dung, you'll love the Age of Wonder. Historian Richard Holmes effortlessly interleaves journal entries, letters, and original research together to tell the story of science after the age of Newton. It starts with a fascinating account of the young botanist and anthropologist Joseph Banks's expedition to unspoiled Tahiti, and by chapter three you're reading about fearless aeronauts and the battle to conquer the skies with hydrogen and hot air. The Age of Wonder is definitely what you'd call a tome, but the writing is so lively and the stories so engaging that you'll have a hard time putting it down. Don't believe me? Read an excerpt here.

Science made Stupid by Tom Weller. In case you missed this classic of the 1980s the first time around, you need to buy this book. Or at least visit this online reproduction. It will teach you things you never knew about biology, paleontology, physics, and chemistry. Most importantly of all, you will laugh a lot, and not really know why.

The Standard Model, according to Tom Weller.

Relativity explained.

Just for fun

The amazing Violet Ray cures all that ails you!

The one, the only, the Violet Ray. Not only does it deliver a very mild shock, all you need now to complete your career as a traveling quack is a gypsy wagon, some snake oil, and a bearded lady to collect the nickel admittance fee to your very own depression-era medicine show.

Read the rest of the post . . .

Monday, December 14, 2009

Dark matters



The latest rumors about forthcoming hot results from the Cryogenic Dark Matter Search at the Soudan Mine in Minnesota only confirm what I've always suspected: Physicists! They're just like us! They love rumors!

Into this wild Abyss
The womb of Nature, and perhaps her grave—
Of neither sea, nor shore, nor air, nor fire,
But all these in their pregnant causes mixed
Confusedly, and which thus must ever fight,
Unless the Almighty Maker them ordain
His dark materials to create more worlds,—
Into this wild Abyss the wary Fiend
Stood on the brink of Hell and looked a while,
Pondering his voyage; for no narrow firth
He had to cross."
—John Milton, Paradise Lost
In many other ways, physicists are not just like us. To get to work, CDMS physicists descend 2,341 feet via a rickety metal cage elevator to the 27th level of an abandoned iron ore mine. Once there, they monitor frozen towers of hockey puck-sized germanium crystals for the incredibly gentle rattle of a passing dark matter particle. The dress code is bunny suits, hairnets and slippers, and the cement walls of the "office" are spattered with the dehydrated remains of dead bats.

But when rumors circulated last week that CDMS had a paper coming out in the December 18th issue of the journal Nature, the physics blogosphere went mad in a way that reminded me endearingly of teenage girls pouncing on the latest US Weekly photo of Robert Pattinson holding hands with someone.

Nature is an extremely selective journal and only publishes physics research if it's likely to impact the field in a big way. Thus, wrote Ted Bunn, a physicist/blogger at the University of Richmond, the paper must be discussing big results. And for CDMS, that could only mean one thing: detecting dark matter, that mysterious stuff that physicists reason must make up about 85 percent of the matter in our universe in order for the observed dynamics of stars and galaxies to make any sense within accepted theories of gravity. It's invisible to our telescopes, be they infrared or gamma-ray, since it neither emits nor interacts with electromagnetic radiation. But sensitive detectors should be able to feel the barely tangible rumble as it passes, provided nothing else is moving. Bunn wrote:

"Nature usually only publishes high-profile results. If CDMS had a non-detection to report (even if it set a new and interesting upper limit), Nature would be less likely to accept it. Nature articles are embargoed until publication, meaning that the collaboration can’t release the results or talk about them until December 18. Members of the collaboration have canceled seminars before that date and scheduled talks at a number of universities to take place on that date."
Adam Falkowski, a post-doc at Rutgers, wrote in his blog Resonaances that CDMS's last analysis, published in 2008 and covering data from 2006-2007, turned up no dark matter particles, or WIMPs (weakly interacting massive particles.) "By now CDMS must have acquired four times more data. The new data set was supposed to be unblinded some time last autumn, and the new improved limits should have been published by now. They were not," Falkowski wrote. He then pointed to the Nature rumor and the fact that seminars had been canceled, likely due to the embargo. New Scientist wrote a blog post on the buzz, and ScienceNOW's Adrian Cho pointed out that even if the rumor was true it would take much more than a few signals to make a case that dark matter had been detected.

In fact, in 1998 a group of physicists monitoring working deep inside Gran Sasso mountain in Italy claimed they'd seen the flash of passing WIMPS in a 220-odd-pounds sodium iodide cystals. Although they couldn't be sure that the flashes they detected were due WIMPs and not other particulate culprits, they saw the amount of flashes rise and fall over the course of the year, consistent with how the Earth moves through the vast cloud of dark matter permeating our galaxy. Wired UK reported in August:
The Sun moves round the Milky Way at about 485,000mph, dragging its orbiting clutch of planets along with it. The Earth has its own motion circling the Sun each year as well. The upshot is that the Earth's total motion through space varies over the year, being about 30 per cent faster in June than December.

They even saw the signal in an upgraded version of the experiment, now 550lbs of sodium iodide. But neither CDMS nor its arch rival, XENON100 (as their name suggests, they look for signals in vats of ultra-pure liquid xenon), saw so much as a whisper. What's more, writes mathematician Marcus du Sautoy in the Times, the team have not made it easy to reproduce their potentially groundbreaking results:

Science depends on being able to reproduce experiments, and currently no one has been able to repeat the Italian team’s claim. Not through want of trying but because, according to the journal Nature, the only company that makes pure enough sodium iodide crystals, Saint Gobain in Paris, has signed an intellectual property agreement with the Italian team and is therefore unable to supply the crystals to anyone else.
A close-up of one of CDMS's crystal germanium detectors. Credit: Fermilab.
And the latest is that CDMS is unlikely to release any game-changing data this week. The physical sciences editor of Nature wrote a rumor-quashing email to Falkowski, which he promptly posted on Resonaances:

I was alerted to your blog of yesterday (you certainly don't make contacting you easy). Your "fact" #1, that Nature is about to publish a CDMS paper on dark matter, is completely false. This would be instantly obvious to the most casual observer because the purported date of publication is a Friday, and Nature is published on Thursdays. Your "fact" therefore contains as much truth as the average Fox News story, and I would be grateful if you would correct it immediately.

Symmetry Breaking published this statement from CDMS that says yes, analysis is coming, but not the kind that would make it into Nature, necessarily:

The CDMS collaboration has completed the analysis of the final CDMS-II runs, which more than doubled the total data from all previous runs combined. The collaboration is working hard to complete the first scientific publication about these new results and plans to submit the manuscript to arXiv.org before the two primary CDMS talks scheduled for Thursday, Dec. 17, at Fermilab and at SLAC.

A wave of deflated sighs rippled throughout the blogosphere. But in the aforementioned WIRED UK article, folks from both CDMS and XENON100 expressed a strong belief that 2010 will be the year of the WIMP:

This year, both projects have scaled up their experiments to make them more sensitive to dark matter, which both hope to nail within 12 months. "Next year will be an exciting one," says Elena Aprile, team leader for XENON100. "Hopefully we'll have a first result by early 2010 - there's a lot of expectation as to what will happen."

So one of these days, hopefully soon, a rumor that dark matter has been found will actually prove true. Or we might have to reconsider everything we think we know about gravity.

Read the rest of the post . . .

Friday, December 11, 2009

Steampunk!

It's like a birthday gift that arrived a few centuries too late. The English mathematicianCharles Babbage, who was born the day after Christmas in 1791, dreamed of calculating logarithms using a vast machine. Or daydreamed, at least; he wrote in his Passages from the Life of a Philosopher:

...I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, Well, Babbage, what are you dreaming about?" to which I replied "I am thinking that all these tables" (pointing to the logarithms) "might be calculated by machinery."
Babbage won government support to work on a "difference" engine that could handle calculations that were valuable to navigators; nautical tables at the time were riddled with errors and could lead a ship into disaster. But after one seventh of the machine was built, and 17,000 pounds from government coffers and 6,000 pounds of Babbage's own fortune were poured into the project, the dream of the "difference engine" died. It was not resurrected for over a century, until the 1980s, when Doron Swade, then a curator at the London Science Museum, decided he would try to build one, this time based off of a later design, the Difference Engine No. 2. It took 17 years to put together the difference engine, built entirely from materials that would have been available in Babbage's time. Forget the sleek, minimalist aesthetics of the MacBook. This is what a beautiful computer looks like:



NPR recently ran a wonderful story on a second version of the difference engine replica currently at the Computer History Museum in Mountain View, California (where Google lives). This video from the Museum shows the difference engine in action; it's operated entirely by hand crank, and mesmerizing to watch.



Instead of bothering with multiplication or division, the machine reduces any problem to a problem of "differences", or adding and subtracting; hence its name. The Computing History Museum writes: "Its 8,000 parts are equally split between the calculating section and the output apparatus. It weighs five tons and measures seven feet high, eleven feet long and is eighteen inches deep at its narrowest."

Left: One of Babbage's original designs. Top: The dream realized. (Science Museum.)

But building Babbage's ultimate invention, the Analytical Engine, is a whole other story. The Analytical Engine took the ideas of the Difference Engine to the next level. Reading from punch cards and printing out the answers, the Analytical Engine would have launched the Victorian information age. (I'm seeing the workings of a science fiction/alternate history novel.) The Computing History Museum writes:

The Analytical Engine features many essential principles found in the modern digital computer and its conception marks the transition from mechanized arithmetic to fully-fledged general purpose computation. Had the Engine been built, it would have dwarfed even the vast Difference Engine and cranking it by hand would have been beyond the strongest operator.

Although the Analytical Engine was never built, a language was designed for it by Ada Lovelace, the daughter of Lord Byron, making Lovelace the first programmer. Her code was never put to use, but in the 1970s an early programming language was named after her. She also realized that the Engine wasn't a mere calculator, but a computer in the modern sense:

Perhaps more importantly, the article contained statements by Ada that from a modern perspective are visionary. She speculated that the Engine 'might act upon other things besides number... the Engine might compose elaborate and scientific pieces of music of any degree of complexity or extent'. The idea of a machine that could manipulate symbols in accordance with rules and that number could represent entities other than quantity mark the fundamental transition from calculation to computation.

Although the Analytical Engine will probably never be built in the flesh, or the metal, as it were, Babbage fans have created an emulator for modern computers. You can find the source code, and a lot of information about the never-built Victorian computer, here.

Read the rest of the post . . .

Thursday, December 10, 2009

Reading by numbers


This news story from the BBC website almost sounds like a faux-academic fantasy written by Jose Luis Borges: physicists at Umea University in Sweden, using statistical analysis on the works of three classic authors, conclude that every author has a unique linguistic fingerprint. The BBC writes:

The relationship between the number of words an author uses only once and the length of a work forms an identifier for them, they argue.

This happens to be the same team whose insights into traffic jams we highlighted on the blog earlier this year. The team seems fond of using methods from physics to make observations on systems you wouldn't normally think of being in a physicists' realm, including fads and internet dating. This time they took the complete opuses of Thomas Hardy, Herman Melville, and D.H. Laurence to statistical task. The paper is free to view here.

The BBC writes that the graph of the number of unique words versus the number of total words gives a curve that is unique to each author. The paper itself seems to do something slightly different: it graphs the number of different words, or the book's vocabulary, against the total number of words for texts of different sizes, defining unique as "different" rather than "only appearing once." Obviously, the greater amount of different words there are, the more words there will be that appear only once, so it's a minor quibble. The curves for Hardy (H), Melville (M) and Lawrence (L) look like this (just the first graph):



Here M is the total number of words, and N is the number of different words. Looks like Melville's vocabulary was wider than Hardy's. I wonder if that makes Melville a better read?

To analyze shorter texts, the researchers would excerpt works or look at short stories; to get longer texts, they would look at a few books together, finally using the entire oeuvre. For this reason, Hardy's relative paucity of words seems natural; all his novels were set in a fictional shire in the south of England he called Wessex, a pre-Industrial world of milkmaids and shepherds, wagons and plows, Christmas feasts and country fairs. His books are a delight because they are a world unto themselves; I imagine that repeating words throughout one novel or across several helped Hardy establish Wessex as a real place in the reader's imagination.

This visual curve just might be the embodiment of that ethereal, ineffable stuff that makes one author different from another: the Hardy-ness of Hardy, the Melville-ness of Melville. Disappointingly, the paper makes no mention of the most common words, least common words, or words that only appear once for each author. Thank goodness there's Amazon's "Statistically Improbable Phrases" stats, which they run for books with the "Look Inside" feature, are a computer algorithm's attempt to answer why one book is different from another in a couple of words. To generate a book's SIPs, they compare its phrases to phrases in all the other books with this feature. The result? Literary criticism by numbers, in a way. Amazon explains: "For works of fiction, SIPs tend to be distinctive word combinations that often hint at important plot elements." The SIPs in Far from the Madding Crowd are "spring waggon and new shepherd," which do evoke the novel's pastoral setting. It doesn't always make sense, though; the SIP for The Return of the Native is "editorial emendation," which comes not from the novel's text but this particular edition's hefty appendix.

I think Amazon means for SIPS to help potential buyers see at a glance whether they're likely to enjoy the book; SIPS like "new shepherd" and "spring wagon" certainly would send some people running in the opposite direction. But I wonder whether it's somehow more accurate than simply reading the first couple of pages of a book and seeing whether the plot hooks you, and whether, from this tentative sip, you like the flavor of the author's language.

Besides creating the fingerprint-curves for each author, the Umea group also found that an author's word-frequency distribution, or the probability of finding a word that appears K times in a text, was the same for a short story, novel, or several novels put together. If an excerpt from a novel looks the same as the novel, through this analysis, then the novel, and, by extension, every work the author has produced, can be seen as a mere excerpt of "an imaginary complete infinite corpus written by the same author," what the authors call the "metabook." Computer simulations revealed that Thomas Hardy's looks something like this:


But really, the concept should be intriguing to literary critics. The authors write:

The writing of a text can be described by a process where the author pulls a piece of text out of a large mother book (the meta book) and puts it down on paper. This meta book is an imaginary infinite book that gives a representation of the word-frequency characteristics of everything that a certain author could ever think of writing. This has nothing to do with semantics and the actual meaning of what is written, but rather concerns the extent of the vocabulary, the level and type of education, and the personal preferences of an author. The fact that people have such different backgrounds, together with the seemingly different behavior of the function N(M) for the different authors, opens up the speculation that everyone has a personal and unique meta book, in which case it can be seen as an author's fingerprint.

They told the BBC that this concept could be used for literary sleuthing: "As their collection of fingerprints grows, Mr Bernhardsson said, they will try to identify the authors of anonymous works." Some people think the plays we attribute to Shakespeare weren't all written by the same man—what about scanning Shakespeare's plays to see if they all bear the same "fingerprint?"

Read the rest of the post . . .

Wednesday, December 09, 2009

Stephen Hawking in the oddest places

Hawking in Legos
When Stephen Hawking wrote the modern science classic A Brief History of Time, he transformed from an ordinary theoretical physicist into an all-purpose pop culture icon. He pops up in the oddest places, from the YouTube video that autotuned his voice and Carl Sagan's, to geek rapper MC Hawking, to…the debate about healthcare? I hadn't heard about that last appearance until I read the editorial in this month's Scientific American, written by another theoretical physicist, Lawrence Krauss (the "Physics of Star Trek" guy). Krauss opens his editorial with a mention of Hawking:

When I saw the statement repeated online that theoretical physicist Stephen Hawking of the University of Cambridge would be dead by now if he lived in the U.K. and had to depend on the National Health Service (he, of course, is alive and working in the U.K., where he always has), I reflected on something I had written a dozen years ago, in one of my first published commentaries:

"The increasingly blatant nature of the nonsense uttered with impunity in public discourse is chilling. Our democratic society is imperiled as much by this as any other single threat, regardless of whether the origins of the nonsense are religious fanaticism, simple ignorance or personal gain."



The Royal Society recently unveiled this painting of Hawking by Tai-Shan Schierenberg.
I didn't hear about this when it happened, but apparently an editorial condemning President Obama's plans for universal healthcare used Professor Hawking as an argument for privatized healthcare. The editorial, published in Investor's Business Daily, was later changed—you'll see why—but not before other websites reprinted the quote for all posterity:

People such as scientist Stephen Hawking wouldn't have a chance in the U.K., where the National Health Service would say the life of this brilliant man, because of his physical handicaps, is essentially worthless.
If you're scratching your head and thinking, "Wait. . ." don't worry, Stephen Hawking didn't transmogrify himself into an American while you weren't looking or something.

The editors later printed this correction:
Editor's Note: This version corrects the original editorial which implied that physicist Stephen Hawking, a professor at the University of Cambridge, did not live in the UK.

While a political commentator at the Washington Post had a good laugh at the hapless editorial's expense, the best response came from the Guardian's politics Diary.
We say his life is far from worthless, as they do at Addenbrooke's hospital, Cambridge, where Professor Hawking, who has motor neurone disease, was treated for chest problems in April. As indeed does he. "I wouldn't be here today if it were not for the NHS," he told us. "I have received a large amount of high-quality treatment without which I would not have survived." Something here is worthless. And it's not him.

Read the rest of the post . . .

Tuesday, December 08, 2009

When chemistry dunces bake


With the holidays approaching, I've had seasonal culinary delights insistently on the brain. I can tell because in my spare time I find myself googling "gingerbread" and pawing through mouth-watering photos of shortbread Santa Clauses, currant-studded scones, rum cakes, pumpkin breads, and chocolate soufflés.

But given my long history of pulling chewy breads, rock-hard cookies, and deflated cakes out of the oven, I know the best I can do is hope friends and relatives have been slaving away in their respective kitchens to produce enough sugary Christmas goodies to satisfy my cravings. I always assumed that these baking fiascos were due to an innate defect, located somewhere near my lack of rhythm and inability to enjoy science fiction movies, a clipped allele on the cooking chromosome, perhaps.

I know what you're thinking. Baking is easy! You just follow the directions. It's scientific, it's like chemistry—I've heard it all, believe me. But no matter how diligently I follow baking recipes, nothing ever seems to turn out, well, edible.

Well, if you've ever taken a chemistry class, you know that, no matter how hard you try to follow the directions to the letter, you don't always get the right result. And I was always that one poor kid who could never get the mixture to precipitate. My fortunes were similar in the kitchen; my mother would let me lend a hand with her famous Tres Leches cake, but she forbade me from so much as looking at the egg whites she whipped into the cake's heavenly meringue crust.

When you're carrying out an experiment, you have to think about details that may not be mentioned in your handy lab manual, and it certainly helps to understand the role that each chemical plays and why each step happens when and how it does. Having this knowledge by heart meant my chemistry teacher could execute each experiment swiftly and perfectly.

Apparently, the same sense applies to becoming a good baker (my chemistry teacher baked a mean rye bread, let me tell you). Shirley Corriher, a former research biochemist at Vanderbilt University, got her start in the kitchen burning scrambled eggs beyond all recognition. Later, when she ruined recipes while taking a cooking class, she impressed her teacher by being able to explain scientifically what had gone wrong. Red cabbage gone purple? Add vinegar to restore the acidity. Asparagus gone an unappetizing olive green? Overcooking broke the veggie's cell walls. Soon her teacher and chefs and bakers all over the southeastern US were calling her with their questions; eventually, they convinced her to teach food science classes, and she's now published two cooking chemistry "textbooks," Cookwise and Bakewise.



Her scientific knowhow even helped her take on a truly odd cooking challenge presented by late-night TV show host Jimmy Kimmel, which she hilariously retells here:

They flew me to LA, and told me they wanted me to make a batter and fry some things. I said "Okay, I can do that." But when I got to the studio, the producer told me what I was going to fry: a whole Poptart, chocolate bunnies, a whole low-fat sub, a bunch of grapes, slices of pizza, Ping-Ping balls, a football, and finally a wrist watch. And Snoop Doggy-Dogg was going to be the co-host.

Thank goodness I had my book, CookWise, with me, and let me tell you, that's a great book. I rushed right to the batter section, and it said thick batters adhere better; cold batters adhere better; batters with eggs are stronger. So I made the thickest, coldest batter ever seen. The producer said things had to brown instantly, and with my technical knowledge I knew that if I added corn syrup to the batter things would brown in a flash.

They had an audience of 800 people, and when they started filming Snoop and Jimmy and I fried up things fast and furious. That batter stuck like a dream, even to those slick Ping-Pong balls and to the football. And it was delicious—Jimmy even ate the batter off the Ping-Pong ball. I can't tell you how relieved I was. We were funny and having fun, and it was a grand time. A little technical knowledge got me through that.

Her knowledge also makes her the Oracle of Delphi for cookie companies seeking to get the perfect texture. Butter and sugar both affect how gluten, a protein, forms from the flour, which is why things can fall apart if you mix everything together at once. Mix the flour and water together first, she says, to get your gluten. "You can get just about any texture you want in a cookie—it can be tough enough to stand on, or so tender it just barely holds together. It's all in how you control the formation of the gluten," she says. Ah, the miracle of science.

Back to me, the hapless baker. Am I doomed to produce sub-par cookies and sweets for all eternity, or is there a scientific explanation for the catastrophes?
In her chapter on soufflés in Bakewise, Corriher presents a number of reasons why the limp meringues my mother sometimes faced were not at all due to my "looking at them." Maybe I hadn't washed the bowl or beaters enough, and a tiny morsel of grease prevented the proteins in the white from coagulating. A whole egg stays liquid when beaten, thanks to the fats in the yolk.

Or maybe it was that my mom doesn't own a copper bowl, which Corriher suggests using. Yes, even the bowl you mix things in can make a difference? According to about.com's chemistry page:

When you whisk egg whites in a copper bowl, some copper ions migrate from the bowl into the egg whites. The copper ions form a yellow complex with one of the proteins in eggs, conalbumin. The conalbumin-copper complex is more stable than the conalbumin alone, so egg whites whipped in a copper bowl are less likely to denature (unfold).

When you whisk air into egg whites, you denature (break down) the proteins. But you don't want to take this process too far, otherwise you get a clumpy mess that (thanks, entropy) can never be whipped back into a lovely foam. Copper, apparently, gives you a bit of a safety net.

What about my deflated cakes? I remember baking a very sad birthday cake that cooked but didn't rise. Incredibly, a detail as tiny as what baking powder I used could have been the culprit. While baking soda reacts immediately, baking powder usually makes bubbles twice—once when cool and once when heated. But this all depends on the acids in the baking powder. If a baking powder happens to release all the bubbles in the first stage, when mixed, you'll lose out on most of your leavening if you don't act fast. Considering how slow I am about going about things, this could very well be an explanation.



If it's mathematical beauty you're looking for in your cooking, why not try a topological treat? No baking required—by slicing a bagel in just the right way, you can produce two linked halves. Here's how. Maybe I'll save the chemistry in the kitchen for when I'm feeling more ambitious, and just invest in some high-quality New York bagel and shmear.

Read the rest of the post . . .

Monday, December 07, 2009

Eyes and ears on Copenhagen


A giant globe inside Copenhagen's Bella Conference Center—perhaps to remind delegates to the United Nationals Climate Change Conference of the weight of their grave responsibility.

Here's the one piece of science news you can't avoid hearing: over the next two weeks in Copenhagen, Denmark, United Nations delegates from 192 countries will mastermind a global plan to stop climate change in its tracks. Or, if you're a pessimist, international delegates will squabble and point fingers, eventually failing to instigate any sort of meaningful action. Or, if you're a climate change contrarian (denier might be the less-polite word), they're using pseudoscience to monger fear as part of the global liberal conspiracy to murder capitalism and usher in Big Brother and totalitarianism.

Besides being hailed as a turning point in climate change policy, Copenhagen is historically important just in terms of how it's being covered. The BBC sent 35 correspondents to the conference; newspapers, television channels, interest groups, NGOs, charities, churches, carbon traders, and fossil fuel companies sent the other 4,965 people who are writing about, blogging, and tweeting the meeting. You don't have to wait for the morning paper; you can read blogs updated every thirty minutes or so and watch videos of speeches as they're given. With so much information, raw and processed, flooding out of Copenhagen's Bella Conference Center, what on earth should you read?

If you'd rather absorb the raw footage and maybe write a few blog posts of your own, check out the conference's official website for live and on demand event video. The next level up might be the Guardian's live blog from Copenhagen, updated every quarter hour or so with quotes from the conference (and the online buzz around it), photos, and videos. Expect a general sense of the activity without much interpretation. For info that's a bit more pre-digested, try BBC News environment correspondent Richard Black's blog.

Flummoxed by the big issues at stake? Try the Times' bullet-point briefing, which helpfully boils the conference's central goal down to a few dire numbers:
The Copenhagen summit aims to limit average global temperature rises to no more than 2C by stabilising atmospheric concentrations of carbon dioxide at or below 450 parts per million. With no action, they will rise by 6C by 2100.

...hopefully in time to save poor Bjorn from melting:


Outside the conference center, an ice bear's copper skeleton reveals itself as the sculpture melts.

If you're thinking, "Wait a minute, back up! What's the context here?" try the New York Times' Copenhagen 101 video and climate change science and policy timeline. It might surprise you to learn that early-19th-century physicist and mathematician Joseph Fourier was the first to theorize the greenhouse effect, or that a conference on "Global Causes of Climate Change" brought scientists together as early as 1965. It might not surprise you that most of the Americans caught off guard on Times Square by reporter Tom Zeller had no idea the conference was going on in the first place.

With so many bloggers and news organizations covering the conference, chances are you'll find a kindred spirit no matter what your climate change stripe. Laugh at all the silly, science-worshiping tree-huggers with National Review Online's Planet Gore, or season your lack of faith in the human race seasoned with a pinch of the Grist's wry humor. Understandably cautious about weighing in on scientific ideas you don't fully understand? Here's one biochemist's common-sense guide (no math required). Finally, a real, live physicist weighs in on ClimateGate (cringe), why snarky emails aren't so outlandish even in the sober realm of professional science, and how the fiasco shows it's high time scientists learned to communicate with the public.

Read the rest of the post . . .

Friday, December 04, 2009

Dance your physics

If you want to learn about science, you can pick from a wide variety of media. If you're a student, you read a textbook; if you're a scientist trying to keep up with the latest research, you read journal articles and attend conferences. If you're an interested layman, you pick up popular science books and magazines, browse the Web, and watch NOVA. And if none of that does it for you, well, there's always interpretive dance.



The above video is the work of Theatre Adhoc, a Dutch performance art group. Although it's hard to tell (especially if you don't speak Dutch), the movements portray the work of 17th century Dutch physicist Christiaan Huygens, who discovered the phases and changes in the shape of the rings of Saturn, patented the first pendulum clock, and developed an early wave theory of light. The performance was in honor of the new Huygens building at Radboud University Nijmegen in the Netherlands.


Here's another interpretive dance project that needs no translation—"Dance Your PhD" was initiated last year, and features PhD students, postdocs, and professors who perform their scientific papers as dance. Does this look like "A spectroscopic study of the Blazhko effect in the pulsating star RR Lyrae" to you?



This year, "Dance Your PhD" went pro: professors teamed up with professional choreographers and dancers to portray four scientific papers through modern dance. Viewers were then asked to match each dance to an abstract. Science magazine, which funded the project, found that, "overall, people guessed far more accurately than random. The 341 people who took part made an average of 1.86 out of 4 possible correct matches between the dances and research articles. That's far more accurate than the expected 1 out of 4 correct if everyone guessed randomly. So the rigorously tested answer is yes, dance really can encode science." You can try the experiment yourself by watching all the dances here and trying to guess which one corresponds to "Mechanism of Force Generation of a Viral DNA Packaging Motor." The dances themselves have definitely evolved from last year's charmingly amateur gyrations.

This is Science: Jenn Liang Chaboud from Red Velvet Swing on Vimeo.


The latest project of the artists at Theatre Adhoc is a bit more mainstream: it's a documentary called Higgs: Into the Heart of Imagination. Theatre Adhoc artists Hannie van den Bergh and Jan van den Berg interviewed Higgs and a few of the many of scientists whose careers have been inspired by the most well-known theorized particle of them all. From the preview, it looks like directors have brought their avante-garde sensibilities to the film; the trailer includes beautiful shots of work being done inside the LHC tunnel, set to minimalist, slightly ominous drumbeat and sounds. They wrote about their experiences filming at CERN on their website.



If you're tired of hearing what everyone else thinks of Higgs, why not hear from the man himself? For the last few months Higgs has been on tour, giving a talk titled, "My Life as a Boson." When he gave the presentation at Stockholm University this October, Higgs elicited laughter from his audience by opening with, "In a way the lecture is coming home to Stockholm because the title...was inspired by the title of a Swedish film of some years ago, My Life as a Dog."

The lecture answers that burning question "what is spontaneous symmetry breaking?" and leads you through the physics and experiences that led Higgs to write the 1964 paper that saddled him with such enduring name recognition. Along the way you'll get a few adorable jokes, such as his reminiscence of his days as "steward" of a physics summer school ("The job of steward was essentially to buy the wine and look after it and distribute it to all the people at dinner, and that i did with limited success") and a clear sense that Higgs is extremely modest about his contributions to particle physics. Clearly all the media attention hasn't gone to his head. If you can get past his quavering baritone (the man's 80 years old, so give him a break) and the hand-drawn transparencies, I can think of no better way to get acquainted with perhaps the most popular physicist alive today (sorry, Brian Greene.)

Read the rest of the post . . .