« August 2007 | Main | October 2007 »

September 28, 2007

Safety Day 2007

Yesterday (9/27/2007) was Safety Day at Honeywell's Morristown site. I reported on last year's Safety Day in a previous article that contained an interesting Einstein anecdote. Safety Day has been an annual event at Morristown for about twenty years, and it's marked by guest lectures on safety and some free time to clean your laboratory. One of the day's speakers was Robert Siciliano [1], who talked about personal safety and internet security. Honeywell has an excellent safety record, and there is a deep safety culture at the Morristown site. This Morristown safety culture was established many years ago, since the Morristown Headquarters is the legacy location of the Allied Chemical corporate laboratories.

Because of the human psyche, there's a tiered structure to our safety focus as individuals.

1. Keeping your family safe is a person's prime focus. This is an instinctive trait, and our desire to protect our children is one reason why we've survived as a species.

2. Keeping yourself safe is a person's next important focus. This is also an instinctive trait. Its primary purpose is to keep us around until child-bearing age.

3. Keeping others safe is another human focus. Society expects us to help others when they are in distress whenever possible. This natural tendency has been protected in our litigious society by Good Samaritan laws that protect helpful individuals from legal problems if they render assistance and things go awry. In some cases, the Good Samaritan laws stipulate further that it's a requirement to give aid, although this may be as simple as dialing 9-1-1, or another emergency number. Altruism likely has its roots in human cooperation in our hunter-gatherer activities, and this too helped us survive as a species.

In a more abstract expression of keeping others safe, we keep our co-workers safe by anticipating things that could bring them into harm's way. We clean debris off the floor so our co-workers won't fall, and we train them about workplace hazards. We take care also to develop and manufacture safe products. In Honeywell's case, we even sell products, such as Traffic Collision Avoidance Systems and Ground Proximity Warning Systems, that assist in safety. Furthermore, we take necessary environmental precautions, such as safe disposal of hazardous materials, to keep the world safe for future generations.

1. Robert Siciliano's Web Site.
2. Occupational Safety and Health Administration.
3. Morristown Health, Safety and Environmental Web Site.

September 27, 2007


A convenient property of mixtures is that adding components will usually reduce the melting point. Although chemical effects are important, for mixtures of similar components this is because the presence of more than one type of atom in the solid mixture gives it a higher entropy than the pure metals, which leads to a lower free energy for the solid at a given temperature. An example of this melting point reduction effect is the mixing of lead and tin to form conventional lead-tin solder. Lead has a melting point of 327.46 oC, and tin has a melting point of 231.93 oC, but the solder composition of Pb37Sn63 melts at 183 oC. This property is not limited to metals, but it occurs also for other materials. For example, silica is used as a sintering aid for ceramic materials, since combinations of silica with other oxides lead to low melting point compositions that bind the ceramic particles together. Mixtures of the "antifreeze" agent, ethylene glycol (C2H4(OH)2, a.k.a., monoethylene glycol and ethane-1,2-diol), and water will give a lower melting point (or "freezing point," if you prefer) than water (0 oC) or ethylene glycol (-12.9 oC) in their pure forms. At the magic mixture of about 40% ethylene glycol and 60% water, the freezing point is reduced to -60 oC.

Methanol was used as an antifreeze before widespread adoption of ethylene glycol. Although mixtures of methanol and water do have a low freezing point, the mixture has a low heat capacity and a low boiling point. Furthermore, methanol in the methanol-water mixtures would be lost to evaporation, so the market was ready for the improved properties of ethylene glycol. Ethylene glycol was important to aviation history, since its mixtures with water have a higher boiling point, also. This allowed for a higher engine coolant temperature and smaller, lighter radiators. Antifreeze has been important to Honeywell for another reason - Prestone antifreeze has been a product of Honeywell's Consumer Products Group since 1927, and Honeywell is celebrating the eightieth anniversary of Prestone Antifreeze this year.

Ethylene glycol is toxic, but its close cousin, propylene glycol (C3H8O2, a.k.a., propane-1,2-diol) is much less toxic. It's even used as a food additive, although drinking any sort of antifreeze should not be considered safe. As mentioned earlier, a proper ratio of water and glycol is required to ensure the minimum freezing temperature, but water is one substance you would like to exclude from any system because of its corrosive effect. A particular mixture of ethylene glycol and propylene glycol results in a coolant with a boiling point of 188 oC with no water content.

Today, antifreeze is not just ethylene glycol. There are various additives that inhibit corrosion. The color of the antifreeze identifies the type of anti-corrosion agent used. Red or pink signifies organic acids, and blue or green signifies inorganic agents, such as silicates, borates, phosphates. It's typically a good idea not to mix the colors, although some organic acids are claimed to be compatible with inorganic agents and have a green or yellow color.

1. Prestone Antifreeze Web Site.

September 26, 2007

Lucky Shot

In a previous article, I discussed a speech given by Steven Weinberg, in which he argued against manned spaceflight. Weinberg, who shared the 1979 Nobel Prize in Physics for work on the unification of the elementary forces of electromagnetism and the weak nuclear force, thinks that robotic space missions have a much higher scientific payback per dollar spent. Weinberg spoke at the Space Telescope Science Institute, the scientific center for the Hubble Space Telescope and the planned James Webb Space Telescope, so it's interesting to note that some recent research has found that you don't need a space telescope to get Hubble-quality images.

The rationale for a telescope in space is to eliminate imaging problems caused by atmospheric disturbances. We aren't just talking about clouds. Stars will still "twinkle" on a clear night because of variation in atmospheric refraction from turbulence, and this blurs images during the long exposures required to capture dim objects. One technique to counter atmospheric disturbance is adaptive optics in which the movement of a nearby reference star is tracked. The telescope mirror is changed slightly in shape, or inserted optics are adjusted to maintain focus and de-blur the image. A variation of this technique used when a bright reference star is not in the field of view is to generate a laser guide star - a laser beam excites sodium atoms at very high altitude to create an artificial star.

Astronomers at the University of Cambridge have now used a different technique called "lucky imaging" to enhance the resolution of ground-based telescopes [1, 2]. By using a high-speed camera developed by E2V Technologies which can resolve single photons, researchers at the Cambridge Institute of Astronomy collaborated with astronomers from the Palomar Observatory to do imaging experiments with the 200-inch Hale Telescope. The Hale Telescope produces images that are about a factor of ten less detailed than Hubble. The idea behind "lucky imaging" is simple. Images are taken twenty times a second. These images are analyzed by computer to find the few sharp images that exist in this large number of images, and the sharp images are combined to produce a high resolution image. As any Six Sigma practitioner will know, there will always be a chance alignment of turbulent cells that provides the needed resolution, if you are willing wait that long. The computer software is not that complicated. It's claimed that the Palomar Lucky Camera produces images twice as sharp as those of Hubble at a 50,000 times cheaper cost.

Of course, when scientific information leaves the laboratory and enters the realm of the press release, distortions inevitably occur. Although the lucky images are impressive [3], Hubble still can do better. The Hubble astronomers were so angered by the claims that the Hubble had been surpassed that they immediately issued a rebuttal [4]. Although everyone admits that the lucky imaging technique yields an angular resolution of 50 milliarcseconds, and Hubble's resolution is about 100 milliarcseconds, one of the key differences is that 99% of the lucky images are discarded, and every Hubble image is a "keeper." Furthermore, the lucky camera will work only for bright images over large patches of sky, and not for the dim "deep sky" images for which Hubble is famous. Still, advances in technology like the lucky imaging technique will keep ground-based astronomers in the running in the Space Age.

1. Tom Kirk, "'Lucky Camera' takes sharpest ever images of stars" (University of Cambridge Press Release, September 3, 2007).
2. Scott Kardel, "Caltech Astronomers Obtain Sharpest-Ever Pictures of the Heavens" (California Institute of Technology Press Release, September 3, 2007).
3. Luck Imaging photos appear here and here.
4. Geoff Brumfiel, "Is this the clearest picture of space ever taken? Claims of the 'sharpest' photos of space are a little fuzzy (Nature News, September 7, 2007).

September 25, 2007

Einstein's Prediction

Physics rarely makes front-page news. After recounting all the daily disasters, there's not much space remaining for scientific discoveries, especially those of the mathematical, abstract variety. However, some important discoveries do make it into the first few pages of the New York Times. The New York Times has just opened some of its archives to free access after a two year effort to make money from a subscription service. As any savvy internet user knows, information wants to be free, you just need to find a way to make "free" pay for itself, somewhat like commercial television. The New York Times had an online subscription service that attracted almost a quarter of a million subscribers at about fifty dollars per year. However, this figure pales in comparison to the estimated thirteen million unique visitors referred to their web site each month from search sites such as Google [1]. Industry observers uniformly believe that revenue from online subscription services will always be less than advertising revenue from open web sites. Although the entire Times archive will be open to print subscribers, everyone now has free access to articles from 1987 to the present, and from 1851 to 1922 [2]. This brings us back to our topic of physics in the news.

In 1915-16, Einstein published his Theory of General Relativity, which predicted that light would be deflected by massive bodies [3-4]. This prediction was presaged by his earlier work in 1911, but none of this made the news. This was because it was an unproven mathematical notion that was not well understood. As British physicist Arthur Eddington reportedly replied when asked by a reporter whether it was true that only three people understood Relativity, "Oh, who's the third?" It was Eddington who finally put Einstein in the news and gave him celebrity status by doing an observation that "proved" General Relativity.

Eddington decided that observations of stars near the eclipsed sun would show that light bends around a massive body, so he and his colleagues traveled to Príncipe, an island near the West African country, Equatorial Guinea, to photograph the solar eclipse of May 29, 1919. Eddington's team's observations [5] confirmed Einstein's theory, and General Relativity made the front page of The Times (London), and page six of the New York Times [6]. It's interesting that later analysis of Eddington's photographs showed that his positive result was about the same magnitude as the experimental uncertainty, so he should have used his Six Sigma toolkit. Further eclipse observations using modern technology have vindicated the theory, and there are other observations, as well, so we can be fairly certain that General Relativity is quite close to the truth.

1. Kelly Fiveash, "New York Times and Murdoch gunning for more clicks" (The Register, September 19, 2007).
2. Sumner Lemon, "The New York Times to stop charging for online content" (Computerworld, September 18, 2007).
3. Albert Einstein, "Die Feldgleichungen der Gravitation," Sitzungsberichte der Preussischen Akademie der Wissenschaften zu Berlin (1915), pp. 844-847.
4. Albert Einstein, "Die Grundlage der allgemeinen Relativitätstheorie," Annalen der Physik (1916), p. 49
5. F. W. Dyson, A. S. Eddington and C. R. Davidson, "A Determination of the Deflection of Light by the Sun's Gravitational Field, from Observations Made at the Total Eclipse of May 29, 1919," Mem. R. Astron. Soc., vol. 220 (1920), pp. 291-333.
6. New York Times story about Eddington's confirmation of General Relativity (November 9, 1919); PDF file: ECLIPSE SHOWED GRAVITY VARIATION; Diversion of Light Rays Accepted as Affecting Newton's Principles.HAILED AS EPOCHMAKING British Scientist Calls the Discovery One of the Greatest of Human Achievements.
7. Reproduction of Eddington's photograph of the 1919 solar eclipse.

September 24, 2007

Computer Poetry

When I was in high school, I read an interesting article in the Bell Labs Record, a magazine that published popular accounts of research at Bell Labs. The article was about using a computer to compose poetry automatically. I always remembered that article, and when I built my first computer in the late 1970s, one of the first things I did was write a poetry generator. Computers were primitive in those days, so the program was written in assembler. It used templates and dictionary databases of specific topic areas in which words were categorized by whether they were nouns, adjectives, verbs, etc.; and their number of syllables. For the larger dictionaries, the program produced some acceptable poetry, but the rest of the output could be quite humorous. In 1983, two computer scientists wrote a more advanced program of this sort, Racter, and published a book, entitled "The Policeman's Beard Is Half Constructed," based on its output [1]. As you can tell from the title, their program produced some humorous output, also, and at the same time it produced things such as this:

"More than iron, more than lead, more than gold I need electricity.
I need it more than I need lamb or pork or lettuce or cucumber.
I need it for my dreams."

Autonomous composition of poetry by computers has been dismissed by some as just another version of the Infinite Monkey Theorem, but there's another side to computer poetry. This is the use of the multimedia capability of computers to enhance the poetic experience. The state of the art of this field is summarized in the Ph.D. thesis [2, 3] of Maria Engberg. Historically, reading and recitation by a speaker were the only ways to access poetry. This is a long history, extending farther back than the Iliad, recited about 2,500 years ago and eventually transcribed. In the past few decades, audio recordings and the occasional television show became new media for poetic expression. Soon, instead of just a televised image of a speaker, film images and then computer graphic art were introduced to capture the viewer's interest. Now, all such things are possible on home computers, and this opens the way for more interesting interaction with the listener, such as tactile and kinetic effects. A love poem in which you feel someone stroking your hand? Perhaps. There are also "immersion" environments in which a person is surrounded by computer displays. All this may evolve into a Star Trek Holodeck device.

I can't write about poetry without a mention of Vogon poetry. The Vogons are an alien race in Douglas Adams' The Hitchhiker's Guide to the Galaxy [4]. Their poetry is so bad that they use it as a means of torture.

1. The Policeman's Beard Is Half Constructed (Amazon.com).
2. Johanna Blomqvist, "Computer poetry pushes the genre envelope" (Uppsala University Press Release, September 12, 2007).
3. Maria Engberg, "Born Digital: Writing Poetry in the Age of New Media (Ph.D. Thesis, Uppsala University, 2007).
4. The Hitchhiker's Guide to the Galaxy on the Internet Movie Database.
5. Digital Poetry Group (Google Groups).

September 21, 2007

Composite Controversy

A few scientific debates have captured the public interest. The most recent of these is global warming, and there is fierce public and political debate about whether or not the world is becoming warmer because of human activities. A long-standing debate has centered on Darwinian evolution, and this debate has persisted since the time Darwin introduced the theory of evolution by natural selection nearly a hundred and fifty years ago. There have been a few cases in which materials science has become a public issue. One of these is Richard Feynman's observations on the reason for the loss of the space shuttle Challenger. Now, materials science is in the news again in a debate about whether composites are a safe material for construction of aircraft; specifically, the Boeing 787 Dreamliner.

Dan Rather, anchorman for CBS News for twenty four years, and presently a featured reported on the cable television channel, HDNet, has sparked controversy with his September 18, 2007, report, "Plastic Planes," which investigates some concerns about composite materials in the Boeing 787 [1]. Among these is the idea that composites would not absorb impact energy as well as the traditional aluminum materials, that burning composites would create highly toxic smoke, and that it is difficult to ascertain damage to composite materials and effect their repair [2]. Fifty percent of the Boeing 787 is constructed from composites [3].

Public reception of this report can be summarized in the title of an article on Wired News, a popular technology web site - "Dan Rather Makes Questionable Case Against Science Behind Boeing Dreamliner." [4] The first problem is the title of Rather's report. Of course, advanced composites are not simply "plastics," and they've been used for years in military aircraft, as well as sports cars and other vehicles. They do have different problems than aluminum, such as sensitivity to exposure to moisture, chemicals such as jet fuel and possibly de-icing fluids, and UV radiation, but you can be certain that they are tested to maintain their properties throughout the anticipated lifetime of the aircraft. The 787 Dreamliner must be certified for air-worthiness by the FAA, and I would be surprised if Boeing employees, who are often passengers on their own aircraft, would not maintain the highest standards of safety.

Rather's report was generally balanced, since it included interviews from aviation experts, including pilots, who believe there is no fundamental problem with composite aircraft. Tom Hahn, editor of the Journal of Composite Materials and a Distinguished Professor of Mechanical and Aerospace Engineering at UCLA, is quoted in Wired News as follows [4]:

"I have full trust in Boeing's decision to use composites in 787. It is a timely decision based on our understanding of composites stemming from many years of analytical and experimental research."

Boeing has booked orders for nearly seven hundred 787 Dreamliner from forty-seven customers, and the first delivery, to All Nippon Airways, is anticipated for May, 2008. [5]

1. Dan Rather, "Plastic Planes" (HDNet, September 18, 2007).
2. Lester Haines, "787 unsafe, claims former Boeing engineer" (The Register, September 19, 2007).
3. Boeing 787 from the Ground Up, Aero, Quarter 4, 2006 (Boeing).
4. Aaron Rowe, "Dan Rather Makes Questionable Case Against Science Behind Boeing Dreamliner" (Wired News, September 19, 2007).
5. Dan Rather Expected to Report Boeing 787 Unsafe (Fox News, September 18, 2007).
6. Transcript: Dan Rather Reports, Episode Number: 231, Episode Title: Plastic Planes (MS Word Document).

September 20, 2007

Where No Man Should Go Again

At the introduction of the original Star Trek television series, Captain Kirk announces that he will venture into space and "Boldly go where no man has gone before." However, Steven Weinberg, who shared the 1979 Nobel Prize in Physics for work on the unification of the elementary forces of electromagnetism and the weak nuclear force, thinks we should abandon manned spaceflight. At a speech delivered at this week's Science Writers Workshop at the Space Telescope Science Institute, the scientific center for the Hubble Space Telescope, Weinberg argued that manned spaceflight has led to no scientific advances, and it's a waste of money. Says Weinberg,

"The International Space Station is an orbital turkey... No important science has come out of it... and I would go beyond that and say that the whole manned spaceflight program, which is so enormously expensive, has produced nothing of scientific value." [1]

It's quite apparent that manned spaceflight is dangerous. The dramatic flight of Apollo 13 in 1970 came just a few years after the loss of Apollo 1 and its three crewmen in a launch pad fire. The seven crewmembers of the Space Shuttle Challenger were killed shortly after launch in 1986. The Space Shuttle Columbia and its seven crewmembers were lost upon re-entry in 2003. These seventeen lives were lost even after considerable expenditure to make spaceflight safe for humans.

While saying that "Human beings don't serve any useful function in space," Weinberg praised the various robotic missions, such as the Mars Pathfinder and the Mars Exploration Rovers, which generate a huge amount of scientific information for just a small fraction of the budget of the manned spaceflight programs. Weinberg thinks that the public fascination with manned spaceflight is over. It's been eclipsed by the adventures of the Martian rovers. "I think our political leaders underestimate the intelligence of the public in thinking they won't be fascinated by real scientific discoveries."

Weinberg's subtext is the need to put more money into unmanned missions, such as the Joint Dark Energy Mission of NASA's Beyond Einstein initiative. This mission is designed to study dark energy, a force believed to be the cause of the expansion of the universe. Presently, whenever there's a shortfall in NASA's budget, it's the unmanned projects that are cut to prop up the manned programs.

1. Ker Than, "Nobel Laureate Disses NASA's Manned Spaceflight" (Yahoo News, September 18, 2007).

September 19, 2007


Titanium, the twenty-second chemical element, is relatively abundant in the Earth's crust, which is about a half percent titanium by weight [1]. This is a tenth as plentiful as iron, an inexpensive metal which is used extensively in everything from refrigerators and automobiles to magnets. Titanium exists mostly in mineral deposits of rutile (titanium dioxide, TiO2) and ilmenite (iron titanium oxide, FeTiO3). It is extracted by the Kroll Process, in which the oxide is reduced by coke at 1000o in a fluidized-bed reactor, reacted with chlorine to form TiCl4, which is finally reduced by molten magnesium to form liquid MgCl2 and solid titanium. The process is energy intensive and time consuming, so the present market value of titanium is more than $10,000 per metric ton [2].

A new process for the production of titanium was discovered at the University of Cambridge in 1996, and perfected in 1997. This process, called the FFC Cambridge Process after the inventors, Tom Farthing, Derek Fray and George Chen, is based on the small solubility of calcium in molten calcium chloride. A few percent calcium will dissolve in calcium chloride at high temperatures (3.9 mole percent at 900 oC), and the molten calcium metal will reduce titanium oxide in an electrolytic cell. The FFC Cambridge Process may provide titanium at a lower cost.

Of course, titanium is a major material component in aircraft, since it is lightweight and strong. It's becoming such a popular material that demand is increasing at a faster rate than supply, which is another factor in titanium's high price. The US used about 75,000 metric tons of titanium in 2006, a third of which was imported. Boeing, which incorporated nearly fifteen percent titanium into the design of its new 787 jetliner [3], has formed a joint venture with a Russian company, VSMPO-Avisma Corporation, to build a titanium plant in Russia. Titanium Metals Corporation (Timet) of Dallas, Texas, bills itself as the world's largest producer of titanium. Timet produces about 12,500 metric tons of titanium annually, but Allegheny Technologies of Pittsburgh, Pennsylvania, intends to ramp up its annual production from its current 6,000 metric tons to over 20,000 metric tons.

The US military, which is a prime user of titanium, has funded research programs for titanium production processes through DARPA at a cost of $20 million. The goal is to follow the path taken by aluminum, a previously expensive metal that's become a commodity item used in such mundane objects as soft drink containers. Since 2003, DARPA has funded research programs at Timet, International Titanium Powder LLC, and a joint venture of DuPont and MER Corporation. These research efforts have had mixed results. The $12.3 million research program at Timet resulted in a method that was not commercialized.

1. Abundances of the Elements (Wikipedia).
2. Paul Glader, "Tweaking Titanium's Recipe," Wall Street Journal (September 10, 2007), p. A10.
3. Boeing 787 from the Ground Up, Aero, Quarter 4, 2006 (Boeing).

September 18, 2007

Signals of Opportunity

The Energy Crisis of 1979 spawned many energy saving technologies, most of which used off-the-shelf components and did not require any fundamental research. At Honeywell's Morristown, New Jersey, location, motion sensing switches were installed in many areas to turn off lighting when no people were around. Later, in 1992 when office computers became ubiquitous, the Energy Star program was launched by the US Environmental Protection Agency to place computers and other office equipment in sleep mode when unused for an extended period, typically overnight. It is estimated that $12 billion in energy costs are saved annually through this program.

Motion sensing is cost effective in a workplace where tens of people occupy an area, but not in a home environment where one or more sensors would be required in every room, and there are just a few occupants. Gregory Abowd, an Associate Professor of computer science at the Georgia Institute of Technology has a novel idea for occupancy sensing which he will present this week at the 9th International Conference on Ubiquitous Computing. Abowd is director of the Georgia Tech Ubiquitous Computing Research Group and co-director of its Aware Home Research Initiative.

The idea is simple, but it would not have been possible without today's inexpensive but powerful computers. A device, plugged into a standard electrical outlet, monitors noise on the electrical lines. When an electrical device, such as a television or microwave oven, is on, it will emit a characteristic noise signal. Abowd and his team conducted tests with nineteen representative appliances in six different houses. After appropriate "training" in the host environment, which takes about four hours, their present system was able to achieve about 90% accuracy. For this research, only one appliance was active at any time, whereas a real environment would need to distinguish any one appliance in the presence of any others. If accuracy can be increased, this could be a useful system, not only for energy savings, but as a monitor of the normal activity of elderly people living alone. Of course, occupants need to be "trained" also to turn unused appliances off. It seems to me that what's needed is an "Energy Star" type program that embeds an inexpensive power-line communications device into appliances. Since nearly all appliances now have a computing element, such a device would add less than a dollar to appliance cost. The HomePlug Powerline Alliance is a more elaborate platform for such a capability.

1. Kurt Kleiner, "'Smart homes' could track your electrical noise" (NewScientist.com News, September 10, 2007).

September 17, 2007

Portable Power Challenge

The US military, in an effort to accelerate application of certain technologies, secured congressional permission to stage contests for which cash prizes are awarded. First, there was the 2004 DARPA Grand Challenge in which contestants fielded autonomous vehicles to maneuver a 150-mile course in the Mojave Desert. The contest was so difficult that there were no winners. The vehicle to travel the farthest managed only a little more than seven miles. Another contest, the DARPA Urban Challenge, will take place on November 3, 2007, at an abandoned air force base. The purpose of this new contest is to have a vehicle maneuver a 60-mile urban-style course in less than six hours while obeying all traffic regulations and avoiding obstacles and other vehicles. The winner of the Urban Challenge will be awarded $2 million, and there will be second and third place awards of $1 million and $500,000. Honeywell Aerospace Advanced Technology is participating in the DARPA Urban Challenge [1].

The US Defense Department has just announced a new challenge contest, this one involving portable power for troops [2]. Specific technical requirements will be released at a September 21, 2007, meeting, but the overall requirement is for a wearable 20-watt average power source operable over four days and weighing less than four kilograms (8.8 pounds). This is the anticipated power required to operate night-vision goggles, radios, satellite navigation units, and other electronic gizmos. There will be a million dollar first place prize. Further information is available on the contest web site.

Based on the summary requirements, how hard will it be to meet the challenge? If we ignore the necessary electrical regulation and power conversion circuitry, and just focus on the required energy storage, what technologies will give us the needed energy in the allowed weight? Twenty watts over four days is 6,912,000 watt-sec, or 6,912 kilojoule (kJ). Listed below are the energy storage capabilities of various physical or chemical systems [3] and the mass required to store 6,912 kJ.

Mechanical Energy
Flywheel - 1 kJ/gram (6.91 kg)
Torsion Spring - 0.0003 kJ/gram (23,040 kg)

Electrochemical Energy
Lead Acid Battery - 0.1 kJ/gram (69.12 kg)
NiMH Battery - 0.22 kJ/gram (31.42 kg)
Lithium Battery - 2.5 kJ/gram (2.76 kg)

Supercapacitor - 0.01 kJ/gram (691.2 kg)

Chemical (Reaction)
Thermite - 4 kJ/gram (1.73 kg)

Chemical (Combustion)
Gasoline - 46.9 kJ/gram (0.15 kg)
Ethanol - 30 kJ/gram (0.23 kg)
Hydrogen - 143 kJ/gram (0.05 kg)
Lithium - 43.1 kJ/gram (0.16 kg)
Magnesium - 24.7 kJ/gram (0.28 kg)
Sodium - 9.1 kJ/gram (0.76 kg)

Some of these systems have fundamental problems. Tweaking a flywheel to get more energy might work, but energy-storage flywheels need to operate in a vacuum, and containment of a shattered flywheel is a major problem. Lithium batteries seem to be a viable solution, but they would likely not win the prize, since they are too conventional. The chemical reactions all have the problem of converting the heat energy to electrical energy at high efficiency, and some of the combustion reactions produce toxic or irritating products, but the weight of the fuel is quite small. If a suitable generator and exhaust scrubber could be managed in a few pounds weight, this would be a viable option. Of course, a fuel cell type construction would be ideal, but fuel cell technology is not at a point where it can be used in a wearable device.

1. Track A Participants announced for 2007 DARPA Urban Grand Challenge (gizmag.com).
2. Jim Wolf, "Pentagon revs up drive for wearable power" (Reuters, via Yahoo News, September 13, 2007).
3. Energy Density of Energy Sources (Wikipedia).

September 14, 2007

Stephen Hawking - Science Fiction Author

When the preeminent physicist, Stephen Hawking, wrote A Brief History of Time (Bantam Press, 1988), his popular account of cosmology, relativity, and superstring theory as it was known at that time, his editor warned him that every equation he chose to include would halve the book's sales. He couldn't resist including the iconic equation, E = mc2, but that equation is likely an exception to the rule. This book sold millions of copies, but it's thought that most people bought it as a "coffee-table book," but didn't read it. Stephen Wolfram's A New Kind of Science falls into this same category. I have a copy of this twelve hundred page book, now freely available online [1], but I read the many reviews and rebuttals of the book, instead. These number just about as many pages, and I think they were more informative.

Now, Hawking has co-authored a book that has not a single equation, but it may do more towards educating the public about science than A Brief History of Time. This time, the intended audience is young children, and it's expected that they will actually read the book. The book, "George's Secret Key to the Universe," is co-authored with his daughter, Lucy, and Christophe Galfard, a Frenchman who works in the same areas of physics as Hawking [2]. The book is of the science fiction genre, but, as expected, it's still a "hard science" book. As Lucy Hawking, a journalist and writer, explained to the press, her father's repeated comment during writing was, "That's too much science fiction, we do science fact."[3] The book is intended to be the first of a trilogy, and it explains the solar system and other celestial bodies, including black holes. Young reader's interest is maintained, since the characters in the book are youngsters themselves. The fictional element in the book is a supercomputer that allows the youngsters to travel through space on an asteroid.

The idea of humans traveling via asteroid is not new to this book. It appeared in "The Wailing Asteroid" a 1960 book by Murray Leinster (William Fitzgerald Jenkins). This book was made into a movie, The Terrornauts (1967, Montgomery Tully, Director), which was so unlike the book that it was publicly scorned by the author himself.

1. Stephen Wolfram, "A New Kind of Science," Wolfram Media, Inc., May 14, 2002. ISBN 1-57955-008-8
2. "Science should be 'as exciting as science fiction' says Hawking" (Agençe France-Presse News Article, via Google).
3. "Hawking pens kids' cosmology book" (Agençe France-Presse, September 4, 2007}
4. Stephen Hawking, Lucy Hawking, Christophe Galfard, and Garry Parsons (Illustrator), "George's Secret Key to the Universe," Simon & Schuster Children's Publishing, Hardcover, 304 pages, ISBN-10: 1416954627, ISBN-13: 978-1416954620 (Available October 23, 2007, Amazon.com)

September 13, 2007

The Mind in the Thermostat

Philosophiae Naturalis Principia Mathematica, known simply as "Principia," is the work, published in 1687, in which Isaac Newton introduced the concept of universal gravitation. Newton's critics complained that, although he described gravity with his mathematics, he didn't explain what caused gravity. In response to these criticisms, Newton added an appendix to the second edition of the Principia in which he wrote

"I have not as yet been able to discover the reason for these properties of gravity from phenomena, and I do not feign hypotheses. For whatever is not deduced from the phenomena must be called a hypothesis; and hypotheses, whether metaphysical or physical, or based on occult qualities, or mechanical, have no place in experimental philosophy. In this philosophy particular propositions are inferred from the phenomena, and afterwards rendered general by induction." [1, 2]

The phrase, "I do not feign hypotheses," hypotheses non fingo in Latin, has set the tenor of physics for the three centuries since that time. The science in physics is the generalization of phenomena. The explanation of what causes these phenomena is not required. However, a strict observance of the rules of this game can lead to some unusual ideas, including an unusual idea about thermostats.

Honeywell is famous for the iconic, round thermostat. This thermostat, introduced in 1906, has just celebrated a centennial anniversary. Even so, it's surprising that thermostats are mentioned in a recently published philosophy book, "Brain, Mind, and Language," [3] reviewed in an August issue of Science [4]. One of the book's authors, Daniel Dennett, is Director of the Center for Cognitive Studies at Tufts University. Dennett has always been interested in the idea of machine consciousness; that is, the question as to whether the concept of "self," etc., is unique to humans, or whether computers will eventually become sentient. This is an important question, since some scientists believe that computers should have sufficient "brain power" within the next few decades to become as fully conscious as humans. On the topic of machine consciousness, Dennett reasons from the reciprocal viewpoint that humans are really just machines, and nothing more, so it won't surprise him when computers are found to be conscious. Unfortunately, Dennett embellishes such thinking with a sprinkling of atheism, but atheism is not a requirement for belief that machines may one day think.

Where do thermostats fit into this grand philosophical scheme? Dennett argues that the human qualities of belief and intent are embodied in even simple machines such as thermostats. A thermostat believes it knows the temperature, and it intends to keep it at a particular value. Barry Dainton, a professor of philosophy at the University of Liverpool, and author of the aforementioned book review, doesn't agree, but belief that human qualities exist in something as simple as a thermostat neatly summarizes Dennett's philosophy. Now I have some confirmation that my "Honeywell Round" is waiting for a chance to take over my house. As Kurt Cobain once said, "Just because you're paranoid, doesn't mean they aren't after you."

1. I. Bernard Cohen and Anne Whitman, translators, "Philosophiae Naturalis Principia Mathematica, General Scholium (Isaac Newton)," Third edition, University of California Press (1999, ISBN 0-520-08817-4), p. 943. Latin facsimile of the entire edition is available here.
2. Rationem vero harum gravitatis proprietatum ex phaenomenis nondum potui deducere, & hypotheses non fingo. Quicquid enim ex phaenomenis non deducitur, hypothesis vocanda est; & hypotheses seu metaphysicae, seu physicae, seu qualitatum occultarum, seu mechanicae, in philosophia experimentali locum non habent. In hac philosophia propositiones deducuntur ex phaenomenis, & redduntur generales per inductionem.
3. Maxwell Bennett, Daniel Dennett, Peter Hacker, and John Searle, "Brain, Mind, and Language," Columbia University Press (New York, 2007), 227 pp., ISBN 9780231140447.
4. Barry Dainton, "Neuroscience: Wittgenstein and the Brain," Science, vol. 317. no. 5840 (17 August 2007), p. 901.

September 12, 2007

Physics in Virtual Worlds

As another indicator of my deep experience (or advanced age), I operated an electronic analog computer during a laboratory assignment while I was an undergraduate. An electronic analog computer is composed of electrical circuits that can do simple mathematics, usually as a function of time, and the results are displayed on an analog pen plotter. This was so long ago that the circuitry was based on vacuum tubes. For vacuum tubes, addition and subtraction are quite simple. Surprisingly, integration and differentiation are easier that multiplication and division, so the laboratory exercise was chosen around these limitations. The computation was ballistic trajectory, so we shot cannon balls on the Earth, Moon, and Jupiter. We were exploring the physics of virtual worlds many decades before Second Life.

Physics is important to realistic computer gaming. Most importantly, ray-tracing and other optical computations are important to the realistic simulation of the virtual world. Mechanics is important to movement, so objects will fall at the proper rate. Other kinetic effects, such as the consequences of explosion, need to be modeled. Most computer games incorporate a "physics engine," which is a set of computer routines that simulate Newtonian mechanics.

Nature Physics features a one page science fiction story on the back page of each issue. In the August, 2007, issue, this is a story [1] by Craig DeLancey, a professor of Philosophy at a small college in upstate New York. The story takes the form of a letter from the Chair of a physics department at a university to her Dean. The letter bemoans the decreasing enrollment of students in physics courses, and it suggests what might be done to make the curriculum more attractive to students. Fortunately, she doesn't propose diluting physics to make it easier on the students. Instead, she proposes teaching courses that students would find more interesting; courses on the physics of virtual worlds, for example,

• Virtual Ballistics

• Space Travel in Non-Relativistic Worlds

• Fluid Dynamics of Instantaneous Weather

• Symmetry and Asymmetry of Fake Space

• Basic Thermodynamics of Fictitious Explosions

I guess I was ahead of the curve on the Virtual Ballistics course. The Fluid Dynamics of Instantaneous Weather is an obvious put-down of the film, "The Day After Tomorrow." DeLancey has some fun also with the transformation of computer science into information management, and some of the marginally academic media courses being taught at universities today. Some subject areas seem to be more suited for trade schools, and they shouldn't be hosted by universities.

1. Craig DeLancey, "Physics 2.0," Nature Physics, vol. 3, no. 8 (August 2007), p. 580.
2. "The Day After Tomorrow" (2004, Roland Emmerich, Director) on the Internet Movie Database.

September 11, 2007

Dry Bones, Dry Bones, Them Dry Bones

Hydroxyapatite, more properly called hydroxylapatite, is a mineral form of apatite with the chemical formula, Ca5(PO4)3(OH). A more descriptive formula is Ca10(PO4)6(OH)2, which expresses the fact that there are two chemical units per unit crystallographic cell. This material would be a mere curiosity, but for the fact that 70% of your bones are composed of the stuff, and your tooth enamel is composed of carbonated, calcium deficient, hydroxyapatite. Not surprisingly, engineered hydroxyapatite is used as a prosthetic implant material. Mother Nature determined that hydroxyapatite was a great material to use in structural composites. These composite materials have a combination of strength and fracture toughness that is still being investigated in the laboratory.

Markus Buehler, a professor at the Laboratory for Atomistic and Molecular Mechanics in the Department of Civil and Environmental Engineering at MIT investigated the fracture toughness of bone in a model that takes into account material interactions on the molecular, as well as nanometer, scale. Bone exhibits toughening at different scales in a hierarchical manner in which each toughening at each scale level, from the nanoscale to larger scales, is optimized. Buehler's research, presented in a recent article in Nanotechnology [1, 2], shows that the nanostructure of bone, at the level of mineralized collagen fibrils, is important to its strength and ability to handle large deformations. The mineralized fibers, a composite of inorganic and organic material, have a greater fracture toughness than the fibers alone.

The composite structure of bone consists of alternating molecules of collagen and similarly-sized hydroxyapatite crystals. Since only weak bonds are formed between the hydroxyapatite crystals and collagen molecules in the fibers, and between the fibers themselves, this allows a yielding fracture of bone on a nanoscale, which relieves stress overall. As a consequence, bones are less likely to break under large applied stress. The addition of the nanoscale hydroxyapatite crystals to the collagen fibers increases the Young's modulus from about 4.6 GPa to 6.2 GPa. The tensile strain at yield is likewise increased from about 5% to 6.7%.

"Dry Bones" (a.k.a. "Dem Bones") is an American spiritual based on Ezekiel, chapter 37. It was included in a memorable scene in the BBC miniseries, The Singing Detective [3, 4].

1. Denise Brehm, "MIT probes secret of bone's strength" (MIT Press Release, August 24, 2007).
2. Markus J Buehler, "Molecular nanomechanics of nascent bone: fibrillar toughening by mineralization," Nanotechnology, vol. 18, no. 29 (July 25, 2007). The article is available as a PDF file here.
3. "The Singing Detective" on the Internet Movie Database.
4. "Dry Bones" scene from "The Singing Detective" (YouTube, not accessible from inside Honeywell.

September 10, 2007

Nano-Magnetic Sponges

Magnets have always amazed me because of their unusual and overtly demonstrated action-at-a-distance effect. Since magnets are so ubiquitous, appearing innocuously on almost every refrigerator door to hold our children's precious artwork, nobody thinks about how amazing their properties are. In our technological age, their use has transcended their original application as compass needles for navigation to useful components, such as motors. Importantly, many magnetic materials maintain their magnetism down to the nanoscale. Now, nanoparticle magnets have been utilized to create easily removable cleansers [1, 2].

Piero Baglioni, an accomplished professor of Physical Chemistry at the University of Florence, and his colleagues have published a paper in a recent issue of Langmuir describing a polymer gel with dispersed iron nanoparticles that can be used as a sponge for the application of cleaning chemicals to artwork [3]. Such an application is important to Italian heritage, since the country is a repository of much Renaissance artwork by artists such as Leonardo da Vinci and Michelangelo. Conservation of these works of art requires cleaning without damaging. Suitable cleaning chemicals for oil paintings and other artwork, such as sculptures, are known, but the problem is the removal of the chemical residue after cleaning.

Baglioni and his team found that a polyethylene glycol and acrylamide gel could be impregnated with iron nanoparticles. As polymer chemists know, polyacrylamide is highly water-absorbent, so it can be used as a vehicle for various chemical solutions, as well as oil-in-water microemulsions. The gel is relatively firm, so it's easy to handle and cut into sponges of a desired size and shape and loaded with suitable cleaning agents. A rubbing action is not required, since an osmotic pressure gradient appears across the sponge volume as solution evaporates from the top surface. This pulls the cleaning chemicals back into the sponge after a period of time. A magnet is then used to remove the gel from the artwork. It goes without saying that other applications can be found for this type of process.

1. Daniel Cressey, "Magnets harnessed to clean artwork" (Nature News, September 3, 2007).
2. Nanomagnetic sponges to clean precious works of art (ACS Press Release, Aug. 29, 2007).
3. Massimo Bonini, Sebastian Lenz, Rodorico Giorgi, and Piero Baglioni, "Nanomagnetic Sponges for the Cleaning of Works of Art," Langmuir, vol. 23, no. 17 (2007), pp. 8681-8685. Available as a PDF file here.