Larry Page and Sergey Bin's mission "to organise the world's information and make it universally accessible and useful" has resulted in an entire reference library at our fingertips. Google Book Search is a much-needed democratisation of knowledge.
We are now a nation of online shoppers, able to browse the offerings in our local bookshops and then buy them so much cheaper at Amazon (or Play.com). Amazon tells us about books we haven't even heard of and allows us to Search Inside. And the Kindle e-reader means any book any time!
Aside from intruding into our lives, Google has made us lazy. Research once meant books and a trip to the library, but these days we just let our fingers do the walking. Google Books gives far too much power to a company which has not always adhered to its unofficial slogan, "don't be evil". Now, despsite court challenges, it is seeking to establish an effective monopoly over digital access and distribution, and to strip writers of contractual rights.
The e-tailer has established a stranglehold on publishers that allows it to dictate terms and change them without consultation. Its price cuts are detrimental to the health of independent booksellers, which have become a mere research channel. With Booksurge and the Kindle e-reader, it is attempting to establish monopolies in the two growth areas of publishing and bookselling, print-on-demand and e-books.
Friday, December 18, 2009
Thursday, December 17, 2009
The planet, named GJ 1214b, is 2.7 times as large as Earth and orbits a star much smaller and less luminous than our sun. That's significant, Charbonneau said, because for many years, astronomers assumed that planets only would be found orbiting stars that are similar in size to the sun.
Because of that assumption, researchers didn't spend much time looking for planets circling small stars, he said. The discovery of this "watery world" helps debunk the notion that Earth-like planets could form only in conditions similar to those in our solar system.
"Nature is just far more inventive in making planets than we were imagining," he said.
In a way, the newly discovered planet was sitting right in front of astronomers' faces, just waiting for them to look. Instead of using high-powered telescopes attached to satellites, they spotted the planet using an amateur-sized, 16-inch telescope on the ground.
Wednesday, December 16, 2009
Oral Roberts was that most unusual of all kinds of Methodists--a Pentecostal one. He had a big heart, a large evangelistic career, and a big vision, even a vision of creating a huge medical center to go with the university he was founding.But of all the things we know about Oral Roberts he will undoubtedly be best remembered for all the faith healing services, and all the persons who benefited from his ministry.
My whole life I watched the old sinner bilk his sheep for millions to finance his own deification and the construction of his empire. Roberts once saw a 900-foot-tall Jesus who assured him that his City of Faith hospital would be built, if only Roberts would squeeze his acolytes for more bucks. The hospital was built, and went bankrupt. It’s now an office complex.
Friday, December 11, 2009
Researchers mapping a massive array of genomes across Asia say they have found evidence that humans covered the continent in a single migratory wave, and share a common ancestry.
The findings were released by the Human Genome Organisation (HUGO) Pan-Asian SNP Consortium which looks at single-nucleotide polymorphisms (SNPs), or variations at individual bases that make up the genetic code. The results challenge the view that Asia was populated by at least two waves of migration.
Thursday, December 10, 2009
Tuesday, December 08, 2009
Monday, December 07, 2009
Wednesday, December 02, 2009
But you can see how it could all go bad — how a culture so intense clerical, so politically high-handed, and so embarrassed (beyond the requirements of Christian doctrine) by human sexuality could magnify the horror of priestly pedophilia, and expand the pool of victims, by producing bishops inclined to strong-arm the problem out of public sight instead of dealing with it as Christian leaders should. (In The Faithful Departed, his account of the scandal, Philip Lawler claims that while less than five percent of priests were involved in actual abuse, over two-thirds of bishops were involved in covering it up.) I suspect it isn’t a coincidence that the worst of the priest-abuse scandals have been concentrated in Ireland and America — and indeed, in Boston, the most Irish of American cities — rather than, say, in Italy or Poland or Latin America or Asia. There will always be priests who become predators; the question is how the Church as an institution deals with it. It hasn’t been handled all that well anywhere, I’m afraid. But the particular qualities of Irish Catholicism — qualities which were once a source of immense vitality — seem to have led to a particularly horrifying outcome.Hat tip.
Tuesday, December 01, 2009
I just finished William Trevor's latest novel (his first since the disappointing Story of Lucy Gault) and am happy to report he's back in the driver's seat at age 82 or so. From Thomas Mallon's New York Times review:
There is a good deal of kindness in Trevor’s Rathmoye, and in the Rathmoyes he has created before it. Dillahan is good to Ellie, as the nuns had been. Mrs. Carley, once a maid at Shelhanagh, is kind to Florian; and the customers of aging Mr. Buckley, one of the salesmen at Mrs. Connulty’s, look out for him, correcting the errors he now makes when writing up orders, protecting him so “that he might safely reach the retirement he secretly craved.”
But dread and terror are also always present in this repressive world. Trevor is fantastically effective at foreboding; he can make a reader squirm just by withholding the next bit of some long-past anterior action he’s been recounting. When he wishes, as in his 1994 novel, “Felicia’s Journey,” he can depict the most gruesome violence, but always in the same even tones with which the hens get fed. This new novel, except for the accidents that took Mrs. Connulty’s husband and Dillahan’s first wife, is a delicate sort of drama — there is no corpse in the basement, no bomb lies hidden in any drawer — but even so, a reader will have his heart in his mouth for the last 50 pages. And when that heart settles back down, it will be broken and satisfied.
Evolution is not a means of creation; it isn't very clear that the notion of a 'means of creation' is coherent. Evolution is a theory of generation; to be more precise, it is an account of how populations change when the generation and destruction of individuals in the population is linked to variation in the individuals, through things like selection and drift. This is something entirely different from creation; living things are not created through evolution, or by evolution, or any such thing. Evolution is part of the overall account of how living things are generated, of how they change from contrary to contrary, from being that to being other than that; it therefore is part of our account for why something is this rather than that.
Monday, November 30, 2009
Now, which is more likely:
- That two extremely remote chances came up independently for the same species [rare-squared]; or
- That the one might have something to do with the other?
Tuesday, November 24, 2009
Monday, November 23, 2009
Let us by all means celebrate the man and his achievements. But let us not make him into a demigod, either (nor any scientific hero – if Pasteur discarded 90% of his data, and he did, that doesn’t belittle his contributions to science, and if Mendel made his data fit his model, and he probably didn’t but might have, that doesn’t change one whit the facts of genetics as we now understand them). Darwin did not invent the ball point pen, antibiotics, the iPhone (all hail the Prophet Jobs!), or BLTs, either. What he did is what he did, and more power to him.
What we need to know is that Darwin founded not a theory, or even a set of doctrines, but instead he is the focal point of a series of traditions that converged in his ideas and writings, and which have derived from him. He did not invent biogeography; de Candolle is a good candidate for that. He did not invent natural selection, although he was perhaps the first to think of it as an agent for evolutionary change (excepting Patrick Mathew, who buried his light under a naval architectural bushel). He did not invent genetics (although the term gene comes from his notion of a pangene, and he probably set many people thinking about heredity in a serious manner). He did not give us a mathematical theory; that was William Castle, JBS Haldane, RA Fisher and Sewall Wright, among others to this day.
Friday, November 20, 2009
Thursday, November 19, 2009
Monday, November 16, 2009
Thursday, November 12, 2009
Two tiny changes in the sequence of one gene could have helped install the mechanisms of speech and language in humans.
In 2001, a gene called FOXP2 was found to underlie a rare inherited speech and language disorder1. It encodes a transcription factor called FOXP2, a protein 'dimmer-switch' that binds to DNA and helps to determine to what extent other genes are expressed as proteins.
Experiments have now revealed that the human version of FOXP2, which has two different amino acids compared with the version carried by chimps, has differing effects on genes in the brains of the two species. These differences could affect how the brain develops, and so explain why only humans are capable of language.
But one fraud outstripped them all, eclipsing the others with its sheer audacity. Between 2000 and 2002, Jan Hendrik Schön, a researcher at Bell Laboratories, published more than 20 articles on electrical properties of unusual materials. He shot to the very top of the booming field of “molecular electronics”—a wonder field in which researchers aim to shrink computer chips down to single-molecule components. At Schön’s peak, he was submitting 4 or 5 articles per month, most of them going to top journals like Science and Nature. He hit his record in autumn 2001, turning out 7 articles that November alone. The output was staggering. It’s rare for a scientist—even a string theorist, beholden neither to instruments nor to data—to submit 7 articles in an entire year, let alone one month. And Schön’s papers were no run-of-the-mill exercises. In them, he announced one unbelievable discovery after another: He had created organic plastics that became superconductors or lasers; he had fashioned nanoscale transistors; and more. The editors of Science hailed one of his many contributions as a “breakthrough of the year” in 2001. The CEO of Lucent Technologies (parent company of Bell Labs) likewise touted Schön’s work when courting investors. Everything Schön touched seemed to turn to research gold.
Wednesday, November 11, 2009
Monday, November 09, 2009
The newest issue of Science Magazine includes a lovely demonstration of multilevel selection by Omar Tonsi Eldakar, my former graduate student, who is currently at the University of Arizona's Center for Insect Science.
Readers who have been following my "Truth and Reconciliation for Group Selection" series will be well prepared to appreciate the import of the Science article. Group selection requires variation among groups. Variation among groups is eroded by dispersal. Therefore, group selection can only take place in groups that are highly isolated from each other. That is part of the reasoning the led to the conclusion that group selection can only take place under highly restrictive conditions.
But wait. This argument assumes that dispersal is random. What if dispersal is conditional? What if individuals stay in groups when they are sufficiently cooperative but leave when they become overrun by selfish individuals? In this case, dispersal might increase variation among groups, improving the conditions for group selection. John Pepper and Athena Aktipis (featured in T&R XII) are two theorists who have studied this "walk away" process in agent-based simulation models.'
Saturday, November 07, 2009
Tuesday, November 03, 2009
Friday, October 30, 2009
During the penal period of the 1500s to the 1700s in England, Catholics had no legal rights. They could not hold office and were subject to fines, jail and heavy taxes. It was a capital offense to say Mass, and hundreds of priests were martyred.
Occasionally, English Catholics resisted, sometimes foolishly. One of the most foolish acts of resistance was a plot to blow up the Protestant King James I and his Parliament with gunpowder. This was supposed to trigger a Catholic uprising against their oppressors. The ill-conceived Gunpowder Plot was foiled on Nov. 5, 1605, when the man guarding the gunpowder, a reckless convert named Guy Fawkes, was captured and arrested. He was hanged; the plot fizzled.
Nov. 5, Guy Fawkes’ Day, became a great celebration in England, and so it remains. During the penal periods, bands of revelers would put on masks and visit local Catholics in the dead of night, demanding beer and cakes for their celebration: trick or treat!
Guy Fawkes’ Day arrived in the American colonies with the first English settlers. But, buy the time of the American Revolution, old King James and Guy Fawkes had pretty much been forgotten. Trick or treat, though, was too much fun to give up, so eventually it moved to Oct. 31, the day of the Irish-French masquerade. And in America, trick or treat wasn’t limited to Catholics.
The mixture of various immigrant traditions we know as Halloween had become a fixture in the Unites States by the early 1800s. To this day, it remains unknown in Europe, even in the countries from which some of the customs originated.
Trick or Treat.
Thursday, October 29, 2009
Tuesday, October 27, 2009
In 1964, Burnham, the author of the nightmare vision that so provoked Orwell, was helping William F. Buckley edit the National Review. (Reagan would later award Burnham the Medal of Freedom.) At the time, Burnham's latest book had administered another powerful dose of pessimism. Titled Suicide of the West, in it Burnham argued that modern liberalism had lost the fervor of classical liberalism. The modern variant treated peace and security as equal to or greater than the commitment to preserving freedom. Since the focus on peace denigrated the use of power against a ruthless foe, Burnham predicted that the West was slowly committing suicide.
History dealt Burnham's argument a strange hand. He would be pleased to see that a belief in defending the West was a factor in the American and European revival. But the positive, dynamic ideal offered in Western European countries and Japan was so magnetic precisely because those countries seemed to be discarding their traditional reliance on force and hard power.
At supreme moments of crisis in 1989 and 1990, critical choices were indeed made in favor of peace, in favor of nonviolent change. But those choices were made by men groomed from adolescence to be model Communist leaders. The suicide was in the East, not the West. And the suicide was not an act of self-destruction. Theirs was an act of creation.
Monday, October 26, 2009
Friday, October 23, 2009
Wednesday, October 21, 2009
Tuesday, October 20, 2009
Target Joins Price War as Sears Offers A Twist, and BN Prepares NookTarget joined the pre-order bestseller price war, though in more limited fashion. They're matching Walmart.com's $8.99 offer with free shipping included, but on just six November pre-order titles. Boulder Bookstore buyer Arsen Kashkashian has suggested via Twitter that fellow indies cancel their publisher pre-orders on these deep-discounted forthcoming titles and take advantage of their competitors' loss leaders. Bookstores will save money, he reasons, while helping Amazon and Walmart.com lose more.
This morning Sears offered their own twist on the discounts. Buy an "eligible book" at tempting discount from Sears.com or their competitors at Target.com, Walmart.com and Amazon.com and e-mail the receipt to Sears and they will give customers a $9 credit at Sears.com for any merchandise "so it's like getting the books for free." SVP for Online at Sears Holdings Imran Jooma, says, "We believe this program will benefit the thousands of customers who buy books every day by putting more money into their pockets." Called Keep America Reading, they are promoting it on the Sears.com home page. Cleverly, while Sears is highlighting the same ten forthcoming November releases as their competitors, from their own site they are listing all the titles for $17.98 (tempting consumers to help the other sites lose money on the book sales and then coem over and buy something else from Sears.com to use the extra credit.)
And Barnes & Noble's afternoon media event suffered another pre-announcement leak--this time from the bookseller itself. An ad in next Sunday's NYT Book Review touts their new ereader, called the Nook, and priced to match Kindle at $259.
Friday, October 16, 2009
From today's Publisher's Lunch email:
After closing 35 to 40 B. Dalton stores annually for years now, Barnes & Noble is preparing to shutter the last remaining group of 50 Dalton outlets. Spokesperson Carolyn Brown notes, "These are small-format, low-volume stores in malls and their leases are expiring." All but two B. Dalton stores will be closed within the next few months, with branches in Washington, DC and Roosevelt Field, Long Island remaining open until their leases expire. Brown says some "booksellers will be offered a chance to move to Barnes & Noble stores in the cases where there are Barnes & Noble stores near the B. Dalton's which are closing" and "others will be given generous severance packages." She declined to indicate how many positions will be eliminated as a result of the store closings.
The B. Dalton's are all "small, low-volume" mall stores, but Brown underscores that "we are still very committed to the mall business; about 75 percent of our new Barnes & Noble stores are in malls."
Wednesday, October 14, 2009
Tuesday, October 13, 2009
Thursday, October 08, 2009
Hub Blog gets my own kick out of theories about why American cooking has historically sucked over the years (industrialization etc.). But has anyone stopped to think it might have to do with the nation being founded by the descendants of Europe’s worst cooks, i.e. the English? That the English were followed to America by the world’s second and third worst cooks, i.e. the Irish and Germans? C’mon. Durgin Park didn’t spring from nowhere. …
Wednesday, October 07, 2009
Tuesday, October 06, 2009
Monday, October 05, 2009
Friday, October 02, 2009
Sometime tomorrow Richard Dawkins will be presenting the Richard Dawkins Award to Bill Maher at the Atheist Alliance International convention in Los Angeles.
Why is this a problem? It's a problem because Bill Maher is a kook. He believes in all kinds of strange things about alternative medicine, cancer, and immunizations.
Orac has the documentation at Respectful Insolence: Some "inconvenient questions" for Bill Maher and Richard Dawkins tomorrow. He also has a list of question for Bill Maher and Richard Dawkins.
PZ Myers will be at the convention. His attempt to defend Maher and Dawkins isn't working, in my opinion. Orac takes him on and exposes the hyprocrisy of the whole sorry episode. Maybe there will be fireworks at the convention tomorrow? I sure hope so. Giving the Richard Dawkins Award to Bill Maher is a travesty.
The new fossils of Ardipithecus ramidus — known as 'Ardi' — offer the first substantial view of the biology of a species close to the time of the last common ancestor, estimated to be at least 6 million years ago. Like modern humans, Ardi could walk upright and didn't use her arms for walking, as chimps do. Still, she retains a primitive big toe that could grasp a tree like an ape.
"This spectacular specimen shows why fossils really matter," says Andrew Hill, head of anthropology at Yale University in New Haven, Connecticut.
Previously, the oldest near-complete skeleton of a human ancestor was the 3.2-million-year-old Australopithecus afarensis skeleton known as Lucy, also from Ethiopia. Because Lucy had many traits in common with modern humans, she didn't provide much of a picture of the earlier lineage between apes and humans, says Alan Walker, a biological anthropologist at Pennsylvania State University in University Park. The new A. ramidus does.
(click on the picture to make clearer)
Wednesday, September 30, 2009
Nothing makes me more cynical about the whole book publishing industry than this. I mean, at least the young-adult vampire serials on the bestsellers lists involve semi-original characters. The author actually does have to do some real work. I get that. But this plundering of literary classics is the worst yet.
In line with these thoughts, and with the regret that comes from having to acknowledge yet set aside two things, namely the existence of human frailty and the contribution gifted individuals such as Roman Polanski make to society, I conclude that it is right that the United States authorities are seeking to extradite him to serve his sentence for rape. Neither fame nor wealth, neither time nor distance, should render anyone immune to laws protecting against serious crimes against other human beings.
Tuesday, September 29, 2009
As ludicrous as Shore's post is, I have to agree with Fecke that my favorite Polanski apologist is the Washington Post's Anne Applebaum, who finds it "bizarre" that anyone is still pursuing this case. And who also, by the by, failed to disclose the tiny, inconsequential detail that her husband, Polish foreign minister Radoslaw Sikorski, is actively pressuring U.S. authorities to drop the case.
Monday, September 28, 2009
Friday, September 25, 2009
Apropos the Coyne/Manzi debate, Ed Feser weighs in on teleology with some spirited and quite understandable exasperation with the way so many on both sides fail to understand Aquinas:
Let me make some general remarks about what the A-T [Aristotelian - Thomist] tradition does mean, then, before coming back to One Brow’s comment. If you are going to understand Aristotle and Aquinas, the first thing you need to do is put out of your mind everything that you’ve come to associate with words like “purpose,” “final cause,” “teleology,” and the like under the influence of what you’ve read about the Darwinism vs. Intelligent Design debate, Paley’s design argument, etc. None of that is relevant. If you think that what Aristotelians or Thomists mean when they say that teleology pervades the natural world is that certain natural objects exhibit “irreducible specified complexity,” or that some inorganic objects are analogous to machines and/or to biological organs, or that they are best explained as the means by which an “Intelligent Designer” is seeking to achieve certain goals, etc., then you are way off base. I realize that that’s the debate most people – including writers of pop apologetics books – think that arguments like the Fifth Way are about. They’re not. Think outside the box. “What hath Thomas Aquinas to do with William Paley?” Nothing. Forget Paley. (boldface mine)
Having had the undeserved good fortune of knowing him during his 21-year sojourn in Washington, I can testify to something lesser known: his extraordinary equanimity. His temperament was marked by a total lack of rancor. Angst, bitterness and anguish were alien to him. That, of course, made him unusual among the fraternity of conservatives because we believe that the world is going to hell in a handbasket. That makes us cranky. But not Irving. Never Irving. He retained steadiness, serenity and grace that expressed themselves in a courtliness couched in a calm quiet humor.
Thursday, September 24, 2009
Since the late 19th century, evolutionary biologists have debated whether evolution can go in reverse. If not, then evolution may depend on more than just natural selection. Multiple evolutionary paths could be possible through small chance events. It hasn't been easy to examine reversibility. Previous studies have focused on complex traits such as whale flippers, and scientists often lack sufficient information about ancestral traits or how present-day traits evolved.
So evolutionary biologist Joseph Thornton of the University of Oregon, Eugene, and his colleagues picked a more tractable subject: a single protein. His group has been studying the more than 450-million-year evolution of the glucocorticoid receptor (GR), a protein that binds to the stress hormone cortisol to control animals' response to it. Like all proteins, GR is made up of amino acids. By collecting the amino acid sequences of GR and related proteins from living animals, Thornton and his team previously constructed the GR evolutionary tree and resurrected sequences of GR's ancestors.
This history reveals that GR has switched its hormone preference. Around the time cartilaginous fish such as sharks split off from bony fish, roughly 440 million years ago, the ancestral protein that the scientists call GR1 responded to both cortisol and the hormone aldosterone. But 40 million years later, when four-legged creatures started to appear, the descendent GR2 had become cortisol-specific.
During these 40 million years, 37 amino acids changed. Only two were necessary to alter the function: One put a kink in the protein's shape, making it unresponsive to both hormones, and another allowed the restructured molecule to interact with only cortisol. Thornton's team next wondered if they could make GR2 recognize both cortisol and aldosterone by reverting these amino acids, which they call group X, back to their GR1 state. The researchers report today in Nature that this swap not only couldn't restore GR's original dual function but that it also killed the protein's ability to recognize any hormone.
Wednesday, September 23, 2009
T. Ryan Gregory's paper, online.
As is true with many other issues, a lack of understanding of natural selection does not necessarily correlate with a lack of confidence about one's level of comprehension. This could be due in part to the perception, unfortunately reinforced by many biologists, that natural selection is so logically compelling that its implications become self-evident once the basic principles have been conveyed. Thus, many professional biologists may agree that “[evolution] shows how everything from frogs to fleas got here via a few easily grasped biological processes” (Coyne 2006; emphasis added). The unfortunate reality, as noted nearly 20 years ago by Bishop and Anderson (1990), is that “the concepts of evolution by natural selection are far more difficult for students to grasp than most biologists imagine.” Despite common assumptions to the contrary by both students and instructors, it is evident that misconceptions about natural selection are the rule, whereas a working understanding is the rare exception.Because you can't study this stuff enough...
Tuesday, September 22, 2009
Monday, September 21, 2009
Friday, September 18, 2009
When Darwin claimed that all species have evolved from ancestral species so that each species is adapted to a specific manner of life, he was closer to Aristotle than to those nominalists who would deny the natural reality of species.Essential reading for anyone interested in the history of evolution as an idea.
So, I am now pleased to report that the scholarly writing on the "species problem" seems to be moving towards this position as I argued it in 1998. Increasingly, historians of science and philosophers of biology are questioning the "essentialism story" told by Mayr and Hull that presents Darwin's "populational" thinking as a revolutionary rejection of the "essentialist" thinking that ruled over biology for two thousand years. Instead, scholars are rediscovering a tradition of Aristotelian biological empiricism that broke away from Platonic essentialism and prepared the way for Darwin.
Some of this new scholarship was surveyed a few years ago in an article criticizing the "essentialism story"--Mary Winsor, "Non-essentialist Methods in Pre-Darwinian Taxonomy," Biology and Philosophy, 18 (2003), pp. 387-400. Now we have two new books that elaborate the issues in this scholarly debate. Newly published is John S. Wilkins, Species: A History of the Idea (University of California Press, 2009). Soon to be published is Richard Richards, The Species Problem: A Philosophical Analysis (forthcoming from Cambridge University Press). I have read Wilkins' book, and I have read a few chapters from Richards' manuscript. Wilkins is a philosophy professor at the University of Sydney. Richards is a philosophy professor at the University of Alabama.
Wilkins provides an encyclopedic history of the idea of species from Plato to the present. Running through his history is his argument for the falsity of the essentialism story as told by Mayr and Hull. His argument rests on three claims (x-xi, 231-34).
Francis Beckwith is leaving the blog, just a few months after Zippy. I don't blame either of them, and while I'm certain there are lots of reasons for both gentlemen to move on, I can't help feeling there's something a bit off kilter about the whole WWWtW project. Perhaps because it strikes me as being primarily motivated by fear. I'm sure there's more to it, but I do see how it can become exhausting.
Tuesday, September 15, 2009
Monday, September 14, 2009
Thursday, September 03, 2009
Boston is awash in tourism “trails,’’ such as the Freedom Trail, the Women’s Heritage Trail, and so on. Just recently, Emerson College journalism professor Manny Paraschos created the Boston Journalism Trail, celebrating his contention that “Boston is the birthplace of American journalism.’’
It may well be. Our first newspaper, Publick Occurrences Both Forreign and Domestick, started publishing in 1690. Paraschos’s trail escorts us past the original site of America’s oldest continuously published English language Jewish newspaper, the Jewish Advocate, and of course along downtown’s Newspaper Row, once home to 13 newspapers, depending on who is counting.
Yet there is a reason that journalism history cannot be left to the professorate. Where, for instance, is Dave Farrell’s table at Anthony’s Pier 4, where the veteran Herald and Globe columnist held court? Or George Higgins’s spot at Locke-Ober, where the novelist (“The Friends of Eddie Coyle’’) and newspaperman always welcomed guests willing to pay a tab?
I would add a few more stops to the Journalism Trail, beginning perhaps . . .
I still expect my dad to call me every day, telling me what's in the WSJ or the New York Times....
When The View from Castle Rock appeared in 2006, Alice Munro was widely quoted as saying she thought the book would be her last, suggesting that at the age of seventy-five she might not “have the energy to do this anymore”. Yet I am not sure anyone actually believed she might simply retire. Few writers have been more attuned to the hazards of age, and as long ago as her Paris Review interview of 1994 she had spoken about her fear of losing the necessary stamina and desire, equating the end of writing with death of the mind itself. So her words made her readers afraid, afraid less for the work than for her personally.
That solicitude was curious. For if the open secret of Munro’s career has been the degree to which she has drawn on the details of her own life, the singularity of her work lies in the fact it has never seemed to be about her. Her parents once ran a fox farm, as do Del Jordan’s in Lives of Girls and Women (1971), and like so many of her protagonists she too moved in a first marriage from Ontario to British Columbia, and then left that union behind her. Those connections, however, have never really been the point of her work. Such paradoxically impersonal details are simply her material, and her many volumes of stories owe little to the teasing dance of art and actuality – the sense of a counterlife – that shapes so much autobiographical fiction.
What does all this tell us about early life? It tells us that the evidence for life before 3 billion years ago is being challenged in the scientific literature. You can no longer assume that life existed that early in the history of Earth. It may have, but it would be irresponsible to put such a claim in the textbooks without a note of caution.
What else does this story tell us? It tells us something about how science is communicated to the general public. The claims of early life were widely reported in the media. Every new discovery of trace fossils and trace molecules was breathlessly reported in countless newspapers and magazines. Nobody hears about the follow-up studies that casts doubt on those claims. Nobody hears about the scientists who were heroes in the past but seem less-than-heroic today.
That's a shame because that's how science really works. That's why science is so much fun.
Wednesday, September 02, 2009
Scientists at Stanford University in Palo Alto, California, have unsettling news from what they say is the first-ever study of chronic multitaskers.My brain is definitely going to be scrambled.
A team headed by psychologist Eyal Ophir compared 19 "heavy media multitaskers" (HMMs), identified by questionnaires on media use, with 22 "light media multitaskers" (LMMs). They tested how well the subjects could filter relevant information from the environment, filter relevant information in their memories, and quickly switch cognitive tasks. One filtering test, for example, required viewers to note changes in red rectangles while ignoring blue rectangles in the same pictures.
HMMs did worse than LMMs across the board. Surprisingly, says co-author Clifford Nass, "they're bad at every cognitive control task necessary for multitasking." Nass, a sociologist, says the study has "disturbing" implications in an age when more and more people are simultaneously working on computers, listening to music, surfing the Web, and texting or talking on a phone. Also troubling, he notes, is that "people who chronically multitask believe they're good at it." The findings are reported this week in the Proceedings of the National Academy of Sciences.The team hopes to investigate whether multitasking really scrambles brains or whether people with poor filtering and attentional abilities are more attracted to it to begin with. Psychologist Anthony Wagner suspects that media multitasking offers instant rewards that reinforce "exploratory" behavior at the expense of the ability to concentrate on a particular task.
Lunar losses: India formally abandoned its first lunar orbiter on 30 August, after scientists at the Indian Space Research Organisation (ISRO) abruptly lost radio contact with the probe. Chandrayaan-1 , launched last year to map the Moon, ended its mission 14 months early, but the ISRO said it had met most of its scientific objectives. Meanwhile, NASA 's Lunar Crater Observation and Sensing Satellite (LCROSS) accidentally burned up most of its spare fuel on 22 August. Mission managers say it remains on track to smash into the Moon on 9 October, in the hope of kicking up evidence of ice.
Hand axes from southern Spain have been dated to nearly a million years old, suggesting that advanced Stone Age tools were present in Europe far earlier than was previously believed.
Acheulian axes, which date to at least 1.5 million years ago, have been found in Africa, and similar tools at least 700,000 years old have been found in Israel and China. But in Europe, sophisticated tool-making was thought to stretch back only around 500,000 years.
Cave sediment levels that included the two axes also held what some archaeologists believe may be small tools made using the so-called Levallois technique of shaping stone, known to have existed in Europe only about 300,000 years ago.
... in 2006, geneticists showed for the first time that they could identify truly novel genes. In fruit flies, they came across five young genes that were derived from "noncoding" DNA between existing genes and not from preexisting genes. As a result, other researchers started looking for novel genes in other species.
Meanwhile, while looking for gene duplications in humans, geneticists Aoife McLysaght and David Knowles of Trinity College Dublin kept coming across genes that seemed to have no counterparts in other primates, suggesting that new genes arose in us as well. To determine which of these genes with no counterparts were de novo genes, McLysaght and Knowles first used a computer to compare the human, chimp, and other genomes. They eliminated all but three of the 644 candidates because their sequence in the database was not complete--or they had equivalents in other species.
Next, they searched the chimp genome for signs of each gene's birth. "We strove hard to identify the noncoding DNA that gave rise to the gene," McLysaght says. Only by finding that DNA could they be sure that the gene wasn't already present in the chimp genome but was somehow unrecognizable to gene-finding programs. At three locations where the chimp and human genomes were almost identical, telltale mutations indicated that it was impossible to get a viable protein from the chimp DNA sequence. In contrast, the human version of each sequence had mutations that made it a working gene, the researchers report online tomorrow in Genome Research.
The researchers were able to verify that the genes worked by checking messenger RNA databases and protein surveys done by other scientists. They are now using antibodies to find out where in the cells these proteins are active and are trying to disable the genes in cells to tease out their functions.
What these genes actually code for will be fascinating to find out. You also gotta love a researcher with a classic Irish name like Aiofe McLysaght (talk about right out of The Tain...)
Tuesday, September 01, 2009
Creation, then, does not make any difference to things. If you like, it makes all the difference, but you cannot expect to find a 'created look' about things. The effect of creation is just that things are there, being themselves, instead of nothing. Creation is, of course, an unintelligible notion. I mean it is unintelligible in the sense that God is unintelligible. It is a mystery. Not that the notion is self-contradictory, but it involves extrapolating from what we can understand to what we are only trying to understand. To be created is to exist instead of nothing; but the notion of 'nothing' is itself a mystery unintelligible to us.
Unless we grasp the truth that creation means leaving the world to be itself, to run itself by its own scientific laws so that things behave in accordance with their own natures and not at the arbitrary behest of some god, we shall never begin to understand that the Lord we worship is not a god but the unknown reason why there is anything instead of nothing. --Herbert McCabe, God Still Matters.
Wednesday, August 26, 2009
Despite the success of general relativity, one of the most important problems in modern physics is finding a theory of quantum gravity that reconciles the continuous nature of gravitational fields with the inherent 'graininess' of quantum mechanics. Recently, Petr Hořava at Lawrence Berkeley Lab proposed such a model for quantum gravity that has received widespread interest, in no small part because it is one of the few models that could be experimentally tested. In Hořava's model, Lorentz symmetry, which says that physics is the same regardless of the reference frame, is violated at small distance scales, but remerges over longer distance scales
It was almost as if he had decided to atone, first for the death of Mary-Jo Kopechne and second for his hubris in trying to emulate the ambition of his older brothers.One thing is going to become painfully obvious I think over the next year or so, and that is how pale a shadow John Kerry has been in the Senate all this time, and how little likely he is to come even close to the accomplishments of Ted any time in the future.
Tastes differ but many of his admirers were secretly relieved when the Senator stopped trying to deliver epoch-making speeches like the famous but rather ham-like "The Dream Will Never Die" effort that constituted his last hurrah at the Democratic Convention in New York in 1980.
His chaotic interview with Roger Mudd that same year, in which he could not produce a single coherent reason for seeking the White House, was also helpful in getting him to adopt a more realistic view of himself, and to become a more useful public servant.
You may notice that I have managed to get this far without once using the word "lion". This is on purpose. Senator Edward Moore Kennedy was not particularly leonine, even though he did have a bit of a mane until the very end. He was more like a horse, and it is for his slow and steady work and his willingness to work in harness with others that he will be best remembered.
Monday, August 24, 2009
As for intelligent design? There's really nothing in this paper about it. I'm [sure] Dembski will run around bragging about how he got an ID paper published in a peer-reviewed journal. But arguing that this is an ID paper is really dishonest, because all ID-related content was stripped out. In other places, Dembski has used these quantification-based arguments to claim that evolution can't possibl[y] work. But this paper contains none of that. It's just a fairly drab paper about how to quantify the amount of information in a search algorithm.
Wednesday, August 19, 2009
Tuesday, August 18, 2009
Education Media & Publishing Group, the educational publisher formed by Barry O'Callaghan's leveraged buy-outs of Boston-based Houghton Mifflin and Harcourt, has agreed a refinancing which will lower its debt load and interest bills but heavily dilute equity holders, reports the FT. The refinancing had averted any risk of a Chapter 11 filing, the newspaper adds.
The newspaper also reports that HMH has decided against a renewed attempt to sell its consumer book arm, hoping to use the small division to bolster its core educational publishing business.
The finance agreement will cut EMPG's long-term debt, now standing at about $7.6bn, by more than $1bn and reduce annual interest costs by $100m, in exchange for a 45% dilution of current shareholders.
I'm sure the shareholders are thrilled.
Saturday, August 15, 2009
Me and the guys from Blue Mass Group, Massachusetts leading liberal blog, are putting our petty political differences aside to try and raise money for the Jared C. Monti Scholarship Fund. Jared was KIA in Afghanistan in 2006 and recently became only the 6th person since 2001 to be awarded the .Here's the link to contribute if you'd like to donate to a worthy cause in the name of a fellow American who gave his life in service of his country.
We are trying to rally bloggers from all sides to the cause. Jared fought and died for all Americans, and supporting his scholarship fund seems like the least we can do in return.
Thursday, August 13, 2009
Before 1594, the kaleidoscope of acting companies was becoming impossible for the City authorities to control. Then deals were done, and for six years, from about 1594 to 1600, a monopoly – or duopoly – was granted to two companies only, the Admiral’s and the Chamberlain’s. The Chamberlain’s (the King’s Men) had Henry Carey, Lord Hunsdon as patron, and Shakespeare as writer. The patrons of the Admiral’s Men were Charles Howard and later Prince Henry, then Lord Palsgrave, Earl Palatine. Only in the late 1590s was the duopoly encroached on by the companies of three earls – Worcester, Oxford and Derby – and by the Paul’s Boys and Blackfriars Boys. There were five competitors by 1602; but even then the duopoly companies continued to dominate.
Tuesday, August 11, 2009
Today on the Morning Media Menu, journalist, author, and screenwriter Richard Farrell gave advice for aspiring screenwriters and memoir writers.I'll have to buddy it up with this guy...
Farrell recently publisher his heroin memoir "What's Left of Us," and helped write The Fighter--an upcoming film starring Christian Bale and Mark Wahlberg. Among all screenwriting guides, Farrell recommended the writing handbook by Christopher Vogler, "The Writers Journey: Mythic Structure for Writers."
In order to get on of our pigments, we need to synthesise a piece of DNA. A very large piece of DNA, that would normally cost around £3000 to get synthesised, effectively blowing our synthesis budget for this project. I've been spending most of last week agonising about how much of the actual gene we wanted to synthesise, I didn't want to cut too much out, in case it stopped working.It's the little things in life....
I got an email from my supervisor last night: DNA2.0 have agreed to synthesise is for us.
Monday, August 10, 2009
This is well-timed since it appears just when I've returned from a meeting on this very topic. [Go here if you can't see the article on the Science website.]
One of the things we learned at the meeting is that the Woose tree of life is almost certainly an over-simplification at best and wrong at worst. It is no longer possible to claim that eukaryotes have a simple vertical descent relationship with any archaebacterium (or any bacterium, for that matter).
Instead, the early history of life is characterized by a web or a net involving multiple gene exchanges between all primitive species. After some time, the major divisions of life emerged from this "soup" and became separate lineages with an semi-independent history. This view dates back ten years or so and it's illustrated by a figure that Ford Doolittle published in the February 2000 issue of Scientific American. I've used this figure several times. Here it is again so you can see how it relates to Carl's article.
In the case of eukaryotes, the history is complicated by an endosymbiotic event where a proteobacterium was engulfed and evolved into mitochondria. That explains many of the eukaryotic genes with a clear bacterial origin. Those genes, can be reliably traced to a particular lineage of proteobacteria. What this shows is that by the time of the endosymbiosis most of the main lineages of prokaryotes had emerged from the soup and become fairly well-defined.
This doesn't explain the origins of the host cell. That cell presumably had some of the features of modern eukaryotes. Where did it come from? Was it part of an ancient lineage that formed during the gene exchange period of evolution suggesting that some eukaryotic features are ancient? Was it formed by a fusion between a primitive bacterial cell and a primitive archaebacterium? (Or, did archaebacterial arise from a fusion of a primitive eukaryotic cell and a primitive bacterium?)
I don't read the Huffington Post, but Barrett Brown's takedown of William Dembski is one of the best:
* In conjunction with his friends at the pro-ID Discovery Institute, Dembski decided to commission a Flash animation ridiculing Judge John Jones, the Bush-appointed churchgoer who, despite being a Bush-appointed churchgoer, ruled in the 2005 Dover Trial (known more formerly as Kitzmiller v. Dover Area School District and even more formally as something longer and more formal) that intelligent design could not be taught in public school science classes. The animation consisted of Judge Jones represented as a puppet with his strings being held by various proponents of evolution; aside from being depicted as unusually flatulent, poor Judge Jones was also shown to be reading aloud from his court opinion in a high-pitched voice (Dembski's, it turned out, but sped up to make it sound sillier). The point of all of this, as The Discovery Institute explained, was that Jones had supposedly cribbed some 90 percent of his decision from findings presented by the ACLU, and that this was a very unusual and terrible thing for Jones to have done. On the contrary, judges commonly incorporate the findings of the winning party into their final opinion, either in whole or in part, and Jones' own written opinion actually incorporated far less than 90 percent of the findings in question. For his part, Dembski agreed to reduce the number of fart noises in the animation if Jones would agree to contribute his own voice. Jones does not appear to have accepted the offer.
* One of Dembski's hand-picked blog co-moderators, Dave Springer, once received an e-mail to the effect that the ACLU was about to sue the Marine Corps in order to stop Marines from praying; outraged, Springer posted it on his blog in order that his readers could join him in being affronted. After all, the e-mail had told him to. "Please send this to people you know so everyone will know how stupid the ACLU is Getting [sic] in trying to remove GOD from everything and every place in America," the bright-red text exhorted, above pictures of praying Marines. "Right on!" Dembski added in the comments. It was then pointed out by other readers that the e-mail was a three-year-old hoax; the ACLU spokesperson named therein did not actually exist, and neither did the ACLU's complaint. Springer was unfazed by the revelation. "To everyone who's pointed out that the ACLU story is a fabrication according to snopes.com -- that's hardly the point," he explained. "The pictures of Marines praying are real." Dembski himself had no further comment.
* Dembski has spent much time and energy pointing out that Charles Darwin made several racist statements back in the 19th century, even going so far as to call for a boycott of the British ten-pound note due to Darwin's picture being displayed thereupon. Incidentally, Dembski has spent most of the past decade working at universities within the fold of the Southern Baptist Convention, which was founded in the 19th century for the sole purpose of defending slavery.
Friday, August 07, 2009
With Limelight reporting earnings last night, it's now clear that the major players in the CDN space, the vendors that control the vast majority of the market share for video delivery, are all experiencing no growth. Akamai's M&E business was down and Limelight, Internap and Level 3 all reported no revenue growth for their CDN business. And with Q3 typically being a weak quarter for the CDNs and some of them setting guidance that shows no growth over Q2, we may have yet to see the bottom.
While Limelight was very optimistic that they will see growth in the second half of this year and that the CDN market as a whole will pick up, I'm not so sure that industry wide, that's going to happen in the next two quarters. While pricing still took a decline last quarter, I see the bigger impact being that traffic growth with current customer is no where near the levels it once was and many smaller content owners continue to go under. While Akamai and Limelight both talked about the future of HD and higher-quality video, more devices on the market, blu-ray streaming etc. none of that will take place any time soon on any kind of large scale to impact their revenue in the near-term.
As I wrote in my very first internet blog post on the ideas of Rodney Stark when every single member of society in supposedly a Christian to talk in terms of a Christian role in the advent of science is meaningless, one must instead examine the proponents of natural philosophy according to the various schools of philosophy that they adhered to. Here, we don’t have a unified Christian thought propelling advances in science but various groups, Thomists, Ockhamists, Realists, Nominalists, Averroeists and a whole artist’s pallete of all shades to all sides and in-between, as well individualist loose cannons some of whom despite being outside of all cliques and group exercised a lot of influence. The very fact that there were so many shades of opinion and open conflicts produced an atmosphere of intense discussion that almost certainly played a significant role in the furtherance of scientific inquiry.
My own goal is not really changing people’s minds; it’s understanding the world, getting things right, and having productive conversations. My real concern in the engagement/mockery debate is that people who should be academic/scholarly/intellectual are letting themselves be seduced by the cheap thrills of making fun of people. Sure, there is a place for well-placed barbs and lampooning of fatuousness — but there are also people who are good at that. I’d rather leave the majority of that work to George Carlin and Ricky Gervais and Penn & Teller, and have the people with Ph.D.’s concentrate on honest debate with the very best that the other side has to offer. I want to be disagreeing with Ken Miller or Garry Wills and St. Augustine, not with Paul Nelson and Ann Coulter and Hugh Ross.
Thursday, August 06, 2009
The first TVs fitted with Yahoo-Intel’s widget engine have begun to ship in Europe amid speculation about their potentially disruptive impact on the TV landscape.
On the one hand web-enabled TVs retailing over £1000 (US $1600) are unlikely to attract a mass market, outside early adopters, given that over the past 18 months the industry has made a pretty successful attempt to encourage people to upgrade to flat-screen HD screens as digital switchover gathers pace.
Yet the no-fuss plug and play internet access that widgets provide, albeit in limited ‘walled garden’ form, will give broadcasters and platform owners pause for thought.
“It’s not a slam-dunk competitor but a development that chips away at the edges of the pay-TV business,” says Nigel Walley, managing director of media strategists Decipher. “Pulling up a widget on the Samsung TV pushes the broadcaster’s EPG to one side, potentially delivering on-demand content outside its control. Widgets will raise the appetite among consumers to use the main screen for more activity, putting pressure on STB manufacturers to raise their game.”
Samsung TV was first to launch in April with a six-month exclusive deal to market Yahoo TV Widgets software in its Internet@TV branded displays. Yahoo's UK channels include widgets for news and sports reports, Flickr and, as of mid-July, YouTube.
Even today, as unfashionable as it sounds, and given Washington's attack on horsepower, Americans are still in love with automobiles. They still like going to showrooms, checking out the new models, inhaling the great new-car smell, and yes, kicking the tires and making a buy. Cars may no longer be the heart of our economy -- that's all techie, information gadgets now. But folks still love the car thing.
Now, I wouldn't want the government to pass out free money for everything. But in this particular case, the cash-for-clunkers rebate program is working. It's working so well that it's running way ahead of the computers that are administering it at the Transportation Department and Citibank.
But as book sales fall and publishing houses look for ways to cut costs, many literary agents are growing increasingly worried that publishers looking to trim their lists will start holding authors to deadlines and using lateness as an occasion to renegotiate advances and, in some cases, terminate contracts altogether.
“Publishers are looking at their books and saying, ‘O.K., this book is two years late. Do we want it anymore?’” said Eric Simonoff, an agent at WME Entertainment. “If the answer is no, they’re saying, ‘We don’t want it anymore—we’re calling our loan.’”
Wednesday, August 05, 2009
We miss something important when we just look at the genome as a string of nucleotides with scattered bits that will get translated into proteins — we miss the fact that the genome is a dynamically modified and expressed sequence, with patterns of activity in the living cell that are not readily discerned in a simple series of As, Ts, Gs, and Cs. What we can't see very well are gene regulatory networks (GRNs), the interlinked sets of genes that are regulated in a coordinated fashion in cells and tissues.
What this means is that if you look within a specific cell type at a specific gene, its state, whether off or on, will be correlated in a coherent way with a set of other genes. Look in a developing muscle cell, for instance, and you'll typically find a gene called MyoD is switched on, and also other genes, like Myf5 and myogenin. Look further, and you'll find others like C-jun and cyclin-dependent kinase 4, that also have their activity modulated in predictable ways. And when we start poking around experimentally, we discover that the relationships are often directly causal, with certain gene products binding to and modifying the expression of other genes.
There was never a good reason for assuming that organically grown foods would be more healthy than conventionally grown food and now we have scientific evidence to support that assumption. From now on, whenever you hear someone say that "organic" foods are more healthy you can inform them that what they are saying is contrary to scientific evidence.
Tuesday, August 04, 2009
Interesting new study on how the explosion of a star influences its leftover pulsar.
"In 2007, computer simulations suggested that the stars don't explode in perfectly smooth spheres (J. M. Blondin and A. Mezzacappa Nature 445, 58–60; 2007). This latest visualization, created by Hongfeng Yu, a computer scientist at Sandia National Laboratories in Livermore, California, shows the entropy of the gases in the dying star's core, revealing the immense swirling currents that originated as tiny perturbations (gases with the highest entropy are yellow, followed by green and then purple). The currents "spin up the proto-neutron star, just like pulling a string on an old spinning top", says Bronson Messer, an astrophysicist at Oak Ridge National Laboratory in Tennessee, who contributed to the research. The work incorporates a new visualization technique, developed at Argonne National Laboratory outside Chicago, Illinois, which runs and visualizes the simulation directly on a Blue Gene/P supercomputer."
Thursday, July 30, 2009
Wednesday, July 29, 2009
Like a stopped clock that's right twice a day, Nietzsche hit the nail on the head in Beyond Good and Evil when he wrote "It is perhaps just dawning on five or six minds that physics, too, is only an interpretation and exegesis of the world...and not a world-explanation." The word "physics" here may stand for any of the empirical sciences, including psychiatry.Come back, Scott! Before it's too late....
Tuesday, July 28, 2009
Imagine Finnegan's Wake written in an impossible language. I imagine many people who've tried to read it have already concluded that it was written in an impossible language. But in fact what makes the book so nerve-wracking is that it was written in syntactically recognizable English, but with a slew of invented names, places and verbs occupying the place of more familiar subjects, predicates and adjectives, and no punctuation anywhere in sight.
No, what I mean is: imagine a language that, for example, mandates placing a particular word in a fixed position in the sentence, no matter when it is used. Or a language in which a statement of fact can be converted into a question by reversing the order of the words. (What kind of logic would follow from such a language?)
Marc Hauser, professor in the Departments of Psychology, Human Evolutionary Biology and Organismic and Evolutionary Biology at Harvard has a fascinating article (registration required) in the July 9th issue of Nature on the contraints placed on human cultural evolution. Or rather the possibility of getting around what have been observed to be the constraints on cultural evolution.
The article is detailed and I don't want to just cut and paste chunks of it here. But I do want to look briefly at Hauser's discussion of the sharp differences between animals and humans when it comes to various reflections of intelligence, and how it has been assumed that these differences between animals and our own species amount to a matter of degree, and not kind. Brain science has a lot to say about this.
The simple use of tools, for example, was long considered a significant distinction between us and other animals, until research of primates and other species showed that chimpanzees make use of natural items, like branches, for use as tools, as do some species of birds. But making use of tools for one thing, and making use of them for several different things...is something else.
Although anthropologists disagree about the timing of the human cultural revolution... many researchers point to fundamental changes starting some 800,000 years ago in the Early Palaeolithic, with a crescendo of change at around 45,000–40,000 years ago in the Late Palaeolithic. This period is associated with the generation of symbols (mathematical, artistic and ritualistic), controlled fire for use in cooking and other forms of environmental transformation, and tools with multiple components and functions (for example, tools used for expressing both aggression and music). Given that this interval of several thousand years is barely noticeable on an evolutionary timescale, and that such cultural expressions emerged rapidly, the parallel with the Cambrian is striking: that is, something similar to a genetic revolution must have occurred during this period, providing humans with an unprecedented set of capacities for generating novel cultural expressions in language, morality, music and technology.The human brain, Hauser goes on, changed from a system with a high degree of modularity with few interfaces to one with 'numerous promiscuous and combinatorially creative interfaces.' These interfaces are what bestowed on humans a set of abilities to generate novel cultural expressions in language, morality, music and technology.
At this point in the article, Hauser really brings the distinction into a fine relief (to borrow an art term) by pointing out the limits of other animals' intelligence.
Although many vertebrates have evolved brains with reciprocal connections or loops between different cortical areas (for example, basal ganglia to the cortex and back), these loops are restricted to particular functions....At the most general level, it is clear that the motor systems of all animals must involve recursive operations to allow organisms to take a discrete set of motor options and generate a vast range of functionally meaningful motor acts or sequences in novel environments. For example, whether an organism flies or runs, its legs must repeatedly lift and fall or its wings must repeatedly beat. However, because an organism's habitat and climate is constantly changing, the iterative or recursive rule of cycling through leg lifts or beating the wings must be flexible so that the animal's response can vary in response to environmental change.I realize this may not be news exactly, but I like how Hauser has drawn such careful attention to these distinctions, which I think do tend to be overlooked in general science writing on evolution. It isn't just a matter of degree.
That said, the recursive properties of the motor system seem to be locked into motor function in all animals but humans. For example, in striking contrast to the recursive operations in human language, with its unrestricted use of different content or classes of words, the looping circuitry that is necessary for song acquisition in songbirds only supports singing and, in some cases, mimicry of other biological and non-biological sounds. This circuitry is not, however, used when they acquire the calls that constitute their repertoire more generally, including the sounds used in social interactions, food discovery and alarm calls.
Another example of generative computation comes from the domain of artefacts, in particular the creation and diversity of human tools. Unlike many of our simplest tools, such as the pencil, animal tools consist of a single material, never include more than one functional component, are typically dispensed after their first use and are never used for functions other than the original one. The first two features reveal that, unlike human tools, the representation of animal tools is not combinatorial. A pencil can combine four materials (graphite, wood, metal and rubber) to create four functions (graphite for writing, wood for holding the graphite, metal for attaching the rubber to the wood, and rubber for erasing). Moreover, each material can be used for a variety of other functions: for example, rubber can be a component of chewing gum. As experiments reveal, if a young child is asked what she can do with a pencil other than write, she will immediately offer such functions as holding up her hair, puncturing a plastic cover and poking a friend...Only humans think of artefacts as being designed for a particular function but, as a result of promiscuous interfaces, entertain many other possible functions. [emphasis mine]
His article goes on to discuss what kinds of research might more clearly map out, for lack of a better term, the blind spots in human cultural evolution, what kinds of cultural expressions have not become evident either because they are impossible for us to evolve, given the constraints on our evolution, or because they would be so complicated as to not survive and take root.