Wednesday, September 30, 2009
Nothing makes me more cynical about the whole book publishing industry than this. I mean, at least the young-adult vampire serials on the bestsellers lists involve semi-original characters. The author actually does have to do some real work. I get that. But this plundering of literary classics is the worst yet.
In line with these thoughts, and with the regret that comes from having to acknowledge yet set aside two things, namely the existence of human frailty and the contribution gifted individuals such as Roman Polanski make to society, I conclude that it is right that the United States authorities are seeking to extradite him to serve his sentence for rape. Neither fame nor wealth, neither time nor distance, should render anyone immune to laws protecting against serious crimes against other human beings.
Tuesday, September 29, 2009
As ludicrous as Shore's post is, I have to agree with Fecke that my favorite Polanski apologist is the Washington Post's Anne Applebaum, who finds it "bizarre" that anyone is still pursuing this case. And who also, by the by, failed to disclose the tiny, inconsequential detail that her husband, Polish foreign minister Radoslaw Sikorski, is actively pressuring U.S. authorities to drop the case.
Monday, September 28, 2009
Friday, September 25, 2009
Apropos the Coyne/Manzi debate, Ed Feser weighs in on teleology with some spirited and quite understandable exasperation with the way so many on both sides fail to understand Aquinas:
Let me make some general remarks about what the A-T [Aristotelian - Thomist] tradition does mean, then, before coming back to One Brow’s comment. If you are going to understand Aristotle and Aquinas, the first thing you need to do is put out of your mind everything that you’ve come to associate with words like “purpose,” “final cause,” “teleology,” and the like under the influence of what you’ve read about the Darwinism vs. Intelligent Design debate, Paley’s design argument, etc. None of that is relevant. If you think that what Aristotelians or Thomists mean when they say that teleology pervades the natural world is that certain natural objects exhibit “irreducible specified complexity,” or that some inorganic objects are analogous to machines and/or to biological organs, or that they are best explained as the means by which an “Intelligent Designer” is seeking to achieve certain goals, etc., then you are way off base. I realize that that’s the debate most people – including writers of pop apologetics books – think that arguments like the Fifth Way are about. They’re not. Think outside the box. “What hath Thomas Aquinas to do with William Paley?” Nothing. Forget Paley. (boldface mine)
Having had the undeserved good fortune of knowing him during his 21-year sojourn in Washington, I can testify to something lesser known: his extraordinary equanimity. His temperament was marked by a total lack of rancor. Angst, bitterness and anguish were alien to him. That, of course, made him unusual among the fraternity of conservatives because we believe that the world is going to hell in a handbasket. That makes us cranky. But not Irving. Never Irving. He retained steadiness, serenity and grace that expressed themselves in a courtliness couched in a calm quiet humor.
Thursday, September 24, 2009
Since the late 19th century, evolutionary biologists have debated whether evolution can go in reverse. If not, then evolution may depend on more than just natural selection. Multiple evolutionary paths could be possible through small chance events. It hasn't been easy to examine reversibility. Previous studies have focused on complex traits such as whale flippers, and scientists often lack sufficient information about ancestral traits or how present-day traits evolved.
So evolutionary biologist Joseph Thornton of the University of Oregon, Eugene, and his colleagues picked a more tractable subject: a single protein. His group has been studying the more than 450-million-year evolution of the glucocorticoid receptor (GR), a protein that binds to the stress hormone cortisol to control animals' response to it. Like all proteins, GR is made up of amino acids. By collecting the amino acid sequences of GR and related proteins from living animals, Thornton and his team previously constructed the GR evolutionary tree and resurrected sequences of GR's ancestors.
This history reveals that GR has switched its hormone preference. Around the time cartilaginous fish such as sharks split off from bony fish, roughly 440 million years ago, the ancestral protein that the scientists call GR1 responded to both cortisol and the hormone aldosterone. But 40 million years later, when four-legged creatures started to appear, the descendent GR2 had become cortisol-specific.
During these 40 million years, 37 amino acids changed. Only two were necessary to alter the function: One put a kink in the protein's shape, making it unresponsive to both hormones, and another allowed the restructured molecule to interact with only cortisol. Thornton's team next wondered if they could make GR2 recognize both cortisol and aldosterone by reverting these amino acids, which they call group X, back to their GR1 state. The researchers report today in Nature that this swap not only couldn't restore GR's original dual function but that it also killed the protein's ability to recognize any hormone.
Wednesday, September 23, 2009
T. Ryan Gregory's paper, online.
As is true with many other issues, a lack of understanding of natural selection does not necessarily correlate with a lack of confidence about one's level of comprehension. This could be due in part to the perception, unfortunately reinforced by many biologists, that natural selection is so logically compelling that its implications become self-evident once the basic principles have been conveyed. Thus, many professional biologists may agree that “[evolution] shows how everything from frogs to fleas got here via a few easily grasped biological processes” (Coyne 2006; emphasis added). The unfortunate reality, as noted nearly 20 years ago by Bishop and Anderson (1990), is that “the concepts of evolution by natural selection are far more difficult for students to grasp than most biologists imagine.” Despite common assumptions to the contrary by both students and instructors, it is evident that misconceptions about natural selection are the rule, whereas a working understanding is the rare exception.Because you can't study this stuff enough...
Tuesday, September 22, 2009
Friday, September 18, 2009
When Darwin claimed that all species have evolved from ancestral species so that each species is adapted to a specific manner of life, he was closer to Aristotle than to those nominalists who would deny the natural reality of species.Essential reading for anyone interested in the history of evolution as an idea.
So, I am now pleased to report that the scholarly writing on the "species problem" seems to be moving towards this position as I argued it in 1998. Increasingly, historians of science and philosophers of biology are questioning the "essentialism story" told by Mayr and Hull that presents Darwin's "populational" thinking as a revolutionary rejection of the "essentialist" thinking that ruled over biology for two thousand years. Instead, scholars are rediscovering a tradition of Aristotelian biological empiricism that broke away from Platonic essentialism and prepared the way for Darwin.
Some of this new scholarship was surveyed a few years ago in an article criticizing the "essentialism story"--Mary Winsor, "Non-essentialist Methods in Pre-Darwinian Taxonomy," Biology and Philosophy, 18 (2003), pp. 387-400. Now we have two new books that elaborate the issues in this scholarly debate. Newly published is John S. Wilkins, Species: A History of the Idea (University of California Press, 2009). Soon to be published is Richard Richards, The Species Problem: A Philosophical Analysis (forthcoming from Cambridge University Press). I have read Wilkins' book, and I have read a few chapters from Richards' manuscript. Wilkins is a philosophy professor at the University of Sydney. Richards is a philosophy professor at the University of Alabama.
Wilkins provides an encyclopedic history of the idea of species from Plato to the present. Running through his history is his argument for the falsity of the essentialism story as told by Mayr and Hull. His argument rests on three claims (x-xi, 231-34).
Francis Beckwith is leaving the blog, just a few months after Zippy. I don't blame either of them, and while I'm certain there are lots of reasons for both gentlemen to move on, I can't help feeling there's something a bit off kilter about the whole WWWtW project. Perhaps because it strikes me as being primarily motivated by fear. I'm sure there's more to it, but I do see how it can become exhausting.
Tuesday, September 15, 2009
Monday, September 14, 2009
Thursday, September 03, 2009
Boston is awash in tourism “trails,’’ such as the Freedom Trail, the Women’s Heritage Trail, and so on. Just recently, Emerson College journalism professor Manny Paraschos created the Boston Journalism Trail, celebrating his contention that “Boston is the birthplace of American journalism.’’
It may well be. Our first newspaper, Publick Occurrences Both Forreign and Domestick, started publishing in 1690. Paraschos’s trail escorts us past the original site of America’s oldest continuously published English language Jewish newspaper, the Jewish Advocate, and of course along downtown’s Newspaper Row, once home to 13 newspapers, depending on who is counting.
Yet there is a reason that journalism history cannot be left to the professorate. Where, for instance, is Dave Farrell’s table at Anthony’s Pier 4, where the veteran Herald and Globe columnist held court? Or George Higgins’s spot at Locke-Ober, where the novelist (“The Friends of Eddie Coyle’’) and newspaperman always welcomed guests willing to pay a tab?
I would add a few more stops to the Journalism Trail, beginning perhaps . . .
I still expect my dad to call me every day, telling me what's in the WSJ or the New York Times....
When The View from Castle Rock appeared in 2006, Alice Munro was widely quoted as saying she thought the book would be her last, suggesting that at the age of seventy-five she might not “have the energy to do this anymore”. Yet I am not sure anyone actually believed she might simply retire. Few writers have been more attuned to the hazards of age, and as long ago as her Paris Review interview of 1994 she had spoken about her fear of losing the necessary stamina and desire, equating the end of writing with death of the mind itself. So her words made her readers afraid, afraid less for the work than for her personally.
That solicitude was curious. For if the open secret of Munro’s career has been the degree to which she has drawn on the details of her own life, the singularity of her work lies in the fact it has never seemed to be about her. Her parents once ran a fox farm, as do Del Jordan’s in Lives of Girls and Women (1971), and like so many of her protagonists she too moved in a first marriage from Ontario to British Columbia, and then left that union behind her. Those connections, however, have never really been the point of her work. Such paradoxically impersonal details are simply her material, and her many volumes of stories owe little to the teasing dance of art and actuality – the sense of a counterlife – that shapes so much autobiographical fiction.
What does all this tell us about early life? It tells us that the evidence for life before 3 billion years ago is being challenged in the scientific literature. You can no longer assume that life existed that early in the history of Earth. It may have, but it would be irresponsible to put such a claim in the textbooks without a note of caution.
What else does this story tell us? It tells us something about how science is communicated to the general public. The claims of early life were widely reported in the media. Every new discovery of trace fossils and trace molecules was breathlessly reported in countless newspapers and magazines. Nobody hears about the follow-up studies that casts doubt on those claims. Nobody hears about the scientists who were heroes in the past but seem less-than-heroic today.
That's a shame because that's how science really works. That's why science is so much fun.
Wednesday, September 02, 2009
Scientists at Stanford University in Palo Alto, California, have unsettling news from what they say is the first-ever study of chronic multitaskers.My brain is definitely going to be scrambled.
A team headed by psychologist Eyal Ophir compared 19 "heavy media multitaskers" (HMMs), identified by questionnaires on media use, with 22 "light media multitaskers" (LMMs). They tested how well the subjects could filter relevant information from the environment, filter relevant information in their memories, and quickly switch cognitive tasks. One filtering test, for example, required viewers to note changes in red rectangles while ignoring blue rectangles in the same pictures.
HMMs did worse than LMMs across the board. Surprisingly, says co-author Clifford Nass, "they're bad at every cognitive control task necessary for multitasking." Nass, a sociologist, says the study has "disturbing" implications in an age when more and more people are simultaneously working on computers, listening to music, surfing the Web, and texting or talking on a phone. Also troubling, he notes, is that "people who chronically multitask believe they're good at it." The findings are reported this week in the Proceedings of the National Academy of Sciences.The team hopes to investigate whether multitasking really scrambles brains or whether people with poor filtering and attentional abilities are more attracted to it to begin with. Psychologist Anthony Wagner suspects that media multitasking offers instant rewards that reinforce "exploratory" behavior at the expense of the ability to concentrate on a particular task.
Lunar losses: India formally abandoned its first lunar orbiter on 30 August, after scientists at the Indian Space Research Organisation (ISRO) abruptly lost radio contact with the probe. Chandrayaan-1 , launched last year to map the Moon, ended its mission 14 months early, but the ISRO said it had met most of its scientific objectives. Meanwhile, NASA 's Lunar Crater Observation and Sensing Satellite (LCROSS) accidentally burned up most of its spare fuel on 22 August. Mission managers say it remains on track to smash into the Moon on 9 October, in the hope of kicking up evidence of ice.
Hand axes from southern Spain have been dated to nearly a million years old, suggesting that advanced Stone Age tools were present in Europe far earlier than was previously believed.
Acheulian axes, which date to at least 1.5 million years ago, have been found in Africa, and similar tools at least 700,000 years old have been found in Israel and China. But in Europe, sophisticated tool-making was thought to stretch back only around 500,000 years.
Cave sediment levels that included the two axes also held what some archaeologists believe may be small tools made using the so-called Levallois technique of shaping stone, known to have existed in Europe only about 300,000 years ago.
... in 2006, geneticists showed for the first time that they could identify truly novel genes. In fruit flies, they came across five young genes that were derived from "noncoding" DNA between existing genes and not from preexisting genes. As a result, other researchers started looking for novel genes in other species.
Meanwhile, while looking for gene duplications in humans, geneticists Aoife McLysaght and David Knowles of Trinity College Dublin kept coming across genes that seemed to have no counterparts in other primates, suggesting that new genes arose in us as well. To determine which of these genes with no counterparts were de novo genes, McLysaght and Knowles first used a computer to compare the human, chimp, and other genomes. They eliminated all but three of the 644 candidates because their sequence in the database was not complete--or they had equivalents in other species.
Next, they searched the chimp genome for signs of each gene's birth. "We strove hard to identify the noncoding DNA that gave rise to the gene," McLysaght says. Only by finding that DNA could they be sure that the gene wasn't already present in the chimp genome but was somehow unrecognizable to gene-finding programs. At three locations where the chimp and human genomes were almost identical, telltale mutations indicated that it was impossible to get a viable protein from the chimp DNA sequence. In contrast, the human version of each sequence had mutations that made it a working gene, the researchers report online tomorrow in Genome Research.
The researchers were able to verify that the genes worked by checking messenger RNA databases and protein surveys done by other scientists. They are now using antibodies to find out where in the cells these proteins are active and are trying to disable the genes in cells to tease out their functions.
What these genes actually code for will be fascinating to find out. You also gotta love a researcher with a classic Irish name like Aiofe McLysaght (talk about right out of The Tain...)
Tuesday, September 01, 2009
Creation, then, does not make any difference to things. If you like, it makes all the difference, but you cannot expect to find a 'created look' about things. The effect of creation is just that things are there, being themselves, instead of nothing. Creation is, of course, an unintelligible notion. I mean it is unintelligible in the sense that God is unintelligible. It is a mystery. Not that the notion is self-contradictory, but it involves extrapolating from what we can understand to what we are only trying to understand. To be created is to exist instead of nothing; but the notion of 'nothing' is itself a mystery unintelligible to us.
Unless we grasp the truth that creation means leaving the world to be itself, to run itself by its own scientific laws so that things behave in accordance with their own natures and not at the arbitrary behest of some god, we shall never begin to understand that the Lord we worship is not a god but the unknown reason why there is anything instead of nothing. --Herbert McCabe, God Still Matters.