# Monthly Archives: May 2009

## Tiller: 2 Degrees

George Tiller, a physician who provided late-term abortions to women with high health risks, was fatally shot in his church. Obviously, the shooter has no acquaintance with the concept of sanctuary.

The famous Kevin Bacon site demonstrated that there are less than six degrees of separation between most people. That is, if you know someone who knows someone, etc., in six or fewer such links you can put any two people on the ends of such a chain. I didn’t know it before, but I turn out to be just two degrees of separation from George Tiller; I have a friend who knew Tiller and counted him as a friend.

## State Board of Education Seat is Not a Pulpit

Responding to the Texas state Senate failing to confirm religious antievolutionist Don McLeroy as chair of the State Board of Education, Sen. Steve Ogden went on the record:

Ogden decried much of the criticism of McLeroy as a “slur.”

“It is not fair to say that if you don’t believe Darwin’s theory of evolution or accept the argument that global warming is occurring, that you should not be on the State Board of Education,” he said.

Mr. Ogden, any concerned citizen could be on a state board of education. Any citizen holding unsubstantiated opinions about empirical research can hold a seat on such a board. However, when a person holding an unsubstantiated narrow sectarian viewpoint who uses their seat on such a board to push to have those views be promulgated as legitimate science, they are not only not doing the job that they are supposed to be doing, they are in malfeasance and have demonstrated by such bad behavior that they do not belong there.

You want kids to learn weird stuff in science classes? Pony up the appropriate ante: show that the arguments you favor have demonstrated their merit under scientific scrutiny and have convinced the scientific community of their worth. Until then, the best approach is to make sure that kids learn what science has to offer in their classes. If it is wrong, the researchers of tomorrow will have a better shot at showing it is wrong if they actually understand it in the first place. Trying to replace instruction with confusion, as McLeroy has consistently advocated, does no one any good.

## A Nice LaTeX Cheat Sheet

I ran across this cheat sheet while looking for an answer to setting line spacing of single-spaced within paragraphs and double-spaced between paragraphs in the front matter. While it doesn’t have the answer to that, it does look like a very handy reference for more commonly encountered situations in $\LaTeX$.

If you are wondering what $\LaTeX$ is, it is a document-processing and typography system. It is to a word processor what a process camera is to a point-and-shoot consumer camera. It’s big, has a steep learning curve, but delivers results far beyond what can be done with consumer-grade word-processing applications, or at least makes it possible to do those tasks with far less hair-pulling.

Technically, $\LaTeX$ is a set of macros originally by Leslie Lamport built on the $\TeX$ typography system of Donald Knuth. Documents in $\LaTeX$ are actually programs, so the process of building a document in $\LaTeX$ is much like software development. While there are commercial versions of $\LaTeX$ systems, pretty much everyone I know uses free, open source versions like MikTeX or TeXLive. There are a number of frontends that help users construct and typeset documents using $\LaTeX$: TeXNicCenter for Windows, TeXShop for Mac, and Kile for the KDE GUI on Linux and FreeBSD.

Why use $\LaTeX$ and not either word processing programs like WordPerfect and Word or desktop publishing packages like Ventura and Quark? First, $\LaTeX$ has excellent mathematics typesetting capabilities. It is the sole format accepted by many journals that often deal with typesetting equations. If you are writing for such journals, there is no alternative. If you want to publish math-heavy text and not spend oodles of your time trying to figure out what went wrong in an “equation editor” for a consumer word processing program, you want $\LaTeX$. Second, it incorporates a huge amount of typography experience. If you are concerned about making documents that are not just formatted well, but make it easy on the eyes of the reader, $\LaTeX$ provides that for you. It is flexible enough that if you think you know better, you can override just about anything, though most of the time that’s not really going to help your readers. Third, $\LaTeX$ automates just about everything that makes writing large documents a hard task. Let me explain that by example.

When Diane and I were writing our dissertations, we had a task of putting together several chapters of material where the final document had to conform to a long set of rules used by the Thesis Office at our university to assure both consistency across dissertations and to allow micro-filming archives to be able to use the result. In particular, there are rules about the placement of figures and tables relative to where they are first referenced in the text. In word processors, you place your text and you place your figure or table, and there is no effective control over where the word processor finally decides to put the figure or table. I had several chapters of material, and I tried WordPerfect and Microsoft Word, without success. I tried Microsoft Publisher and Corel Ventura, but also ran into difficulties there. It was around that time that I started looking at $\LaTeX$ as an alternative. I found that dissertations in the electrical engineering department were often done in $\LaTeX$ and that there was a thesis class (a sort of configuration or environment document setting a style) for $\LaTeX$ that the EE department made available. This was like an existence proof; people had actually managed to get the thesis office to accept their manuscripts when done with the thesis class. I asked Jeff Shallit for recommendations on books, and he pointed me to the $\LaTeX$ book by Leslie Lamport and the $\LaTeX$ Companion book by Goossens, Mittlebach, and Samarin. What I found back in 2002 was that getting acquainted with $\LaTeX$ definitely took some effort, but it almost immediately was paying off. My figures and tables weren’t going hither and yon willy-nilly, they were pretty much where they needed to be, or could be tweaked to do so.

Then, some of the other benefits started becoming apparent. While word processors have some mechanisms for generating front matter (table of contents, lists of figures and tables), $\LaTeX$ could do this in a very systematic way that basically took the entire load off my back. The other bane of the dissertation writer is references. The Thesis Office wanted all references to be cited in a consistent style, to be formatted in a consistent style, to appear in order, and that every citation in the text would appear in the references, and no reference would appear that was not cited in the text. That last one puts a huge load on someone who is organizing a large set of references for themselves. Let’s say that your committee decides that you should remove some text including a citation that only appears in that text. You have to remember not only to remove the text, but to revamp your bibliography so that the now-uncited reference no longer appears. $\LaTeX$ has a helper program, BibTeX, specifically for handling bibliographic data. Using BibTeX and the natbib style, I was able to address all the concerns of the Thesis Office while keeping things pretty simple for me. BibTeX allows you to set up one or more bibliographic source files containing all the references that you might want to use in your document. Within the document, citing a reference occurs using a “\cite” command. There are variants to allow for various in-text citation formats. The cite command is given a parameter that links to a particular reference in one of your BibTeX files. $\LaTeX$ sets up a file used by BibTeX to pull in just the references that are actually used, and BibTeX applies the desired style to produce the typesetting for the references section. The result is that the references section went from something needing a lot of continuing effort to maintain to needing almost no effort to maintain. That sort of assistance is invaluable when what one wants to be doing is writing content and not worrying incessantly about keeping all the effects of changes one makes to the layout in mind.

Something I didn’t use in writing my dissertation that $\LaTeX$ provides is indexing. If you want to produce a large manuscript with an index, this is something that you can do pretty easily in $\LaTeX$. Basically, as you go along in the text, you place an index tag next to the text that you want the index entry to refer to. $\LaTeX$ will track the entries and the corresponding page numbers for you. If you re-organize your text, say by swapping chapters 2 and 3 around, you don’t have to re-do all those page number references in an index, $\LaTeX$ will handle it for you.

$\LaTeX$ provides several basic document classes for you, and you can find extensions online. The basic ones include “letter”, “article”, “book”, and “slide”. That last allows you to generate presentations in $\LaTeX$. Then there are all sorts of styles that one can add on. For example, if you want to write screenplays using the standard formatting rules, the screenplay style can help you. (If, though, you are really intent on screenplay writing, you probably want to look at Celtx. [Addendum: Looking a bit more at the Celtx website, I found this: “TypeSet provides precise automatic formatting of your script to industry and international standards. The Celtx server uses the very powerful LaTex typesetting tool to deliver perfectly formatted scripts.”])

There’s a system called LyX that places itself in between full $\LaTeX$ and the usual way one uses a word processor. $\LaTeX$ is used by LyX as a back-end, and you get a display of text that is a bit closer to the usual WYSIWYG experience, but cast by LyX as “what you see it what you mean”. Unfortunately, LyX documents are not simply standard $\LaTeX$, which to me is a limitation of the system.

Since writing my dissertation, I have relied upon $\LaTeX$ for all my serious writing work, save where a collaborator has insisted upon something else. I use $\LaTeX$ for writing letters and it is the basis for the six or so pending article manuscripts I have. My curriculum vitae/resume is handled in $\LaTeX$, and I have that set up such that I can generate documents of different lengths and detail, plus tuning the focus of my research statement, all by changing a couple of configuration settings. This means that I have one source text whether I want a CV or a resume, or whether I’m sending the result to someone interested in my biology background, my computer science background, or appreciates my interdisciplinary approach. That also means that I can keep things up to date with changes to just one file, and not about a dozen different ones to handle the most common sets of configuration changes that I use.

If you don’t do equation type-setting, don’t need figures and tables to go where you want, don’t need front matter or bibliographies, and don’t need an index, you’ll probably be perfectly happy using the usual word processor. If you do need any of those things, then you owe it to yourself to check out $\LaTeX$.

A commenter on Ed Brayton’s “Dispatches from the Culture Wars” asked what I can only assume was meant to be poser question about the use of torture. I responded there, but figure I should also note that here.

It is clear, from all the investigations done to date, that information was gained which saved many lives. So even knowing that, would you all prefer it had not been done?

Yes, so long as “it” is referring to torture.

If we were conducting a moral war, a war on terror, we lost when we failed to conduct ourselves morally.

Taking a moral stance against torture may involve loss of innocent life. Taking moral stances in the past has definitely cost us in terms of lives lost. That hasn’t prevented us from taking such stances. Have we gotten so cowardly recently?

The argument from beneficial results of torture as an intelligence-gathering method has an unstated assumption, that we have adopted a pragmatic or utilitarian stance, where it is merely the cost/benefit ratio of torture use that controls whether we rationally should use torture ourselves. It should be noted that by this standard, the users of torture who stand opposed to us can also justify their use of torture against our citizens. And all that is needed to make that equation tip in favor of torture is that there be no value attached to the life of the “other”, whether them by us, or us by them.

Even if one accepts the implied pragmatic or utilitarian view that can countenance torture, the relation between cost and benefit isn’t as simple as some would have it. Taking an immoral stance, as our leaders have done for us, doesn’t mean that the intelligence gathered that way did preserve the life of some of our citizens. Is there any documented instance where only torture was productive, and no other intelligence methods contributed to our knowledge and response? In any such instance, did the use of torture preclude other methods of intelligence gathering that might have been productive? It is only if torture is cost-free that one may ignore these sorts of considerations. Again, that’s the case when no value is attached to the lives of the others. (I did not say “innocent” in the second instance above, because we can no longer claim innocence for ourselves, thanks to the actions ordered on our behalf.)

If we want to claim the moral high ground for ourselves, we have to give up on pragmatics or utilitarianism on torture. The most effective policy change that could make that clear to the world that I can think of would be treating everyone with the same care, rights, and privileges that we offer our own citizens. Given what a rough deal some of our own citizens get, that’s a low enough ante to show some minimum level of commitment.

## Benjamin Franklin and the Anti-Vaccination Argument

I ran across this in the Autobiography of Benjamin Franklin:

In 1736 I lost one of my sons, a fine boy of four years old, by the small-pox, taken in the common way. I long regretted bitterly, and still regret that I had not given it to him by inoculation. This I mention for the sake of parents who omit that operation, on the supposition that they should never forgive themselves if a child died under it; my example showing that the regret may be the same either way, and that, therefore, the safer should be chosen.

The anti-vaccination impulse seems to be ancient, and anciently rebutted.

## Lying With Dogs

Conservative commentator Denyse O’Leary has posted an interview with Adnan Oktar at the Uncommon Descent weblog. Oktar is probably better known by his pen name, “Harun Yahya”. He writes prolifically, and has several antievolution books. O’Leary appreciates Oktar’s antievolution. One wonders, though, whether she is as appreciative of some of Oktar’s other contrarian stances, like, say, his Holocaust denial activism. The phenomenon is called “second denial”: those who engage in evolution denial often also take up some other obviously wrong idea.

This isn’t exactly low-profile. The TalkOrigins Archive has had an article up documenting the Holocaust denialism of Harun Yahya as far back as 2003. At the time, it was tough to find the content of Oktar’s “The Holocaust Hoax”, but thanks to “sparc” at AtBC, I now have a link to the full book online. Oktar’s thesis is that Zionists collaborated with Hitler to set up a system to encourage emigration to Palestine; that once hostilities in World War II began, Hitler continued with the policies of segregation previously agreed to by collecting Jews in labor camps; and the whole mass mortality thing was due to “tiffus plague” and the general end-of-war famine. Oktar specifically is in denial that any of the concentration camps utilized gas chambers for mass killings.

How about it, Denyse? What do you think of Oktar’s “second denial”?

## New Server

The email server I use was having some hardware issues. Marc picked up a new box and disk, and Jeff, who has somewhat more spare time at the moment than I do, suggested we go with Ubuntu Server 9.04 for the new install.

So we switched from a FreeBSD 6.3 box to Ubuntu Server today, and on it the new mail system was Postfix/MySQL/Courier. We spent a bit over four hours copying files and preparing the user accounts to use the new system before bringing the Ubuntu server online in place of the old FreeBSD one.

The rest of my day has been spent in fixing up other issues, like switching over the couple of WWW domains that were served from there and setting up email list software.

I’ve been using Majordomo for email lists since the 1990s. Unfortunately, that’s about the time of the last update for that software, too. So I am getting acquainted with Mailman instead.

Hopefully, most of those issues will be sorted before the end of the weekend.

On a somewhat more personal note, the way that I’ve done email since the 1990s has been disrupted. I’ve used the .forward file in my user account to pipe incoming email into a Perl script I wrote. It uses a whitelist file and a pattern file to sort incoming mail and append it to a file named for the day and with an extension according to the recognized class of email. Most of my email reading has been done using emacs from the command line of an ssh session. Now I’m dealing with using SquirrelMail as a primary interface, at least until I can work out what to do about the setup. I’m looking into Fetchmail, which I’m hopeful may allow me to do much the same thing as I did before, where my script only stuck back into my incoming mail box those items matching my whitelist criteria.