Monthly ArchiveSeptember 2011
Computation Wesley R. Elsberry on 22 Sep 2011
Some time back, I mentioned getting data off CD-ROM and putting it on hard disk with a second hard disk for back-up. As time passes, this gets more critical. I think archivists start getting antsy about CD-ROM after a decade or so, and I have media that go back to 1996.
And I have run into CD-ROM data disks with various reading errors.
So I thought that I would mention a freeware tool for Windows that addresses getting what can be gotten from a CD-ROM with problems. This is Roadkil’s Unstoppable Copier (RUC). Fortunately, you can stop it in bad circumstances by killing the process in Task Manager. I’ve done this after setting it to work on a CD-ROM with an obvious, visible blemish. In its default setup, RUC will attempt multiple reads of bad sectors in order to recover as much of a file as possible. This leads to it taking a long, lllllooooonnnnngggg, time to get through a patch of damage. Longer than I was willing to wait, anyway. So in the “Settings” tab, I set it to “Auto Skip Damaged Files”. This copies off all the undamaged files from the CD-ROM, and it does so fairly expeditiously. For some CDs, I may decide to let it trundle for a few days to analyze things, but first I want to get as much of the good stuff secured as I can. This tool looks to be a help in that regard.
The lengthy recovery process is probably most useful for large text files, where recovering a majority of a file is preferable to losing all of it due to a possibly small section that is damaged. For binary files, this may not be universally useful. The data files I have are raw integer data, so as long as the reconstituted file preserves the same length, I can recognize the bad patches and leave them out of analysis. That may not hold true for ZIP files and other compressed archives, JPG images, and the like.<= get_option(\'vc_tag\') ?>> = get_option(\'vc_text_before\') ?> 104977 = get_option(\'vc_human_count_text_many\') ?> = get_option(\'vc_preposition\') ?> 6065 = get_option(\'vc_human_viewers_text_many\') ?> = get_option(\'vc_tag\') ?>>
We’ve known for a long time that Casey Luskin has some very odd ideas about what constitutes a technical publication. Casey’s been good enough to document another deficiency of his in this respect for all to see, but no one is allowed to comment. (I wonder what happened to the Discovery Institute’s grand experiment in interactive commentary, anyway?)
Casey thinks I’m a hypocrite for criticizing Granville Sewell on the topic of self-plagiarism. As evidence, he notes that an essay co-authored by Jeff Shallit and I was published on the web and later in the journal Synthese.
The Case of Wesley Elsberry’s Self-Plagiarism
In 2003, Wesley Elsberry and Jeffrey Shallit co-published a paper, “Information Theory, Evolutionary Computation, and Dembski’s ‘Complex Specified Information,’” on the website TalkReason.org. (I wrote a response to the substance of their 2003 article here.)[*]
In 2011, Elsberry and Shallit co-published a paper in the journal Synthese titled “Information theory, evolutionary computation, and Dembski’s “complex specified information.’”
If you’ll notice, the titles of those two papers are identical. That’s not all that’s identical in the papers. A comparison performed by a colleague using the plagiarism-detection software SafeAssign shows that these two papers are ~94% matching.
(Note: The analysis used text files I had prepared using the original PDFs of the papers. For processing, I had to strip out some numbers and mathematical equations which did not translate well into the text files. Also, my colleague’s name has been redacted.)
Isn’t it just a bit hypocritical that Elsberry harps upon Sewell’s supposed mortal sin of “self-plagiarism” when Elsberry himself has taken previously published work and then republished it in academic journals?
Yeah, I’ll stipulate that the essay is mostly the same. But…
Casey, Casey, Casey… Republishing essentially the same thing multiple times in the technical literature is a bad thing. Getting something that’s been released on the web but not yet published in the technical literature is perfectly fine, with a caveat: the authors should make sure that the editors are aware of the prior release. This was done for the essay that was published in Synthese. (The editors also knew of a similar essay published in 2004′s “Why Intelligent Design Fails”, which Casey hasn’t mentioned yet.) This situation is not what “self-plagiarism” applies to. Nor is converting material from a dissertation into technical articles considered self-plagiarism, which is another process that I’m still working on. For another case in point, some time ago Reed Cartwright blogged a criticism of a paper. Another researcher saw that and invited Reed to contribute to a response letter in the technical literature. Does Reed’s previous web publication of the line of criticism used in the letter establish “self-plagiarism”? That’s a clear “No”. Scientists treat the technical literature as a separate source of knowledge from popular sources like blogs and portal sites. Repetition of material in lay outlets is essentially of no concern to the scientific endeavor. When it occurs in the technical literature, it is perceived as a pernicious problem.
But Granville Sewell doesn’t have a situation analogous to mine, where I converted a lay release into a publication in the technical literature. The Discovery Institute itself counts his shtick about the 2nd Law of Thermodynamics twice already in its list of “peer-reviewed” work on ID. I have no doubt that had AML actually followed through on publication of the essay, the DI would have happily counted it three times over in their list instead of just twice. The DI and its spokes-weasels can’t simultaneously claim that each re-publication counts separately and that self-plagiarism that repeats the same arguments in the technical literature is not happening. Of course, Casey knows how weak his position is, else he wouldn’t have added the following to his screed:
So I personally don’t care if Wesley Elsberry plagiarizes himself, and it doesn’t matter to me one bit if he resubmits material he’s already published to any publication he likes.
My point is simply this: it is hypocritical for Elsberry to attack Sewell for “self-plagiarism,” when Elsberry does the same thing. What Sewell (and Elsberry) have done isn’t a crime. Elsberry’s complaint is both baseless, and hypocritical.
Given that IDC advocates are so unproductive, Casey has to defend the line that if they can manage to sneak the same stuff around to multiple venues within the technical literature, there’s nothing wrong with that. Well, there is something wrong with that. Maybe it isn’t high on the lists of academic sin, but it certainly does goes some way to demonstrating intellectual dishonesty to game the technical literature.
[*] Casey, you did not write a response to the substance of our essay. That would have required reading comprehension on your part. What you wrote was an orgy of strawman gouging and delusional codswallop.<= get_option(\'vc_tag\') ?>> = get_option(\'vc_text_before\') ?> 108848 = get_option(\'vc_human_count_text_many\') ?> = get_option(\'vc_preposition\') ?> 6549 = get_option(\'vc_human_viewers_text_many\') ?> = get_option(\'vc_tag\') ?>>
Law and Politics Wesley R. Elsberry on 14 Sep 2011
In the Terry Pratchett novel “Witches Abroad”, he sets up an image of “the optimist’s fire”, which he then defines as “two logs and hope”. I bring this up because I’m seeing people who desperately want to be seen as hard-nosed and fiscally responsible proposing the optimist’s economic booster: tax cuts and hope.
Cutting taxes on industry (and regulations, too!), it is said, will finally bring about job creation and get the economic fire burning once again. I am afraid that I’m not seeing the causal link here. Our corporate citizens have plenty of money on hand, various estimates putting those cash reserves in the trillions. It’s not like any tax cuts are contemplated on that scale, so that leads to the vexing question of why jobs aren’t being created now? If they are worried about consumer spending, tax cuts for businesses don’t address that at all.
Well, having said that, I would propose a tax cut for businesses. But I am not proposing an across-the-board tax cut. No, I think that there should be a tax incentive for businesses that create new jobs. If you have a business and are bringing more people into the workforce than you did the year before, your business should get a break on its taxes. If you have a business that is standing pat on jobs or shedding them or outsourcing them, then, sorry, but no tax break for you. Probably this would have to have some metric that ties together both number of jobs and total compensation per job. The right incentive structure will reward companies more for better-paying jobs, and less for low-paying job creation. But that’s a refinement to be considered after getting more people onboard with the big idea.
This puts the right outlook on using the tax code to influence business: it makes clear what behavior is needed to take advantage of the tax incentive, and makes clear that until that behavior is actualized, there is no corporate handout in the offing. It rewards the corporate citizens who are making things better here in the USA, and withholds benefits from those who seek to cut labor corners or send jobs overseas. It means that we aren’t cutting our own economic throats to no better purpose than padding executive bonuses, which seems to be all that we’ve gotten out of corporate tax-cutting in recent memory. It would mean that we are applying a lighter where it is needed rather than hoping that a fire will start on its own somewhere, somehow. And, best of all, even if it doesn’t work and most corporations refuse to create jobs anyway, we haven’t burdened the rest of the taxpaying public unnecessarily.<= get_option(\'vc_tag\') ?>> = get_option(\'vc_text_before\') ?> 101820 = get_option(\'vc_human_count_text_many\') ?> = get_option(\'vc_preposition\') ?> 5934 = get_option(\'vc_human_viewers_text_many\') ?> = get_option(\'vc_tag\') ?>>
Back in 1990, Clayton Williams, Jr. was in the news a lot as he ran for governor in Texas. His campaign famously imploded over insensitive good-ol’-boy comments made to a weekend gathering of media. He was rich, but there didn’t seem to be much else to recommend him. I always thought it odd to go by the “Clayton Williams, Jr. Alumni Center” at the TAMU campus.
But it appears I need to seriously revise my assessment of Williams. The Austin-American Statesman reports that Williams had some extraordinarily good advice for Texas Governor Rick Perry (which Perry obviously and promptly ignored):
<= get_option(\'vc_tag\') ?>> = get_option(\'vc_text_before\') ?> 103642 = get_option(\'vc_human_count_text_many\') ?> = get_option(\'vc_preposition\') ?> 6623 = get_option(\'vc_human_viewers_text_many\') ?> = get_option(\'vc_tag\') ?>>
Williams, a wealthy Midland oil man, wrote to Perry as the State Board of Education was starting the debate over new science curriculum standards. He warned Perry to stop any effort by the board to include creationism or intelligent design in those standards.
“If Texas enters into a debate on the teaching of fundamental religious beliefs in public schools, it will tarnish our strong academic reputation, set our ability to attract top science and engineering talent to Texas back decades and severely impact our reputation as a national and global leader in energy, space, medicine and other high tech fields,” Williams wrote.
He continued: “Governor, this is a very important issue for Texas. I urge you to quell this issue quietly, firmly and permanently.”