101 Fabulous Freebies

101 Freebies logoMy feature story for PC World‘s May issue is online, and it’s a doozy. This story has capsule reviews of 101 completely free programs and web-based services. I worked my tail off to write this story, and I’m proud of the result: There’s a lot of great stuff in there. Many props to my editors at PC World, Liane Cassavoy and Laura Blackwell. Check it out!

There’s never been a better time to be a cheapskate. Free utilities? We’ve got ’em. Want a full-fledged image editor? A few gigabytes of mail storage? How about an entire office software suite? We can top that, easy. Take the whole earth and solar system. Free!

If you thought that the golden age of free stuff ended when the dot-com bubble burst, guess again. The past few years have seen an explosion of giveaways–both Web-based services and free software–that make the anemic home-page building apps and first-generation Web mail services of the late 1990s pale in comparison.

PCWorld.com – 101 Fabulous Freebies

UPDATE 3/30: Benjamin Pimentel at SFGate (the Chronicle‘s site) blogged this story.

101 Fabulous Freebies

Wikipedia study: Cooked?

Andrew Orlowski dishes more dirt on the Nature comparison of Wikipedia vs. Britannica: Nature mag cooked Wikipedia study | The Register

“Nature sent only misleading fragments of some Britannica articles to the reviewers, sent extracts of the children’s version and Britannica’s “book of the year” to others, and in one case, simply stitched together bits from different articles and inserted its own material, passing it off as a single Britannica entry.”

Choice turn of phrase: Former Britannica editor Robert McHenry calls Wikipedia the “Faith based encyclopedia”

Wikipedia study: Cooked?

Wikipedia’s reliability?

This morning, the WSJ ran a story on the Encyclopedia Britannica’s efforts to defend itself against assertions that it’s no more accurate than Wikipedia: WSJ.com – In a War of Words, Famed Encyclopedia Defends Its Turf

Here’s the Nature study on Wikipedia vs Encyclopedia Britannica that kicked the whole thing off back in December.

A month ago I had discussed the topic with my dad, a cognitive psychologist who has spent a lot of time studying the scientist Michael Faraday. He was pretty unimpressed with Wikipedia, and did a comparison of its entry on Michael Faraday with Britannica’s. Here are Dad’s comments:

Yeah, I’d known about the “Nature” study, but it’s really pretty poor, IMHO.

First of all, they used only one expert per pair of articles. Wouldn’t surprise me if most of them could come up with pretty good guesses about which was which (this would probably work in Wikipedia’s favor, but it’s still a bias). Even so, one per pair is too few.

Second, that’s a pretty puny sample size of articles. At a maximum of 39 cents per letter ( x 50 = $19.50), assuming they used snail mail, and some xeroxing, they could have done better for very little effort!

I redid my own short test (one I’d done a while ago) by looking up the articles on Michael Faraday in Wikipedia and in Britannica Online. The levels are very different, of course; the EB one is much longer and has some very abstract stuff in it. Even so, the comparison is startling.

Wikipedia’s is riddled with error, apparently most of it being based on a recent “fictionalized” biography of Faraday that simply invented stuff about his motivations and that of other people, events that “must” have happened, conversations that “probably” occurred, and so on. Of course the qualifiers all disappeared in the article, and I now see, for example, Faraday credited as having invented the Bunsen Burner as his major achievement. What nonsense! Bunsen’s device was indeed based on Faraday’s research on flames, but he no more invented the burner than he invented the “Faraday Flashlights” now being sold all over eBay.

There are other mistakes even more serious, but I won’t bore you. The mistaken stuff about electricity, magnetism, and chemistry is frightening, never mind the hash made of the history. The links to other sources are just as bad — poorly chosen crap, for the most part.

The EB article was written by Pearce Williams, a major guy in the field. It was written some 15 years ago, so it errs on some small scholarly points, since resolved differently, but it has a masterful account of the science, and of Faraday’s life. The only big “error” is the attribution to Faraday of a strong influence by Boscovich. Again, not to bore you, but this was a hobby horse ridden by Williams for years, one that most scholars (me included) now think was overdone or flat wrong. But ’twas a heated controversy for a while, so one can perhaps forgive in retrospect. A major error, if that’s what it is, only to nitpicky historians. If I’d read this “blind”, I’d have guessed Williams was the author, I’m sure.

The EB article has surprisingly full references (but to heavy scholarly stuff, print media only) and links to good serious articles in the EB covering the physics and chemistry. Here again, a different level altogether — you wouldn’t send the average high school student to the EB for this. But if sent to Wikipedia, they’d of course be able to fully understand all the horseshit given there.

On balance, EB, hands down! You get what you pay for in this case!

I asked Dad why he didn’t just do the public-minded thing, and spend some time correcting Wikipedia’s egregious errors? From his response:

The EB article on Faraday was much longer than the one in Wikipedia, a factor of three, maybe. So that means there was 3 x the opportunity for an error in the EB article. If the sample of 50 entries in the Nature study was similar (i.e., longer EB articles), then the number of errors would really be skewed in Wikipedia’s favor. The way around this problem would be to compute the # errors per 1000 words, or some such. But then they would have had to be thinking like a scientist!

… Been mulling your point about correcting the Faraday article in W. Problem is, I’d have to write a whole new article! It’s not just a matter of changing factoids. Having just spent a month writing a bio of a psychologist for the “Dictionary of Scientific Biography” (for which Scribner’s is paying me! Hey, I’m a free lancer!), I know how much work would be involved to do it right.

I’ll keep mulling, but there is also the problem of whether those of us who do have some expertise should enhance an endeavor that is perpetuating misinformation, partial information, whatever. Thus, I looked at the Wikipedia article on Jonathan Edwards. Seems mostly o.k. factually (one minor error), but says nothing at all about his philosophical contributions, nor even the fact that he made any. His “great work on the Will” is mentioned, but no clue what it’s about, other than the book’s title. You do get links to lots of theological figures he was associated with (but not his most important predecessors), but the article is really about Edwards as a pastor. As with Faraday, the EB article is about 3 x longer, covers the topic very well, including the philosophical issues, as well as the theological. It’s a signed article by a real person (I don’t recognize the name, but he knows his stuff).

Wikipedia’s reliability?

Selfish genes and soft heads.

Ten years ago I wrapped up my studies for an MA in religious studies, and walked out of the academic world. I had spent the previous two years of graduate study (plus much of my undergraduate career) trying to make sense of what people believe, how they organize the world around themselves, and how we can make sense of what others believe. My ultimate motivation, I suppose, was to understand other human beings better.

I can’t say I succeeded very well at that, but I did learn a lot about the ritualistic and textual manifestations of many religions. I learned even more about the methodologies and rituals of the field of religious studies, a domain that is itself as riddled with superstition and confusion as the religions it purports to study. In the end I argued myself out of the field, convinced that what we call “religion” in the U.S. academy is inevitably a reflection of Christian conceptions of the sacred vs. the secular. Try to go beyond those preconceptions, and the field evaporates.

Beyond that were the often ridiculous feats of mental gymnastics one had to attempt in order to make sense of the field. I remember one seminar in particular, where a more advanced graduate student was presenting some of the results of his research into the ritualistic bowing found in Chinese monasteries. Among the attendees of the seminar was a professor of anthropology I studied briefly with. After the student made his exposition of the intricate significations embodied by the bows, the anthropology prof started asking if there might be parallels with animal gestures of submission. The discussion quickly focused in on that point, with the religious studies / critical theory contingent attacking the anthropologist for drawing untoward animalian parallels, I suppose. Eventually, exasperated, she said that there must be some kind of relevance — if you believe in evolution, anyway.

She might as well have mentioned the name of Satan. Apparently “evolution” was a code word for something terribly wrong, not to be mentioned within the sphere of humanistic studies. I think this is because of a reaction against social darwinism, but I’m not sure what motivated it. Anyway, it was clear she had shocked the other participants and her comment virtually shut down the discussion, because they were unwilling to let any discussion of human evolution enter into a conversation about humans being submissive to other humans.

It was then, I think, that I finally realized I was surrounded by people who were so committed to their theoretical frameworks that they were unable to appreciate, unable even to see, the most basic fact of human life. They were science-blind.

How far things have come in the last ten years. Now the evolutionists are aggressively on the march, fighting creationists in the schools and making strong arguments against religion of all kinds in books such as Sam Harris’s The End of Faith. This dialogue, with biologist Richard Dawkins and philosopher Daniel Dennett, is an excellent example of such new boldness among scientists and the science-minded. Or, as Dawkins has elsewhere suggested we call ourselves, “brights.”

Edge 178: The Selfish Gene: 30 Years On

Speakers: Daniel C Dennett (Tufts), Sir John Krebs, FRS (Zoology, Oxford), Matt Ridley, Ian McEwan, Richard Dawkins, FRS (Oxford)

Selfish genes and soft heads.

Creative Commons and photography rights.

Scot Hacker posts a picture at he took at SXSW of a dude with a Creative Commons logo shaved into the back of his head. Scot asked the guy for permission to take the picture, and the guy said sure, just so long as you publish your photo with a CC license.

It was polite of Scot to comply with the guy’s request (and when I was taking photos for Wired News at ETech I always asked permission of the subjects, again out of politeness), but this is one case where Creative Commons is actually more restrictive than the general law. In fact, you can photograph anyone you want without asking permission, provided they’re in a public place. You can also photograph any building you want, provided you are standing in a public place. Then you own the copyright to your photo. Here’s a handy one-page guide explaining photographers’ rights (PDF).

“Basically, anyone can be photographed without their consent except when they have secluded themselves in places where they have a reasonable expectation of privacy such as dressing rooms, restrooms, medical facilities, and inside their homes.”

There was upheld in a recent case, incidentally. An art photographer was taking random photos of people on the street in NYC. One man got upset to find his photo was on display in a gallery, for sale, and sued the artist, saying he had no right to profit from his likeness. The judge dismissed that suit last month.

Creative Commons and photography rights.