Magic and technology.

Sci fi writer David Brin looks at J.R.R. Tolkien’s Lord of the Rings through the lens of Romanticism vs. the Enlightenment, nostalgia vs. optimism. And he finds Tolkien’s vision, not surprisingly, to be a surprisingly retrograde one. Why is this world, full of strict class and race divisions, kings and serfs, wizards and eternal orders, so appealing? He attributes it to a kind of Romantic backlash, a nostalgia for the order of the past that’s been disrupted by this world’s true rebels: scientists, democrats, technologists.

Brin writes: “Millions of people who live in a time of genuine miracles — in which the great-grandchildren of illiterate peasants may routinely fly through the sky, roam the Internet, view far-off worlds and elect their own leaders — slip into delighted wonder at the notion of a wizard hitchhiking a ride from an eagle. Many even find themselves yearning for a society of towering lords and loyal, kowtowing vassals.” Curious, no?

Later, he invokes a thought experiment: “Ask yourself: “How would Sauron have described the situation?” … Now ponder something that comes through even the party-line demonization of a crushed enemy — this clear-cut and undeniable fact: Sauron’s army was the one that included every species and race on Middle Earth, including all the despised colors of humanity, and all the lower classes. ”

And he concludes: “You are heirs of the world’s first true civilization, arising out of the first true revolution. Take some pride in it. Let’s keep enjoying kings and wizards. But also remember to keep them where they belong. Where they can do little harm. Where they entertain us. In fantasies. ”

Meanwhile, usability wizard Jakob Nielsen points out that in the future, we really will be surrounded by Harry Potter-like magic. “Much of the Harry Potter books’ charm comes from the quirky magic objects that surround Harry and his friends. Rather than being solid and static, these objects embody initiative and activity. This is precisely the shift we’ll experience as computational power moves beyond the desktop into everyday objects. ” He concludes with an exhortation to Web wizards: Don’t harm the Muggles — the ordinary Web users for whom advanced technology really is magical. In other words, keep it simple, keep it usable, and meet their expectations.

Magic and technology.

Library lookup.

I’m a jaded tech journalist. It’s not often that a new Internet service can actually make me excited. But last week I discovered one that had me grinning all afternoon.

Jon Udell’s LibraryLookup is an unassuming-looking page: It lists about 900 libraries, all of which use an online catalog system from Innovative Interfaces. Each library in this list is presented as a link. When you drag one of those links to your links toolbar or favorites list, you create a “bookmarklet” (a small piece of JavaScript code, stored as a bookmark). Next time you’re at Amazon.com, looking at a book, all you have to do is click on this bookmarklet to find out if that book is available at your local public library.

Cool, huh? LibraryLookup, more than anything I’ve seen online yet, makes me long to learn how to write JavaScript and to figure out regexp syntax more fully. This is just so much more powerful and interesting than mouse-over graphics.

Udell works at InfoWorld, where I used to be a columnist, and he has a fine technical weblog. The story of how LibraryLookup came to be is described there, and it illustrates how weblogs, web services, and clever scripters can combine to create — almost spontaneously — brand-new web applications.

What’s interesting to me about this whole story is that Jon considers “web services” to be not just applications that make themselves accessible via SOAP or other explicit Web services protocols. Any kind of programmable interface to Web information seems to qualify as a Web service, in Jon’s world.

So when he finds a page on Innovative’s site that lets you look up any of their customers, he creates a script to extract their entire customer database. Likewise, Innovative-powered libraries weren’t exactly publishing SOAP interfaces or XML APIs for the world to address. But their systems let you query the catalogs using nothing more than an URL. So Jon writes a script that extracts a book’s ISBN from its Amazon.com URL, then plugs that ISBN into the query URL for a particular libary, and opens it up in your browser. Voila: LibraryLookup.

Innovative quickly took their customer lookup page down, mere days after Jon’s app debuted. Perhaps they were a little freaked out when they saw what they’d unwittingly enabled. They may have good reasons for wanting to protect their customer list — but on the other hand, Jon’s application adds tremendous value to the Innovative system, and potentially can drive a lot of traffic (and customers) to the libraries using Innovative catalogs.

The evolution of LibraryLookup has been a joy to watch. It’s a nifty, useful tool. And most importantly, it shows how clever coders, working with publicly-available information services and simple scripts, can create ad-hoc Web services applications seemingly out of thin air.

Notes:

Jon also has a version of LibraryLookup for libraries that use the Voyager online catalog.

Here are Udell’s latest updates on the project, including a version that works with Powells.com URLs, and one that looks up books on O’Reilly’s electronic-book network, Safari.

Library lookup.

Less is Moore.

I don’t use instant-messaging software as a rule, and one reason I don’t is that it only exacerbates this “interruptive” condition of online life. Email itself is distracting enough that I’ve had to take serious measures to control its impact (filters, schedules, spam guards, and more). Yet I can’t rule it out entirely. As with many people, email is the way I work. It’s the primary way I communicate with the people I work with and those I’m interviewing. It’s one of the major channels through which I learn about new ideas and technologies. I can’t just turn it off any more than I can quit my job and go spend the winter meditating in a mountain retreat.

Then there was Nash’s office: During the height of his illness, he had plastered the walls with hundreds of pages torn from various magazines, highlighting random words and letters here and there, drawing lines from one thing to another. The whole space was a vast, tangled map of mental connections, embodied in paper and ink and string. Holy crap, I thought: That’s a weblog in physical form!

Times sure change. In the 1950s, making such obsessive connections between scraps of publicly available media was a sure sign of insanity. Now it’s practically a requirement. The experts agree: Maintaining a weblog is a good way to promote yourself and your business.

Naturally, these experts are all famous webloggers. As for how the rest of the hoi polloi are justifying the hours they spend on their digital diaries when only a handful of people will ever read them — well, the results are still out on that one.

(Actually, Chris Gulker has analyzed the most popular weblogs and has found that being famous already will help your blog be popular, but having a blog won’t necessarily help you become famous. So much for the self promotional values of weblogging.)

Indeed, as technology becomes more pervasive, there’s an unfortunate down side. Unless you learn to master the technology around you, it can easily take over your daily life — without you being particularly aware that this is happening. Email is distracting. The infinite interconnectedness of the Web means there’s always something more to learn — you’re never quite finished with anything because there’s always one more link to investigate, one more fact to incorporate.

And, as InfoWorld columnist Ephraim Schwartz wrote last month, mobile technology isn’t exactly helping, either. Rather, those Web-enabled mobile phones and Blackberries let us to keep working far beyond the hours and places where work used to be confined.

Think back to what your world was like before you got Web access and Internet email. For me, it’s hard even to imagine what things were like — that’s how much I’ve come to rely on these technologies, how much they’ve transformed our world. By and large, this is a good thing. But I think there’s a growing gap between personal technologies and our ability to manage them effectively.

My column for Business 2.0 this week discusses the obsolescence of Moore’s Law, and points out that processing power isn’t really a driving concern in the technology world nowadays (outside of the board rooms of Intel and AMD).

Instead, the big problems facing IT departments — and, increasingly, ordinary individuals — have to do with information management. How do you store, maintain, organize, and make accessible large amounts of data?

I think we’re fast approaching the point where our tools’ ability to bring us information exceeds our ability to manage and make use of that information. At this point, what we need is less information, not more: We need filters, categories, classifications. Editors. Friends. Good work habits and business processes. All of these things we need in order to take control of the information that surrounds us — and to take control of our online lives.

In other words, we need to control technology before it controls us.

As I note in the piece itself, this week’s column is the last one I’ll write for Business 2.0. I’ve spent more than two years as that magazine’s “Defogger” columnist, penning over 80 articles in that time. Now it’s time for me to devote more attention to other projects. This newsletter’s sporadic dispatches will continue, of course. And I’ll say more about what I’m working on early next year, right here.

Less is Moore.

Pulling up stakes.

I’m relocating this weblog: As of today, I’m promoting it to the home page of my web site, http://dylan.tweney.com. Archives, RSS feeds, and everything will be moving. Please update your links — and if you have questions or comments, please let me know.

Why am I making the move? It’s time to do some technology consolidation here on tweney.com. For the past year or so, I’ve been using PHP to generate the home page. I had built an articles database using PHP and MySQL to keep track of the articles I’ve published in various magazines and web sites. To generate the home page and associated RSS feeds, I used a hacked-together PHP script that pulled the most recent articles from this database and put them together with the right formatting, text, and so forth. The whole system worked decently enough. In fact, my homegrown database still works better, for my purposes, than anything I’ve found available on the Net.

However, times change, and so do syndication formats. RSS 1.0 came out, and then RSS 2.0. My feeds are still built upon RSS 0.91. I suppose I could go to the trouble of figuring out the new syndication formats, but I really don’t want to. It’s much easier to use Movable Type, which has been powering my weblog for quite some time, and which automatically generates both RSS 1.0 and 2.0. If I’m going to use MT for feeds, why not use it for the home page, too?

Also, I update my weblog more frequently than the articles database, and I believe a site’s home page should always display the freshest content. So here I go: putting my best face forward.

Pulling up stakes.

Less is Moore.

I was watching “A Beautiful Mind” recently and was struck how much the mathematician John Nash’s schizophrenia, as portrayed in the movie, was like my online life: Ethereal voices constantly impinging on my attention, demanding responses, distracting me from the work (and people) at hand. Only in my case it’s email messages, not hallucinations.

I don’t use instant-messaging software as a rule, and one reason I don’t is that it only exacerbates this “interruptive” condition of online life. Email itself is distracting enough that I’ve had to take serious measures to control its impact (filters, schedules, spam guards, and more). Yet I can’t rule it out entirely. As with many people, email is the way I work. It’s the primary way I communicate with the people I work with and those I’m interviewing. It’s one of the major channels through which I learn about new ideas and technologies. I can’t just turn it off any more than I can quit my job and go spend the winter meditating in a mountain retreat.

Then there was Nash’s office: During the height of his illness, he had plastered the walls with hundreds of pages torn from various magazines, highlighting random words and letters here and there, drawing lines from one thing to another. The whole space was a vast, tangled map of mental connections, embodied in paper and ink and string. Holy crap, I thought: That’s a weblog in physical form!

Times sure change. In the 1950s, making such obsessive connections between scraps of publicly available media was a sure sign of insanity. Now it’s practically a requirement. The experts agree: Maintaining a weblog is a good way to promote yourself and your business. [1]

Naturally, these experts are all famous webloggers. As for how the rest of the hoi polloi are justifying the hours they spend on their digital diaries when only a handful of people will ever read them — well, the results are still out on that one.

(Actually, Chris Gulker has analyzed the most popular weblogs and has found that being famous already will help your blog be popular, but having a blog won’t necessarily help you become famous. [2] So much for the self promotional values of weblogging.)

Indeed, as technology becomes more pervasive, there’s an unfortunate down side. Unless you learn to master the technology around you, it can easily take over your daily life — without you being particularly aware that this is happening. Email is distracting. The infinite interconnectedness of the Web means there’s always something more to learn — you’re never quite finished with anything because there’s always one more link to investigate, one more fact to incorporate.

And, as InfoWorld columnist Ephraim Schwartz wrote last month, mobile technology isn’t exactly helping, either. Rather, those Web-enabled mobile phones and Blackberries let us to keep working far beyond the hours and places where work used to be confined. [3]

Think back to what your world was like before you got Web access and Internet email. For me, it’s hard even to imagine what things were like — that’s how much I’ve come to rely on these technologies, how much they’ve transformed our world. By and large, this is a good thing. But I think there’s a growing gap between personal technologies and our ability to manage them effectively.

My column for Business 2.0 this week discusses the obsolescence of Moore’s Law, and points out that processing power isn’t really a driving concern in the technology world nowadays (outside of the board rooms of Intel and AMD). [4]

Instead, the big problems facing IT departments — and, increasingly, ordinary individuals — have to do with information management. How do you store, maintain, organize, and make accessible large amounts of data?

I think we’re fast approaching the point where our tools’ ability to bring us information exceeds our ability to manage and make use of that information. At this point, what we need is less information, not more: We need filters, categories, classifications. Editors. Friends. Good work habits and business processes. All of these things we need in order to take control of the information that surrounds us — and to take control of our online lives.

In other words, we need to control technology before it controls us.

[ LINKS ]

[1] Doc Searls on the evolution and benefits of his weblog

[2] Chris Gulker analyzes the most popular weblogs

[3] Ephraim Schwartz on wireless work

[4] Dylan Tweney on the meaning and end of Moore’s Law

Link: Less is Moore.

Link broken? Try the Wayback Machine.

Less is Moore.

Does Moore’s Law still hold true?

You don’t have to be a software programmer to be familiar with the principle. Since the early 1970s, Moore’s Law — named after Gordon Moore, one of the founders of Intel — has been universally touted within the computing industry. The law has many variants, but the gist of it is this: Computing power will increase exponentially, doubling every 18 to 24 months, for the foreseeable future.

Too bad it isn’t true. According to Ilkka Tuomi, a visiting scholar at the European Commission’s Joint Research Centre in Seville, Spain, not only is Moore’s Law losing significance, but it never fit the data very well in the first place. In an academic paper published last month, Tuomi dissects the many variants of Moore’s Law and shows that, in fact, none of them match up well with actual advances in chip technology. (See Tuomi’s paper for more.) For example, processor power has increased dramatically since 1965, when Moore first proposed his law, but at a slower rate than expected, doubling about every three years instead of every two. That’s equivalent to a ninefold increase in processing power per decade, compared with a 32-fold increase per decade with a two-year doubling period — a big difference.

What’s more, it’s hard to translate processor power into increased computing power, because there are so many other factors involved in computer performance. As anyone who has been forced to buy a faster, more powerful computer in order to run the latest version of Windows knows, today’s operating systems are memory and processing hogs. You probably aren’t much more productive on a top-of-the-line 2-gigahertz Pentium 4 desktop running Windows XP today than you were with a 300-megahertz Pentium II running Windows 95 five years ago. The sad fact is that the hardware upgrades of the past decade have been driven more by Microsoft operating system demands than by consumers’ demands for more power. As the old saying goes, Andy Grove giveth, and Bill Gates taketh away.

If Tuomi’s right (and I find his argument persuasive), why should we care? First, Moore’s Law gives the false impression that progress in the semiconductor industry is unlimited and unconstrained by the laws of supply and demand. Unfortunately, that just ain’t so. In reality, the cost of chip factories increases exponentially with each new generation of processors (a trend known as Moore’s Second Law). For example, Intel is spending $2 billion on its latest chip fabrication site in Kildare, Ireland. That’s a very big bet that continued demand for more processing power will eventually sell enough chips to pay for the plant. Take away the demand and you’ve got an economic crisis in the semiconductor industry. More important, Tuomi’s analysis shows that processor power alone is only part of the business of technology — and an increasingly small one at that. Look at any company’s IT infrastructure today and you’ll see that processor power is not a significant issue. There’s more than enough power available (unless you’re one of the workers unlucky enough to be saddled with a four-year-old desktop trying to run Lotus Notes R5 or Windows XP). The biggest corporate technology problems now have to do with storing, managing, organizing, retrieving, and guarding increasingly huge amounts of data.

That’s why the hottest areas for enterprise IT are in segments like storage, knowledge management, customer relationship management, business intelligence, and data mining. These systems are all about handling large amounts of information — and making it useful. Significantly, such systems often require that you spend more time reworking business processes and training employees than you devote to installing the technology itself.

“Sometimes we perhaps invest disproportionately in technology, believing that technology, as such, solves our problems,” Tuomi says. “We often underestimate efforts and investments needed for organizational change and new work practices, for example.”

The challenge now is not finding new and more powerful technologies to serve our needs — it’s organizing our companies and our work lives so that we can use those technologies more effectively. We can no longer trust in the magic wand of Moore’s Law to solve our computing problems for us. Instead, we must learn how to use the tools we already have.

This will be my last Defogger column for Business 2.0. I’ve written more than 80 of these columns since July 2000, and I hope that during that time I’ve helped you to understand and make smarter decisions about technology and its strategic uses in business. Now it’s time for me to move on. If you want to find out what I’m working on in the coming months, please sign up for my personal newsletter at http://dylan.tweney.com. So long, and thanks for all the e-mail!

Link: Does Moore’s Law still hold true?

Link broken? Try the Wayback Machine.

Does Moore’s Law still hold true?

Two by Sterling.

Two early 90’s talks by Bruce Sterling:

Free as Air, Free As Water, Free As Knowledge: ‘What’s information really about? It seems to me there’s something direly wrong with the “Information Economy.” It’s not about data, it’s about attention. In a few years you may be able to carry the Library of Congress around in your hip pocket. So? You’re never gonna read the Library of Congress. You’ll die long before you access one tenth of one percent of it. What’s important — increasingly important — is the process by which you figure out what to look at. This is the beginning of the real and true economics of information. Not who owns the books, who prints the books, who has the holdings. The crux here is access, not holdings. And not even access itself, but the signposts that tell you what to access — what to pay attention to. In the Information Economy everything is plentiful — except attention. ‘ (via BoingBoing)

The Wonderful Power of Storytelling: ‘We’re not into science fiction because it’s *good literature,* we’re into it because it’s *weird*. Follow your weird, ladies and
gentlemen. Forget trying to pass for normal. Follow your geekdom. Embrace your nerditude. In the immortal words of Lafcadio Hearn, a geek of incredible obscurity whose work is still in print after a hundred years, “woo the muse of the odd.”‘

“Don’t become a well-rounded person. Well rounded people are smooth and dull. Become a thoroughly spiky person. Grow spikes from every angle. Stick in their throats like a pufferfish. If you want to woo the muse of the odd, don’t read Shakespeare. Read Webster’s revenge plays. Don’t read Homer and Aristotle. Read Herodotus where he’s off talking about Egyptian women having public sex with goats. If you want to read about myth don’t read Joseph Campbell, read about convulsive religion, read about voodoo and the Millerites and the Munster Anabaptists. There are hundreds of years of extremities, there are vast legacies of mutants. There have always been geeks. There will always be geeks. Become the apotheosis of geek. Learn who your spiritual ancestors were. You didn’t come here from nowhere. There are reasons why you’re here. Learn those reasons. Learn about the stuff that was buried because it was too experimental or embarrassing or inexplicable or uncomfortable or dangerous.”

Two by Sterling.

What happened to the New Journalism?

Where are the really good stories in magazines today? The ones that make you say to your friends: “Did you see that story about X?” Michael Shapiro examines the attenuated legacy of the New Journalism: “I had learned the essential lesson of those who wished to make magazines a career: write to form, or go to law school. So, I wrote to form. … Technique slid slowly, maddeningly, and seemingly inevitably into The Form: anecdote; set-up graph; scene, digression, scene, quote from Harvard sociologist.” (via bookslut)

What happened to the New Journalism?

Reality vs. Moore’s Law.

Game, set, and match.

[1] The Inquirer: Moore’s Law meets market saturation. “Today, accountants to video heads have enough horsepower at their fingertips to keep themselves sated. Why shell out $1500 in next year’s technology for a measly 3-5% performance bump on the most bleeding-edge applications?”

[2] Brewster Kahle on storage requirements at the Internet Archive: “It costs $40,000 a month just to buy new storage. Next year it will cost half that for the same amount of storage but by then there will be twice as much to record, or more.”

[3] Ilkka Tuomi: “Contrary to popular claims, it appears that the common versions of Moore’s Law have not been valid during the last decades. As semiconductors are becoming important in economy and society, Moore’s Law is now becoming an increasingly misleading predictor of future developments.”

Reality vs. Moore’s Law.

Retro Dickens.

Dickens’ novel Great Expectations was published serially starting in December 1860 — a new episode appearing each week, just like The West Wing. Now, in December 2002, you can read Dickens’ Great Expectations as Victorians did — except you’ll be viewing .PDF facsimiles that look like the original newsprint broadsides. A new edition comes out every Wednesday. It’s retro tech! Read it or subscribe on Stanford’s Dickens site.

Retro Dickens.