Trends & Quick Takes
Myths and Misses
The April 12, 2010 Fortune has an intriguing feature, “25 Green Myths Debunked.” The claim is that these pages “clear up 25 common misconceptions about the food we eat, the products we buy, the way we travel, and the energy we use.”
Maybe so, maybe not. Some are clear myths—although, in some cases, they’re ones you’d expect most well informed people (presumably Fortune’s readership) to know as myths by now, such as “1. Bottled water is safer than tap water” and “14. It doesn’t pay to turn down your thermostat when you’re not home.” Some are interesting, e.g., “8. It’s better to buy an artificial Christmas tree than cut down an evergreen every year” and “23. Car air conditioning wastes energy.” (Remember, these are all supposed to be myths.)
But a few too many are not myths and the “realities” are not refutations. For example: “4. Cars are one of the biggest emitters of greenhouse gas.” The counter is “Yes, but those hamburgers you like to gobble down are actually much worse.” Once you say “Yes, but,” the game’s over: If it’s true, it’s not a myth. Ditto “11. I’ll save energy if I keep my appliances turned off” (sure, there’s parasitic usage, but this is still a true statement). Others fall somewhere in between, boiling down to “it depends” or “all else being equal, which it frequently isn’t.” I think you could have a solid dozen cases where these would be learning experiences for most people—and, frankly, I think stretching that dozen out to 25 weakens the dozen best.
Farhad Manjoo wrote “The Poor Man’s Mac” on April 2, 2009 at Slate (this is probably a good place to note that Microsoft doesn’t own Slate anymore—Washington Post Newsweek purchased it some time ago). Manjoo takes issue with the ad campaign Microsoft ran briefly in 2009—the one where people were allotted a certain amount of money with which to buy a computer that would satisfy their needs.
Manjoo says it’s a terrible marketing strategy because it makes Windows something you settle for, and that once we’re out of the slump, people will happily pay for Apple’s superiority. (He also says Apple dominated the notebook market in 2008, which says something about Manjoo’s objectivity or awareness. Some 146 million notebooks and netbooks were sold in 2008. Apple’s own annual report for 2008 shows six million notebooks sold. Admittedly, Apple’s operating year isn’t a calendar year, but it beggars belief to suggest that Apple went from being the 7th largest notebook manufacturer to not only being first but having a majority of all sales within six months. That would require that Apple sold more than 65 million notebooks in the second half of 2008. That did not happen, I can say without much fear of being wrong.)
Then Manjoo tells us what “People want” from computers—and it’s clear he believes these are things Apple offers and Windows notebooks don’t. (“Look awesome.” “Environmentally responsible.” “Easy to fix.”)
The ads themselves? Mac fans say PC buyers will regret buying a “cheapo” $700 Windows notebook because it’s “terribly slow,” “weighs a ton,” etc., etc. Well, maybe—the machine chosen by one of the ad participants is a heavyweight and uses an AMD chip.
But…I’m writing this using as my “desktop” a Gateway notebook. I purchased it in early 2008, a year before these ads were running. I paid $600 or $700—let’s say $700. It weighs about six pounds (but I’m not really using it as a portable—if I was, I’d add another $300 and buy a two-pound netbook to use on the go). It has—and remember, it’s already 2.5 years old—a 1.6GHz Intel Core 2 Duo CPU, 3GB RAM, 250GB hard disk, a 15” screen, a DVD burner, and draft-N wireless. It’s more than fast enough for anything I want to do with it. Oh, and it’s snazzy-looking, with a dark red case. I’m pretty sure I would have paid at least 50% more for a Mac notebook with similar specs for the things I care about.
If you find Apple notebooks and desktops to have good value for you, more power to you. You’re probably right. But when you tell me I’m just cheaping out and must be regretting “settling” for something less—or when a supposed journalist does the same—well, sorry, but you’re not only wrong, you’re offensively wrong. Whether the year is 2009 or 2010.
That’s the title of a Chronicle of Higher Education piece by Geoffrey K. Pullum dated April 17, 2009 (chronicle.com/article/50-Years-of-Stupid-Grammar/ 25497). The day before (April 16) was the 50th anniversary of Strunk & White’s Elements of Style—and Pullum “won’t be celebrating.”
The Elements of Style does not deserve the enormous esteem in which it is held by American college graduates. Its advice ranges from limp platitudes to inconsistent nonsense. Its enormous influence has not improved American students’ grasp of English grammar; it has significantly degraded it.
I hadn’t realized “Strunk & White” is really E.B. White’s expansion and revision 0f Strunk’s self-published earlier work (required for Strunk’s English class at Cornell—that’s the way to self-publish something!), done after Strunk’s death. Pullum calls Strunk and White “grammatical incompetents.”
Pullum doesn’t necessarily object to the style advice, calling it “mostly harmless” if frequently vapid. (Vapid? “Be clear” and “Do not explain too much”—hmm. As Pullum says, “Omit needless words” is useless because writers who know which words are useless don’t need to be told.) He objects to the advice on grammar itself:
It is atrocious. Since today it provides just about all of the grammar instruction most Americans ever get, that is something of a tragedy. Following the platitudinous style recommendations of Elements would make your writing better if you knew how to follow them, but that is not true of the grammar stipulations.
He objects to the general advice to use the active voice (as a section heading: the actual discussion is more moderate) and claims that three of four pairs of examples showing how to avoid passive voice are misdiagnoses: For example, “There were a great number of dead leaves lying on the ground” isn’t passive voice.
There’s a lot more. He notes “the book’s contempt for its own grammatical dictates”—and thinks that it’s not so much willful as ignorant.
There is of course nothing wrong with writing passives and negatives and adjectives and adverbs. I’m not nitpicking the authors’ writing style. White, in particular, often wrote beautifully, and his old professor would have been proud of him. What’s wrong is that the grammatical advice proffered in Elements is so misplaced and inaccurate that counterexamples often show up in the authors’ own prose on the very same page.
Then there are the bogeymen Strunk & White helped maintain—for example, the advice to avoid split infinitives (which have “always been grammatical”) and the idea that you shouldn’t start a sentence with “however” used as a connective adverb. (Poor old Mark Twain: He used that construction much more often than the Preferred Alternative—but then, Clemens could barely write at all. Right?) There’s even one I take some pain to obey: the use of “which” and “that.” According to Cullum, “There was never a period in the history of English when “which” at the beginning of a restrictive relative clause was an error.” (In fact, Strunk used it that way—apparently White added the new rule.)
Interesting stuff. I read Strunk & White back in the day—of course I did. I suspect I can’t get away from some of the “rules” that may not make any sense at all, even as I’ve obviously abandoned some of the style advice. The 2,500-word article is a fun read and may be worthwhile if you take Elements of Style at face value.
I’ll probably do a Zeitgeist or Perspective on Twitter one of these days—but today isn’t that day. Instead, this is a little note on Geoff Manaugh’s “How the Other Half Writes: In Defense of Twitter,” posted April 22, 2009 at BLDGBLOG. It’s a 1,300-word post accompanied by 115 comments. I think he’s both right and wrong—right to “defend” Twitter against some of the silly things said about it in 2009, wrong in his evaluation of what Twitter is and even wronger in his final sentence: “Get over it.” Always a terrible way to end a thinkpiece, as it comes down to “I’m right, you’re wrong, end of discussion.”
Of course Maureen Dowd was silly to attack Twitter; of course Manaugh’s friend who said “Twitter is the death of humanism” was being absurd (and apparently a bit drunk). Of course you need to distinguish between Twitter itself and what (some) people write on Twitter. But…
Twitter is a note-taking technology, end of story. You take short-form notes with it, limited to 140 characters.
Buzz. Thanks for playing. Twitter is a social network service. I suppose you could have a private Twitter feed and never allow anybody to follow you, in which case Twitter would be a note-taking technology. Heck, you could have a private blog and not allow any subscriptions, in which case the blog would be a diary. But saying that blogging is a bunch of diaries is nonsense—as is saying that Twitter is (for most people or in its methodology) a note-taking technology. Manaugh is apparently an architect; how would he respond to my suggestion that most modernist skyscrapers are glass-and-steel sculptures, end of story?
Manaugh makes a comparison with ballpoint pens, if they’d been introduced into a world where all writing was done using typewriters:
People use it to write down grocery lists and street addresses and recipes and love notes. What is this awful new technology? the literary users of typewriters say. Ball-point pens are the death of humanism.
Nevermind, of course, that you can use ball-point pens to write whatever you want: a novel, a screenplay, epic poems, religious prophecy, architectural theory, ransom notes. You can draw astronomical diagrams, sketch impossible machines for your Tuesday night art class, or even work on new patent applications for a hydrogen-powered automobile—it doesn’t matter. You can draw penises on your coworker’s paycheck stub.
It’s a note-taking technology.
Well, no, it isn’t. A ballpoint pen is a writing technology that yields semi-permanent results (unlike a pencil). Twitter is not, primarily or fundamentally, a note-taking technology. Its whole design is centered on social networking, on sharing of those notes among your small (or large) circle of friends.
Kafka would have had a Twitter feed! And so would have Hemingway, and so would have Virgil, and so would have Sappho. It’s a tool for writing. Heraclitus would have had a f***ing Twitter feed.
Bull. I’m pretty sure Hemingway and Kafka would not have sent their initial ideas out 140 characters at a time. If they had Twitter accounts, they wouldn’t use them as “tools for writing”—they’d use them for social networking purposes. I could be wrong, of course, since I have exactly the same personal knowledge of Virgil, Kafka, Sappho and Heraclitus as Manaugh does, which is to say “none whatsoever.”
What Manaugh is really doing in this article is attacking elitism—he thinks Dowd and others are upset because “the other half” is writing.
Those other people—those everyday people who weren’t supposed to have thoughts, who aren’t known for reading David Foster Wallace or Dostoevsky or James Joyce, those overlooked people from whom we buy groceries, who fix our cars, clean our houses, and vote differently than we do--weren’t supposed to become writers.
That may be an objection to blogs. It could be an objection to the read/write web in general. It’s a stupid objection, but it’s not the same as some people’s nervousness about Twitter’s early emphasis on the most mundane and its 140-character limit. Somehow, though, to Manaugh it’s all about class. Here’s the simply wrong penultimate paragraph, before the final three-word paragraph that I object to in general (“Get over it.”) with that horrendous close:
Twitter is just another option for people to use when they want to take notes—and it’s no more exciting than that, either, to be frank. It’s a ball-point pen.
Nope. Wrong. Oddly enough, one commenter starts out by high-fiving Manaugh (“Spot on as ever Geoff”) just before offering a paragraph that says something about what Twitter really is:
I remember when I first tried Twitter it seemed rather pointless. After a while and having increased the number of people I was following it finally made sense until the point where there were enough people to require filtering and it has now become indispensable to me.
In other words, “once I had a worthwhile social network on Twitter, it became indispensable.”
I don’t use Twitter currently because it doesn’t work well (directly) for me (at this point), although I surely partake in the Twittersphere through Friendfeed. I do know enough about Twitter to know that it’s not the death of humanism, it’s not evil in any way—and it’s not “a ball-point pen.”
Wanna buy a netbook cheap? That’s the lure of AT&T and Verizon specials at various places such as Radio Shack: You get a name brand netbook (Gateway, HP, Dell, Lenovo, Acer) for a lot less than you’d expect—maybe free, maybe $100, maybe $200. There’s just one catch: You also have to sign up for a two-year 3G data-only plan—at either $40 for around 200-250MB per month or $60 for an “unlimited” plan, that is, 5GB a month. The $40 one is hairy—if you use too much data, you’ll pay another $0.10 per megabyte. (Actually, $10 for 100 megabytes from AT&T. Don’t buy the $60 plan and ever go over 5GB: They’ll sock you for $0.50 per megabyte.) So how much does that netbook really cost? $1626 for a Dell Mini 10 from AT&T, $1675 for an HP Mini 110 from Verizon—basically, $68 to $70 a month. Sure, you’re saving money on the netbook—but if you don’t really need 3G (if you can use it with wifi), it’s an expensive way to save a few bucks.
Interesting story in the February 2010 PC World: How the “good guys” managed to take down a botnet, Mega-D, that controlled a quarter million PCs. If your spam level went down slightly in November 2009, you can probably thank FireEye.
A Wired story by Jordan Ellenberg (March 2010) talks about a “revolutionary algorithm” that “can make something out of nothing”—going by the name compressed sensing. It’s an interesting technique, one that could allow for (for example) much faster MRI scans. As I read the story, the technique makes perfectly good sense—it’s basically applying Occam’s Razor to filling in missing pieces (that is, finding the least complex way to reconstruct what’s missing). “It turns out that of all the bazillion possible reconstructions, the simplest, or sparsest, image is almost always the right one or very close to it.” That seems not only reasonable but natural. Interesting article, probably also available online.
I’d like to like Clive Thompson’s “I’d Rather Be Texting” column in the March 2010 Wired, where he says the texting-while-driving problem needs to be reversed: We need to keep texting and stop driving. That’s fine—but the U.S. is a relatively sparsely populated, spread-out country, and that’s not going to change any time soon. He says U.S. cities and suburbs have “completely neglected their public transit.” That’s hogwash, but it’s true that public transit is generally less than satisfactory—and there’s no plausible way you’ll get the funds to make it otherwise. Meantime, do you really need to be texting every waking moment?
Sometimes the current delay in dealing with Trends & Quick Takes items is revealing—as in an April 21, 2009 item by John Battelle at Searchblog, “News: Google Lets You Put Yourself Into Results For…Yourself.” He’s touting the addition of Google Profile results to Google, noting that you can build your own Google Profile and seems to think this is a wonderful thing—while admitting that this is mostly a way for Google to get more people to build profiles. I’m guessing more than a few people who had Google Profiles shut them down after Google did its cute Buzz introduction; I know I did. And somehow, the idea that Google’s manipulation of self-manipulated profiles puts a “human, community-driven face” on Google is…well…odd. As far as I can tell, Google abandoned the wonderful new service—at least I don’t see profiles in name searches…although, doing a vanity search, I do get an ad from Google itself urging me to create a Google Profile.
Since I still haven’t read Chris Anderson’s Free (and am not chomping at the bit to read the output of any Wired guru), consider this a pointer to John Dupuis and a year-old post at Confessions of a Science Librarian: “Guru cage match: Gladwell vs. Anderson,” posted June 29, 2009. He discussed Malcolm Gladwell’s New Yorker review of Free: The Future of a Radical Price. He cites one specific weakness Gladwell identifies in the book. Here’s Dupuis’ comment, which I particularly love:
The weakness, of course, is more due to Anderson’s overweaning hypiness and guruhood than anything else. He wants to make his ideas on business models based on free digital content some sort of Grand Unification Theory of markets, digital and otherwise rather than honing in on cases where it actually makes sense. He has to shoe horn everything into his model.
Note that Dupuis isn’t dismissing Anderson entirely. I’m just citing a nicely worded key point.
I found this wiki bemusing, although I’m not quite sure I know why: “Theories Used in IS Research.” [www.fsc.yorku.ca/york/istheory/wiki/ index.php/Main_Page] It is exactly what the name implies: “This site provides researchers with summarized information on theories widely used in information systems (IS) research. Click on a linked theory name below to find details about the theory, some examples of IS papers using the theory, and links to related sites.” There are a lot of these theories—close to a hundred. I’ll admit that I never thought of Darwin’s evolutionary theory as an information sciences theory, but I’m no information scientist. (Even if I was, I’d be challenged by text such as the first sentence for “Hermeneutics”: “Hermeneutic theory is a member of the social subjectivist paradigm where meaning is inter-subjectively created, in contrast to the empirical universe of assumed scientific realism.” Does that mean “We just make shit up”? The rest of the description leaves me even less certain of what’s being described and how it could fit into a “science.”) I do like the presence of a “Top 5 Theories” list!
To finish off this random set on a humorous note (albeit one that doesn’t appear intended as humorous), here’s Jeremy Reimer’s August 3, 2009 ars technical piece, originally titled “Microsoft Word, RIP: 1983-2009”—now retitled “The prospects of Microsoft Word in the wiki-based world.” What’s that you say? “Wiki-based world” strikes you as a ludicrous term? Well… see, the author’s someone who’s used Word for 20 years and now realizes “that I don’t need Word any more. At all. Ever.” (Those last three words appear as a separate paragraph.) Why? First Reimer talks about features (he seems to be for them but complains about the difficulty of converting complicated documents into XML), then about Word being designed to prepare documents for printing (which, of course, nobody does any more). He seems to assume that all anybody uses Word for is to write office memos—and somehow concludes that MediaWiki is the answer. And at his wholly representative firm, everybody started using it right away, loves it, and it “transformed our office’s documentation landscape.” Since it’s trivial to convert Word documents to wikimarkup (rriigghhtt…), “that’s basically the end of Word at work.” So that’s it: “Word…is the new typewriter.” As the second commenter says, “It’s official. Word is dead because Jeremy Reinter stopped using it.” After all, nobody needs any of the Word formatting, structuring and other features MediaWiki doesn’t support… (Another commenter has clearly had Fun MediaWiki Editing Experiences, noting what frequently happens after half an hour or an hour of intense editing—whoopsie, it’s all gone!) What we have here is one interesting (if unusual) case of one small office adopting a wiki as a document handling standard…that’s then generalized beyond all rational thought into a universal solution.
Comments should be sent to firstname.lastname@example.org. Cites & Insights: Crawford at Large is copyright © 2010 by Walt Crawford: Some rights reserved.
All original material in this work is licensed under the Creative Commons Attribution-NonCommercial License. To view a copy of this license, visit http://creativecommons.org/ licenses/by-nc/1.0 or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA.