This [image by J. Scott Campbell] originally appeared back in Wizard magazine #136 in 2002 as the opening spread to an article entitled "80's EXTREME!"
Here is the first complicating fact about the Jobs visit. In the legend of Xerox PARC, Jobs stole the personal computer from Xerox. But the striking thing about Jobs's instructions to Hovey is that he didn't want to reproduce what he saw at PARC. "You know, there were disputes around the number of buttons--three buttons, two buttons, one-button mouse," Hovey went on. "The mouse at Xerox had three buttons. But we came around to the fact that learning to mouse is a feat in and of itself, and to make it as simple as possible, with just one button, was pretty important." [...]
If you lined up Engelbart's mouse, Xerox's mouse, and Apple's mouse, you would not see the serial reproduction of an object. You would see the evolution of a concept.
The same is true of the graphical user interface that so captured Jobs's imagination. Xerox PARC's innovation had been to replace the traditional computer command line with onscreen icons. But when you clicked on an icon you got a pop-up menu: this was the intermediary between the user's intention and the computer's response. Jobs's software team took the graphical interface a giant step further. It emphasized "direct manipulation." If you wanted to make a window bigger, you just pulled on its corner and made it bigger; if you wanted to move a window across the screen, you just grabbed it and moved it. The Apple designers also invented the menu bar, the pull-down menu, and the trash can--all features that radically simplified the original Xerox PARC idea.
The difference between direct and indirect manipulation--between three buttons and one button, three hundred dollars and fifteen dollars, and a roller ball supported by ball bearings and a free-rolling ball--is not trivial. It is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that's appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.
This is a fascinating article about how the reinsurance agencies model risk, how those models are modified to take into account heretofore unknowns (e.g., 9/11, climate change, etc.) and the relationships between the "original insured," insurers and reinsurers.
Consumers don't tend to know what reinsurance is because it never touches them directly. Reinsurers, massively capitalized and often named after the places where they were founded, make their living thinking about the things that almost never happen and are devastating when they do. But even reinsurers can be surprised. And the insurers who make up their market put them on the hook for everything, for all the risks that stretch the limits of imagination. This is what the industry casually refers to as the "God clause": Reinsurers are ultimately responsible for every new thing that God can come up with. As losses grew this decade, year by year, reinsurers have been working to figure out what they can do to make the God clause smaller, to reduce their exposure. They have billions of dollars at stake. They are very good at thinking about the world to come.
This is the coolest video you'll watch this month. Basically, this guy installed a video camera in the cockpit of a radio-controlled F–16--the camera replaced the head of the fighter-pilot doll!--and then piped the video to video goggles he wore on the ground. The camera on the plane is synced with the movements of his goggles--it's as if he's in the cockpit when piloting the plane from the ground.
He also modified the dashboard to include an altimeter so he could see, within the goggles, the height of the plane. Oh, did I mention that the plane also has servo-controlled "rockets?"
Unreal. This whole damn project is unreal.
I'm not going to lie, I skimmed over this article looking for just one thing, and found it:
As far as the existing e-ink-based Kindles, all I've heard is that they'll continue to co-exist with this new tablet…
Regarding the tablet, it's exactly what anyone who follows this sort of thing would have guessed. *yawn* It likely will sell well, but I've zero interest in the device because 1) I already own an iPad 2 (and am foaming at the mouth for the retina-display iPad 3); 2) apps, apps, apps; and 3) I already own an e-ink Kindle.
The big news of course is that the tablet will be an all-Amazon joint--it will use a forked Android as its base, but going forward the OS will be developed independently (at least as far as UI/UX goes), and it will use Amazon's app store, not Android Market. I actually think this is a great move, as it will allow Amazon to control the entire experience (a la Apple) and really set themselves apart from the Android-based competition.
That said, I think those, including MG, hyping up the integrated nature of Amazon services on the device are deluded. I can't see this making a lick of difference to consumers, who already can interact easily with most (if not all) of the these services via a browser or native (iOS/Android) apps. *shrug*
A great read. Nice work Ryan.
As some of you are well aware, I cut my geek teeth on Linux (starting in high school). If memory serves, the first distro I really dug into was Slackware (because it was the one some of my IRC friends liked), and then SUSE, Red Hat, and finally Debian, which I stuck with until I finally made the switch to Mac OS X in 2003.
I'll be honest, as annoying as some things were in Linux (and in the early days damn near everything was annoying, until you "solved" it), a big part of me misses the challenge. And don't even get me started on window managers, especially the nascent stages of Enlightenment, which turned my world upside down (in an awesome way).
dis·a·buse verb (used with object) [dis-uh-byooz] -- persuade (someone) that an idea or belief is mistaken: he quickly disabused me of my belief in an afterlife.
Does that sound like you? Do you have an idea for an opinionated site? If so, then today's your lucky day, because I'm selling the world's second-greatest domain name, disabuse.net. I nabbed it earlier this year when I was trying to settle on a non-eponymous domain name for this site, and likely would have used it had I not successfully haggled for hypertext.net.
I've been gushing recently on Twitter about Spotify, and may never stop pushing the service if and when their entire catalog is available at the higher bitrate (i.e., 320 Kbps), a stated goal of theirs.
A few weeks ago I decided that going forward I would use the service exclusively, and, consequently, stop buying albums1. To this end I searched for and starred in Spotify every album/artist currently in my iTunes library (and then deleted the correspnding audio files from my machine; I have them in offline storage), thinking that later I would move those starred albums to one or more Spotify playlists2.
This took forever. Imagine then my horror when yesterday I noticed that my "starred" items included only the very last album I starred. That's right, all of my time and effort--spanning multiple days--vanished in a second, and I have absolutely no idea why (nothing changed on my end, and in fact, I never even closed/opened the app during this time). To add insult to injury, I couldn't even go back and do all of the work again, because I had deleted the albums from iTunes along the way.
At this point I kind of freaked out. A carnitas burrito from Chipotle brought me back to my happy place, mostly.
Fortunately, I scrobble a good chunk of my listening to Last.fm, and so I'm now in the process of working through that data and using it to rebuild my Spotify library. This time though I'm adding these previously-bought albums to a Spotify playlist (and mirroring this playlist to yet another Spotify playlist), instead of starring them. Also, I'm going to look into the possibility of exporting/backing up these playlists because I'm worried that they too can just disappear without warning; my guess is that, from a backend perspective, the starred list is nothing more than a pre-named playlist.
Dear Spotify, a lil' help?
When people like me decide they're no longer going to buy albums, you better believe that model is dead dead dead. There's no going back. iTunes needs to offer a subscription service, and fast. ↩
The idea here was that I didn't want to "forget" about music I already owned and loved, and so this was a way of "transferring" those records to Spotify. Maybe it's possible, but I couldn't figure out a way to have Spotify match my local music to its library, which would have saved me a ton of time. I have a ridiculous amount of music (about eight times as much in storage as was in my iTunes library), and so I hope Spotify comes out with a way to automate this transition. ↩
A large array of options may discourage consumers because it forces an increase in the effort that goes into making a decision. So consumers decide not to decide, and don't buy the product. Or if they do, the effort that the decision requires detracts from the enjoyment derived from the results. Also, a large array of options may diminish the attractiveness of what people actually choose, the reason being that thinking about the attractions of some of the unchosen options detracts from the pleasure derived from the chosen one.
Jonathan Pararajasingham has pulled together a montage of 50 renowned academics, mostly all scientists, talking about their thoughts on the existence of God. The list includes includes 16 Nobel prize winners, and a bundle of recognizable names, including Richard Feynman, Steven Pinker, Oliver Sacks, Bertrand Russell, Stephen Hawking, and Leonard Susskind.
You may have noticed that I made no mention here of this technology when the whole world was going on about it a couple of months ago. It's a mind-blowing advent to be sure, but I wonder if it has real, artistic application for prosumer and professional photographers.
Devin Coldewey does a great job summarizing my skepticism with regard to both the real-world practicability of the technology and its potential impact on the art of photography.
[The inventor] describes focusing as "a chore," and believes removing it simplifies the process. In a way, it does -- the way hot dogs simplify meat. Without focus, it's just the record of a bunch of photons. And saying it's a revolution in photography is like saying dioramas are a revolution in sculpture.
Steve, I could drone on forever about how you've affected me over these many years, but instead I'll simply say that you've made my life better. Thank you.
While Windows' pervasiveness did limit computer diversity in the 1990s, it by no means stamped it out. Here are 15 amazing and unusual machines that dared to swim against the tide of conformity-albeit with limited success that leaves most of these systems extremely hard to find today. Let's pay tribute, at long last, to these rare computers of the 1990s.
My alarm clock goes off. Presumably on my iPhone 4, because it's very important to me that I own the latest technology. I hit snooze. I can't believe I have to get up by 9 a.m. to make it to my place of work before 10 a.m. where I am paid to be creative and knowledgeable about "the internet," just in general. […]
On a conference call, someone we're talking to says a buzz-word like "synergy." We put it on mute and make fun of them. [...]
When I return to work, I will sign up for a social networking site that is new... It's probably a site made by a guy who knows a guy that I know. I'll be jealous that he was smart enough to make this. I will presumably use said new social network about 14 times and then I will never use it again. But I'll be able to let people know that, yeah, I've used that. I found it hard to get into.
The whole thing is gold. Go read it.
A bold experiment in distributed education, "Introduction to Artificial Intelligence" will be offered free and online to students worldwide during the fall of 2011. The course will include feedback on progress and a statement of accomplishment. Taught by Sebastian Thrun and Peter Norvig, the curriculum draws from that used in Stanford's introductory Artificial Intelligence course. The instructors will offer similar materials, assignments, and exams.
This sounds great, not least because it's being co-taught by Peter Norvig (now at Google), who wrote the book we used in the first AI course I took in undergrad. (His book is kind of the book.) I'm interested to learn a lot of this stuff again, and to see how he presents his material. (I'm assuming that the main difference between this course and the other free web-based classes MIT et al. have offered for years, is that with this one you'll be able to participate nearly directly, and will receive (automated) feedback on assignments, etc.)
In another twist of awesome, Reddit is handling study group duties.
I've just enrolled. Will you?
A must-read for anyone doing web development.
In his series, "Day to Night," Wilkes photographs a scene "for a minimum of ten hours, from the same perspective, capturing a fluid visual narrative of day into night within a single frame."
Coffee Joulies do work, but their effect isn't very strong, and it's nowhere near their claims that the drink "will be ready to drink three times sooner and will remain hot twice as long." In fact, the effect is barely noticeable.
Well, that's terribly disappointing. Like many, I was excited about the potential for these things to radically alter the way I consume coffee. Fortunately, I was skeptical enough to not spend (at least) $40 backing the project; I'm hoping they're making refunds an easy process for their 4,818 backers.