Get Simpsonized!

July 29, 2007

Here at the Buffet, we don’t normally push advertising, but this is just too cool not to pass along. What you would look like had you been born in Springfield.

I've been Simpsonized!You’ll need a color photo of yourself
+ facing forward
+ in .jpg, .jpeg or .png format
+ with a minimum resolution of 640×480 pixels

It requires an email address only if you wish to save your Simpsonized self for posterity on their website. It is possible to download a full-sized image and an avatar without registering.

Even if you’re not a huge fan of the Simpsons, you can try it out to appreciate the face recognition software used to discern features and create personalized images. It’s a fun way to kill 20 minutes, I promise.


Finding a better way

June 22, 2007

When I left my old job, I turned in my laptop. I’d carried it for the last several years, bringing it back and forth between home and the office every day. Because I could be confident that I’d always have it with me, it was the machine I used for most of what I did, including writing. All of my important files were backed up on my home computer every night, but I still used the work computer whenever I needed to deal with them.

With the work computer returned to the company and no longer available to me, I sat down recently to work on a document from my home PC. This machine has a slightly different version of Microsoft Word on it than the work computer did, so I was a little bit nervous about how it might work out. It turns out that I had good reason to be worried.

Word pretty much destroyed all of the formatting in my document. It mashed all of the indentation, the formatting of block quotes, punctuation, and altered all of the section and subsection headings so that the entire table of contents was useless. Then, to add insult to injury, the program repeated its “improvements” each time I tried to correct them. I hate it when software is so confident it’s smarter than I am that it refuses to let me make decisions, and even worse, when it actively thwarts me in the decisions it pretends to offer.

With my 220 page document ruined, it was suddenly very worth my time to find a better option. Being the sort of obstinate person I am, I went whole-hog, and began experimenting with not just a different word processor, but a whole new paradigm in how I am writing this particular text.

Read the rest of this entry »


Good Reads: “Everyware”

June 15, 2007

In the first of a series of recaps of the excellent, excellent reading material I covered in my final year at SI, I present Adam Greenfield’s Everyware: the dawning age of ubiquitous computing (New Riders, 2006).

Everyware is a multipart manifesto — brash at times with its self-assured tone, modest by intervals with its delicate forecasting of the trends it follows. The first portion of the work is given over to grappling with the intellectually thorny question of what exactly ubicomp (Ubiquitous Computing) is, precisely. While detailing the contours of this question, it also attempts to persuade the reader that regardless of your particular definition of ubicomp, it is happening. There is, insists Greenfield, some unique, powerful, and compelling at the intersection of micro-scale computing hardware, globally available high bandwidth connectivity, and multimodal interfaces that is and will continue to change the world we live in. At least in the affluent “west,” that is. The remainder of the not-enormous pagecount (268pp) is dedicated to exploring the many contours and faces of this technological revolution.

This is the heart of the matter, and Greenfield bravely plunges into discussion: “What does ubicomp/everyware mean for us? For those left behind, technologically speaking? What are the ethics?” In other words, Everyware is not just about what this technology is and what is possible to do with it, but also the critical (from this author’s perspective as an information scientist) question of what it means for society, what it affords, what it trends to, and what we can and should be thinking about as we respond to it. I thought, given the initial sections of the book, that I would absolutely hate the remainder of it, but in this I am happy to report that I did not. Greenfield’s discussion is articulate, forceful, and wise.

Read the rest of this entry »


Cheesy horror movies are more fun than real life

June 14, 2007

In my daily perusal of the news today, I ran across the headline “FBI battles zombie hordes.” It sounded like something out of a bad movie. I was crushed to discover that it was not.

Instead, the article is about the FBI’s efforts to counter the growing problem of hijacked personal computers, called “bots” or “zombies”. Subverted computers are used to launch attacks on websites and send out spam emails. In its role investigating computer crime, it falls to the FBI to tackle this problem.

While I don’t mean to suggest that hijacked PCs are not a very real problem, I was hugely disappointed. I’d have been much more entertained to read about black, government helicopters swooping down to attack crowds of stiff-limbed, shambling ranks of the undead, intent on devouring anyone they could get their hands on.

Alas, the foibles of real world technology have trumped B-movie hijinks once again. I suppose this means that I can sleep soundly, not worrying that anyone will attempt to devour my brain. On the other hand, it does make reading the news a lot less interesting than it might have been.

Instead, I can leave you only with this music video. The song is original and entertaining, and it puts the whole zombie issue in a gentler, more entertaining light.

-posted by Mark.


Photosynth

June 13, 2007

I’m still processing this presentation by Blaise Aguera y Arcas, founder of Seadragon, Inc. and co-creator of Photosynth, but I wanted to put this out there for other geeks to ponder and hopefully respond to in greater depth.

http://www.youtube.com/v/EqkDV0Ogvxc

Basically, Photosynth is software that takes digital photos and synthesizes them into zoomable, navigable spaces. It’s approximately the coolest thing I’ve ever seen, and I only wish I had half the computer knowledge necessary to understand how it all works.

-posted by Ann


Stalking the perfect PC endgame

May 7, 2007

Here’s the thing about endgames: if the midgame was any good at all, the endgame has to be radically different.

That’s because a good midgame basically consists of a series of new obstacles, skills that must be mastered, or other permanent developments (My black bishop has been taken! I can now Force-push!) and eventually they pile up. By the time you’ve gotten to the end of a robust PC game, you’re likely to be either:

  • buried in accumulated minutae: say, micromanaging planets in a 4X strategy game like Master of Orion;
  • stuck in a monotonous slugfest: cf. Black and White, Age of Empires, or the countless real-time-strategy games that disintegrate into resource management; or
  • boringly overpowered: ol’ Diablo has the endgame mechanics of backgammon: just keep rolling the dice until it’s over.

Instead, a good endgame turns a corner somehow, cutting across all the skills you’ve gathered or perhaps requiring an entirely new sort of skill. And for my money, a good endgame is the key to making a good game great.

Below the fold: a few endgame techniques to look for and my pick for the best endgame ever.

Read the rest of this entry »


An Inhumane Interface

April 26, 2007

Snake? Snake? SNAAAAAAAAKE!Our computers are inhumane. When they don’t work right, they can set us back days or weeks, cause frustration and anger, and even lose or destroy irreplaceable information. Even when they work properly, though, they don’t work well.

People just don’t naturally think in terms of programs (or even, for the Apple folks, documents). One task can easily span multiple documents and programs. Unfortunately, computers don’t work that way. Instead, they try to force you to think the way they work, which just causes the kind of grief we’re all familiar with.

If I’m writing an article and need to do some simple math (maybe I need to figure out how many pages my 1500 word article will be), I have to open up another program, wait for it to load and get the answer before I return to my original document. That kind of thing is a concentration-killer and is probably one reason why so many people are now multi-taskers. We have to be because that’s how computers allow us to perform tasks.

So what’s the alternative? Well, wouldn’t it be better if you could just type out a mathematical equation, select it, and tell the computer to find the answer? That way, you don’t have to switch contexts (concentration killer!), switch keyboard commands (why doesn’t ctrl-A work now?), or worry about which application has focus (oops, copied my document instead of the answer).

Read the rest of this entry »


Choice is a choice, but is it a good one?

April 26, 2007

I ran across an article about the allure of the Microsoft approach to selecting software on slashdot yesterday. For those of you not in the field of software development, the approach is simple. Microsoft is of the opinion that you should buy your operating system, your server software, your database solution, your development tools, your content management system, and even the syntax of your programming language all of the plug-ins, extensions, and updates to them, from Microsoft. It is obvious what the advantage is to Microsoft, but how does taking this approach help me, as a developer?

The author of the blog post is James Turner, who is an open-source developer, working in a particular sub-specialty of the field very similar to the one I work in. The argument that he makes in his blog entry is simple. It says that while the open source community has a lot of interesting solutions to offer, this is often the problem. The large number of choices that the open source community has to offer makes it difficult to make the decisions necessary to get the job done.

So what’s good about a monoculture, and why does Microsoft win so often when people make a decision about platforms? Largely because what the open source community sees as a strength, people trying to get a job done in the real world see as a weakness. We celebrate the diversity of choices available to solve a problem and call it freedom. IT managers and CIOs look at it and call it chaos, confusion and uncertainty.

In reading this opinion, I was reminded of a conversation I had over dinner this weekend. One of the people I was eating with brought up a book they’d read, in which the author talked about how it is that people find happiness. Parts of that conversation seemed applicable to the concept of software monocultures. The point that struck me as most applicable was the one that talked about salad dressing.

Read the rest of this entry »


Poor Dead Microsoft!

April 10, 2007

Well, by now you all should have seen Paul Graham’s essay “Microsoft is Dead.” You have probably also read his follow on piece “Microsoft is Dead: The Cliffs Notes.” If not, you should certainly go read them (www.paulgraham.com). I tend to agree with the gist of his essay and thought that, this being the Geek Buffet, that we should have some discussion about it (especially since nearly every other tech oriented blog has already jumped on it – “link bait” some have called it).

I was interested in his essay because I had had a similar epiphany at the American Geophysical Union meeting last December. There were 15,000 very geeky geo-scientists and students there and they had a huge area set up with tables and chairs for attendees to use their laptops on the convention center’s WIFI network. It seemed to me that I was the only person there using an IBM laptop running Windows. I was shocked by this and spent some extra time observing the phenomena just to be sure I wasn’t overreacting.  When I looked closely at people who seemed to be using a non-Apple computer, they were mostly running some variant of Linux. This experience has led me to believe Microsoft’s monopoly on operating systems has been broken. Microsoft still has a huge advantage in the number of machines using their operating system versus all of the others combined, but people now believe there is a viable alternative and are acting on that belief.

Another experience relevant to this issue is the effort that I have had to personally put into keeping the Windows machines I am associated with running. The big desktop machine in our house was completely cutoff from the internet by an upgrade to IE 7. I have had to reinstall everything on my office laptop due to a virus infestation that got past my virus protection software and firewall. Even though everything was backed up, the productivity lost to reinstalling the operating system and all of you other software and files is enormous. Now, the latest MS security update has caused a glitch in my Windows 2000 system that makes a startup take 30 to 40 minutes. I see this situation as unacceptable, and expect that many other users see it the same way.

Because of these personal observations, I have bought a Mac laptop and am in the process of moving over to that platform for all my personal work and hope to move all my professional work over to it soon. There is some learning curve to get over, but it wasn’t as bad as I thought it would be. Personally, I am hoping to consider MS dead soon.

I am interested to hear if anybody else has personal experience that leads them to think MS is losing its importance in the computer world.


Building a better Commissar detector

March 13, 2007

The advent of powerful image-manipulation software, such as Adobe System’s ubiquitous Photoshop, makes it easier than ever for photographers and journalists to tailor their pictures to look just the way they want them. Everything from subtle alterations to color saturation and shadow highlighting to the removal of dust specks and red-eye have become simple enough for anybody with the right software, a little bit of training, and a few minutes of free time to accomplish.

Not everything in life is sunshine and roses, though. Just as it has become easier to accomplish artistic expression for the purpose of producing powerful, evocative images, it has also become increasingly easy to radically alter the content of an image. In short, it is no longer the case that the camera doesn’t lie. Even worse, as the tools get better and better, it becomes increasingly difficult to determine when an image is lying to you.

Of course, doctored photographs are nothing new. David King, in his book The Commissar Vanishes, has produced the seminal work on photo manipulation in the Soviet Union. Stalin made himself famous for having pictures doctored, particularly to airbrush out people who had been photographed with him and subsequently fallen out of favor. More than one once-favored subordinate vanished from the official record at about the same time they were rounded up and killed for crimes against the state.

As time has gone on, the venerable airbrush has been superseded by fancy math and software almost magical in its capabilities. One side effect of this is that almost any photographer can now alter their photos, rather than requiring a team of experts to produce believable results. The issue has been thrust back into the news by the recent discovery of a badly doctored photograph of smoke rising over the site of an Israeli air strike in Lebanon. The photograph had been widely distributed by the Reuters news service.

In the resulting media feeding frenzy, Reuters announced that they had updated their rules for photo editing of images they carry on the news wire. In conjunction with these changes, Wired Magazine is reporting an announcement by Adobe that they are working on a new set of tools for an upcoming version of Photoshop that can be used to detect manipulation of a photograph using the types of techniques that their software has pioneered. These new plug-ins might be important tools for news organizations to try and catch these kinds of incidents before they happen, but they also raise a number of thorny issues.

Read the rest of this entry »