Review: Jake2

This is somewhat of a departure for me. I’ve been doing some work for my client that involves WebStart, a system that lets you deploy Java applications from a webserver without installing any local tools (aside from Webstart itself, natch). “A perfect avenue for games!”
So digging around, I started looking for games I could run via webstart (which, by the way, runs perfectly under Linux). I found Jake2.
Jake2 is a pure-java implementation of the Quake2 engine, from Id Software. If you have any interest in first person shooters, or haven’t been under a rock the last 7 years, you’ve heard of Quake. I had pulled Jake2 a while ago as a standalone app, and it was ‘okay’, but a bit bumpy. Since I still had my Q2 maps locally, I was able to just click on the Jake2 Webstart button on Bytonics webpage, say “Here are my maps” and I was off and running.
The game plays perfectly, with a high frame rate on my laptop, and seems to handle many of the issues of running “a full screen app in a window” quite well (the mouse motion does NOT move the ‘pointer’ off the window, and you suddenly stop moving, easy prey for the nasties lurking about).
I also was able to download a couple third party game maps, and install them into the baseq2 directory, and run around a bit. Only one caused Jake2 to crash out, and that could easily have been because of a bad map. The Id supplied maps were fine.
Next on the list will be to try using the networking code, and playing multiplayer. Moohahah.

Computin in da Field

Things are settling down a bit around Chez Geek, and I’m starting to get back into the rhythm of working on my laptop full time. Life things being as they are, it makes sense to really keep an eye on where in the area I can find to telecommute. This is more of a challenge than you might think. Not many businesses are comfortable with you just plopping down in their space and sitting for 4-6 hours straight.
Despite that, though, there are a few places that encourage this, or even invite it. Since today I was hanging around Waltham, MA, I decided to see what I could find in the area.
First step, since it was lunchtime, was the local Wendys. Surprisingly, I’ve had very good luck finding open WiFi access points around Wendy’s restaurants – I guess they just sort of invite the open-no-wep-key types to their locale. Today I found 7 (!) accesspoints in range, and easily connected up to one to keep in touch through lunch and a bit after. Alas, Wendy’s dining room chairs really aren’t that comfortable, so I decided to move on.
I had stopped by the Charles River Public Internet Center once before, but it was unfortunately closed at that time. This time the door was open, but the public area was being worked on, so I couldn’t sit down. These folks seem awfully nice, and I think I’ll explore doing more formal “Come here and work for a day” arrangements with them. The woman at the front desk was nice enough to point out a local coffee shop that had open WiFi, and was within walking distance, so off I went.
I ended up at “Cafe on the Common” (oddly, no website), right in the center of Waltham on Moody street. There was one other laptop-a-holic when I got there, so I settled into a comfy table, got a big bowl of coffee, and settled in to work. The net connection was a little bit twitchy, but I was able to get 2-3 hours of work in, whilst listening to lovely music.
I’m going to keep exploring other good spots around the area. Finding something a little further west would be nice, since I’ll be commuting to Sudbury Valley School for picking up Z when the school year starts – be nice to have someplace within striking distance of that.

Does this make ANY sense?

I’m not a computer scientist. I didn’t go to school to get letters by my name to learn how languages are built and designed, and I never wrote a compiler in my life. I’m just a fairly high end programmer who writes in high level languages, and builds large applications.
I’m certainly a big user of PHP – less so recently, but since Claimit is written in it, I’ve been getting my wheels back on it.
But, occasionally I run across design decisions that just don’t seem to make sense. Case in point, the ‘asort()‘ function. It sorts an array, in this case backwards (as opposed to sort). One would assume you’d use it like this:

$sortedArray = asort($oldarray);

Not so! This function works, does not throw any errors, but the resulting value in $sortedArray is ‘true’ (or ‘1’). The asort function returns a BOOLEAN value (successful completion of a sort??) What actually happens is the asort() function reorders an existing array. So in order to have a new version of the array sorted, or to sort the result of a function call, you use this tortured syntax:

asort($sortedArray = $oldarray);

In my case, I’m actually sorting the returned array from a function call, so we have:

asort($sortedArray = getFiles())

I understand this is a Perl-ism. That doesn’t mean it has to be perpetuated further on the masses. Ick.

Claimit and Domains

Over the last few days, other people have started to use Claimit for managing giveaways. There’s a potential for this thing to pick up steam, and I need to do some more work on the code, but in the meantime I thought it might be nice to actually register a domain for it.
Twiddling whois, I see that ‘claimit.com’ is registered, but the associated ‘www’ site doesn’t have the standard ‘Buy this domain!’ crud all over it. So I fire off mail to the domain tech contact asking if they’d like to sell the domain.
I get a quick response (nice), saying “I have to check with my partners, but note it won’t be cheap.” Folks are STILL trying to cash in on this noise, huh? I sent back a very terse letter saying I don’t endorse profiteering from domain squatting, and went back to whois.
Well, lookee. ‘claimit.net’ is not registered. I’ll take that instead.
The next question was which registrar to use. Unfortunately, my recent experience with Register.com has been poor to disasterous (2 weeks to resolve a broken address pointer in their database), and their pricing scheme ain’t so hot either. So I decided to try Yahoo! Domains for this one. The process was truly slick, costing me $9.95 to reserve the domain complete with A and MX records hosted on their servers. I set up a forwarder so that hits to www.claimit.net will be redirected to claimit.stonekeep.com, and sat down to wait to see how long it’ll take to go into the master nameservers, and for DNS to propogate.
Answer: 23 minutes.
I’ve never brought up a new domain that fast before. Amazing. Stick that in your smoke and pipe it, mister domain squatter.

A Rant. Linux Man pages vs Info pages

This one just bit me on the ass tonight.
Why is it that the Gnu Linux heads can’t understand that a single documentation format, one that has been in place since time immemorial, should be replaced by one that is inferior, annoying, and requires a whole new set of skills, JUST TO LOOK UP SOME TEXT?
I bring you the idiocy that is ‘info’. It is the grossest example of ’emacsification’ that is all too common place in the Linux world.
Here’s a real life example of this ridiculousness.
Today I needed to write a ‘sed’ script to do some line processing. No problem, I just need to look in the man page to see how to do a few things. But lo! The man page for ‘sed’ on my machine (Debian Sarge Linux) doesn’t have any useful information. In fact, it has almost nothing at all. It does, however, helpfully say, about line 30:

This is just a brief synopsis of sed commands to serve as a reminder to those who already know sed; other documentation (such as the texinfo document) must be consulted for fuller descriptions.

Very handy, eh? Fine, I just so happen to know that also later in the document it mentions ‘info sed’. Okay, so I type that.
The information I’m looking for is simply what the command line arguments are, and how to set pattern matches on substitution. The command info sed puts me… into an interactive menu system. Normally, ‘man’ pages are searchable by typing ‘/string’. So I type that. I am now on a page called ‘Less frequently used commands’. Very soon I am hopelessly lost in navigating screens and nodes and commands that have nothing to do with what I’m trying to do, which is simply get a command syntax.
Who POSSIBLY thought that translating the simple single page format of man pages, that every Unix admin on the planet knows how to use, into an interactive mini-Emacs session that requires a fair amount of knowledge on how Emacs works to work with makes ANY sense whatsoever?
Unfortunately, the people who make these sorts of idiotic decisions are the same ones that boot up Emacs and never leave it, and assume the rest of the world must know it as well. I have news for ya fellas, I don’t. And assuming that eveyrone knows how to use emacs, and therefore it’s okay REQUIRING an admin to use an emacs interface to view documentation is narrow sighted and ridiculous.
Stick with the known, standard, common formats. Emacs is not universal. Use the man page system that has been around since the dawn of time. There’s absolutely no reason to change it.

A step forward.

Since we moved into the new house, I haven’t had much opportunity to try and get things in my workspace / room / office organized. When there was energy to move things around, I worked on the public spaces. Hey, it’s only my room, right?
Today I spent some time cleaning up my workspace, prepping to do more work on it (new lights, chair arrangements, etc). It’s much tidier now, and I can look at it and go “RIght. New light there, new monitor there… and…”
But while it’s clean, here’s what it looks like. Since I work fulltime at home, I spend anywhere from 8 hours to ‘all my waking time, save trips to eat and use the loo’ in that spot.

Debian Sarge FireFox Security update – UNSTABLE!

This is a fair warning to folks running Debian Sarge. Recently there was a security update of Sarge (aka ‘Debian Stable’) to bring Firefox to a package identified as “mozilla-firefox 1.0.4-2sarge2“.
This update is UNSTABLE. Loaded extensions (I believe) are causing constant segfaults in the application, and running with an empty profile (firefox -p, new profile), odd behaviour in certain mime type handlers is happening. Also, hitting ^H or viewing the history window causes an immediate segfault and crash.
I would recommend people running Debian Sarge NOT upgrade their install of FireFox until this is resolved.
Update: this bug is reported already in Bugzilla, but has not been addressed. The link(s) are:
Bug #324516 (^W crash)
Bug #324617 (History window crash)

Little bunny foo foo…

Many many moons ago my friend Cathy turned me on to Roman Dirge’s ‘Lenore’ comics. In one particular strip, Lenore is singing the bunny foo foo song.
Well, some bright light went and made a shockwave video of it. Note, it makes little sense unless you have the sound on, and in fact, it loses something in the translation, but it’s still gruesome and cute.
Apparently this is just one of an entire series of vids. I think I’ll wait until I’m truly catatonic before watching more. The killer bear picture is very cute though.