Moving on

I have decided to house this blog on a new server running wordpress.  I have imported all of the entries from this blog (minus this one, obviously).  This blog is officially retired, although I'll leave it here in case anyone is linking against it (such as my personal blog).

Livejournal simply didn't seem the best place for these brain dumps.  I think I'll be inspired to post more often if I put them elsewhere.

So, here's the elsewhere:

A Random String of Bits.

Nintendo and the Homebrew Arms Race

When I purchase a piece of hardware, it is mine to do with as I wish.  This is a long-held understanding.  If I buy a piece of clothing, I can have it altered.  If I buy a car, I can change the tires.  If I buy a television, I can kill myself trying to screw with its insides.

It might void the warranty, it might put my life at risk or potentially damage the thing I've purchased, but it is my right as a consumer.

Nintendo takes a different view on the issue.  Owners of the Wii have long been able to employ a simple buffer overflow exploit in Twilight Princess to run custom code.  This exploit, called the Twilight Hack, allows a user to install, among other things, an application called the Homebrew Channel, which looks like any other Wii channel and lets you run other custom code without using the Twilight Hack again.  It's the gaming console equivalent of installing a new stereo in your car.

Since the hack was made public, Nintendo has been trying to thwart it.  They have, to date, released three firmware updates that included code targeted to stop the Twilight Hack.  The most recent update succeeded at stopping it completely - it appears to detect the hacked save files and delete them, both on boot and whenever you insert an SD card.

So, all of this is standard fare.  Whenever a console launches, homebrewers will make it run custom code.  The console manufacturer will release an update to prevent this.  The homebrewers will work around it.  This process will continue in an escalating cycle.

However, Nintendo has delivered a low blow here.  Along with the System Menu 3.4 update, they changed their terms of service.

We may without notifying you, download updates, patches, upgrades and similar software to your Wii Console and may disable unauthorized or illegal software placed on your Wii Console...

Now, that's pretty cold - deleting our custom software?  Come on Nintendo, all I want to do is play videos on my Wii!  Also, the first time a fully automated background firmware update breaks something, the angry calls are going to pour like rain.  Power outage in the middle of a night-time firmware update?  Too bad!  But it gets worse...

If we detect unauthorized software, services, or devices, your access to the Wii Network Service may be disabled and/or the Wii Console or games may be unplayable.

Okay, at this point I feel it is crucial to point out a couple of things.  First, these quotes come from two documents, the Wii Network Service Privacy Policy and the Wii Network Service EULA.  Both of these documents are required, not to use the Wii in general, but to use the Wiiconnect24 services (the Shop channel, Nintendo channel, and Nintendo's other online content channels).  So, to use their network, you agree that they may disable your system completely.  This means two things:

1. You can perfectly legally run hacked code on a Wii that does not use Wiiconnect24.

2. You grant Nintendo the right to break the law (destruction of private property) if you choose to use the Wiiconnect24 service.

Now, according to a lawyer I know, a contract cannot override criminal law, even if signed in full knowledge as opposed to clicked-through (the enforceability of click-through EULAs is still up for debate in the US).  So this clause is, by necessity, unenforceable.

So why is it there?  Nintendo has a juggernaut legal team, famed for its ruthlessness.  They can bankrupt any individual consumer with the legal proceedings necessary to challenge them, and it is unlikely that this will raise enough stink to get a class-action suit started.

I used to have some respect for Nintendo.

Linux on the Desktop - a partial solution

Lately, I've read a number of "Windows user tried Linux for a week and hated it, and this is why" articles. Then, while holding back the urge to scream during a Windows XP install, it hit me: we're holding a double standard, here.

In the last year, whenever someone talks about "whether Linux is ready for the desktop", the complaints that always crop up revolve around the fact that a user can't throw in a Linux install CD, click next a few times, and have a fully functional desktop environment in half an hour. Several things plague these proverbial users: the lack of mp3 support is probably the most problematic now, as is the lack of 3d graphics support. The complaints further, er... complain, that the user has to know what she is doing to enable/install all of these components.

What most people overlook, though, is that installing Windows is no cakewalk, either. Windows ships with almost no real video or audio hardware support - everything must be downloaded from 3rd party websites, and more importantly, the user has to *know* what vendor website to go to, and how to navigate the vendor's site (with some vendors, that can be a real pain!).

So now, let's be fair. I'm taking a Windows XP install, out of the box, and comparing it side-by-side with an Ubuntu Linux install. Okay, here goes.

Ubuntu Linux

No mp3 support

As a user, I have to install several non-free packages, which means changing my available repositories and running a few commands (or using the graphical tool). If I prefer the less-questionably-legal route, I would purchase Fluendo (28E for their entire set of plugins, with perputual updates, as of this writing. Still about 1/4 the price of Windows' most basic version), and follow their instructions to install it.

Of course, I also have to *know* about these options. A quick google search ("MP3s in Ubuntu") and a forum gives me the answer, in step-by-step format.

No 3d graphics acceleration

This is even easier. All we need is to install the nvidia-glx or xorg-driver-fglrx packages, depending on the card. They're also in the restricted repository, but we've already enabled it previously. If we hadn't, the google search "3d graphics in Ubuntu" gives us the correct answer immediately.

No flash player

Another quick google search turns up the answer, as always with step-by-step instructions.

And, that's it. Everything else I need to do to be productive is already provided by Ubuntu: web browser, office suite, multimedia software. Note: I never had to restart Ubuntu during this whole process.

Windows XP

No audio

First, I have to figure out the name of my audio chip, which Windows doesn't tell me. All Windows will say is "Unknown Multimedia device". By booting Linux and running lspci, I discover it's a C-Media chip, and go to their website. I have to give them the exact chip model number, and they give me a driver to download. I have to restart Windows.

No 3d graphics acceleration

Again, the video controller is just called an "Unknown display adapter". Foreknowledge tells me I have an Nvidia Geforce 6600 GT. I go to Nvidia's website (much easier to use than C-Media was), and get the driver. I have to restart Windows.

No flash player

Well, this one installs automatically. Doesn't even need a restart! 1/3 isn't bad, I suppose.

The Conclusion

What's the point of this exercise? Am I trying to say Windows is teh sux0r? No, that's not my message today. I could extoll the myriad problems with Windows that make Linux a better option (spyware, viruses, openness and all the benefits thereof, etc), but that's not the point.

The point is this: when it comes to installation, Linux and Windows are roughly equivalent in complexity. Linux has its installation issues; so does Windows. They tend to break roughly even, in my experience, although Linux has a much more readily available support structure in the form of community forums. But both OSes require a lot of user knowledge in order to get up and running. They assume you already know how to do things. What they really assume, underneath, is that
a technical person is doing the install.


The Solution

Most Windows users never install their OS; some technician installs it, either OEM at a factory, or at the local computer shop, or the in-law programmer who gets drafted for technical work (ahem...). Linux users have seldom known this luxury; instead, whenever someone talks about Linux, they assume that the end user is doing the install.

The solution is to treat Linux installation the way we treat Windows installation. Someone who Knows What They Are Doing (tm) sets up the OS and delivers it to the end user. One practical advantage for the Linux community is that all the time spent on fancy installers could be channeled elsewhere (not to say we don't like our hardware auto-detection, et al. But a curses-based menu is just fine, thanks). Make Linux installation work like OS installation always has before: technical users install their own OS, everyone else leaves it to the techs.

At least don't hold us to a double standard.

Decentralizing Second Life

So, I've been thinking about Second Life, and it occured to me that it's being done entirely the wrong way. Don't get me wrong; I enjoy SL, and have no qualms with the experience itself. It's the underlying scheme it's built on that bothers me: one company controlling all the servers, one company responsible for keeping everything running smoothly. It seems to me that all technologies built on that model eventually fail on the Internet, while distributed technologies (Web, email, usenet) thrive.

To that end, I've been thinking about how Second Life could be successfully decentralized, without adversely affecting the experience that everyone has come to know and love. I've identified key elements of the user experience that would be difficult to decentralize, and possible ways to handle them. First, though, we'll talk about the basics; how could decentralization even work.

First, LL releases the code for the Second Life server. Now, anyone who wants to can host a Second Life sim/sims of their own on a server. A central repository would keep track of the existing sims, in a vaguely similar fashion to DNS (see The Grid, below). This would allow Second Life to grow without bound, with sims run by a multitude of companies and even home users.

So, how do we keep that Second Life experience without the centralized monolith of Linden Labs?

Economy
First and most importantly, the Second Life economy must be preserved. The economy has become the most crucial element to the experience; the ability to use real money, diluted down to a virtual quantum, to purchase other users' custom created content. This breaks down into two sub-problems:

a) Managing the money. The most likely way to do this would be to set up a "bank", wherein a single host (or several different hosts) manages all of the banking transactions. I'm thinking basically a system like paypal, where you buy L$ ("Linden Dollars", Second Life's currency) from the bank, or sell $L back to the bank for real currency. Each SL server would use this central bank system to check a user's account balance, and make withdrawals/deposits, with proper confirmation on the part of the user, naturally. A public/private key system to ensure the user actually sent the confirmation could prevent abuse here, so no worries on that score. The SL bank could even be controlled by Linden Labs, as this would be a lot easier to handle than the entire grid, and still give them opportunity to have a strong stake in their creation.

b) Protecting Intellectual Property. This is a tricky problem, and the single hardest element to decentralizing SL. Since a huge portion of the money in SL is traded for users' creations, there must be a way to prevent them from being stolen. Under a decentralized scheme, when a user rezzes an object on a sim, all the data for that object (textures, sounds, scripts) would necessarily be available to the owner of that sim. The most obvious solution I can find for this is to keep the object data elsewhere, and have a rezzed object be a pointer to that data. The advantage is that compiled scripts, raw texture data, and sound files stay on a secure server independent of their rezzed location. But where is this mystical server? I see two options here: either the data is on another sim, perhaps the user's "home sim" (see User Accounts, below), or the data is in a central "asset server" (essentially the way SL works right now). Using the former approach, the client would have to make tons of connections to different servers to get all the data. Under the latter, the asset server would have to be extremely load-tolerant and robust, and all the data is stored by the same group of people, whose ethical integrity the SL user base would have to trust implicitly. Since both of these are flaws in the *existing* Second Life system, however, it is acceptable for the hypothetical exercise we're attempting here. Also, under either system the sim owner's creations could be stored on-sim for lower lag.

One other solution would be to create some DRM scheme that encrypts this data until it reaches the client. Of course, in all of these cases the client could be modified to steal the data. However, here we again reach the fact that these flaws are already inherent in SL, and there's no easy way around them.

The Grid
The ability to bring up a map and scroll around, or teleport instantly to another part of the world, is an exciting part of SL, and another crucial part of the SL experience. Fortunately, the Internet already has a great system that we can build on - DNS and hyperlinking. We simply define 2 kinds of link: "landmarks" and "neighbors". Each sim can have 4 neighbors, and neighbors must mutually agree to be neighbors (for a neighboring to work between sim A and B, A would have to set B as a neighbor and vice versa). The neighboring agreements would be stored in a central server system, modelled on DNS. A few recursive calls to this system and each sim can cache a portion of the overall grid map. Want a private island? Simply don't neighbor your sim with any others. This creates user-level "peering agreements" that could create a more logical terrain (snowy areas linked together, etc) even if the landscape does shift from time to time.

The other kind of link would work just like landmarks in the current SL system. Pretty self-explanatory, except this system would make "click to teleport" objects a necessity, finally.

If a user searches for a sim on the map, the client can grab that sim's cache of neighbors, and display more of the grid. The client could be configured to keep any amount of that information cached locally, for a more immersive experience.

User Accounts
There are two ways to handle user accounts: a centralized account server, or a sim-based account system. Under a centralized server, all accounts would be handled by, say, LL. This simplifies the system greatly, and aids in managing the asset server. With "home sims", you'd have a system similar to Jabber, where user accounts are essentially user@home_sim. I believe the centralized system will work best, given that the asset server system seems to be the most logical way to do things.

Instant Messages
Well, LL is currently planning to re-implement the IM system in Jabber, so we're pretty much covered there :P


So, in summary, we have a system that uses a centralized server for accounts and user-created assets, as well as a DNS-like neighboring system to create the world map, but grids are controlled by individuals, and hosted by companies just like web servers are now.

Technophobia

I have recently realized why there are so many computer illiterate people running around. It's not that people are simply stupid - that's a grossly judgemental answer that many of my fellow geeks unfortunately arrive at. That's not it at all, because computer illiteracy reaches into technical fields. I know several computer science professors that simply can't use technology newer than 5 years old.

So, what causes this, if not simply "they're dumb"? Fear. Technology is mysterious; most people, when confronted with something unfamiliar, are uncomfortable. It feels like some delicate piece of magic; if they touch it too hard, it might shatter.

The consequence of this fear is that, once gripped by it, people start assuming they *can't* learn anything about computers; it's too arcane. So, when presented with technical terms or ideas, they stumble over them. If the technophobe stopped to think about the idea they are grappling with, they'd probably figure it out pretty quickly. But their mind won't do that, computers are "too complicated" for anyone like them to figure out.

An example: USB flash drives. Even most technophobes know what floppy disks are, but when you tell them this is similar, except it connects to that rectangular plug on the side of their computer, they give a blank stare. They can't comprehend it because it's new.

A better example: If presented with two products that very clearly do the same thing, but are made by different companies, the technophobe will invariably ask "what's the difference between these two?" If you showed them a Dirt Devil and a Hoover, they would have no such problem, but computers are *mysterious*, afforded a special class of untouchability.

So, to all you technophobes out there: Stop being afraid of the computer. I promise it won't bite. Engage your mind and really *listen* when computer jargon floats by. Make intuitive leaps; even if they're wrong, they'll eventually point you in the right direction.

Programming: The theory

One of my biggest problems with the IT community, both in amateur programmers and prospective employers, is the following question: "So, what programming languages do you know?" This implies that learning a language is an extremely difficult task, and collecting languages like trophies is somehow a worthy pursuit.

A programming language is a tool. A skilled craftsman isn't good at her trade because she knows how to use a given set of tools; anyone can learn that. Rather, true skill comes from knowing how to *apply* the tools. The fundamental concepts behind programming are the skills on which we should be focusing.

This applies to academia as well. The language you use to teach students, especially the first language they encounter, *is* important. I'm not about to advocate "teaching languages" like Pascal, though. I think it's important to choose a real-world language, with all the pitfalls and caveats of a real-world language, as a student's first language. At the same time, it should be a language with the features available to demonstrate all the fundamental concepts in programming. A language that doesn't support recursion would be a Bad Choice, for example.

So, when someone (a peer or a hopeful programmer-to-be) asks me "what languages do you know?", I won't respond "Well, I know C, C++, Java, perl, php, xhtml/xml/css (if you count those), lisp, prolog, LotusScript, Javascript, LSL..." etc. Instead, I'll say "I've used a number of languages, but the key thing is that I know how to learn any language." When an employer asks, I suppose I'll have to say "Well, I know @languages...". Then, though, I might add "...but I consider the fundamental concepts behind programming languages to be more important, because mastering those means I can learn to get around in any language given a week or two of study."

In summary: Learning a programming language is trivial, once you know the fundamental concepts of programming.