I (still) don’t get SecondLife

I’ve been trying. Really trying. I just can’t find a way to “get” what all of the SecondLife hype is about. I mean, yeah, it’s cool. It’s fun. It’s a really interesting and diverse metaverse. It’s a blast to create and buy stuff, and customize an avatar, and fly around islands. I get that part of it.

But, for education, it largely doesn’t change much over the existing and available tools. I could see it if you were working on a collaboratively designed architecture project. Or perhaps some theatre or alternate reality exploration of literature.

But what I (still) see as the primary use of SecondLife in education is to compel people to sit in rows of chairs to watch a screen at the front of the “classroom”. Some classrooms have innovated – no roof. But, underneath it all, it’s still didacticism with chalk-and-talk replaced with stream-and-chat. I checked out a hybrid event today – a “traditional” web-enable streaming video conference, simulcast into SecondLife.

SecondLife simulcast

In this screenshot, the back window shows the “conventional” webcast of the presentation. Streaming video and audio, at full resolution. The front window shows the SecondLife simulcast, with attendees sitting nicely in rows, watching a lower fidelity, smaller and distorted version of the same webcast.

Sure, in the SecondLife version, there was opportunity for interaction between participants. But that could have easily been added through a chat or IM group. And much of the interaction involved “how do I sit down?” and “how to I give my avatar breasts?” rather than discussing the presentation.

Much of that is a result of n00bs learning the environment, and that is to be expected. But I’ve been in several SecondLife sessions over the last week, and none were what I’d call compelling or innovative educational pedagogies.

I still don’t get SecondLife, as it is typically used for education. I’ll keep trying, though.

The National Geographic Ritual

The latest National Geographic came in the mail today. I find it a little ironic that a magazine that’s had such a strong bent toward showcasing the effects of global warming is printed on dead trees and trucked around the world to be delivered into our mailboxes, but whatever…

When I get a fresh new NG, I have a ritual I follow.

  1. act all giddy and excited, like a kid with a new present
  2. carefully peel the brown wrapper off, so as to not rip the precious cargo inside. mention a little louder than is necessary that it’s a National Geographic, so any observers don’t get any ideas about what kind of magazine I’m subscribing to that requires a brown wrapper…
  3. inhale. deeply. pause. aaaaaaaaaahhhh… the ink smell, mixed with the off-gassing paper. so, that’s why they kill trees and ship this stuff around the planet…
  4. peruse the cover. always an awesome photograph. try to figure out where the photo was taken. if feeling really geeky, try to figure out how they got the shot. if feeling really cocky, try to figure out if I could have gotten that shot. wonder what it would be like to work on a NG shoot…
  5. scan the topics listed on the cover. the ones obscuring the photograph.
  6. take 10-30 seconds to scan the table of contents. get an idea of what’s inside.
  7. flip past the Cialis/Levitra/Ensomnublis/Viagra/Erectomax ads that fill the first section of the magazine with multiple full-page spreads. gee, I wonder what the prime demographic for this magazine is…
  8. examine every single page, looking only at the photographs. repeat step 4 for each photograph. this will take an hour or two. wonder what the hell they were thinking when selecting at least 3 photos that should have been marked as “Reject” in Aperture. (the motion-blurred flying birds with blurry ice field in the background is the prime candidate this time around – they were trying to be artistic. it would have worked, had the pan managed to get the bird in sharp focus, but it didn’t…) The polar bear shaking off water is one of the best catches of this issue. wow. Knowing that the bear charged the photographer seconds after the shot was taken just makes it so much better. Some of the wide-angle shots of meltwater reservoirs on top of the ice are pretty amazing, too.
  9. if any articles look really interesting, go back and read them.
  10. wonder why NG isn’t just a photo magazine. by FAR the best part of the magazine. the articles are great, too, but they take up paper that would be better allocated to more photos…
  11. come back to the issue several times over the next month, slowly working through all articles, letters, sidebars. revisiting every photograph. wondering how freaking cool it would be to work on a NG shoot.
  12. put the magazine away for “safe keeping” never to open it again once the next one comes in.

As much as I love NG, I really think I’d prefer an online-only subscription. With access to high-resolution photographs and galleries, I’d be more than satisified. And it would save countless trees, prevent tonnes of greenhouse gas emissions, conserve fossil fuels, etc…

WordPress Performance Tuning

My blog often has fits of sucktacular performance. After digging around, and bugging DreamHost support for some ideas, I’ve made some progress.

I had been running wp-cache to enable file-based caching, thinking that would help optimize performance of the site (fewer database calls should equal better performance) – except that DreamHost apparently uses NFS-mounted storage for accounts. As a result, filesystem access is a bit laggy, so the file-based caching was actually (apparently) slowing the site down (as suggested by 4+1 ways to speed up wordpress). Disabled wp-cache and set define('WP_CACHE', false); in wp-config.php

I also noticed that there were 2 requests for linked files that were returning 404 errors, which in turn trigger my fancy schmancy 404 page and so add significant lag to the page load. Turns out the OpenID Delegation plugin had bad references to openid.js and openid.css so I fixed that, and page loads are at least cosmetically snappier now.

One other modification I made was to (temporarily?) disable the “Similar Posts” plugin and sidebar display. I really like the functionality that provides, but it was adding too much processing time to generating individual post pages. It works by using the MySQL full text index on the blog posts table, which gets a bit slow with lots of posts and MyISAM tables (table-level locking and lots of extra queries means slower site responses). I’ll look into optimizing that a bit and re-enabling in the future.

Also, I had the Mandigo theme set to automatically rotate through a set of banner images, meaning WordPress was having to crunch through the blogbanner/wide directory itself in order to pick up a banner image. Instead, I just set up my .htaccess file to intercept the URL for the default Mandigo banner image, and Redirect it to rotator.php so it should be a bit better now.

It’s still not running as fast as I’d like, but it’s just a blog. I could always trim out more plugins and pick a simpler theme, but I’m pretty happy with the functionality I have at the moment. At least it’s performing better than Twitter…

Update: I turned off the General Stats widget (but left the plugin active) – there’s no need to count every word of every comment and post when displaying every page on the blog. That info is available on the Archives page, where it belongs. That removed several very heavy queries from the typical page generation load.  I also updated Mandigo, which mentioned some optimizations in the changelog. The blog feels snappy enough now for me to stop worrying for awhile…

Are we approaching social currency?

With all of this Web 2.0 activity still building, I’ve been thinking about Cory Doctorow’s concept of whuffie or social currency (from his great short story Down and Out in the Magic Kingdom) quite a bit. The idea being that you can “pay” others with a mythical currency based on reputation, not gold. Those with high reputations (creators of cool stuff, humanitarians, or shudder American Idle) gather whuffie, which they can use to purchase stuff. Those with low reputations (eBay scammers, spammers, or ideally American Idle) have less whuffie, and have to struggle to gain reputation in order to move back up the whuffie ladder.

Just as web 2.0 has the power to democratize government, it has the potential to shift the base of our economy from one of scarcity (gold bullion is hard to come by) to one of plenty (reputation is not a zero sum game). Google PageRank is the closest thing I’ve seen to whuffie – it is calculated by an algorithm that takes into account links and their context, which is an approximation of reputation. It’s missing sign, in that it only adds reputation and has no concept of “rocks” vs. “sucks.” A “bad” link doesn’t detract from the reputation of a site (and by extension its owner).

So, I have to wonder. How feasible is Bitchun Society now (without the scifi rejuvenation and brain implant stuff)? Between PageRank, “Friends” on social software websites, etc… are we starting to approach an effective social currency?

Enterprise-Class WordPress

I’d been thinking that WordPress might be tricky to scale, but between WP-Cache and the newly announced HyperDB, I think WP might well have some legs in it.

WP-Cache stores pages as static files, and dramatically reduces the load on the database. This makes sites more responsive, and at least theoretically able to survive a Slashdotting or Digging.

Matt just announced the other side of the equation. Enterprise-level database connectivity. They’re releasing the (previously custom) database class that was developed for WordPress.com. It obviously works, as WordPress.com has something like 47 quajillion blogs hosted, with pretty decent performance.

Matt’s notes on the HyperDB release describe features including:

  • Replication
  • Failover
  • Redundant (public/private) networks
  • Local and remote datacenters
  • Partitioning
  • Different tables on different DBs
  • Advanced stats for profiling
  • More…?

So, it supports spreading databases across a bunch of servers, making it easier to set up server clusters to scale WordPress (and WPMU) up to any level you want. Might be handy for, oh, I don’t know… an institutional blogging platform?

WordPress just got a bunch more interesting, from a CMS perspective…

Blogs and the Twitter Effect

While chatting with Scott at ETUG, he commented that he was frustrated with Twitter. Both because of the constant flakiness, and the negative effect it’s having on many people’s blog posting activity. I’m definitely posting less frequently since getting bitten by the Twitter bug.

At first, I didn’t see the problem, but then he explained it. If people are pumping their content and energy into Twitter, something that is by nature largely ephemeral and transient (both in server uptime and lifespan of content) then the blogosphere is effectively losing out. Yes, there are benefits – the conversations and serendipitous connections that happen via the always-on and always-shifting nature of Twitter streams are compelling because they are some of the most highly social public interactions on the internets. And that has helped me feel more closely connected with the 40-odd people in the strange, distributed, cosmopolitan set of folks I consider friends.

During ETUG, we tried to shift to Jaiku. The UI of Jaiku sucks, compared with Twitter. It’s too busy. It’s got ads. But it stays up and never eats content. And it’s got almost nobody on the network. The people are on Twitter. I gave up on tilting at that windmill in less than a day. It’s not worth fighting with cranky software, but it’s also not worth abandoning the community in the relentless pursuit of uptime…

Where Jaiku was like a cold, lonely walk, Twitter’s like a family gathering (with all that entails).

Cold walk on campus vs. DSCF1928.JPG

Hope for Peak Oil. Soon.

Things are getting out of hand, when Peak Oil – the end of cheap petroleum – is the only way I can see out of this mess. It would help reduce carbon emissions, and it would help reduce our environmental exposure to plastics and plastic byproducts like Bisphenol-A.

My friend Niran gave a rundown of how pervasive environmental plastics are, and the dangerous side effects of our constant exposure to them. Grey Goo, but as a result of Better Living Through Chemistry™

Very scary stuff. Between the upcoming drop in global carrying capacity, impending spike in fuel prices, and environmental contamination through petroleum products and byproducts, we’re in for an interesting ride over the next 50 years…

I recently got rid of an old plastic table that was sloughing a white powder. I had no idea that was a product of photodegredation of the polymers, and that the white dust wasn’t just annoying but potentially toxic (if not to me, then to the critters that form the base of the planet’s food chain/web). Of course, by “got rid of”, I mean “carted to the landfill, where it continues to photodegrade, but I don’t have to look at it.”

My house is full of (and made of) plastics. My fridge is stocked with it. My water is stored in it. My vitamins encased. There is no part of my home, work, or neighbourhood that is free of plastics (and by extension, petroleum). Very scary to imagine the changes that will be necessary to reduce that, or to adjust to a new way of doing things without the long polymers so cheaply available…

OpenID Server

OpenID Logo

OpenID appears to be gaining some momentum. It feels like the right approach to identity management – let individuals control their identity in a trusted way, rather than relying on federation through central brokers. Sun Microsystems just rolled out OpenID support for all of their employees. Stephen‘s been talking about this kind of decentralized identity management for years (and most recently just yesterday).

But, it’s been a bit strange in that it hasn’t been very easy to run your own OpenID server. I mean, you could go through myopenid.com to get a free hosted OpenID, but that’s just a federated, centrally hosted identity. No different than a Yahoo! or Google account. The power of OpenID is that you can/should run your own OpenID server, so you control it. It’s not a decentralized, individual identity management system if we still hand control over it to central services. We need to be running our own OpenID servers. Which means it needs to be easy to set up. Ideally one-click easy. It’s not quite there yet, but it’s getting closer.

I’d tried to install an OpenID server yesterday, and failed because DreamHost doesn’t support the big math libraries needed for encryption, and the server I was trying didn’t fall back to “dumb” mode. But, I just installed phpMyID on my DreamHost account, and it worked flawlessly. It took maybe 10 minutes, including RTFMing. Now, I have my own OpenID server, which I control, living at openid.darcynorman.net

Now, what does that get me? Initially, not much. All I’ve been able to do is authenticate on Zooomr.com using my own OpenID server as credentials. That’s pretty cool as a “hello, world!” test. And when OpenID support gets rolled into more services, I’m ready.

DreamHost, if you’re listening, this would be a great opportunity for a One-Click Install package. Rolling out OpenID server support for all of the 46 bajillion DreamHost customers would go a long way toward kickstarting OpenID adoption. I’d say Google should roll it out for GMail account holders, but again that kind of defeats the point of a decentralized identity management system, if we all use a central broker anyway…

Update: Even cleaner, now. I’ve just added the openid.server and openid.delegate elements to the head of my blog, meaning I can just provide the url “http://darcynorman.net” as my identity in any OpenID-enabled software.

Update 2: Yikes! I just went to enable HTTPS and certificate support on the openid.darcynorman.net domain, and it’d cost almost $250CDN per year to do that ($48US per year for static IP, $189US per year for the certificate via GeoTrust). There’s a minor flaw in the whole OpenID system – if the distributed servers aren’t trustworthy and secure, the system kind of falls over. An unsecured OpenID server is a bit of a magnet for packet sniffing usernames and passwords…

Update, 33 1/3: I got nervous about not having a secure OpenID server, so reverted back to using MyOpenID.com. Yes, it’s a centrally hosted distributed identity provider, but it’s secure, and by using my own URL as a delegate I retain control (so if MyOpenID.com turns evil, I’m able to very easily switch to another provider, or run my own).

I also added the handy OpenID WordPress Delegate Plugin to this blog, so it will automatically add my OpenID information without my having to remember to tweak the theme’s header.php file every time I update the theme…