We just launched the new website for the Teaching & Learning Centre at The University of Calgary. It’s been a long time in the making, with heavy use of themes, custom CCK content types, events, signups, views, and a bunch of other Drupal modules and tricks. King worked his usual magic in putting together the CSS for our theme, which uses the same HTML templates as the official www.ucalgary.ca site.
The new site should make it much easier for us to keep content up to date. We’re also planning some potentially cool community features for down the road a bit, once the dust starts to settle after The Big Website Launch.
Also, it’s currently running on our aging PowerMac Quicksilver dual 1GHz G4 server, so is a bit slower than it should be. We’ll be moving it to a shiny new-ish XServe ASAP.
TLC Website in Drupal
I’ve been trying to log into the SecondLife presentations (starting with the big NMC Impact of Digital Media Symposium, and then wanting to follow up on some of the blog posts about it).
But, SecondLife keeps throwing my Quad G5 into a kernel panic whenever I launch it. Or, rather, I can launch it, but if I log in, and try to enter SL, it chokes on the caching process, and locks up my system something fierce. Then it drops into the Grey Screen of Death. Oops. Reboot the machine, wait 30 minutes for it to check the drives and files, and try again. Same thing. Doh. I might have to install SL on a spare machine in the lab to try it out…
I just grabbed the latest CVS build of Drupal 5, to poke around and see what’s new. I’m really surprised and impressed at how far it’s come. Here are the big changes I’ve seen in about half an hour of playing with a new test site:
- New theme. Garland feels much more modern than Bluemarine. Nicely done.
- Revamped admin interface. It’s been reorganized by task and by module, making it easier to find where to modify a given setting. Much of the time spent administering a Drupal 4.7 site is wasted poking around the admin screens to find where a bit needs to be twiddled. “Is it in admin > content > content types? Or rather admin > settings > content types ? etc…” The new “Content Management,” “Site Building,” “Site Configuration,” “Logs,” and “Help” admin sections should make it much simpler to run a Drupal site.
- Tweaked page cache settings – not sure if the rumoured filesystem cache system is included here (is that the “aggressive – experts only” cache?)
- Includes a lite version of Content Construction Kit, making it easy to create simple content types (title + description/body). Why is that cool? Because you can assign different taxonomies and access controls to the various simple content types, so different users are able to create different types of content, classified differently. I’ll have to see what the potential issues may be with running the full CCK module for managing complex content types. Our new, soon-to-be-released TLC website makes extensive, obsessive use of several complex custom content types, so I’ll be spending some time checking this out.
- Separate “main” and admin themes. Instead of trying to shoehorn admin functions into a production custom theme, you can pimp out your site’s theme while retaining a fully functional admin theme. Yay.
- Lots of little improvements, making the admin interface more task-oriented. Things like the Clean URL setting have been moved into more appropriate spots, rather than dumping them all into One Giant Config Screen.
- Support for multiple image libraries. It was there in previous versions, but required adding a secret file to support Magick. I’ll have to install ImageMagick on my PowerBook again to try this out.
- Site installer. You still have to manually create the database, but Drupal will now install the tables and a default set of data automatically. I’ll have to play with the profiles feature to see how that might tie into the Provisionator.
I’m looking forward to the release of Drupal 5. I’ve got a LOT of compatibility testing to do before going live with it, though. I’ve got several sites running older versions of some modules, because the module versioning system is confusing enough that they break a little if updating them to “current” versions. That’s one area I’d really like to see get some loving. Tagging a module as “4.7″ and then making changes to it that break things depending on exactly which version of the 4.7 module you’re running (or have run). Very confusing and frustrating.
After some really spectacular fall days, winter’s here. Right on schedule, just before Halloween.
Here’s the webcam view from ShawTV, looking north from near the Stampede Grounds. That’s downtown Calgary. Hiding beneath the snow and ice piled up on the camera.
I’ve been trying to move domain registration and DNS hosting for darcynorman.net from GoDaddy to Dreamhost for a couple of months. It’s been a long and frustrating process, involving faxing my driver’s license to Arizona to somehow prove I am who I say I am.
I just logged into my Dreamhost account to check on the status (still hasn’t finalized – they sure did set it up in a hurry, but it takes a looooong time to switch off of GoDaddy). On a lark, I tried adding registration for darcynorman.com. But Dreamhost’s registration utility complained that the domain was already taken.
Mwaaaah? Another D’Arcy Norman out there? Lemme check that out. A quick
whois darcynorman.com turned up this:
Domain Name: DARCYNORMAN.COM
Registrar: GO DADDY SOFTWARE, INC.
Whois Server: whois.godaddy.com
Referral URL: http://registrar.godaddy.com
Name Server: CNS1.CANADIANWEBHOSTING.COM
Name Server: CNS2.CANADIANWEBHOSTING.COM
Updated Date: 16-mar-2006
Creation Date: 16-mar-2006
Expiration Date: 16-mar-2007
Oh, wait. No. It’s a domain squatter. Sitting on my name, assumedly hoping for a portion of the mad cash this blog generates. Mad cash, I tell you. Some lame squatter leech decided to register my name in the hopes I’d pay a ransom to get it back. At least the squatter is using a Canadian service provider to park the DNS for the domain. I guess that’s better than having it offshored to Moscow or something.
The combination of cheap domain registrations and “secure/private” registrations where you can hide behind a proxy make this practice possible. When I register domains, I need to go through CIRA verification, accept agreements about usage, etc… But these roaches can register other people’s names and park them for ransom. Rules (like locks) are for the honest people.
Screw you, squatter. I just went and registered darcynorman.ca – the only other variant of the domain I’d care about. Go ahead and squat on the rest, you rat bastage.
Alan wrote up a post on "linktribution" (the concept of providing attribution for a link to a web page, flickr image, etc…) and in the comments, Scott replied that (perhaps a more broad concept such as) Creative Commons would be a better Big Picture meme to propagate.
Which got me thinking about my experience with CC. I'm a firm believer in it. All of my stuff is licensed using a simple CC-Attribution license. Anyone is free to use any of my blog posts, any of my Flickr photos, however they wish. As long as they provide attribution to say that I created it. My photos don't even have a non-commercial clause, and as a result they've been included in a board game, books, travel guides, and (soon) the cover of a magazine. Sure, I'm not getting paid for any of that, but it's not like I'm losing out by contributing to the pool. Karma's a good thing, and if I want to use items in the CC pool, it's only fair that I contribute what I can.
I've tried mentioning Creative Commons in some workshops, and it seems like many (most?) people have a vague awareness of some strange subversive counter-culture movement called "Creative Commons" – but it doesn't seem to apply to them, and certainly not to their own creative works.
At which point I'm often left stumped, scratching my head and wondering what else I can do to show how CC applies to everyone. I model it, walking the walk every day. I show samples of works that couldn't have been created without CC. But then clients ask me how to ensure their content is locked down so nobody can even see it without their approval, never mind reusing and remixing.
People get confused about the difference between CC and Public Domain. They're quite different. Under CC, you retain "ownership" of the thing, and people are free to use it only as long as they abide by the clauses you select for the CC license. Under PD, everybody owns it, so nobody controls it. Subtle difference.
Also, people worry that if they release a work under CC, they won't be able to later sell it. Release it under a CC-NonCommercial license, and you're covered. You're free to later release your work however you like (commercially, or under a different license, as you deem fit).
Perhaps this is a side effect (intentional or otherwise) of the huge blitzes by the MPAA, RIAA, Disney, etc… in protecting copyright at all costs by suing 3-year-olds and grandmas. People are (rightly) scared of accidentally violating copyright and incurring the wrath of a well funded team of legal beagles.
I dunno. I strongly believe that CC is one of those things that has the power to change the nature of the game. It's not about gathering the most intellectual property, and staunchly protecting it through threats of litigation in the microscopic chance that you might make a buck off it. It's about freely sharing, contributing to the greater good, and all that jazz.
I tried a couple of times this morning to join a webcast offered by Apple, on the topic of using Lectopia to capture and distribute lectures (and other material). Something that would be useful, say, at a large-ish university that spends a fair amount of time and effort on online and blended learning.
Instead of being able to attend the webcast, I got occasional snippets of audio, and about once every minute or so I get a partial screen refresh for the video feed of the webcast.
Thanks for the partial connectivity. Maybe we should just get everyone AOL dialups or something.
Like just about everyone else with an active TCP/IP stack, I grabbed a copy of Firefox 2.0 today. It feels much cleaner and faster than before, and the spelchecker is definitely welcome (making it feel more like a MacOSX browser, where all other browsers have had spelcheking for ages…)
As part of the upgrade, it grabbed new versions of my extensions, including Performancing. Poking around in the PFF 1.3 settings, I notice it’s got its own concept of plugins. Including one that takes any local images used in a blog post and hucks them into Flickr on demand. mwaaaAAH? I’ve just got to try that sucker. How about a screenshot of Firefox 2.0 with PFF 1.3? Here goes… command+3, then drag the resulting image into place and resize by dragging the corner widgets…
Update: It didn’t handle resizing as expected (or at all – no way to select one of the Flickr-generated sizes, only original), and using a 1280×854 screenshot inline in a blog entry would be evil. But it works. Cool. Dragging the image from Flickr directly into PFF solves the problem nicely.
And PFF seems to have lost all of the strange text rendering bugs I was seeing under Firefox 1.x.
Update 2: PFF still doesn’t seem to successfully set categories/tags on my Drupal site. Don’t know if that’s a problem with PFF or Drupal (or both). Not fatal, but inconvenient.
Update 3: How does the image FTP upload work?
Update 4: OK. I got categories working! Woohoo! The trick is to set Drupal to use the MetaWeblog API, and tell PFF to connect to a Drupal site. Seems to work like a charm.
I’ve been jonesing for a zoom lens since I picked up my Canon XT back in June. The kit lens is not too bad (aside from some chromatic aberration), but a longer lens would be great. I had been eyeing the Canon 55-200 EF lens, at around $300 bucks at the local Black’s, but after reading some reviews, I want to stay far far away from that lens and its questionable build quality.
After a bit of poking around on The Camera Store’s site, I think I’ve found a much better alternative. A Sigma 70-300mm f4-5.6 APO Macro. There’s no image stabilization, so I’ll have to use a tripod when in the 200-300mm range, but at $320CDN, it’s a much better deal. And reviews suggest a much sturdier build quality.
Poking around some sample images shot with that lens, I’m pretty happy with it. Sure, it might not stack up against a $1500 lens with image stabilization and the works, but it’s cheap enough to be able to pick it up without many regrets.
Unfortunately, I think the lens is too slow for much indoor work. I don’t think it’s a suitable candidate for photographing workshops here at the TLC. But as an outdoor lens, it looks pretty darned good.
Update: On recommendation from Raffaella, I think I’ll hold out for the Canon 28-135mm f.3.5-5.6 IS USM. It’s a little more spendy (just under $600 CDN), and not quite as long, but the Sigma may be too long, and the build quality won’t be quite as good as this. And the image stabilization would help when shooting at the 135mm end of the lens. Now to go return some more empty bottles, and look under the cushions on my couch…
I’ve been using Aperture for a couple of weeks, in somewhat light usage (some days not at all, others, like today, with it open for most of the day). I’ve got a few gigs of images in my Aperture library, without importing my iPhoto images (I decided it’s not worth having an out-of-sync snapshot of my iPhoto library in Aperture). Here’s some quick thoughts based on my time in Aperture 1.5:
- I love the various views (list/browse, browser/viewer/fullscreen, metadata hud, adjustments hud, secondary screen modes…) – makes it soooo easy to find what I’m looking for, and get the job done.
- Smart albums – great way to organize images. I can add stuff to projects/albums, and have a Smart Album populate itself based on combinations of keywords, star ratings, and other metadata fields (show me all images with 3 or more stars, having the keyword “Issue #1″, shot with an aperture greater than f4 in the last 2 weeks).
- It runs like a pig on my PowerMac G5 Quad. The chips and RAM are up to the task, but the stock Nvidia GeForce 6600 video apparently doesn’t have the horsepower needed by Aperture (although it kicks Q3A and UT2003 pretty nicely – Dashboard hangs a bit, too, so perhaps it’s a Quartz Extreme thing). Images can take a few seconds to display. The loupe might take up to 5 seconds to show up, and is sluggish to drag. HUDs fade in very slowly, often over a couple of seconds. I resorted to minimizing the previews, selecting the smallest size and lowest quality, and that seems to have really sped things up. But it sure doesn’t feel like it’s running on what was Apple’s flagship machine until the recent release of the Mac Pro.
- Keyboard shortcuts aren’t properly localized. They’re hardcoded to the QWERTY keys. Or, rather, some are hard-coded, some are localized. I use the Dvorak layout, so have to use the menus and toolbar buttons for many things (like changing the Window Layout), but keyboard shortcuts work fine for other things (like fullscreen, and loupe). Frustrating.
- No straightforward way to crop/export really wide aspect ratio images. I spent most of today cropping a whole bunch of images into 790×287 and 790×134 sizes, and had to leave Aperture to do that. The crop HUD won’t let me enter values over 40, and seems to be acting up on large-ish numbers in general. So I resorted to exporting versions of the image, and taking those into Photoshop for cropping (where the crop tool lets me enter whatever dimensions I want) and export as jpeg for use on the web. Frustrating. I’d love to be able to apply non-destructive crops in Aperture, then export at the appropriate dimensions. It would make tweaking the crop much easier, rather than eyeballing it in Photoshop.
- No “Burn Project to DVD” button. I wanted to send a copy of an organized Aperture library to a client for backup, because their photographer sent them some really scratched and unorganized CDs full of great photos, and a nice DVD with the Aperture library would be better for their archives. I had to export the project to my desktop, then copy that do a DVD and burn it. A handy “Backup Project to DVD” feature would be great. Maybe I missed it.
Even with a few gripes, I’m pretty impressed by Aperture 1.5. It’s not perfect, but is a really sweet app that (mostly) lets me plough through a whole bunch of images quickly to get things done right and fast.
Things I’d like to see added:
- Live reference to iPhoto library. So I don’t have to stop using iPhoto in order to use Aperture. Instead of importing a snapshot of iPhoto’s library, how about a dynamic reference, something like a symlink from a Project in my Aperture library to my iPhoto library. There are a few reasons to keep using iPhoto, so it’d be cool to play better with it rather than just trying to push it out of the playground.
- Make it run faster on the Nvidia GeForce 6600. No idea if that’s possible, or how to do it, but the card’s not supposed to be a total stinker, and a whole lot of people have Quad G5s with it as the stock card. It’s a shame that Aperture doesn’t run fast enough on it.
- Wide aspect ratio cropping/exporting workflow. For things like blog/website banners, newsletter headers, etc… The crop tool’s acting up at really wide aspect ratios.