Photography Trends in my iPhoto Library

I was just messing around with smart albums in iPhoto, and found that I can create albums based on camera model. So, I created a set of smart albums showing all photos taken with each of the 3 digital cameras I’ve owned. I then created additional smart albums to show just photos taken with a particular camera that have been rated 1 star or more (which I add to any photo that’s worth showing anyone else). The results were a bit surprising (and completely unscientific).

Camera Model # months using camera # photos taken # photos / month # starred photos % Starred
Olympus C200 36 3715 103 301 8.1 %
Fujifilm e510 17 2909 171 497 17.1 %
Canon Digital Rebel XT 5 1115 223 386 34.6 %

What does that suggest? Well, much of the story isn’t in these numbers. According to my Canon Digital Rebel XT’s internal computer, I’ve taken 4932 photos with it. A few hundred were added to Aperture on my work desktop, so approximately 3000-3500 photos have been deleted in camera, meaning I’m taking a LOT more photos with the XT (986 / month!), and performing a LOT more selection before dumping photos onto a computer (~500/month deleted in camera). I’ve also been doing a lot of experimentation, where I fill the card with a hundred shots at a time, and nuke them all.

I’m guessing there are a few things at play here.

  1. if you take more pictures, you get more pictures you’re happy with. I’m a firm believer that the best photo is the one you take, meaning if you don’t pull the trigger, you can’t get a good shot. And if you don’t pull the trigger enough, it’s harder to get good shots.
  2. As you get more control over the camera’s settings, and get comfortable with that control, you take better pictures. The Olympus had essentially no manual controls. The Fujifilm had plenty, but the interface sucked (all through menus, etc…). The XT has awesome manual control, great priority modes, etc… so I play more. And get some really cool shots (and some stinkers, which get deleted)
  3. It’s unclear if the increasing ratio of “good” photos is related to the camera, or just more experience over time. Would I have wound up with similar results by just sticking with the Olympus and using it more?
  4. The Olympus was purchased for the birth of Evan, so it got a LOT of specific use. Hundreds of baby photos. Birthdays, holidays, etc… Even with that emotional loading, I keep more than twice as many photos with the XT than I did with the Olympus. Hmmmm…
  5. I’m not sure if I’m being more thoughtful in taking shots with the XT (hence the higher star ratio), or if it’s the in-camera deletion causing that. Stuff that sucks gets nuked before touching the computer…

On Solving Spam

Spam is the scourge of the internets. It clogs Internet Tubes all over the globe, overloading the trucks that take internets around the world.

And it is directly caused by Google’s PageRank and Adsense systems. They (as well as others, but primarily Google – take a look at any spam farm, and you’ll see prominent Adsense ad blocks) created this mess by enabling individuals to cash in on hijacking innocent websites that have enabled anonymous commenting.

A spammer can sit in his basement, run some scripts to find juicy targets, send out some probes, then unleash hell in the hopes that they will improve the PageRank of their (or their client’s) websites, in an attempt to increase Adsense revenue on those sites.

So, here’s the easy solution. If a website is shown to be associated with spammish activities, the Adsense account is suspended. And their PageRank is reset to 0. Take away the financial incentive, and the rules of the came change.

It’s time for Google to step up and show some corporate responsibility. The whole rel="nofollow" solution is a non-starter, since it only works if we all agree to break the nature of the web in the first place by devaluing all links contributed to a website. It’s not worth throwing the baby out with the bathwater.

Now, how to define “spammish activities” – and, who gets to determine if a spam producer is guilty of that? There could be juries. There could be committees. Heck, it could become a social software tagging exercise, where the intelligence of the hive is harnessed to determine if something is spam or not. Have an appeals process, to prevent abuse. Have a responsible governance system to ensure effectiveness.

It seems to me that it would be in Google’s best interest to protect the value of PageRank and Adsense. By allowing spam farms to co-opt both systems, they devalue both. By ensuring spammers are removed from the system, we’re left with a more realistic representation of the online advertising ecosystem, with (hopefully) better representation of the actual contributors and participants.

But, this has to stop. Now. It’s only getting worse, and is threatening to smother any semblance of openness left on the web (1.0, 2.0 or beyond).

Justin Trudeau Speaking at the U of C

I just checked in on, and was greeted by a wonderful surprise. Justin Trudeau was on campus on Friday November 24, and the full audio of his talk was posted to as a podcast. I’ve grabbed the file, and have listened to the first couple of minutes, but this should be a great talk.

For anyone who doesn’t recognize the name Justin Trudeau, he is the son of former Prime Minister Pierre Elliot Trudeau, and is making quite a name for himself as both a public speaker and leader of youth activism.

I’m really interested to hear his thoughts on Quebec as a “nation” as well as his take on what we can do to address environmental issues. (my own take on the Quebec “nation” issue is that it only acts as a divisive instrument – instead of what we need, which is something that is unifying)

This is the kind of thing I’m hoping we can put into a itunes@UCalgary service, once we get that off the ground. For now, it’s hosted on our weblogs service.

Some progress against the evil spammers

After switching from BadBehavior+Spam.module back to Akismet, I assumed I’d be in for a bit of an onslaught of spam. I was braced for impact. I can’t believe the sheer volume of sustained attempted spam comments that are constantly being flung against this blog, 24/7 now. It’s peaked at several attempts per second, which was adding a bit of a load to the server as it struggled to thwart the forces of evil.

Shortly after switching to Akismet, and enabling the experimental spam detection, I was seeing this:

Now, that might not look like much, but it suggests that Akismet was having to reject attempts several times per minute. Fast forward 24 hours, and I see this:

Again, not looking like much, but the interval between Akismet interventions is getting longer. Either the spammers are slowly starting to give up, or this is just a natural lull. I mean, there can be several minutes now without an attempted spamment posting. Entire minutes!

Now, the downside of Akismet is that I can’t use it on any of my campus projects. The cost of licensing Akismet for the number of sites we have would be prohibitive, given our budget asymptotically approaching zero dollars (CDN).

Again with the spam blocking.

OK. Even I am getting sick of the incessant "spam blocking update" posts, but I figure if it helps even one other person put the brakes on the attempts of the evil spamroaches, it's worth it.

So, here's the latest. I got frustrated with the number of spamments that snuck through the combo of Bad Behavior and Spam.module, so I disabled both. I've reverted to using only Akismet.module, with the experimental spambot detection/prevention enabled.

And, so far, it's doing a better job at blocking the roaches. I've got no idea if it's also blocking legitimate hu-mans, though.

One nice thing about Akismet.module vs. spam.module – with Akismet's experimental spambot prevention, it's closer to acting like Spam Karma 2, where if you smell like a roach, you don't even get close enough to pop the lid off your can of spray paint.

I'll have to look into updating Akismet.module for Drupal 5. There's really no sense in actually moving to D5 without spam blocking. That'd be kind of silly.

As an aside, I was looking through some of the logs, and found an interesting user agent, which led me to the product website for one of the evil spam roach comment bot factory applications. They have disclaimers on the site saying they don't condone using their product without the permission of the blog owners. What? Permission? What a frakking load of ass-covering crap that is. Yeah. You're going to give someone permission to aim a program titled "Blog Post Uzi" – because, you know, Uzis are all warm and fuzzy, and the kind of thing that friends give permission to other friends to point at each other. Yeah. Permission to spray the output of a concealable assault gun. Whatever. Karma's going to catch up to you in spades, my friends at Promo Arsenal (dot com).

Bad Behaviour 2.0.7 for Drupal

I had been running an out of date version of Bad Behavior on my blog because the Drupal module requires BB 1.2.4 – but I think the evil spambots were getting around that old version.

So, I just took a stab at updating my copy of bad-behavior.module to work with the latest and greatest Bad Behaviour 2.0.7. I’m not sure if I’ve missed anything, but if seems as though it’s suddenly become successful at blocking the annoying Apkakkallli spambot that has been attempting to vandalize my blog for the last few days. The server seems much more responsive, at least. Maybe it’s successfully banning evildoers?

Or, it could be just banning everyone but me? Can anyone see this? Did I bork the site and/or Bad Behavior? Stupid spammers are such a frakking waste…

Moving to Drupal 5 Beta?

I’m seriously considering moving my blog to Drupal 5 Beta 1. It seems stable enough, and the performance (especially with Aggressive Caching) blows Drupal 4.7.4 out of the water. That’s becoming a pretty serious factor for me, as the insane onslaught of comment spam just keeps bringing my server to its knees (I’m looking at YOU, Apkakkallli!)

There are only a handful of modules that I use that don’t have Drupal 5 versions. Of those, I could probably limp along without them until D5 versions are available.(computed field, img_assist, recent blocks, TinyMCE…)

Stephen’s already running a public site on D5, and it seems to be going OK. I just did a test upgrade on my desktop box, and it basically went OK. Well, nothing blew up so fatally that I’d lose sleep over it. And Garland sure is a nice theme…

Batman visits!

I was just checking in, and on my Sitemeter stats page, saw an interesting recent visitor:

Visit from the Bat CaveVisit from the Bat Cave

I didn't realize Batman had moved the Cave so far from Gotham City. But, in another score for Drupal, the Dark Knight is researching how to move from WordPress to Drupal.