Efficient Distance Collaboration

Josh and I (and sometimes King) have been utilizing a pair-programming-at-a-distance method of collaboration (not quite like spooky action at a distance, but sometimes it feels pretty close…)

Our magic sweet spot combo of tools includes 2 computers at each end of the pipe, one with VNC (a server at one end, a client at the other), and the other computer acting as “support”, with iChatAV acting as our communications bridge, and various browsers and utilities handy on the non-shared screen. We leave an iChat video connection open, so we can communicate without having to stop editing source code to enter chat mode. This rocks nicely.

We had been using plain audio before, but Josh just got his iSight camera, so the open video line adds another dimension (and even without the video, the audio is much better – without needing headphones to kill the feedback loop). It could also come in handy, since he has a large whiteboard behind his chair, so he can sketch things up on it and I can see what he’s drawing.

I never thought I’d be a such a fan of distance collaboration (although it has worked extremely well before) – it is always better to meet in person. But this combo makes it about as close to a face-to-face meeting as possible. Still not quite there, but pretty darned close.

One of the cooler things about the setup is that, except for the iSight cameras (which are entirely optional, but sure work nicely), the entire setup uses free software. Not necessarily Free Software, but no expense. This blows the doors off of Breeze Live, Centra, and any number of other “collaboration” suites.

We’re seriously thinking of writing a paper, and/or presenting something at the next NMC conference on this setup.

Josh and I (and sometimes King) have been utilizing a pair-programming-at-a-distance method of collaboration (not quite like spooky action at a distance, but sometimes it feels pretty close…)

Our magic sweet spot combo of tools includes 2 computers at each end of the pipe, one with VNC (a server at one end, a client at the other), and the other computer acting as “support”, with iChatAV acting as our communications bridge, and various browsers and utilities handy on the non-shared screen. We leave an iChat video connection open, so we can communicate without having to stop editing source code to enter chat mode. This rocks nicely.

We had been using plain audio before, but Josh just got his iSight camera, so the open video line adds another dimension (and even without the video, the audio is much better – without needing headphones to kill the feedback loop). It could also come in handy, since he has a large whiteboard behind his chair, so he can sketch things up on it and I can see what he’s drawing.

I never thought I’d be a such a fan of distance collaboration (although it has worked extremely well before) – it is always better to meet in person. But this combo makes it about as close to a face-to-face meeting as possible. Still not quite there, but pretty darned close.

One of the cooler things about the setup is that, except for the iSight cameras (which are entirely optional, but sure work nicely), the entire setup uses free software. Not necessarily Free Software, but no expense. This blows the doors off of Breeze Live, Centra, and any number of other “collaboration” suites.

We’re seriously thinking of writing a paper, and/or presenting something at the next NMC conference on this setup.

Pachyderm Extreme Programming Redux

Josh, King and I are continuing work on the Pachyderm Authoring Application. We just got a distance collaboration setup going that works really well. We use VNC to share a computer (my TiBook), iChatAV for an open audio channel, and Breeze Live for a shared whiteboard. This is working almost as well as when Josh flew up to Calgary for a week. I think we’ll be able to get much more done this way. And it’s more fun, too…

Pachyderm Distance Collaboration Setup

Pictured here are my trusty TiBook (with shared VNC session for XCode and WOBuilder), the monitor connected to my Power Mac (with iChatAV and Breeze Live Whiteboard), an iMac used for supporting surfingresearch, the iSight camera used for my audio/video feed, my 15-year-old Sony speakers for blasting Josh’s voice down the halls, and my iPod (just sitting there, begging to play some background music). Also, lots of really cute photos of Evan…

Josh, King and I are continuing work on the Pachyderm Authoring Application. We just got a distance collaboration setup going that works really well. We use VNC to share a computer (my TiBook), iChatAV for an open audio channel, and Breeze Live for a shared whiteboard. This is working almost as well as when Josh flew up to Calgary for a week. I think we’ll be able to get much more done this way. And it’s more fun, too…

Pachyderm Distance Collaboration Setup

Pictured here are my trusty TiBook (with shared VNC session for XCode and WOBuilder), the monitor connected to my Power Mac (with iChatAV and Breeze Live Whiteboard), an iMac used for supporting surfingresearch, the iSight camera used for my audio/video feed, my 15-year-old Sony speakers for blasting Josh’s voice down the halls, and my iPod (just sitting there, begging to play some background music). Also, lots of really cute photos of Evan…

Pachyderm Foundation Development

I’ve left (early) from the Pachyderm development session. King and Josh kept going, and we’re going to be unbelievably close to a working Pachyderm Presentation Authoring application. The work is shifting to the user interface, so changes will become visible.

We took some photos today, to document the ad-hoc Extreme Programming setup we adopted, and some of the results.

First, we have the “before” picture. The original Pachyderm 1.0 database schema we inherited:

Pachyderm 1.0 Schema

Then, we have the ad-hoc Extreme Programming sessions:

Extreme Programming - 5 screens, 6 eyeballs

and, the final database schema for Pachyderm 2.0:

Pachyderm 2.0 database schema

The main foundation stuff that’s left is the entering of screen and component data, integration with APOLLO resource management, and the generation of the xml files to be read by the Pachyderm flash templates. I’m guessing King and Josh will have tackled a good chunk of this already. Next, we’ll be able to focus on the authoring UI almost exclusively.

I’ve left (early) from the Pachyderm development session. King and Josh kept going, and we’re going to be unbelievably close to a working Pachyderm Presentation Authoring application. The work is shifting to the user interface, so changes will become visible.

We took some photos today, to document the ad-hoc Extreme Programming setup we adopted, and some of the results.

First, we have the “before” picture. The original Pachyderm 1.0 database schema we inherited:

Pachyderm 1.0 Schema

Then, we have the ad-hoc Extreme Programming sessions:

Extreme Programming - 5 screens, 6 eyeballs

and, the final database schema for Pachyderm 2.0:

Pachyderm 2.0 database schema

The main foundation stuff that’s left is the entering of screen and component data, integration with APOLLO resource management, and the generation of the xml files to be read by the Pachyderm flash templates. I’m guessing King and Josh will have tackled a good chunk of this already. Next, we’ll be able to focus on the authoring UI almost exclusively.

“Extreme Programming” Pachyderm

Joshua Archer has been in the Learning Commons this week (up from the CSU Center for Distributed Learning at Sonoma State University. We’re working on the code that will drive Pachyderm 2.0, and it’s been a pretty intense week so far.

I’m feeling a bit out of my league, with King and Josh running with this stuff, and me panting about half a lap behind, struggling to keep up. Just like junior high school gym class all over again 😉

Anyway, we’ve taken over the old computer lab in the Learning Commons, and have set up a pretty effective working area. It’s sort of like Extreme Programming, with 1 computer shared between the three of us, and supporting computers for each. My laptop was hooked up to a projector, and King plugged another monitor into the projector’s VGA out. This gave us 3 displays of the same screen (one about 10 feet wide for easy viewing). King also plugged a second keyboard/mouse combo into a spare USB port in the Powerbook, so he could take over as needed. King also had his Powerbook for searching docs, and Josh had his Thinkpad as well.

I’d always assumed I’d hate Extreme Programming. I thought it would suck, and that the programmers would be tripping over each other. Not the case (maybe just with this group?) – kind of a cool flow situation. Very slick, and amazingly efficient. 6 eyeballs on all of the code at all times.

Tuesday evening, we were still hacking away, but decided we should watch the final game of the World Cup of Hockey (Canada vs. Finland). So, we planned to string a coax cable into the lab to watch it on the projector. Coax cable wasn’t long enough. Doh. So, King hooked up our own QuickTime broadcast of the game from CBC over the LAN to Josh’s Thinkpad, which was then projected to the 10′ shared screen (and Julian tapped the feed as well, from his cube down the hall). Worked great (there’s a photo of this in my Flickr space), until Lawrie’s daughter decided to change the channel to the Muppets with 10 minutes left in the game. Doh. 😉

SubEthaEdit would make some of this multi-screen collaboration possible at a distance (with Josh back at the CDL, and King and myself here in Calgary), but there isn’t the same chemistry involved without everyone in the same room.

The code that’s coming out of these sessions is freaking amazing. King’s mastery of the stuff that shouldn’t be possible, and Josh’s conceptual model of the whole system, are combining to make this some pretty sweet foundation code. The biggest change came last night, when we were winding up a 14-hour session after 9pm. We ended up trimming over half of the tables from the database. What was initially modeled on poster-sized paper produced on a large-format plotter, now fits comfortably (and more legibly) on 2 sheets of letter sized paper.

The other big thing that was developed was the implementation of a document model to represent Pachyderm presentations. We’ll be writing the authoring tools, and they won’t even have to know about databases. They’ll be creating documents, and adding/manipulating the content of these documents. Very very cool stuff.

Anyway, tomorrow’s the last day of this marathon hackathon. I sure hope I can keep up… 😉

Joshua Archer has been in the Learning Commons this week (up from the CSU Center for Distributed Learning at Sonoma State University. We’re working on the code that will drive Pachyderm 2.0, and it’s been a pretty intense week so far.

I’m feeling a bit out of my league, with King and Josh running with this stuff, and me panting about half a lap behind, struggling to keep up. Just like junior high school gym class all over again 😉

Anyway, we’ve taken over the old computer lab in the Learning Commons, and have set up a pretty effective working area. It’s sort of like Extreme Programming, with 1 computer shared between the three of us, and supporting computers for each. My laptop was hooked up to a projector, and King plugged another monitor into the projector’s VGA out. This gave us 3 displays of the same screen (one about 10 feet wide for easy viewing). King also plugged a second keyboard/mouse combo into a spare USB port in the Powerbook, so he could take over as needed. King also had his Powerbook for searching docs, and Josh had his Thinkpad as well.

I’d always assumed I’d hate Extreme Programming. I thought it would suck, and that the programmers would be tripping over each other. Not the case (maybe just with this group?) – kind of a cool flow situation. Very slick, and amazingly efficient. 6 eyeballs on all of the code at all times.

Tuesday evening, we were still hacking away, but decided we should watch the final game of the World Cup of Hockey (Canada vs. Finland). So, we planned to string a coax cable into the lab to watch it on the projector. Coax cable wasn’t long enough. Doh. So, King hooked up our own QuickTime broadcast of the game from CBC over the LAN to Josh’s Thinkpad, which was then projected to the 10′ shared screen (and Julian tapped the feed as well, from his cube down the hall). Worked great (there’s a photo of this in my Flickr space), until Lawrie’s daughter decided to change the channel to the Muppets with 10 minutes left in the game. Doh. 😉

SubEthaEdit would make some of this multi-screen collaboration possible at a distance (with Josh back at the CDL, and King and myself here in Calgary), but there isn’t the same chemistry involved without everyone in the same room.

The code that’s coming out of these sessions is freaking amazing. King’s mastery of the stuff that shouldn’t be possible, and Josh’s conceptual model of the whole system, are combining to make this some pretty sweet foundation code. The biggest change came last night, when we were winding up a 14-hour session after 9pm. We ended up trimming over half of the tables from the database. What was initially modeled on poster-sized paper produced on a large-format plotter, now fits comfortably (and more legibly) on 2 sheets of letter sized paper.

The other big thing that was developed was the implementation of a document model to represent Pachyderm presentations. We’ll be writing the authoring tools, and they won’t even have to know about databases. They’ll be creating documents, and adding/manipulating the content of these documents. Very very cool stuff.

Anyway, tomorrow’s the last day of this marathon hackathon. I sure hope I can keep up… 😉

DirectorWeb is 10 years old!!

The DirectorWeb website is 10 years old! Holy cow. Time flies. I used to spend a LOT of time on this site, using their Direct-L listserv archive/search utility, back when I was doing ~100% Director stuff. Alan did an awesome job on DirectorWeb, so much so that I considered it essential to doing Director development. I still remember many of the regulars (WTHMO – Warren (the Howdy Man Oleshko), Zav, Zac, John Dowdell, Warren The Audi Man (from Integration.qc.ca, IIRC) and many more…) Direct-L searches for “howdy” show that WTHMO must still be active 😉 I appear to have been dropped from the online archives, or perhaps more accurately, the archive doesn’t go back far enough to include me 😉

It’s some kind of synchronicity thing, when I first "met" Alan Levine almost 10 years ago via DirectorWeb and Direct-L, and now deal quite a bit with him on Pachyderm, and other things Learning Object. What a long, strange trip it’s been… 😉

Thanks for the update on DirectorWeb, Alan! (although, like yourself, I haven’t seriously touched Director for several years now…)

The DirectorWeb website is 10 years old! Holy cow. Time flies. I used to spend a LOT of time on this site, using their Direct-L listserv archive/search utility, back when I was doing ~100% Director stuff. Alan did an awesome job on DirectorWeb, so much so that I considered it essential to doing Director development. I still remember many of the regulars (WTHMO – Warren (the Howdy Man Oleshko), Zav, Zac, John Dowdell, Warren The Audi Man (from Integration.qc.ca, IIRC) and many more…) Direct-L searches for “howdy” show that WTHMO must still be active 😉 I appear to have been dropped from the online archives, or perhaps more accurately, the archive doesn’t go back far enough to include me 😉

It’s some kind of synchronicity thing, when I first "met" Alan Levine almost 10 years ago via DirectorWeb and Direct-L, and now deal quite a bit with him on Pachyderm, and other things Learning Object. What a long, strange trip it’s been… 😉

Thanks for the update on DirectorWeb, Alan! (although, like yourself, I haven’t seriously touched Director for several years now…)

Tim O’Reilly and The Software Paradigm Shift

I listened to the IT Conversations interview with Tim O’Reilly on the way in this morning. Very interesting interview. I love the vision that Tim has for his company – it’s more about capturing knowledge (whatever that means) rather than growing their market. It’s kind of cool to see how taking the right attitude can actually lead to a strong position in the market (how many people buy O’Reilly books vs. the others? I’d bet a LOT).

The big paradigm shift is also pretty cool… PCs and their OS become less important, as applications really do move into a more decentralized space (he uses Google as an example of a pervasive network app). What he didn’t mention is that the decreased focus on the PC and its OS means that people will become more able to select their tools based on need, and not by some 95% market share domination by a single OS vendor…

It also struck me that the simple fact of my listening to a freely distributed interview, published to the web as an .mp3 file, played by my iPod while taking the bus into work might be a bit of a paradigm shift as well. On-demand content, MeTV, that kind of thing…

Why would I post about this? It was triggered by the fact that I saved the GMail Invitation entry into software/google/gmail…

I listened to the IT Conversations interview with Tim O’Reilly on the way in this morning. Very interesting interview. I love the vision that Tim has for his company – it’s more about capturing knowledge (whatever that means) rather than growing their market. It’s kind of cool to see how taking the right attitude can actually lead to a strong position in the market (how many people buy O’Reilly books vs. the others? I’d bet a LOT).

The big paradigm shift is also pretty cool… PCs and their OS become less important, as applications really do move into a more decentralized space (he uses Google as an example of a pervasive network app). What he didn’t mention is that the decreased focus on the PC and its OS means that people will become more able to select their tools based on need, and not by some 95% market share domination by a single OS vendor…

It also struck me that the simple fact of my listening to a freely distributed interview, published to the web as an .mp3 file, played by my iPod while taking the bus into work might be a bit of a paradigm shift as well. On-demand content, MeTV, that kind of thing…

Why would I post about this? It was triggered by the fact that I saved the GMail Invitation entry into software/google/gmail…

21 Rules of Thumb – How Microsoft Develops Its Software

I’m sure this is going to be linked all over the place, but it’s a very interesting read. It’s a reposting by David Gristwood, of an original article by Jim McCarthy (a manager on the MS Visual C++ team).

Particularly interesting and useful, it delves into the topic of “slippage”, but treats it as a necessary and good part of the process. Slippage becomes a transition from the unknown to the less unknown – as you know more, the timeline and estimates become more refined and realistic, often leading to slippage.

Perhaps the most important point is the first one: Don’t know what you don’t know. If you don’t know something, state that, and come up with a plan to fill the gap. Incorrectly assuming knowledge, or worse – faking it – will lead to disaster.

UPDATE: Just came across a link to The Ten Commandments of Egoless Programming – more good stuff. Even comes with a handy downloadable (if butt ugly) stone tablet for printing.

I think I’ve been risking breaking Commandment #9: Don’t be “the guy in the room” – going to have to work harder to prevent that. 🙁

I’m sure this is going to be linked all over the place, but it’s a very interesting read. It’s a reposting by David Gristwood, of an original article by Jim McCarthy (a manager on the MS Visual C++ team).

Particularly interesting and useful, it delves into the topic of “slippage”, but treats it as a necessary and good part of the process. Slippage becomes a transition from the unknown to the less unknown – as you know more, the timeline and estimates become more refined and realistic, often leading to slippage.

Perhaps the most important point is the first one: Don’t know what you don’t know. If you don’t know something, state that, and come up with a plan to fill the gap. Incorrectly assuming knowledge, or worse – faking it – will lead to disaster.

UPDATE: Just came across a link to The Ten Commandments of Egoless Programming – more good stuff. Even comes with a handy downloadable (if butt ugly) stone tablet for printing.

I think I’ve been risking breaking Commandment #9: Don’t be “the guy in the room” – going to have to work harder to prevent that. 🙁

Collaboration at a Distance

Michelle made a good point in an email. I’d overlooked the value of collaboration at a distance, because I really take it for granted now. I’ve been working with folks over the ‘net for years, but much more intensely over the past year.

The Learning Object Syndication with RSS presentation(s) (here and here) wouldn’t have been possible without iChatAV, wikis, and weblogs.

And the Pachyderm install would have cost a few orders of magnitude more without these tools (well, we really only used iChatAV/Trillian). The cost of travel between Calgary and California would have been waaaay too high, since it would have meant a few trips to get it running.

Michelle made a good point in an email. I’d overlooked the value of collaboration at a distance, because I really take it for granted now. I’ve been working with folks over the ‘net for years, but much more intensely over the past year.

The Learning Object Syndication with RSS presentation(s) (here and here) wouldn’t have been possible without iChatAV, wikis, and weblogs.

And the Pachyderm install would have cost a few orders of magnitude more without these tools (well, we really only used iChatAV/Trillian). The cost of travel between Calgary and California would have been waaaay too high, since it would have meant a few trips to get it running.

Federated Identity Management

Looking into techniques to allow us to decentralize user management in cross-institutional (and non-institutional) software, such as APOLLO.

Here are some links I’ve come across on the topic:

Many of these articles look like corporate shovelware “Read about how smart we are – give us money” but maybe there’s some good stuff in there, too.

This is stuff waaaay outside my normal realm of things, so I’ll be doing some reading/thinking about this stuff, and how it might affect CAREO/APOLLO.

The goal is to be able to do something like this scenario:

Bill is a professor at the University of Calgary. He securely logs into an APOLLO search application using his U of C login, and APOLLO is aware of the groups and roles that Bill has as part of his U of C identity.

Mary is a grad student at the University of British Columbia. She logs into an APOLLO collaborative application using her UBC login, and is able to access resources defined by her groups and roles described by her UBC identity.

Bill and Mary are working together on a project, and Bill creates an ad-hoc group in APOLLO for them to share resources privately while collaborating on their development. Once ready for publication, these resources are made available to individuals at both the U of C and UBC.

Looking into techniques to allow us to decentralize user management in cross-institutional (and non-institutional) software, such as APOLLO.

Here are some links I’ve come across on the topic:

Many of these articles look like corporate shovelware “Read about how smart we are – give us money” but maybe there’s some good stuff in there, too.

This is stuff waaaay outside my normal realm of things, so I’ll be doing some reading/thinking about this stuff, and how it might affect CAREO/APOLLO.

The goal is to be able to do something like this scenario:

Bill is a professor at the University of Calgary. He securely logs into an APOLLO search application using his U of C login, and APOLLO is aware of the groups and roles that Bill has as part of his U of C identity.

Mary is a grad student at the University of British Columbia. She logs into an APOLLO collaborative application using her UBC login, and is able to access resources defined by her groups and roles described by her UBC identity.

Bill and Mary are working together on a project, and Bill creates an ad-hoc group in APOLLO for them to share resources privately while collaborating on their development. Once ready for publication, these resources are made available to individuals at both the U of C and UBC.

Hammers, Nails, and Web Pages

Back when I was building the “theme engine” for CAREO, everything began to look like themable stuff. I kept saying “Hey, that’s not hard – we can just implement that as a theme in CAREO!”

Everything from a version of the Learning Commons website, to SciQ, to a bunch of other projects, were mocked up (and some even implemented) as themed components in CAREO.

Worked pretty well, and was an awesome test of the flexibility of the theme engine.

The main drawback was that all edits had to go through a rather cryptic process of defining content in (perfectly valid) XML, crunching the content through an app that ran some XSLT wizardry, and stored the results in a database.

That kinda killed any productive workflow. All edits had a single point of waiting, and it was hard to use regular tools to manage content (couldn’t use DreamWeaver much, as it barfed on XML, or worse yet, added invalid stuff). Contribute wouldn’t work with the custom workflow… Etc…

We’re going through iterative cycles on a few projects, and lo and behold, SciQ is becoming largely static pages (except the stuff that deals directly with learning objects – that is still generated by CAREO) – likewise for the LC website.

The moral of the story is, when you’ve built a pretty fancy hammer, and everything is looking like pretty fancy nails, take a good step back and think about what’s really going on. Odds are, you’ve got more than just nails to deal with.

Back when I was building the “theme engine” for CAREO, everything began to look like themable stuff. I kept saying “Hey, that’s not hard – we can just implement that as a theme in CAREO!”

Everything from a version of the Learning Commons website, to SciQ, to a bunch of other projects, were mocked up (and some even implemented) as themed components in CAREO.

Worked pretty well, and was an awesome test of the flexibility of the theme engine.

The main drawback was that all edits had to go through a rather cryptic process of defining content in (perfectly valid) XML, crunching the content through an app that ran some XSLT wizardry, and stored the results in a database.

That kinda killed any productive workflow. All edits had a single point of waiting, and it was hard to use regular tools to manage content (couldn’t use DreamWeaver much, as it barfed on XML, or worse yet, added invalid stuff). Contribute wouldn’t work with the custom workflow… Etc…

We’re going through iterative cycles on a few projects, and lo and behold, SciQ is becoming largely static pages (except the stuff that deals directly with learning objects – that is still generated by CAREO) – likewise for the LC website.

The moral of the story is, when you’ve built a pretty fancy hammer, and everything is looking like pretty fancy nails, take a good step back and think about what’s really going on. Odds are, you’ve got more than just nails to deal with.