Tips for Very Large Projects

Update: I’ve wikified this list to make it more useful.

Some notes I’ve gathered while working on some Very Large Projects™ over the last couple of years. Some of the projects have done some of these things, and some have done few/none of these. Live and learn… I’m just documenting them here so I can refer to something quasi-concrete on the next big-ish project that comes along.

None of these tips are written in stone. They are also not necessary for a project to get done, but these should make things much easier on everyone. YMMV. IANAL. YHBH. Wait. Not the last one…

I’ll update the list as I think more about it.

  • Maintain a central repository of files for the project

    • avoids 16,326 revisions of Word .doc files being emailed around. Which is the most recent? Are all changes in the latest-modified version? Do I need to keep the previous 16.325 revisions on my drive Just In Case?
    • Everyone will be on the same page – no confusion with some people talking about stuff in revision 16,3200, some in the latest revision.
    • Word files are definitely not suited to sharing information with a group – use a wiki or CMS or something, but Word .doc files are not the best way to share information, even though everyone might have Word installed (and they may even use it with the project – but .doc is NOT an interchange format).
    • If something like Subversion is too geeky, try a shared FTP directory. Just do something to prevent confusion and missing files/data.
    • For a simple text/data repository, Wiki Is Your Friend™
  • Have concrete targets/objectives/goals/milestones

    • Make sure everyone (organizations and individuals) knows exactly what the plan is.
    • Make sure everyone knows about the timeline – what are the “real” deadlines? What are dependencies within the project(s) to meet the deadlines? Avoid “fake” or “internal” deadlines – make sure everyone knows the real dates.
    • What are the risks in the project? Where should resources be focussed to prevent failure?
    • What is the plan to communicate about the project? Will you be showing it at conferences? If so, pick them well ahead of time and make sure everyone knows about them. Make sure everyone is on the same page so there aren’t any last minute “oh, by the way, we’re presenting it at X conference tomorrow…” types of surprises.
  • Communicate

    • Have regular meetings – they don’t have to be face-to-face, they can even be phone calls or IM sessions – but talk regularly to avoid confusion.
    • Don’t have meetings just to have meetings. That’s a bigger waste of time than not having any meetings at all. Meetings should be short, productive, and useful. As soon as they stop being any of these things, they cut into efficiency on the project. Meet when you need to, and only when you need to.
    • Share notes from (online and offline) meetings so that there isn’t any confusion about what was discussed. Wiki pages are handy for this.
    • Make sure everyone on the project has an instant messaging account. It’s so much easier on everyone if they can just quickly ping each other for quick questions without drafting emails, or interrupting each other with phone calls, etc… As an added bonus, iChat and Skype accounts function for both text messaging as well as audio conferences (and in the case of iChat, video as well) – very handy for free ad-hoc meetings.
  • Document

    • The documentation for the project should include links to the central file repository(ies), timelines/milestones/objectives, meeting notes, etc…
    • Documentation should be written both for people new to the project (people come and go – prepare for it), and for existing people who may not have all of the information active in their heads – projects take time, so there should be a backup outboard brain shared by project members
  • Iterate
    • Avoid monolithic “here it is – it’s done” releases. Build early, build often, seek feedback/review/critique throughout the entire project lifespan.
    • Make sure the plan/timeline/milestones are still valid as you move along. Stuff happens. Things change. Make sure you’re not painting yourself into a corner with outdated plans. If plans change, make sure everyone is on the same page, and document the change (why did it change? what is the new plan? how does the change affect the rest of the project? the timeline? deliverables? etc…)
    • Embrace change/risk. If you’re not changing the plan at all, you’re likely not being responsive. (don’t change just for something to do, but don’t freak out about it) Also, risk isn’t a bad thing. If there was no risk, the project wouldn’t be necessary. Risk is where the magic happens, so work with it.

Update: I’ve wikified this list to make it more useful.

Some notes I’ve gathered while working on some Very Large Projects™ over the last couple of years. Some of the projects have done some of these things, and some have done few/none of these. Live and learn… I’m just documenting them here so I can refer to something quasi-concrete on the next big-ish project that comes along.

None of these tips are written in stone. They are also not necessary for a project to get done, but these should make things much easier on everyone. YMMV. IANAL. YHBH. Wait. Not the last one…

I’ll update the list as I think more about it.

  • Maintain a central repository of files for the project

    • avoids 16,326 revisions of Word .doc files being emailed around. Which is the most recent? Are all changes in the latest-modified version? Do I need to keep the previous 16.325 revisions on my drive Just In Case?
    • Everyone will be on the same page – no confusion with some people talking about stuff in revision 16,3200, some in the latest revision.
    • Word files are definitely not suited to sharing information with a group – use a wiki or CMS or something, but Word .doc files are not the best way to share information, even though everyone might have Word installed (and they may even use it with the project – but .doc is NOT an interchange format).
    • If something like Subversion is too geeky, try a shared FTP directory. Just do something to prevent confusion and missing files/data.
    • For a simple text/data repository, Wiki Is Your Friend™
  • Have concrete targets/objectives/goals/milestones

    • Make sure everyone (organizations and individuals) knows exactly what the plan is.
    • Make sure everyone knows about the timeline – what are the “real” deadlines? What are dependencies within the project(s) to meet the deadlines? Avoid “fake” or “internal” deadlines – make sure everyone knows the real dates.
    • What are the risks in the project? Where should resources be focussed to prevent failure?
    • What is the plan to communicate about the project? Will you be showing it at conferences? If so, pick them well ahead of time and make sure everyone knows about them. Make sure everyone is on the same page so there aren’t any last minute “oh, by the way, we’re presenting it at X conference tomorrow…” types of surprises.
  • Communicate

    • Have regular meetings – they don’t have to be face-to-face, they can even be phone calls or IM sessions – but talk regularly to avoid confusion.
    • Don’t have meetings just to have meetings. That’s a bigger waste of time than not having any meetings at all. Meetings should be short, productive, and useful. As soon as they stop being any of these things, they cut into efficiency on the project. Meet when you need to, and only when you need to.
    • Share notes from (online and offline) meetings so that there isn’t any confusion about what was discussed. Wiki pages are handy for this.
    • Make sure everyone on the project has an instant messaging account. It’s so much easier on everyone if they can just quickly ping each other for quick questions without drafting emails, or interrupting each other with phone calls, etc… As an added bonus, iChat and Skype accounts function for both text messaging as well as audio conferences (and in the case of iChat, video as well) – very handy for free ad-hoc meetings.
  • Document

    • The documentation for the project should include links to the central file repository(ies), timelines/milestones/objectives, meeting notes, etc…
    • Documentation should be written both for people new to the project (people come and go – prepare for it), and for existing people who may not have all of the information active in their heads – projects take time, so there should be a backup outboard brain shared by project members
  • Iterate
    • Avoid monolithic “here it is – it’s done” releases. Build early, build often, seek feedback/review/critique throughout the entire project lifespan.
    • Make sure the plan/timeline/milestones are still valid as you move along. Stuff happens. Things change. Make sure you’re not painting yourself into a corner with outdated plans. If plans change, make sure everyone is on the same page, and document the change (why did it change? what is the new plan? how does the change affect the rest of the project? the timeline? deliverables? etc…)
    • Embrace change/risk. If you’re not changing the plan at all, you’re likely not being responsive. (don’t change just for something to do, but don’t freak out about it) Also, risk isn’t a bad thing. If there was no risk, the project wouldn’t be necessary. Risk is where the magic happens, so work with it.

Drupal to support online communities of practice

One of my tasks for the next few weeks is to investigate Drupal, and specifically its ability to support the online interactions of a community of practice. We have a few projects that will involve some form of online interaction by students and professionals who are spread throughout southern Alberta, and it looks like Drupal may provide most, if not all, of the functionality to support these communities. I’ll be specifically looking at Drupal in the context of:

  • personal and professional reflection (blogging – either privately or to selected audiences)
  • information gathering
  • asynchronous communication – forums, threaded discussions, etc…
  • communication with peers and/or instructors
  • combinations of the above to form an “ePortfolio” type of view on the whole shebang, suitable for sharing with potential employers, or others
  • likely other stuff

I’m actually pretty excited to be working on this stuff – it feels like I’ve been working so long on media-production-supporting-utilities that I’ve been out of the loop on the whole community side of things. We’ll be looking at integrating Drupal with Pachyderm (likely via a download/upload process, rather than a direct tie-in), and the institutional WebDAV storage/sharing system. Should be fun.

One of my tasks for the next few weeks is to investigate Drupal, and specifically its ability to support the online interactions of a community of practice. We have a few projects that will involve some form of online interaction by students and professionals who are spread throughout southern Alberta, and it looks like Drupal may provide most, if not all, of the functionality to support these communities. I’ll be specifically looking at Drupal in the context of:

  • personal and professional reflection (blogging – either privately or to selected audiences)
  • information gathering
  • asynchronous communication – forums, threaded discussions, etc…
  • communication with peers and/or instructors
  • combinations of the above to form an “ePortfolio” type of view on the whole shebang, suitable for sharing with potential employers, or others
  • likely other stuff

I’m actually pretty excited to be working on this stuff – it feels like I’ve been working so long on media-production-supporting-utilities that I’ve been out of the loop on the whole community side of things. We’ll be looking at integrating Drupal with Pachyderm (likely via a download/upload process, rather than a direct tie-in), and the institutional WebDAV storage/sharing system. Should be fun.

LazyWeb Request: Drupal + WebDAV integration?

I’ve got a project that will require the use of Drupal (or something like it, but it’s looking like Drupal at the moment – I’ve got a mockup running for the project, and it’s solving about 90+% of their defined needs just using a stock Drupal installation with a handful of plugins), and one of the things that the users will have to do will be to upload files (images, .doc, .zip, whatever) into the system for reflection/commenting/review. They would also like to use these uploaded files outside of the system (for instance, on their own web pages, in an “ePortfolio”, whatever), and we’d like to provide a solution that wouldn’t force them to upload the files separately into two locations (their WebDAV volume for “regular” use, and into the Drupal system for review).

A quick Google search turned up a WebDAV project as part of the Google Summer of Code, but it’s intended to expose the Drupal database via WebDAV (for backup, or alternate interfaces…)

Any ideas on how a file uploaded into Drupal could be placed into a user’s institutional WebDAV space, rather than in Drupal’s /files/ directory?

ps. this was posted using Flock, and I think it took me longer to enter it via the WYSIWYG interface than it would have taken to just enter the HTML. Perhaps once I get used to the editor… Oh, and no place to enter categories? wtf?

Update: Apparently, editing an existing post using Flock’s editor makes the post disappear from my blog. Have to go in and manually re-publish the post after editing in Flock…

Technorati Tags: , , ,

I’ve got a project that will require the use of Drupal (or something like it, but it’s looking like Drupal at the moment – I’ve got a mockup running for the project, and it’s solving about 90+% of their defined needs just using a stock Drupal installation with a handful of plugins), and one of the things that the users will have to do will be to upload files (images, .doc, .zip, whatever) into the system for reflection/commenting/review. They would also like to use these uploaded files outside of the system (for instance, on their own web pages, in an “ePortfolio”, whatever), and we’d like to provide a solution that wouldn’t force them to upload the files separately into two locations (their WebDAV volume for “regular” use, and into the Drupal system for review).

A quick Google search turned up a WebDAV project as part of the Google Summer of Code, but it’s intended to expose the Drupal database via WebDAV (for backup, or alternate interfaces…)

Any ideas on how a file uploaded into Drupal could be placed into a user’s institutional WebDAV space, rather than in Drupal’s /files/ directory?

ps. this was posted using Flock, and I think it took me longer to enter it via the WYSIWYG interface than it would have taken to just enter the HTML. Perhaps once I get used to the editor… Oh, and no place to enter categories? wtf?

Update: Apparently, editing an existing post using Flock’s editor makes the post disappear from my blog. Have to go in and manually re-publish the post after editing in Flock…

Technorati Tags: , , ,

Wiki to document organizational procedures

Julian and I were just IMing about how to set up a new Subversion repository – and we both commented about how this process should be documented. King set one up last week, and we said the same thing then.

So, I bit the bullet, and wrote up a first draft of the documentation for the process of creating a new Subversion repository for a Learning Commons project.

I left a page to list all documentation for the Learning Commons, in case the idea takes off and others start doing it…

I’ve used wikis to store “spontaneous documentation” before – but I’m hoping to keep the wiki as part of the formal documentation process. We’re about to head into another Learning Commons website rethink, so it might make sense.

Julian and I were just IMing about how to set up a new Subversion repository – and we both commented about how this process should be documented. King set one up last week, and we said the same thing then.

So, I bit the bullet, and wrote up a first draft of the documentation for the process of creating a new Subversion repository for a Learning Commons project.

I left a page to list all documentation for the Learning Commons, in case the idea takes off and others start doing it…

I’ve used wikis to store “spontaneous documentation” before – but I’m hoping to keep the wiki as part of the formal documentation process. We’re about to head into another Learning Commons website rethink, so it might make sense.

so… fried…

Last night, it totally hit me just how fried I am. I’ve been fighting off burnout for awhile now, but I think I may have finally succumbed to it. I realized this when, at almost midnight, my family was upstairs fast asleep, and and I continued to work on my PowerBook to try to catch up on the backlog of bugs to fix – not feeling like I was making any progress anyway. I’m having to keep working this late so I don’t have to essentially abandon my family, which I refuse to do.

My work is backlogging because of some insanely tight deadlines, on 2 projects that are chronically understaffed. The One Project has been in actual development for over a year now, and while we’re close to being done, the Final Stretch is looming, with all that is entailed with that. The Other Project relies so completely on The One Project that I have had to walk a carefully balanced line between the two. Can’t finish The Other, without The One working well. Can’t work on The One without billing time to The Other, since we’ve used up our budget for The One long long ago… On top of that, I inherited The Other Project after budget cuts led to layoffs here in the spring, and there is no way I would have set the project up the way it is – but it’s waaaaay too late to change anything. The only thing left to do is grit teeth and push through it.

So now, The Other Project is coming due. Like, tomorrow due. And I’m still putting in revisions to content, and tweaking code in The One Project to support what is needed. And working with external contractors on some key supporting files that are basically out of my hands, but I can’t deliver without them working perfectly. And still receiving revisions to content and structure for The Other Project. And bugs/todos piling up for The One Project. Repeat ad nauseam.

Basically, to finish The One Project properly, the three of us programmers need to be able to direct 100% our our energy toward it for about a month. And, to finish The Other Project properly, I need to have the time and energy of about 5 people in order to finish massaging what can only be described as a freaking huge mass of content and resources.

My caffeine intake is waaaay up. My sleep is waaaay down. My cranky rating is off the chart. I feel (rightly or wrongly – doesn’t matter at this point) like I’m placed as a single point of failure for The Other Project, and have had to neglect The One Project more than I’d hoped and promised. It had gotten to the point where I seriously considered leaving, for the first time since I started here in 2001.

Anyway, there endeth the rant. Hopefully there is light at the end of the tunnel (there is an end of the tunnel, isn’t there?). In case anyone from either Projects stumbles across this – this is why I’ve been so pissy/grumpy/silent lately. Trying to keep my head down and pulling out all stops to get this stuff done, but there’s not a sane way to do that.

On the plus side, it’s Evan’s third birthday this weekend. I’ll be forced to take most of the weekend off for that, and we’re heading to West Edmonton Mall on Sunday and Monday (staying at the hotel in the mall). Should be at least a welcome break from the unceasing pressures…

Update: No, I’m not planning on quitting the Learning Commons – it’s just one of the things that go through a person’s head when faced with seamingly endless pressures. I’m staying here – we’ve got lots of ideas that will be fun to be a part of implementing, so I have no reason to go elsewhere.

Last night, it totally hit me just how fried I am. I’ve been fighting off burnout for awhile now, but I think I may have finally succumbed to it. I realized this when, at almost midnight, my family was upstairs fast asleep, and and I continued to work on my PowerBook to try to catch up on the backlog of bugs to fix – not feeling like I was making any progress anyway. I’m having to keep working this late so I don’t have to essentially abandon my family, which I refuse to do.

My work is backlogging because of some insanely tight deadlines, on 2 projects that are chronically understaffed. The One Project has been in actual development for over a year now, and while we’re close to being done, the Final Stretch is looming, with all that is entailed with that. The Other Project relies so completely on The One Project that I have had to walk a carefully balanced line between the two. Can’t finish The Other, without The One working well. Can’t work on The One without billing time to The Other, since we’ve used up our budget for The One long long ago… On top of that, I inherited The Other Project after budget cuts led to layoffs here in the spring, and there is no way I would have set the project up the way it is – but it’s waaaaay too late to change anything. The only thing left to do is grit teeth and push through it.

So now, The Other Project is coming due. Like, tomorrow due. And I’m still putting in revisions to content, and tweaking code in The One Project to support what is needed. And working with external contractors on some key supporting files that are basically out of my hands, but I can’t deliver without them working perfectly. And still receiving revisions to content and structure for The Other Project. And bugs/todos piling up for The One Project. Repeat ad nauseam.

Basically, to finish The One Project properly, the three of us programmers need to be able to direct 100% our our energy toward it for about a month. And, to finish The Other Project properly, I need to have the time and energy of about 5 people in order to finish massaging what can only be described as a freaking huge mass of content and resources.

My caffeine intake is waaaay up. My sleep is waaaay down. My cranky rating is off the chart. I feel (rightly or wrongly – doesn’t matter at this point) like I’m placed as a single point of failure for The Other Project, and have had to neglect The One Project more than I’d hoped and promised. It had gotten to the point where I seriously considered leaving, for the first time since I started here in 2001.

Anyway, there endeth the rant. Hopefully there is light at the end of the tunnel (there is an end of the tunnel, isn’t there?). In case anyone from either Projects stumbles across this – this is why I’ve been so pissy/grumpy/silent lately. Trying to keep my head down and pulling out all stops to get this stuff done, but there’s not a sane way to do that.

On the plus side, it’s Evan’s third birthday this weekend. I’ll be forced to take most of the weekend off for that, and we’re heading to West Edmonton Mall on Sunday and Monday (staying at the hotel in the mall). Should be at least a welcome break from the unceasing pressures…

Update: No, I’m not planning on quitting the Learning Commons – it’s just one of the things that go through a person’s head when faced with seamingly endless pressures. I’m staying here – we’ve got lots of ideas that will be fun to be a part of implementing, so I have no reason to go elsewhere.

JSwiff and Flash File Generation

We had a great hacking session today, with Josh piped in over iChat and VNC from California, and King and I hunkered around his collection of Cinema Displays. We managed to replace our krufty jGenerator-powered flash file wrapper class with one based on JSwiff, in under a day.

JSwiff takes care of the nastiness of dealing with the .swf file format, and provides an extremely helpful XML intermediary – you can convert any .swf file to this xml format, modify the xml, then render back as .swf. Very handy for what we need to do.

Basically, all we do is a fancy search-and-replace for some custom tags (for things like the image – encoded in Base64 – and the tombstone fields for display on screen) in this intermediate xml file, then pass it into JSwiff and ask it to transform that xml into a swf that we can use in our finished presentation. It’s fast, and so far very reliable. As an added bonus, it appears to handle accented characters and such, which totally borked in jGenerator. Mavericks will look better now, once I regenerate all transformed assets.

And JSwiff doesn’t look like it will be affected by the scary deadlocks that made jGenerator basically useless for us. Yay, JSwiff! 🙂

Even better, if this works out (it’s still being tested), then Pachyderm 2 is fully usable again, and on track for the October release!

We had a great hacking session today, with Josh piped in over iChat and VNC from California, and King and I hunkered around his collection of Cinema Displays. We managed to replace our krufty jGenerator-powered flash file wrapper class with one based on JSwiff, in under a day.

JSwiff takes care of the nastiness of dealing with the .swf file format, and provides an extremely helpful XML intermediary – you can convert any .swf file to this xml format, modify the xml, then render back as .swf. Very handy for what we need to do.

Basically, all we do is a fancy search-and-replace for some custom tags (for things like the image – encoded in Base64 – and the tombstone fields for display on screen) in this intermediate xml file, then pass it into JSwiff and ask it to transform that xml into a swf that we can use in our finished presentation. It’s fast, and so far very reliable. As an added bonus, it appears to handle accented characters and such, which totally borked in jGenerator. Mavericks will look better now, once I regenerate all transformed assets.

And JSwiff doesn’t look like it will be affected by the scary deadlocks that made jGenerator basically useless for us. Yay, JSwiff! 🙂

Even better, if this works out (it’s still being tested), then Pachyderm 2 is fully usable again, and on track for the October release!

On Walled Gardens of Content

I originally posted this entry on May 18, on the Apple Digital Campus Exchange (ADCE) “Tools to Enhance Teaching and Learning” weblog. I’d post a link, but everyone (including myself) would have to login to the ADCE system to read it. So I’m reposting it here in the hopes that it might make some difference. I’m not holding my breath. I was almost convinced that a walled garden might have value, but on further consideration I have to agree wholeheartedly with Alan – and won’t be posting to the ADCE weblogs unless/until the walled garden is opened up to everyone.

One thing that the read-write model of the internet is pretty much diametrically opposed to is the concept of content silos, or walled gardens of content.

There has to be a pretty compelling reason to lock content behind logins and registration. Restricting publishing is another matter, but restricting access to content that is not confidential is just plain wrong.

People won’t create accounts just to read content. Walled Gardens will wither and die – quickly atrophying into irrelevance.

I can see having a requirement for a login to post in the discussion boards, or to comment on a weblog (although even that is questionable). But the concept of having to log in just to find the URL to a weblog is pretty shortsighted.

Hopefully that’s just an oversight that will be quickly righted (especially considering the fact that Google has already found the ADCE blogs).

Update: I’ve put a quick-and-dirty PlanetADCE site up, which aggregates all posts from all ADCE blogs into one easy-to-read page. Enjoy!

PlanetADCE remains the only way to read the ADCE weblog posts. At least until the walled garden stormtroopers decide to seal our backdoor entrance…

The only way this kind of walled garden would fly is if it were the first, the only, or the largest (by an order of magnitude or so). ADCE isn’t any of those, but it does offer some cool things. For me the biggest draw of ADCE is the fact that it’s getting Carl Berger blogging. If that was the only product of the project, it would be well worth it.

I originally posted this entry on May 18, on the Apple Digital Campus Exchange (ADCE) “Tools to Enhance Teaching and Learning” weblog. I’d post a link, but everyone (including myself) would have to login to the ADCE system to read it. So I’m reposting it here in the hopes that it might make some difference. I’m not holding my breath. I was almost convinced that a walled garden might have value, but on further consideration I have to agree wholeheartedly with Alan – and won’t be posting to the ADCE weblogs unless/until the walled garden is opened up to everyone.

One thing that the read-write model of the internet is pretty much diametrically opposed to is the concept of content silos, or walled gardens of content.

There has to be a pretty compelling reason to lock content behind logins and registration. Restricting publishing is another matter, but restricting access to content that is not confidential is just plain wrong.

People won’t create accounts just to read content. Walled Gardens will wither and die – quickly atrophying into irrelevance.

I can see having a requirement for a login to post in the discussion boards, or to comment on a weblog (although even that is questionable). But the concept of having to log in just to find the URL to a weblog is pretty shortsighted.

Hopefully that’s just an oversight that will be quickly righted (especially considering the fact that Google has already found the ADCE blogs).

Update: I’ve put a quick-and-dirty PlanetADCE site up, which aggregates all posts from all ADCE blogs into one easy-to-read page. Enjoy!

PlanetADCE remains the only way to read the ADCE weblog posts. At least until the walled garden stormtroopers decide to seal our backdoor entrance…

The only way this kind of walled garden would fly is if it were the first, the only, or the largest (by an order of magnitude or so). ADCE isn’t any of those, but it does offer some cool things. For me the biggest draw of ADCE is the fact that it’s getting Carl Berger blogging. If that was the only product of the project, it would be well worth it.

Automator for Deploying WebObjects Application

For the Pachyderm project, we wanted a way to automatically update, build and deploy a WebObjects application and its supporting framework. The initial reaction was to just use a shell script, with xcodebuild running on the server to build the appropriate projects.

That didn’t work for us, because our server is still running 10.3 (with the appropriate older XCode dev. kit), while we’ve moved on to 10.4 and XCode 2.1 for development- so our server can’t understand the .xcodeproj files and barfs appropriately. Doh.

So, King and I whipped up an Automator-based workflow that runs on one of our dev. boxes. It first runs a shell script to update the source code from subversion, then builds the framework and application. It then runs a shell script to record the subversion revision number in a file in the compiled application (so we can display the revision number for bug reports etc… and not pollute our source code with revision numbers). Then, it connects to the server over afp, moves the old build products out of the way, and copies the new ones into place. About an hour later, WOMonitor comes through and cycles the app.

Basically, there is one “master” workflow that runs several nested workflows. This “master” is saved as an application, and I’ve set iCal to launch it every morning at 1:00am.

Yes. Source code management and enterprise application deployment with Automator and iCal 🙂

It seems to work well, but we initially had some permissions problems. It will also make it trivial for use to update the app at will.

For the Pachyderm project, we wanted a way to automatically update, build and deploy a WebObjects application and its supporting framework. The initial reaction was to just use a shell script, with xcodebuild running on the server to build the appropriate projects.

That didn’t work for us, because our server is still running 10.3 (with the appropriate older XCode dev. kit), while we’ve moved on to 10.4 and XCode 2.1 for development- so our server can’t understand the .xcodeproj files and barfs appropriately. Doh.

So, King and I whipped up an Automator-based workflow that runs on one of our dev. boxes. It first runs a shell script to update the source code from subversion, then builds the framework and application. It then runs a shell script to record the subversion revision number in a file in the compiled application (so we can display the revision number for bug reports etc… and not pollute our source code with revision numbers). Then, it connects to the server over afp, moves the old build products out of the way, and copies the new ones into place. About an hour later, WOMonitor comes through and cycles the app.

Basically, there is one “master” workflow that runs several nested workflows. This “master” is saved as an application, and I’ve set iCal to launch it every morning at 1:00am.

Yes. Source code management and enterprise application deployment with Automator and iCal 🙂

It seems to work well, but we initially had some permissions problems. It will also make it trivial for use to update the app at will.

Mavericks Authoring with Pachyderm

The Mavericks project is scheduled to wrap up in the next week (officially, but there will be some straggling updates over the next few weeks, I’d guess). We’ve been using the latest beta of Pachyderm to author the online version of the Glenbow Museum’s Mavericks exhibit. It’s a huuuuuge project. I just did a screen count, and while it’s lower than I estimated, it’s still a behemoth. 1254 screens are currently authored, with another 150+ coming online tonight (once Shawn finishes up).

That’s ~1400 screens, authored via the web interface, and compiled into a bunch of handy dandy flash websites for use as part of the Museum’s online exhibits. By the way, that’s the single largest test of Pachyderm 2.0, by several orders of magnitude. And, while authoring in a alpha/beta software package is a bit quirky at times, it actually worked surprisingly well. Boy, do I have some ideas for improvements though 🙂

That’s also a lot of picky little details that need to be cleaned up in the next few days. Stray images (over 1600 images were provided for the project), stray metadata, stray content, fun with MS Word “smart formatting” messing everything up (I’m going through a few of the published sections, and finding a LOT of those insipid MS Word formatting characters… Going to have to write some SQL to strip/convert them globally…). Template tweaks. Test. Review. Fix. Repeat 🙂

While it’s been an amazing test of Pachyderm, and a very cool project – I’ve learned a lot about Alberta history – I can’t wait until this thing is wrapped up. Keeping 1400 screens and their supporting bits in my head has been hurting for a while now…

Once it goes live, I’ll post links and stuff to share with the rest of the class.

The Mavericks project is scheduled to wrap up in the next week (officially, but there will be some straggling updates over the next few weeks, I’d guess). We’ve been using the latest beta of Pachyderm to author the online version of the Glenbow Museum’s Mavericks exhibit. It’s a huuuuuge project. I just did a screen count, and while it’s lower than I estimated, it’s still a behemoth. 1254 screens are currently authored, with another 150+ coming online tonight (once Shawn finishes up).

That’s ~1400 screens, authored via the web interface, and compiled into a bunch of handy dandy flash websites for use as part of the Museum’s online exhibits. By the way, that’s the single largest test of Pachyderm 2.0, by several orders of magnitude. And, while authoring in a alpha/beta software package is a bit quirky at times, it actually worked surprisingly well. Boy, do I have some ideas for improvements though 🙂

That’s also a lot of picky little details that need to be cleaned up in the next few days. Stray images (over 1600 images were provided for the project), stray metadata, stray content, fun with MS Word “smart formatting” messing everything up (I’m going through a few of the published sections, and finding a LOT of those insipid MS Word formatting characters… Going to have to write some SQL to strip/convert them globally…). Template tweaks. Test. Review. Fix. Repeat 🙂

While it’s been an amazing test of Pachyderm, and a very cool project – I’ve learned a lot about Alberta history – I can’t wait until this thing is wrapped up. Keeping 1400 screens and their supporting bits in my head has been hurting for a while now…

Once it goes live, I’ll post links and stuff to share with the rest of the class.

Apple Digital Campus Exchange

I’ve been asked to participate in a new effort by Apple, called the “Apple Digital Campus Exchange” – it’s basically a community of folks much smarter and more interesting than myself (ranging from Cole Camplese to Alan Levine to Larry Johnson to Carl Berger) who just get to talk about stuff like The Future, iPods in the classroom, etc…

I thought this was going public next week, but I see it was actually scheduled for launch on May 12. Since it is now somewhat after May 12, I think I’m safe linking to it…

I’m a contributing blogger on the Tools to Enhance Teaching and Learning in a Digital World weblog. It needs a longer title, or at least an acronym.

I am pretty darned excited by this ADCE effort – it could be a chance to get the stuff we’ve all been dabbling with over the last little while finally pushed out to the Rest of the Class™ (because the folks that still haven’t heard of or use blogs etc… just might sit up and pay attention when the Big Glowing Apple Icon starts talking about it. All we really need is for it to me shown in a Stevenote…)

I took the opportunity to write my First Post to the Tools blog, where I babble for a bit about the internet not being read-only, yadda yadda…

On the technology side, they’re basically mixing existing tools, including Wordpress and vBulletin (and some others, I’m sure) to provide a pretty compelling software suite for managing the community. (the admin side is secured behind an Apple Connect login, or I’d share the URL to that – it’s pretty cool, though…)

Oh, and someone in Apple Corporate decided that we all needed bios on the website. I absolutely hate writing a bio. Booooring. So I decided instead to have fun with mine. They want a bio? I’ll give ’em a bio! 😉

Update: The Campus Exchange community site is now live, with registration open to anyone. Come play!

I’ve been asked to participate in a new effort by Apple, called the “Apple Digital Campus Exchange” – it’s basically a community of folks much smarter and more interesting than myself (ranging from Cole Camplese to Alan Levine to Larry Johnson to Carl Berger) who just get to talk about stuff like The Future, iPods in the classroom, etc…

I thought this was going public next week, but I see it was actually scheduled for launch on May 12. Since it is now somewhat after May 12, I think I’m safe linking to it…

I’m a contributing blogger on the Tools to Enhance Teaching and Learning in a Digital World weblog. It needs a longer title, or at least an acronym.

I am pretty darned excited by this ADCE effort – it could be a chance to get the stuff we’ve all been dabbling with over the last little while finally pushed out to the Rest of the Class™ (because the folks that still haven’t heard of or use blogs etc… just might sit up and pay attention when the Big Glowing Apple Icon starts talking about it. All we really need is for it to me shown in a Stevenote…)

I took the opportunity to write my First Post to the Tools blog, where I babble for a bit about the internet not being read-only, yadda yadda…

On the technology side, they’re basically mixing existing tools, including WordPress and vBulletin (and some others, I’m sure) to provide a pretty compelling software suite for managing the community. (the admin side is secured behind an Apple Connect login, or I’d share the URL to that – it’s pretty cool, though…)

Oh, and someone in Apple Corporate decided that we all needed bios on the website. I absolutely hate writing a bio. Booooring. So I decided instead to have fun with mine. They want a bio? I’ll give ’em a bio! 😉

Update: The Campus Exchange community site is now live, with registration open to anyone. Come play!