Brightspace nee Desire2Learn

I’ve been trying to get my head around the reasoning for the corporate rebranding to Brightspace12, and I’m coming up short. I like the name, but it feels like everything they’ve described here at Fusion could have been done under the previous banner of Desire2Learn. I’m more concerned about signs that the company is shifting to a more corporate Big Technology Company stance.

When we adopted D2L, they felt like a teaching-and-learning company. What made them interesting to us is that they did feel like a company that really got teaching and learning. They were in the trenches. They used the language. They weren’t a BigTechCo. But, they were on a trajectory aspiring toward BigTechCo.

Fusion 2013 was held at almost the exact same time that we had our D2L environment initially deployed to start configuration for our migration process. We were able to send a few people to the conference last year, and we all came away saying that it definitely felt more like a teaching-and-learning conference than a vendor conference. Which was awesome.

We’ve been working hard with our account manager and technical account team, and have made huge strides in the last year. We’ve developed a really great working relationship with the company, and I think we’re all benefiting from it. The company is full of really great people who care and work hard to make sure everyone succeeds. That’s fantastic. Lots of great people working together.

But it feels like things are shifting. The company now talks about “enablement” – which is good, but that’s corporate-speak, not teaching-and-learning speak. That’s data.

Fusion 2014 definitely feels more like a vendor conference. I don’t know if we’re just more sensitive to it this year, but every attendee I’ve talked to about it has noticed the same thing. This year is different. That’s data.

As part of the rebranding, Desire2Learn/Brightspace just rebooted their community site – which was previously run within an instance of the D2L Learning Environment (which was a great example of “eating your own dog food”), and now it’s a shiny new Igloo-powered intranet site. They also removed access to the product suggestion platform, which was full of carefully crafted suggestions for improving the products, provided by their most hardcore users.

The rebranded community site looks great, but the years worth of user-provided discussions and suggestions didn’t make the journey to the new home. So, the community now feels like a corporate marketing and communication platform, rather than an actual community because it’s empty. I’m hopeful that there is a plan to bring the content from the actual community forward. The content wasn’t there at launch, and it was about the branding of the site rather than the community. That’s data.

And there are other signs. The relaunched community site is listed under “Community” on the new Brightspace website, broken into “Communities of Practice”:

Screen Shot 2014 07 16 at 8 51 49 AM

The problem is, those aren’t “communities of practice” – they are corporate-speak categories for management of customer engagement. Communities of Practice are something else entirely. I don’t even know what an “Enablement” community is. That’s data.

It feels like the company is trying to do everything, simultaneously. They’re building an LMS / Learning Environment / Integrated Learning Platform, a big data Analytics platform, media streaming platform, mobile applications, and growing in all directions at once. It feels like the corporate vision is “DO EVERYTHING” rather than something more focused. I’m hoping that’s just a communication issue, rather than anything deeper. Which is also data.

They’re working hard to be seen as a Real Company. They’re using Real Company language. They’re walking and talking like a Real Company. Data.

The thing is – they’ve been working on the rebranding for awhile now, and launched it at the conference. The attendees here are likely the primary target of the rebranding, and everyone I talk to (attendees and staff) are confused by it. It feels like a marketing push, and a BigTechCo RealCo milestone. It feels like the company is moving through an uncanny valley – it doesn’t feel like the previous teaching-and-learning company, and it’s not quite hitting full stride as a BigTechRealCo yet.

I really hope that Brightspace steps back from the brink and returns to thinking like a teaching-and-learning company.

  1. this isn’t about the name – personally, I like the new name, and wish they’d used it all along. But the company had built an identity around the previous name for 15 years, and it looks like they decided to throw that all away []
  2. and there’s the unfortunate acronym. 30 seconds after the announcement, our team had already planned to reserve []

UGuelph and D2L sitting in a tree

News of a new collaboration between UGuelph and D2L, on a major pedagogy research initiative:

The pedagogy research project strives to help schools track and report on learning outcomes across programs over time. Researchers will use D2L’s predictive analytics capabilities to document and discover the effectiveness of assessment tools on specific subjects while working with educators to develop a curriculum that results in greater student success.

via University of Guelph to Leverage Desire2Learn’s Integrated Learning Platform for $6 Million Pedagogy Research Initiative | Desire2Learn Press Release.

2 quick thoughts1 on this:

  1. awesome! D2L really does play well with others, and invests in improving teaching and learning rather than just polishing shiny baubles.
  2. surely there is more to this than just predictive analytics. I’d love to see a pedagogical collaboration that was about in-the-trenches teaching (and learning) online, and not just massaging the data gathered about online activities. D2L has been trying to foster an online community of teachers (and others) in their D2L Community site23. It would be really cool to push that community up a few notches and open the doors so anyone can follow along (or join in).

Desire2Learn really feels like they care about teaching and learning – the Fusion conference last year was different from any other vendor conference I’ve been to, and felt decidedly like a good teaching-and-learning conference rather than a buy-our-shiny-products vendor conference.

  1. my own thoughts, not the official position of the university or anything []
  2. which is actually running in the D2L LMS itself []
  3. but it requires a login to see the stuff that goes on inside it []

on creating courses to set up a semester in Desire2Learn

We’re in the middle of our Fall 2013 “Pilot” semester – almost 5,000 students are using D2L1 this semester, with extremely positive feedback from students and instructors. We’re now in the process of setting up for the Winter 2014 semester – where 4 faculties will be moving to use Desire2Learn for 100% of their online- and blended courses (and many courses from other faculties thrown in for good measure). Likely 10-12,000 students using it next semester. That’s a lot of students. And a lot of courses. We still don’t have automated course creation integrated with PeopleSoft, and are working feverishly on that (the thought of managing course enrolments for 12,000 students using CSV uploads makes me break into a cold sweat).

The basic process for setting up courses for a semester looks something like this:

  1. Create a data feed that triggers course creation. Course Code, Course Title, Department, other key metadata about the courses. This can either be done through the connection with PeopleSoft (which isn’t working for us yet), or via the Bulk Course Create (BCC) tool. Feed BCC the CSV of course info (SFTP it to the D2L server), wait until the scheduled processing job crunches it, and boom. Courses are created. But they’re empty. And nobody can see them.

  2. Enrol users in the courses. This can either be done via scheduled data feed from PeopleSoft (again, not yet), or via another CSV file that associates a user with a course and applies a role. This is done using a second tool, built into the Users admin interface. This Bulk User Management (BUM) tool2 takes the CSV, and crunches it on demand. No scheduled processing job to wait for. The CSV can also be cumulative, so you don’t have to scoop out previous entries. Separate files are needed to handle CREATE, ENROLL, UNENROLL etc… because they all have different columns, in different orders.

  3. The courses are empty. They need to be populated with any default content that a faculty uses. We have set up “template”3 that needs to be copied into each course in a faculty. This uses another separate utility, Copy Course Bulk (CCB), with another CSV format. This utility is different, because it lives inside a special course in our D2L instance. You go to the course, open the “Manage Files” interface, and upload the CSV file (named input.csv) into an “Inbox” folder. Every night, at about 12:30am, a process crunches that file (if it exists), copies the content as specified in it, logs the result, and moves the file to the “Outbox” folder. But, this only copies course content, grade items, assignments, grading schemes, etc…

  4. To have the courses in each faculty use the proper homepage as designed by the key people in each faculty, yet another utility is needed. With yet another CSV format. We haven’t seen the Automated Course Branding Tool (ACBT) yet4 but I assume it lives as a special course offering within our D2L instance, as the CCB tool does. This tool will set the homepage (the layout of the course – which widgets are visible, where they are, etc…) – as well as setting the NavBar (the navigation menu for the course).

There. That’s all it takes. Of the 4 steps, 2 will eventually be automatable through our connection with PeopleSoft, when that comes online. The other 2 will require semi-manual intervention, to create the list of courses for each faculty, tying course codes to “template” courses5.

Of these tools, the Copy Course Bulk (CCB) and Automated Course Branding Tool (ACBT) require additional licenses, and need to be separately deployed in your D2L environment. This takes time. We weren’t even aware these tools were separate, or that they needed to be licensed and deployed, until we went to use the functionality (assuming it would exist in the core product). Plan ahead. These tools do the job, but take some time to set up (and push through campus purchasing processes…).

  1. I should start an acronym-based drinking game, except my liver wouldn’t survive it []
  2. BUM. giggle. No, just upload it to the BUM. D2L takes it in the BUM. oy. productive project meetings discussing this tool… []
  3. not the D2L concept of “Template” which is strictly an administrative thing – all Math 251 courses are set up with the D2L Template of “Math 251”, making it easier to find all instances of a certain course. The “Faculty Template” I’m talking about here is just a Course Offering that is used by key people in each faculty to set up how they want all of their courses to look – they add stuff to the Content area. Items to News. Preload any content etc… that will then be copied into each course in their faculty. []
  4. stuck in the fun of University Purchasing etc… []
  5. that aren’t actually D2L templates []

all I want from a D2L user activity system dashboard

We’re now in the third week of the Fall 2013 Desire2Learn pilot, and I find myself using the Users > Statistics page to monitor the status of the environment. It’s an extremely coarse way to see if people are having problems (if there’s a problem, I’d assume the user count drops to near 0).

D2L User Statistics

It’s not exactly ideal, though. What I’d love is something closer to what WordPress gives for recent activity, but for active users in the environment. Something kind of like:

D2L User Activity Report Mockup

Bonus points for some content activity reports as well (# of courses active, # of discussion posts per hour, # of quizzes submitted per hour, video notes published/viewed, etc…)

Extra-special bonus points for something that could be set to run on an iPad display to monitor the environment throughout the day…

small scale pilot

We’re about to launch a “small scale pilot” in Desire2Learn, for the semester that starts Monday. The goal was to keep it small and manageable, because we don’t have integration with PeopleSoft for managing enrolment data yet.

Small scale pilot

28 registrar-provided courses, with many sections. 5,181 enrolments for 4,069 participating students (and growing). Small scale… The largest course is just shy of 1,000 students.

Desire2Learn Fusion 2013 notes

Since we’re adopting Desire2Learn, the UofC sent a few folks to the annual Desire2Learn Fusion conference – the timing was extremely fortuitous, with the conference starting about a month after we signed the contract. I’d never been to a D2L conference before, so wasn’t sure really what to expect. Looking at the conference schedule ahead of time, it looked pretty interesting – and would have many sessions that promised to cover some of the extremely-rapid-deployment adoption cycle we’re faced with.

The short version of the conference is that I’m quite confident that we’ll have the technology in place by the time our users need it1. But, I’m a little freaked out about our ability to train instructors before they need to start using D2L. We’ve got some plans to mitigate this, but this is the Achilles Heel of our implementation plan. If we can’t get our instructors trained… DOOM! etc…

The sessions at the conference were really good. I’d expected a vendor conference to focus on the shiny bits of the vendor’s tools, and on the magical wondrousness of clicking checkboxes and deploying software to make unicorns dance with pandas, etc… But it really wasn’t about the vendor’s product – for many of the sessions (excepteven the rapid-deployment sessions that I went to), you could easily remove Desire2Learn’s products from the presentation and it would still have been an interesting and useful session. One of our team members who went to a session on online discussions commented something along the lines of “I was thinking I’d see how to set up the D2L discussion boards in a course – but they didn’t even talk about D2L! It was all pedagogy and learning theory…” The conference as a whole had extremely high production value. Surprisingly high. Whoever was responsible for the great custom-printed-conference-schedule lanyard badge deserves a raise or two.

This was also one of the most exhausting conferences I’ve been to. It’s the first one in a long time where I had to really be present, attend all sessions, and pay attention. We’re under just a little pressure to get this deployment done right, and there’s no time to screw up and backtrack. So, pay attention.

The sessions I chose largely share a theme. We’re looking at migrating from Bb8 to D2L starting now, and wrapping up with 100% adoption by May 2014. So, my session selection and note-taking focus was largely driven by the panic of facing 31,000 FTE students, a few thousand instructors, 14 deans, a Provost, a Vice Provost, and a CIO (when we get a new one) and being able to say “hey. we’re on it. we can do this.”2

I’m not going to blab about the social events (which were awesome), or about how nice (but ungodly hot) Boston was (it was very nice). I’ve posted photos3.

Here’s abridged highlights from my session notes:

Administration pre-conference

Kerry O’Brien

Learned a bunch about the Org Unit model in D2L – and something clicked when thinking about “Course Templates” – I’d been thinking Templates were course design templates. No. They’re course offering templates, used for grouping course offerings based on an organizational hierarchy. So, if you offer a course like Math 251, there’s a Course Template called “Math 251” and all offerings (instances of a course with students actually taking the course) are just Course Offerings that use the “Math 251” template. So, a course like “Math 251 – Fall 2013” is Course Offering of “Math 251” (and also belongs to the “Fall 2013” semester org unit, and likely has Sections enabled so that Math 251 L01-L10 are contained within the single Course Offering. Sounds complicated, but once it clicked, I realized it should help to keep things nicely organized.

Also, the distinction between Sections and Groups was useful – Sections are defined by the Registrar – they’re official subdivisions of a Course Offering, and will be pulled from our PeopleSoft SIS integration – while Groups are informal ad hoc subdivisions that are created by the instructor(s) to put students into smaller bunches to work on stuff together4.


Al Essa

I’ve seen Al online, but this was my first time seeing him in person. He’s a really interesting presenter, and made me overcome some of my resistance to analytics and enterprise data analysis. I’m still kind of pissed off at what the Gates Foundation is doing to education, but there’s something interesting going on with analytics, if we’re careful about how we interpret the data, and what we do with it.

One thing I’m really interested in exploring with the Analytics/Insights package is the modelling of curriculum and learning outcomes within a course, department, faculty, and across campus. This has some really interesting implications for the development and refinement of courses to make sure students are given the full range of learning experiences that will help them succeed in their program(s).

Navigation & Themes

Lindsay – Calgary Catholic School District

As an aside, one of the big reasons we went with D2L was because the CCSD and CBE (and SAIT, and Bow Valley College, and a bunch of other institutions in Calgary) use it, so we’ll have much more opportunity for collaboration and sharing of resources.

One thing we’ll have to figure out is what the default course layout and permissions should be – what tools/content should be presented by default, and what will instructors be able to change? I’d been thinking we should just hand the keys over to the instructors, and let them change everything if they want. But, we may need to think more about how to set up the default layout, and which bits need to adjusted by instructors. This comes down to maturity level of the users, and of the organization, and on the level of training we’ve been able to provide by the time they get there… I’ll also be looking to keep things as simple as possible, while providing access to deeper tools for advanced users. Interesting tension there…

Quick & Dirty Implementation

Michel Singh, La Cite Collegial

Michel (and team) migrated about 5,000 FTE students from Blackboard CE to D2L. Signed a contract in May 2012, and launched on September 1 2012. The technology portion of the implementation and migration worked great. But training of instructors was the big risk and weak spot (sounds familiar…).

They did a “cold turkey” migration – everyone jumped into the pool together.

Some lessons from Michel:

  1. Have a plan and choose your battles
  2. Manage expectations (admin and faculty)
  3. Share responsibilities among your group
  4. Have empathy for users
  5. Structure your support team
  6. Leverage video and tutorials
  7. Celebrate small successes
  8. Maintain good communication with D2L

They used Course Templates in an interesting way – although course content isn’t automatically copied from Templates to Offerings, they used Templates as a central place for the curriculum review team to build the course and add resources, which could then be manually copied down into the Offering for use with students. Nice separation, while allowing collaboration on course design.

Victoria University D2L Implementation

Lisa Germany

They took on a wider task – refreshing the campus eLearning environment – which included migrating from WebCT to Blackboard in addition to curriculum reform and restructuring the university. Several eLearning tools were added or replaced, including D2L, Echo 360, Bb Collaborate, TurnItIn, and auditing of processes and data retention across systems.

Lessons learned:

  1. Understand how teaching & learning activities interact with university systems and determine high-level and detailed business requirements before going to tender (systems map, in context with connections between tools/platforms/data/people).
  2. Involve strategic-thinking people in selection process (not a simple tech decision – there are wider implications)
  3. Make sure requirements are written by people who have done something like this before…
  4. Involve the legal team from the start, so they aren’t the bottleneck at the end. cough
  5. Have a good statement of work outlining vendor and university roles/expectations.
  6. Don’t do it over the summer break! cough
  7. Have a business owner who is able to make initial decisions.

Program priorities (similar to what ours are):

  1. Time (WebCT expires – in our case, timeline driven by Bb license expiring in May 2014)
  2. Quality (don’t step backward, don’t screw it up)
  3. Scope (keep it manageable for all stakeholders)

They took a 3 semester timeline as well:

  1. Pilot
  2. Soft launch
  3. Cutover

DN: So, I’m feeling surprisingly good about the timeline we have, from the perspective of the technology. The biggest tech hurdle we have will be PeopleSoft integration, and that’s doable. It’s the training and user adoption that will kill this…

One Learning Environment

Greg Sabatine – UGuelph

In a way, as a new D2L campus, this was kind of a “don’t do it this way” kind of session. Guelph was the first D2L client, a decade ago, and has kind of accreted campus-wide adoption over the years. So, initial decisions didn’t scale or adapt to support additional use-cases…

They run 11 different instances of self-hosted D2L, on different upgrade schedules (based on the needs of the faculties using each instance – Medicine is on a different academic schedule than Agriculture, so servers can’t have same upgrade timeline etc…). I wonder what our D2L-hosted infrastructure will do to us along these lines – minor upgrades are apparently seamless now with no downtime, but I wonder what Major Upgrades will do5

Looking at the Org Structure they use, we’ll likely have to mimic it, so that each Faculty can have some separation and control. So, our org structure would likely look like:

University > Faculty > Department > Course Offering

They did some freaky custom coding stuff to handle student logins, to point students to the appropriate Org Home Page depending on their program enrolment. Man, I hope we don’t have to do anything as funky as that…

Community of Collaboration in D2L

Barry Robinson – University System of Georgia

They use D2L with 30 out of 31 institutions in the system6. Over 10,000 instructors. 120 LMS Administrators. 315,000 students across the system. Dang. That’s kinda big. They use a few groups to manage D2L and to make decisions:

  1. Strategic – Strategic Advisory Board – 15 reps from across the system, making the major decisions, meeting quarterly
  2. Strategic – Steering Committee – monthly meetings to guide implementation…
  3. Operational – LMS Admin Retreats – bi-annual sessions where the 120 LMS Admins get together to talk about stuff
  4. Tactical – LMS Admins Weekly Confernece – every Thursday at 10am, an online meeting for the admins to discuss issues etc…

Community resources:

  • A D2L course for the community to use to discuss/share things.
  • Listserv
  • Emergency communication system
  • Helpdesk tickets
  • Vendor – Operational Level Guidelines
  • Vendor – SLAs

They migrated 322,347 courses from Bb Vista 8, and had over 99% success rate on course migrations…

D2L for Collaboration & Quality Assurance

Mindy Lee – Centennial College

They review curriculum every 3-5 years. Previously used wikis to collect info, but then had export-to-Word-hell. Needed to transition to continuous improvement, rather than static snapshots. Now, use a D2L course shell to share info and then build a Word document to report on it after the fact.

D2L Curriculum Review course shells:
* 1 for program. common place for all courses in a program. meta.
* 1 for each course under review – used to document evidence for the report (content simply uploaded as files in the course). Serves as a community resource for instructors teaching the course – shared content, rubric, etc…

DN: This last part is actually pretty cool – the curriculum review course site is used as an active resource by instructors who later teach the course. It’s not a separate static artifact. It’s part of the course design/offering/review/refinement process. Tie this into the Course Outcomes features, and there’s a pretty killer curriculum mapping system…

Maximize LOR Functionalities

Lamar State College

DN: the D2L LOR feature is actually one of the big things we’re looking to roll out. Which is kind of funny/ironic, given my history with building learning object repositories…

They use the LOR to selectively publish content to repositories within the organization – you can set audiences for content.

Would it make sense to use a LOR for just-in-time support resources? (similar to the videos available from D2L)?

From Contract to Go-Live in 90 Days


DN: 90 days. Holy crap. Our timeline looks downright relaxed in comparison. This should be interesting…

They had permanent committees guiding decisions and implementation, and a few ad hoc committees:

  • LMS Guidance (CIO etc…)
  • LMS Futures (Faculty members…)
  • LMS Implementation (Tech team / IT)

On migration: they encouraged faculty to “re-engineer” courses rather than to simply copy them over from the old LMS. “If a course is really simple, it’s better to just recreate it in D2L. If it’s complex, or has a LOT of content, migrate it.”

Phased launch – don’t enable every feature at first – it’s overwhelming, and places an added burden on the training and support teams. Best to stage features in phases – key Learning Environment first, then other features like Eportfolio, LOR, Analytics, etc… once you’ve gotten running.

Train the trainers first (and key überusers).

Work with early adopters – they will make or break adoption in a faculty.

They ran several organized, small configuration drop-in sessions, each focused on a specific tool or task. Don’t try to do everything in one session…

8 Months vs. 8 Weeks: Rapid LMS Implementation

Thomas Tobin – Northeastern Illinois University

First, any session that starts with a soundtrack and Egyptian-Pharoah-as-LMS-Metaphor though exercise has GOT to be good.

Much of this should be common sense, but it’s super-useful to see how it lays out in the context of an LMS implementation…

On building the list of key stakeholders – ask each stakeholder “who else should we include? why?” – don’t miss anyone. they get cranky about that.

Identify the skeptics, and recruit them.

The implementation PM must have the authority to make decisions on the fly, or the project will stall.

Develop a charter agreement – define the scope, roles, goals, etc… so people know what’s going on.

The detailed Gantt chart is not for distribution. It has too much detail, changes regularly, and will freak people out.

Plan tasks in parallel vs. serial – break things out, delegate, and let them do their jobs. Success relies on multiple people working together, not single-worker-bottlenecks.

Use a gated plan for milestones and releases – and celebrate (small) successes. (but what do you do about failures?)

Designate 1 person as “schedule checker” – doesn’t have to be the project manager (actually, may be useful to be someone else…)

Assess existing and new risks regularly.

Do a “white glove” review – review and test all settings and features. So, the ePortfolio tool is supposed to be enabled. Is it? Were permissions set so people could actually use it? Were they trained? Does it work? etc…

Unified Communications at the Univ. System of Georgia

David Disney

Interesting talk – he pointed out the importance of making sure end users have current information on the status of tools/networks/services, so they’re not left guessing. I pointed out that if people have to monitor a status website to see how things are doing, that may be a symptom of larger problems…

They have a cool website for monitoring key services across the University System of Georgia.

  1. we’re doing a small-scale pilot this Fall semester, without full integration to PeopleSoft. Winter 2014 will be a “soft launch” with anyone wanting to use it (and some faculties deciding to switch over 100% for January), and with our Blackboard 8 server sunsetted – we’re taking a page from Google’s playbook – in May 2014, so the Spring 2014 semester will be the first with 100% use of Desire2Learn. []
  2. because I sure wasn’t feeling confident about being able to say that before the Fusion conference… []
  3. well, some of the photos… []
  4. and the recent D2L acquisition of Wiigo should make the use of Groups REALLY interesting… []
  5. the Desire2Learn community site was just down for 2 days for a major upgrade to the latest version of D2L – will this kind of thing be necessary going forward from 10.2? []
  6. they mentioned this several times – with the ominous but unspoken question of “what’s up with the 31st institution?” []

on the new LMS

I’ve been working with people on campus for a long time to try to figure out what we need to do about our campus LMS. My oldest file for the endeavour was created on July 19, 2011. Seriously. Almost 2 years ago. We did a couple rounds of campus engagement1, ran an RFP, and wrote several reports. Provincial politics, budget crises and legal processes intervened, and here we are. The decision was formalized in the RFP system this afternoon, and it’s official: the University of Calgary has selected Desire2Learn as its next learning management system.

This is good for a few reasons:

  1. we can finally move past “so… do we have a new LMS yet?” to “yes. now what are you going to do with it?”
  2. we can finally stop focusing our support efforts on “but it doesn’t work on (insert name of current browser and operating system)” and “but file uploads don’t work” etc… Yes. It works. Moving on…
  3. D2L is used by almost all post-secondary institutions in Calgary – the only non-D2L post sec is MRU. Almost all of the city’s K12 runs on D2L (public and catholic boards run it, and most private schools). So, lots of opportunity for collaboration and sharing of resources for support/training/development.

We’re just working on the migration plan now – I’d drafted a version assuming a decision would have been made back in January. Yeah. The timeline isn’t entirely valid anymore. So… Now that it’s final, we need to put together an adjusted implementation plan and timeline. The optimistic plan is to start with a small-scale pilot for Summer 2013 semester (which starts next month!), and start large-scale migration of courses in Fall 2013 and Winter 2014. By Spring 2014, all courses will be run in D2L2. From conversations I’ve had with Deans and instructors from many faculties, the problem is going to be turning people away from the new system in order to get on our feet before running…

Those who know me may be surprised that I’m excited about the LMS. Yes, I really am. We need to provide quality tools that are able to be used by everyone, not just those who are geeky enough to duct tape together their own DIY non-LMS environments. The LMS is important in a social justice context – we need to provide equal functionality for all instructors and students, in all faculties. The LMS lets us do that. If people want to develop their own custom tools if they feel the need, they can totally do so. But for most of the use-cases I’ve seen for custom tools,3 they won’t need to do that.

This is important because with a current LMS in place, we can stop focusing on the tool. We can stop talking about shortcomings in the tool, and focus on teaching and learning. Awesome. Let’s do that.

  1. focus groups, vendor demos, workshops, sandboxes, surveys, etc… []
  2. of course, this may prove to be unrealistically optimistic, so may need to be adjusted. again. []
  3. they were often implemented to fill perceived gaps in the previous LMS, rather than solving unique teaching-and-learning needs []