Category Archives: general

small scale pilot

We’re about to launch a “small scale pilot” in Desire2Learn, for the semester that starts Monday. The goal was to keep it small and manageable, because we don’t have integration with PeopleSoft for managing enrolment data yet.

Small scale pilot

28 registrar-provided courses, with many sections. 5,181 enrolments for 4,069 participating students (and growing). Small scale… The largest course is just shy of 1,000 students.

how to repair all tables in all databases on a mysql server

This comes in handy, and I have to google it every time I need it12. So, here’s a copy for reference later…

mysqlcheck --repair --use-frm --all-databases

Run it as root, with MySQL running. It’ll repair every table in every database. Give it time to chew for awhile. It spews out the status of every table as it works. Here’s what it found with my Fever˚ database tables (which now work just fine):

dnorman_fever.fever__config
warning  : Number of rows changed from 0 to 1
status   : OK
dnorman_fever.fever_favicons
warning  : Number of rows changed from 0 to 408
status   : OK
dnorman_fever.fever_feeds
warning  : Number of rows changed from 0 to 240
status   : OK
dnorman_fever.fever_feeds_groups
warning  : Number of rows changed from 0 to 305
status   : OK
dnorman_fever.fever_groups
warning  : Number of rows changed from 0 to 17
status   : OK
dnorman_fever.fever_items
warning  : Number of rows changed from 0 to 13660
status   : OK
dnorman_fever.fever_links
warning  : Number of rows changed from 0 to 46208
status   : OK

Better.

Looks like it doesn’t like INNODB tables, throwing this:

note     : The storage engine for the table doesn't support repair

So, if you’re using MyISAM tables, this should do the trick. Not sure how to fix the INNODB tables, or if they even need fixing…


Footnotes:

  1. usually coming up with the top-voted answer for a question posted to stackoverflow.com []
  2. actually, I use DuckDuckGo, so I get the tip inline in the search results… []

on the napsterization of education

Another post on how education is undergoing (or will very soon be forced to undergo) a napster-like disruption/transformation/eruption.

But napster didn’t disrupt music. It disrupted the previous business model for distributing recorded music content. Musicians still exist. People still write/play/perform/record/buy/download music. The workflow has changed. The people who control the pipelines have changed.

Digital technologies are disrupting the current business model(s) for distributing educational content. And that’s a great thing. $500 worth of required textbooks for a single course is just plain messed up. Academic journals charging researchers hundreds or thousands of dollars to gain access to research funded by public institutions, also messed up.

But these technologies aren’t disrupting education itself. That’s up to the teachers, students, and administrators. They’re the ones who need to figure out what to make of the transformative technologies and with having free (or nearly free) access to content and each other.

Falling back to calling it “napsterization” – as if that is a magic recipe for disruption – is just a lazy narrative. Yes. Things are changing. But it’s not going to be a relatively instant bit-shift from Old Education to New Education through the power of modern chemistry.

Education is not just about granting access to content. If it was, then we’d close every school and just let people go to the library. Education is about the activities we do with each other in our various roles, to build/connect/try/experiment/explore/create/etc… – these are things that build on content, but content itself is not sufficient for education. Music is different, because unless you’re a musician yourself, music is about consuming content (whether pre-recorded or listening to live performances).

Desire2Learn Fusion 2013 notes

Since we’re adopting Desire2Learn, the UofC sent a few folks to the annual Desire2Learn Fusion conference – the timing was extremely fortuitous, with the conference starting about a month after we signed the contract. I’d never been to a D2L conference before, so wasn’t sure really what to expect. Looking at the conference schedule ahead of time, it looked pretty interesting – and would have many sessions that promised to cover some of the extremely-rapid-deployment adoption cycle we’re faced with.

The short version of the conference is that I’m quite confident that we’ll have the technology in place by the time our users need it1. But, I’m a little freaked out about our ability to train instructors before they need to start using D2L. We’ve got some plans to mitigate this, but this is the Achilles Heel of our implementation plan. If we can’t get our instructors trained… DOOM! etc…

The sessions at the conference were really good. I’d expected a vendor conference to focus on the shiny bits of the vendor’s tools, and on the magical wondrousness of clicking checkboxes and deploying software to make unicorns dance with pandas, etc… But it really wasn’t about the vendor’s product – for many of the sessions (excepteven the rapid-deployment sessions that I went to), you could easily remove Desire2Learn’s products from the presentation and it would still have been an interesting and useful session. One of our team members who went to a session on online discussions commented something along the lines of “I was thinking I’d see how to set up the D2L discussion boards in a course – but they didn’t even talk about D2L! It was all pedagogy and learning theory…” The conference as a whole had extremely high production value. Surprisingly high. Whoever was responsible for the great custom-printed-conference-schedule lanyard badge deserves a raise or two.

This was also one of the most exhausting conferences I’ve been to. It’s the first one in a long time where I had to really be present, attend all sessions, and pay attention. We’re under just a little pressure to get this deployment done right, and there’s no time to screw up and backtrack. So, pay attention.

The sessions I chose largely share a theme. We’re looking at migrating from Bb8 to D2L starting now, and wrapping up with 100% adoption by May 2014. So, my session selection and note-taking focus was largely driven by the panic of facing 31,000 FTE students, a few thousand instructors, 14 deans, a Provost, a Vice Provost, and a CIO (when we get a new one) and being able to say “hey. we’re on it. we can do this.”2

I’m not going to blab about the social events (which were awesome), or about how nice (but ungodly hot) Boston was (it was very nice). I’ve posted photos3.

Here’s abridged highlights from my session notes:

Administration pre-conference

Kerry O’Brien

Learned a bunch about the Org Unit model in D2L – and something clicked when thinking about “Course Templates” – I’d been thinking Templates were course design templates. No. They’re course offering templates, used for grouping course offerings based on an organizational hierarchy. So, if you offer a course like Math 251, there’s a Course Template called “Math 251″ and all offerings (instances of a course with students actually taking the course) are just Course Offerings that use the “Math 251″ template. So, a course like “Math 251 – Fall 2013″ is Course Offering of “Math 251″ (and also belongs to the “Fall 2013″ semester org unit, and likely has Sections enabled so that Math 251 L01-L10 are contained within the single Course Offering. Sounds complicated, but once it clicked, I realized it should help to keep things nicely organized.

Also, the distinction between Sections and Groups was useful – Sections are defined by the Registrar – they’re official subdivisions of a Course Offering, and will be pulled from our PeopleSoft SIS integration – while Groups are informal ad hoc subdivisions that are created by the instructor(s) to put students into smaller bunches to work on stuff together4.

Analytics/Insights

Al Essa

I’ve seen Al online, but this was my first time seeing him in person. He’s a really interesting presenter, and made me overcome some of my resistance to analytics and enterprise data analysis. I’m still kind of pissed off at what the Gates Foundation is doing to education, but there’s something interesting going on with analytics, if we’re careful about how we interpret the data, and what we do with it.

One thing I’m really interested in exploring with the Analytics/Insights package is the modelling of curriculum and learning outcomes within a course, department, faculty, and across campus. This has some really interesting implications for the development and refinement of courses to make sure students are given the full range of learning experiences that will help them succeed in their program(s).

Navigation & Themes

Lindsay – Calgary Catholic School District

As an aside, one of the big reasons we went with D2L was because the CCSD and CBE (and SAIT, and Bow Valley College, and a bunch of other institutions in Calgary) use it, so we’ll have much more opportunity for collaboration and sharing of resources.

One thing we’ll have to figure out is what the default course layout and permissions should be – what tools/content should be presented by default, and what will instructors be able to change? I’d been thinking we should just hand the keys over to the instructors, and let them change everything if they want. But, we may need to think more about how to set up the default layout, and which bits need to adjusted by instructors. This comes down to maturity level of the users, and of the organization, and on the level of training we’ve been able to provide by the time they get there… I’ll also be looking to keep things as simple as possible, while providing access to deeper tools for advanced users. Interesting tension there…

Quick & Dirty Implementation

Michel Singh, La Cite Collegial

Michel (and team) migrated about 5,000 FTE students from Blackboard CE to D2L. Signed a contract in May 2012, and launched on September 1 2012. The technology portion of the implementation and migration worked great. But training of instructors was the big risk and weak spot (sounds familiar…).

They did a “cold turkey” migration – everyone jumped into the pool together.

Some lessons from Michel:

  1. Have a plan and choose your battles
  2. Manage expectations (admin and faculty)
  3. Share responsibilities among your group
  4. Have empathy for users
  5. Structure your support team
  6. Leverage video and tutorials
  7. Celebrate small successes
  8. Maintain good communication with D2L

They used Course Templates in an interesting way – although course content isn’t automatically copied from Templates to Offerings, they used Templates as a central place for the curriculum review team to build the course and add resources, which could then be manually copied down into the Offering for use with students. Nice separation, while allowing collaboration on course design.

Victoria University D2L Implementation

Lisa Germany

They took on a wider task – refreshing the campus eLearning environment – which included migrating from WebCT to Blackboard in addition to curriculum reform and restructuring the university. Several eLearning tools were added or replaced, including D2L, Echo 360, Bb Collaborate, TurnItIn, and auditing of processes and data retention across systems.

Lessons learned:

  1. Understand how teaching & learning activities interact with university systems and determine high-level and detailed business requirements before going to tender (systems map, in context with connections between tools/platforms/data/people).
  2. Involve strategic-thinking people in selection process (not a simple tech decision – there are wider implications)
  3. Make sure requirements are written by people who have done something like this before…
  4. Involve the legal team from the start, so they aren’t the bottleneck at the end. cough
  5. Have a good statement of work outlining vendor and university roles/expectations.
  6. Don’t do it over the summer break! cough
  7. Have a business owner who is able to make initial decisions.

Program priorities (similar to what ours are):

  1. Time (WebCT expires – in our case, timeline driven by Bb license expiring in May 2014)
  2. Quality (don’t step backward, don’t screw it up)
  3. Scope (keep it manageable for all stakeholders)

They took a 3 semester timeline as well:

  1. Pilot
  2. Soft launch
  3. Cutover

DN: So, I’m feeling surprisingly good about the timeline we have, from the perspective of the technology. The biggest tech hurdle we have will be PeopleSoft integration, and that’s doable. It’s the training and user adoption that will kill this…

One Learning Environment

Greg Sabatine – UGuelph

In a way, as a new D2L campus, this was kind of a “don’t do it this way” kind of session. Guelph was the first D2L client, a decade ago, and has kind of accreted campus-wide adoption over the years. So, initial decisions didn’t scale or adapt to support additional use-cases…

They run 11 different instances of self-hosted D2L, on different upgrade schedules (based on the needs of the faculties using each instance – Medicine is on a different academic schedule than Agriculture, so servers can’t have same upgrade timeline etc…). I wonder what our D2L-hosted infrastructure will do to us along these lines – minor upgrades are apparently seamless now with no downtime, but I wonder what Major Upgrades will do5

Looking at the Org Structure they use, we’ll likely have to mimic it, so that each Faculty can have some separation and control. So, our org structure would likely look like:

University > Faculty > Department > Course Offering

They did some freaky custom coding stuff to handle student logins, to point students to the appropriate Org Home Page depending on their program enrolment. Man, I hope we don’t have to do anything as funky as that…

Community of Collaboration in D2L

Barry Robinson – University System of Georgia

They use D2L with 30 out of 31 institutions in the system6. Over 10,000 instructors. 120 LMS Administrators. 315,000 students across the system. Dang. That’s kinda big. They use a few groups to manage D2L and to make decisions:

  1. Strategic – Strategic Advisory Board – 15 reps from across the system, making the major decisions, meeting quarterly
  2. Strategic – Steering Committee – monthly meetings to guide implementation…
  3. Operational – LMS Admin Retreats – bi-annual sessions where the 120 LMS Admins get together to talk about stuff
  4. Tactical – LMS Admins Weekly Confernece – every Thursday at 10am, an online meeting for the admins to discuss issues etc…

Community resources:

  • A D2L course for the community to use to discuss/share things.
  • Listserv
  • Emergency communication system
  • Helpdesk tickets
  • Vendor – Operational Level Guidelines
  • Vendor – SLAs

They migrated 322,347 courses from Bb Vista 8, and had over 99% success rate on course migrations…

D2L for Collaboration & Quality Assurance

Mindy Lee – Centennial College

They review curriculum every 3-5 years. Previously used wikis to collect info, but then had export-to-Word-hell. Needed to transition to continuous improvement, rather than static snapshots. Now, use a D2L course shell to share info and then build a Word document to report on it after the fact.

D2L Curriculum Review course shells: * 1 for program. common place for all courses in a program. meta. * 1 for each course under review – used to document evidence for the report (content simply uploaded as files in the course). Serves as a community resource for instructors teaching the course – shared content, rubric, etc…

DN: This last part is actually pretty cool – the curriculum review course site is used as an active resource by instructors who later teach the course. It’s not a separate static artifact. It’s part of the course design/offering/review/refinement process. Tie this into the Course Outcomes features, and there’s a pretty killer curriculum mapping system…

Maximize LOR Functionalities

Lamar State College

DN: the D2L LOR feature is actually one of the big things we’re looking to roll out. Which is kind of funny/ironic, given my history with building learning object repositories…

They use the LOR to selectively publish content to repositories within the organization – you can set audiences for content.

Would it make sense to use a LOR for just-in-time support resources? (similar to the videos available from D2L)?

From Contract to Go-Live in 90 Days

MSU

DN: 90 days. Holy crap. Our timeline looks downright relaxed in comparison. This should be interesting…

They had permanent committees guiding decisions and implementation, and a few ad hoc committees:

  • LMS Guidance (CIO etc…)
  • LMS Futures (Faculty members…)
  • LMS Implementation (Tech team / IT)

On migration: they encouraged faculty to “re-engineer” courses rather than to simply copy them over from the old LMS. “If a course is really simple, it’s better to just recreate it in D2L. If it’s complex, or has a LOT of content, migrate it.”

Phased launch – don’t enable every feature at first – it’s overwhelming, and places an added burden on the training and support teams. Best to stage features in phases – key Learning Environment first, then other features like Eportfolio, LOR, Analytics, etc… once you’ve gotten running.

Train the trainers first (and key überusers).

Work with early adopters – they will make or break adoption in a faculty.

They ran several organized, small configuration drop-in sessions, each focused on a specific tool or task. Don’t try to do everything in one session…

8 Months vs. 8 Weeks: Rapid LMS Implementation

Thomas Tobin – Northeastern Illinois University

First, any session that starts with a soundtrack and Egyptian-Pharoah-as-LMS-Metaphor though exercise has GOT to be good.

Much of this should be common sense, but it’s super-useful to see how it lays out in the context of an LMS implementation…

On building the list of key stakeholders – ask each stakeholder “who else should we include? why?” – don’t miss anyone. they get cranky about that.

Identify the skeptics, and recruit them.

The implementation PM must have the authority to make decisions on the fly, or the project will stall.

Develop a charter agreement – define the scope, roles, goals, etc… so people know what’s going on.

The detailed Gantt chart is not for distribution. It has too much detail, changes regularly, and will freak people out.

Plan tasks in parallel vs. serial – break things out, delegate, and let them do their jobs. Success relies on multiple people working together, not single-worker-bottlenecks.

Use a gated plan for milestones and releases – and celebrate (small) successes. (but what do you do about failures?)

Designate 1 person as “schedule checker” – doesn’t have to be the project manager (actually, may be useful to be someone else…)

Assess existing and new risks regularly.

Do a “white glove” review – review and test all settings and features. So, the ePortfolio tool is supposed to be enabled. Is it? Were permissions set so people could actually use it? Were they trained? Does it work? etc…

Unified Communications at the Univ. System of Georgia

David Disney

Interesting talk – he pointed out the importance of making sure end users have current information on the status of tools/networks/services, so they’re not left guessing. I pointed out that if people have to monitor a status website to see how things are doing, that may be a symptom of larger problems…

They have a cool website for monitoring key services across the University System of Georgia.


Footnotes:

  1. we’re doing a small-scale pilot this Fall semester, without full integration to PeopleSoft. Winter 2014 will be a “soft launch” with anyone wanting to use it (and some faculties deciding to switch over 100% for January), and with our Blackboard 8 server sunsetted – we’re taking a page from Google’s playbook – in May 2014, so the Spring 2014 semester will be the first with 100% use of Desire2Learn. []
  2. because I sure wasn’t feeling confident about being able to say that before the Fusion conference… []
  3. well, some of the photos… []
  4. and the recent D2L acquisition of Wiigo should make the use of Groups REALLY interesting… []
  5. the Desire2Learn community site was just down for 2 days for a major upgrade to the latest version of D2L – will this kind of thing be necessary going forward from 10.2? []
  6. they mentioned this several times – with the ominous but unspoken question of “what’s up with the 31st institution?” []

the death of Google Reader has been greatly exaggerated

Using Marco Arment‘s handy dandy RSS feed-subscribers apache access log processing script, here’s the current breakdown of accesses by known RSS reader applications to my blog since 8am today:

GReaderRIP

The big spike on the far left? Google Reader. Still counting for almost 77% of RSS-related accesses to my blog. Except no humans can see what it’s still indexing using GReader…

Marco found the same thing on his much-more-widely-read blog.

Google Reader appears to be a zombie process, obediently and tirelessly indexing RSS feeds, oblivious to the fact that nobody will be able to view the product of its work…

Update: Here’s a version with GReader removed:

GReaderRIP withoutGReader

Newsblur (at almost 15% of RSS traffic to my blog), followed by Digg1, Yahoo! Pipes2, etc… The RSS landscape has changed rather dramatically in the last couple of weeks. The long tail of long tail readers…


Footnotes:

  1. which used to be dead, but was resurrected by the death of GReader… []
  2. I thought that was dead already… []

syncing Desktop across multiple computers

I treat my Desktop as “stuff I’m working on right now” and file things away into project folders after I’m done actually working on them. I also use 3 different computers, and a couple of iOS devices. How to sync this active-work area across all? This would work with any other file sync tool1.

It’s an easy trick, based on one I found on Lifehacker2. It’s also not necessary – it’s trivial to just leave the active-work files in the Dropbox directory, but then you have to go digging every. single. time…3

First, on one computer, create a Desktop folder in your Dropbox directory, at ~/Dropbox/Desktop (then wait for it to sync to Dropbox).

Then, in Terminal, run this on all computers you want to sync the desktop:

sudo mv ~/Desktop ~/Desktop.bak
sudo ln -s ~/Dropbox/Desktop/ ~/Desktop

This will move the “real” Desktop directory out of the way (but just renames it so Finder won’t use it – all of the files are still safe) and creates a symbolic link from the ~/Dropbox/Desktop directory to the place that Finder looks, at ~/Desktop.

Then, log out (don’t save state – Finder has to relaunch fresh) and log back in again. Move any critical/active files from ~/Desktop.bak/ back onto the Desktop. Sync. Magic.

Not sure how this will work over time – I’m sure the Finder-specific invisible files will act up a bit, storing strange positions etc… for icons on the shared Desktop… Might also get extra funky if mixing platforms – I have no idea what would happen if Windows variants were thrown into the mix. Should be possible to share a Desktop across Mac/Win/etc… though.


Footnotes:

  1. I recently gave up on OwnCloud for a bit because it was just too funky to trust 100% with my active working files, because it gets confused with browser-based wifi authentication and self destructs when it encounters this. so, I’m on Dropbox (again) for now. []
  2. but I had to modify it ever so slightly. The Lifehacker recipe wanted to sync the entire Dropbox directory to the Desktop. That’s just stupid. []
  3. and, yeah, you can just add that folder to the sidebar in Finder, but, again, clicking every time… []

sharing LMS requirements

Jen suggested many (many) months ago that I post the requirements we used in the LMS RFP on github so people could use them, fork them, etc…. A starting point, so every institution doesn’t have to reinvent the wheel each time. Considering the amount of time and effort she put into helping craft them, how can I possibly refuse? I can’t, that’s how.

So.

Generic-ish LMS RFP requirements, ala github.

Things I learned about using this set of requirements:

  1. WAY too many line items. When ranking responses, we got serious regression toward the mean – the numbers balanced out and the differences were de-emphasized. Pick the ones that mean the most to you. Ignore (or demote) the rest.
  2. Some of the items were way too specific, others, not specific enough. Yeah. Balance.
  3. The vendors provided great responses to all of the items. I know they must have been frustrated by the sheer number of line items, but they pushed through and provided what we needed. And now, they have responses to copy and paste as needed, so it should get easier for everyone…

I haven’t received explicit approval from The Management to share these, but they were included in a public document, so it shouldn’t be a problem… Openness and sharing and unicorns and butterflies etc…

Update: I chatted with our CIO about this, and he thought it was a great idea. Cool. Done.

on the new LMS

I’ve been working with people on campus for a long time to try to figure out what we need to do about our campus LMS. My oldest file for the endeavour was created on July 19, 2011. Seriously. Almost 2 years ago. We did a couple rounds of campus engagement1, ran an RFP, and wrote several reports. Provincial politics, budget crises and legal processes intervened, and here we are. The decision was formalized in the RFP system this afternoon, and it’s official: the University of Calgary has selected Desire2Learn as its next learning management system.

This is good for a few reasons:

  1. we can finally move past “so… do we have a new LMS yet?” to “yes. now what are you going to do with it?”
  2. we can finally stop focusing our support efforts on “but it doesn’t work on (insert name of current browser and operating system)” and “but file uploads don’t work” etc… Yes. It works. Moving on…
  3. D2L is used by almost all post-secondary institutions in Calgary – the only non-D2L post sec is MRU. Almost all of the city’s K12 runs on D2L (public and catholic boards run it, and most private schools). So, lots of opportunity for collaboration and sharing of resources for support/training/development.

We’re just working on the migration plan now – I’d drafted a version assuming a decision would have been made back in January. Yeah. The timeline isn’t entirely valid anymore. So… Now that it’s final, we need to put together an adjusted implementation plan and timeline. The optimistic plan is to start with a small-scale pilot for Summer 2013 semester (which starts next month!), and start large-scale migration of courses in Fall 2013 and Winter 2014. By Spring 2014, all courses will be run in D2L2. From conversations I’ve had with Deans and instructors from many faculties, the problem is going to be turning people away from the new system in order to get on our feet before running…

Those who know me may be surprised that I’m excited about the LMS. Yes, I really am. We need to provide quality tools that are able to be used by everyone, not just those who are geeky enough to duct tape together their own DIY non-LMS environments. The LMS is important in a social justice context – we need to provide equal functionality for all instructors and students, in all faculties. The LMS lets us do that. If people want to develop their own custom tools if they feel the need, they can totally do so. But for most of the use-cases I’ve seen for custom tools,3 they won’t need to do that.

This is important because with a current LMS in place, we can stop focusing on the tool. We can stop talking about shortcomings in the tool, and focus on teaching and learning. Awesome. Let’s do that.


Footnotes:

  1. focus groups, vendor demos, workshops, sandboxes, surveys, etc… []
  2. of course, this may prove to be unrealistically optimistic, so may need to be adjusted. again. []
  3. they were often implemented to fill perceived gaps in the previous LMS, rather than solving unique teaching-and-learning needs []