Posts tagged as:

nptech

I’ve been working with nonprofit organizations on technology issues (strategy, implementation) for about 15 years now. I remember the heady days, when most nonprofits didn’t even have networks, and some of them still didn’t have internet access. In those days, most nonprofit techies were progressive, and we were sure that what we were doing was going to change the world for the better.

Now, 15 years later, I’m pretty sure I’m not changing the world. You’re still more likely to find a progressive nonprofit techie than a conservative one, but there are plenty of conservative ones now. Conservative causes of all sorts have discovered the power of the kinds of technologies I’ve been helping nonprofits with, and are au courant. Plenty of conservative organizations use Drupal, Salesforce, online fundraising, Facebook and Twitter – using those technologies to push for ends that I am far from interested in seeing come to reality. You can bet that the 2012 and 2016 presidential elections will not be a repeat of the 2008 election with such a massive differential in use of technology and social networks.

I remember also, from those heady days, the idea that we could help nonprofits be more effective by encouraging them to be more proactive around replacing their hardware. Come to find out not so much later, that the massive production (and disposal) of computer hardware fuels deadly conflicts, and causes serious environmental damage.

And then there is the fundamental – what is all this technology really for, anyway? I was reminded of this when listening to Marketplace on radio a while ago. It’s worth remembering that one of the two motive forces around all of this technology change is that business (and nonprofits, too) can squeeze more work out of fewer people. That would be fine if we had a great safety net where people who were unemployed could be supported, and perhaps get free education so they could create art, music, or new and interesting things, but that’s not how the system works, is it? The second motive force is simply to empty your wallet so you can get shiny.

I still think I’m doing good. I still think that working with nonprofits to help them grapple with communications and data is good work, helps people, and is right livelihood. But I’m pretty sure I’m not changing the world by doing it.

I’m reminded, of course, by the famous Audre Lorde quote:  “The master’s tools will never dismantle the master’s house.”

There may be other ways I’m helping to change the world, though, but you’ll have to read my other blog for that.

{ 1 comment }

Betting the Farm

April 16, 2010

Countless nonprofits flocked to Ning to create social networks. Since I’m not a social media guru, I’ve generally kept my opinions about this to myself. But now that Ning isn’t free anymore, I’m going to carp some.

I think over the course of lo this last few years, I have blogged or tweeted about this very phenomenon what feels like countless times. Nonprofits find services for free. They start depending on them. The free services disappear, for business reasons. The nonprofit community gets up in arms. Lather, rinse, repeat.

There is nothing wrong with software or services that don’t cost anything. Nothing at all. But if you are going to bet the farm, make sure you know what the risks are. Using free services is fine, but know why they are free. Are they free because the company behind them is an ad revenue machine and uber profitable (Google)? Is it free because it’s open source (Drupal, Elgg, Word Press)?  Is it free because it is a profitable company that has a clear and well defined donation program (Salesforce.com)? Or is it free because it is a start up in search for a business model (Ning)?

There is an effort afloat (and a petition) to get Ning to make nonprofit and educational accounts free. I’m not holding my breath. They eliminated 40% of their staff. They are feeling pinched, and need to stop their burn rate. I don’t know how charitable this will make them feel. And even if they do, there is no guarantee that Ning will even survive.

Anyway, if you’re looking for a great social network management system that won’t get pulled out from under you, try Elgg. It’s open source, and out of the box, it does just about everything Ning does, without the need for the deep setup required to set up Drupal like Ning. It has an active developer community, and is growing.

Or, if you look for another free service, make sure you understand the risks, and be prepared for possible disaster if it’s a startup in search of a business model.

{ 16 comments }

Off to NTC!

April 5, 2010

Tomorrow morning, I’ll be leaving on a jet plane, to Atlanta, Georgia, for the 2010 Nonprofit Technology Conference. This will be my 7th NTC since 2001 (or, more accurately, my 5th. I went to two Circuit Rider Roundups.)

I’m looking forward to it. I’m speaking in two sessions: “Collaborative Problem Solving for Consultants” organized by Robert Weiner, and “Earth to Cloud”  part of the fabulous Tech Track organized by Peter Campbell. I’m looking forward to the Unconference on Open Data organized by NetSquared, and getting to see lots of old colleagues. I’ll probably be using FourSquare to check in to places (I’m still experimenting with that one.)

{ 0 comments }

Drupal and Salesforce

December 31, 2009

It’s taken me a while to write this blog post, mostly because I have been working hard at various things (like building a business and building new websites.) This is the last installment in my CRM/CMS integration series, that started almost a year ago (wow!) And I’m skipping Joomla/Salesforce Integration because there isn’t any publicly available documentation or code about the integration that PICnet did with Joomla and Salesforce, called J!Salesforce.  [update: see Ryan's comment below.]

So what is the state of Drupal/Salesforce Integration? It’s not as mature as the Plone/Salesforce integration, for sure, but it is coming along nicely. There are several contributed modules:

  • salesforce – main module, with API, node, and user integration possibilities. This module provides the basic salesforce API connection (via SOAP), and includes field mapping, and basic import/export
  • sf_webform – Makes integration with webforms in Drupal fairly easy. Web-to-lead is quite nice and flexible with this module.
  • uc_salesforce – Provides integration with ubercart orders
  • parser-salesforce – Integration with FeedAPI – pulling data from salesforce into drupal nodes via FeedAPI  (I hope to start maintaining this module)
  • sf_import – Import Salesforce objects into Drupal nodes (will be folded into the main salesforce module)

All of these modules are in alpha or beta, although I know for a fact that some of them (or versions of them) are working in production sites. There are a fair number of bugs that need to be fixed before there is a stable release. There are a bunch of outstanding issues that need a lot of work (like caching, for instance). There are two other modules that are related, but don’t use the main salesforce api module – one for ubercart, and one for web-to-lead (called salesforcewebform). That module has a stable release, but only provides the ability to integrate between Webforms and leads, not other objects.

Right now, the salesforce module allows for integration of contact, lead and campaign objects only. so that’s another big area that could use some work.

There is a good screencast done by one of the folks (Jeff Miccolis from Development Seed) who has worked a lot on this project.

I’d say that in a year, we’ll have a good solid module release, providing lots of features for integration between Drupal and Salesforce.com.

{ 4 comments }

Got Research?

December 7, 2009

One of the great things about the nonprofit technology field is the collection of nonprofit organizations that provide what is often called “Intermediary” services to other nonprofits: information and resources that help nonprofit organizations do the work they do in the world,  by helping them make good technology decisions.

I’ve been involved in one way or another with a number of these intermediary organizations. One of them, Idealware, is an organization whose goal is to provide the sector with unbiased, analytically developed reviews and information about software that nonprofits use in their everyday work. This is incredibly important stuff, and it’s darned hard work – I know, I’ve been involved in doing a bit of research for Idealware.

If we don’t have this sort of research in our sector, nonprofits won’t have the kind of analytical approach to software available – it is much needed. As you might imagine, funding this sort of work doesn’t come easy – they need our help to be able to continue to provide great research.

{ 0 comments }

Same crap, different day

November 9, 2009

I’m warning you – this is snarky.

I was only vaguely following the brou-ha-ha over Causes leaving Myspace. Only vaguely because I don’t really keep close track of the goings on in the Social Networking space: it’s not my passion. I use them a lot, both for work as well as for personal use. I know they are becoming an increasingly important tool for nonprofits in communicating with their constituents, and so I do keep them in my peripheral vision, for sure.

Anyway, in reading the varied reactions to this news, I had to just sigh, and then get annoyed. Sigh because of what feels to me to be the wasted energy that the nonprofit sector has spent over many years, using, hawking, and supporting proprietary tools and companies. Annoyed because it seems the nptech community hasn’t figured this out, even being hit over the head with this over, and over, and over again.

Make no mistake about it – Causes is a for profit company, and they are making what is, I’d bet, a decision based entirely on economics. If you’ve read any of the gloomy news from Silicon Valley, this is just the beginning. Social ventures will not be immune to the blowing winds of economic distress.

If we keep building our nonprofit toolsets on proprietary software and for-profit web services, even if they are free (for now) we are going to be bit by this over and over again. The only way we’re going to get out of this cycle is to channel this energy and resources into open software (including “open” source apps for proprietary web services), open standards, and open networks – things no one can take away.

I love to write blog entries about successful open source efforts – like CiviCRM, or the amazing stuff people are doing in the mobile space. Writing blog entries about for-profit web vendors that make economic decisions that hurt nonprofits because we depend on them too much is just not fun.

{ 4 comments }

Open Mobile Camp report

October 25, 2009

Yesterday, I spent the day in Manhattan, at the UNICEF building, with a bunch of folks passionate about the technology in mobile phones, and the ways to use that technology for good. I’ve been a very long time cell phone user (had one since 1998), but I haven’t been involved in implementing a mobile system for an organization, so I had a lot to learn.

The place to find reports on what happend is on the wiki. Also, check out the twitter stream for the #omc09 hashtag.

I was especially interested in the issue of mobile data collection. (I was so interested, I facilitated a session.) And, even more specifically, I’m interested in how to leverage CiviCRM and mobile devices for a range of interesting applications. There are a number of ways to get data from mobile phones into a CRM – and all have advantages and disadvantages, depending on a lot of things.

  • Globally, what you can basically depend on is SMS. Smartphones haven’t made it into most of the developing world, nor have 3G networks. So how do you get SMS data into a database system like CiviCRM? You need an SMS gateway, and systems such as RapidSMS to gather data
  • Use J2ME to write applications for mobile phones, and send the data via SMS to a central database.
  • A tool such as EpiCollect, which is an Android app.
  • A slimmed-down, simplified webform to be used on mobile browsers.

One thing that would facilitate this would be a more robust API system in CiviCRM – access to the data via REST or JSON, which would allow CiviCRM to talk with some of the tools out there like Mesh4X.

I learned a ton. Thanks to MobileActive.org and the Open Mobile Consortium for a fabulous event.

{ 3 comments }

Security Camera - Photo by Sirius Rust

Security Camera - Photo by Sirius Rust

Beth threw down the gauntlet, and I had to pick it up. I’m sort of surprised I hadn’t written about this before. I think a lot about both of these, not so much for myself, but for organizations that I work with whose work is fairly sensitive.

First off, some definitions – I think that these two terms do get mixed up quite often, and understanding what’s really being meant by them in a technical context is important.

Security, in this context, is the concept that your personal computing resources and data are safe from both prying eyes, as well as hijack by crackers and spammers who will use those resources and data for their nefarious ends. In the case of your computing resources and personal data inside that box you call your laptop, or protecting the whole of your home or office network, security is a matter of using specific tools that prevent unprivileged outsiders from getting in. Wifi passwords, firewalls, password protected fileshares, virus protection software, etc. are the tools of the trade here. Security of your private data that is “in the cloud” is largely at the mercy of the software developers who hold your data. Luckily, most of them take security quite seriously. (That said, your data “in the cloud” can be compromised by lack of security on your network or laptop – someone installs a key logger, for instance, and grabs all of your passwords.)

Privacy, in this context, is that you can control, in a granular sense, what information about you is exposed to whom. Privacy is, as Beth says, primarily a matter of human behavior, but there are very interesting intersections with technology and security. In some instances, services have default privacy settings that are a lot less private than someone might like – and it takes some know-how to figure out how to correct those settings. Privacy is, also, a set of decisions that get made – sometimes in haste, or without much consideration. Your drunken decision to post that picture of you (or a co-worker) dancing in your underwear on a table at a party, the cat is out of the bag, and may never be able to be put back.

Security and privacy in the context of online communities, as Beth points out, are different beasts. The software that drives online communities (such as Drupal, phpBB, and others) have options to allow for varied levels of security. You might need to have a password to see anything. Or you might just need a password to make comments. You might not be able to just register for an account – you might need to go through an admin. These days, most software driving communities have roles you can assign people to, with specific privileges granted per role.

But privacy is made up of policy (the policy of the organization running the community) as well as the behavior of the members – their collective agreement that “what happens in Vegas, stays in Vegas.”

{ 3 comments }

Data Ecosystems

September 30, 2009

Not so long ago, nonprofit organizations had software tools, that dealt with specific parts of their organizational process. They had fundraising tools, client management tools, volunteer management tools, HR tools, accounting tools, etc. And the data in these varied tools were siloed – there was no way for one tool to talk to another without:

  1. painstaking manual entry
  2. painstaking export/import processes
  3. tools written by the same vendor designed to talk to each other (which meant that they were generally exceedingly expensive)

Although many nonprofit organizations still find themselves in this situation, there are increasing numbers of tools available to help them out of it. And as more and more organizational processes become web-based (whether “in the cloud” or self-hosted), and as more and more nonprofit-focused software includes open APIs (with some unfortunate exceptions,) nonprofit data is looking less and less siloed, and more and more like an ecosystem – many different software parts talking to others.

NTEN is trying to get a bit of a handle on this with the Data Ecosystem Survey.

I’m very much looking forward to the result – looking to see where this new set of tools that can talk freely to each other is working … and where it isn’t – where there is still work to be done. Please take time to fill it out!

{ 2 comments }

Beth Kanter tweeted about an article by Gale Berkowitz relating to evaluation, which I found really fascinating – it is a must read. In this article, Gale points to an interesting shift within her organization (the Packard Foundation):

“Over the past four years we have been shifting from evaluation for proof or accountability (“Did the program work?”) to evaluation for program improvement (“What did we learn that can help us make the program better?”).”

In some ways, it’s a subtle shift – but as she says, the latter leads to “real time” evaluation – something that happens as one moves through projects, not just at the end.

Nonprofit organizations often have their feet to the fire to evaluate their programs and projects, because funders and contributors often demand proof that their programs work. And there has been an overall movement in the sector in the direction of increased evaluation and learning.  The community I’m a part of, the group of for-profit (“for-little-profit” as is often said – we’re small and lean)  companies that serve the technical needs of nonprofits, evaluation is generally not part of the process of the work we do. But it should be.

I’ve talked about this before. A lot. In a variety of different contexts. To me, evaluation, both internal (“how could we have done this process better?” “”How could we have worked together as a team better?”) as well as externally with the client (“How did we do?” “What could we have done better?” “How could we have communicated better?”) is a critical part of the work we do.

It’s a tough balance. We’re geeks, often busy deep in the command-line, SQL and code. We’re often extremely busy, juggling lots of projects and demands at once. The bottom line, of course, for us, is always a measure of how well we are doing, but I don’t think that’s enough. As our sector as a whole moves further and further along the path of a commitment to evaluation and learning, I think it behooves us to follow.

So, you ask, what are good strategies to start with? I can give you what we try to do. Some of it is well worked out, and some is nascent. All of these we aim to do, but it’s easy to miss the target. Evaluation is a learning process, like anything else, and the most important thing is an intention and commitment to being a learning organization. The rest will eventually follow over time.

  1. Spend time at the beginning of each project outlining evaluation steps and process for the project.
  2. Spend time at the end of every project asking internally “what worked, and what didn’t work?”
  3. Ask clients at the end of the project a set of questions about the process and the result.
  4. If its an ongoing engagement, ask periodically (we aim for every 6 months or so) for an evaluation meeting or call with the client.
  5. Write a report at the end of each project with lessons learned.
  6. When a proposal isn’t accepted, ask a few questions, both internally and externally, and write up a short report with lessons learned.
  7. Ask internally how earlier lessons learned, are being applied to current projects.
  8. Always be open to learning how to make things better.

{ 3 comments }