The invention of the first modern steam engine, at the beginning of the 18th century, heralded the transformation from an agrarian to an industrial economy. Steam engines could be powered by either wood or coal, but coal quickly became the preferred fuel, being denser and thus more efficient. Coal enabled the massive growth of industrialization making it an essential ingredient to our future. Fossil fuels were a catalyst for industry growth, and in the coming 100 years, data will be of the same significance.
In fact, it is of the same significance already, although the nonprofit industry has yet to seize the opportunity that has already been realized by so many private, for profit companies. One of the key differentiators in the data that is available to nonprofits vs for profit companies, is that for profit industries have access to technology that captures data alongside the evolution of the behaviour of their customers. In other words they generate their datasets.
Recently, big data companies have come under fire for their use of such data, one of the main objections raised by critics is the lack of transparency in how data is processed. This is transparency should be high on the priority list of things you should look for in order to understand how your data is being used, distributed, and shared. While procurement departments should not necessarily seek to understand how machine learning actually works because it can be extremely complex, there is a difference between technical questions and ethical questions. Ethical decisions are made by people, not technology.
For example, collecting data that facilitates the delivery of impact to a donor, is an example of where big data could be used to enhance stewardship, and demonstrates an ethical use of big data that is contrary to that of companies who may use that data to manipulate consumer behaviour. Personalizing communications is proven to enhance the customer experience for consumers, and the same can be said for the personalization of stewardship for donors. Stewardship is proven to increase the amount of giving, but the technology to personalize stewardship at scale did not exist 10 years ago.
Nonprofit organizations have certainly placed an increased importance upon data and analytics in the last few years, but the question then becomes what kind of data is being considered. Most nonprofits have traditionally recorded data for compliance, not necessarily with the intention to put it to use with technologies like artificial intelligence. But more importantly, the data that is available to these organizations mostly consists of demographic, wealth, and donation data.
There are a few problems with this.
1) Demographic, wealth, and donation data is judgement based data predefined by society, whereas behavioural data is data defined by the independent choices and actions of individuals. Making behavioural, engagement based data a far better indicator of what causes donors care about, or are interested in.
2) Until recently, nonprofits did not have access to the tools that for profit companies have to capture, centralize and generate the data they need for AI, including behavioural data and metadata, or the data about data, (for example: the time between donations).
So what happens when nonprofits need to keep up with the rest of the world? Should they not have equal opportunity to utilize some of the cutting edge technologies already being adopted in the for profit sector? Artificial intelligence has enormous potential to have an incredibly positive impact on the nonprofit sector, and thus on the world.
Unfortunately, nonprofit organizations have taken the advice that in order to implement AI, you have to structure your data. The problem with that is, all of the AI manuals of best practices are written by businesses, and the businesses that are truly qualified to implement some type of Artificial Intelligence at scale... already have that data at scale, or they are generating it live.
So herein lies the third problem:
3) Existing AI systems for nonprofits presume you have the data that you need.
What happens when you don’t have the data?
For the nonprofit industry to follow the conventional advice on AI implementation, they would need to have the dataset to begin with. But because nonprofits do not have the data that is needed for AI, the first step for any nonprofit who wants to implement AI is not structuring the data that you already have for the systems that already exist, but developing a data generation infrastructure.
Rethinking your nonprofit organization’s data infrastructure is necessary to achieve better results, or you will find yourself with diminishing information from which to make smarter decisions. Over several years, a lack of relevant information will compound and make it exponentially harder to improve.
Alternatively, if you were to focus on laying the foundation necessary for constant improvement, the results have better odds of improving each time because you have the information that is required to make those changes. Centralized, intelligent data capture infrastructure will future-proof your organization, and lay the groundwork for what any nonprofit will need to thrive in a data dependent world.
Bio: Kristopher has over 18 years of marketing experience in both Canada and the USA and 8 years experience in fundraising for Canadian charities. With an emphasis on multi-channel direct marketing, Kristopher has managed over $7 million dollars in annual donations integrating direct mail, digital including predictive modelling, face-to-face and telemarketing strategies to drive growth and lifelong donor journeys.
“The concept of digital fundraising today must include predictive modelling/machine learning. Including machine learning in the mix ensures that you’re driving down your cost of funds raised while ensuring that no donor feels overlooked because you’re providing meaningful, personalized stewardship touch points at the right time in their donor journey.”
-Kristopher Gallub, Fundmetric Fundraising Liaison