Prospect Research · Ethics · Ask the Ethicist
Ask the Ethicist: How Can I Ensure Donor Data Privacy When Using AI?
By The Apra Ethics & Compliance Committee | May 23, 2024
Dear Readers,
As we continue to explore the intersection of artificial intelligence (AI) and fundraising, it's crucial to address the ethical implications of deploying AI solutions in our work. In previous columns, we've discussed the potential benefits and pitfalls of AI in fundraising. From optimizing donor segmentation to generating content with tools like ChatGPT, our community has expressed both excitement and apprehension about integrating AI into fundraising practices. While AI holds promise in enhancing efficiency and effectiveness, it also raises significant concerns, particularly regarding donor privacy.
However, amidst this dialogue, one fundamental principle remains clear: donor privacy concerns are ethics concerns.
When leveraging AI technologies, especially in handling donor data, it's essential to prioritize ethical considerations and safeguard donor privacy. As prospect development professionals, we must follow best practices in data management, ensuring that donor data is not shared with external AI tools unless it is through a trusted vendor in a contracted partnership. Additionally, anonymizing donor data before utilizing AI tools helps mitigate privacy risks and uphold ethical standards.
The most practical and acceptable way to ensure donor privacy is to utilize AI tools in a controlled environment. This can be accomplished through in-house development or contracting with a trusted vendor. This structure mirrors how organizations have been handling sensitive information with other vendors for decades, such as wealth screenings, prospect scoring models and appending email addresses.
For example, organizations preparing for a fundraising event expect the prospect research team to compile dozens of individual donor biographies. This is no small task, yet it is a perfect use case for generative AI. We are at an amazing moment in time where it is entirely possible for the prospect development team to generate dozens of comprehensive biographies in a matter of seconds using an AI tool.
Imagine the prospect development team purchases an in-house language learning model (LLM), such as GPT4. This is the engine that runs the free version of ChatGPT, but in this context it operates within a closed, protected system. The prospect researcher feeds the list of donor names into the AI tool and specifies the information needed, such as brief biographies encompassing giving history, personal interests and summaries of past contact reports. The AI engine utilizes the entirety of the available data to securely generate personalized biographies at whatever level of detail was specified. It's essential for the researcher to carefully review the output, ensuring that the information provided is both reasonable and accurate. Any necessary edits or adjustments can be made to the biographies before they are used for donor engagement purposes.
In this example, AI operates within a controlled, secure environment, protecting donor data throughout the process.
While AI presents exciting opportunities for innovation in fundraising, it also poses ethical challenges that cannot be overlooked. By centering on donor privacy and adopting a proactive approach to AI integration, organizations can deploy AI solutions securely and ethically, thereby maintaining trust and integrity while preventing countless hours of manual work.
Stay tuned for future discussions on AI and best practices in fundraising ethics.
Yours in Data Privacy,
The Ethicist
The Apra Ethics & Compliance Committee
The Apra Ethics and Compliance Committee monitors current ethics and privacy trends and issues, while offering timely guidance to the Apra and broader philanthropic communities. The committee is responsible for writing articles, presentations and webinars, as well as creating and updating practical toolkits and guides related to ethics in fundraising. Learn more about the committee online here.