Professional Development
Practical AI for Fundraising and Beyond: A Plug In to Technology Preview
By Rachel Du | April 18, 2024
This May, the Apra community will gather virtually for Plug In to Technology, a two-day interactive virtual event featuring diverse sessions aimed at harnessing the power of technology in the nonprofit sector.
Gayle Roberts, chief development officer at Larkin Street Youth Services, will host “Revolutionizing Nonprofit Impact: Practical AI for Fundraising and Beyond,” a presentation and discussion about the full potential of artificial intelligence (AI).
Below, Roberts shares her advice for navigating change in the wake of this emerging technology, which will be expanded upon during her session on Tuesday, May 21.
What motivated Larkin Street Youth Services to explore the use of AI in its operations?
Upon joining Larkin Street as chief development officer five years ago, I faced the challenge of raising funds without a major gifts officer, despite having a substantial pipeline.
At the time, the agency relied primarily on grants and sponsorships, as well as special events and direct marketing for private revenue. The recent adoption of Salesforce as a customer relationship management (CRM) platform was a step forward, yet we struggled to leverage its full potential due to limited in-house expertise. This changed with our adoption of Raise by Gravyty three years ago; it revolutionized our approach to managing our major donor pipeline by automating and personalizing outreach efforts.
This innovation, alongside project management and email management tools, significantly enhanced our operational efficiency and productivity. Being a fundraising team that embraced technology prepared us for being an early adopter of generative AI. Using technology to amplify our fundraising efforts helped us nearly double private revenue over the past half-decade, raising nearly $13 million last fiscal year alone.
Can you provide insights into how nonprofits can begin to leverage AI, particularly those with limited resources and technical expertise?
My short answer is: Just start.
My longer answer is AI tools offer all of us the opportunity to achieve more with fewer resources, enhance speed and efficiency, and ultimately drive greater outcomes for the causes you’re passionate about. Many of these tools — including well-known generative AI platforms such as OpenAI’s ChatGPT, Anthropic’s Claude and Google’s Gemini — are accessible for free or at a minimal cost. Each platform has its own strengths and limitations, but I advise against getting too wrapped up in the ongoing debates over which one reigns supreme.
The greater challenge lies in navigating and leading through the coming cultural changes within organizations and communities. This includes addressing individual objections, educating staff, establishing sound policies and procedures and demonstrating the intrinsic value of these technologies.
To be clear: Change is coming more rapidly every day, and these AI solutions are becoming more powerful. It’s incumbent upon you as an individual and those who manage teams and lead organizations to ensure staff and communities are prepared and not left behind on the wrong side of the digital divide.
In what ways do you ensure the ethical use of AI within your organization, particularly concerning data privacy issues?
Selecting the appropriate software solution and refraining from inputting confidential data can mitigate most privacy concerns. It’s essential, however, to back this up with clear policy guidelines and consistent training for users.
A more nuanced challenge lies in addressing AI systems’ inherent biases and accuracy issues. Given that we’re still in the nascent stages of widespread, consumer-grade AI adoption, it’s crucial to acknowledge that these technologies, trained on varied data sets and instructions, are prone to biases and inaccuracies. At Larkin Street, I often encourage my team to leverage generative AI as a collaborative tool for brainstorming, writing and editing — starting with prewritten information we trust to be accurate. Legal issues surrounding AI use are still evolving, emphasizing the need for vigilance.
Looking ahead, what do you see as the future of AI in the nonprofit sector, and how do you plan to continue learning and adapting as new AI technologies emerge?
Advances in AI are exponentially increasing, with the most significant strides being made in the private sector. The nonprofit sector, often underfunded, decentralized and more reliant on human labor, is trending behind in adopting AI technologies. This situation presents both a challenge and an opportunity: it may exacerbate the very social inequalities many of us aim to mitigate, yet it also allows for a more deliberate and thoughtful integration of AI into our operations. My message to everyone reading this is not a question of if but of when. Now is the time to start preparing your organizations, staff and communities for the future.
Leaders and advocates must navigate these uncertain times proactively. The fundamental steps are to develop a clear AI usage policy, prioritize data security, foster AI literacy and promote ethical standards. These efforts should extend beyond organizational boundaries, including advocacy for sensible AI regulation and safeguarding measures.
When one of my team members came to me last year to express their concerns about their future job security, I reminded them that when we reached our fiscal budget goal in February last year, the board didn’t tell us to take the next four months off. We kept raising money. The same will be true at most nonprofits — there is always more work to be done. However, the nature of work may evolve in some places more rapidly than others. Nevertheless, individuals and organizations that remain focused on our missions, staff and communities will continue to thrive in this changing environment.
Make plans to join Roberts and other top-notch presenters at Plug In to Technology, taking place virtually May 21-22. Register to secure your spot today.
Rachel Du
Director of Research, Prospect Management and Analytics, Bon Secours Mercy Health
Rachel Du is a member of the Apra Content Development Committee.