We held another Impact Data Chat focused on how data science and artificial intelligence (AI) can be used in social impact, evaluation, storytelling, and sustainability tracking. The ethical use of artificial intelligence and data security and privacy issues was centrally discussed. It was a lively and insightful exchange of ideas among diverse participants from the foundation, corporate, and monitoring and evaluation sectors. Below is a list of AI tools that is being constantly updated. Please feel to share other tools we should add.
General AI tools/technology
- Futurepedia.io is a directory of different AI tools and a repository of AI news and trends. This is a good source of information on the latest about AI and users can get inspiration from the site on how to apply AI to their work
- ChatGPT is an AI chatbot that can answer questions, provide explanations, and engage in conversations across a wide range of issues.
- Bard is a conversational AI chatbot. It is Google’s experimental conversational AI. It has similar functions as ChatGPT.
- DeepL is an AI-powered real-time language translator. The technology can translate across 29 languages (and counting).
- Otter.ai is an AI-powered transcription tool.
- Upshot.ai is a customer experience and gamification platform that uses AI aimed at increasing user engagement, retention, and monetization.
Innovations in Grantmaking and Impact Tracking Software
- Blackbaud automated translation in their grants management software
- b.World uses a range of different AI tools in their software to expedite reporting processes, including draft content for a project, logic model or story for AI using open ai and visualising data and log frame trends.
- Candid leverages AI in its news feed and also focuses on reporting on misinformation
- Digital Science, has a range of AI summary tools, including Dimensions for scientific publications including tracking the trustworthiness of the research.
- Google Crisis Response is a platform that provides critical and timely information and resources during times of disasters, natural calamities, and other crisis or emergency situations. The platform can provide interactive crisis maps, notifications/alerts and latest news about the emergency situation, missing person finder, and information on where to donate or provide relief assistance.
- ImpactMapper is developing equitable AI tools to automate aspects of social impact tracking, specifically coding text data, such as grantee reports and evaluations.. A user can upload their custom impact metrics, theory of change, or SDGs and the tool suggests pieces of text that align with those outcomes, allowing for quicker analysis and visualization of trends.
- Impala is an AI driven tool to explore and map relationships and networks in the nonprofits and philanthropic sector
- OSDG.AI is an open-source AI tool that helps users assign SDG labels to the content of uploaded reports or other text-based inputs to the platform.
- Salesforce embeds AI in its CRM and reporting features.
- SBC Impact Designer is an open-source digital platform that integrates AI. The platform supports social and behavior change (SBC) interventions by providing guidance in the design, plan, management, and evaluation of an organization’s SBC program. This is developed by Amplio and the platform will be beta tested by the second half of 2023.
- Signpost Initiative. IRC´s humanitarian application provides accurate and timely information for people in crisis, using AI.
- Smart Simple uses AI to support drafting grant applications, translating applications and reports.
- Uli uses machine learning technology to hide abusive content, gender-based, and racial and ethnic slurs on Twitter.
Other Innovations and Use Cases for Storytelling, Impact Tracking, and Development Work
- Using machine learning with Python for predictive analysis; but this predominantly uses statistics rather than AI.
- A participatory phone-based technology used for a humanitarian project. It served as a listening tool to quickly identify the priority needs of girls and boys during emergency responses.
- A transcription technology used as a monitoring tool to track hate speeches or racial slurs in conversations on local radio stations; data can be used in designing and implementing advocacy campaigns.
- Platform to conduct due diligence on grant applications and gather impact information. Combined with other technology (such as ChatGPT), the data gathered was also used for awareness-raising campaigns or marketing initiatives.
- Implementing an AI model that will help assess grant proposals in a non-biased way by taking into consideration local voices, situations, and nuances.
- Multi-language transcription tool, cause right now, transcription technology can mostly accommodate only the English language.
- A platform that will suggest indicators aligned with certain outcomes; or those that could suggest key messages aligned with an organization’s communication strategy and target audience.
Reflections on the practical use of AI and ethical considerations
- AI tools can help users be more efficient in many tasks, and that is why they have exploded in popularity. However, we have not spoken enough about the ethical and security issues in our sector. Some people in the data chat used AI chatbots for marketing purposes, interview transcription, summarizing documents, some in our discussion shared they even created indicators from ChatGPT. We discussed the risks in users taking AI-generated responses at face value and simply copy-pasting the information they have gathered or creating reports from it. AI tools are known to hallucinate and share false information, have biased training datasets, and not understand contexts that are central to social impact and sustainability work. Users should use extreme caution if they choose to do this. We recommend they do additional research, reflect on an organization’s strategic priorities, verify and meaningfully adapt any AI-generated responses when creating indicators. Personally, at ImpactMapper we do not condone using ChatGPT or other open AI tools for indicator or grantee analysis purposes, see the following point on privacy and security.
- Many of the popular AI tools, like Chat GPT and Bard, work in a way that continually learns from the data or information entered to it by the users (generative AI). The problem right now is that many people are unknowingly sharing private and confidential data. The ImpactMapper Founder has been speaking at many conferences speaking on this issue lately. She has heard examples of evaluators entering interview and focus data for summarization, foundation staff entering grantee reports for summarization and analysis, etc. This information contains sensitive and confidential information that should not be shared to an AI platform, like Chat GPT or Bard. This is a security breach and information entered could show up in another search later on given how these free and open platforms work. As such, it is important to continually discuss responsible AI and security and privacy issues in your organization and develop policies for your staff and contractors. If you are sending your data to Chat GPT or an OpenAI platform, you need to ensure that all personal and confidential information is stripped out of data that is sent to openAI platforms.
- In using AI tools, it is important to check the inclusiveness of the database. It is essential to ask how and what data was sourced, whose voices were included or not to train the algorithm that the tool uses. Then it is important to understand, how the data will be shared, who has access, if it could be shared back out to the public and in what form, if there are opt out features to not share your data for training purposes, etc. For an article underscoring the potential of inclusive and equtiable AI for the sector, read Leveraging data science and AI to promote social justice, sustainability and equity.
By: Alexandra Pittman, PhD, Founder ImpactMapper