Is your charity ready for the age of automation?
The age of automation has arrived. It is an array of technology that includes: robots, chatbots, artificial intelligence, machine learning, conversational interfaces, smart devices, drones, self-driving cars and others which are increasingly becoming the interface between organisations and humans. These technologies have been in development for decades but are increasingly becoming part of our daily lives and will continue to have a significant impact on our work and lives.
Charities are beginning to explore how bots, robots, machine learning, artificial intelligence and other technologies can support mission driven-work and civil society as well as fundraising and marketing strategies.
What are the benefits?
Bots and other technologies can add huge value by completing basic tasks for us and freeing up our time for other pursuits. Bots can help charities do more with less staff and time. I don’t know about you, but I have never heard charity professionals complaining about having too much idle time!
In fact, some of the new technologies can do tasks that surpass human capabilities. For example, artificial intelligence can do a facial recognition analysis of hundreds and thousands of photos in minutes. (That technology is already here and is used on Facebook to tag photos of people). Recently, a robot designed to flip burgers for a fast food restaurant had to be reprogrammed because humans could not keep up with its pace.
What are the downsides?
There is also a dark side, popularised by evil robot science fiction books, movies, and television shows. It goes something like this: Evil robots programmed by AI gets smarter than its human creators and takes over the world. And while that is a remote possibility now, experts are concerned that robots/bots programmed with AI will be able to learn in ways that we cannot completely understand (or be able to control the actions that result from that learning.) More about that scary scenario here.
There are some real challenges that need to be considered, including ethical issues and loss of jobs to robots. Experts predict that robots and bots could eliminate human jobs or parts of those jobs that are repetitive in the next five years in many industry sectors.
Let’s take a look at a few examples:
- Major Gifts Officer: A new platform called First Draft developed by Gravyty uses AI to identify and draft emails to prospects in an organisation’s database. These “unassigned” prospects immediately become “assigned” because staff members with minimal training can substitute for gift officers and manage the cultivation process. It is a huge time saver, but how does that help sustain a talent development pipeline of experienced gifts officers who understand donor cultivation at all levels?
- Human Resources: The chatbot Spot mimics a confidential conversation with HR about sexual harassment, leading to the generation of a report. It is hoped this bot can remove some of the stigma associated with making a formal complaint – and lead to a safer, more open environment for all employees.
- Legal Counsel: The world’s first robot lawyer is a chatbot that can help you fight a parking ticket – and is just the beginning. Robot lawyers probably won’t dispute the finer points of copyright law or write elegant legal briefs just yet. Experts suggest that chatbots could be helpful in certain types of law such as bankruptcy, divorce disputes, and other areas that typically require navigating lengthy and confusing statutes that have been interpreted in thousands of previous decisions.
Given that charities are so under-resourced, replacing interns with bots might provide a big boost in productivity for some. But it requires careful thought. As automation begins to change workplaces, it doesn’t mean that boards should slash staffing costs in favour of automation. Instead, it is important to consider how to channel human knowledge.
The question to ask is: “How do we make the most of all of that knowledge, experience and expertise, even if the mechanical aspects of their work have gone away?”
We’re happy to welcome Susan Caesar to #TeamLightful!
We’re more than 18 months in the pandemic and we’re still seeing misinformation spreading online. Black, Asian and Minority Ethnic (BAME) communities have been disproportionately impacted by COVID-19. Mistrust created by historical racism and health care inequalities has given space to the rise of misinformation and disinformation.
Despite the promising news of vaccines rolling out to combat COVID-19, mistrust and discord challenge their adoption. Concerns have been raised about the take-up of the Covid vaccine among Black, Asian and Minority Ethnic (BAME) communities that have been disproportionately impacted by the pandemic.
See other ways Lightful can help
Want to learn more?
Email Pumulo and start a conversation