gapp awards gallery banner

Ensuring ethics and creativity in AI-driven PR

By Mbali Khumalo, account manager, Tribeca Public Relations

Public relations has always been and continues to be a competitive, fast-evolving industry. With all rapid-pace professions, staying ahead of the curve requires being a quick – if not first – adopter of the tricks of the trade that will give you the edge over the rest. Enter artificial intelligence, a powerful tool that is reimagining ways of working – to the excitement of some and the trepidation of others.

The question of AI for public relations is not one of its usefulness. Its impact is similar across businesses, academia and even our personal lives. It can instantly analyse and summarise vast amounts of data, generate ideas from business names to event themes, the list goes on. For public relations consultants, the question of AI is far more nuanced – can we use it ethically and if so, how?

Confidentiality is key

A crucial ethical consideration that is easily forgotten in the haste to use generative AI like ChatGPT is confidentiality. When you provide an AI tool a prompt, you aren’t only enabling it to generate you a response. You are also feeding its algorithm, assisting it to respond to similar questions asked by its millions of users around the world.

If you can’t share the details of the groundbreaking concept you’ve developed for a client with your friends over drinks, it stands to reason that you shouldn’t share it with the over 200 million average monthly users of everyone’s favourite chatbot.

AI prompts are an interesting part of our relationship to these tools. Many people simply copy and paste their information, tack on a question at the end and wait for the technology to do its thing. However, as professionals who have access to intellectual property and market sensitive information long before it reaches the masses, PR consultants should think twice about the kinds of prompts they provide generative AI. Clients need to be able to trust that their sensitive information is appropriately guarded from prying eyes and algorithms.

Use big data to your benefit

The great thing about generative AI tools is that they have already been well fed by swathes of publicly available information, from social media to news articles and published research. A major part of being a successful PR consultant is listening – on both an interpersonal and wider scale. From a macro perspective, AI tools are great for helping to cut out the static, providing users with clear and concise answers about what they’re looking for.

Say you’re running a campaign to get more young people to vote and need to first understand what is driving youth voter apathy, ChatGPT can synthesise information from hundreds of thousands of sources on this topic, which can help inform ideas and approaches. Inform, being the operative word here.

These kinds of tools are great for providing you with a starting point or even helping to overcome idea blocks. Integrating them like this into ways of working can assist consultants in being more efficient. Things get ethically murkier when AI does the bulk of the thinking and doing, rather than acting as a strategic complement to your work.

You still have the creative edge

Much of the AI software we currently have access to is yet to replicate the style, soul and feel of creative expression. It isn’t difficult to identify the clunky and mechanical writing of generative AI. While it can technically get the job done of writing a social media post or article, its output is hardly memorable or moving.

Creativity is a fundamental pillar of public relations which, at its core, is about connecting people to businesses, ideas and each other. At the risk of sounding like a Luddite, those of us in creative industries should keep a watchful eye on technology that replicates aspects of our role, especially in an unregulated environment.

There is room to embrace the advancements and growing accessibility of AI, without pushing our ethics to the side or trading creativity for convenience.

Previous Article
Next Article