1. Using generative Artificial Intelligence tools in Public Relations: Ethical concerns and the impact on the profession in the Romanian context.
- Author
-
Cusnir, Camelia and Neagu, Anamaria Nicola
- Subjects
- *
GENERATIVE artificial intelligence , *ARTIFICIAL intelligence , *PUBLIC relations personnel , *METHODS engineering , *PUBLIC relations - Abstract
The controversy surrounding ChatGPT has reopened the debate about the impact of new technologies in many fields of activities, including communication and PR. This study mapped Romanian PR practitioners' use of generative AI and their perception of it, placing a special focus on the ethical concerns involved and the implications for the profession itself. We took a quantitative-qualitative approach by using both a survey and semi-structured interviews. Our goal was to determine the impact of generative artificial intelligence (AI) in the Romanian PR industry and to understand the reasons and challenges behind integrating generative AI in PR practice. The survey findings revealed a substantial adoption (73.5%) of AI within the Romanian PR community, with an overwhelming 91.6% of them using ChatGPT. The satisfaction level was remarkably high, with 92% expressing satisfaction with generative AI application efficacy. Benefits included timesaving, work simplification, and the reduction of repetitive tasks. Surprisingly, not only did 67.3% of respondents not perceive AI as an immediate threat to PR jobs, but 80.5% believed AI represents an opportunity for the industry. Indeed, almost all our interviewees admitted relief and satisfaction when using generative AI tools to complete their tasks. However, some concerns were expressed regarding the quality of the generative AI content and, in particular, the need to always check this kind of content by a human editor before using it. Moreover, PR professionals' main ethical concerns are related to the issue of transparency towards their clients when using AI tools to produce different types of content. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF