Artificial Intelligence and the Future of Impact Assessment
Like all other fields, Artificial Intelligence (AI) is expected to affect the Impact Assessment (IA) systems worldwide. This study explores the opinions and concerns of international IA experts regarding the benefits and threats AI, particularly ChatGPT may pose to IA. 25 semi-structured interviews were conducted with IA experts including consultants, regulators and academicians. The experts believed that AI will help to reduce the time and effort required for preliminary data collection and will certainly improve the report quality in terms of formatting and sentence structure etc. Moreover, AI techniques may help in impact prediction and regulators may also use these for IA evaluation and monitoring. However, they also feared that the quality of data in reports and public involvement in IA process may be compromised, chances of plagiarism and bias may increase, and the quality of IA graduates produced by academic institutions may deteriorate. Majority of the experts were afraid that AI will pose more threats to IA compared to the benefits it may offer. The policy makers, while keeping all these concerns in mind, need to formulate laws, rules and guidelines regarding the use of AI in IA in their respective jurisdictions as suggested by the interviewees. Importantly, the International Association for Impact Assessment is also preparing international best practice principles for the use of AI in IA. These will serve as guidelines for different countries. Since the study is based on opinions of international experts, the results would be of interest to policy makers around the globe. Future studies may be conducted to evaluate the use of different types of chatbots such as Google’s Bard or Microsoft’s Bing in IA, AI techniques used for impact prediction. Additionally, another study may be done a couple of years later to evaluate the AI related laws and guidelines drafted by IA systems across the globe and their implementation.
Search for the Publication In: