AI may damage professional reputation says study

New research from Duke University indicates that generative AI usage could damage professional reputations due to negative judgments from colleagues.

While generative AI tools can be used to boost productivity or to automate repetitive tasks, a study published by the National Academy of Sciences (PNAS) showing that employees who use AI tools such as ChatGPT, Claude and Gemini at work could face negative judgements about competence and motivation from managers and colleagues.

The researchers found that results were consistent across demographics, indicating that social stigma against AI use isn’t limited to specific groups. The Duke team conducted four experiments with more than 4,400 participants to examine both anticipated and actual evaluations of AI tool users, revealing a consistent pattern of bias against those who receive help from AI.

Researchers Jessica Reif, Richard Larrick and Jack Soll, Duke University’s Fuqua School of Business, commented: "Our findings reveal a dilemma for people considering adopting AI tools: Although AI can enhance productivity, its use carries social costs.”

The experiments tasked participants with imagining the use of either an Ai tool or a dashboard creation tool at work, with those in the AI group expected to be judged as lazier, less competent, less diligent, and more replaceable than those using conventional working methods. They also reported a lower likelihood of disclosing their use of AI to colleagues and managers.

Further experiments showed that participants consistently rated those receiving AI help as lazier, less competent and less independent than those receiving similar help from non-AI sources or no help at all.

The researchers discovered that this bias can affect real business decisions, with managers who didn’t use AI themselves less likely to hire candidates who regularly used AI tools when exposed to a hiring simulation. Managers who frequently used AI showed the opposite preference, favouring the AI-using candidates.

Article Categories








Most Viewed