Half of all employees are using Shadow AI (i.e. non-company issued AI tools). This finding is according to a new study* on the AI habits of 6,000 knowledge workers.
The research goes on to show that personal AI tools are so valuable that half of workers (46%) would refuse to give them up, even if their organization banned them completely. This is a powerful signal to organizations that they need more robust and comprehensive AI strategies, to prevent inviting significant risk into their business.
Steve Ponting, Director at Software AG, commented: “If 2023 was a year of experimentation, 2024 will be defined as the year that GenAI took hold. While 75% of knowledge workers use AI today, that figure will rise to 90% in the near future because it helps to save time (83%), makes employees’ jobs easier (81%) and improves productivity (71%). As usage increases, so does the risk of cyber attacks, data leakage or regulatory non-compliance. Consequently, business leaders need to have a plan in place for this before it’s too late.”
The survey also found that not only does AI have a day-to-day impact on individuals, but nearly half (47%) of workers believe these tools will help them to be promoted faster. This suggests a future where AI tools are wholly ingrained in many roles due to their criticality in job success.
The AI utility gap
Most knowledge workers said they use their own AI tools because they prefer their independence (53%). An additional 33% said it’s because their IT team does not currently offer the tools they need. This suggests that if businesses want their employees to use officially issued tools, a different process is needed for determining which ones are actually made available.
Risk Management
Most employees aren’t blind to the risks of their AI choices and high volumes recognize cybersecurity (72%), data governance (70%), and inaccuracy of information as potential pitfalls. However, businesses should be concerned that few employees take adequate precautions like running security scans (27%) or checking data usage policies (29%).
J-M Erlendson, Global Evangelist at Software AG, added: “There is some comfort that regular users of AI are better prepared to mitigate risks compared to occasional users. This fact alone should encourage organizations to implement more rigorous training programs, because many still don’t have anything robust in place. We need this now, because the future – where 90% of workers use AI – is just around the corner and will bring more of the occasional users, which is a problem. This group is far less adept at taking risk management precautions compared to their more experienced counterparts – but they’re just as likely to take the risks.
He continued: “Shadow AI is not going anywhere, but it is supercharging the operational chaos already engulfing many organizations. A transparent framework for their processes, coupled with an understanding of the tools employees want – and the training they need – are good building blocks for better incorporating Shadow AI. It’s clear that AI is not going away, and, collectively, we need to address it in the right way now.”
*Research Software AG