Privacy practitioners are increasingly tasked with creating and enforcing trustworthy programs to safeguard individuals’ data and ensure their organizations remain compliant within an evolving patchwork of global regulations. A volatile technology landscape, influenced heavily by artificial intelligence, only heightens this challenge.
ISACA’s State of Privacy 2026 report gathers insights from more than 1,800 global privacy professionals, exploring trends in privacy staffing, operations, breaches, privacy awareness training, privacy by design and use of AI tools by privacy professionals. Here are five key takeaways from the report:
- Privacy teams are shrinking. The survey data shows that the median privacy staff size of survey respondents is five, compared to eight a year ago, with technical privacy roles particularly understaffed. Given the rising stress that most practitioners report experiencing in recent years – as well as increasing challenges related to AI and the regulatory environment – this thinning of privacy teams is particularly concerning.
- The regulatory environment is posing significant challenges. Understanding the laws and regulations to which the organization is subject is among the top three skills gaps respondents identified. Additionally, the complex international legal and regulatory landscape is considered the second most common privacy program obstacle. However, regulations can also provide privacy teams with a shared understanding of what to focus on, as reflected by the EU's General Data Protection Regulation (GDPR) being the most common law/regulation used by respondents to help manage privacy in their organization.
- AI usage in privacy function is not yet the norm. Only 13% of respondents report they currently use AI in their privacy function while 38% plan to use AI within the next 12 months. Both of those represent slight increases from a year ago. “These findings suggest that many enterprises understand that AI tools are not without risk and that insufficient resources or prioritization of privacy can be problematic,” according to the report. “These correlations also underscore that AI tools, although helpful, are not a panacea for challenges with prioritization or resource shortages.”
- Technical privacy expertise difficult to come by. Technical expertise is the No. 1 privacy skill gap (54%), just ahead of experience with different types of technologies/applications (52%). Nearly half of respondents (47%) indicate their technical privacy teams are understaffed. ISACA’s Certified Data Privacy Solutions Engineer (CDPSE) credential is designed to validate the technical skills and knowledge required to assess, build and implement comprehensive data privacy measures. Training that allows interested nonprivacy staff to move into privacy roles was the top strategy recommended to combat skills gaps in the profession.
- Mixed signals on privacy prioritization. While there is understanding of privacy’s importance at the board level (56% say their board of directors adequately prioritizes privacy), that recognition is not necessarily translating into the resources privacy functions need to excel. In addition to the above-mentioned reduction in privacy team sizes, less than one-quarter of respondents (22%) said their privacy budgets will increase in the next year and half of respondents anticipate a decrease in their privacy budget in the next 12 months. “Budget cuts can have serious for the ability of privacy teams to ensure privacy, which is demonstrated by a correlation between forecasts for budget cuts and confidence in ensuring privacy,” according to the report. “Sixty-one percent of respondents who were not so confident or not at all confident in their organization’s ability to ensure the privacy of sensitive data believed that their privacy budget will decrease in the next 12 months.”
For additional insights from the report, access the full global research report and related resources at www.isaca.org/state-of-privacy.