The Digital Trust Imperative: Building Trust Through Transparency—Addressing Digital Exhaust

Icon with a blue backdrop and a design of connected dots representing a network.
Author: K. Brian Kelley, CISA, CDPSE, CSPO, MCSE, SECURITY+
Date Published: 1 May 2025
Read Time: 7 minutes

I recently received notification of a data breach involving some of my healthcare data. This was unpleasant, but not exactly surprising. However, when I looked at the name of the company contacting me, I had never heard of it. That meant a healthcare organization that I did engage with provided data to the organization that suffered the data breach. However, there was no transparency about transferring my data to a third party. Though what the provider did is fully legal under current laws and regulations, I certainly did not provide direct consent to having my data shared with said company. As a result, I have lost trust in the healthcare provider who gave my data to this third party. My business is tied to my trust level and has been impacted accordingly.

I know I’m not alone in my concern about how my data is used. A recent survey revealed that 81% of respondents were somewhat or very concerned with how companies use the data they collect.1 News stories of such misuse have proven those concerns valid. Rather than misusing data, organizations are better served to make clear what data is collected, how that data is going to be used, and ensuring that their choices in using said data brings value back to the customer. Being clear about data collection, usage, and any potential benefit to the consumer can ultimately build trust through transparency.

Data Collection and Digital Exhaust

Unfortunately, the ways in which organizations collect data are not always clear to end users. In my earlier healthcare example, my data was likely shared with a third party by an organization I had a legitimate transaction with. Many times, though, data is being continuously collected during users’ everyday interactions and transactions—oftentimes without their full knowledge or consent.

The United States Federal Bureau of Investigation (FBI) created a guide to address exactly this issue.2 While the Digital Exhaust Opt-Out Guide is specifically targeted toward law enforcement personnel and their families, it contains important information for anyone interested in understanding how to minimize the amount of their personal information that is being collected and shared.

The Digital Exhaust Opt-Out Guide defines digital exhaust as data generated from one’s online activities, including sensitive information that can be exploited by malicious actors to track and target an individual.3 Another term that is used to often mean the same thing is “data exhaust.” Based on the definition, we typically think of tracking cookies, but there is a tremendous amount of telemetry available to track people and understand their digital habits, much of it operating behind the scenes of various applications and web sites.

Take, for example, OpenTelemetry, which is an “open source observability framework for instrumenting, generating, collecting, and exporting telemetry data such as traces, metrics, and logs.”4 All of the leaders in the application performance monitoring (APM) space conform to OpenTelemetry.5 From a practical perspective, if a user visits a website which uses OpenTelemetry compatible observability and monitoring, it is possible to observe their entire experience and history for the given session.

OpenTelemetry is a powerful, excellent tool, which is why it is so widely adopted. From a user perspective, with the telemetry collected, it is possible to construct a session and identify which pages were visited, how long pages took to load, what type of web browser and OS was used, how long the user spent on pages before going to the next one, and any errors or issues encountered. It can, of course, also provide metrics at each layer that is instrumented, such as at the API layer and even calls into the database layer. The user portion, and the one we are most concerned about with respect to digital trust, relays extremely valuable information and is a major boon if an organization is looking to deliver an excellent customer experience. However, just like any other tool or data, it could be used improperly by an organization or bad actor.

It is important to note that digital exhaust is not just the output of a user’s journey through a system but includes any data that is generated online. For instance, data that users input into a platform or that is stored electronically after having been captured somewhere else is digital exhaust. As an example, the Digital Exhaust Opt-Out Guide includes “People Search” sites.6 These sites allow users to search for a particular person’s name and get back known addresses, phone numbers, and other associated people. Typically, this information is gathered from other sources and not from the user directly. Therefore, digital exhaust does not just refer to data organizations capture online from customers and other relationships—it also includes data that organizations acquire or sell.

Digital Exhaust and Internal Stakeholders

Digital exhaust isn’t limited to external relationships. The same type of telemetry that can track how customers flow through a system and the types of choices they make is increasingly being applied internally within organizations. How much information to capture about internal users and how to use that information requires a balancing act. While many organizations use login banners on enterprise resources indicating the organization’s right to monitor all activity, it is not unusual for staff to click “OK” and then promptly forget about the warning. Only in organizations where regular follow-up occurs based on the data being captured do users become wary of exactly how much is being observed. Many of us accept such warnings about monitoring and the possibility of such data being collected in the workplace; and we’re actually quite pleased when such monitoring captures information on despicable crimes such as child pornography and human trafficking. But what do we think about a workplace where every minute is scrutinized by management, and even a few moments spent reading an article unrelated to one’s job might result in disciplinary action? Most of us would find such a work environment hostile and seek alternate career opportunities as quickly as possible.

If an organization wants to build digital trust, both externally and internally, then it needs to be as transparent as possible how it uses the digital exhaust it collects and stores.

The instrumentation of many enterprise-owned applications and systems provides a robust amount of digital exhaust internal to the organization. Indeed, a recent article from Forbes stated that “… an organization’s digital exhaust is one of your most valuable assets—and its value is only going to grow.”7 On the positive side, the collected data can be used to understand how organizational processes are working in practice, how employees communicate with each other, and provide other insights into how the organization operates. This telemetry, since it is invisible to the user, thereby avoids the Hawthorne effect, which is when people change their behavior because they know they are being observed.8 Like with OpenTelemetry, these applications and systems are excellent, effective tools, but they can be used improperly. Therefore, it’s imperative for organizations to develop appropriate controls and safeguards to ensure abuse doesn’t happen.

Transparency Builds Trust

If an organization wants to build digital trust, both externally and internally, then it needs to be as transparent as possible how it uses the digital exhaust it collects and stores. Transparency requires organizations to ask hard questions about the data collected, stored, and shared. After all, the goal behind transparency is to create trust. If an organization appears to be overreaching with the data collected, then it will lose trust. On the other hand, organizations that can clearly delineate what data is being collected and how it is being used, so long as such collection and usage is acceptable to the majority of their respective audience, will tend to gain trust.

Endnotes

1 McClain, C.; Faverio, M.; et al.; “How Americans View Data Privacy,” Pew Research Center, 18 October 2023
2 Federal Bureau of Investigation (FBI) Kansas City Division, Digital Exhaust Opt-Out Guide 5.0, 2024
3 FBI Kansas City Division, Digital Exhaust Opt-Out Guide 5.0
4 OpenTelemetry, “Documentation”
5 OpenTelemetry, “Vendor List”
6 FBI Kansas City Division, Digital Exhaust Opt-Out Guide 5.0
7 English, L.; “Digital Exhaust: The Most Valuable Asset Your Organization Owns, But Isn’t Using,” Forbes, 1 February 2021
8 Perera, A.; “Hawthorne Effect: Definition, How It Works, And How to Avoid It,” SimplyPsychology, 13 February 2024

K. BRIAN KELLEY | CISA, CDPSE, CSPO, MCSE, SECURITY+

Is an author and columnist focusing primarily on Microsoft SQL Server and Windows security. He currently serves as a data architect and an independent infrastructure/security architect concentrating on Active Directory, SQL Server, and Windows Server. He has served in a myriad of other positions, including senior database administrator, data warehouse architect, web developer, incident response team lead, and project manager. Kelley has spoken at 24 Hours of PASS, IT/Dev Connections, SQLConnections, the TechnoSecurity and Forensics Investigation Conference, the IT GRC Forum, SyntaxCon, and at various SQL Saturdays, Code Camps, and user groups.