With the release of Android 9.0 recently, Google enabled a big change for how user’s can protect their privacy: DNS over TLS. While the concept isn’t brand new, it also hasn’t exactly exploded in usage either. This is going to start changing as Google rolls out the new version of their operating system that not only comes with the feature, but also enables it by default.
How does this actually protect user privacy? Best described in the intro to RFC 7858, DNS over TLS “eliminates opportunities for eavesdropping and on-path tampering with DNS queries”. Using the new standard, DNS queries of a client can be encrypted using TLS and tunneled directly to the DNS server. The intent is that monitoring of user activity becomes more difficult, since their DNS requests are no longer sent across the network in plain text.
It seems that DNS security has become a point of focus recently. Over the past year or so, we have seen two new big companies join the public DNS space: Quad9 (backed by IBM) and CloudFlare. Both companies advertise that they are focused on security and privacy of the individual user. They offer features like: blocking of malicious domains, DNS over TLS support, and limited historical logging of user requests. CloudFlare has even already posted steps for enabling their DNS over TLS service with the new version of Android. For the end user, this is great – free services that offer better security and increased privacy. However, this can impact how businesses protect their users.
A lot of organizations use web filtering as an important step toward securing client traffic – which can be accomplished through DNS filtering and/or web proxies. These methods might be used for compliance reasons, malware/threat detection, or even just managers who want to monitor productivity. However, there are a few big changes happening in user security/privacy right now that begin to make that more difficult: Increased focus on DNS privacy and the recent release of TLS 1.3.
Years ago I used to work for a local government organization where security was a high priority for IT operations. Web access was extremely limited and configured to whitelist only for known sites that were business related. If you wanted general internet access, you had to log into one of two segmented computers that were dedicated to that purpose. Even still, our IT security manager had concerns on the potential for threats and/or data exfiltration via HTTPS-enabled websites. One of my projects was to tackle this problem by implementing SSL decryption on our web filtering platform. At the time, implementing a web proxy as a man-in-the-middle wasn’t very complicated. Generate a CA cert for the proxy, distribute that cert to clients, then enable SSL decryption. Sure, there were bumps and hurdles – but the overall process achieved the result we were looking for.
The problem today is that newer standards and features (like TLS1.3, HSTS, or certificate pinning) are adding enhanced privacy features – which make that user data more difficult to reliably decrypt. That’s good right? Well, maybe – but only for the user. As an organization trying to secure your systems and private information, these measures are making life a lot more difficult. Even as a network admin, it’s become more difficult to troubleshoot web applications via packet captures. The use of ephemeral keys for TLS connections means that you can’t decrypt the session even if you have the server-side private keys.
Now the addition of DNS over TLS give us more of the same problem. Even if you couldn’t look into an encrypted web session, maybe you could gather what the destination is by reading the DNS requests. Instead, users now have the ability to encrypt their DNS queries, encrypt their web sessions, and make the life of a security admin a nightmare.
From the perspective of a corporate security team, the future is going to be more challenging (and probably for a lot more reasons than just this). However, I think that at some point we will have to reach a better balance of user privacy while still providing adequate security coverage. For some that might mean installing feature-heavy agents on every client device to inspect everything before it leaves the device. Or it might mean trying to rely on statistical, analytical, and behavioral data to detect patterns and anomalies (something like what ETA is trying to accomplish) rather than just tearing apart the individual user sessions.
Whatever the solution ends up being, it should be an interesting area to watch develop. In the meantime, let’s just agree not downgrade user security or privacy just to shove our security tools in. We’re all users of some service or another, and we all have some expectation of privacy. Is it worth reducing security and increasing risk just to see if your employees are using their web access responsibly? Probably not. Instead it’s time to look for new ways to solve this problem, and find ways to stay secure while still ensuring user privacy.