Problems with running to SSL in fear of the NSA.

Recently, a whole host of companies have been rapidly implementing SSL across their entire websites in response to the NSA scandal. I for one don’t buy into the paranoia to the extent that the media and everyone else does. As an american citizen, my expectation is that my government is doing what it can to protect me and as a technologist I am constantly advising organizations globally on what they need to do to protect themselves. In the process, it is very common for the technologies to be deployed to peer into user network traffic. The main goal of this inspection is to protect users, not spy and snoop on their activities. I realize that organizations are a bit different than that of a government agency but honestly folks, when have you seen court cases involving NSA data? Its very far and few between. Intelligence is about gathering information. Information is used as context in decision making, we all do this and seek information for all decisions we make.

Now I am not defending the NSA’s tromping through the US constitution, I agree that our government should be tightly controlled and held to the constitutional standards set forth by our forefathers. I only want to shed some light on what “we” already do as organizations globally. We as organizations go way beyond tracking “metadata” about the users that use our networks, and this is largely in order to protect ourselves from the evil presented by the hackers and nation states that wish to get into our information or steal our intellectual property.

Now we come to the use of SSL, although I do believe that all folks that are concerned with government monitoring or the transport of sensitive information over the internet should be encrypted, one thing that organizations need to consider are the impacts to the user experience and their own infrastructure.  Leveraging SSL for absolutely all content can have a dramatic performance disadvantage. Although SSL encryption is now much easier to implement due to hardware performance enhancements. Implementing SSl can have huge impacts and must be considered by all that are involved. I urge the community at large and the IETF to push for mixed-mode web content encryption and new standards in browsers that would provide encryption that can be specified only for sensitive things like the transport of cookies, forms, specific called out elements and other such information without the need to transport absolutely everything over an encrypted channel. I realize that HTML does provide for this but many browsers prompts users with warnings making it difficult for web content providers to selectively encrypt content that “must” be secured, while other content can remain unencrypted. There could be a concerted effort that eliminates the need for browser warnings while also improving security of “sensitive” content.

One major disadvantage here is that for organizations that wish to dramatically reduce network load and leverage caching proxies, SSL must be terminated at the proxy in order for these proxy caches to be effective. This actually diminishes security quite extensively and could introduce potential liability (not a lawyer so this isn’t legal advice). The reason I bring up this topic is that I leverage a network proxy cache myself and I really don’t want to pierce my SSL sessions en-mass to properly cache my network resources.

My two cents. What are yours?