jump to navigation

Fast, cheap and out of control 2008-08-05

Posted by lukethelibrarian in Technology.
1 comment so far

[This blog post — actually, this entire weblog — contains the thoughts and opinions of Luke Rosenberger, and does not represent official policy, nor does it intend to represent the opinions of my employer or co-workers.]

On a recent Friday morning, my university’s Information Security Office implemented a feature of its Intrusion Detection System (IPS) that is designed to completely block peer-to-peer (P2P) traffic.  In and of itself, this was not a surprise.  They had notified us of their intent in advance, and moreover, our university has had a very far-reaching anti-P2P policy for over five years. The policy only allows users to install P2P apps that are (a) authorized by their Dean, Director or Chair, and (b) have been determined by Information Security to represent a minimal risk.  This policy is accompanied by a list of unauthorized/prohibited P2P applications, which include not only the usual suspects (Azureus, eDonkey, Gnutella, Kazaa, Morpheus and clones of the like) but also some pretty mainstream stuff (µTorrent, BitTorrent, Skype) and others that really defy the conventional definition of P2P (PicoPhone, Socks2HTTP).  It has also been accompanied by a very consistent message from the Information Security Office: every mention of peer-to-peer technology is presented in the context of copyright or intellectual property infringement, P2P as a vector for malware and viruses, and the dangers of “opening up” one’s computer to the whole Internet (otherwise known as “misconfiguration”).  The message has been very effective; I have been surprised to find (at an academic institution highly focused on scientific research) practically no awareness or recognition of potentially beneficial applications of peer-to-peer technology.

So I think no-one was particularly surprised to receive the advance notification that InfoSec would be using this relatively new tool — the IPS — to block peer-to-peer traffic entirely.  The surprise came, however, when the change was actually implemented, and suddenly in the library the patrons and staff found ourselves unable to reach any page within the domains facebook.com, myspace.com, and youtube.com.  In fact, we even discovered that pages on other servers that included embedded advertisements or other content served from those domains — such as a blog post with an embedded YouTube video or even a webpage with a Facebook advertisement — would not render because they would timeout waiting for a response from the blocked server.

None of these sites were specifically mentioned in the advance memo, and none of us had associated them in any way with P2P applications — in fact, the memo specifically indicated the change would “only apply to P2P applications and not to Instant Messaging or Streaming Media traffic.” So how did Facebook, MySpace, and YouTube manage to get killed in a shutdown of P2P apps?  What kind of change control process failed to prevent a mistake that InfoSec had to publicly back-pedal from after less than two business days?

Even now that InfoSec has reinstated access to those three domains at our university, I’m left asking myself those questions.  I know the InfoSec guys at my school.  They are sharp guys.  Not strong in the collaboration department, but that’s not entirely their fault either.  I’m left asking those questions because I spent a long time talking with one of them while this block was in place, and he really had a hard time understanding why this was such a problem.  I’m left asking those questions because I suspect it could easily happen again.

In asking those questions, two clues have led me to a rather disturbing discovery.  The first clue was in my conversation with my friend, the InfoSec officer, who at one point made an appeal to a higher authority — he said that the University was simply trying to implement a P2P prohibition which was not just its own, but was required by an executive order issued by no less than the Governor himself, Rick Perry.  This was new to me and I made a note to follow up on it.  The second clue was a sentence from the memo issued by the Chief InfoSec Officer upon rescinding the blockade of those three sites:

Even though these websites are not commonly thought of as P2P applications, these sites were unintentionally included in the policy because of the video or photo sharing capability and potential risk they present.

He went on to explain that as a Health Science Institution, use of upload capabilities is particularly dangerous because of the possibility of uploading patients’ Protected Health Information, and violation of the 1996 HIPAA law, and went on to encourage us to educate users to that danger.

Ultimately, I found the Executive Order issued by Rick Perry: it is RP58 of April 5, 2006 (yep, that’s RP as in his initials; the previous governor’s executive orders are numbered GWB, and before that AWR…), titled “Relating to peer-to-peer file-sharing software.”  The thing that struck me about this order is the definition of “file-sharing software” as specified in item (5) under the “Therefore” clause:

5. For purposes of this executive order, “peer-to-peer file-sharing software” means computer software, other than computer and network operating systems, that has as its primary function the capability of allowing the computer on which the software is used to designate files available for transmission to another computer using the software, to transmit files directly to another computer using the software, and to request transmission of files from another computer using the software.

Okay, so let’s look at that.  In order be defined as a P2P software by RP58, the software must:

  • not be an operating system,
  • allow host to designate files available for transmission to another computer using the software,
  • transmit files directly to another computer using the software, and
  • request transmission of files from another computer using the software.

That’s pretty broad, I thought.  Any web browser meets the 1st and 4th bullets right out of the box.  And then I re-read the CISO’s memo — the three blocked sites were included because of their photo- and video-sharing capability.  At which point I realized that in fact, Facebook, MySpace, and YouTube (not to mention Flickr, Gmail, and even Outlook Web Access) had upload capabilities which met the 2nd and 3rd bullet points of the RP58 definition — they allowed the host to designate files available for transmission (by allowing selection of a local file to upload), and they transmitted the files using the software (via http).

So bottom line: the InfoSec guys were actually right — RP58 apparently gives Texas state agencies the right to not only clobber Facebook, MySpace, and YouTube, but *any* web traffic that involves end-user-uploaded content.  For that matter, Firefox, Internet Explorer, Opera and any other web browser fall within the RP58 definition of “P2P filesharing software,” as does any FTP client or Instant Messaging application that allows file upload.

How in the world did we allow ourselves to get here?

I realize that as individual professionals, as institutions, and as a state, we’re trying to respond to threats quickly and cost-effectively.  My worry is: what happens when we outsource these critical decisions, when we circumvent all internal review and instead rely on experts or vendors without regard to our own user constituencies?  Is it really cheaper and faster to avoid learning from our own mistakes and instead spend our time and money cleaning up after strangers?