How NOT to run a leak site

Don’t get me wrong, I am totally in favour of more and more leak sites. So, I started to examine Tradeleaks a little closer, and tech-wise it strikes me that it lacks basic security.

If you enlarge the picture above you will soon see that there is no HTTPS connectivity. For an average user, this poses a threat of someone listening into the traffic. Basically, you will end up sending the leak in plaintext to the server. Even if you use a proxy, which is recommended, you will still end up with half your route being vulnerable. End-to-end encryption is the only solution for a serious leak site.

Moreover, it states:

TradeLeaks stores the IP addresses of sources who post on the TradeLeaks website. As such, we encourage all sources to ensure they use anonymous proxies before posting to our website. More information about anonymous proxies can be found here: Anonymous proxy (Wikipedia)

Why store log files? For any website, sure, you want statistics and maybe you want to sell the information to a third party. But a leak-site? No way. Logs should never be made, or sent directly to /dev/null. Any authority getting their hands on those machines can easily read the logs.

Third: The ”Silicon-valley beacon”. Image below:

Tradeleaks tells Facebook, Google, Reinvogorate and Recaptcha that you have submitted a leak. The front page also tells Twitter. That’s like half of Silicon valley… They should never have this information. Just look at the Twitter (and probably lots of other) subpoenas!

The Technical Contact for Tradeleaks.com is:

Technical Contact:
Tradeleaks Pty Ltd
IT Department (info@tradeleaks.com.au)
+61.386772272
Fax:
PO Box 439
Albert Park, Australia 3206
AU

Hosting:

Rackspace Hosting
5000 Walzem Road
San Antonio
TX
US

So, once again: I really like the concept of more leak sites. And I am very happy to see more and more of them. But security is always the first issue, and these scripts, the failure of end-to-end encryption and the log files may put people in danger. If I were a Facebook or Google employee leaking information, I wouldn’t want scripts to call home to my boss, that is for sure.

13 reaktioner till “How NOT to run a leak site”

  1. Also, worth to mention: As the site is hosted in the US, most probably NSA is listening in on the border. It is quite easy for them, just by looking at the size of a posted request, to identify an actual transfer of a document, EVEN if https is used. And then they have the sending IP address and an approximate size of the submitted document… And that is probably a bad thing.

    1. Jörgen: Indeed. With advanced traffic analysis you can see which IP-numbers that make an actual submission of files. There are possible solutions in the darknets, such as I2P, where ”junk” traffic is sent between the nodes to make this kind of analysis more difficult. Also, with an https-site you can make this manually, by having networks of nodes submitting already leaked documents to the site. Takes some advanced programming though.

  2. Btw. I’m a bit surprised that Firefox by default seems to send a referer header with each request for an external javascript file. Is this an indication that sites like googleapis.com is not just a case of Google benevolently or for PR reasons offering a useful tool to developers, but rather a system specifically setup for the purpose of tracking what web pages users visit?

    I guess one should be able to protect oneself against it by setting the http.sendRefererHeader setting to 0 or 1 or by using a javascript blocker (unless the site one visits explicitly passes information about itself to the external site of course).

    1. Tor: Well, I guess the purpose of these javascripts is to monitor the web, not to give us good ”tools”.

      I recommend NoScript plugin for Firefox for better performance and security on all sites. But on leak sites, these scripts should not be there in the first place.

  3. Christopher: jag menade precis tvärtom. man bör inte ha loggar men ej heller säga att man inte loggar. Men poängen med bra anonymisering är väl att man ska komma till punkten där det inte längre spelar någon roll om man loggar eller ej!

    1. kristoffer: Ah, då är jag med på vad du säger. Jo, anonymiseringen ska helt klart vara så bra att loggarna är meningslösa. Men det ställer stora krav på användarna som då måste lära sig proxies, vpn, exit-noder osv.

      Men, har man en .i2p eller en .onion blir ju alla access-loggar i princip värdelösa.

  4. @Christopher
    Yes, surely the purpose is to monitor what web sites people visit, but I wonder if the regular web developer who just wants to load for example jquery realizes that he violates the privacy of his visitors in the process. To me this seems like a quite big issue that should be discussed more.

  5. Tor: Definately. The scripts around the interweb are one of the largest security issues, and since web 2.0 they have exploded (even though they were quite common in the 90’s-style guestbooks). But it is not about javascript in itself, rather, it is about who gets the information from a script.

Kommentera

E-postadressen publiceras inte. Obligatoriska fält är märkta *

Time limit is exhausted. Please reload CAPTCHA.