I've been thinking about the false sense of security that TLS
brings, in the context of deliberately choosing to publish content
on say, a website rather than an unencrypted gopher site, due to
some misplaced concerns that you have to deliver all your content
in encrypted form. It's true that HTTPS with a modern version
of TLS and a trusted CA will protect you against your garden
variety criminal out to steal banking credentials or credit cards,
but it won't protect you against a determined adversary like a
nation-state. In the limit case they can just demand a CA's private
key, and MITM any encrypted traffic using it. If you think this
can't be done secretly, you haven't been paying attention this past
decade. Still, it is probably more common that a CA gets hacked and
the private keys compromised; [the EFF has a good article on these
topics][0].
The idea of Gopher-over-TLS then, apart from lacking decent client
support, will inherit the same trust issues if it uses the web model
of a centralized CA.
I think a better security model is this - use a VPN (to include
SSH/socks proxies or just remote SSH terminal sessions) or Tor to
access gopher content you don't want your ISP or government or
garden-variety cracker to snoop on.
The simplest way to do this in the case of gopher is to use the Tor
browser with [a gopher/web proxy, like floodgap's][1] (in my tests,
the Tor browser doesn't function with the OverbiteWX plugin, so you
need to use a web proxy manually). This has the advantage of being
cross-platform and very easy to use for non-technical users. You can
also use command-line browsers with a tor/socks wrapper like
torsocks.
Sometimes the gopher server operator runs their server in parallel
as a Tor hidden service, but this suffers from lack of working
client support due to problems with the .onion DNS queries not being
socks-ified (lynx I'm looking at you), even with torsocks. It is
also extra work to setup the hidden service so that the internal
gopher site links all use the .onion domain (not impossible, just
time consuming for large sites that may use full selectors already,
and also depending on which gopher server is in use). Browsing
hidden sites on gopher is frustrating, unless the site owner takes
the time to mark non-onion links it is easy to think you are
browsing a hidden site but not be. Again, this is more work for the
site admin.
More technical users will be comfortable with a VPN, or just a text
client like vf-1 or lynx over a remote SSH shell session. I have
also had success using the OverbiteNX plugin in standard Firefox,
with the browser configured to use a socks proxy port (setup via ssh
-D or privoxy/tor) and also to send DNS requests through the proxy.
This latter configuration is nice in that it gives you native gopher
support in modern Firefox, at the expense of a slightly harder setup
for OverbiteNX (but it just has to be setup once).
In all of these cases, the final hop to the gopher server is
unencrypted, but this shouldn't matter as that content will only be
unencrypted as far as a VPN endpoint, remote shell, or the last hop
in a Tor circuit. Nothing will be directly tied back to the client.
[0]: https://www.eff.org/deeplinks/2011/10/how-secure-https-today
[1]: https://gopher.floodgap.com/gopher/gw
Response:
text/plain