Gopher

Gopher is a very simple protocol, unlike HTTP and HTML: it only knows menus and files, basically: text files, image files, and any other binary files (HTML files are an extension). It doesn’t do content negotiation, it doesn’t have cookies, it doesn’t execute JavaScript and no Gopher site expects clients to run JavaScript.

This site is available via Gopher! Visit gopher://alexschroeder.ch using a gopher client. You’ll notice some interesting changes.

2018-12-01 Thinking about the real RSS 3.0

I’m still sad about RSS 2.0 and Atom. Why didn’t we go the other way? Aaron Swartz had the right ideas and called it RSS 3.0! He was so ahead of his time.

Follow those links and read it. The introduction in particular is still gold! 😂

Quote:

  1. Remove XML. XML is just too complicated and is against the spirit of RSS, which is Really Simple Syndication. [...] Instead, we’ll go back to RFC822-style fields. [...]
  2. Remove namespaces. Namespaces are just a waste of time. [...]
  3. HTML forbidden. No one needs HTML. Email has been just fine for years before Microsoft introduce their stupid rich HTML extensions. HTML is for those loser newbies. Any intelligent Internet user deals in plain text.

Clearly, it’s a joke. But the reason it burns is because it is so attractive.

And to be sure, it is a joke! Aaron Swartz was a member of the RSS-DEV Working Group which had developed RSS 1.0. He didn’t actually espouse the values expressed in the list above. But they still speak to me in an irrational way, like Gopher speaks to me.

Wouldn’t this make sense, in a retro kind of way? When some people start seeing a point in Gopher in an age of HTML 5 and rich multi-media hypertext, then perhaps we can also go “back” to a syndication format that never was. I don’t actually prefer Gopher to the Web. I prefer the spirit of Gopher. I like being able to write a client and a server in a few lines of code. i like the absence of Javascript and Cookies. I like the lack of surveillance. That’s because Gopher is simple. I want things to be simple. And I want feed parsing to be simple. Sure, there are libraries to help parse XML (like there are for HTTP, HTML, caching, headers, content negotiation, Javascript, and so on). But nothing beats a few lines of code.

sub ParseData {
  my $data = shift;
  my %result;
  while ($data =~ /(\S+?): (.*?)(?=\n[^ \t]|\Z)/gs) {
    my ($key, $value) = ($1, $2);
    $value =~ s/\n\t/\n/g;
    $result{$key} = $value;
  }
  return wantarray ? %result : \%result; # return list sometimes for compatibility
}

Oddmuse uses this format to save data to disk. 🙃

Anyway, if you want to go for a deep dive, there is a lot more history and examples in the long, multi-page article The Evolution of RSS. The history section on Wikipedia is much shorter.

Tags:

Add Comment

2018-12-01 Gopher space is growing

I love the tiny, expanding Gopher network. People hosting small public access shell servers, sometimes with only 128MB RAM, sometimes more, access via a restricted shell, sometimes with git, sometimes with a weird publishing system, all of them with Gopher hosting. 😍

It feels like the old days!

The old days that never were, to be sure. I do remember the old days. Applying for an email account at the department’s IT contact. Going into the cellar where the computer rooms were, and logging into an AIX. Rows and rows of computers, one or two other humans, practically no help, no search engines, everything was hard. We’re building a better past, striving for a brighter future!

And nobody – nobody! – was hosting anything, anywhere, outside of universities and some companies. People hosting stuff? By the people for the people? Not that I remember! Public access shell systems were rare and learning about them wasn’t easy. By the time I got an account I had already gotten used to free services like Geocities web hosting and Flickr image hosting and Google mail hosting…

As a reader, you need a Gopher Client.

As a writer, you need to ask the operators of one of the sites listed above for access. This will require you to use the shell and public key cryptography. Like a hacker! 😎 😂 But no worries: I wrote a short introduction here, for Cosmic Voyage. The instructions for how to generate your private and public keys and how to use the terminal, ssh, or PuTTY will be the same, however.

@dbucklin wrote a different introduction for Gopher authors: How to Gopher.

Tags:

Comments on 2018-12-01 Gopher space is growing

Interesting, and timely. I installed pygopherd on an Ubuntu 18.04 VM yesterday to play around with it. Still learning the ropes but I’m intrigued with the uncluttered simplicity of Gopher.

– BedRockDocs 2018-12-11 13:26 UTC


Also, the archaic gopher menu files. 🙈

I have become so used to URLs, I’m confused when I realize that there’s no way to tell gopher clients that a particular link is with or without TLS. Which is why I was very confused when I started thinking about encrypted gopher.

– Alex Schroeder 2018-12-11 13:36 UTC


There is a pretty robust discussion of TLS in Gopher starting here: https://lists.debian.org/gopher-project/2018/02/msg00025.html I don’t pretend though to understand much of it. :)

– BedRockDocs 2018-12-14 15:23 UTC


Sure. I was part of that discussion. :)

– Alex Schroeder 2018-12-14 17:39 UTC

Add Comment

2018-11-30 Moku Pona and Gopher Feeds

I’m adding Gopher feed support to Moku Pona. In this context, a Gopher feed is a Gopher resource that returns a RSS 2.0 or Atom feed with gopher URLs as their links.

The URL could be in the text content of the link element like they do in RSS 2.0, or it could be in the href attribute like they do in Atom.

So with all of that out of the way, here are two Gopher feed URLs to look at:

  1. gopher://alexschroeder.ch:70/0do/rss
  2. gopher://gopher.leveck.us/0/phlog.atom

This should work:

alex@melanobombus:~$ moku-pona add gopher://alexschroeder.ch:70/0do/rss "Alex RSS"
alex@melanobombus:~$ moku-pona add gopher://gopher.leveck.us/0/phlog.atom "Jynx Atom"
alex@melanobombus:~$ moku-pona update
Fetching Alex RSS...updated
Fetching Jynx Atom...updated
alex@melanobombus:~$ vf1 ~/.moku-pona/updates.txt
Welcome to VF-1!
Enjoy your flight through Gopherspace...
[1] 2018-11-30 Jynx Atom/
[2] 2018-11-30 Alex RSS/
VF-1> 2
[1] Ship «Hoffnung»
[2] Comments on I don't like Bennies
[3] Influental Games
[4] Comments on Cosmic Voyage
[5] Cosmic Voyage
[6] Comments on What is Sandbox?
[7] Comments on Laptop Fan
[8] Gopher Module
[9] What is Sandbox?
[10] List of Open Books
[11] Fennel and Bussard
[12] Old Dwarf Drawings
[13] Petition
[14] Past and Current Reading
[15] 2019-02 Book Club

Now, it’s important to note that this is different from how Moku Pona worked before. Previously, it simply took the line from sites.txt and rearranged them in updates.txt. Thus, if you subscribed to alexschroeder.ch 😎 then the list of updates would also link to the same site. But this doesn’t work for feeds. If Moku Pona watches a feed, and it updates, you don’t want to get linked to the feed. You want to see a Gopher map of the links in that feed, right? So this is what Moku Pona does: it translates the feed into a Gopher Map and saves that into its cache. And when you look for the update, then it links you to the local cache.

This is what it looks like:

alex@melanobombus:~$ ls .moku-pona/
alexschroeder.ch-70-do-rss.txt       sites.txt
gopher.leveck.us-70--phlog.atom.txt  updates.txt
alex@melanobombus:~$ cat .moku-pona/sites.txt 
0Alex RSS	do/rss	alexschroeder.ch	70
0Jynx Atom	/phlog.atom	gopher.leveck.us	70
alex@melanobombus:~$ cat .moku-pona/updates.txt 
12018-11-30 Jynx Atom	/home/alex/.moku-pona/gopher.leveck.us-70--phlog.atom.txt	
12018-11-30 Alex RSS	/home/alex/.moku-pona/alexschroeder.ch-70-do-rss.txt		

This works well, locally. But if you’re sharing the updates.txt file, then people won’t be able to follow the link as the link to the updates leads to your local cache, on your local disk (in this case: /home/alex/.moku-pona). 😢

Tags:

Comments on 2018-11-30 Moku Pona and Gopher Feeds

The page used to say that I was using the Gopher Module 1.0 but Tomasino convinced me that it wasn’t necessary. If you get the feed via Gopher protocol, provide a feed containing gopher links. If you get the feed via the web, provide web links. And never the twain shall meet, I guess. :)

– Alex Schroeder 2018-11-30 21:07 UTC

Add Comment

2018-11-28 Cosmic Voyage

I’ve signed up for an account on Cosmic Voyage. It’s basically a way to write a little Science-Fiction together with other people. The website reflects the Gopher mindset: it’s text based. You can visit it via Gopher, too.

If you don’t have a dedicated Gopher Client, I’ll suggest using the text browser lynx. As for dedicated gopher clients, I like VF-1 by @solderpunk. It requires Python. For iOS devices, I’ll suggest the Gopher Client by @crc. For Android devices, @ckeen suggests Pocket Gopher.

Anyway, back to Cosmic Voyage. It is run by @tomasino. In order to join, you don’t need much:

  1. a username (used internally on the system only)
  2. a first ship name (used in the story)
  3. a link or contents of a public ssh key for access

You get ssh access to a directory per ship where you can dump your little stories. Refer to the others, or don’t. Post a lot, or just a little.

The ssh access means I can just use emacs to write and rsync to backup.

My first transmission from the the ship Hoffnung, with only the faintest idea of a plot:

-+-+-+- Regular Report -+-+-+- C51.204 -+-+-+- Green       -+-+-+-
Dr. med. Herbert Wullschlegel reporting on the first scheduled
inspection. The passenger status in cryo sleep is nominal. We have had
no failures. My companion for this roun is Dr. astr.-phys. Eng.
Philomena Auerbach. We've spent the first two days reviewing the logs.
This is the third day and we have had a good time running the long
corridors of the ship. Philomena did go back to sector C in order to
inspect a minor hydro leak. Nothing unexpected. As for myself, I'm
taking advantage of the cryo break to eat some solid food. It helps
with the teeth reconstruction. I'm so happy that we managed to get the
newer cryo berths with the slow shaking to strengthen bones maintain
muscle tissue. With so many years spent in cryo, even the very slow
metabolism of space sleep changes the body. Sometimes I wonder what
ships built after us would offer. Hoffnung did not get the latest
Shrinivasan-Ramapattnam drive. We could not afford them. And with that
I'm going to close this cover leter for the full technical report. In
two days we're going back to sleep for another fifty years. Peace.
-+-+-+ End of Report -+-+-+ Signed HEWUSC -+-+-+ 

(Continued in 2018-11-30 Ship «Hoffnung».)

To me, and apparently to other people as well, this looks like an exciting, new way to experience Gopher, a return to the spirit of public access servers, shell servers. Much more limited than a virtual machine, or even a web host, but also something you can do right now, no matter your background.

@jynx seems to share my enthusiasm. See his Gopher post: So much cool stuff going on. @solderpunk is happy, too: So much cool stuff.

Even if you’re not too technically minded, you can join.

If you’re on a Mac, you should have everything you need.

  1. use Command+Space to search for Terminal
  2. generate a key using ssh-keygen (just accept all the defaults)
  3. your secret key is the file ~/.ssh/id_rsa
  4. your public key is the file ~/.ssh/id_rsa.pub
  5. send the id_rsa.pub file to Tomasino, together with your user name and your ship name (see Cosmic Voyage for his email)
  6. when Tomasino has set things up for you, connect using ssh and the user name you received, e.g. ssh alex@cosmic.voyage

If you’re on Windows, it’s a bit trickier.

  1. download the latest PuTTY (probably putty-64bit-0.70-installer.msi)
  2. use puttygen to generate a key (I don’t think you need to specify a passphrase)
  3. save private and public keys
  4. send the public key to Tomasino, together with your user name and your ship name (see Cosmic Voyage for his email)
  5. when Tomasino has set things up for you, start pageant (this will put an icon of a computer wearing a hat into the System tray)
  6. right click the icon and choose View Keys
  7. click the Add Keys button
  8. select the private key you created up above and open it
  9. start putty and connect to your username at cosmic.voyage, e.g. alex@cosmic.voyage

You can read more about key generation in chapter 8 of the documentation. You can read more about pageant in chapter 9 of the documentation.

If you’re on GNU/Linux, you should have everything you need. The instructions are the same as for the Mac except opening a terminal will vary on how things are set up on your machine.

  1. open a terminal
  2. generate a key using ssh-keygen (just accept all the defaults)
  3. your secret key is the file ~/.ssh/id_rsa
  4. your public key is the file ~/.ssh/id_rsa.pub
  5. send the id_rsa.pub file to Tomasino, together with your user name and your ship name (see Cosmic Voyage for his email)
  6. when Tomasino has set things up for you, connect using ssh and the user name you received, e.g. ssh alex@cosmic.voyage

I hope to hear from you, soon!

Here’s how you’d write a story. Here, my username is alex and my ship’s name is Mercury and I have already written my first story and I’m going to write my second story using the editor nano.

alex@cosmic:~$ cd ships
alex@cosmic:~/ships$ ls
Mercury
alex@cosmic:~/ships$ cd Mercury
alex@cosmic:~/ships/Mercury$ ls
001.txt
alex@cosmic:~/ships/Mercury$ nano 002.txt
alex@cosmic:~/ships/Mercury$ log

Other editors you can use are editor or vi. You can worry about them later. If you’re just beginning, nano will do.

The log command updates the top menus and lets other people know that your story is ready. Use man log to learn more about it. (man is the program that displays pages from the manual.)

Later, once you’ve realized that you don’t like nano or vi or any of the other editors, you can always edit the file locally and transfer it using scp which uses ssh in the background. And if you’re using PuTTY instead of ssh, then you transfer your file using pscp instead of scp.

Tags:

Comments on 2018-11-28 Cosmic Voyage

Thank you so much for posting this! I’ve just joined this evening, and I’m really enjoying writing Voortrekker, and looking forward to reading everyone else’s stories as well - saving that for tomorrow, since I’ll likely want something that’ll make me smile then.

I’ve lately been looking for something that’d help me recover the practice of writing fiction, and I believe I have now found it. And it’s really nice, too, to be on a host again that feels like it’s alive. So thanks again for talking about it - I’m really glad you did!

– Alexis 2018-11-29 04:23 UTC


Excellent! :)

– Alex Schroeder 2018-11-29 07:27 UTC

Add Comment

2018-11-27 Gopher Module

@jynx mentioned a Gopher Atom feed again, and in fact he has had such a feed for his site for quite some time. And now I’m thinking of a gopher module for RSS 2.0 and I started writing up a specification.

I’m adding it to this wiki right now!

If you want to do the same, take a look at the code that generates your feed. Your feed is either a RSS feed, in which case it starts with an rss element, or it’s an Atom feed, in which case it starts with a feed element.

Basically you want to add the following:

xmlns:gopher="https://communitywiki.org/wiki/Gopher_Module_1.0"

Here’s an example from my site, using RSS 2.0. This is how my feed begins:

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/css" href="https://alexschroeder.ch/css/alex-2018.css" ?>
<rss version="2.0"
    xmlns:gopher="https://communitywiki.org/wiki/Gopher_Module_1.0"
    xmlns:wiki="http://purl.org/rss/1.0/modules/wiki/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:cc="http://web.resource.org/cc/"
    xmlns:atom="http://www.w3.org/2005/Atom">

The feed contains a channel element which remains unchanged and a bunch of item elements. These get a new link element with the “gopher” prefix I just defined.

Here’s an example from my feed:

<item>
<title>Gopher Module</title>
<link>https://alexschroeder.ch/wiki/2018-11-27_Gopher_Module</link>
<gopher:link>gopher://alexschroeder.ch/12018-11-27_Gopher_Module/menu</gopher:link>
<guid>https://alexschroeder.ch/wiki/2018-11-27_Gopher_Module</guid>
<description>I'm thinking of a gopher module for RSS 2.0 and I started writing up a specification...</description>
<pubDate>Tue, 27 Nov 2018 19:30:08 GMT</pubDate>
<comments>https://alexschroeder.ch/wiki/Comments_on_2018-11-27_Gopher_Module</comments>
<dc:contributor>Alex Schroeder</dc:contributor>
<wiki:status>updated</wiki:status>
<wiki:importance>major</wiki:importance>
<wiki:version>2</wiki:version>
<wiki:history>https://alexschroeder.ch/wiki?action=history;id=2018-11-27_Gopher_Module</wiki:history>
<wiki:diff>https://alexschroeder.ch/wiki?action=browse;diff=1;id=2018-11-27_Gopher_Module</wiki:diff>
<category>Gopher</category>
</item>

Now, if you have an Atom feed, it’s very similar. This time I’m going to use @jynx’s feed as an example.

First, he had to add the namespace to the root element. This is how his feed starts:

<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"
      xmlns:gopher="https://communitywiki.org/wiki/Gopher_Module_1.0">

And Atom feeds also have a list of items in their feed but now we’re looking for the element called entry. See how he added the gopher link:

<entry>
<title># No days off for you </title>
<id>tag:leveck.us,2017-08-10:/phlog/20170810.post</id>
<author><name>Mr. Leveck</name><email>leveck@leveck.us</email></author>
<link rel="alternate" type="text/plain" href="gopher://leveck.us/0/Phlog/20170810.post" />
<gopher:link href="gopher://leveck.us/0/Phlog/20170810.post" />
<updated>2017-08-10T00:00:00Z</updated>
<content type="text">
<![CDATA[<pre># No days off for you 

Working a day off today... 74 hours last week, 54 this week, then four days off. I am damn ready to not wake up at 0335.
</pre>]]>
</content>
</entry>

Obviously in his case there’s little incentive to actually use the gopher module as his site doesn’t have any web links! But still, this is how it would work.

Tags:

Comments on 2018-11-27 Gopher Module

In a later discussion, @tomasino convinced me, that this was unnecessary: why mix web links and gopher links? If you’re a web person, you want the web links and the HTML content; if you’re a gopher person, you want the gopher links and the text content.

– Alex Schroeder 2018-11-30 21:32 UTC

Add Comment

2018-11-12 Zaibatsu

I just love the old school Internet enthusiasm shown by @solderpunk in his phlog post: “I am utterly entranced by this idea of a self-organising, ever-changing fractal network of social unix servers, with networks forming and disolving between servers and between networks themselves as users see fit.” Me too!

Tags:

Add Comment

2018-07-16 Blocking IP Addresses

OK, I’ve fiddled with my setup and I think it should work, but these guys still get on my nerves because I don’t understand why they need to download my entire site, ten thousand selectors and counting. And so I learned about blocking IP addresses using iptables and ipset.

I got all the info from this blog post: Block IP addresses in Linux with iptables.

Here’s the gist of it:

# Install
apt-get install ipset

# create blacklist once
ipset create blacklist hash:ip hashsize 4096
# set up iptables rules
iptables -I INPUT -m set --match-set blacklist src -j DROP
iptables -I FORWARD -m set --match-set blacklist src -j DROP
# add a specific IP address
ipset add blacklist 192.168.1.100
# confirm the blacklist contains the IP address
ipset list blacklist
# show firewall setup
iptables -L
# unblock IP address
ipset del blacklist 192.168.1.100

And for IPv6, same same but different.

ipset create blacklist6 hash:net hashsize 4096 family inet6
ip6tables -I INPUT -m set --match-set blacklist6 src -j DROP
ip6tables -I FORWARD -m set --match-set blacklist6 src -j DROP
ipset add blacklist6 ...
ipset list blacklist6
ip6tables -L

To save and restore iptables rules, use the package iptables-persistent. We don’t need this, for now.

This seems to work.

Tags:

Comments on 2018-07-16 Blocking IP Addresses

Now that I am also using fail2ban, here’s more:

# iptables --list f2b-alex-apache
-N f2b-alex-apache
-A f2b-alex-apache -s XXX -j REJECT --reject-with icmp-port-unreachable
-A f2b-alex-apache -j RETURN

To remove XXX means to repeat the command but use -D instead of -A:

# iptables -D f2b-alex-apache -s XXX -j REJECT --reject-with icmp-port-unreachable

Verify that it is gone:

# iptables --list-rules f2b-alex-apache
-N f2b-alex-apache
-A f2b-alex-apache -j RETURN

– Alex Schroeder 2018-10-07 19:07 UTC

Add Comment

2018-07-15 Russian Gopher Idiots at Work

My Gopher site seems sluggish. I wonder what’s up...

top - 21:28:34 up 99 days,  5:38,  1 user,  load average: 38.37, 32.55, 17.14

Well, load average approaching 40 gives me an idea...

$ bin/time-grouping-gopher < farm/gopher-server.log
         Hour Connections   [%]  Selectors   [%]
2018-07-15 06          25    1%         25   1%
2018-07-15 07          39    1%         39   1%
2018-07-15 08          33    1%         33   1%
2018-07-15 09          36    1%         36   1%
2018-07-15 10          32    1%         32   1%
2018-07-15 11          36    1%         36   1%
2018-07-15 12          40    1%         40   1%
2018-07-15 13          36    1%         36   1%
2018-07-15 14          38    1%         37   1%
2018-07-15 15          36    1%         36   1%
2018-07-15 16          37    1%         37   1%
2018-07-15 17          39    1%         39   1%
2018-07-15 18          38    1%         38   1%
2018-07-15 19          37    1%         37   1%
2018-07-15 20          32    1%         32   1%
2018-07-15 21        2324   81%       2288  81%

Yes indeed! I guess it’s time to stop the server.

And who did it?

$ bin/ip-numbers-gopher < farm/gopher-server.log | head -n 2
                  IP Connections   [%]
        90.154.53.13        2448   82%

Same guys like the other day!

$ whois 90.154.53.13|grep "org-name\|address"|head -n5
org-name:       "Central Telegraph" Public Joint-stock Company
address:        7, Tverskaya street
address:        125375,
address:        Moscow
address:        RUSSIAN FEDERATION

I think I need to install the honeypot ckeen was talking about.

Tags:

Comments on 2018-07-15 Russian Gopher Idiots at Work

Time to take another look at the tarpit idea again.

– Alex Schroeder 2018-07-16 06:38 UTC


I just added default Surge Protection.

– Alex Schroeder 2018-07-16 07:41 UTC


Apparently it doesn’t help. I come back to the server and my web and gopher services are down. Load is up to 128. Fiddled with the code some more. As these people are forcing me to spend time writing code I’m not interested in because they wrote a gopher bot that requests pages faster than people usually read, I’m growing angrier every time I’m forced to look at this.

Right now their bot is behaving, though: twenty requests in around 50s.

– Alex Schroeder 2018-07-16 18:56 UTC

Add Comment

2018-07-13 Killing Gopher From Russia

My gopher server crashed...

Remember Killing Gopher Servers From Russia, part 1, from April 2018? Well, It’s July and they’re at it again.

$ bin/time-grouping-gopher < farm/gopher-server.log.1
         Hour Connections   [%]  Selectors   [%]
2018-07-12 06          22    1%         22   1%
2018-07-12 07          38    1%         38   1%
2018-07-12 08          32    1%         32   1%
2018-07-12 09          35    1%         35   1%
2018-07-12 10          34    1%         34   1%
2018-07-12 11          37    1%         37   1%
2018-07-12 12          38    1%         38   1%
2018-07-12 13          35    1%         35   1%
2018-07-12 14          32    1%         32   1%
2018-07-12 15          36    1%         36   1%
2018-07-12 16          35    1%         35   1%
2018-07-12 17          38    1%         38   1%
2018-07-12 18          39    1%         39   1%
2018-07-12 19          41    1%         41   1%
2018-07-12 20        3619   88%       3607  88%

OK, so who did this?

$ bin/ip-numbers-gopher < farm/gopher-server.log.1 | head -n 2
                  IP Connections   [%]
        90.154.53.13        3610   88%

And who is this?

$ whois 90.154.53.13|grep "org-name\|address"|head -n5
org-name:       "Central Telegraph" Public Joint-stock Company
address:        7, Tverskaya street
address:        125375,
address:        Moscow
address:        RUSSIAN FEDERATION

And who was it last time? It was 79.165.173.172, also “Central Telegraph”, Russia.

Idiots!

To get a feeling for those 3610 requests:

$ grep 90.154.53.13 < farm/gopher-server.log.1 | head
2018/07/12-20:01:50 CONNECT TCP Peer: "[90.154.53.13]:59651" Local: "[178.209.50.237]:70"
2018/07/12-20:01:51 CONNECT TCP Peer: "[90.154.53.13]:59776" Local: "[178.209.50.237]:70"
2018/07/12-20:01:51 CONNECT TCP Peer: "[90.154.53.13]:59808" Local: "[178.209.50.237]:70"
2018/07/12-20:01:51 CONNECT TCP Peer: "[90.154.53.13]:59825" Local: "[178.209.50.237]:70"
2018/07/12-20:01:51 CONNECT TCP Peer: "[90.154.53.13]:59900" Local: "[178.209.50.237]:70"
2018/07/12-20:01:51 CONNECT TCP Peer: "[90.154.53.13]:60000" Local: "[178.209.50.237]:70"
2018/07/12-20:01:52 CONNECT TCP Peer: "[90.154.53.13]:60020" Local: "[178.209.50.237]:70"
2018/07/12-20:01:52 CONNECT TCP Peer: "[90.154.53.13]:60083" Local: "[178.209.50.237]:70"
2018/07/12-20:01:52 CONNECT TCP Peer: "[90.154.53.13]:60085" Local: "[178.209.50.237]:70"
2018/07/12-20:01:52 CONNECT TCP Peer: "[90.154.53.13]:60123" Local: "[178.209.50.237]:70"
$ grep 90.154.53.13 < farm/gopher-server.log.1 | tail
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62389" Local: "[178.209.50.237]:70"
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62397" Local: "[178.209.50.237]:70"
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62400" Local: "[178.209.50.237]:70"
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62415" Local: "[178.209.50.237]:70"
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62418" Local: "[178.209.50.237]:70"
2018/07/12-20:17:01 CONNECT TCP Peer: "[90.154.53.13]:62427" Local: "[178.209.50.237]:70"
2018/07/12-20:17:02 CONNECT TCP Peer: "[90.154.53.13]:62436" Local: "[178.209.50.237]:70"
2018/07/12-20:17:02 CONNECT TCP Peer: "[90.154.53.13]:62442" Local: "[178.209.50.237]:70"
2018/07/12-20:17:02 CONNECT TCP Peer: "[90.154.53.13]:62449" Local: "[178.209.50.237]:70"
2018/07/12-20:17:02 CONNECT TCP Peer: "[90.154.53.13]:62471" Local: "[178.209.50.237]:70"

It took them 16 minutes to take out the server...

Tags:

Add Comment

2018-04-11 Killing Gopher Servers From Russia

My Gopher server crashed and burned today. When my monitor finally killed it, it took so long to shut down that the address was still in use when the replacement got started and so it didn’t get back up. What was this all about?

alex@sibirocobombus:~$ bin/time-grouping-gopher < farm/gopher-server.log.1
         Hour Connections   [%]  Selectors   [%]
2018-04-10 06          60    1%         60   1%
2018-04-10 07          84    2%         84   2%
2018-04-10 08          77    2%         76   2%
2018-04-10 09          55    1%         54   1%
2018-04-10 10          40    1%         39   1%
2018-04-10 11          39    1%         39   1%
2018-04-10 12          81    2%         81   2%
2018-04-10 13          62    1%         62   1%
2018-04-10 14          36    1%         36   1%
2018-04-10 15          40    1%         40   1%
2018-04-10 16          72    1%         72   1%
2018-04-10 17          45    1%         45   1%
2018-04-10 18         151    3%        151   3%
2018-04-10 19        4202   83%       4182  83%

OK, so somehow somebody felt it was OK to write a bot that made 4202 connections in 3600s. Please don’t be this person.

What do we know about this person?

alex@sibirocobombus:~$ bin/ip-numbers-gopher < farm/gopher-server.log.1 | head -n 2
                  IP Connections   [%]
      79.165.173.172        4162   83%

What does WHOIS tell us?

inetnum:        79.165.160.0 - 79.165.175.255
netname:        Neo-CNT
descr:          BRAS E-320-32 DHCP-pool
descr:          Russian Central Telegraph, Moscow
country:        RU

Thanks, person.

Tags:

Comments on 2018-04-11 Killing Gopher Servers From Russia

C-Keen says this same IP has had him implement the “tarpit”: gopher://vernunftzentrum.de:70/0/ckeen/phlog/2018-04-09-Dealing-with-rogue-crawlers.md

– Alex Schroeder 2018-04-11 23:26 UTC

Add Comment

More...

Comments


Please make sure you contribute only your own work, or work licensed under the GNU Free Documentation License. Note: in order to facilitate peer review and fight vandalism, we will store your IP number for a number of days. See Privacy Policy for more information. See Info for text formatting rules. You can edit the comment page if you need to fix typos. You can subscribe to new comments by email without leaving a comment.

To save this page you must answer this question:

Please say HELLO.