Live over Tor

One of the things I’ve been meaning to do for a while is gain a better understanding of the onion protocol and how hidden services are hosted. I got about 50% through the Tor whitepaper before deciding to pivot to some practical applications, so this blog is now mirrored at http://xzjjcvowtdunfx4z6dkeund7sjvt3k7nphgcfdusy64smyqpmdusmpad.onion/. It’s v3, which can be quickly inferred from the 56 character address.

The steps

The static files for this blog reside on a 3rd gen Raspberry Pi and are served by nginx, so we’ll setup Tor there as well.

Installing Tor requires either building from source, or adding some 3rd party APT repositories. I used the repos because building will take a while. There are some gotchas, however, since Raspbian differs from Debian in a few ways. More detailed info here.

In short, it’s sufficient to add the following to your /etc/apt/sources.list. The [arch=amd64] is important else your apt get update will fail. Also pay attention to your raspbian version. Is it jessie, stretch or buster? Double check with cat /etc/os-release.

deb [arch=amd64] buster main
deb-src [arch=amd64] buster main

Add the package signing GPG keys.

curl | gpg --import
gpg --export A3C4F0F979CAA22CDBA8F512EE8CBC9E886DDD89 | apt-key add -

Hope this works.

apt update && apt upgrade
apt install tor

Next, ensure the hidden service dir /var/lib/tor/hidden_service exists and has the correct permissions.

sudo chown -R debian-tor /var/lib/tor/hidden_service
sudo chmod 700 /var/lib/tor/hidden_service

Edit /etc/tor/torrc to reflect the setup

HiddenServiceDir /var/lib/tor/hidden_service
HiddenServicePort 80

Finally, add a vhost to nginx. Your web root may vary so double check that.

server {
    root /var/www/html;
    client_max_body_size 32M;
    charset utf-8;
    index index.html;

Now, do a service tor restart && service nginx reload and you should now find a hostname and private_key file in /var/lib/tor/hidden_service. Your hostname is your onion url so go ahead and test that. If something went wrong, which is highly likely, tail your /var/log/syslog, /var/log/nginx/access.log and /var/log/nginx/error.log for hints. Apparently Tor is supposed to generate logs in /var/log/tor but mine is empty.

Ah, one more thing

Setting up a middle/guard relay is pretty trivial, and a great way to help out the community. Add a few lines to /etc/tor/torrc and ensure port 9001 is reachable via Then restart, and we’re good to go. Keep tailing syslog however, since it will indicate if things go wrong.

ORPort 9001
ExitRelay 0
SocksPort 0
ControlSocket 0
Nickname CiaSurveillanceVan

After about an hour we can verify the relay is operational at Fantastic.

By hosting a hidden service however, I’ve now created a new set of problems for the poor pi, and by extension my home network connection, in the form of eventual ruthless DDoSing. These Posts discuss the problems & solutions, and are a good starting point for hardening the home network.

In theory though, v3 addresses should be impossible to crawl unless publicly advertised. Still, for my peace of mind it’s best to take precautions. For now the first step is getting some better network monitoring in place, which from my understanding is Munin. Updates to follow, as they’re implemented.

Bonus pro-tip

Now would be a good time to run a backup of the Pi’s SD card, to save our hours of toil. I have an additional usb drive attached and mounted so I will drop it there. Note that your SD card might have a different name, so check with lsblk.

dd bs=4M if=/dev/mmcblk0 of=/media/pi-space/sd-card-backup-2021-02.img


The evergreen home networking blog post

I’ve reached the point in my life where I find home networking fun. It’s either that or a model train set. The catalyst was finally setting up an Open-WRT router, which gives me options. The ability to put the Vodafone KabelBox modem in full bridge mode is mandatory for proper NAT resolution and to access local services from the outside. The hosting hub is a Raspberry Pi 3 with a number of additional services like nginx.

Open-WRT & KabelBox

Luckily Vodafone are nice enough to provide the option to toggle bridge mode on their router/modem devices. It was deeply buried here.

However, after enabling it my internet connection died. I noticed the router was being assigned an ip on the WAN interface from the gateway (modem), but I couldn’t ping through to any DNS servers. After investing several hours reading the open-wrt docs and searching online I noticed a tip about restarting the cable modem in order to assign the router a new ip for the WAN interface (one that isn’t from the old 192.0.1 pool, but instead something fresh from the ISP, in a completely different subnet). This interface remains running as a DHCP client which is the default. In fact, no additional configuration outside of wifi setup had to be done for the router. Anyway once again the rule of thumb about computers is if in doubt; restart.

Cloudflare & DNS

On the Cloudfront side, I had to create a couple of records and then use my Global API key (Token didn’t work for some reason) in order to run a Crontask which updates the A record whenever the router receives a new public IP from the ISP. We want this router accessible from the internet. Hackers welcome I guess. The CF DNS records consist of an A record initially set to anything, but subsequently updated by the crontask running on open-wrt. An A record of dynamic-> router-ip and a CNAME record of rpavlov -> are created. Pinging either one will resolve to CF’s server, so my ip is nicely hidden and protected from DDoS. The steps are more clearly outlined here . Finally, add a Page Rule to redirect all traffic to https.

Raspberry Pi

  • Disable ssh password. While we’re at it, only allow ssh into the router from the LAN interface.
  • We need to reserve a static ip for the pi in the LAN. Easily done through the open-wrt Luci web interface.
  • We need to expose port 80/433 in the router firewall from the WAN interface, and route it to the Pi local ip on the LAN interface. Additionally we need to remap the port that the router’s Web interface server (uHttpd) is running on to something else in order to free up those ports.
  • Issue an SSL cert with Certbot, by authentication through a TXT file in the webroot.
  • Drop the static blog files in /var/www/html.
  • Add my hardened nginx config.

Dropbox replacement

Dropbox can’t really be trusted, so I trade reliability and security for dubiously decreased paranoia. Instead I use Syncthing which runs on each client device (laptops, PC) and doesn’t require hosting. It works out of the box flawlessly.


As easy as running curl -sSL | bash on the pi, and following the steps. They are also kind enough to explain why curl to bash, and give you options if you want to audit the scripts yourself. Afterwards, set the DNS servers of each device on the home network to point to the pi’s ip. Unfortunately the Vodafone router does not provide the option to set DNS servers, so this has to be done per device.

Next up

  • Funkwhale for streaming my own music files outside of Spotify.
  • A Tor relay.
  • This blog as a hidden service, which is pretty straightforward actually: Download Tor, modify the .../tor-browser_en-US/Browser/TorBrowser/Data/Tor/torrc to point to nginx, then access at the relevant vhost+port.


Much respect to the legion of clever elves who made all this software magic possible. Throughout the course of writing this post I also realized I’ll probably be touching computers until the day I die, which seems like a mixed blessing.

Update 2020-11-30

The dream is over. My internet speed plummeted when using the router, probably because it’s running on 10 year old hardware. In any case, it turns out I can still just forward ports 80/443 from outside to the static ip of the Pi, as well as run the cronjob to update the A record with the public IP of the Vodafone Router. One other thing I added was uptime checks from Google Cloud. Godbless.

Github pages bug

Barely a day goes by now where I don’t discover a bug in some service, tool or program. Sometimes it’s even the default configuration that’s wrong. Anyway, this one is a blog post because it’s about Github pages where this blog lives.

I noticed the site was 404ing after I made the repository itself private. I reverted that, but still seeing a 404. I hadn’t deployed any code changes either. I ruled it out as issue with the custom domain name because was also 404.

Of course, others out there have had this same problem on and off since 2012 (According to the top answer on Stackoverflow). The solution is to delete and re-push your remote gh-pages branch. Pretty lame solution. Possible bug on Github’s end, according to comments.

git checkout gh-pages

git push origin --delete gh-pages

git push origin

It appears the option to have a private repo with public github pages is graciously on offer as a paid feature by GH.

Pandemic tunes

It seems this pandemic is shaking things up a bit, and enforcing some lifestyle changes. Not much to do but wait it out, listen to some music, and learn how to mix.


OWASP Automated threat handbook

Some quick thoughts after reading the OWASP Automated Threat Handbook

These kinds of attacks are carried out by seemingly legitimate but actually malicious users of your application. The crux of it is: spend time thinking about how your application can be probed, scanned, scraped, flooded or most commonly have it’s otherwise normal functionality subverted.

The next step is to build in countermeasures. This is where a prioritized checklist would lend itself well to laying out which things to build or plan for. There is a huge variety of actionable steps to take: obfuscating urls such that your site can’t be spidered, adding page and session-specific tokens, purposely load-testing/flooding your app, performing user-agent fingerprinting to weed out some automated requests, writing test cases for abuse scenarios, monitoring for anomalous requests and dozens of other things. The priority depends on your infrastructure inventory and your risk tolerance which should hopefully have been clarified after your risk assessment (which you did, right?).

Automated threats kind of fall into two attacker use cases: application recon and unfair resource usage. In the context of e-commerce (where these threats are most common), this can mean sniping/hoarding concert tickets or scraping product prices, enumerating valid users or validating stolen credit cards. So, this means that when creating legitimate features take the time to think of how they can be subverted or have unintended uses. From a technical point of view, ensure you’ve performed security assessments and vulnerability scans, have monitoring and instrumentation in place and a whole bunch of other things like real-time detection and alerting from sources such as logs, DNS and computing resources. If you have health-checks you should have security-checks too, and spikes in CPU usage should be as concerning as spikes in unusual requests.

<< 2 of 7 >>
© 2021 Roumen Pavlov. All rights reserved.