Build 100 Websites!

Howdy! The name's Andrew Quinn. I'm building 100 websites!

Why?

¯\_(ツ)_/¯ It seemed like fun!

I've been a computer guy for a long time, but even as a little kid I always saw building websites as this immensely daunting task that only a genius could pull off. Then I got a little older and got a computer of my own, and as the years went by it gradually began to appear like a less daunting task to me.

I pretty much went through every phase of Unix geekdom you can imagine before, at the ripe age of 28, I finally admitted to myself "I just think websites are cool and I want to build some." Let's see, we have:

And here we are. Currently I work in the ops department of a very cool fintech firm, and I'm actually quite happy with that role -- but here's my LinkedIn and my GitHub profile anyway ;)

I think I might be the world's slowest maturing web developer, since I got my first job in tech in 2009 as a Linux sysadmin intern at Akamai and just kind of never stopped messing around with Linux until it accidentally became my full-time career. But for some reason, I was always terrified of making the leap into real dev work - what if I'm not smart enough? What if I can't build the product fast enough? What if it's just a bubble? (Yes, I was actually asking myself this about software in 2020. Trauma is one motherfucker of a common-sense-scrambler.)

Website #1: https://build-100-websites.fun

That's right, this right here is website #1! Its primary purpose is simply as a technique catalogue of how I went about doing everything, what big technologies I used, etc. It is unapologetically self-indulgent that way.

Major technologies employed

Website #1's technology stack

Plain ol' HTML

Ha, it's 2023 and I wrote all the HTML for this little page by hand!

But it's good to get back to basics every now and then. I mean really basic - I've had a lot of fun nights in my life reading old websites that looked not too dissimilar at all from this, straight up Web 1.0. My favorites are when you find a professor's website that looks like it might as well be the inspiration for danluu.com but then you realize it's also the page of one of the most cited IEEE Fellows of all time.(Rest in peace, Dr. Taflove. I'll never forget what you did for me, and I wish I wasn't as mentally ill as I was when I was your student. Maybe I could have made something more of myself.)

nginx

I actually have superkuh to thank for this choice. After my first ever Hacker News submission to hit the front page, they left a comment:

You can just write HTML too. It is much simpler, easier to maintain, and more secure (using a real webserver). Make .html files, open them in a text editor, type in the HTML. Here, I'll make an example like the minimum viable hugo.

        $ sudo apt get install nginx
      
        <html>
          <head>
            <title>
              Lorem ipsum dolor sit amet
            <title>
          <head>
          <body>
            <h1>
              Lorem ipsum dolor sit amet
            <h1>
          <body>
        <html>
      

And now save it as index.html in the www directory that installing nginx creates. Check it out at http://127.0.0.1/ . Go to your router and forward port 80 to the IP address of the computer running nginx to make it public.

Hey, it looked simple enough. So I tried it out. And it worked! And it was actually really fun and satisfying!

Working in ops I definitely knew of nginx for a long time, but it always sounded like the kind of thing that was just a little too low-level for me to bother with. I think I wasn't giving it enough credit - it's quite an improvement on Apache in terms of usability. Realizing I could literally just install it, stick an index.html into the /var/www/html/ it created, and then see it in Firefox has changed my tune.

There was only one small problem, and that's that nginx doesn't use UTF-8 by default. So my ¯\_(ツ)_/¯ looked like a much more menacing ¯\_(ツ)_/¯ until I figured out how to do that:

  1. First find and open nginx.conf.
    1. Any time I have to edit a random config file on a Linux box these days, I just mindlessly hammer out sudo vi $(fdfind . '/' | fzf) and fuzzy-search until I find the boy; it's a lot easier than trying to use my half-remembered knowledge of the FHS to track them down.
    2. in this case, it looks like it was living at /etc/nginx/nginx.conf.
  2. Then find the http block and add charset utf-8; to it.
  3. Finally, run systemctl reload nginx.service and take a look! No more weird characters.

entr

There was of course one annoyance even with this bare-bones setup: I didn't want to have to actually edit this page at /var/www/html/, since that's owned by root. Half the reason I love sites like Netlify is because I can just set them up to rebuild any time I hit git push. And nginx doesn't come with live reload, like Hugo does.

So I installed an auto tab reloader extension for Firefox and set it to every 5 seconds, a poor man's live reload, but easy enough; and then I installed entr, a magical little utility that runs a shell command whenever a given file changes. (Seriously. If you've never used this thing before keep it in your back pocket. It is a game-changer when working on web stuff like this, up there with watch in a tmux window.)

I went to my build-100-websites repo, punched in

    echo index.html | sudo entr cp index.html /var/www/html
    

and we were off to the races!

Github Pages (hosting)

Now superkuh was kind enough to give me port forwarding instructions, but I didn't want to pay for a whole VM just to put one HTML page online. At the same time, I had used Netlify and Heroku for everything of this sort my whole life, and I decided I wanted to try out GH Pages to see what all the fuss is about. I had the literal HTML sitting there, in Github - it was even called index.html for crying out loud. How hard could it be?

Not hard at all, it turned out! Getting it to deploy on https://hiandrewquinn.github.io/build-100-websites/ was as close to a zero-config hosting experience as I've ever seen: Pick your branch, make sure the HTML page is named "index", and you're good.

Getting it to deploy on https://build-100-websites.fun was a little more involved, by necessity, since you have to set up your CNAME and ALIAS tags and everything for a DNS provider. First GH Pages commits a new CNAME page to the root of your Git repo containing the name of your website, as per the instructions.

I also finally got an opportunity to use dig! I've wanted to use dig ever since I read this Julia Evans article about it.

    ➜  ~ dig www.build-100-websites.fun

    ; <<>> DiG 9.18.1-1ubuntu1.3-Ubuntu <<>> www.build-100-websites.fun
    ;; global options: +cmd
    ;; Got answer:
    ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 62786
    ;; flags: qr rd ra; QUERY: 1, ANSWER: 5, AUTHORITY: 0, ADDITIONAL: 1

    ;; OPT PSEUDOSECTION:
    ; EDNS: version: 0, flags:; udp: 65494
    ;; QUESTION SECTION:
    ;www.build-100-websites.fun.	IN	A

    ;; ANSWER SECTION:
    www.build-100-websites.fun. 227	IN	CNAME	hiandrewquinn.github.io.
    hiandrewquinn.github.io. 227	IN	A	185.199.110.153
    hiandrewquinn.github.io. 227	IN	A	185.199.109.153
    hiandrewquinn.github.io. 227	IN	A	185.199.108.153
    hiandrewquinn.github.io. 227	IN	A	185.199.111.153

    ;; Query time: 0 msec
    ;; SERVER: 127.0.0.53#53(127.0.0.53) (UDP)
    ;; WHEN: Sat Mar 04 16:15:38 EET 2023
    ;; MSG SIZE  rcvd: 156
    

Finally I deleted the default A records my DNS slaps in there and added in Github Pages' own A and AAAA ones to let me go to the apex domain, build-100-websites.fun, sans www.

Website #2: https://azure-functions-datasette.azurewebsites.net/

I know, I know, you were expecting me to go for Hugo for #2. But no! At work, I happened upon some CSVs generated by some legacy code that I figured woud be really helpful to be able to access whenever I was on the corporate network, instead of rummaging around in my email to find them ('legacy' here means, in part, that these CSVs get created and mailed out to 5 or 6 people every day).

I've used Simon Willison's Datasette many times before in the past and found it a simply remarkable little piece of software. I love SQLite, I love locally-hosted web apps, I love performance - what's not to love, really? But alas, the published guides on deploying Datasette all focus on public platforms: Google Cloud, Vercel, Fly.

I happen to work primarily in enterprise, and with Microsoft Azure in particular. And I think there's a really nice niche for creating tiny baked-data websites for company eyes only - so I decided to get some practice at home deploying Datasette to Azure on my own time. Just to make sure I knew what I was doing for the big time. ;)

Getting Azure set up

I've never used Azure in a personal capacity before, despite being a cloud administrator. So I treated this quite a bit differently from how I usually provision resources on the job, namely: I aggressively cost-optimized.

Following Simon Willison's instructions, I created my resources in the order Resource Group, Storage Account, Function App. I actually really like the simplicity of this stack - it's similar to working solely with Lambdas and S3 in Amazon Web Services. Easy to keep track of.

Then came the bugs! Unsurprsingly Mr Willison does not spend as much time in Azure land as me, and so his codebase was slightly out of date. The instructions weren't clear-to-the-point-of-brainless, either, so I added a few bits of my own to that. Here's the results!"

The cost of getting set up

After an evening or two of messing around with it, I finally got my Azure Function to work right! I was serving an honest-to-goodness serverless web app, and it wasn't even hard! For a while I loved living on the shoulders of giants... Until I saw the bill.

One cent. One whole cent a day. That's how much my little experiment was costing me, and it was entirely from storage costs. I couldn't believe it! I had an actual bill that wasn't from a DNS provider, for the first time in my life!

I knew from my work that storage is one of those bugbears you absolutely want to keep an eye on when running any kind of software busines, because if you're doing things right (aka collecting a lot of data), it will only grow over time - and you'd better make sure you can afford that. I found I had on the order of 50 megabytes in blob containers for my serverless SQLite database searcher, alongside about 3 MiB of tables and 6 KiB of file shares. At current prices of $0.15 per GB, this would have explained the approximately one-cent-a-day storage costs I saw. In other words, I could reliably bet on an average SQLite database, infrequently acceessed and under, say, 10 MB of data to cost about 50 cents a month.

I figured this was pretty acceptable, actually, because I had another idea brewing...