About this website

Facebook has been getting less and less cool with each passing year. It used to be like Myspace (which also used to be cool) where your friends would actually post stuff, and it was interesting. Now, Facebook is mostly posts that I don't care about. It is political videos, memes, ruined sports scores, and posts by companies I'm not interested in. There is very little posting by actual friends of mine. I decided it would be more fun to post on my own website. I also thought having my own website would be a fun learning experience, which it has been.

Domain Name

First, I needed a domain name. I decided to use Google domains. They are no-nonsense, and it was easy to quickly get a domain name.

Choosing a place to host my site

This was a tough decision. Previously, I had a website through Godaddy. It was extremely limited and frustrating. I decided I wanted something else. I've heard a lot about the different cloud computing options: Amazon's AWS (Amazon Web Services), Microsoft's Azure, Google's Cloud Platform, and Digital Ocean. These are cloud computing services, so you can use them for all kinds of stuff, including hosting a website. I ended up going with Amazon Lightsail for a while, then the costs went up. AWS doesn't have a way to cap prices, so in theory, you could be billed a lot on a busy month. So I recently migrated everything to a Digital Ocean droplet. Basically, after you sign up, you get your own machine, to do what you want with. There are some pre-loaded machines to choose from- some come with Windows, or Wordpress, or MySQL, or whatever other popular software you want installed. I went with a bare-bones linux machine, as this is a learning experience, and I want to set it up myself.

Accessing the machine

My linux machine happens to be somewhere in San Francisco, so I have to access it using a terminal. Amazon and Digital Ocean both have nice web-terminals you can use. I decided to use the built-in terminal on my own computer to connect to the linux machine in Oregon. You must use SSH keys to do this, to keep communication secure.

Linux

When you get your machine, it doesn't have a webserver. It doesn't have much at all, just a few default linux programs. To navigate this mystery linux machine, I've been using The Linux Command Line: A Complete Introduction. It is an excellent, well-paced tutorial on using linux and the command line. After just a few chapters, you'll be able to navigate the file system, read and write files, and have a general idea of what you can do. It has been a lot of fun learning linux and some of the more advanced commands.

Web server

There are a few web server options out there. At work we use Apache Tomcat and IIS, but I want this website to be a learning experience. Amazon had instructions for Nginx, but I went with classic Apache. It was simple to set up. Yum, the package manager, basically does all the setup for you automatically. Once this was set up, I changed some DNS settings so that my www.johnbyers.net domain would point to my brand new AWS linux machine's IP address in Oregon. Amazon's instructions made it easy.

Making the website

I was torn between making a Wordpress website or just a normal website. Wordpress would be a good skill to have, and there are many good-looking templates available, but Wordpress also has huge amounts of security flaws that come out each month. This website is supposed to be for fun, not work, and I don't want to deal with SQL Injection, unsecure plugins, bots leaving comments, etc. So I decided to go with a simple, classic website. I still get bots and hackers scanning my site (I can tell from the logs), and it is nice knowing that there isn't much for them to attack.

My site started out as simple as possible, just plain html pages. This quickly became a hassle, as I found myself copying the same exact code for the header on every single site. And then when I made a change, I had to change it on each page. I decided to make the header dynamic, loaded through JavaScript. This worked, but I changed my mind and decided to do it in PHP. PHP does it all server-side, so after a page is requested, then you get the page right away. So if someone is using the Lynx browser, or doesn't want to use JavaScript, they'll still get the full page. Plus it is an opportunity to learn PHP.

GitLab

It isn't always easy using the command-line to edit pages. I prefer using my personal computer to write and edit my web pages, that way, I can use a mouse and any editor I want. However, then I have to copy them to the machine in San Francisco somehow. First I used linux's secure copy (scp) command to copy the files to the remote machine, this was ok, but not very convenient. Then I decided to start using Git to version my website, and push my Git repo to GitLab. I chose GitLab because it has a nice web interface, and free private hosting. Once my website is pushed to GitLab, I can pull the repo onto any other computer...including my San Francisco web server machine. This has been simple enough so far. Plus, my site content is now backed up in a few places.

SSL

The modern web is moving to using SSL everywhere. Google and Firefox show warnings when a site uses plain http, and even though my site doesn't need any security, I thought it would be fun to get a security certificate for my site. Obviously I didn't want to pay for one, but thankfully, Let's Encrypt is a new certificate authority that offers free certificates. I used the Electronic Frontier Foundation's command line tool, Certbot, to get a Let's Encrypt certificate for my site. This wasn't as easy as I thought it would be, but hopefully the process gets easier in the future. And even though I don't need a certificate, at least now Comcast or another shady ISP can't inject ads into my pages.

Another note on this, my domain is johnbyers.net, and I used certbot to get a certificate for this domain. However, it isn't a wildcard certificate, so subdomains, like www.johnbyers.net would show up as untrusted, and browsers would show big warnings. To get around this, I followed the examples in Certbot's documentation here. I ran certbot --expand -d johnbyers.net,www.johnbyers.net to get a certificate for the www subdomain as well.

Testing

Before I update my website, I test it on my personal computer. PHP comes with a built-in web server for testing, so I run the command: php -S localhost:8000 which starts a webserver that can process php. I can open a browser, and pretend that localhost:8000 is the same as www.johnbyers.net, and test everything. It works great.

Email

I wanted email to work with my new domain name. Google's G Suite and Microsoft's Office 365 were both options I considered, as you can use their services with your own domain name, but both were $60/year for the cheapest plan. I decided to go with Zoho, who offer a similar service, for free (there are paid options as well). They had a great tutorial on how to set up mail from my domain, validate that I owned the domain, prevent spam, etc. So far it has been good.

Closing Thoughts

I realize this site looks straight out of the 90's, but that is ok. It could easily be spruced up with some css, which I may do when I have more time. This site is how, in my opinion, the web is supposed to be. Nowdays, we are used to crazy and fantastic websites that do everything. Video, massive amounts of JavaScript, Facebook and Google tracking you, and ads everywhere. I am guilty of this too, my apps at work are mostly single-page-apps that do too much. But the web was meant for documents, not crazy web-apps, which is what my site does. I want to keep it simple.