Securing the site in the wild

by Misha Rumbesht

One can never feel too safe online nowadays – the openness of the world wide web remains a double-edged sword. Man-in-the middle attacks, XSS injections, clickjacking attacks, you name it – have been plaguing the internet since the dawn of WWW times.

Google has taken up the banner of internet security, and started aggressively bringing it into the masses, almost three years ago. Giving a little boost in the rankings to the sites which have chosen to use HTTPS protocol over HTTP sounds a bit unfair, until one starts to consider the negative influence of the sites served over HTTP on the general health of the web. What is worse, even if the majority of the links to other resources on your site are secure, save one – be it a web-page, a script, or even an image – google will choose to index the page as HTTP. Further on, if the user navigates to the page served over HTTPS, and there are resources loaded over HTTP contained in the page, those resources will be blocked, and the browser will issue a content warning in the console.

So, it’s obvious – the SSL certificate is a must nowadays. Setting it up used to be a bit tricky before, though. Not any more it’s not – Let’s Encrypt, “a free, automated, and open Certificate Authority”, has been steadily gaining popularity after its launch last year. Currently, it is serving more than 35 million active certificates. Many hosts have implemented a one-click setup tool, as well as automatic renewal process for Let’s Encrypt certificates, removing the pain of certificate installation from users’ shoulders. This, in fact, is the certificate issuer we use for our own website – you can check how well our certificates are set-up at this SSL Labs page. There are, of course, plenty of other trusted SSL certificate issuers – while Let’s Encrypt simply secures the connection, one might sometimes want to go for a “personalised” certificate.

Security is not the only benefit which comes with HTTPS. HTTP/2, a “new” kid on the block (HTTP protocol upgrade, really), requires to be used over HTTPS only. It brings a lot of benefits to the site – it’s binary (not textual, like HTTP/1.1), potentially quicker due to being multiplexed (parallel downloads of resources), and it also introduces headers compression, making pages lighter and faster to load. All the major servers support it (nGinx is our preference here), all the major browsers understand it, and so will download the data in one constant stream, instead of an old waterfall.

HTTPS over HTTP is not the only consideration – to properly secure the site, one needs to configure the headers which are served together with website’s pages. Over time, more of them have been introduced to battle with the man-in-the-middle attacks, clickjerking, xss injections, insecure referrers, and much more. Scott Helme’s does a great job in assessing all the header settings, here are ours.

And then, there is the choice of technology running the site. Even though static sites are re-gaining the popularity due to the growing number of static sites generators (Jekyll, Hugo, Hexo, and many more), bigger projects do require a more sophisticated CMS to run them. With the increased complexity, there are more vectors of attack. To mitigate that fact, all the websites which are run by a bigger CMS (open-source is our choice here – Wordpress and Joomla being the go-to ones), are protected by security plugins, and constantly monitored.

So here we are. Security is a no-nonsense business in web-development, and should be taken very seriously.

Image: Christina Gottardi