Privacy.

Today there have been attention grabbing headlines in a number of news outlets. One of these headlines was “WhatsApp and iMessage could be banned under new surveillance plans”, from the Independent. The article outlined the possibility that technologies and applications, such as WhatsApp, would be banned as they allow users to send messages which are encrypted end-to-end. This falls in line with the new legislation that was rushed through during 2014, and the continuing loss of privacy that we have online.

One quote the article put heavy emphasis on, and in turn has been taken by several other news outlets was as follows:

In our country, do we want to allow a means of communication between people which[…]we cannot read?

My initial urge was to get angry at how patently wrong the connection of encryption and privacy to terrorism and violence was. But then I decided to listen to the full comment from Cameron, rather than the paraphrased version. The full quote is as follows:

In our country, do we want to allow a means of communication between people which, even in extremis with a signed warrant from the home secretary personally, we cannot read?

It’s not much better, but it’s also not as bad as the original quote sounds. The issue is, I can’t say that I want terrorists to be able to plot to carry out these attacks on innocent people, but I don’t believe this is the way of doing it. The fundamental link between not having access to the content of every single communication made anywhere in the UK, and terrorism “winning”, is the key issue. It’s simply a complete fallacy and by allowing the PM to say that unopposed would be us accepting it as truth and allowing the rate of erosion to our online privacy to increase greatly.

Heavy Handed

Taking everyone’s ability to access a completely private form of communication is a heavy handed tactic which, as I’ve said before regarding government views and ideas on online freedom and privacy, won’t actually work. It is not possible to stop anyone from encrypting communications that they send. It may be possible to stop a company from profiting from offering this type of service, thereby taking it away from the common user, but it is not possible to stop people from doing that.

The types of people that really want communication which is envrypted end-to-end will be able to access it regardless of the law. Included in that user base are those that want to discuss illegal activities. It’s not difficult to find how to set up a method of encryption such as PGP, and the active online community will no doubt offer a great deal of help to anyone that’s stuck.

The Glowing Record of Piracy Laws

Further, piracy laws are always a hot topic and probably a good example to learn from. They’re now failing so excessively that the list of “Most pirated shows of the year” is now reported and celebrated. This year Game of Thrones hit the top of the list for a third year in a row after being illegally downloaded at least 8.1 million times. Guess who lost out and weren’t able to enjoy their favourite TV show with everyone else – paying customers in both the UK and the US. Now guess who were able to enjoy it ad-free, only minutes after it finished its first airing in the US – those pirating the episode from around the world.

In the same way, a law stopping completely encrypted, backdoor free communication would simply make the majority of online users more vulnerable to having their personal communications leaked to the public. 2013 and 2014 have been years where, more than ever, it’s clear that we don’t need to increase the likelihood of it happening.

Back to Work

To wrap up my rambling (and procrastination), I will simply conclude that, while I know that giving up our privacy isn’t the right way to help authorities deal with terrorism, I’m not entirely sure what is. I’d imagine that whatever solution is the best will involve far more general knowledge of technology and computer security in UK government. The hackers and cyber criminals of the world are using social engineering, vulnerabilities in code and brute force attacks to get what they want, and it’s working. Maybe trying something that works as well as the criminals’ methods, would be a good place to start.

Lessons learnt today: Double quotes, redirection and rubbish routers

I haven’t posted here in a long while, and no doubt that will continue unfortunately. However, tonight I’ve learnt (or in some cases re-learnt) a few, albeit simple, lessons and it felt sensible to note it down so I can remember in the future.

Double Quotes vs. Single Quotes in PHP

This was a massive rookie error and a sign that I haven’t worked in PHP much over the past year.

While tightening my database security, I ended up patching up some dodgy db related code in PHP in a particularly old project. I spent nearly half an hour trying to work out why my passworded database user was being denied access to the database.

After a bit of debugging, I noticed the password was being cut off in the middle, and after further debugging and tracing the string back, I noticed that my randomly generated, double quoted password string happened to have a ‘$’ character in it.

PHP (among other languages) tries to resolve variables within double quoted strings, meaning “abc123$efg456” is resolved to “abc123”, if the variable $efg456 doesn’t exist in your script. The solution was to simply exchange the double quotes for single quotes.

Lesson: If you’re working in a language which treats double and single quoted strings differently, check you’re using the right ones!

.htaccess redirection

.htaccess always ends up leeching away my time. This time I was trying to set up some redirects to treat a sub-directory as the root directoy, but only if the file or directory didn’t exist in the root directory and did exist in the sub-directory.

This is simple enough if you know what the .htaccess variables mean, but in using examples and making assumptions I tripped myself up. So here’s the bit I learnt:

%{REQUEST_FILENAME} – This isn’t just the filename that was requested, but the absolute path from the root of the server.
%{REQUEST_URI} – This is the filename on its own.
%{DOCUMENT_ROOT} – This is usually the path up to the root directory of your site (though I’m quite sure this is not always the case).

So given the path “/a/file/path/to/a/website/index.html”:

%{REQUEST_FILENAME} = /a/file/path/to/a/website/index.html
%{REQUEST_URI} = index.html
%{DOCUMENT_ROOT} = /a/file/path/to/a/website

Simple when you know, but confusing otherwise! In any case, here’s the resulting rule I cobbled together:

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{DOCUMENT_ROOT}/other%{REQUEST_URI} -f
RewriteRule ^(.*)$ /other/$1 [L,QSA]

That won’t suffice if you need directories to work as expected, and it will only apply to files, but it’s as much as I need for now.

Lesson: Don’t assume things, especially when dealing with something as powerful as .htaccess files. The more you know and use it, the less of a pain it will be.

NAT Loopback and remote IPs not working locally

Having acquired a new domain name today, I decided to put it to work as a domain for my home server (with help from no-ip). Having set it all up, I came across a peculiar scenario where I was able to access the machine remotely with the domain (the outward facing IP), I was able to access the machine locally with the local IP address, but I was unable to access the machine locally with the public IP or domain name.

In a few minutes I realised that this was not so peculiar at all. The Network Address Translation (NAT) rules decide where inbound requests should go when it hits the router, I have my router set up to allow certain connections to forward through to my server. However, these rules don’t apply to requests which pass through the router on the way out. I’d only be guessing, but I’d imagine this is because responses to requests across the Internet would otherwise have these rules applied to them as well, completely breaking the network.

To solve this issue, NAT loopback, a feature or further rule which resolves requests to the router’s public IP from inside the network correctly, is available in many routers. It is commonly turned off due to security concerns, or simply may not be available in some routers.

Unfortunately, my Huawei HG533 router falls into the latter group, with no obvious plans of an upgrade which would fix this.

Lesson: If you want to use one address to access a machine locally and remotely, ensure NAT Loopback is set up.

All simple stuff, but it’s been interesting learning about it all. Hopefully I can continue documenting the small things like this over the next year, my final year of university should be a steep learning curve!

A “filtered” Internet

The recent news has featured two main stories, both of which have already been discussed to death. There was the futile efforts at keeping the royal baby news going while Kate had the baby, recovered and walked outside with him in her arms and there was the huge trumpeting of Internet safety from David Cameron. The latter, which the Prime Minister set out as a way of stopping child abuse and guarding children from viewing adult content in one fell swoop, irritated me quite a lot. There were a number of reasons for my irritation…

TL;DR

  • Blacklisting search terms doesn’t stop any target audience from finding the adult or illegal material.
  • Parental controls at the ISP level are good, but the way they’re implemented should be regulated and the way they’re used should be determined by the account holder.
  • Parents don’t want their children to see this damaging content, so tell them how to guard against it.
  • Mainstream media is just as damaging as hard to find, extreme material online.
  • Aiming the blame at tech companies is lazy, and doesn’t attack the root cause.

Paedophiles don’t Google “child abuse”

Mr Cameron also called for some “horrific” internet search terms to be “blacklisted”, meaning they would automatically bring up no results on websites such as Google or Bing.

BBC Technology News

This calling for search companies to provide a blacklist is actually popular with a lot of people, with the idea that “if you can’t Google it, it won’t be there”. The problems with this method are pretty easy to see, while its effect is pretty minimal:

  1. Those who actually want to find the horrific parts of the Internet certainly won’t be using Google in the first place, and even if they were, they wouldn’t be using any of the terms on that blacklist.
  2. Kids aren’t just going to accidentally stumble upon the keywords and innocently type them in, it’s not something that grabs their curiosity.
  3. Google and the other search engines already have filtering and guarding against sites with adult themes.
  4. Because neither of the target audiences will be affected by the change, the only loss is to researchers or those with genuinely innocent searches which somehow fall into the blacklist.
  5. Finally, blacklists are easy to get around, mis-spelling words, using acronyms or replacements for words, sites would simply target keywords which simply couldn’t be blacklisted.

The “accidentally stumbling” scenario which is often painted out is also well guarded against nowadays, with further, definite steps being required to access content which is deemed inappropriate for some audiences. That is already provided for, many companies do in fact have whole departments devoted to ridding their services of this type of content, or at least flagging it as such to the user.

ISPs are not your babysitter

He told the BBC he expected a “row” with service providers who, he said in his speech, were “not doing enough to take responsibility” despite having a “moral duty” to do so.

– BBC Technology News

The blame was also passed on to ISPs because they weren’t doing enough, with Mr Cameron insisting that they filter the Internet service which they provide. This is where many who care about freedom and worry about censorship begin to get panicky because:

  1. What organisation is maintaining the filter, suggesting what is adult content and what is not? Is it down to the ISPs? Is it a government enforced thing? My bet is that it won’t be a free, open body that decides these things, unlike the rest of the Internet.
  2. There is a worry that these filters will be leaned on too much, rather than parents learning about the service they have, and how to control it so that it is child-friendly in their opinion. Why is it the ISPs job? You won’t find Freeview users being asked whether they’d like adult content on their TVs, there are instead tools which are set up by each end-user which filters the service at their end?
  3. For many, including myself, there is a sense that, despite not viewing any of the adult content which would be blocked, the restriction on my service is unwelcome and quickly feels like a policed, monitored service is being provided, instead of the fully-fledged, free Internet that we currently use.

On speaking to my family about this point, it was quickly mentioned that “It’s just like any other law, you wouldn’t complain about not being allowed to drink and drive”. Immediately this argument is flawed, because I still have the ability of having a few too many beers and walking out to my car and driving it, there isn’t a breathalyzer test which I must take before driving, or a guard which won’t allow me to get behind the wheel when I’m over the limit – I still have the ability to commit the crime.

To make this point worse, after the announcement from the Prime Minister, many ISPs came out and described the parental tools which are already supplied by their service, and alluded to the fact that David Cameron’s jubilance, was simply taking credit for work that had already been done.

For the record, I’m actually all for parental controls at the service level, it provides a service wide coverage for the filtering of websites, which means most children will be unable to get past the filters (as most mobile networks impose filters, as do all schools and colleges), especially if the filtering includes proxy servers and websites. However, it certainly doesn’t stop anyone who is over 18 from removing the filters and accessing illegal content online. The idea is to also provide the filters automatically, passing over the parents heads and assuming their computer illiteracy, which leads me on to…

Work with parents, not for them

Most parents are probably quite concerned at the thought of their child being exposed to adult content online. Many will actually ensure that a child’s Internet use is supervised at an early age, and any filtering which the parents should be applying to their Internet connection is not absent through choice. Technology companies and the government (as they’re so interested) should be working with parents, providing support, training and user-friendly tools to parents, rather than just applying a blanket filter to all their account holders and patronising them. Clearly this is happening to some extent already, but the level of involvement could certainly do with raising.

As much as many people protest, the parents do have a huge responsibility when they allow their children to use the Internet, just as they do in not allowing them to watch films with an age rating which is too old for them or watch TV channels which are specifically intended for adults. They should be provided with the necessary information to fulfil this responsibility, but otherwise it should be up to them.

Realise that mainstream media isn’t squeaky clean

The other point that has annoyed many people is the fact that much of the mainstream media now has strong sexual, violent or addiction promoting themes, which have been entirely overlooked in this review.

Examples aren’t possible to count, but obvious examples can be found in most tabloid newspapers, gossip magazines and adverts (that’s printed, televised or digital). These slightly softer images are just as damaging as those more extreme images on the Internet, because they are in publications or media channels which contain everyday information alongside them, normalising them and sending similarly bad messages to children. If the Internet is going to be filtered, the filters already in place in other forms of media should be reviewed as well.

The tech companies don’t produce the illegal content

The final, and most concerning thing, is that the government seem to be aiming their efforts at the wrong end of the whole issue.

  1. The ISPs provide a service, plain and simple, they serve up whatever is available on the Internet – that’s what they’re meant to do.
  2. The search engines are tools which find relevant information on the Internet in relation to a selection of keywords entered by a user – that’s what they’re meant to do.

They do not produce illegal material, nor do they promote it. Asking them to do something about that material is already too far along the line, because by then the illegal act has occurred, been filmed or in some way documented, uploaded publically and then is accessible to anyone that knows where and how to get the content.

It’s like those weed killer adverts, the competitor product always just kills the weed at the surface, not at the root – the material can easily be hidden from view, but that doesn’t stop the root cause.

What really needs to be done is further investigation, more of a focus on finding those that cause harm to others for personal gain, unfortunately Mr Cameron’s budget cuts are actually doing the complete opposite – that’s what makes me quite so irritated.

 

The whole issue is huge, and calls into question the privacy of Internet users and the freedom of the Internet itself, as well as the degree to which the government are misguided when the word “technology” is mentioned. What David Cameron has suggested will detriment normal Internet users, bastardise the Internet and completely fail to achieve most of the aims which he has set out.</rant>