Securing forms and URL data submissions.

form2

Please see my article Vulnerabilities in forms and URL Parameters for more info on the vulnerabilities.

There are a few ways the data transmissions from forms and URL parameters can be made more secure. Some of these practices can be used, or all, for extra security. Below I’ve listed a few important guidelines and other tips that can be used to secure forms.


Golden Rule #1
Never rely on Javascript for error checking or to validate the content submitted. This is a poor mistake to make. Javascript is easily hackable because it is client-side script. This means that the processing of the code is carried out on the clients machine, the server-side script has no control over it. So a user can edit this code and change the way it processes data. This leads onto the next rule!

Golden Rule #2
Always check submitted data via server-side scripting. Always! Does not matter who you are, where you came from or the colour of the horse you rode in on. Always check the data submitted by a human or a remote machine on the server. There is no question about it. Humans can insert all kinds of random data and could break your system, cause undesired affects and/or processes to occur. It is also wise to check the content of the hidden fields in forms. To make sure they are valid and correct. Hidden fields can be manipulated with ease.

Checking submitted data will save a lot of issues. In addition you should be very aware of SQL injection. Something you should research if you are not familiar with. If you use the same process to save certain data to the database, another valuable practice is to have this code check the data before saving each time. Hence wherever the data comes from, if the same saving script is used it will always validate the data and present an error if found. With this method bad data should never be saved.

There are various types of checks that can be made on data. Such as if the submitted value makes sense, is it a valid number or that a user is a valid user to interact with a certain element/process (ie if deleting an image, does the user have permission). The general rule is to check as much data as possible.

Below are a couple of rules that should be followed at all times where relevant. These will negate the affects if bad data has been submitted. Not all data can be checked. For example if someone guesses a correct command parameter then the system will see it as valid. EG

<input type=”hidden” name=”command” value=”edit” />

if a hacker changed this line to

<input type=”hidden” name=”command” value=”delete” />

and delete is a valid command, then the system will validate this data as correct.
Generally you will only need to validate data that is submitted to a class or process.

Ok so here are the tips and tricks to securing data that is transmitted via a website with forms or URL parameters.

  • Display as least amount of sensitive data as possible. The less information you provide the client/user the better. Remember that spiders are crawling the web all day and night. So they too can access data that is used in forms and URL parameters.
  • Make the data difficult to understand. For example name=”command” could be name=”cc”. Just makes it a little hard to decipher. Command is pretty self explanatory. You could use anything such as ‘space-monkey‘ if need be.
  • Checking that the submitted data makes sense. If a user from Australia it trying to purchase a product not valid from their country, they may have hacked the form and inputted the product id of the product they are not allowed to purchase. By double checking everything the system will be able to see that the product is not available in Australia and hence will throw an error.
  • Use encryption on all hidden field and URL parameters. See more below.

Remember that some data is not important and it may not matter if the form is hacked. However every possible scenario should to be taken into account.

As and administrator/developer of large scale websites I have seen the shear massive amounts of attempted hacks on a daily basis. From URL injection to automated login scripts. Many of these attempt I have discovered due to custom built scripts that are designed to detect and report such hacks. Once I was working on a site where the server was being switched. For a period of a few hours the root directory file structure was visible to the general public. I reported this to the server admin where they shrugged their mistake off. However what happened was that spiders had crawled the site and recorded the file structure. Once that happened crawlers were attempting to directly access files they should not be. I already had installed security for this so I was able to detect it and forward them a 404 page. The point I’m trying to make is it does not take much to open up a site up for potential hacks. There are robots in place that will track and access all areas of the site at any hour. Some are honest and harmless other will try various hacks to gain access.

It is always a good idea to pass IDs where possible. Instead of including User data into a form you should pass the ID. Then on the server side script the script can access the Users data from their ID. However keep in mind that throwing IDs around on the internet may not be desired. If a valid user ID can be obtained from form data (EG <input type=”hidden” name=”user_id” value=”462″ />) then is could be used in malicious activities attempting to hack your site (EG www.yoursite.com/login.php?command=start_login&user_id=462). Generally the less information you provide the public on the inner workings of your site the better.

Encrypting Data

This is by far the best way to secure your system. Hidden fields and URL parameter do not need to be human readable. Hence by encrypting them you are eliminating the problem where people can see the data, edit them, decipher how your site works, steal the data, etc. It is by far the best way to keep the system secure and eliminate many issues.

However!.

It’s not as much fun as it sounds. Throwing around encrypted data can cause a series of headaches. It needs to be decrypted to be used by the server side script then re encrypted to be used in the html or URL. All of which take more processing power (depending on how you code your software) and can make life much more confusing. Debugging is much more difficult when you are looking at data such as ‘euyr45Wf42Foro038mnfhrs4m44-3′ when it decrypts to ‘command’.

Another issue is that your URL will look terrible. Instead of being www.mysite.com/post.php?id=34 it may look like www.mysite.com/post.php?38ieruhsd9h8sdv=ihnwney9239. This is also poor for SEO.

For this reason encryption should only be used on sensitive data. A wise practice is to encrypt all hidden fields. There is no harm in this. Nothing to lose. If a URL is to contain data that is deem sensitive to hacks then convert the url into a form and encrypt the fields. So www.mysite.com/images.php?page=edit&id=8432 would be turned into

<form action=”images.php”>
<input type=”hidden” name=”urn834jj435po” value=”97364ksuuelr” />
<input type=”hidden” name=”pmxjsowe9239″ value=”ge3kmd8-2iig” />
<input type=”submit” value=”edit image” class=”text-link-button”/>
</form>

You can encrypt both the hidden field names and value. This creates a rock solid site but will with increase debugging time and overheads.

Best Practice

Really it all depends on the data that is being included in the code and how by editing/taking this data can effect the security of the site. Remember that spiders crawl the web day and night. Many of them do not follow crawler convention and will still crawl your site even if you have followed the correct procedures to block them (IE robot.txt).

Some sites will need to be more secure than others. The data being passed around may not be important. There are many things to determine when taking this into consideration. Nobody wants their site hacked.

I tend to encrypt most of my data with a custom built encryption system. All hidden fields are encrypted as there is no harm in doing so. URLs are dependent on the information and what I’m aiming to achieve with the URL (EG for SEO, to save as a link, etc). I never take it for granted that someone will not try and steal the information on a site or not try to hack it. There is nothing worse than building a site for a client then it get hacked. Its looks really bad for the developer, creates down time in the site and the developer will generally have to drop everything to find the breach. These things do happen and it is important for clients to understand that nothing is full proof. The more secure the better.

Conclusion

A judgment call needs to be made as to what protection to use and where. Most things have an up and a down side, positive/negative. So these need to be analysed during development. At the end of the day the harder it is to hack the less people will succeed and the less people will attempt to hack. A similar analogy to a lock on a bicycle and a thief wishing to steal it. If you put 2 locks on a bicycle then it will take twice as long and there is twice as much chance of getting caught stealing it. Put 3 locks on and it equates to 3 times, etc.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>