Max Length For Get Variable - php

I was wondering if there is a max length on a $_GET variable. I plan on using ajax with a get command and part of it is an encoding of an access key using one of my encoding methods. This method has returned back roughly 1530 characters and I was wondering if this it too long for a get variable as long as it's all URL encoded?
Thanks in advance,
Spencer

Browser would greatly affect the max length of your $_GET param
MAXIMUM LENGTH FOR BROWSERS REFERENCE
Some versions of PHP have a limitation of length of GET params:
PHP.NET REFERENCE

Max URL length is around 2000 charactors
In IE is around 2048 - http://support.microsoft.com/kb/208427

Similar to this? https://stackoverflow.com/a/7725515/2827152
Please note that PHP setups with the suhosin patch installed will have a default limit of 512
characters for get parameters. Although bad practice, most browsers (including IE) supports URLs up
to around 2000 characters, while Apache has a default of 8000.
To add support for long parameters with suhosin, add suhosin.get.max_value_length = in php.ini
Source: http://www.php.net/manual/en/reserved.variables.get.php#101469

Related

Laravel 4 Encryption: how many characters to expect

I've just had an interesting little problem.
Using Laravel 4, I encrypt some entries before adding them to a db, including email address.
The db was setup with the default varchar length of 255.
I've just had an entry that encrypted to 309 characters, blowing up the encryption by cutting off the last 50-odd characters in the db.
I've (temporarily) fixed this by simply increasing the varchar length to 500, which should - in theory - cover me from this, but I want to be sure.
I'm not sure how the encryption works, but is there a way to tell what maximum character length to expect from the encrypt output for the sake of setting my database?
Should I change my field type from varchar to something else to ensure this doesn't happen again?
Conclusion
First, be warned that there has been quite a few changes between 4.0.0 and 4.2.16 (which seems to be the latest version).
The scheme starts with a staggering overhead of 188 characters for 4.2 and about 244 for 4.0 (given that I did not forget any newlines and such). So to be safe you will probably need in the order of 200 characters for 4.2 and 256 characters for 4.0 plus 1.8 times the plain text size, if the characters in the plaintext are encoded as single bytes.
Analysis
I just looked into the source code of Laravel 4.0 and Laravel 4.2 with regards to this function. Lets get into the size first:
the data is serialized, so the encryption size depends on the size of the type of the value (which is probably a string);
the serialized data is PKCS#7 padded using Rijndael 256 or AES, so that means adding 1 to 32 bytes or 1 to 16 bytes - depending on the use of 4.0 or 4.2;
this data is encrypted with the key and an IV;
both the ciphertext and IV are separately converted to base64;
a HMAC using SHA-256 over the base64 encoded ciphertext is calculated, returning a lowercase hex string of 64 bytes
then the ciphertext consists of base64_encode(json_encode(compact('iv', 'value', 'mac'))) (where the value is the base 64 ciphertext and mac is the HMAC value, of course).
A string in PHP is serialized as s:<i>:"<s>"; where <i> is the size of the string, and <s> is the string (I'm presuming PHP platform encoding here with regards to the size). Note that I'm not 100% sure that Laravel doesn't use any wrapping around the string value, maybe somebody could clear that up for me.
Calculation
All in all, everything depends quite a lot on character encoding, and it would be rather dangerous for me to make a good estimation. Lets assume a 1:1 relation between byte and character for now (e.g. US-ASCII):
serialization adds up to 9 characters for strings up to 999 characters
padding adds up to 16 or 32 bytes, which we assume are characters too
encryption keeps data the same size
base64 in PHP creates ceil(len / 3) * 4 characters - but lets simplify that to (len * 4) / 3 + 4, the base 64 encoded IV is 44 characters
the full HMAC is 64 characters
the JSON encoding adds 3*5 characters for quotes and colons, plus 4 characters for braces and comma's around them, totaling 19 characters (I'm presuming json_encode does not end with a white space here, base 64 again adds the same overhead
OK, so I'm getting a bit tired here, but you can see it at least twice expands the plaintext with base64 encoding. In the end it's a scheme that adds quite a lot of overhead; they could just have used base64(IV|ciphertext|mac) to seriously cut down on overhead.
Notes
if you're not on 4.2 now, I would seriously consider upgrading to the latest version because 4.2 fixes quite a lot of security issues
the sample code uses a string as key, and it is unclear if it is easy to use bytes instead;
the documentation does warn against key sizes other than the Rijndael defaults, but forgets to mention string encoding issues;
padding is always performed, even if CTR mode is used, which kind of defeats the purpose;
Laravel pads using PKCS#7 padding, but as the serialization always seems to end with ;, that was not really necessary;
it's a nice thing to see authenticated encryption being used for database encryption (the IV wasn't used, fixed in 4.2).
#MaartenBodewes' does a very good job at explaining how long the actual string probably will be. However you can never know it for sure, so here are two options to deal with the situation.
1. Make your field text
Change the field from a limited varchar to an "self-expanding" text. This is probably the simpler one, and especially if you expect rather long input I'd definitely recommend this.
2. Just make your varchar longer
As you did already, make your varchar longer depending on what input length you expect/allow. I'd multiply by a factor of 5.
But don't stop there! Add a check in your code to make sure the data doesn't get truncated:
$encrypted = Crypt::encrypt($input);
if(strlen($encrypted) > 500){
// do something about it
}
What can you do about it?
You could either write an error to the log and add the encrypted data (so you can manually re-insert it after you extended the length of your DB field)
Log::error('An encrypted value was too long for the DB field xy. Length: '.strlen($encrypted).' Data: '.$encrypted);
Obviously that means you have to check the logs frequently (or send them to you by mail) and also that the user could encounter errors while using the application because of the incorrect data in your DB.
The other way would be to throw an exception (and display an error to the user) and of course also write it to the log so you can fix it...
Anyways
Whether you choose option 1 or 2 you should always restrict the accepted length of your input fields. Server side and client side.

Generate the same MD5 using javascript and PHP

I am trying to build an application that needs to compare the MD5 hash of any file.
Due to specific issues, before the upload, the MD5 must be generated client side, and after the upload the application needs to check it server side.
My first approach was to use, at the client side, the JavaScript File API and the FileReader.ReadAs functions. Then I use the MD5 algorithm found here: http://pajhome.org.uk/crypt/md5/
Server side, I would use PHP's fopen command and the md5 function.
This approach works fine when using simple text files. But, when a binary file is used (like some jpg or pdf), the MD5 generated at the client side is different from the server. Using md5sum command-line tool I figured out that the server MD5 is correct and the problem occurs at client side.
I've tried other MD5 API's I found with the same results. I suspect that FileReader.ReadAs functions are loading the file content slightly differently (I have tried all ReadAs function variants: text, binary and so on), but I can't figure out what is the difference.
I'm missing something but don't know what, maybe I need to decode the content somehow before generating the MD5.
Any tips?
Edit 1:
I followed the idea given by optima1. Took each character and printed the unicode number both on javascript and PHP. I could see only one difference at the end on all the cases (used vimdiff).
PHP: 54 51 10 37 37 69 79 70 0
Javascript: 54 51 10 37 37 69 79 70
Maybe this extra zero at PHP is some kind of "string end". On both cases the binary strings have the same length. Adding a String.fromCharCode(0) to the end of the JS content do not solve the problem. I will keep investigating.
If i can't find a solution i will try to build a giant string by concatenating those charcodes and using it to build the MD5. It is a crap solution but will serve for now and i will just need to add a zero to the end of the JS string...
Edit 2:
Thank God! This implementantion works like a charm: http://www.myersdaily.org/joseph/javascript/md5.js
If you need to generate a MD5 hash from binary files, go for it.
Thanks in advance!
http://membres-liglab.imag.fr/donsez/cours/exemplescourstechnoweb/js_securehash/
javascript md5 and php md5 both are same but we need to use some functions...that functions we can get from above url....
I would suggest doing a quick sanity check: have your client-side code report the first and last bytes of the binary data. Repeat in your PHP code. Compare first and last bytes from both methods to ensure that they are in fact reading the same data (which should result in the same MD5 hash.)
Then I would suggest posting code here so that we can review.

What is the maximum URI length in codeigniter?

I'm wandering what the maximum URI length is in codeigniter, and if URI segments being used as arguments to a controller function count towards the browsers GET length limit? I think most browsers cap there GET parameter length to about 2000?
Currently if my total URI length (inc. https://domain/folder/controller/function/argument) exceeds around 1560 characters I get a forbidden message.
'Forbidden You don't have permission to access /folder/controller/function/argument on this server'
If I trim the characters back to under around 1550~1560 it works fine again. I realise 1500+ is alot anyway, which is why I was wandering if URI counts towards the GET limit.
Has anyone experienced this problem? Is there a solution aside from POSTing all data?
BTW: I'm using the URI protocol AUTO in the config
As far as I remember the whole URI is limited to a more or less specific length. Something is already mentioned here: What is the maximum length of a URL in different browsers?
However, it feels a little bit curious, that you require such long uris. If you append a query string of around 1000 characters length, thats already 1kB of data. In my oppinion a query string is not the right place to transport data around.

How much data can be sent via $_GET

How much data can be sent via $_GET in PHP5? Is there a maximum number of variables, string length etc? Thanks in advance.
Although the specification of the HTTP protocol does not specify any maximum length, practical limits are imposed by web browser and server software.
There is no limit defined in the RFC, but browsers limit the URL length (including get variables). IE for instance limits the URL length to 2083 characters, Opera about 4,050, Netscape 6 about 2,000 characters.
A general rule of thumb is, that you shouldnt use more than 256 characters.
There is not only the PHP limitation, but you should also consider 'in between' proxies and the client software.
The http standard doesn't pose a limitation, though.
(I got this from here, where advice is to not exceed 255 char's urls!)

maximum URI length for file_get_contents()

Is there a maximum length for the URI in the file_get_contents() function in PHP?
I suppose there is a maximum length, but you'll be hard pressed to find it. If you do hit the maximum, you're doing something wrong. :)
I haven't been able to find a number for PHP specifically, but MS IIS, Apache and the Perl HTTP::Daemon seem to have limits between 4,000 and 16,384 bytes, PHP will probably be somewhere around there as well.
What you need to consider is not really how much your side can handle, but also how much the other server you're querying can handle (which is presumably what you're doing). As such, any URL longer than ~1000 characters is usually already way too long and never really encountered in the real world.
As others have stated, it is most likely limited by the HTTP protocol.
You can view this answer for more info on that : What is the maximum length of an url?
In HTTP there's no length-limit for URI,and there's no note of file_get_contents() about this in the manual .So I think you needn't to consider about this problem.
BTW,the length of URI is limited by some browser and webserver.For example,in IE,the length should be less than 2083 and in FF it's 65,536.I tried to test this I found that only not more than 8182 is OK when I visited my apache on ubuntu because of limit of my apache.

Categories