We are designing a system for conducting a survey in which it askes user a about 72 questions (Multiple Choice questions)
And when the user submits this will be posted to php page which will save the answer in a MySQL table.
Its works fine and perfectly well when we doing the test with a small number of user
But I observed the when a large amount of users are submitting not all data reaches the server only a part of some users answer (around 65 answer) only reaches the server.But i get data from my all users but some answers aren't compete.
Am using MySql engine : MyISAM
What would be the problem or how can i solve this. is it the problem with some php configuration or mysql (large number of insert statement)
What is the best way to handle larger amount data from a form submission php
Thanks in Advance
There is a limit on POST request size in PHP. You can adjust post_max_size in your php.ini. As for database, I don't know how you are saving them in the database, but there are character/storage limitation on the database as well.
Whenever I'm dealing with large POST data like sending numerous field values through forms, using ajax does wonders! Try using jQuery $.post(), which is the shorthand for $.ajax(). It's quite easy to use, even if you're not that familiar with jQuery :)
You need to Increase max_input_vars from php.ini OR you can set the following code in your .htaccess file.
php_value max_input_vars 3000
You should use the ajax function for post the data..
Go through bellow link,it might help you
https://www.w3schools.com/jquery/ajax_ajax.asp
Related
I got a database that currently is pretty big (and even gets bigger by time).
I got a webpage on which I present the data in a form, so it can be changed.
I send it with method post.
This worked pretty well until the data got too much. Now it tells me 'it exceeds the limit of 1000'.
I read that I could change post_max_size in php.ini but I cant do it on my webserver, so its not really an option for me. Is there anything else I can do? The problem is, that ALL THE SHOW data will be in post, not only the changed one. Is there something that would do the trick?
I don't know the type and setup of your web-server, but post_max_size can be set in an .htaccess file:
php_value post_max_size 10000
Alternatively you could only send the values that have changed. If you use ajax to post your form, you could set - and check for - for example a data attribute like modified set on the fields.
By the way, I am assuming that you are talking about individual posts and are not sending your whole database back-and-forth. If you offer editing of a collection of items, you should use something like pagination to limit the number to a fixed maximum.
My questions is similar to this one, but I am interested in posting the data to PHP. I couldn't find anything on Google about this and was wondering if anyone knew the answer?
In essence, how many characters can you put in an input field, submit the form and then successfully receive all of the characters in a PHP script?
Thanks in advance.
It does depend mainly on the server confinguration. As an example:
php_value post_max_size 50M
I've come across a problem, which I can't seem to fix. I have a form (method="post" enctype="multipart/form-data") in which the user can choose some options. They
have the possibility to 'check all'. If they 'check all', they are checking about
2000+ boxes. To check if my form actually gets posted, I have the following (not so complex) code:
<?php
if(isset($_POST['bijwerken'])) {
echo "YIPPEE!!";
}
?>
Now, if I check all the boxes, I don't get any feedback. If I only select like 20 boxes, I do actually get feedback. What am I missing? The checkboxes are also generated by a script, with an echo :
echo " <input type=\"checkbox\" name=\"productsoorten[]\" value='" . $rowproductsoorten1[productsoort1] . "'> " . $rowproductsoorten1[productsoort1] . "<br />";
Would love the hear some good ideas!
Yes, there's actually a max_input_vars setting. The default value is 1000 and your post inputs won't work if the number of input fields are more than that.
Edit your php.ini file (usually at /etc/php5/apache2/php.ini if you're on a Unix system) and increase the limit:
max_input_vars = 5000
If you can't modify the php.ini file, you can add this to .htaccess:
php_value max_input_vars 5000
I'm sure most people would choose simplicity over better design and I'm not going to explain every detail on how I'd handle this. If you don't understand what I'm talking or don't want to make the extra effort to write this pseudo-code out then this solution probably isn't for you.
I'd compress this data before you POST. First take the checkboxes out of your form tag so they don't get posted (we'll be giving it out own data).
Second, when the user submits the form (now with just a button) run some JS which traverses your DOM and gathers all of your check data. It will need to create a long array and then it'll need to use a separate long to shift the active bit if the checkbox is selected only (start at 1 and double the value for each checkbox until it reaches >2147483647). I would stop at 31bits per group (even though JS uses 64bit longs; the shift ops don't work above 32; also, JS doesn't have unsigned variables so unless you want to deal with flipping the - sign while all of this is going on then that's out too).
Third, post that to the server in a hidden text field and on the server end you get to reverse all of this.
Fourth, decode this on the server.
Fifth: Do this all again in reverse if you need the checks to begin correctly setup.
Advantages of This:
- MUCH less data travels between client and server
- No need to modify php.ini or even .htaccess
- This is able to grow and shrink dynamically (no need to reconfigure anything if you add another 2000 checks)
- Some browsers have limits on the number of bytes\fields you can post so simply increasing the number of fields won't always help.
- IE has limits on the length of a URL so forget about GET with the other solution. MAYBE, you can do it with this.
Disadvantages:
- Much more difficult to implement (nearly everything will need to be done 2 times for client and server as well)
- Users may not have JS enabled
- JS needs a lot of help when it comes to bit shifting
- Uses more CPU time
--
If you want to take this up a notch then you'll want to work out a fix for the JS bit shift operator problem (this will nearly halve your data length): Converting javascript Integer to Byte array and back
This improved version will also require a 64bit PHP installation on the server (or a BigInt class which is WAY out of scope for this question).
I am passing the following data through url :
<?php
$url = "generate_pdf.php/?feed=" . urlencode(serialize($result));
echo '<div id="left-sidebar">';
echo '<div id="pdf">Download PDF</div>';
echo '</div>';
?>
Here the $result containing the rss feed data in form of array. I am using urlencode(serialize($result) for passing that data through url and its working perfectly on local machine but in server it showing the following error :
Request-URI Too Large
The requested URL's length exceeds the capacity limit for this server.
Please tell me your views to deal with this problem.
I made this mistake (It was more than not knowing than making a mistake!) once. I've build an ajax engine for webapps. It used only the get method. Once I had a form with a lot of data and it did not work.
After some research I found out this: look here
So basically most browser does not make any problems because they support approximately 100.000 characters. But most web-servers like Apache only support 4000 characters from a URL.
No you can not just configure Apache to accept more. It is possible do but you have to edit the source code to do so.
solution: Use the POST method it is made for large data transfer between web-servers and clients(which are most likely browsers).
In your case I think you want to create a pdf with some user input an for some reason that input is larger than 4000 characters.
Store data somewhere on the server (e.g. a database)
Assign a unique ID to such data
Pass that ID in the URL
Retrieve data from storage using the ID as key
I am using a combination of ajax php and sql. I have a local copy of this program and a live sever run by a company, there is a button that posts a comment, on the local copy you can post a comment of any size, but on the live server I have narrowed it down to about 512 bytes once the comment gets larger than that, no error is generated but the comment isn't added, is there any configuration files concerning MySQL databases PHP or javascript that could limit the amount of data that can be parsed?
Ok there was a get max value paramater in php.ini (under the settings for shino or something like that that was on the live server) that was set to 512 I changed it so now the system can handle 10KB of text for comments
is the action of the form for posting comments GET or POST?
if it's POST: in the php.ini, there's a configuration called post_max_size (documentation), please take a look at that on your local- and production-server and compare the values.
if it's GET: some browsers limit the querystring to aroudn 2kb, so maybe you exceed this... you should use POST instead.
If it were me, I would probably use the "onSubmit" tag on the form an run a quick javascript validation on the input (a function called from onSubmit="return func(this)" will only transmit if func(this) returns true). Simply have it call a function, and if the value of the field has a string length and return a little alert window if there are more than 500 characters. That way you don't necessarily parse or transmit anything you don't have to.
You'll probably want to have a fallback so that someone with javascript disabled can't bypass those limits, but that should work for the majority of your users.