Request-URI Too Large showing on server - php

I am passing the following data through url :
<?php
$url = "generate_pdf.php/?feed=" . urlencode(serialize($result));
echo '<div id="left-sidebar">';
echo '<div id="pdf">Download PDF</div>';
echo '</div>';
?>
Here the $result containing the rss feed data in form of array. I am using urlencode(serialize($result) for passing that data through url and its working perfectly on local machine but in server it showing the following error :
Request-URI Too Large
The requested URL's length exceeds the capacity limit for this server.
Please tell me your views to deal with this problem.

I made this mistake (It was more than not knowing than making a mistake!) once. I've build an ajax engine for webapps. It used only the get method. Once I had a form with a lot of data and it did not work.
After some research I found out this: look here
So basically most browser does not make any problems because they support approximately 100.000 characters. But most web-servers like Apache only support 4000 characters from a URL.
No you can not just configure Apache to accept more. It is possible do but you have to edit the source code to do so.
solution: Use the POST method it is made for large data transfer between web-servers and clients(which are most likely browsers).
In your case I think you want to create a pdf with some user input an for some reason that input is larger than 4000 characters.

Store data somewhere on the server (e.g. a database)
Assign a unique ID to such data
Pass that ID in the URL
Retrieve data from storage using the ID as key

Related

Submitting large amount of form data to PHP using POST

We are designing a system for conducting a survey in which it askes user a about 72 questions (Multiple Choice questions)
And when the user submits this will be posted to php page which will save the answer in a MySQL table.
Its works fine and perfectly well when we doing the test with a small number of user
But I observed the when a large amount of users are submitting not all data reaches the server only a part of some users answer (around 65 answer) only reaches the server.But i get data from my all users but some answers aren't compete.
Am using MySql engine : MyISAM
What would be the problem or how can i solve this. is it the problem with some php configuration or mysql (large number of insert statement)
What is the best way to handle larger amount data from a form submission php
Thanks in Advance
There is a limit on POST request size in PHP. You can adjust post_max_size in your php.ini. As for database, I don't know how you are saving them in the database, but there are character/storage limitation on the database as well.
Whenever I'm dealing with large POST data like sending numerous field values through forms, using ajax does wonders! Try using jQuery $.post(), which is the shorthand for $.ajax(). It's quite easy to use, even if you're not that familiar with jQuery :)
You need to Increase max_input_vars from php.ini OR you can set the following code in your .htaccess file.
php_value max_input_vars 3000
You should use the ajax function for post the data..
Go through bellow link,it might help you
https://www.w3schools.com/jquery/ajax_ajax.asp

file_get_contents returns different results when called from different servers

I'm running a simple piece of php code, like so:
echo file_get_contents( 'http://example.com/service?params' );
When I run the code on my local machine (at work) or from my shared hosting account, or if I simply open the URL in my browser, I get the following result:
{"instances":[{"timestamp":"2014-02-28 18:03:39.0","ids":[{"id":"525125875"}],"cId":179,"cInstanceId":9264183220}]}
However, when I run the exact same code on either of two different web severs at my workplace, I get the following slightly different result:
{"instances":[{"timestamp":"2014-02-28 18:03:39.0","ids":[{"id":"632572147"}],"cId":179,"cInstanceId":4302001980}]}
Notice how a couple of the numbers are different, and that's all. Unfortunately, these different numbers are the wrong numbers. The result should be identical to the first one.
The server I'm making the call to is external to my workplace.
I've tried altering the file_get_contents call to include headers and masquerade as a browser, but nothing seems to give a different result (well, other than an error due to an accidentally malformed request). I can't use cURL because it's not installed on the servers where this code needs to be deployed.
Any clue what could be causing the differing results? Perhaps something in the request headers? Although I'm not sure why something in the headers would cause the service to return different data.
thanks.
(edit)
The service URL I'm testing with is:
http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/CygnetLastNInstancesServlet?lastN=1&cygnetId=179&endTimestamp=2014-02-28+21%3A35%3A48
The response it gives is a bit different than what I posted above; I simplified and shortened the responses in my SO post to make it easier to read--but the essential information given, and the differences, are still the same.
I give the service a timestamp, the number of images I want to fetch which were created prior to that timestamp, and a 'cygnetId', which defines what sort of data I want the images to show (solar wind velocity, radiation belt intensity, etc).
The service then echoes back some of the information I gave it, as well as URL segments for the images I requested.
With the returned data, I can build the URL for an image.
Here's the URL for an image built from a "correct" response:
http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/StreamByDataIdServlet?allDataId=525125875
Here's the URL for an image built from a "wrong" response:
http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/StreamByDataIdServlet?allDataId=632572147
If you click the links above, you'll see that the "wrong" URL does not open an image--the response is blank.

PHP url - Is this a viable hack?

Following on from this question, i realised you can only use $POST when using a form...d'oh.
Using jQuery or cURL when there's no form still wouldn't address the problem that i need to post a long string in the url.
Problem
I need to send data to my database from a desktop app, so figured the best way is to use the following url format and append the data to the end, so it becomes:
www.mysite.com/myscript.php?testdata=somedata,moredata,123,xyz,etc,etc,thisgetslong
With my previous script, I was using $GET to read the [testdata=] string and my web host told me $GET can only read 512 chars, so that was the problem.
Hack
Using the script below, I'm now able to write thousands of characters; my question, is this viable or is there a better way?
<?
include("connect.php"); //Connect to the database
//hack - read the url directly and search the string for the data i need
$actual_link = "http://$_SERVER[HTTP_HOST]$_SERVER[REQUEST_URI]";
$findme = '=';
$pos = strpos($actual_link, $findme) + 1; //find start of data to write
$data = substr($actual_link, $pos); //grab data from url
$result = mysql_query("INSERT INTO test (testdata) VALUES ('$data')");
// Check result
if ($result) {echo $data;}
else echo "Error ".$mysqli->error;
mysql_close(); ?>
Edit:
Replaced image with PHP code.
I've learned how not to ask a question - don't use the word hack as it riles peoples feathers and don't use an image for code.
I just don't get how to pass a long string to a formless PHP page and whilst i appreciate people's responses, the answers about cURL don't make sense to me. From this page, it's not clear to me how you'd pass a string from a .NET app for example. I clearly need to do lots of research and apologise for my asinine question(s).
The URL has a practical fixed limit of ~2000 chars, so you should not be passing thousands of chars into the URL. The query portion of the URL is only meant to be used for a relatively short set of parameters.
Instead, you can build up a request body to send via cURL/jQuery/etc for POSTing. This is how a browser will submit form data, and you should probably do the same.
In your scenario, there are two important elements that you need to examine.
First, what is the client that is performing the http operation? I can't tell from your text if the client is going to be a browser, or an application. The client is whatever you have in your solution that is going to be invoking a GET or POST operation.
This is important. When you read about query string length limitations online, it's usually within the context of someone using a browser with a long URL. There is no standard across browsers for maximum URL length. But if you think about it in practical fashion, you'd never want to share an immensely large URL by posting it somewhere or sending it in an e-mail; having to do the cut-and-paste into a client browser would frustrate someone pretty quickly. On the other hand, if the client is an application, then it's just two machines exchanging data and there's really no human factor involved.
The second point to examine is the web server. The web server implementation may pose limitations on URL length, or maybe not. Again, there is no standard.
In any event, if you use a GET operation, your constraint will be the minimum of what both your client AND server allow (i.e. if both have no limit, you have no limit; if either has a limit of 200 bytes, your limit is 200 bytes; if one has a 200 byte limit and the other has a 400 byte limit, your limit is 200 bytes)
Taking a look back, you mentioned "desktop app" but have failed to tell us what language you're developing in, and what operating system. It matters -- that's your CLIENT.
Best of luck.

Load the r_object of a GET request into a variable

I have searched extensively for this answer. Maybe it's just too simple to be posted. Please forgive me I am missing something really obvious. I am trying to use a piece of data (number value) from a GET request (I think it is the r_object value) that I access via a URL with an API key.
I am using an external service (called "teleduino") to read data (moisture/voltage readings) from a microprocessor unit (Arduino Uno with Ethernet shield). I use my personal API key from teleduino and the teleduino service uses php GET requests (in a JSON format I think). I can load the results into an iframe on my web page using javascript, but I need to use a real variable of the data not just view it in an iframe.
For example, if I load "http://us01.proxy.teleduino.org/api/1.0/328.php?k={my-key-here}&r=getAnalogInput&pin=16" into an iframe I get something like this (when the device is online) :
{status":200,"message":"OK","response":{"result:1,"time":0.22702789306641,"values":[877]}}
The number in the square brackets is the value I need to extract as a simple variable so I can display it on the page (instead of using iframes) and also use it in mathematical calculations and other functions.
That value I need (in the square brackets after "values") is I think the "r _object" in the $GET request because when I observe the URL : r=getAnalogInput&pin=16 is the crucial data I need. (It is the number of a voltage given on pin16 of that microprocessor, which refers to the soil moisture of that plant).
I have searched extensively but cannot find out how to load that value from the get request into a variable so I can use it in the javascript on my page.
I assume it is some simple php like
moisture = $_GET[URL,"values"] ....etc etc
I am assuming it is the php r_object. And I assume I need to use php or ajax as the data is coming from an outside server (the teleduino service), so pure javascript will not work.
I am happy to write some php onto the php file of my webpage, but I do not know what that php should be.
ANY ideas would be greatly appreciated - thank you so much!!!
$_GET is for getting parameters that were passed as parameters to your PHP script, not results from a remote API. I think this is what you want:
$results = file_get_contents($url);
$data = json_decode($results, true);
$moisture = $data['response']['values'][0];

Text size limit on website

I am using a combination of ajax php and sql. I have a local copy of this program and a live sever run by a company, there is a button that posts a comment, on the local copy you can post a comment of any size, but on the live server I have narrowed it down to about 512 bytes once the comment gets larger than that, no error is generated but the comment isn't added, is there any configuration files concerning MySQL databases PHP or javascript that could limit the amount of data that can be parsed?
Ok there was a get max value paramater in php.ini (under the settings for shino or something like that that was on the live server) that was set to 512 I changed it so now the system can handle 10KB of text for comments
is the action of the form for posting comments GET or POST?
if it's POST: in the php.ini, there's a configuration called post_max_size (documentation), please take a look at that on your local- and production-server and compare the values.
if it's GET: some browsers limit the querystring to aroudn 2kb, so maybe you exceed this... you should use POST instead.
If it were me, I would probably use the "onSubmit" tag on the form an run a quick javascript validation on the input (a function called from onSubmit="return func(this)" will only transmit if func(this) returns true). Simply have it call a function, and if the value of the field has a string length and return a little alert window if there are more than 500 characters. That way you don't necessarily parse or transmit anything you don't have to.
You'll probably want to have a fallback so that someone with javascript disabled can't bypass those limits, but that should work for the majority of your users.

Categories