I'm not a PHP expert at all so please forgive me if I make incorrect or stupid statements! I recently had my web server uprgaded to PHP 7.4.11 from PHP 5. In addition to running Joomla that server also contains a PHP script that receives a file via POST and moves it to a specified folder. This is sent form a .NET client that crucially does not put the Content-Length header into the post with the content type of ' multipart/form-data; boundary=---------------------8d8cf4b4721703d\n'.
The file is notified as being correctly received by this PHP code:
if (move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile))
On PHP5 this was working just fine, on PHP7 it is not. When testing with Postman, we determined that if the Content-Length header is present the $_FILES['file'] contains the file but if Content-Length is not present it will not contain the file. The call typically comes as a post from a .NET application that we have determined does not send the Content-Length header
How can I get PHP7 to replicate the behaviour of PHP5; note that I cannot change the sender as it is a .NET rich client application installed and I can't get to the client to change it.
Thanks for your help.
Cheers,
Neil
Related
The URL in question : http://www.roblox.com/asset/?id=149996624
When accessed in a browser, it will correctly download a file (which is an XML document). I wanted to get the file in php, and simply display its contents on a page.
$contents = file_get_contents("http://www.roblox.com/asset/?id=149996624");
The above is what I've tried using (as far as I know, the page does not expect any headers). I get a 500 HTTP error. However, in Python, the following code works and I receive the file.
r = requests.get("http://www.roblox.com/asset/?id=147781188")
I'm confused as to what the distinction is between how these two requests are sent. I am almost 100% it is not a header problem. I've also tried the cURL library in PHP to no avail. Nothing I've tried in PHP seems to succeed with the URL (with any valid id parameter); but Python is able to bring success nonchalantly.
Any insight as to why this issue may be happening would be great.
EDIT : I have already tried copying Python's headers into my PHP request.
EDIT2 : It also appears that there are two requests happening upon navigating to the link.
Is this on a linux/mac host by chance? If so you could use ngrep to see the differences on the request themselves on the wire. Something like the following should work
ngrep -t '^(GET) ' 'src host 127.0.0.1 and tcp and dst port 80'
EDIT - The problem is that your server is responding with a 302 and the PHP library is not following it automatically. Cheers!
Here is a simple file upload form HTML.
<form enctype="multipart/form-data" action="upload.php" method="POST">
Send this file: <input name="userfile" type="file" />
<input type="submit" value="Send File" />
</form>
And the php file is pretty simple too.
<?php
die();
As you see, the PHP script do nothing in server side. But when we uploading a big file, the process still cost a long time.
I know, my PHP code will executed after the POST process ended. PHP MUST prepare the array named $_POST and $_FILES before the first line code parsed.
So my question is: Can PHP (with Apache or Nginx) check HTTP header before POST request finished?
For example, some PHP extensions or Apache modules.
I was told that Python or node.js can resolve this problem, just want to know if PHP can or not.
Thanks.
================ UPDATE 1 ================
My target is try to block some unexpected file-upload request. For example, we generated a unique token as POST target url (like http://some.com/upload.php?token=268dfd235ca2d64abd4cee42d61bde48&t=1366552126). In server side, my code like:
<?php
define(MY_SALT, 'mysalt');
if (!isset($_GET['t']) || !isset($_GET['token']) || abs(time()-$_GET['t'])>3600 || md5(MY_SALT.$_GET['t'])!=$_GET['token']) {//token check
die('token incorrect or timeout');
}
//process the file uploaded
/* ... */
Code looks make sense :-P but cannot save bandwidth as I expected. The reason is PHP code runs too late, we cannot check token before file uploading finished. If someone upload file without correct token in url, the network and CPU of my server still wasted.
Any suggestion is welcome. Thanks a lot.
The answer is always yes because this is Open Source. But first, some background: (I'm only going to talk about nginx, but Apache is almost the same.)
The upload request isn't sent to your PHP backend right away -- nginx buffers the upload body so your PHP app isn't tying up 100MB of RAM waiting for some guy to upload via a 300 baud modem. The downside is that your app doesn't even find out about the upload until it's done or mostly done uploading (depending on client_body_buffer_size).
But you can write a module to hook into the different "phases" internally to nginx. One of the hooks are called when the headers are done. You can write modules in LUA, but it's sill fairly complex. There may be a module that will send the "pre-upload" hook out to your script via HTTP. But that's not great for performance.
It's very likely you won't even need a module. The nginx.conf files can do what you need. (i.e. route the request to different scripts based on headers, or return different error codes based on headers.) See this page for examples of header checking (especially "WordPress w/ W3 Total Cache using Disk (Enhanced)"): http://kbeezie.com/nginx-configuration-examples/
Read the docs, because some common header-checking needs already have directives of their own (i.e. client_max_body_size will reject a request if the Content-Length header is too big.)
There is no solution in HTTP level, but is possible in TCP level. See the answer I chose in another question:
Break HTTP file uploading from server side by PHP or Apache
I have installed the wonderful software wkhtmltopdf on our production Debian server to be used for creating PDFs from URLs. We stream (i hope that's the right term) the resulting PDF to the client, we have no interest in storing them server side.
We only generate PDFs from local URLs (on the own server that is). However, the URL is still based on the domain and not the local IP since there are multiple sites on the server.
Our problem is that for some local pages, all we get back is a entirely blank page (not even a PDF). The response code is 200, but the content-type is text/html. For those pages where a PDF is successfully streamed to the client, the content-type is application/pdf.
I thought that maybe something goes wrong in the generation of my PDF, so i ran exactly the same commands that PHP executes, and then a PDF was being generated successfully.
I am using the library found on this page to make PHP connect with wkhtmltopdf.
$pdf = new WkHtmlToPdf(array(
'margin-left'=>0,
'margin-right'=>0,
'margin-top'=>0,
'margin-bottom'=>0,
'print-media-type',
'disable-smart-shrinking'
));
$url = "http://myserver.se/$url";
$pdf->addPage($url);
$pdf->send();
Why do i get blank pages back for some URLs?
It turned out, the problem was in the library i was using. I can't tell exactly what the problem was, but proc_close in the send method of the wkhtmlpdf class was returning 2 instead of the expected 0 when a PDF was succesfuly created. This lead to that the library thought that no PDF was created, and it simply returned false meaning nothing was outputted to the client. I solved it by checking if the file existed instead by using PHP's file_exists function.
$output = file_get_contents("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
var_dump($output);
HTTP 505 Status means the webserver does not support the HTTP version used by the client (in this case, your PHP program).
What version of PHP are you running, and what HTTP/Web package(s) are you using in your PHP program?
[edit...]
Some servers deliberately block some browsers -- your code may "look like" a browser that the server is configured to ignore. I would particularly check the user agent string that your code is passing along to the server.
Check in your PHP installation (php.ini file) if the allow_url_fopen is enabled.
If not, any calls to file_get_contents will fail.
It works fine for me.
That site could be blocking the server that you're using to access it.
When you run the URL from your browser, your own ISP is used to get the information and display in your browser. But when you run from PHP, the ISP of your web host is used to get the information, then it passes it back to you.
Maybe you can do this to check and see what kind of headers its returning for you?
$headers=get_headers("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
print_r($headers);
I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.