php curl_exec doesn't react to server response - php

I'm trying to upload files to another server via Curl. The script uploads a file to my own server. My server processes the upload and responses, my script keeps going as it should. Hovever if the uploaded file is of a big (About 500MB) size (seems do differ) the script keeps running even after my server responded (AFAIK by logging the output of the serverside script). The client is a Windows 7 x64 machine running Xampp (different versions tried). If I upload the same file to the same serverside script via an html form everything works fine and I get my response. What could be the problem?
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array(
"MAX_FILE_SIZE" => "2147483648",
"action" => "upload",
"userfile" => "#".$filepath));
if($uploadresponse = curl_exec($ch)){
echo"Upload done!";
} else {
echo"Curl error no. ".curl_errno($ch)." (".curl_error($ch).")";
}

curl has a default timeout to allow it execute, regardless the success. You can extend this time using this curl option(as I am suspecting you are getting timed out while uploading a large file):
curl_setopt($ch, CURLOPT_TIMEOUT, 5 * 60); // 5 minutes

It is possible to set no time limit to CURLOPT_TIMEOUT by simply setting the timeout to 0.
curl_setopt($ch, CURLOPT_TIMEOUT, 0);

Related

PHP curl download remote stream / chunked data

I have audio files on a remote server that are streamed / chunked to the user. This all works great in the clients browser.
But when I try to download and save the files locally from another server using curl, it only seems to be able to download small files (less than 10mb) sucessfully, anything larger and it seems to only download the header.
I assume this is because of the chunking, so my question is how do I make curl download the larger (chunked) files?
With wget on the cli on linux this is as simple as :
wget -cO - https://example.com/track?id=460 > mytrack.mp3
This is the func I have written using curl in PHP, but like I say it's only downloading headers on large files :
private function downloadAudio($url, $fn){
$ch = curl_init($url);
$path = TEMP_DIR . $fn;
$fp = fopen($path, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_AUTOREFERER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
if (file_exists($path)) {
return true;
}
return false;
}
In my case it was failing as I had forgotten to increase the default PHP memory_limit on the origin server.
It turned out after posting this question that it was actually successfully downloading any files that seemed to be below the 100mb mark, not 10mb as I had stated in the question. As soon as I realised this I checked the memory_limit and low and behold it was set to the default 128m.
I hadn't noticed any problems client side as it was being chunked, but when the server tried to grab an entire 300mb file in less than 1 second the memory limit must have been reached.

How to upload file from local to server using curl in php?

I want to upload the file from local to server using curl in php and without using the form (which is html).
My php version is 5.2.6.
I have tried many different way and make many research about (upload file to server using curl in php), but the solution in google cannot solve my problem.
Below is my code:
// open file descriptor
$fp = fopen ("user.com/user/fromLocal.txt", 'w+') or die('Unable to write a file');
// file to download
$ch = curl_init('C:/wamp/www/user/fromLocal.txt');
// enable SSL if needed
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
// output to file descriptor
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
// set large timeout to allow curl to run for a longer time
curl_setopt($ch, CURLOPT_TIMEOUT, 1000);
curl_setopt($ch, CURLOPT_USERAGENT, 'any');
// Enable debug output
curl_setopt($ch, CURLOPT_VERBOSE, true);
echo $ch;
echo $fp;
curl_exec($ch);
curl_close($ch);
fclose($fp);
Expected output:
The file can upload to server and view.
Actual output:
Unable to write the file
I think you miss important information.
fopen ("user.com/user/fromLocal.txt", 'w+')
this means nothing.
To send a file to the server the server has to be configured to accept a POST request and you have to send the file through that endpoint.
If you have a form, do you send it to: "user.com/user/fromLocal.txt" ??
You have to create the form data with curl and send it to a server ready to accept your request. There are different ways to accomplish that. And the most simple is exactly to send a form using curl and not the HTML. But absolutly you cannot write a file like that in a server.

PHP curl unable to access other servers via VPN

I have a server connected to my company via OpenVPN and I am trying to retrieve JSON from another server using PHP's curl command.
Running the most stripped back version of the code in PHP's interactive mode from the command line (php -a), works fine:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'status.int.companyname.com');
curl_exec($ch);
{...expected json response after 2 seconds}
Howver when I run this from within Apache, the script will just hang.
Setting some timeouts and outputting the error
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
echo 'Error: '.curl_error($ch);
Gives the error as
Error: Resolving timed out after 10517 milliseconds
Initially I thought this was a DNS-related issue, but changing the URL to use the IP address or specifying the address to resolve to
curl_setopt($ch, CURLOPT_RESOLVE, [ 'status.int.companyname.com:80:10.10.0.5']);
yeilded a connection error:
Error: Connection timed out after 10001 milliseconds
My next thought was that Apache was using the wrong interface. The VPN is the default gateway, so works fine with everything else but I tried forcing it with
curl_setopt($ch, CURLOPT_INTERFACE, '10.33.0.105');
This made no difference. After reading around a few other posts, I tried adding the following in various combinations but it made no difference.
curl_setopt($ch, CURLOPT_DNS_USE_GLOBAL_CACHE, false);
curl_setopt($ch, CURLOPT_DNS_CACHE_TIMEOUT, 2);
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
file_get_contents has the same problem
I don't inderstand it. Curl on the command line works and PHP in interactive mode work fine. It just doesn't work when the PHP script is executed via Apache.
Other things I've tried:
starting Apache after the VPN is connected
setting Apache to only bind to IPv4
Resolv.conf contains one nameserver, which is the correct one for the company VPN
The differences between the CLI and Apache versions of php.ini are as expected:
-disable_functions =
+disable_functions = pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,
-expose_php = On
+expose_php = Off
-memory_limit = -1
+memory_limit = 128M
Has anyone seen this before? I've lost a few hours of my day on this now and it's driving me mad!
Thanks

Send data from server to another by curl or any other way

I have two servers.
The first (Local) is the server which has the script.
The second (Remote) is the server which has uploaded files only (just for storage).
Now I'm confused about how to upload the files, and I have 2 ways and I don't know what the best one of them.
The ways:
Upload the file to the local server first and make all security
operations like (check file size,file type, and so on), then upload
again to the second server (remote server).
Upload the file to the second server (remote server) directly,
and make all checks and security operations in there, then send the file information to the first server (local) to store the info into database.
Or there is another way is better than them ?
I have tried to apply the second way, by send the file directly to the second server (remote server) and then after everything is ok, the second server (remote server) send to the first server (local) all information about the uploaded file by curl.
For example:
The following code written in file exists in php file in the second server (remote server).
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://localServer/script/receiveReques.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
curl_setopt($ch, CURLOPT_POST, 3);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('fileName' => 'filename.zip', 'size' => 1000, 'path'=> 'http://remoteserver.com/files/filenmae.zip'));
curl_exec($ch);
curl_close($ch);
But the problem I don't know how to get the sent information.
The summary of my questions is:
What the best way to do that ?
How to receive the file information from the second server (remote
server) that has been sent by curl ?
you have
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
all you need to do is use the data it returns
$data = curl_exec($ch);
all of the data is now in $data

How to set a timeout on PHP5 curl calls? Published CURL options do not seem to work

We've written a script that pulls data from an external server. If the server goes down we don't want our server waiting for the data since we process a lot of data and we don't want it bogged down. To address this, we're trying to timeout our curl calls if they take more than a couple hundred milliseconds.
I found some documentation saying that CURLOPT_TIMEOUT_MS and CURLOPT_CONNECTTIMEOUT_MS should be available in my version of php and libcurl, but it does not seem to be timing out, even if I set the timeout to 1ms.
$url = "http://www.cnn.com;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER,0); //Change this to a 1 to return headers
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT_MS, 1);
$data = curl_exec($ch);
curl_close($ch);
Does anyone know what we're doing wrong or another way to do this?
saw this in unresponsive dns server and curl multi timeouts not working:
"...We have had some times where a
site that we pull information has had
dns server become unresponsive. When
this happens the timeouts set in curl
(php bindings) do not work as
expected. It times out after 1min 14
sec with "Could not resolve host:
www.yahoo.com (Domain name not found)"
To make this happen in test env we
modify /etc/resolv.conf to have a
nameserver that does not exist
(nameserver 1.1.1.1). No mater what
they are set at
(CURLOPT_CONNECTTIMEOUT, CURLOPT_CONNECTTIMEOUT_MS
, CURLOPT_TIMEOUT, CURLOPT_TIMEOUT_MS)
they don't timeout when we cant get
to the DNS server. I use curl_multi
because i we have multiple sources
that we pull info from at the same
time. The example below makes one
call for example simplicity. And as a
side note curl_errno does not return
an error code even though there was an
error. Not sure why..."

Categories