I have a relatively simple script like the following:
<?php
$url = "localhost:2222/test.html";
echo "*** URL ***\n";
echo $url . "\n";
echo "***********\n";
echo "** whoami *\n";
echo exec('whoami');
echo "* Output **\n";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $output;
When I execute it on the command line, it works - I get the meager results from within test.html.
When I run this script by loading up the built-in PHP server and browsing to the script, it hangs. No output to the screen, nothing written to the logs.
I read that sometimes user permissions can get in the way, so I tried doing whoami to ensure that the user that ran the built-in PHP server is the same as the one who executed the script on the command line; which they are.
safe_mode is off, disable_functions is set to nothing. I can exec other commands successfully (like the whoami).
What else should I check for? Does the built-in PHP server count as someone other user when it fulfills a request perhaps?
The PHP built-in development web server is a very simple single threaded test server. It cannot handle two requests at once. You're trying to retrieve a file from itself in a separate request, so you're running into a deadlock. The first request is waiting for the second to complete, but the second request cannot be handled while the first is still running.
Since PHP 7.4 the environment variable PHP_CLI_SERVER_WORKERS allows concurrent requests by spawning multiple PHP workers on the same port on the built-in web server. It is considered experimental, see the docs.
Using it, the PHP script can send requests to itself which is already being served, without halting.
PHP_CLI_SERVER_WORKERS=10 php -S ...
Works with Laravel as well:
PHP_CLI_SERVER_WORKERS=10 php artisan serve
I think problem in your $url. It may be look like this $url = "http://localhost:2222/test.html"; or $url = "http://localhost/test.html"; I think it's solve your problem. Thanks for your question. Best of luck.
Related
I have call-cli.php
I am executing it via command line.
code of call-cli.php
echo "File 1 ".php_sapi_name(); // returns cli
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://localhost/curltest/step1.php?productId=12");
$response = curl_exec($ch);
curl_close($ch);
code of step1.php
echo " step1 ".php_sapi_name(); // returns apache2handler
if(php_sapi_name()==='cli') {
// To do execute code regarding cli
}
if(php_sapi_name()==='apache2handler') {
// To do execute code regarding apache2handler
}
When ADMIN run step1.php via browser it should execute apache2handler code and for cli the diff one.
I am getting productId from call-cli.php. So I need to invoke curl from call-cli.php
So I want to know is there any way to find curl called via cli file returns cli instead of apache2handler or any other suggestion?
No, as Curl opens the step1.php file like a browser via HTTP which is handled by the web server. Apache in this case, therefore it's apache2handler.
But PHP scripts will only output cli if executed from a terminal.
The curl call in call-cli.php is calling curl via cli, but since you are echoing the result of step1.php, which is served via Apache, you are seeing apache2handler in your code.
If you call step1.php via command line (e.g. php step1.php), you will see that it also returns cli.
Not sure what you are trying to achieve, so please clarify your questions.
I have a relatively simple script like the following:
<?php
$url = "localhost:2222/test.html";
echo "*** URL ***\n";
echo $url . "\n";
echo "***********\n";
echo "** whoami *\n";
echo exec('whoami');
echo "* Output **\n";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
echo $output;
When I execute it on the command line, it works - I get the meager results from within test.html.
When I run this script by loading up the built-in PHP server and browsing to the script, it hangs. No output to the screen, nothing written to the logs.
I read that sometimes user permissions can get in the way, so I tried doing whoami to ensure that the user that ran the built-in PHP server is the same as the one who executed the script on the command line; which they are.
safe_mode is off, disable_functions is set to nothing. I can exec other commands successfully (like the whoami).
What else should I check for? Does the built-in PHP server count as someone other user when it fulfills a request perhaps?
The PHP built-in development web server is a very simple single threaded test server. It cannot handle two requests at once. You're trying to retrieve a file from itself in a separate request, so you're running into a deadlock. The first request is waiting for the second to complete, but the second request cannot be handled while the first is still running.
Since PHP 7.4 the environment variable PHP_CLI_SERVER_WORKERS allows concurrent requests by spawning multiple PHP workers on the same port on the built-in web server. It is considered experimental, see the docs.
Using it, the PHP script can send requests to itself which is already being served, without halting.
PHP_CLI_SERVER_WORKERS=10 php -S ...
Works with Laravel as well:
PHP_CLI_SERVER_WORKERS=10 php artisan serve
I think problem in your $url. It may be look like this $url = "http://localhost:2222/test.html"; or $url = "http://localhost/test.html"; I think it's solve your problem. Thanks for your question. Best of luck.
So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.
I am seeing a very bizarre problem with a PHP application I am building.
I have 2 virtual hosts on my development server (windows 7 64-bit) sometestsite.com and endpoint.sometestsite.com.
In my hosts file, I configured sometestsite.com and endpoint.sometestsite.com to point to 127.0.0.1.
Everything works when the server was running Apache 2.4.2 with PHP 5.4.9 as a fcgi module.
I then removed Apache and installed nginx-1.2.5 (windows build). I got php-cgi.exe running as a service and everything seems to work fine.
The problem is that a CURL call from sometestsite.com to endpoint.sometestsite.com that previously worked would time out.
I then moved that piece of code by itself to a small PHP file for testing:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://endpoint.sometestsite.com/test');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('provider' => urlencode('provider'),
'key' => urlencode('asdf')));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//Execute and get the data back
$result = curl_exec($ch);
var_dump($result);
This is what I receive in the PHP logs:
PHP Fatal error: Maximum execution time of 30 seconds exceeded in D:\www\test5.php on line 22
PHP Stack trace:
PHP 1. {main}() D:\www\test5.php:0
However, if I run the same request using CLI CURL (via Git Bash), it works fine:
$ curl -X POST 'http://endpoint.sometestsite.com/test' -d'provider=provider&key=asdf'
{"test": "OK"}
This is quite strange as the PHP is exactly the same version and has the same configuration as when Apache was used.
I am not sure if this is a web server configuration issue or a problem with PHP's CURL yet.
Can anyone provide some insight/past experiences as to why this is happening?
Nginx does not spawn your php-cgi.exe processes for you. If you came from Apache like me and used mod_fcgid, you will find that you have many php-cgi.exe processes in the system.
Because Nginx does not spawn the PHP process for you, you will need to start the process yourself. In my case, I have php-cgi.exe -b 127.0.0.1:9000 running as a service automatically. Nginx then pushes all requests for PHP to the PHP handler and receives a response.
Problem: PHP-FPM does not work on windows (as of 5.4.9). FPM is a neat little process manager that sits in the background and manages the spawning and killing of PHP processes when processing requests.
Because this is not possible, on Windows, we can only serve 1 request at a time, similar to the problem experienced here.
In my case, the following happens: Call a page in my application on sometestsite.com which makes a call to php-cgi.exe on 127.0.0.1:9000. Inside, a CURL request calls a page on endpoint.sometestsite.com. However, we are unable to spawn any new PHP processes to serve this second request. The original php-cgi.exe is blocked by serving the request that is running the CURL request. So, we have a deadlock and everything then times out.
The solution I used (it is pretty much a hack) is to use this python script to spawn 10 PHP processes.
You then use an upstream block in nginx (as per the docs for the script) to tell nginx that there are 10 processes available.
Things then worked perfectly.
Having said that, please do not ever use this in production (you are probably better off running nginx and php-fpm on Linux anyway). If you have a busy site, 10 processes may not be enough. However, it can be hard to know how many processes you need.
However, if you do insist on running nginx with php on windows, consider running PHP-FPM within Cygwin as per this tutorial.
Be sure that you run script on console from same user that used for run cgi process. If they not same - they may have different permissions. For me problem was in firewall rules that disallow open external connections for owner of cgi process.
I am trying to track down an issue with a cURL call in PHP. It works fine in our test environment, but not in our production environment. When I try to execute the cURL function, it just hangs and never ever responds. I have tried making a cURL connection from the command line and the same thing happens.
I'm wondering if cURL logs what is happening somewhere, because I can't figure out what is happening during the time the command is churning and churning. Does anyone know if there is a log that tracks what is happening there?
I think it is connectivity issues, but our IT guy insists I should be able to access it without a problem. Any ideas? I'm running CentOS and PHP 5.1.
Updates: Using verbose mode, I've gotten an error 28 "Connect() Timed Out". I tried extending the timeout to 100 seconds, and limiting the max-redirs to 5, no change. I tried pinging the box, and also got a timeout. So I'm going to present this back to IT and see if they will look at it again. Thanks for all the help, hopefully I'll be back in a half-hour with news that it was their problem.
Update 2: Turns out my box was resolving the server name with the external IP address. When IT gave me the internal IP address and I replaced it in the cURL call, everything worked great. Thanks for all the help everybody.
In your php, you can set the CURLOPT_VERBOSE variable:
curl_setopt($curl, CURLOPT_VERBOSE, TRUE);
This then logs to STDERR, or to the file specified using CURLOPT_STDERR (which takes a file pointer):
curl_setopt($curl, CURLOPT_STDERR, $fp);
From the command line, you can use the following switches:
--verbose to report more info to the command line
--trace <file> or --trace-ascii <file> to trace to a file
You can use --trace-time to prepend time stamps to verbose/file outputs
You can also use curl_getinfo() to get information about your specific transfer.
http://in.php.net/manual/en/function.curl-getinfo.php
Have you tried setting CURLOPT_MAXREDIRS? I've found that sometimes there will be an 'infinite' redirect loop for some websites that a normal browser user doesn't see.
If at all possible, try sudo ing as the user PHP runs under (possibly the one Apache runs under).
The curl problem could have various reasons that require a user input, for example an untrusted certificate that is stored in the trusted certificates cache of the root user, but not the PHP one. In that case, the command would be waiting for an input that never happens.
Update: This applies only if you run curl externally using exec - maybe it doesn't apply.