I am a noob on cURL - I am trying to authenticate a session with a server according to the documentation provided to me.
When I try the below command from OSX it works fine and I end up getting a "Session Successfully Authenticated" message.
But, when I try the exact same command from Windows (tried on 3 diff machines) It gives me "Authentication failed!" message.
curl --request POST http://xx.yy.zz.nnn/login.php --data 'username=ams' --data 'password=<password>' -c cookie.txt -v
In the verbose, the only difference I see between the two systems are - the content-length = 34 (in windows) and content-length = 30 (in OSX).
When I do a cURL from windows, is there any special consideration that needs to be taken into account when sending data??
Any help/pointer to resolve this is much appreciated. Thanks.
Is there a way to set curl to something like curl --url localhost/mysite/file.php --returntransfer 1? I have a script that I'd rather execute from CLI due the many things is handles and I'd like to see the output while it's processing. (By the way; I'm accessing a Laravel route as --url parameter, but I don't think that matters)
First, I read some threads by people with similar problems but all answers didn't go beyond export DISPLAY=:0.0 and xauth cookies. So here is my problem and thanks in advance for your time!
I have developed a little library which renders shelves using OpenGL and GLSL.
Last few days I wrapped it in a php extension and surprisingly easy it works now.
But the problem is it works only when I execute the php script using the extension from commandline
$php r100.php(i successfuly run this from the http user). The script is in the webroot of apache and if I request it from the browser I get ** CRITICAL **: Unable to open display in apache's error_log.
So, to make things easier to test and to be sure that the problem is not in the library/extension, at the moment I just want to start xmms with following php script.
<?php
echo shell_exec("xmms");
?>
It works only from the shell too.
I've played with apache configuration so much now that I really dont know what to try.
I tried $xhost + && export DISPLAY=:0.0
In the http.conf I have these
SetEnv DISPLAY :0.0 SetEnv XAUTHORITY /home/OpenGL/.Xauthority
So my problem seems to be this:
How can I make apache execute php script with all privileges that the http user has, including the environment?
Additional information:
HTTP is in video and users groups and has a login shell(bash).
I can login as http and execute scripts with no problem and can run GUI programs which show up on display 0.
It seems that apache does not provide the appropriate environment for the script.
I read about some difference between CLI/CGI but cant run xmms with php-cgi too...
Any ideas for additional configuration?
Regards
Sounds bit hazard, but basically you can add even export DISPLAY=:0.0 to apache start-up script (like in Linux /etc/init.d/httpd or apache depending distro).
And "xhost +" need to be run on account which is connected to local X server as user, though I'm only wondering how it will work as php script should only live while apache http request is on-going.
Edit:
Is this is kind of application launcher?, you can spawn this with exec("nohub /usr/bin/php script.php &"); .. now apache should be released and php should continue working in background.
In your console, allow everyone to use the X server:
xhost +
In your PHP script, set the DISPLAY variable while executing the commands:
DISPLAY=:0 glxgears 2>&1
We changed servers and installed all necessary software and just cannot seem to pin point what is going on. A simple CURL request does not return anything. Command Line CURL commands work just fine. We are using a wrapper for CURL utilizing streams. Do PHP streams require any out of the ordinary configuration? We are using the latest Lamp stack.
This is the var_dump:
The var_dump can be seen at:
http://jinimetrix.com/test.php
Have you used phpinfo(); to verify that cURL is in fact detected?
Just create a test page containing only:
<?php
phpinfo();
?>
and verify that the section entitled "curl" appears.
I am trying to track down an issue with a cURL call in PHP. It works fine in our test environment, but not in our production environment. When I try to execute the cURL function, it just hangs and never ever responds. I have tried making a cURL connection from the command line and the same thing happens.
I'm wondering if cURL logs what is happening somewhere, because I can't figure out what is happening during the time the command is churning and churning. Does anyone know if there is a log that tracks what is happening there?
I think it is connectivity issues, but our IT guy insists I should be able to access it without a problem. Any ideas? I'm running CentOS and PHP 5.1.
Updates: Using verbose mode, I've gotten an error 28 "Connect() Timed Out". I tried extending the timeout to 100 seconds, and limiting the max-redirs to 5, no change. I tried pinging the box, and also got a timeout. So I'm going to present this back to IT and see if they will look at it again. Thanks for all the help, hopefully I'll be back in a half-hour with news that it was their problem.
Update 2: Turns out my box was resolving the server name with the external IP address. When IT gave me the internal IP address and I replaced it in the cURL call, everything worked great. Thanks for all the help everybody.
In your php, you can set the CURLOPT_VERBOSE variable:
curl_setopt($curl, CURLOPT_VERBOSE, TRUE);
This then logs to STDERR, or to the file specified using CURLOPT_STDERR (which takes a file pointer):
curl_setopt($curl, CURLOPT_STDERR, $fp);
From the command line, you can use the following switches:
--verbose to report more info to the command line
--trace <file> or --trace-ascii <file> to trace to a file
You can use --trace-time to prepend time stamps to verbose/file outputs
You can also use curl_getinfo() to get information about your specific transfer.
http://in.php.net/manual/en/function.curl-getinfo.php
Have you tried setting CURLOPT_MAXREDIRS? I've found that sometimes there will be an 'infinite' redirect loop for some websites that a normal browser user doesn't see.
If at all possible, try sudo ing as the user PHP runs under (possibly the one Apache runs under).
The curl problem could have various reasons that require a user input, for example an untrusted certificate that is stored in the trusted certificates cache of the root user, but not the PHP one. In that case, the command would be waiting for an input that never happens.
Update: This applies only if you run curl externally using exec - maybe it doesn't apply.