Is there a way to set curl to something like curl --url localhost/mysite/file.php --returntransfer 1? I have a script that I'd rather execute from CLI due the many things is handles and I'd like to see the output while it's processing. (By the way; I'm accessing a Laravel route as --url parameter, but I don't think that matters)
Related
At the moment I'm trying to analyze a PHP Application. The Profiler starts working and stops then at 1/10. While this the memory usage of the docker container goes straight up. After the failure in the blackfire log is an entry like "Profile data is trunctated."
I've tried to request it with curl over the cli and with firefox. If I call the page normally in Firefox or via curl I get the correct response
curl --request GET --url 'http://xxx/index.php?eID=contacts&optigemId=1335600' --header 'cookie: fe_typo_user=xxxx' --cookie fe_typo_user=xxx
By chance, do you have any disabled PHP functions in your php.ini or any other PHP configuration file for your domain ?
(disable_functions in php.ini, see https://www.php.net/manual/en/ini.core.php#ini.disable-functions)
I had to delete the function opcache_get_status from the list of disabled functions to get Blackfire to work with my PHP configuration on Plesk.
Cheers.
I have a page (realized with a php framework) that add records in a MySQL db in this way:
www.mysite.ext/controller/addRecord.php?id=number
that add a row in a table with the number id passed via post and other informations such as timestamp, etc.
So, I movedo my eintire web applicazione to another domain and all HTTP requests works fine from old to new domain.
Only remaining issue is the curl: I wrote a bash script (under linux) that run curl of this link. Now, obviously it does not works because curl returns an alert message in which I read the page was moved.
Ok, I edited the curl sintax in this way
#! /bin/sh
link="www.myoldsite.ext/controlloer/addRecord.php?id=number"
curl --request -L GET $link
I add -L to follow url in new location but curl returns the error I wrote in this topic title.
It would be easier if I could directly modify the link adding the new domain but I do not have physical access to all devices.
GET is the default request type for curl. And that's not the way to set it.
curl -X GET ...
That is the way to set GET as the method keyword that curl uses.
It should be noted that curl selects which methods to use on its own depending on what action to ask for. -d will do POST, -I will do HEAD and so on. If you use the --request / -X option you can change the method keyword curl selects, but you will not modify curl's behavior. This means that if you for example use -d "data" to do a POST, you can modify the method to a PROPFIND with -X and curl will still think it sends a POST. You can change the normal GET to a POST method by simply adding -X POST in a command line like:
curl -X POST http://example.org/
... but curl will still think and act as if it sent a GET so it won't send any request body etc.
More here: http://curl.haxx.se/docs/httpscripting.html#More_on_changed_methods
Again, that's not necessary. Are you sure the link is correct?
How can I check in my PHP script, that script execute exactly from Cron Tab by wget util, not from web browser by any user?
There is no reliable solution. Anything wget can do, your browser can do too.
Your best shot is something like sending to wget to http://example.com/mysript.php?accesskey=some+really+obscure+password+that+only+you+should+know and check for that access key in your script. Of course, if anyone gets the password, this kind of protection is useless, but it's a far more consistent point of failure than blindly relying on User-Agent sniffing.
A possibility would be to use $argv. Check if $argv[1] is a certain value and call the script from crontab with the value as argument, like so: script.php argument1.
You're question is a bit difficult to understand bus I guess you wan't to make sure a PHP script is requested by wget (that get initiated by cron)
Although it might be more efficient to call the PHP script directly by cron in this case you could check the server's logging end search for the user agent matching something like wget.
An insecure solution would be to check the headers for the User-Agent:
wget -d http://www.google.com/
---request begin---
GET / HTTP/1.0
User-Agent: Wget/1.12 (linux-gnu)
Accept: */*
Host: www.google.com
Connection: Keep-Alive
---request end---
So you could do:
<?php
$userAgent = $_SERVER['HTTP_USER_AGENT'];
if (strstr($userAgent, 'Wget')) {
// wget Request
}
You can pass some arguments in crontab for your script http://php.net/manual/reserved.variables.argv.php
Then checking for these args you'll know if your script is used from command line or web.
EDIT :
Seeing answers, let's make this clear.
Calls with Wget or cURL or whatever HTTP GET request WON'T PASS ARGS!
ARGS will only pass with local call (like : php script.php arg1 arg2).
Please, noobs, stop talking when you don't know about it, and try it out yourself on your server if you aren't sure about it.
So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.
I have a homebase script, that I have many other scripts ping for information using the CURL method. I need to determine the domain name of the callers. Can I do this with tricks just on my homebase script?
. using php .
Hudson
You could send a custom HTTP header with your CURL request that contains the script name, something like
X-SCRIPT-NAME myscript.php
I don't think CURL automatically adds something about the calling script, so you would have to edit the scripts for this.