In a php script i want to check certain file types, which i do like this:
$info = explode('/',system("file -bi -- uploads/img/test.gif"));
if ( $info[0])
echo 'ok';
This is working just fine, as long as the url is without subdomain, f.e domain.com, but is not working under en.domain.com. It appears, that the file is just not found.
On the other hand, if in the same script i check for existance of the file:
if ( file_exists("uploads/img/test.gif"))
echo 'exists';
The file is found, wheter the current url is domain.com or en.domain.com. Why is the file not found, if the system() function is used?
solved
Friends, i finally figured out the solution, after nearly three working days. Maybe you can imagine how relieved i am.
the solution was in php options to activate mod_rewrite. this was allready activated in regular domain, but not for subdomains. guess i could have thought of this earlier. Even though your answers didnt lead me to how to solve this, i still thank you for your input, as it was going in the same directions i was trying to solve it myself the past days.
The shell command will not necessarily take your file root to be the same as the php command (one will be the user's root directory and the other will be the webserver I think), use absolute path(s) in the shell command and this will probably work how you want it.
You could work out where the two url's are with the file command by using pwd, e.g.
echo system('pwd');
Or you could use finfo-open instead of the shell command which should avoid the problem.
Related
So I am trying to get shell_exec to give me the response of the program, but it is not giving me any response when I put it into echo. I have tried rewriting the output variable without a input variable, and have tried to produce errors hoping the error would give me some sort of output, but still nothing. Here is the code I am having issues with
<?php
$key = $_GET['key'];
$output = shell_exec("node totp.js ".$key."");
echo "Your output is = $output";
?>
Thanks!
Edit:
I tried defining where node is /usr/bin/node and where the totp.js is /var/www/html/totp.js and it gives an error in my log: No such file or directory. I took off everything other than trying to execute node, and I get the same no such file error. When I go back to what I had to begin with, I now get an error that command is not found, which was not appearing before. Is there a way to allow PHP to roam outside of the execution folder, I.E. the htdocs/html folder?
Edit 2:
Tried changing permissions, chowning, discovered my PHP runs under the user apache, which now owns and has 755 permissions to my entire web folder, and now it recognizes the /usr/bin/node.... but it has no permissions. I have tried adding permissions in every possible way, I've tried giving it 777 although I knew I would not leave it at that, and it still has no permission to access it. I am clueless, and I guess this has become a linux issue rather than a php issue... lol - Also changing the title of the post as it is no longer a shell_exec issue, it is an issue with permissions
Edit 3: Something is royally wrong, it now gives 2 error messages. One is "No such file or directory" and the other is "Permission denied". It gives both, leading me to believe it is running on 2 seperate users/2 seperate locations. Idk what I should do, so I am going to reset my server as I have not done much, and start again. Saving the code first to see if that clears it up for me.
Gone through related posts and found turning allow_url_include will does the trick. However when I did this :
remote file file.php at http://www.courierscripts.com
$content = file_get_contents('http://www.courierscripts.com/folder/file.php');
on my functions.php, was not able to use the functions of file.php. I also don't want to change my file.php to file.txt because everyone can see it.
Any other way?
If the file is on the same server, use absolute or relative path to it, not an url. Otherwise:
Short answer:
No, it's not possible.
Long answer:
Actually possible with conditions but I bet you won't like them.
It's obviously impossible if you don't have access to the target server (otherwise storing passwords in php config files like Wordpress does would be just one big security flaw).
First of all, file_get_contents returns a string. So you could eval it, but eval is very bad (you can search SO for the clues why).
OK, suppose you agree to eval what's coming from that server even after considering that someone might change the code and do whatever he wants on your machine. BUT you make an http request that is handles by the server (Apache, Nginx or whatever else).
The server knows that *.php files should not be handles as static files. For example, fastcgi. You can turn that off, for example, with RemoveHandler in Apache. But that would let everyone see the source code of files you expose this way.
So, after removing handlers and evaling the result, you could get the result. But be ready that someone you work with will punch you in the face for doing that ;)
UPD
For code sharing, use Composer to create a package and use it as a dependency.
I tried to check for the existence of a file:
$center='412';
$dl ='download/D'.$center.'.zip';
if (file_exists($dl)) {
if I execute those script on the localhost, it's run properly or found the file, but if I execute on the web server, it's goes wrong or file not found. then I change the script as follows.
$center='412';
$dl ='/home/a1527507/public_html/sm/download/D'.$center.'.zip';
if (file_exists($dl)) {
a1527507 is my user id. it's still does not run properly. is there anything wrong from my script. Thank you for your help.
There are 2 potential issues here. The first script uses a relative directory. It assumes that 'download/D' is in the same working directory as the executing script. That may not be the case on your server. The second issue is that you are taking
$dl ='http://www.comze.com/sm/download/D'.$center.'.zip';
and checking for it's existence on the server which is most certainly will not as it's a URL.
I would recommend that you use an absolute directory to make sure you're checking the correct location for the file "/usr/local/nginx/html/downloads...". That will likely resolve this problem.
I strongly suggest you use the first script listed.
The issue here may be lack of directory permissions so you may need to use your ftp client to reconfigure the permissions for all files in the folder you are searching through. Try 7 for user 5 for group and 5 for world. Read more here: https://codex.wordpress.org/Changing_File_Permissions
So I need a bit of help in regards to ssh/command lines being run from php.
Out of the gate - yes exec is enabled on the server - its our own dedicated box
The setup. I have created a script that uses cpanel's api to create sub domains
for a particular account. That's all fine and dandy and everything works as expected.
Now - we have a wildcard SSL installed as well.
The long and short of it is this. Under the present conditions if you pulled up:
https://user.domain.com you would only see the top root directory for the account,
NOT the directory it should be in. But if you use it without the https - it does what
it should. Now it's a dedicated box with HostGator and they had this script they
installed for me to do what needs to be done - and that works as expected.
The script HostGator installed is located at:
/var/cpanel/userdata/account/mdssl
So the line they gave me to run the script and do what it should do is as follows:
./mdssl clone domain.com user.domain.com
So my question is - because I have never really worked with shell stuff
inside of php, would the following line be a workable example of how to execute the script?
shell_exec('/var/cpanel/userdata/account/mdssl clone domain.com user.domain.com');
or do I just need the exec command? And should the opening line be any different?
I just need to tap that line using php and pass the 'user' field to it and have the
script do its thing to clone the setup for the SSL so that the wildcard features work
properly and the users account lands in the proper directory it should.
Any insight is appreciated! :)
Thank you very much.
I've only used exec() or system()
They're always tricky and tend not to behave as expected.
I would start off with
ini_set("display_errors", 1);
exec('/var/cpanel/userdata/account/mdssl clone domain.com user.domain.com', $output);
var_dump($output);
See what the output says or if a PHP error is occurring.
You may also need to escape out of the current directory of whatever default you're in with ../ and make sure apache user has access to mdssl (or whatever user your system uses). You may need to sudo it.
I'm working on a program right now that calls the script mail.php located in /var/www/vhosts/company/httpdocs. mail.php is trying to execute require_once dirname(__FILE__).'/../pear/Mail.php' to do an smtp send and the require_once is failing. My PEAR directory's located in /var/www/vhosts/company/pear. I then tried to add /var/www/vhosts/company/pear to the include_path but require_once is still failing.
I decided to take a step back and replace mail.php as a simple script that does file_exists(dirname(__FILE__).'/../pear/Mail.php') and prints the result to a logfile. When I run the script independently, it works fine and returns 1. When the flash program runs it, it's returning nothing. Printing out dirname(__FILE__).'/../pear/Mail.php' returns the same regardless if I run the script independently or if the flash file runs it. I've also tried chmod 777 on the Mail.php PEAR file but that didn't do anything.
Any ideas on what's going on?
I would bet anything it has to do with two things:
1) Flash/Actionscript normally does not access local file paths.
In other words, it probably isn't even executing the file.
As a compiled client-side module it needs an actual web-accessible URL. Part of the problem here is the design itself. Try it with an HTTP request within the actionscript and you will have better results. If you don't have access to the flash file.. well tough beans there.
Now if you are running a mail routine through actionscript? I would say that is a security risk. You are better off having the actionscript pass the routine to an AJAX receiver routine that checks session credentials and then sends mail.
2) CWDUP restrictions on the server.
Depending on certain server configurations, excecutables normally do not have access to filepaths outside their own root. (i.e. an executible cannot call ....\another directory\other file.) Some servers will allow this, but many wont.
You may want to make sure your PEAR directory is in your php.ini path variable. This way you don't need to use CWDUPs in your directory name at all, it will find it in the includes directory. (which is normally how pear modules work.)
So rather than using a buncha dot-dots.. try working down from the top.
$mailpath=$_SERVER['DOCUMENT_ROOT'].'\include\mail.php';
As a last resort, you can try copying the mail.php routine into the same directory and see if that works. If that still fails, then its your include path to PEAR. (as the mail.php is probably calling PEAR functions.)