I tried to check for the existence of a file:
$center='412';
$dl ='download/D'.$center.'.zip';
if (file_exists($dl)) {
if I execute those script on the localhost, it's run properly or found the file, but if I execute on the web server, it's goes wrong or file not found. then I change the script as follows.
$center='412';
$dl ='/home/a1527507/public_html/sm/download/D'.$center.'.zip';
if (file_exists($dl)) {
a1527507 is my user id. it's still does not run properly. is there anything wrong from my script. Thank you for your help.
There are 2 potential issues here. The first script uses a relative directory. It assumes that 'download/D' is in the same working directory as the executing script. That may not be the case on your server. The second issue is that you are taking
$dl ='http://www.comze.com/sm/download/D'.$center.'.zip';
and checking for it's existence on the server which is most certainly will not as it's a URL.
I would recommend that you use an absolute directory to make sure you're checking the correct location for the file "/usr/local/nginx/html/downloads...". That will likely resolve this problem.
I strongly suggest you use the first script listed.
The issue here may be lack of directory permissions so you may need to use your ftp client to reconfigure the permissions for all files in the folder you are searching through. Try 7 for user 5 for group and 5 for world. Read more here: https://codex.wordpress.org/Changing_File_Permissions
Related
I was having problems with my PHP website (SuiteCRM) not being able to log users in and I found it was due to not being able to write on the sessions directory.
I am able to fix it by creating the directory /tmp/php_sessions and giving it write permissions for the Apache user www-data. I see the directory get populated with files as users log in.
However, Ubuntu Xenial is deleting my entire tmp directory on reboots, so I have to redo this all over again every time. I decided to move my save_path elsewhere.
After changing things in my php.ini file, and restarting Apache, I can check that they are effective by running this simple script:
<?php
echo ini_get("session.save_path");
phpinfo();
?>
This shows me a double confirmation of the new path, first echoing /var/tmp/php_sessions and then, in the middle of all the phpinfo information, showing the same value as both Local Value and Master value for directive session.save_path.
BUT the directory that php is using is still the first one, /tmp/php_sessions! It seems that my setting is being ignored.
Am I overlooking something? Where could that old setting be buried? Or how can I make the new one effective?
(P.S. - I am not using a redis handler as in another similar SO question)
Ok, I solved my own problem and the general answer is as follows:
There are two more things that can be changing the path and need to be checked,
the PHP code of the application might be changing the ini directive, search the code for ini_set(session.save_path
the PHP code might be using the session_save_path PHP command to override the ini. Search the code for that also (and notice the two underscores _!)
And the specific answer for my case was that SuiteCRM uses session_save_path command to set its path with a value coming from the file config.php found at the web root. That's where I found the old setting, and changing it solved my problem (for good, I hope).
In a php script i want to check certain file types, which i do like this:
$info = explode('/',system("file -bi -- uploads/img/test.gif"));
if ( $info[0])
echo 'ok';
This is working just fine, as long as the url is without subdomain, f.e domain.com, but is not working under en.domain.com. It appears, that the file is just not found.
On the other hand, if in the same script i check for existance of the file:
if ( file_exists("uploads/img/test.gif"))
echo 'exists';
The file is found, wheter the current url is domain.com or en.domain.com. Why is the file not found, if the system() function is used?
solved
Friends, i finally figured out the solution, after nearly three working days. Maybe you can imagine how relieved i am.
the solution was in php options to activate mod_rewrite. this was allready activated in regular domain, but not for subdomains. guess i could have thought of this earlier. Even though your answers didnt lead me to how to solve this, i still thank you for your input, as it was going in the same directions i was trying to solve it myself the past days.
The shell command will not necessarily take your file root to be the same as the php command (one will be the user's root directory and the other will be the webserver I think), use absolute path(s) in the shell command and this will probably work how you want it.
You could work out where the two url's are with the file command by using pwd, e.g.
echo system('pwd');
Or you could use finfo-open instead of the shell command which should avoid the problem.
I'm trying to get a file's content into a variable but I don't get anything when I read the file. I used both methods JFile::read() and file_get_contents() but both return the same thing: a blank string, not an error , not any boolean values or anything.
I want to mention that I'm working on a Linux machine (just for 2 days) and recently I changed the permissions for the entire machine to 777 ( I don't know if this affects something or not).
Is the a connection between my OS, permissions and the php's file_get_contents()? or Joomla restricts file reading?
Also I want to mention that my file_get_contents() function was added manually by me in the index.php file , also the file I want to read was manually added in the same folder with index.php.
We had the same problem with one of our clients, it turned out it was a firewall issue. It was very hard to debug this issue. I suggest you check with your networking team.
I am assuming, of course, that you have set the PHP error reporting level to the maximum and that you have checked the error logs.
It seemed the problem was from my system . I messed it up when I changed the permissions so I have to re-install it.
Never change the permissions on a Linux for the entire file system.
I'm trying to write a PHP script to copy the files from your local machine to a server:
$destination_directory = 'I:\path\to\file\' . $theme_number;
if(!#opendir($desination_directory)) {
echo 'Sorry, the destination directory could not be found.';
die();
}
I check the access to the destination folder with that process, and I keep getting the error return. Anyone know what I'm doing wrong? I pretty much have everything else in place. I just don't know how to access this other server.
Addendum: I accepted an answer below, because it is technically correct, and I was able to get the Apache server to be accepted by the IIS server, however, for what I was trying to accomplish (giving anyone who used the script unfettered ability to move files to the server), it was infeasible. I would've had to set up specific functionality on each of their computers. It seems the best workaround would be to establish the script on the server to which you would like to copy your files, and then move them from your local drive to that location in a more traditional means. That would mean a file server with CGI-exec capabilities, though, which our server did not possess.
I'd guess that you are on windows and that you have I: mapped to a share such as \\server2\files ...
If so, that's your problem. These mappings are only avaialble to the current users (eg, the admin account), not to the IUSR account that your php is probably running as (assuming IIS). Solution, don't use mappings, instead use the full 'unc' path name, ie '\\server\share\folder\file.ext', also remember that the IUSR account will need access to these shares/folders/files
Is this other server accessible via I:\path\to\file\\?
If PHP is reporting an error opening the directory, you might want to make sure it exists and you have access permissions to it.
Also, the two slashes (\\) may be causing problems too. Try checking that.
$destination_directory = 'I:/path/to/file/' . $theme_number;
You might also want to look at the FTP functions.
I'm working on a program right now that calls the script mail.php located in /var/www/vhosts/company/httpdocs. mail.php is trying to execute require_once dirname(__FILE__).'/../pear/Mail.php' to do an smtp send and the require_once is failing. My PEAR directory's located in /var/www/vhosts/company/pear. I then tried to add /var/www/vhosts/company/pear to the include_path but require_once is still failing.
I decided to take a step back and replace mail.php as a simple script that does file_exists(dirname(__FILE__).'/../pear/Mail.php') and prints the result to a logfile. When I run the script independently, it works fine and returns 1. When the flash program runs it, it's returning nothing. Printing out dirname(__FILE__).'/../pear/Mail.php' returns the same regardless if I run the script independently or if the flash file runs it. I've also tried chmod 777 on the Mail.php PEAR file but that didn't do anything.
Any ideas on what's going on?
I would bet anything it has to do with two things:
1) Flash/Actionscript normally does not access local file paths.
In other words, it probably isn't even executing the file.
As a compiled client-side module it needs an actual web-accessible URL. Part of the problem here is the design itself. Try it with an HTTP request within the actionscript and you will have better results. If you don't have access to the flash file.. well tough beans there.
Now if you are running a mail routine through actionscript? I would say that is a security risk. You are better off having the actionscript pass the routine to an AJAX receiver routine that checks session credentials and then sends mail.
2) CWDUP restrictions on the server.
Depending on certain server configurations, excecutables normally do not have access to filepaths outside their own root. (i.e. an executible cannot call ....\another directory\other file.) Some servers will allow this, but many wont.
You may want to make sure your PEAR directory is in your php.ini path variable. This way you don't need to use CWDUPs in your directory name at all, it will find it in the includes directory. (which is normally how pear modules work.)
So rather than using a buncha dot-dots.. try working down from the top.
$mailpath=$_SERVER['DOCUMENT_ROOT'].'\include\mail.php';
As a last resort, you can try copying the mail.php routine into the same directory and see if that works. If that still fails, then its your include path to PEAR. (as the mail.php is probably calling PEAR functions.)