Access log won't log after replacing with a new log file - php

I'm trying to phrase access log files on my Nginx server.
For phrasing the file, I simply rename the original access log file and create a new access log file immediately so I won't miss anything.
But after replacing the file, Nginx won't log anything onto that file but works until I replace the file.
Nginx start logging again to the replaced file after I restart Nginx.
I can not see what I'm doing wrong, any help?
First bit of the PHP code
if(rename("access.log", $tempname)){ // I'm renaming the access log file
$fp = fopen("access.log","wb");
if( $fp == false ){
}else{
fwrite($fp,$content); // I'm creating a new access log file
fclose($fp);
}
// I'm phrasing the renamed file here
}

As I said in my comments it probably not possible to remove the file due to the nature of nginx, my suggestion would be using the same approach, but without actually removing the log file. Instead just clear it.
Pseudo Code
file = open "nginx.log", READ
new_file = open tmpname, WRITE
new_file.write file.contents
file.close
new_file.close
sys_command "cat /dev/null > nginx.log"
Or using a script
#!/bin/bash
cp nginx.log nginx.backup.log
cat /dev/null > nginx.log
This way you are not destroying the file and the file handle that nginx has will still be valid.

Related

unable to open file with php on ubuntu machine

I trying to create new file in Ubuntu system using PHP script,
but when I run this script the error
Unable to Open file
is appear
although I sure that the file's path is right and I have permissions to access this file I don't know where is the wrong
this is my code
$myfile = fopen('inc/users/future.php', "w")or die("Unable to open file!") ;
$text='<?
$host ="'.$host.'";
$user ="'.$db_admin.'" ;
$pass ="'.$db_password.'";
$db ="'.$database.'" ;
$myconn;?>';fwrite($myfile, $text);
fclose($myfile);
the path of this script is
/var/www/html/ghost/index.php
and the path of the file which I wish to open is
/var/www/html/ghost/inc/users/future.php
in other hand when I run this script in windows machine every thing is go fine
In your script use
fopen(dirname(__FILE__) . '/inc/users/future.php', 'w')
This will create a filepath from the directory your index.php. If you script is called from another file, php might search coming from that file.
Also check if the php process has sufficient file permissions to read and open the file. Try setting the file to chmod 777 just to test if that is the case (do not leave it on 777 though).
Keep in mind that if the file is a symbolic link, the 'w' parameter of fopen will not work.

file_get_contents failure on windows

I have a PHP CLI application that creates a file with file_put_contents then listens for changes to that file. If the filemtime changes then I try to get the content with file_get_contents. It often fails to retrieve the contents on windows. This baffles me. How is it possible that a process that creates the file cannot open the file?
I even ran icacls on the folder that the file is in and it still fails to have access to read the file that it created.
icacls.exe 'MYFOLDER' /grant MYUSER:(OI)(CI)F /T
Can someone please enlighten me on how to insure a PHP process can read a file it created?

Trying to write file, file not being written

I am trying to write the contents of a variable to a .php file so that I can access that file later, through requite_once, and use that variable later.
The dir that I am trying to write to has 755 permission and I talked to my hosting provider who said that the dir can be written to by me now.
But, the file is not being written. Here is the code that I am trying to write the file with:
error_reporting(-1);
$elasticaObject = var_export($elasticaObject, true);
file_put_contents('theIndex.php', $elasticaObject);
echo "done!";
I get an output of done!done!done!
but, the file is not being written as I get a 404 error when I try to visit it and because it is not-existent in the dir listing.
You must set the write executing permission on the file that contains the script.
This is a good guide:
http://www.phpjunkyard.com/tutorials/ftp-chmod-tutorial.php

If while sending a file with "readfile" I modify the file, will the download fail?

If I send a big (200mb) file to the browser with readfile(), and while the user is downloading I modify the file, will the download finish succesfully? If not - how do I modify large files that are constantly being downloaded?
No, it will not finish successfully if you edit the file data directly. A corrupted file would be received.
However...
In Linux, if you rm or mv a file, any program that already has that file open will continue to access it. It is only once the file is closed that the file is completely released. Therefore, you could safely edit your large file with these steps:
Copy the file away to a temporary name.
Edit the file.
rm file_being_downloaded; mv new_file file_being_downloaded;
I have not tested this, but that should allow all people downloading the 200mb file to receive their copy completely in tact, and new downloaders will get your updated version.
I can't test this but most probably the download will get corrupted and readfile() will return false and throw an error. Another option, iff readfile() locks the file is that you won't be able to write to the file until it has finished reading it - I don't think this is the case though.
If I were you I would duplicate the file, serve it and then delete it. You might also want to load the file contents into memory and serve that, but for a 200MB file this would be unpractical...
$file = '/path/to/file.200mb';
$temp = tempnam(sys_get_temp_dir(), time() . '_');
copy($file, $temp);
readfile($temp);
unlink($temp);
I would say it depends on how the download is being handled.
If the file is saved to a separate memory position on the server, and you modify the actual file. Seeing as the user is trying to download the file in the alternate memory position, it shouldnt affect it.
Your best bet is to put up a new file, and then update the code to load the new file. That way users who are downloading the old file will continue to do so, but you will be able to transition to a new file.

file uploaded via php: no such file or directory

i'm working on a website wherein the users can upload images (uses php 4.3.11). the files are uploaded with no problem as i can see them in the upload directory and i don't get any error message, but when i try to access the uploaded files via ftp, i get an error: no such file or directory. sometimes, i am able to access the file sometimes i get this error. what could be the problem here?
[update]
thanks for the help guys. i'm not familiar with the ftp daemon stuff. but i do access my files via ftp using FireFTP. the files are there but when try to download them or change the file properties, i get the said error. i also tried uploading a file in the folder through ftp and i was able to download it with no problem.
here is some of the code i'm working on, its kind of roundabout but i'll see on how to improve it.
my working directory is something like this www.domain.com/register/
and the upload directory is here www.domain.com/register/uploads/
users are required to register and upon sign-up, a folder is created for them in the uploads directory. i couldn't find a way to create a folder without having to be in the uploads folder itself so i redirect to a create-user-folder.php file in the uploads dir.
the file just contained this code:
$user_foldername = rawurldecode($_GET['name']);
mkdir($user_foldername);
header("Location: ../form.php"); // redirect back to the page
i checked and the created folder's permission is set to 775.
and here's part of the code i use in uploading ( /register/function/function.php ):
$path = "../uploads/$user_foldername/";
for($j = 0; $j < $num_of_uploads; $j++){
if(is_uploaded_file($_FILES[$file]['tmp_name'][$j])){
$filename = $_FILES[$file]['name'][$j];
copy($_FILES[$file]['tmp_name'][$j],$path.$filename);
}
}
i checked using FireFTP and the files are in the /uploads/user_foldername/ directory and its permission is set to 664. the strange thing is that when i try to download the files, at times there would be no problem at all but there are times when the error will appear.
[another update]
i added chmod() after the copy() function,
$filename = $_FILES[$file]['name'][$j];
copy($_FILES[$file]['tmp_name'][$j],$path.$filename);
chmod($path.$filename, 0755);
but i still get the error.
another thing is that when i access /register/uploads/user_foldername/ through the url, i can see all of the uploaded files and view them, but how is it that i can't access them via ftp?
thanks again!
This is either a permission issue, or a configuration error. Here are things you should try:
What are the permission of the uploaded files? Does the FTP user has access to these files? Have you tried logging in as the user the FTP daemon would use and see if you could read the file that way?
Do you really see the correct directory? Have you verified by putting a file in that directory yourself and downloading it? Have you used the ftp command ls to verify the presence of the folder/folders/files?
You might need to chmod the folder the files are in, or in some cases the files themselves.
try chmoding them to 775
You can chmod files and folders through PHP it's self, with the chmod function. Or, you could use a FTP program such as filezilla.
Also check to make sure the intermediate directories are also permissioned as 755, as all the directories in the path need to be executable to be traversed.
i just figured out the problem. it was all because of the file name having accented characters in it, which explains why i do not always get the error message :|
<sigh> i should have seen this earlier, but anyway i hope this helps in case someone ran into the same problem.
thanks again! i really appreciate it :)

Categories