exec zip root path - php

Im having some problems zipping a directory.
The following line will do the trick but it also includes the root directory.
exec('zip -r '.$tmp_zip.' '.$filename_no_ext.'/rss-ticker/*');
So I only want to zip everything in the dir rss-ticker
How to fix this?
Thanks for your help

What zip program are you using? tar works in the way that you want it to above, then you can gzip it.
Alternatively, chdir() to the directory you want to zip, and specify your path as * - that should only get the files in the current working directory.
If you can't get it to work in the way you want to (or even if you can) try the ZIP Extension or this 3rd party library - doing it in pure PHP will make your code more portable.

Unless you have an overwhelming reason not to (like a desire to cause yourself much pain and headache), you really should be using the ZipArchive toolkit. There is an example of how at this question.
As to getting zip to work with exec, I noticed two points:
you will definitely need the $filename_no_ext variable in there. If you just have it be a .. Right now, because you're starting rss-ticker with /, it is assuming that the folder is at the root of the file-system. I don't think you want that.
You do not need the trailing * to get the command line zip function to work.

Related

PHP file_exists returns false but the file does exist

I have seen several similar questions, but no answer worked in my situation, except that it probably has something to do with permissions.
A PHP script served by Apache tells me unable to open database file.
When I print the path to that file, it returns a valid path, say DBPATH. The file does exist at that location; I gave it and its parent folder 777 rights; I gave them user:user access, where user is the sudoer that all script files belong to. I did the same to the whole htdocs/ folder, just in case.
When I print file_exists(DBPATH), it returns false. Is is most likely a matter of permissions, but I don't know what I should change for PHP to have access rights. I tried apache:apache, too. I cannot su apache (user not available).
My scripts are in htdocs/. DBFILE is somewhere out of it (I tried /tmp/test, all in 777, but no luck either).
No safe_mode, PHP 5.4 freshly installed, CentOS7.
Please someone give me a clue at least to help debug it.
Maybe such as: how can I check whether my file will be readable from apache/my php script, without running the script itself? How can I get the name of the user that is used to execute it?
Solved, more or less.
To debug I had the idea to move DBFILE to the same folder where the PHP script lives, and check it can find it - it did. Then I move DBFILE one folder after another in the tree to see where it stopped finding it.
It occurs that if only one of the folders in the whole path does not have execute rights for all users (xx5), the file cannot be found and file_exists returns false.
So the solution was to create another folder in a totally executable place (/var/www/data/ worked after chmod 755 data), and move the file there.
Do you use an absolute path or relative path?
Because file_exists() doesn't work with HTTP addresses (which is an absolute path). But you can enter the relative path.
I had the same problem and it fixed it. It was the same problem with unlink().
Exemple:
$file_relative_path = "./wp-content/uploads/fileDirectory/fileName.jpg";
if (file_exists($file_relative_path)) {
unlink($file_relative_path);
}
I had a similar problem and was able to solve it by the answer of JulienD:
If the execute flag of a directory in the file system (Linux) is not set, then PHP (still) scans this directory with glob or scandir. However, a subsequent check with file_exists() on this list of results, I wanted to find broken symbolic links, returned false!
So the solution was to set the Execute right for the directory, as mentioned by JulienD.

Automatically compress folders

I am creating an automatic backup system. I plan on running a Cron Job that will, once a week, automatically create a backup and email it to a secure email.(this way, even if the server explodes into a million pieces I will have a full recent backup of my files that any administrator can access)
I found this simple method: system('tar -cvzpf backup.tar.gz /path/to/folder');
It works nearly perfectly for what I need. Only problem is there is one directory that I do not want included in the backup. On this website, users upload their own avatars and the directory in which the images are held is inside the directory I want backed up. Because I'm sending this via email I have to keep the folder relatively small, and a couple thousand images add up. Is there any way I could tell the function to ignore this directory and just compress everything else?
find /path/to/folder -not -ipath '*baddirectory*' -print | xargs tar -cvzpf backup.tar.gz though you might consider passing PHP the full path to all the binaries you use (in the above command: find, xargs, and tar).
From the tar man:
tar --exclude='/path/to/folder/bad'
So you would get:
system('tar -czpf --exclude='/path/to/folder/bad' backup.tar.gz /path/to/folder');
You can leave the v (verbose) out, since you are not watching your code being executed.
You can exclude something from been included with --exclude-tag-all PATTERN long option as on manual.
Unfortunely I did not found a good example about pattern.
I guess the following exclude will work:
--exclude-tag-all=/path/foldernotinclude/.
Since it should match the "directory" file tag.
With lucky another user will make a comment about patern to use.

Browsing the files on a webserver with php

Is there a way to browse and the get the path to files that are on a web server using php...? Possibly any jquery plugin to accomplish this....? Or else how can i do this using only php...??
Edit : I've found some scripts that run as stand alone on the webserver and let you browse the file structure.. But i need to know the method with which it is done so that i can implement it in my own script....
Thanks a lot for your suggestions....
To explore the server's filesystem you'll need the directory functions:
chdir — Change directory
chroot — Change the root directory
dir — Return an instance of the Directory class
closedir — Close directory handle
getcwd — Gets the current working directory
opendir — Open directory handle
readdir — Read entry from directory handle
rewinddir — Rewind directory handle
scandir — List files and directories inside the specified path
But if you're looking for some code to read through, that does what it seems you're looking for, check out filebrowser. No longer in active development, if I remember correctly it's pretty simple code at the core.
I suggest that you have a look at the code for CKEditor's file manager.
Here's the PHP filesystem reference

Zipping files through shell_exec in PHP: problem with paths

I'm working on a web backup service part of which allows the user to download a zip file containing the files they have selected from a list of what they have backed up.
The zipping process is done via a shell_exec command. For this to work I have given the apache user no-password sudo privileges to the zip command.
My problem is, since I can't do "sudo cd /path/to/user/files" to the folder containing the user's backed up files I need to specify the full absolute path to the files to be zipped which then go into the zip file which I don't want.
Here's what I mean:
The user, 'test1' has their files backed up which are stored as follows
/home/backups/test1/my pics/pic1.jpg
/home/backups/test1/my pics/pic2.jpg
/home/backups/test1/my pics/pic3.jpg
/home/backups/test1/hello.txt
/home/backups/test1/music/band/song.mp3
When these files are zipped up they keep the full paths but I just want it to show:
my pics/pic1.jpg
my pics/pic2.jpg
my pics/pic3.jpg
hello.txt
music/band/song.mp3
Is there some way to tell zip to truncate the paths, or specify a custom path for each file?
Or is there some way I can change the working directory to the user's root backup folder so I can simply specify relative file paths.
Thanks for reading, hope you got this far and it all makes sense!
I have to slightly question why you are making a potentially dangerous system shell call when there are a number of good PHP zipping classes around. The tutorial here shows how to easily create a class that will output a zip file to the browser. There is also a number of classes on phpclasses.org, this seems to be the best one.
If you have to do it with a system call my suggestions are:
To truncate the path for the file you could use symbolic links.
Can you not increase the permissions of the zip executable to mean that apache can use it without using sudo?
Have you tried using chdir() to change the working directory?
chdir('/home/backups/test1/');
A better idea may be to make a shell script, and grant the webserver sudo access to that shell script. This may be more secure, too
#!/bin/bash
#
# Zip up files for a given user
#
# usage: backupToZipFile.sh <username>
cd home/backups/$1
tar -czf /tmp/$1.tgz .
Also have you considered running sudo as the user you're trying to zip files as, instead of as root? This would be more secure as well.
It seems like a simple option, but according to man it's just not there. You could of course symlink the stuff you want to archive in the location where you'll create the zip file (I assume you can actually cd to that directory). With the proper options, zip will not archive the symlinks themselves but the symlinked files instead, using the names of the symlink.
E.g. create symlinks /tmp/$PID/my pics/pic1.jpg etc, and then zip everything in /tmp/$PID.

Why do I have to copy the libmysql.dll to the apache/bin directory to get the PHP extension to load properly?

I'm on a Windows machine. This seems like it should be unnecessary, but when I do it, everything suddenly works. Is there something wrong with my path? Do I need to add something to it to avoid having to copy DLLs?
Apache like any application will assume that the file is located in the same directory as the Current Directory path (check out http://en.wikipedia.org/wiki/Working_directory). If it's not there. The current working directory is USUALLY the same directory that httpd.exe (main executable) is in but it can actually be different if you do something like
C:\Apache2>bin\httpd.exe
In this case the Current Working directory is C:\Apache2 rather than C:\Apache2\bin.
If if the file isn't found there the application will naturally traverse the PATH environment variable. The PATH environment variable is a semi-colon or comma separate list of paths) to find the file.
Start -> Run -> Type "cmd.exe" and then in the Command Prompt type "echo %PATH%" to see the current path you have.
Finally, if the file wasn't found it will just error out.
As a tip you can actually track what files an application is trying to load and where they load them from by using Process Monitor. http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
I've used this tool to solve load DLL problems in Apache before and other applications as well. Just simply add a filter for the app you are running and have it only sniff out file reads.
I donot know the internals of MySQL and apache.
My thought is this. Internal of your application is using libmysql.dll. And it seems that path is not proper so it searches in PATH environmental variable. apache/bin will be there in PATH directory. So it is taking the dll from this path. If the dll is not present in that path I think it fails to load and hence fails.
EDIT: Added the solutions which were added in comments
Try rebooting your machine. I had the same issue with mysqlpp library. Path was pointing to mysql bin dir but it still couldnt find libmysql.dll – Daniel (Jan 26 at 6:55)
Apache might be running with credentials different from your own (almost certainly so if you're running it as a service.) Try placing the dirs in the SYSTEM path, not the USER path. – moocha (18 hours ago)

Categories