In my WordPress site, I have a script that looks at a directory and uploads each image in that directory to a specific post. Right now, this is set to happen each time the user opens the post edit page. I need a way to check if the contents of the directory have changed. That way, I can set the script to not run if the contents of the directory have not changed. Is there a way to accomplish this in PHP?
I would try to make a hash of directory content, let's say concating the names of files (sorted) with some divider and making md5/sha1 hash. You'll need to store it in some way (in directory as text file or database).
On access you'll have to calculate actual hash of directory in the same way and compare it with the old (saved) one. Depending on result, you can take some actions...
You can check the modification date of a file with filetime
http://php.net/manual/de/function.filemtime.php
A solution is to store the day of the last visit and look for younger files at the next visit.
You can open every file in PHP and generate checksums with MD5, then save this with time() to text file in directory. Next time you can compare yours checksums and you will know if anthing has changed since last time.
Related
I have a PHP file that has a $_FILES variable in it, and I obviously have a temp file of the file uploaded saved. Problem is that what I want to do is let the user upload the image, validate that it's okay (both via 1st php file), and then allow the user to enter info about the image. From the info I can obtain the name of the image and use move_uploaded_file() to save the image (on the second PHP). The problem is that the file uploaded to the temp file is, well, temporary, and so I can't use it in my second PHP file. Is there any way to go around that? I can move_uploaded_file() in the first file but I'm looking for something easier. I still want it to be like a temp folder in the sense that the file is temporary, but I want to keep it for a couple of minutes after the execution of the first PHP file...
Thanks.
The first page - the one that accepts the file - will need to use move_uploaded_file(). You cannot escape this requirement.
However, you can use tempnam() to create a new temporary filename and use move_uploaded_file() to copy the file to that name. Then pass that second filename to the second page so it knows where it is.
The other alternative is to collapse all the input and processing into one POST, so that all the information is entered at the same time the file is uploaded. This approach has always worked for me.
I have hundreds of mp3 files on my server. Each file's modified-date is important because it is fetched by PHPs filemtime to represent it's upload date (since there's no way to determine an upload time without storing values in a database).
I have come across an audio issue in which all the files need to be normalized and re-uploaded to the server. This would, of course, change the modified-date of each file to "today". I need each file to retain it's original modified-date.
I'm not sure if this is a software-recommendation question or a programming question, so I apologize if this is the wrong .SE site. Is this even possible?
You should be able to set the modified time with touch: http://php.net/manual/en/function.touch.php
This requires PHP > 5.3 and the user running the script (probably your web user unless you run it from the cli) needs to have write permission on the file.
You have two options for implementation:
Store the filenames and their mtimes in temporary storage (either a file or a database table). When you finish the upload, run through all of the files and use touch to reset the mtime.
As you upload the files, check to see if the file already exists. If it does, grab the mtime in a temporary variable, overwrite the file, then touch it with the correct mtime.
I know this isn't the answer you're looking for, but it would make far more sense to start storing this information in a database than relying on the last-modified date. This way you can show your users the date that they need to know and retain the true date of modification.
An approach like this also gives you much more flexibility.
As requested by #Snailer - for the sake of closing the question.
I am working on an application, where the user has to upload files in a cart.e.g. the user upload the file "A" and now doing different work. After some time he again upload another file, file "B". How I can manage the file path or store the file path, as if I use move_uploaded_file() function, then it can overwrite the other user's file with same file name.
Thanks.
When I've had this issue, I have used a timestamp added to the filename. Usually I want to cleanse the filename anyway, so I
replace characters I don't like
remove the file extension and check it looks OK (e.g. pdf not exe)
add a timestamp to the filename
put the extension back on
Obviously, this isn't suitable in every instance, but it might give you some ideas.
Create folder run time against session id, that way only current session user files goes to the folder.
temp_uploads/ (main uploads folder)
temp_uploads/_jhk43543h5h435k3453 (session id folder for user 1)
temp_uploads/_jhk43543h5h435k34tr (session id folder for user 2)
temp_uploads/_jhk43543h5h43trtrtg (session id folder for user 3)
you just need store session id for each user, which you maybe you are already doing.
happy Coding :)
You use php's time function to generate a timestamp that you append to the filename so that they can be different. Then you can use a column in the db to store the file paths. You could store all the file paths in the same column but separate each one with ; or any other character. To get the separate paths, you can use php's explode function.
I will be sending new files over from one computer to another computer. How do I make PHP auto detect new/updated files in the folders and enter the information inside the files into mysql database?
Get all files you already know from the database
loop through the directory with http://www.php.net/manual/de/function.readdir.php
if the file is known, do nothing
if the file is not known, add it to the database
In the end, delete all files no longer in the directory
I would pick a set-up where new files and old fields are in a separate directory.
But if you have no choice, you could check the modification date and match it with your last directory iteration. (Use filemtime for this).
Don't forget to do some database checking when you process an image though.
Save the timestamp of the last check and when you check next look at the fileinfo and check creation date. Even better yet because you store filecontens in a database, check for the time it was modified using: filemtime()
You can't. PHP works as a preprocessor and even it has execution time limit (set in the configuration). If you need to process with PHP then make a PHP script that outputs a web page that use meta redirection to itself. Inside the script, you should loop over the files, query the database for the file name and its modification time, if it exists then nothing to do, otherwise, if the file name exists then it's an update, otherwise it's a new file.
Basically i have simple form which user uses for files uploading. Files should be stored under /files/ directory with some subdirectories for almost equally splitting files. e.g. /files/sub1/sub2/file1.txt
Also i need to not to store equal files (by filename).
I have own solution. Calculate sha1 from filename. Take first 5 symbols - abcde for example and put file in /files/a/b/c/d/e/ this works well, but gives situation when one folder contains 4k files, 2nd 6k files. Is there any way to make files count be more closer to each other? Max files count can be 10k or 10kk.
Thanks for any help.
P.S. May be i explained something wrong, so once again :) Task is simple - you have only html and php (without any db) and files directory where you should store only uploaded files without any own data. You should develop script that can handle storing uploads to files directory without storing duplicates (by filename) and split uploaded files by subdirectories by files count in each directory (optimal and count files in each directory should be close to each other).
I have no idea why you want it taht way. But if you REALLY have to do it this way, iI would suggest you set a limit how many bytes are stored in each folder. Everytime you have to save the data you open a log with
the current sub
the total number of bytes written to that directory
If necesary you create a new sub diretory(you coulduse th current timestempbecause it wont repeat) and reset the bytecount
Then you save the file and increment the byte count by the number of bytes written.
I highly doubt it is worth the work, but I do not really know why you want to distribute the files that way.