I'm currently building a web application for the management of an association. In that app, the users are able to build eMails, and send them to different members of the association.
The user can also, while writing the eMail, provide some attachement files, uploaded by Ajax for a more user-friendly experience. Every time a user wants to upload an image for instance, he will trigger an Ajax request, downloading the file into the server "temp" folder through a classic file upload form. I then extract the file from this temp folder using $_FILES to save it in a custom "temp" folder, with a token named folder, so that I can gather all the attachements there and re-use them when the user wants to actually send the eMail. When the eMail is sent, the files are moved from the custom "temp" folder to another, immuable location for archiving. Only if he sends the mail. If he quits the page or log off, the folder and files are deleted by php.
But sometimes, after creating a new eMail and uploading some documents, the user will simply skip to another website, and never log off or quit the page properly. So, to prevent my server to be crowded with ghost temp files, I need a system to delete the remaining files.
I've already thought of a Cron task that would run for instance every 24h, and deleting every files older than that. But I'd like my solution to be portative and easy to install (--> Php only, no particular server setup), so I'd like to know if I can make PHP automatically run a macro that would delete the files on the session timeout or log off.
I haven't managed to find anything yet, and some help would be appreciated. Is my intended solution only actually possible?
Related
I am working on a web application where file uploads and revisions are tracked. Once a project is ready, it is submitted for an approval process. I want to, upon "submit for approval" lock down all the projects attached files to prevent further changes.
The file uploads are handled by my own simple forms, and the files are tracked in a mySql db.
Is there anyway to set the files as read only so they can not be deleted, renamed, moved, etc. But still be viewed? (prevent change even by FTP or a cPanel file manager)
The idea is to protect the integrity of what has been approved. At the least I will be using .htaccess to prevent viewing the uploads folder directly.
Obviously, someone could SSH into the server and SUDO SU and do whatever, but, I am thinking for the less tech savy folks who need GUI.
You want application level control, not operating system level control. I.e., don't set the files to read only. Instead, have your application recognize that the project is in "submitted" status, and disable the features that would allow a user to change them.
If you're worried about somebody changing files with FTP or cPanel outside the project app then you likely have a personnel issue, not a technical one. I'm never a fan of using technology to solve a personnel issue.
That said, to additionally ensure that no OS-level changes have occurred, you can generate a hash of the contents of all the component files at the time the project is submitted, and then store those hashes in the database along with the project data. This will allow you, at any later date, to determine if the files have changed since.
The Situation:
I have a page on which a user can enter deatils and apply for a job (multiple pages, not just one form). During the registration process a user can upload files that will be stored in a temporary folder on the server and will be attached to the application later. During the application process the user can upload additional files, delete those he uploaded etc.
Once the registration is finished successfully, the final files are moved to a user specific folder of which I store the path in my database so it is attached to the application - everything's fine.
The Problem:
If the registration is not finished successfully (basically it has been cancelled), but files were uploaded, how do I remove those files in a smart way?
When the application has been finished successfully, the active session will be closed. If the application has not been finished, the session will time out and no user has access to those files anymore.
Thoughts:
Now there are a couple of ideas I can think of, but I am not sure, which one is the smartest. The upload will be handled via AJAX. I want the file to be uploaded or at least stored when they are added to the application, so they will be attached, even if the user moves them on it's harddrive during the process:
1) Clean up after session has timed out (custom session handler)
2) Store files in browser and only upload them on completion of the application
3) Use a cron job that deletes files older than X days
4) Serialize files into session which will be cleaned automatically without any modifications.
Ideally I want the temporary files to be gone once the process has finished or has been cancelled.
Store files with upload time in temporary table in database. On completion move files to another (target) table. In cron/database job delete records from temporary table older than 1 day.
You can achieve same result with files. Make temp directory, each day create subdirectory and store files in it. eg.
temp/20150911/some_unique_filename.pdf
In session store full path to file. On completion move files to target directory. In cron delete directories older than 1 day.
I'm working on a WordPress theme that includes an actual installation script of its own. So this is what happens;
1.) Users download theme.zip from mysite.
theme.zip does NOT contain the theme itself, it contains the installation script and all the required files to make the installation successful.
2.) Now the user will upload theme.zip to their server (using the WP dashboard)
3.) One they've uploaded theme.zip, they will run the installation script which requires a username and password, which is stored on MY sql db.
//the dodgy bit
Now here's what happens in the installation script.
Once the user has entered their username and password, some variables (the user's username, password and unique id number) will be sent to a php file on my server (using curl). Then my server will look into the sql db and select a certain row (using the unique id number sent earlier on) and check if the user's details are correct. If the details are correct, my server will then send some variables back (using JSON encode/decode) with a value of TRUE. once the users server has received the TRUE value it will continue. And if it receives a value of FALSE, it will then stop and throw an error
Once the users has logged in successfully (my server sends back TRUE) then another CURL function will run.
This function will send a unique id to another php file on my server. The php file will then make a copy of a folder which is placed on my server and name the file with the unique id number so the duplicate folder will be called "265851654" (which contains all the themes content) then the php script (on my server) will then compress that folder into a .zip. Once the compression is complete, it will send some info(information on where the newly produced .zip is placed on the server, ready for download.) back to to users server.
The users server will then use the info it received from my server to generate the download link and begin downloading the .zip file. once the download has finished, another curl function will be ran. this function will do the same as the one just explained but instead of building the .zip ready for download, it deletes the .zip.
Now it does a load of other stuff too but thats all on the users side.
Is this safe? As this theme will be available to EVERYONE which means they will be able to see the curl functions and all the other source code which they can edit them as they please.
If it's not safe, could you give me some advice to help prevent those evil people from messing around?
Thanks!
A few things you should keep in mind:
Make sure the request within the script calling cUrl with certain
arguments stops or it's blocked by the server (let's say, for 1hr) if
the identification fails for 5 consecutive times.
Make sure your cUrl script contains also proxy variables like ip,
username and password. There a lot of configurations out there
requiring these.
Create a md5sum file for each downloaded script and save it on
your disk. Compare the md5sum from the file on disk with the one
newly created. Make sure the next user asking for the same file will
download the already created one and not create a new one, since it
won't load the server.
Try to secure your php script on the server with 2 distinct identification (like username/password works). This will make less easy for evil ppl to find a path into your server.
I'm sure there are a lot of other stuff, but that's all what's in my mind now.
I currently have a php file which allows the user to upload a file. Once they upload the file, it runs a program with the file using MPI.
The problem is that the script says it cannot find the file .mpd.conf (config file that must be present in users home directory). I'm guessing that this is because it is running as a different user than myself.
I am using apache2 to serve this webpage, can anyone help me get this working? I don't know too much about how PHP works.
Although the user can set a lot of things in their .mpd.conf, the reason it's required is just to have a `secret word' that the launched mpds can agree on -- like (say) erlang machine cookies, it's just so that the various mpd daemons launched only can make sure they're only contacting the right other mpds.
Presumably your php program is launching a script which does the mpirun/mpiexec? If so, you could simply have the script check for the existance of ~/.mpd.conf and if it doesn't exist, create it containing a line of the form MPD_SECRETWORD=[something-unique-here] and then make sure its created with read/write permissions only for that user.
I am currently using php and ajax file upload to develop a web application.
in a web application involves getting the files uploaded from user, e.g email client, photo gallery. This is the scenario that i got stuck.
When user uploads some files but close the browser without submit, i want to delete those files and only move the relevant files.
I have tried leave the stuff in tmp/ folder and been given a temp name by apache but when i do the upload i have to move the file immediately otherwise the file cannot be found in the later stage by referencing to the temp filename.
The reason that i leave it in a /tmp/ is that i will want to setup a cron job and delete files in those folder to free up server space.
Am i doing the right thing? or is there a standard industry approach used by hotmail, google etc?
You will need another temporary folder which you can manage yourself.
You can upload to this folder you created yourself called temp. When the uploading is complete, move the temporary file from PHP's tmp folder into your temp folder.
Then when the submission is done, you move the file away into its respective folders.
Have a cron job that works background to remove old files in that folder.
Remember to give permissions to PHP, Apache and the cron job for access to the folder.
Don't rely on industrial standards - besides, Microsoft and Google don't use PHP. (maybe Google, but definitely not Microsoft).
Why not just move it from the tmp/ folder to your own temporary staging folder immediately, and then keep a reference to it in the DB, and have a cron job that periodically scans the DB for 'staging' files with a timestamp more than X hours in the past and removes them?
I dont know about big boys, but I guess, you can create a database table, that will hold the temporary file names, the pros of this approach is that, you can delete the entry from temporary file table, even browser is not closed in the middle, and additionally setting up cron job to delete files as found under temporary file table.