I have one problem. People can upload files as attachment on my website to contact us/mail us (list displayed once uploaded etc). There is upload limit for one session, but if another one is created (it's more like random id) you can start over uploading to the server. So basically you can upload thousand files until the server is full.
Is there a possible way to prevent this? You can limit it for one IP or cookie, but I can have another one and this is still a problem.
Thank you
Related
I've recently pushed a video uploader for this website I've been working on for months. Recently, we have had two separate instances where the video the user uploaded wasn't completely uploaded resulting in a broken file that wont play. Currently, we are not sure as to why something like is occurring, but we have a couple of hunches, namely the files that are being uploaded are too big to handle and the user's upload speed is horrible. I was wondering what I can do to handle situations such as these where the upload conditions are less than optimal.
I upload firstly to the tmp directory on my Linux server then move the file to S3, our system is for end users to upload and we limit files to 40GB.
I'm currently building a web application for the management of an association. In that app, the users are able to build eMails, and send them to different members of the association.
The user can also, while writing the eMail, provide some attachement files, uploaded by Ajax for a more user-friendly experience. Every time a user wants to upload an image for instance, he will trigger an Ajax request, downloading the file into the server "temp" folder through a classic file upload form. I then extract the file from this temp folder using $_FILES to save it in a custom "temp" folder, with a token named folder, so that I can gather all the attachements there and re-use them when the user wants to actually send the eMail. When the eMail is sent, the files are moved from the custom "temp" folder to another, immuable location for archiving. Only if he sends the mail. If he quits the page or log off, the folder and files are deleted by php.
But sometimes, after creating a new eMail and uploading some documents, the user will simply skip to another website, and never log off or quit the page properly. So, to prevent my server to be crowded with ghost temp files, I need a system to delete the remaining files.
I've already thought of a Cron task that would run for instance every 24h, and deleting every files older than that. But I'd like my solution to be portative and easy to install (--> Php only, no particular server setup), so I'd like to know if I can make PHP automatically run a macro that would delete the files on the session timeout or log off.
I haven't managed to find anything yet, and some help would be appreciated. Is my intended solution only actually possible?
I need to upload an image to another server once it's been submitted by a user. I have 3 ways to accomplish it.
Upload the image to the primary server and then upload it to another server via ftp.
Upload the image to the primary server and store it somewhere in the temp folder. Then, run a cron script to upload images from the temp folder to another server.
Have a php script which would accept images uploaded directly to another server.
Please note, I need to store image references and some metadata in the primary server database. So, by keeping that in mind makes me automatically drop the 3rd choice from the list. Since, it's hard to get the uploaded image file name from another server once it's there.
The first one, is a good one since I can generate the file name before I upload it to another server and once it's there I can easily save the required data into the database. The obvious drawback here is that this requires extra patience for the user while the image is being uploaded first to the primary server, then to another server.
The second method seems the best one for me, since the image will be uploaded to another server unnoticeable for users. The only thing bothers me is that the url for the uploaded image will be changed once the cron uploads the image to another server. Not sure what side effects might come up later because of this.
Has anybody done something similar before? If yes, please share your experience with me.
Thanks a lot!
My php script allows users to upload images and stores them in a directory in the same server. Is there a security or performance advantage when transferring user uploaded files to a remote server on a different FTP account?
I imagine that it's more secure to store uploaded files on a server that isn't in the same directory with my php scripts and connected to my database. What are your thoughts? Advice on how to properly use a remote server is welcomed.
Note: I plan to use Codeigniter's FTP Class to handle all transfers and I'm saving image URLs to the remote directory in mysql.
It is no more secure than storing them on the same server as users can execute your PHP scripts as long as they know their locations. They can find a PHP script's location very easily without know it already (and without seeing images in the same directory).
You will see a decrease in performance (possibly very noticeably depending on how often you access emails) and won't see an increase in security.
If you don't want users to be able to execute your PHP files, you can change the read/execute capabilities of users in specified directorys (you could just put your PHP files outside of the folder with the photos if you want the users to have access to the images without having access to the PHP files)
If you are worried about users uploading files you don't want them to upload, you can limit the types of files they can upload by either saying "these file types are not allowed" or saying "only these file types are allowed" and checking with PHP when the file is uploaded.
I have a file uploading site which is currently resting on a single server i.e using the same server for users to upload the files to and the same server for content delivery.
What I want to implement is a CDN (content delivery network). I would like to buy a server farm and somehow if i were to have a mechanism to have files spread out across the different servers, that would balance my load a whole lot better.
However, I have a few questions regarding this:
Assuming my server farm consists of 10 servers for content delivery,
Since at the user end, the script to upload files will be one location only, i.e <form action=upload.php>, It has to reside on a single server, correct? How can I duplicate the script across multiple servers and direct the user's file upload data to the server with the least load?
How should I determine which files to be sent to which server? During the upload process, should I randomize all files to go to random servers? If the user sends 10 files should i send them to a random server? Is there a mechanism to send them to the server with the least load? Is there any other algorithm which can help determine which server the files need to be sent to?
How will the files be sent from the upload server to the CDN? Using FTP? Wouldn't that introduce additional overhead and need for error checking capability to check for FTP connection break, and to check if file was transferred successfully etc.?
Assuming you're using an Apache server, there is a module called mod_proxy_balancer. It handles all of the load-balancing work behind the scenes. The user will never know the difference -- except when their downloads and uploads are 10 times faster.
If you use this, you can have a complete copy on each server.
mod_proxy_balancer will handle this for you.
Each server can have its own sub-domain. You will have a database on your 'main' server, which matches up all of your download pages to the physical servers they are located on. Then a on-the-fly URL is passed based on some hash encryption algorithm, which prevents using a hard link to the download and increases your page hits. It could be a mix of personal and miscellaneous information, e.g., the users IP and the time of day. The download server then checks the hashes, and either accepts or denies the request.
If everything checks out, the download starts; your load is balanced; and the users don't have to worry about any of this behind the scenes stuff.
note: I have done Apache administration and web development. I have never managed a large CDN, so this is based on what I have seen in other sites and other knowledge. Anyone who has something to add here, or corrections to make, please do.
Update
There are also companies that manage it for you. A simple Google search will get you a list.