I've created a webapp that converts videos that you upload into a compilation. Locally it works perfectly because the files are simply uploaded to a folder in my project folder and the script then simply takes all the files it can find in that folder and convert it into one compilation.
However, if want to put this online and have multiple users use this webapp, I can no longer use this one folder obviously. That's where I am stuck, I don't know how I can make sure different people can upload their files simultaneously, and the script knows what files to take, and at the end delete them, once users are done with the site.
I thought maybe something with sessions or with temporary generated folders, but that is a guess. The entire website is just one page. I don't want to require users to login to use the app.
I'm very stuck and hope someone can at least point me in the right direction. Thanks!
Related
Currently, I am putting all files in my public directory which means that anyone can download the file.
However, I don't want anyone else to see that file other than the user who created it. As of right now, I have no control over that. Maybe if I store it in another directory, I will be using middleware to protect it.
But I'm stuck on the part where I can upload the user-uploaded files.
Where is the best directory to put them? I don't have an external server I just have a VPS.
Laravel has a storage folder designed especially for this case. It's not available from outside your server and you will have to serve the files from it through Laravel.
I have a website which has a lot of confidential data and code which I have custom made. I have hired a developer to do the designing and some simple PHP integration for me.
To prevent him from seeing all files, I made a test environment in one of subfolders like mywebsite.com/testfolder
Now I want him to access the db_test.php, function.php and parameter.php files which are located in the root folder such that he can just include them while executing the scripts (example mywebsite.com/testfolder/mainfile.php) and not download them (with php script or by any other means). The idea is to prevent him to see the code and just use the stuff as it is.
This would also mean that his access to the root folder should be also completely restricted except for the above mentioned files.
I have created a test database and a separate user for him so the database bit is secured.
I have also created a ftp user which can just access the testfolder through ftp
What I am concerned about is that he might run a php script that will give all secrets in the root folder.
I have myself been able to list and download files by running a simple php script from testfolder.
Please suggest how to make this work as I am planning to have a virtual team who will work on the website which will have restricted access to various different resources.
RULE NUMBER ONE: never develop on a live project.
you may create a development environment (=web site) somewhere else, put some meaningless files and/or databases there and allow your developers full access. then, from time to time, you update your working copy from the repository (you have setup hg/git repo, haven't you), review and test the changes and only then upload files to your main web site.
I am trying to allow other people to work on my site with me. There are a couple of files/folders that I do not want them to be able to access / see.
One file is dbase.php and the folder is ./crypt/
How do I go about not even allowing them to see that those files are even there?
One of the guys that I'm trying to allow to work on my site says to use PHP's chmod, I looked it up and it does change the file permissions, but what makes it so that he can't put
chmod('dbase.php',0777);
in another file like index.php and change the permissions of the database file and then he can see what I have in there. What I'm trying to hide is the password to my database and a few special variables that run my site. Just some things I'm not comfortable letting roam around with people "I don't 100%" know.
Thanks.
If the guys you are awaring of should be able to edit and upload PHP code to your site, and your site's PHP code should be able to read the secrets file, the is no solution.
They always can upload the code which reads the secret file and outputs its contents.
I'm practicing some file upload with PHP and I was uploading a file numerous times, so I wanted to make sure I wasn't taking up a lot of space on the server. I had a while loop go over every file in the php tmp directory, and there were 103,988 entries.
Is this more than normal? I had assumed the tmp directory was for files that were automatically deleted after a certain amount of time. Am I supposed to be managing this folder some how?
Part of the reason I ask is because I'm writing an app that takes a users file, changes some things, and serves it back to them. I want the file to be deleted once they leave, but I'm not sure what the best way to do it is. Should I have a folder I put all the files in and use cron to delete files older than a certain time?
General rule is that you should clean up after yourself whenever possible.
If you aren't sure that you can remove temporary files every time, it is a good idea to have a cron job doing this for you once in a while.
I have a site that allows people to upload files to their account, and they're displayed in a list. Files for all users are stored on different servers, and they move around based on how popular they are (its a file hosting site).
I want to add the ability for users to group files into folders. I could go the conventional route and create physical folders on the hard drive, for each user on the server, and transverse them as expected. The downside to that is the user's files will be bound to a single server. If that server starts running of space (or many files get popular at the same time), it will get very tricky to mitigate it.
What I thought about doing is keeping the stateless nature of files, allowing them to be stored on any of the file servers, and simply storing the folder ID (in addition to the user id who owns the file) with each file in the database. So when a user decides to move a file, it doesn't get physically moved anywhere, you just change the folder ID in the database.
Is that a good idea? I use php and mysql.
Yes, it is.
I don't see any downside, except maybe more queries to the database, but with proper indexing of the parent folder id, this will probably be faster than accessing the filesystem directly.
Forget about folders and let the users tag their files, multiple tags per file. Then let them view files tagged X. This isn't much different to implement than virtual folders but is much more flexible for the users.