File Management system in Php - php

I have my system backed up on a server (Docs, Videos, Music, etc.)
I have only FTP access (can't really do anything fancy)
However I can run PHP and MySQL programs (pretty basic).
I was wondering about a good complete system that would let me
upload, control and manage the files I have there (including security features)
so that I would be able to edit some docs on the fly, listen to streaming music and upload more files if I wanted to.
Thanks.

I have used this in the past and it's open source so you can extend as needed.
http://sourceforge.net/projects/phpwebftp/

You can try:
http://extplorer.sourceforge.net/
or
http://pfn.sourceforge.net/index.php?opc=1&lg=ing
or
http://navphp.sourceforge.net/
I used the first one and it's pretty good. For media preview I know only http://www.filerun.com/ but it's not open-source.

Related

Safest way to transfer a lot of files between two sites, on a regular basis?

I'm currently working on a project which allows my employees to upload files to a private work management site that I designed, and then the site informs a publicly accessible site that new files have been uploaded, and to inform the client those files belong to.
I'm trying to figure out the "best" way to go about doing this. Obviously giving my clients access to the work management site would be a terrible idea, but the files still need to be saved on the work management site as well.
When I started working on this feature, I figured I'd just write a cron on the public site (The site clients access) to download the new files every 24 hours, but it looks like there may be hundreds of files (Hundreds of megs) every 24 hours, so I'm starting to doubt that design. I'm also a little skeptical of using ftp/sftp/scp, as that's a possible security issue. Are there other methods I'm overlooking to do this?
Note: I'm using Code Igniter on the work management site, and Laravel on the public site.
Edit: I should note that both sites will be on the same server, domain, and under the same user. Are there any issues with writing a "wrapper" which basically forwards the file data through a php script to cover the actual download location?
sftp or scp will be just about as secure as anything. Why are you skeptical of using them? You could build a VPN between the two sites, but that's likely more work and resources than using sftp or scp.
Edit: Responding to edited question -- if both sites are on the same server, does that mean they essentially share the same file system (disks)? If so, then it would make sense to simply access the same files from both sites, and write code / configure the client site to display only the files the clients should see and give them only the limited access they should have. It is possible to code this in such a way that the only access they have to the files is through the site's code, yes. For example, this is a standard on/off configuration option in Drupal, if I remember correctly.

Upload Music files into server but restrict download?

I am developing a website for a rock band. I wanted to upload the music into the website and let the users listen to the songs. But, I want to add a functionality or some sort of code so that the users are not able to download the mp3 files that i will let the users listen. Suppose someone uses "Internet download manager", he she will be prompted to save the file on as soon as it get the music file. Is there any way to stop that.
Thanks.
There is always a way to bypass any restriction (either download the file or use an audio recorder).
The protection that iTunes and other music platforms use is to let you listen to a preview of the music (~30 seconds) so you cannot get the entire song.
Well, there is really no way to let someone listen to music without them downloading it. Even if you stream the music, they could still record it.
http://www.codewalkers.com/c/a/Miscellaneous/Using-PHP-to-Stream-MP3-Files-and-Prevent-Illegal-Downloading/
This could be of help to you, though, because it will make it much more difficult to download the music.
As mentioned by others, there is no foolproof way to keep the users from downloading the mp3s. Having said that, you could try streaming the content which will make it much more harder for most people.
You can use a streaming server like Red5 for this.
I've been working on the same, and what I did was to create two (or more) versions of each file. One file will be the listening file which in my case was a low quality MP3 file encoded with 112 kbps. The quality would be good enough for people to listen to it online, but not good enough if people wanted a decent quality when they are on the go with a portable player or the like.
Then i'd also have high quality MP3 versions (320 kbps) and WAV files which people could download only if they were logged in.
I can't say that I've found a solution which guarantees that people can't get hold of the music if that's what they are after. You could certainly use Red5 or some other streaming solution, but that requires you to spend more time and resources on configuring and maintaining that solution. Furthermore, I believe Red5/Wowza or any of the other streaming servers requires that the client use Flash to play music (because the communication is done over RTMP). That rules out iPad/iPhone/iPod users as Flash can't be used on those devices.
My conclusion was thus that with the (limited) resources I had the way to go was to offer playback over HTTP (aka "progressive http") using low quality files.

HTTP Uploads with Resource Forks

I'm building a PHP based upload service for some of our clients. I am using SWFUpload so that I can view the progress of a file as it uploads. I've got it pretty much built, but am running into one last issue before we can release it to the public.
Many (almost all) of our clients are Mac-based and are uploading sets of files that include InDesign Files, Fonts, Illustrator Files, etc. Most of the times the images files are OK, but occasionally (and always with Type 1 Fonts) the file will become corrupted because it is losing the resource fork.
I understand why this is happening (moving from a multi-fork system to a single-fork system), but I can not find any elegant solution. In my research the best answer I've found so far is "have the user compress it". I know that works, but it's unreasonable - in our client's opinion - for us to require them to compress every set of files they are going to send.
Are there any better solutions for keeping those resource forks alive? Of course, I would prefer a solution that is straight javascript/php, but would settle for something that is flash based or (least preferably) java based.
My only requirements for the new solution would be:
View upload progress
User doesn't have to manually compress files
Here's some information about my system
Ubuntu 10.10 Server running a standard LAMP install
PHP5
SWFUpload (wtv the most recent version is)
Uploads handle files. If the browser and the underlying OS is not able to deal with forks in this procedure (map anything file onto the file model for uploads), then you're bound to what you get by the systems architecture.
Resource fork: The resource fork is a construct of the Mac OS operating system used to store structured data in a file, alongside unstructured data stored within the data fork. A resource fork stores information in a specific form, such as icons, the shapes of windows, definitions of menus and their contents, and application code (machine code).
If that's a blocker to you you might have chosen the wrong field to work in. Just saying, if you run into systematic borders, there is not much you can do about. Even if you work for graphic designers and mac users.
The swfupload would need a feature to deal with forks. For that, flash would need a feature to deal with forks. For that the browser would eventually need a feature to deal with forks. And so on.
Next to this chain, another question remains: How to deal with forks? As the upload only maps one file to a chunk of binary data, how to map the fork as well? Append it? Add an additional file?
So on the technical level this does not sound like easily solveable. All components and systems in the file input chain must support a feature that is commonly not supported at all.
So as you can not offer something to the user that does not exist, the only thing you can do is make your application more usable or user-friendly. E.g. by providing the right notes at the right time (e.g. when a user selects a Type 1 file for uploading, to remind him/her to select the fork as well). Communicating with the user can help, but keep in mind that a user needs to be spoken with in a language he/she understands.
So if you know that certain file types have forks, address the issue to someone who can solve it: The user. You can't.
You don't have to use swfupload to monitor progress.
Here are some file that demonstrate this: https://github.com/senica/Booger/tree/master/assets/js/jquery-upload
It is not documented very well, but it basically uses webkitSlice function for uploading the files in javascript. You can use the callback functions to display the progress of the files.
This would be a javascript/php solution.

Multiple file web uploader that uploads to remote FTP?

After tearing my hair out for the last week, I am looking for some sort of web uploader that allows my customers to upload a bunch of files (often up to 200) and store them to a remote FTP server. What I am looking for is something similar to uploadify, swfupload etc. but has the possibility to upload files via my web page (at my hosting company) and stored to my local ftp server.
I am looking for something similar to uploadify, swfupload and such, but it is absolutely critical that it has the possibility to store the files on my local server.
If this is somehow impossible to do, it could also just upload the files to my website via html (which uploadify etc. does) and after completion copy the files from the web server to my local ftp.
The closest thing i found was something called filechunker and it looked like the perfect solution, BUT it wont let me add multiple files, just one by one.
All help would be greatly apreciated!
Unfortunately I can't give you a concrete answer, but let me say that it should be theoretically possible to do for a Flash or Java application since they can use raw TCP sockets and implement the FTP protocol (but I am not aware of any Flash-based implementation).
If I'm not wrong all major browsers offer native file upload via FTP by browsing to the FTP directory itself (but you can't influence the visual appearance), just like Windows Explorer can access FTP servers and use them like a network drive.
However, I discourage you from using a FTP server at all. That protocol with it's double connection and that passive/non-passive modes often causes problems. It's usually much better to upload via HTTP and implement a HTTP-based file server yourselves, which is rather easy after all (but be very careful not to expose too much of your server's file system).
I see no real reason for using FTP unless you really want to allow your users to use their FTP client of choice, but that is contrary to your question.
Hope this helps.
Update: I just noticed the sentence "copy the files from the web server to my local ftp". In case you are really talking about two different servers I would still suggest a HTTP upload and then forward the file to the FTP server via the PHP script (your web server acting as a proxy).
I don't think it's feasible to upload directly from the browser to your FTP as you would have to have your credentials more or less visible on the website (e.g. in your javascript source).
I once created something similar, but because of that issue I decided to upload via plupload to Amazon S3 and sync the files afterwards via s3sync. The advantages were
Large filesizes (2GB+)
One time Tokens for upload, no need to send credentials to the client
no traffic to your web server (The communication runs client->s3)
Take a look at this thread for an implementation: http://www.plupload.com/punbb/viewtopic.php?id=133
After a wild search i finally found something that I could use. This java applet lets me upload endless amounts of files, zips them down and i managed to pass a php variable into the applet so the zip file is stored with the users e-mail adress as the filename. Cost me $29 though, but well worth it since I now have full control of where the files go, and who uploadeded them.

Best Practice for Uploading Many (2000+) Images to A Server

I have a general question about this.
When you have a gallery, sometimes people need to upload 1000's of images at once. Most likely, it would be done through a .zip file. What is the best way to go about uploading this sort of thing to a server. Many times, server have timeouts etc. that need to be accounted for. I am wondering what kinds of things should I be looking out for and what is the best way to handle a large amount of images being uploaded.
I'm guessing that you would allow a user to upload a zip file (assuming the timeout does not effect you), and this zip file is uploaded to a specific directory, lets assume in this case a directory is created for each user in the system. You would then unzip the directory on the server and scan the user's folder for any directories containing .jpg or .png or .gif files (etc.) and then import them into a table accordingly. I'm guessing labeled by folder name.
What kind of server side troubles could I run into?
I'm aware that there may be many issues. Even general ideas would be could so I can then research further. Thanks!
Also, I would be programming in Ruby on Rails but I think this question applies accross any language.
There's no reason why you couldn't handle this kind of thing with a web application. There's a couple of excellent components that would be useful for this:
Uploadify (based on jquery/flash)
plupload (from moxiecode, the tinymce people)
The reason they're useful is that in the first instance, it uses a flash component to handle uploads, so you can select groups of files from the file browser window (assuming no one is going to individually select thousands of images..!), and with plupload, drag and drop is supported too along with more platforms.
Once you've got your interface working, the server side stuff just needs to be able to handle individual uploads, associating them with some kind of user account, and from there it should be pretty straightforward.
With regards to server side issues, that's really a big question, depending on how many people will be using the application at the same time, size of images, any processing that takes place after. Remember, the files are kept in a temporary location while the script is processing them, and either deleted upon completion or copied to a final storage location by your script, so space/memory overheads/timeouts could be an issue.
If the images are massive in size, say raw or tif, then this kind of thing could still work with chunked uploads, but implementing some kind of FTP upload might be easier. Its a bit of a vague question, but should be plenty here to get you going ;)
For those many images it has to be a serious app.. thus giving you the liberty to suggest a piece of software running on the client (something like yahoo mail/picassa does) that will take care of 'managing' (network interruptions/resume support etc) the upload of images.
For the server side, you could process these one at a time (assuming your client is sending them that way)..thus keeping it simple.
take a peek at http://gallery.menalto.com
they have a dozen of methods for uploading pictures into galleries.
You can choose ones which suits you.
Either have a client app, or some Ajax code that sends the images one by one, preventing timeouts. Alternatively if this is not available to the public. FTP still works...
I'd suggest a client application (maybe written in AIR or Titanium) or telling your users what FTP is.
deviantArt.com for example offers FTP as an upload method for paying subscribers and it works really well.
Flickr instead has it's own app for this. The "Flickr Uploadr".

Categories