After tearing my hair out for the last week, I am looking for some sort of web uploader that allows my customers to upload a bunch of files (often up to 200) and store them to a remote FTP server. What I am looking for is something similar to uploadify, swfupload etc. but has the possibility to upload files via my web page (at my hosting company) and stored to my local ftp server.
I am looking for something similar to uploadify, swfupload and such, but it is absolutely critical that it has the possibility to store the files on my local server.
If this is somehow impossible to do, it could also just upload the files to my website via html (which uploadify etc. does) and after completion copy the files from the web server to my local ftp.
The closest thing i found was something called filechunker and it looked like the perfect solution, BUT it wont let me add multiple files, just one by one.
All help would be greatly apreciated!
Unfortunately I can't give you a concrete answer, but let me say that it should be theoretically possible to do for a Flash or Java application since they can use raw TCP sockets and implement the FTP protocol (but I am not aware of any Flash-based implementation).
If I'm not wrong all major browsers offer native file upload via FTP by browsing to the FTP directory itself (but you can't influence the visual appearance), just like Windows Explorer can access FTP servers and use them like a network drive.
However, I discourage you from using a FTP server at all. That protocol with it's double connection and that passive/non-passive modes often causes problems. It's usually much better to upload via HTTP and implement a HTTP-based file server yourselves, which is rather easy after all (but be very careful not to expose too much of your server's file system).
I see no real reason for using FTP unless you really want to allow your users to use their FTP client of choice, but that is contrary to your question.
Hope this helps.
Update: I just noticed the sentence "copy the files from the web server to my local ftp". In case you are really talking about two different servers I would still suggest a HTTP upload and then forward the file to the FTP server via the PHP script (your web server acting as a proxy).
I don't think it's feasible to upload directly from the browser to your FTP as you would have to have your credentials more or less visible on the website (e.g. in your javascript source).
I once created something similar, but because of that issue I decided to upload via plupload to Amazon S3 and sync the files afterwards via s3sync. The advantages were
Large filesizes (2GB+)
One time Tokens for upload, no need to send credentials to the client
no traffic to your web server (The communication runs client->s3)
Take a look at this thread for an implementation: http://www.plupload.com/punbb/viewtopic.php?id=133
After a wild search i finally found something that I could use. This java applet lets me upload endless amounts of files, zips them down and i managed to pass a php variable into the applet so the zip file is stored with the users e-mail adress as the filename. Cost me $29 though, but well worth it since I now have full control of where the files go, and who uploadeded them.
Related
I have a standard HTML form, including a file input, allowing users of a web application to upload files (pictures, documents or videos).
It technically works, except when it comes with a large file...
I have a personal dedicated server on which I can change the PHP configuration to handle larger files ;
But this is not a reliable solution as my client has a shared hosting.
I know HTTP has different limits and is definitely not the best protocol to handle files.
So my objective is to avoid HTTP upload.
I was wondering if there is any way to rely on FTP to upload the selected file.
Any solution to upload files through FTP, directly from client's browser ?
EDIT :
I've read some solutions about Java applets, however this is not
really something I can, or even want to, provide.
And as the files are confidential, using a third-party service is
not possible neither.
I develop some python applications so I know how to do this in python locally, but am working with some PHP developers (I know nothing of PHP) who say this can't be done in PHP. This is the idea: A php driven remote website which creates / hosts files. Using a web browser I want to download from this website a series of folders and files onto the local machine overwriting already existing files/folders with the same name. So in my browser I click on a download button which asks me to browse to a local or network folder to download the folders and files to. Currently we are just downloading a single .zip file containing all these files and folders which we have to unzip and manually move, copy paste, etc, very messy and cumbersome. There must be a better way with PHP and some other language?
No, it's not possible to access from a PHP (server-side language) to the Client Machine (from a Browser) and manipulate directly his file system, hard drive, or something like that. This is not the way it works.
Just think about it for a moment, if it could be accomplish, we have serious security threat, for example we visit a page like somebadassdude.com and they have a PHP script that create unlimited folders and files to fill up all our HD... and that is soft.
But hopefully the browsers dont allow this by security design.
Look at this:
As you can see at the Diagram, the Browser and the Server response each other through HTTP Requests & Responses. There's no a communication between them like a local program running at the Client OS. You treat with his Browser, and there's no way to command the Browser to manipulate the client hard-disk, and if that can happend, look at the security consern that I mentioned before.
To be more clearer, your PHP script is running at your server, not at the client machine. It only response when a user/browser request a specific resource at your server, and response with a HTTP Response, and it can contain HTML, or Json, or a File (to be downloaded or visualized by external program), or whatever.
You have limited options:
If it is something for a Intranet, or
local network, and you have access to that network, locally or
remotely like with a VPN access. You could share a folder over
network, in that way you can use a Php Script or Python script in
order to create the folders and copy the files to it, without have to
download a zip, and unzip manually from the Browser.
Using a Java Applet. Why? Because a Java Applet runs
on the Client Side, so you have access to his computer (if the user
allow it), and you certeinly can manipulate (create, delete, read,
etc. folders and files) his hard-drive. So when the user choose the files to download,
you fire the Java Applet, and let em request to the server the files
that the user has marked. When you have the files downloaded, create
or overwrite the files on the client machine.
Create and run a program in the Client Machine, in detriment of a Web Page, by this way you gain the needed flexibility. But of course, it have his own complexities.
So IHMO i think the Java Applet maybe is the best suited solution for you:
Do not have to change much your actual business model
It doesn't require a large time investment.
It is cross-platform, Java can work on a plenty of operating systems, and Java Applets in the most popular browsers.
By the way, I personally dislike Java, but it's a tool, and you have to use the right tool for a job.
Cheers.
I am transferring my WordPress site. I need to transfer all the content to my new server from old one. I don't have that enough bandwidth that I can first FTP to my local PC and than again FTP to the new server. So I request people right here to help me find out some sophisticated technique.
You should ask your new host for help, as they will have enough bandwidth and probably ample experience with this. They might charge extra for this. Moreover, they should have a function that lets them transfer directly from the other host if they both use WHM and don't block this option (most hosts do).
In general, the standard way of transferring anything between two Linux servers is rsync. Here is a simple guide: http://johnveldboom.com/posts/sync-files-between-servers-using-rsync/
However, this may be impossible on shared hosting (as well as any other SSH-based transfer).
The only other way I can think of without downloading the files to your computer is to compress everything on the original host (don't forget to delete after you're finished!), and download it on the destination host. You will have to write a script and put it in the destination server.
I am designing a web-based file-managment system that can be conceptualised as 3 different servers:
The server that hosts the system interface (built in PHP) where users 'upload' and manage files (no actual files are stored here, it's all meta).
A separate staging server where files are placed to be worked on.
A file-store where the files are stored when they are not being worked on.
All 3 servers will be *nix-based on the same internal network. Users, based in Windows, will use a web interface to create an initial entry for a file on Server 1. This file will be 'uploaded' to Server 3 either from the user's local drive (if the file doesn't currently exist anywhere on the network) or another network drive on the internal network.
My question relates to the best programmatic approach to achieve what I want to do, namely:
When a user uploads a file (selecting the source via a web form) from the network, the file is transferred to Server 3 as an inter-network transfer, rather than passing through the user (which I believe is what would happen if it was sent as a standard HTTP form upload). I know I could set up FTP servers on each machine and attempt to FXP files between locations, but is this preferable to PHP executing a command on Server 1 (which will have global network access), to perform a cross-network transfer that way?
The second problem is that these are very large files we're talking about, at least a gigabyte or two each, and so transfers will not be instant. I need some method of polling the status of the transfer, and returning this to the web interface so that the user knows what is going on.
Alternatively this upload could be left to run asyncrhonously to the user's current view, but I would still need a method to check the status of the transfer to ensure it completes.
So, if using an FXP solution, how could polling be achieved? If using a file move/copy command from the shell, is any form of polling possible? PHP/JQuery solutions would be very acceptable.
My final part to this question relates to windows network drive mapping. A user may map a drive (and select a file from), an arbitrarily specified mapped drive. Their G:\ may relate to \server4\some\location\therein, but presumably any drive path given to the server via a web form will only send the G:\ file path. Is there a way to determine the 'real path' of mapped network drives?
Any solution would be used to stage files from Server 3 to Server 2 when the files are being worked on - the emphasis being on these giant files not having to pass through the user's local machine first.
Please let me know if you have comments and I will try to make this question more coherant if it is unclear.
As far as I’m aware (and I could be wrong) there is no standard way to determine the UNC path of a mapped drive from a browser.
The only way to do this would be to have some kind of control within the web page. Could be ActiveX or maybe flash. I’ve seen ActiveX doing this, but not flash.
In the past when designing web based systems that need to know the UNC path of a user’s mapped drive I’ve had to have a translation of drive to UNC path stored server side. I did have a luxury though of knowing which drive would map to what UNC path. If the user can set arbitrary paths then this obviously won’t work.
Ok, as I’m procrastinating and avoiding real work I’ve given this some thought.
I’ll preface this by saying that I’m in no way a Linux expert and the system I’m about to describe has just been thought up off the top of my head and is not something you’d want to put into any kind of production. However, it might help you down the right path.
So, you have 3 servers, the Interface Server (LAMP stack I’m assuming?) your Staging Server and your File Store Server. You will also have Client Machines and Network Shares. For the purpose of this design your Network Shares are hosted on nix boxes that your File Store can scp from.
You’d create your frontend website that tracks and stores information about files etc. This will also hold the details about which files are being copied, which are in Staging and so on.
You’ll also need some kind of Service running on the File Store Server. I’ll call this the File Copy Service. This will be responsible for coping the files from your servers hosting the network shares.
Now, you’ve still got an issue with how you figure out what path the users file is actually on. If you can stop users from mapping their own drives and force them to use consistent drive letters then you could keep a translation of drive letter to UNC path on the server. If you can’t, well I’ll let you figure that out. If you’re in a windows domain you can force the drive mappings using Group Policies.
Anyway, the process for the system would work something like this.
User goes to system and selects a file
The Interface server take the file path and calls the File Copy Service on the File Store Server
The File Copy Service connects to the server that hosts the file and initiates the copy. If they’re all nix boxes you could easily use something like SCP. Now, I haven’t actually looked up how to do it but I’d be very surprised if you can’t get a running total of percentage complete from SCP as it’s copying. With this running total the File Copy Service will be updating the database on the Interface Server with how the copy is doing so the user can see this from the Interface Server.
The File Copy Service can also be used to move files from the File Store to the staging server.
As i said very roughly thought out. The above would work, but it all depends a lot on how your systems are set up etc.
Having said all that though, there must be software that would do this out there. Have you looked?
If iam right is this archtecture:
Entlarge image
1.)
First lets sove the issue of "inter server transfer"
I would solve this issue by mount the FileSystem from Server 2 and 3 to Server 1 by NFS.
https://help.ubuntu.com/8.04/serverguide/network-file-system.html
So PHP can direct store files on file system and dont need to know on which server the files realy is.
/etc/exports
of Server 2 + 3
/directory/with/files 192.168.IPofServer.1 (rw,sync)
exportfs -ra
/etc/fstab
of Server 1
192.168.IPofServer.2:/var/lib/data/server2/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
192.168.IPofServer.3:/var/lib/data/server3/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
mount -a
2.)
Get upload progress for realy large files,
here are some possibilitys to have a progress bar for http uploads.
But for a resume function you would have to use a flash plugin.
http://fineuploader.com/#demo
https://github.com/valums/file-uploader
or you can build it by your selfe using the apc extension
http://www.amwsites.com/blog/2011/01/use-a-combination-of-jquery-php-apc-uploadprogress-to-show-progress-bar-during-an-upload/
3.)
Lets Server load files from Network drive.
This i would try with a java applet to figurre out the real network path and send this to server, so the server can fetch the file in background.
But i never didt thinks like this before and have no further informations.
Okay, here's the rundown.
I'm developing a video hosting site with PHP and jQuery primarily. My client is absolutely set on using Filezilla to upload his videos via FTP instead of allowing me to handle the upload directly with PHP. This makes me mildly psychotic, but hey, the money man calls the shots.
Since I have no way of detecting the completion of those FTP uploads, I set him up with a special FTP account that is rooted into the uploads directory, and designed an interface that monitors that folder and allows him to start transcoding those files for web at his discretion. This is less than optimal. I'd like to show the progress of currently processing uploads on my mobile app, and hopefully start encoding files automatically when the FTP transfer completes.
So what I'm really asking for is either a method of detecting that a file is still being uploaded, or some way of finding the size of the original file that is being uploaded so that I can do comparisons.
Thanks!
On Linux, you can use fuser to see if a file's in use by a process. A more reliable option would be to use the inotify system, which'll give you realtime notifications of file change events.
In case anyone else is ever wondering, the answer to the question reached through the comments and more research on my end to confirm, is that the total size of a file is not passed to the server at all during ftp transfers. Thanks to Jonathan Amend.