How can I set my uploader to save uplodaded files into folder on my computer? I'm hosting a free server on 000webhost.com and I have a simple uploader script ^^
So I'd like to do something like this:
//specify folder for file upload
$folder = "C:\Users\Tepa\Desktop\Ohjelmat\Uploads";
In order to have a website save files on your local computer, you will need to install a ftp or other fileserver on your home computer and send the file by IP address, or DNS address.
I would suggest instead to tackle this from the opposite direction and write a script to pull these files to your local computer rather than open it up to the outside world. You can set it up as either a cron job / scheduled task so that the files are pulled in automatically.
If the files you are interested in are stored in a publicly accessible folder on your host, actually accomplishing this task is fairly trivial (see: cURL or file_get_contents)
You loose the real-time synchronicity, but gain some peace of mind.
Related
I am designing a web-based file-managment system that can be conceptualised as 3 different servers:
The server that hosts the system interface (built in PHP) where users 'upload' and manage files (no actual files are stored here, it's all meta).
A separate staging server where files are placed to be worked on.
A file-store where the files are stored when they are not being worked on.
All 3 servers will be *nix-based on the same internal network. Users, based in Windows, will use a web interface to create an initial entry for a file on Server 1. This file will be 'uploaded' to Server 3 either from the user's local drive (if the file doesn't currently exist anywhere on the network) or another network drive on the internal network.
My question relates to the best programmatic approach to achieve what I want to do, namely:
When a user uploads a file (selecting the source via a web form) from the network, the file is transferred to Server 3 as an inter-network transfer, rather than passing through the user (which I believe is what would happen if it was sent as a standard HTTP form upload). I know I could set up FTP servers on each machine and attempt to FXP files between locations, but is this preferable to PHP executing a command on Server 1 (which will have global network access), to perform a cross-network transfer that way?
The second problem is that these are very large files we're talking about, at least a gigabyte or two each, and so transfers will not be instant. I need some method of polling the status of the transfer, and returning this to the web interface so that the user knows what is going on.
Alternatively this upload could be left to run asyncrhonously to the user's current view, but I would still need a method to check the status of the transfer to ensure it completes.
So, if using an FXP solution, how could polling be achieved? If using a file move/copy command from the shell, is any form of polling possible? PHP/JQuery solutions would be very acceptable.
My final part to this question relates to windows network drive mapping. A user may map a drive (and select a file from), an arbitrarily specified mapped drive. Their G:\ may relate to \server4\some\location\therein, but presumably any drive path given to the server via a web form will only send the G:\ file path. Is there a way to determine the 'real path' of mapped network drives?
Any solution would be used to stage files from Server 3 to Server 2 when the files are being worked on - the emphasis being on these giant files not having to pass through the user's local machine first.
Please let me know if you have comments and I will try to make this question more coherant if it is unclear.
As far as I’m aware (and I could be wrong) there is no standard way to determine the UNC path of a mapped drive from a browser.
The only way to do this would be to have some kind of control within the web page. Could be ActiveX or maybe flash. I’ve seen ActiveX doing this, but not flash.
In the past when designing web based systems that need to know the UNC path of a user’s mapped drive I’ve had to have a translation of drive to UNC path stored server side. I did have a luxury though of knowing which drive would map to what UNC path. If the user can set arbitrary paths then this obviously won’t work.
Ok, as I’m procrastinating and avoiding real work I’ve given this some thought.
I’ll preface this by saying that I’m in no way a Linux expert and the system I’m about to describe has just been thought up off the top of my head and is not something you’d want to put into any kind of production. However, it might help you down the right path.
So, you have 3 servers, the Interface Server (LAMP stack I’m assuming?) your Staging Server and your File Store Server. You will also have Client Machines and Network Shares. For the purpose of this design your Network Shares are hosted on nix boxes that your File Store can scp from.
You’d create your frontend website that tracks and stores information about files etc. This will also hold the details about which files are being copied, which are in Staging and so on.
You’ll also need some kind of Service running on the File Store Server. I’ll call this the File Copy Service. This will be responsible for coping the files from your servers hosting the network shares.
Now, you’ve still got an issue with how you figure out what path the users file is actually on. If you can stop users from mapping their own drives and force them to use consistent drive letters then you could keep a translation of drive letter to UNC path on the server. If you can’t, well I’ll let you figure that out. If you’re in a windows domain you can force the drive mappings using Group Policies.
Anyway, the process for the system would work something like this.
User goes to system and selects a file
The Interface server take the file path and calls the File Copy Service on the File Store Server
The File Copy Service connects to the server that hosts the file and initiates the copy. If they’re all nix boxes you could easily use something like SCP. Now, I haven’t actually looked up how to do it but I’d be very surprised if you can’t get a running total of percentage complete from SCP as it’s copying. With this running total the File Copy Service will be updating the database on the Interface Server with how the copy is doing so the user can see this from the Interface Server.
The File Copy Service can also be used to move files from the File Store to the staging server.
As i said very roughly thought out. The above would work, but it all depends a lot on how your systems are set up etc.
Having said all that though, there must be software that would do this out there. Have you looked?
If iam right is this archtecture:
Entlarge image
1.)
First lets sove the issue of "inter server transfer"
I would solve this issue by mount the FileSystem from Server 2 and 3 to Server 1 by NFS.
https://help.ubuntu.com/8.04/serverguide/network-file-system.html
So PHP can direct store files on file system and dont need to know on which server the files realy is.
/etc/exports
of Server 2 + 3
/directory/with/files 192.168.IPofServer.1 (rw,sync)
exportfs -ra
/etc/fstab
of Server 1
192.168.IPofServer.2:/var/lib/data/server2/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
192.168.IPofServer.3:/var/lib/data/server3/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
mount -a
2.)
Get upload progress for realy large files,
here are some possibilitys to have a progress bar for http uploads.
But for a resume function you would have to use a flash plugin.
http://fineuploader.com/#demo
https://github.com/valums/file-uploader
or you can build it by your selfe using the apc extension
http://www.amwsites.com/blog/2011/01/use-a-combination-of-jquery-php-apc-uploadprogress-to-show-progress-bar-during-an-upload/
3.)
Lets Server load files from Network drive.
This i would try with a java applet to figurre out the real network path and send this to server, so the server can fetch the file in background.
But i never didt thinks like this before and have no further informations.
When uploading an image PHP stores the temp image in a local on the server.
Is it possible to change this temp location so its off the local server.
Reason: using loading balancing without sticky sessions and I don't want files to be uploaded to one server and then not avaliable on another server. Note: I don't necessaryly complete the file upload and work on the file in the one go.
Preferred temp location would be AWS S3 - also just interested to know if this is possible.
If its not possible I could make the file upload a complete process that also puts the finished file in the final location.
just interested to know if the PHP temp image/file location can be off the the local server?
thankyou
You can mount S3 bucket with s3fs on your Instances which are under ELB, so that all your uploads are shared between application Servers. About /tmp, don't touch it as destination is S3 and it is shared - you don't have to worry.
If you have a lot of uploads, S3 might be bottleneck. In this case, I suggest to setup NAS. Personally, I use GlusterFS because it scales well and very easy to set up. It has replication issues, but you might not use replicated volumes at all and you are fine.
Another alternatives are Ceph, Sector/Sphere, XtreemFS, Tahoe-LAFS, POHMELFS and many others...
You can directly upload a file from a client to S3 with some newer technologies as detailed in this post:
http://www.ioncannon.net/programming/1539/direct-browser-uploading-amazon-s3-cors-fileapi-xhr2-and-signed-puts/
Otherwise, I personally would suggest using each server's tmp folder for exactly that-- temporary storage. After the file is on your server, you can always upload to S3, which would then be accessible across all of your load balanced servers.
I am planing to build a CMS-system.
The CMS system will be located and fully administrated at:
www.mycompany.com/customer
The final website will be located at:
www.customer.com
I need to figure out the best way to copy files (eg. images)
from: www.mycompany.com/customer/media
to: www.customer.com/media
Note: The CMS and customer page will be located on different hosts. And I want to build this function using PHP.
Some thoughts:
The optimal solution would be if the two directories could be cloned automaticly, no matter how the images are uploaded or updated. Maby if there is a way to detect changes to www.mycompany.com/customer/media, then www.customer.com/media could be notifyed about it and send a request to update the image.
A wish would also be that images only could be accessed from www.mycompany.com/customer/media if logged in to the CMS :S
Any tips?
You should not use FTP (insecure), or PHP for the replication,
try rsync instead :-
What is rsync ?
http://en.wikipedia.org/wiki/Rsync
rsync is a software application and network protocol for Unix-like and Windows systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate. An important feature of rsync not found in most similar programs/protocols is that the mirroring takes place with only one transmission in each direction. rsync can copy or display directory contents and copy files, optionally using compression and recursion.
In another word, is designed meant for mirroring or replicating (is industry standard)
In general,
setup public key to allow source server to able to ssh into destination server
setup a cronjob in the source server to do rsync
What does the cronjob do ?
In nutshell, it should rsync the selected source directory to destination server,
a quick example :-
* * * * * rsync -avz /home/www.mycompany.com/www $HOST:/home/www.customer.com/www
^ source server directory ^ destination server,
and directory
However, rsync is too hard to describe in few sentences, you can take a look :-
http://www.cyberciti.biz/tips/linux-use-rsync-transfer-mirror-files-directories.html (as a start)
Other possibilities is make use of version controlling software, like -:
git
svn
Or make use on CDN (like #Amir Raminfar has mentioned), which itself is already a complete solution for file distribution.
Just make an ftp call or scp call from mycompany.com to customer.com and upload the file.
Alternately, you can use a CDN which is probably a little too much work but it will give you optimal solution and with great performance. You can upload to a common server and have customer.com pull from that website for the images. This gives you the benefit for parallel download for the assets.
Finally, you can run some kind of web service on customer.com that can accept uploads.
I wouldn't do what you suggested which is notification system because it won't be exactly instantaneous and it will result in some time before actually seeing the image.
You can create DB for www.mycompany.com. But host must allows remote access to DB.
Create crontab for customer.com which will be check new records within remote DB. This script could be allows for registred users.
You should create htaccess for www.mycompany.com images folder. To allow access from itself and customer.com to download (say copy) images.
In this way all registred users can download any image. It could be wrong for you.
If you need to limit access for each image then you should to store images in DB and generate encrypt url for each image and request. In this case you shold to use remote DB or SOAP service for getting images.
I've hosted a site on a shared hosting server.
I've a given permission 776 to a folder, is it possible for someone to upload a file using move_upload_file to my server from his home pc or own server ?
Edit
If i do not provide the front panel or some UI to the user is it still possible to upload file ?
You use move_uploaded_file (note: upload*ed*) to move/rename files in your PHP scripts on your server. The special thing about move_uploaded_file vs. rename is that it will check whether the file was just uploaded in the same HTTP request. If it wasn't, it will fail with an error.
This is to prevent errors in your script or malicious users from tricking your server into moving any other sort of files around that you didn't intend to move. Using it you can be sure that you're only moving uploaded files out of the temp directory to some other destination.
That's all it does. It does not upload files to some other server. You cannot simply upload files to some other server without that server handling that upload somehow (like through a PHP script, FTP, SCP etc).
Not sure what you're asking exactly.
If you're saying, can you make an HTML form and have someone hit that from their browser to upload. That depends what user apache runs as. You can make an HTML form, catch it with PHP and use move_uploaded_file if whatever user apache runs as can create a file in that directory.
If you're thinking someone can write a php script on another computer, and use the function move_uploaded_file, then no, you definitely can't. That's not what that function does. I'd recommend using SCP for something like that.
No, if you do not provide a script which receives the file and moves it, some other user can't upload a file to your server.
All move_uploaded_file does is move a file from the temporary directory on the hard drive to a different location on the same hard drive. It cannot put files on someone else's computer.
Your question is equivalent to asking whether your next door neighbor can copy child pornography onto your home PC's hard drive over the internet. You should be happy that the answer is no.
I'm writing a PHP process that will run on a Unix machine that will need to monitor a remote SMB server and detect new files that are being uploaded to that box via FTP. It's unlikely I'll be able to
It will need to detect:
New files being created
File upload completing
Files being deleted
If it was an NFS share, I'd try using FAM to detect the events, but I can't see a way of doing anything equivalent?
Doesn't sound like something I would use in production. But you could try something like this:
mount the SMB share with Samba on
the machine that is running a PHP
daemon
use SPL
RecursiveIteratorIterator with
DirectoryIterator to collect and
maintain a list of all the files and
folders on the shared drive
once in
a while refresh the folder list and
compare it with the current state,
if the file does not exist any more
you know it has been deleted, if
there is a new file put it in the
queue and mark it as "being
uploaded"
in the next "refresh run"
check the queued file, it the file
size did not change the file upload
probably completed, if the file size
changed put it in the queue again
and mark it as "being uploaded"