How can I create a directory with php script, having some files in it, that can be accessed using ftp (with username and password) and not http (ex. by program such as wget)?
Actually I have multiple users uploading files to same server(using file-uploading form from a site) and server-side php script creates separate directories for each users' files. Now, I want those files to be able to be downloaded over ftp from a client machine running wget.
Please tell me how can I go about it?
Also, if any1 has a better way of doing it, please suggest..
PHP and FTP are 2 completely separate things. If you want the files to be able to be accessed, via ftp, you need to setup an FTP server on the machine hosting the files.
You can use pure-ftp server. They have a way of allowing users to login to the ftp from the database table data. You just need the php to create and write into the database the account.
The url is "http://www.pureftpd.org/project/pure-ftpd"
Related
I'm trying to figure out how I could delete a file from a different URL.
The situation is similar to the one below.
Eg: abc.com/admin/xyz-admin updates the content of xyz.com.
The website XYZ has the folders with images, so I want to delete files from there with abc.com/admin/xyz-admin. I'm using Remote MySQL to connect the database.
Thanks!
You need to use FTP (or SFTP) or some protocol like that to log on to xyz.com and delete the files 'manually'.
See the PHP documentation on FTP.
Note that you would need an FTP server installed on xyz.com, and FTP is not a secure protocol.
You could also use WebDAV, for which there are some PHP libraries like https://sabre.io/dav/davclient/. Again, you would need a WebDAV server on xyz.com
Is there any possbile solutions in PHP to create a file or directory on the client machine.
Note:
Downloading the file is not a solution to me. only once the user access the website and execute some function and it allows to create a files or directory on the client machine.
Simply, you can't do that in php or any other server side language.
Reason is simple server side application and scripts have access only to local resources where they are launched. So when you run your application on local computer, everything works as you wish for. But because of how HTTP works and because of safety reasons you cannot access user local files.
No it's not possible , except- Flash, Applet ( Not sure ), Microsoft Silverlight ! and for those you will also have to give permission. But i don't think it is a good idea to store file on client machine, try another :)
I'm not sure how common my request is but here goes:
I have a client who wants to be able to receive files of up to 2GB in size from their customers. The reason why the files are kind of big is because they are graphic design files. Ideally, they would have their customers transfer their files via FTP using an FTP client like Filezilla. However my client has stated that they spend way too much time trying to school people on how to enter FTP credentials into an FTP program and the concept of FTP in general.
So ultimately my client has stated that they would like the customer to be able to use a web interface, which they're already familiar with, to be able to accomplish the same task. For example, they'd like the customer to be able to use a form and hit an upload file button. Simple as that.
At this point I should state that I'm working on a WordPress site on top of a Rackspace Cloud Sites server (shared hosting).
I'm using a WordPress plugin that allows me to do this and it works for small files but not for files approaching 500MB. After speaking to a RS Cloud tech support person I've narrowed it down to the temporary directory /tmp/ in the Apache server. That is the server is not able to write large files into that temporary directory because of their server wide restrictions. That is, they cannot change this to accomodate my needs. They said I would need a dedicated server for that, which, is not an option for me.
After doing some thinking though, I've come to the conclusion that it's silly for me to have to upload a file to the server's temporary directory, only to move the file to the ftp server. So this brings me to my question: is it possible for a web based PHP script to send the file directly from the user's machine, bypass the web server, and send it directly to the FTP server?
If not, do you have any other suggestions? Any help would be appreciated. Thanks.
No, it's not possible at all.
My suggestion? Search and learn how to use HTML5 upload for large files.
Its seams like someone find solution for your problem. Please refer to this question:
Stream FTP download to output
The question was how to stream/pipe file from FTP thru HTTP to the user's browser.
It seams like the answer from #strager is what you need.
I have a very hard mission to accomplish.
We have a dedicated server outside our company and we also have two servers inside our company.
We need to copy files from server 1 to server 2 programatically via PHP, but the files are inside the /home/server1/files and should go to /home/server2/files.
When the users select 10 files to group accordingly to some criteria, these files must be sent to another server.
We were using copy when we had only one server, and everything was okay...
but now, the system is down, because we have two servers....
When using one server I could use this:
copy('/home/server/files/file.txt', '/home/server/files/group-1/file.txt');
now it has to be:
copy('/home/server1/files/file.txt', '/home/server2/files/group-1/file.txt');
But I don't know how to sent files via servers.
There are quite a few different ways to copy files between servers. I can think of the following:
Send over SSH. You could use php's SCP library
Send via SFTP. Requires a FTP server be setup on server2 and code to be changed to use ftp.
Copy via NFS. Requires NFS to be setup. Once it is you could mount your server2 to /home/server2 and hopefully not need to make any programming changes.
Send via a webservice call (REST or SOAP). Requires code to be setup on server2 to listen for file sending.
Say I use php copy(), to get a file from a site.
And then a user who is using a proxy filtering service that has that site blocked, will the site still be able to copy that file for the user?
If not, would a cron job be able to?
And is the same for file_get_contents?
Cheers
PHP runs on the server, so it is the webserver, not the user, which requests that remote file.
So yes, you can copy a file from a server which a user themselves might not be able to reach directly.