On my FTP server, there is a 4gb XML file and I want it to put data from that file to the database using PHP.
I know how to connect to FTP and basic operations using PHP but my question is there a possibility to do it without having to download the file first?
Unfortunately no, you cannot "stream" file using FTP as you could do say on network drive. It's not possible to open that file without downloading it first locally.
This is given you can only access that file via FTP.
If your FTP server and PHP server are one and the same, you just need to change the path to reference the FTP location rather than whereever you are downloading to.
If they are on the same local network, you may be able to use a network path to reach the file.
Otherwise, you will indeed need to transfer the entire file first by downloading it.
Related
I have logic to upload images to s3 in php. I need to upload the same set of images in a SFTP server. In my view there are 2 options. First is to find a logic to upload image from my local to the server, when I am uploading images to s3 and the other option is to write some script to transfer the images from s3 to sftp server. I need the same set of images to be in the server and s3.
Out of the 2 approaches, which one is optimal? Is there any other way to approach my requirement? Any sample php script available for local to SFTP file transfer, if yes please provide the code.
I cannot say for sure which one is optimal, but I can definitely see a potential issue with option #1. If you perform the second upload (i.e. from your "local" server to the SFTP server) during the first upload, you make PHP wait on that operation before returning the response to the client. This could cause some unnecessary hanging for the user agent connecting to the local server.
I would explore option #2 first. If possible, look into SSHFS. This is a way to mount a remote filesystem over SSH. It uses the SFTP to transfer files. This may be a possible solution where all you have to do is write the file once to the local server's filesystem and then again to the mounted, remote filesystem. SSHFS would take care of the transfer for you.
If I do
$file='http://notmywebsite.com/verybigimage.png';
$newfile='test.png';
copy($file,$newfile);
will my server download http://notmywebsite.com/verybigimage.png?
If yes, how can I make my user download http://notmywebsite.com/verybigimage.png without that my server download it?
It's not possible to have the client (the user) download a file and then copy this file in the server in a seamless way, because these two environments are living in different systems — in order to do this, the client needs to upload the file back to the server after downloading it, which in turn would be a pain for the user and slower than simply letting the server download the file and copying it there, if the ultimate goal is having a copy of the file in the server.
I need to write a script to upload big Files (~2GB+) to a server.
I don't think HTTP is the right way to do this so I want to use (S)FTP.
There are several tutorials about this (using cURL or ftp_connect) and I understand that i have to set several things in the php.ini.
But all this tutorials upload the file to a remote Server, what I want to do is upload it to the Server the script is running on, without having to upload the file to the Server over HTTP first.
Is this possible? If so, how would I do that?
HTTP can be the right way to upload large files. You can use resumable.js or similar library to split the file in "chunks" and then reassemble the file on the server.
If you decided not to go with HTTP and have shell access, I recommend you use rsync (with the --partial flag) which will do the heavy lifting for you.
i can't find out how to save XML file to remote web server using PHP DOM.
I tried with $dom->save('http://www.somedomain.com/file.xml'); but without success. I've set permissions of 'file.xml' on remote server to 777 but that doesn't helps.
You can't write to a URL, unless the server supports WebDAV or the like. You'd need a script on the server which accepts http file uploads and processes them from there.
As to why this doesn't work, imagine how fun the web would be if anyone could save files to whatever URL they wanted.
We are storing a Final Image, Back and Front proof and a thumbnail on one server, but what I need to is when the user goes to took at their images, it needs to transfer the image from the image server on to our web server.
I have tried opening the the server connection and the transferring the image from where it is to where the .php file that has opened the connection is, but i just continually got errors saying that the stream could not be opened because the directory didn't exist.
And i had tried to open up two connections (one for each server) and using get and put to move it from server to server.
These images can't be stored on a intermittent PC.
Any help or advice on how to do this?
You could try transferring the image directly to the user through Apache mod-proxy connection. I'm sure there are ways to work out user permissions between the servers.
What is the purpose of the image server you're using? Are you sure you need to transfer the image across to your other server rather than just having the HTML the web server is producing link to the image server?
Other than that you might want to look into using NFS rather than programatically transferring the files yourself every time they're requested (assuming you're using some form of *nix on both machines and can create NFS shares).