Save XML file to remote server - php

i can't find out how to save XML file to remote web server using PHP DOM.
I tried with $dom->save('http://www.somedomain.com/file.xml'); but without success. I've set permissions of 'file.xml' on remote server to 777 but that doesn't helps.

You can't write to a URL, unless the server supports WebDAV or the like. You'd need a script on the server which accepts http file uploads and processes them from there.
As to why this doesn't work, imagine how fun the web would be if anyone could save files to whatever URL they wanted.

Related

Best way to "pipe" file contents from a remote server, via a 2nd server, output to browser

I have numerous storage servers, and then more "cache" servers which are used to load balance downloads. At the moment I use RSYNC to copy the most popular files from the storage boxes to the cache boxes, then update the DB with the new server IDs, so my script can route the download requests to a random box which has the file.
I'm now looking at better ways to distribute the content, and wondering if it's possible to route requests to any box at random, and the download script then check if the file exists locally, if it doesn't, it would "get" the file contents from the remote storage box, and output the content in realtime to the browser, whilst keeping the file on the cache box so that the next time the same request is made, it can just serve the local copy, rather than connecting to the storage box again.
Hope that makes sense(!)
I've been playing around with RSYNC, wget and cURL commands, but I'm struggling to find a way to output the data to browser as it comes in.
I've also been reading up on reverse proxies with nginx, which sounds like the right route... but it still sounds like they require the entire file to be downloaded from the origin server to the cache server before it can output anything to the client(?) some of my files are 100GB+ and each server has a 1gbps bandwidth limit, so at best, it would take 100s to download a file of that size to the cache server before the client will see any data at all. There must be a way to "pipe" the data to the client as it streams?
Is what I'm trying to achieve possible?
You can pipe data without downloading the full file using streams. One example for downloading a file as a stream would be the Guzzle sink feature. One example for uploading a file as a stream would be the Symfony StreamedResponse. Using those the following can be done:
Server A has a file the user wants
Server B gets the user request for the file
Server B uses Guzzle to setup a download stream to server A
Server B outputs the StreamedResponse directly to the user
Doing so will serve the download in real-time without having to wait for the entire file to be finished. However I do not know if you can stream to the user and store the file on disk at the same time. There's a stream_copy_to_stream function in PHP which might allow this, but don't know that for sure.

Big xml files over FTP

On my FTP server, there is a 4gb XML file and I want it to put data from that file to the database using PHP.
I know how to connect to FTP and basic operations using PHP but my question is there a possibility to do it without having to download the file first?
Unfortunately no, you cannot "stream" file using FTP as you could do say on network drive. It's not possible to open that file without downloading it first locally.
This is given you can only access that file via FTP.
If your FTP server and PHP server are one and the same, you just need to change the path to reference the FTP location rather than whereever you are downloading to.
If they are on the same local network, you may be able to use a network path to reach the file.
Otherwise, you will indeed need to transfer the entire file first by downloading it.

Upload big files with PHP and FTP

I need to write a script to upload big Files (~2GB+) to a server.
I don't think HTTP is the right way to do this so I want to use (S)FTP.
There are several tutorials about this (using cURL or ftp_connect) and I understand that i have to set several things in the php.ini.
But all this tutorials upload the file to a remote Server, what I want to do is upload it to the Server the script is running on, without having to upload the file to the Server over HTTP first.
Is this possible? If so, how would I do that?
HTTP can be the right way to upload large files. You can use resumable.js or similar library to split the file in "chunks" and then reassemble the file on the server.
If you decided not to go with HTTP and have shell access, I recommend you use rsync (with the --partial flag) which will do the heavy lifting for you.

CURL + $GLOBALS["HTTP_RAW_POST_DATA"] in PHP

I'm using CURL to upload files to a service.
currently I'm getting the file content with $GLOBALS["HTTP_RAW_POST_DATA"] then save it on my server.
after that, I'm using CURLOPT_POSTFIELDS with the file's full path.
Is there a way to send the file content directly, without saving it on my server, as if I saved it?
Or is there a way to upload a Photo from a flash app to facebook album, without saving it on the server?
Thanks
If you are uploading data you might consider using the file upload mechanism in PHP http://php.net/manual/en/features.file-upload.php It automatically handls file upload PHP.
If you want to redirect the upload to another (third party service) without needing to be in the chain of commands (i.e. user->3rd party server), you might want to look into AJAX. AFAIK when you upload a file using PHP/forms the file will be uploaded to your PHP temp directory and there is no way to prevent this because:
1. To access the file it needs to be on the server (PHP is server execute meaning it can not execute on the user side)
2. I do not believe any user will want you to access their files on their computer nor will you be able to do so(Firewall, AV), if that were to happen it will be a major security issue
As I said above, what you want to look into is AJAX (I used jquery and their AJAX methods are very simple). Because AJAX is user execute javascript it can run on the machine and initiate a connection to any URL. This way you can directly access the service without submitting the file to your server.
Here is an exmaple AJAX upload (you can google for more):
http://valums.com/ajax-upload/
Hope this helps

PHP Server to Server Transfers

We are storing a Final Image, Back and Front proof and a thumbnail on one server, but what I need to is when the user goes to took at their images, it needs to transfer the image from the image server on to our web server.
I have tried opening the the server connection and the transferring the image from where it is to where the .php file that has opened the connection is, but i just continually got errors saying that the stream could not be opened because the directory didn't exist.
And i had tried to open up two connections (one for each server) and using get and put to move it from server to server.
These images can't be stored on a intermittent PC.
Any help or advice on how to do this?
You could try transferring the image directly to the user through Apache mod-proxy connection. I'm sure there are ways to work out user permissions between the servers.
What is the purpose of the image server you're using? Are you sure you need to transfer the image across to your other server rather than just having the HTML the web server is producing link to the image server?
Other than that you might want to look into using NFS rather than programatically transferring the files yourself every time they're requested (assuming you're using some form of *nix on both machines and can create NFS shares).

Categories