I would kindly like to know if anyone has a simple PHP script which I can run as a cron to basically open a XML file from a URL I have (on a different server) and re-save in a directory on my own server.
The URL is a FTP URL with a username and password.
Easy:
file_put_contents(file_get_contents('ftp://user:password#remoteserver.com/file.xml'), '/path/to/new.xml');
Related
Is it possible to send file from a local URL?
I need to upload files in php from your local url, eg I open the web page with:
www ... upload.php?url=c://...file.jpg,
from the url GET,I would get the file on the pc and would upload, without using html or anything else that I have to choose, just with the file url.
Important, it can only be this way, it will not be possible to choose the file during the upload, it will only be a POST or GET with the local url.
I researched and found nothing related, if anyone can help
I think it is impossible for server to get a client file by using the file path.
But maybe you can use JS and FileSystemobject to prepare the file, make it to streaming and post to the server.
And you must know FSO need a high security permission and may be disabled on users' browser.
I'm trying to archive a big file using PHP and send it to the browser for download. The problem is the file is located on a remote machine and the only way to get it is via HTTP. So imagine this is my file: https://dropboxcontent.com/user333/3yjdsgf/video1.mp4
It's a direct link and I can download the file using wget, or curl anything. When a user wants to download it, I first fetch the file to the server, then zip it up and then send it to the user. Well, if the file is really large, the user has to sit there waiting for the server to download it before he sees the download dialog box in his browser. Is there a way for me to start the download of the file https://dropboxcontent.com/user333/3yjdsgf/video1.mp4 (let's say I'm downloading it into a local /tmp/video.mp4) and simultaneously start putting into an archive and streaming it into the user's browser?
I'm using this library to zip it up: https://github.com/barracudanetworks/ArchiveStream-php, which works great, but the bottleneck is still fetching the file to the server's local filesystem.
Here is my code:
$f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4");
$zip->add_file('big/hello.mp4', $f);
The problem is line $f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4"); takes too long if the file is really big.
As suggested in the comments of the original post by Touch Cat Digital Inc, I found the answer here: https://stackoverflow.com/a/6914986/1927991
A chunked stream of the remote file was the answer. Very clever.
I am currently trying to retrieve a file from an FTP-Server in order to make it accessible for the user to download. ftp_get() writes it to a path on the local machine, yes, but what I want is that it also shows up in the download history and counts as "normal" download from the internet but I didn't figure out how to do this yet. I also tried to link directly to the file in PHP with header("Location: ftp://username:password#ftp.server.com/myfile.file") but this was resulting in the browser showing the files contents (which I didn't want). Did I miss any header-Parameters ? Or is there a completely different way to do this ?
You won't be able to "redirect" a user to a file so he can download it using FTP. This is a HTTP-thing. Browsers provides FTP features and make it look like HTTP but, in fact, those are different stuff.
If this file is only accessible through FTP and it is on a remote server, the only way I can imagine so you cand 'redirect' this download to the user is:
Download the file from the FTP to your application server through FTP in PHP;
Send it to the user using PHP and appropriate file headers, something like this: https://stackoverflow.com/a/7263943/2802720
Hope it helps.
I have a url that downloads a .csv file to my computer like this
http://oi62.tinypic.com/2nh0zzt.jpg
The url is
api.infortisa.com/api/Tarifa/GetFile?user=xxxxxxxxxxxxxxxx
But what I want is to download that file in a folder of the server, or at least open it's content so I can save it where i want.
(upload that file via FTP every time is a pain)
I can't access the direct url of the file, just what the api gives me, and that is just a direct download.
I tried curl and file_get_contents, but nothing seems to work.
How can I download that file in the server instead of my computer?
Any help would be appreciated.
I'm pretty new to the cURL library so I hope the solution to this isn't too trivial.
Basically I have a remote directory, lets say http://www.something.com/dir. In this directory following files are present:
file1.xml
file2.xml
file3_active.xml
Is there a way I can get the files where the filename matches the phrase 'active' into a string? Would the solution work both for http and ftp?
EDIT: How can I get the list/array of filenames in a remote dir? If I could do that, I could simply use strpos, get the full filename and use cURL the simple way.
Many Regards,
Andreas
How can I get the list/array of filenames in a remote dir.
Off the top of my head:
Via an FTP dir command, if you have FTP access.
Via a custom PHP (or whatever) script on the remote server which generates a machine-parsable list for you.
Via a shellexec/popen/ssh_exec to a shell command like ls or find, run through SSH.
By parsing HTML from a web-server generated directory listing (i.e., as generated by Apache mod_autoindex) on the remote server.
Each of these options is going to require some action on the part of the person hosting the remote server -- so if it's completely out of your control, I think you're SOL.