Some log files are saved in cloud server and it is placed in another load balancer portal to download it. If I browse the load balancer URL we can see the files to be downloaded. When I click this file its get being downloaded. How this file can be downloaded from Linux commands / php scripts
In Linux terminal using wget command to download file from specific url.
And using php script to download file, you can use file_get_content() and file_put_content() to download file.
I hope this answer helpful to you.
Related
I have a Laravel App that can upload file to a folder.
The folder is a symlink to /media/folder_name, that is a mount point of a Windows share (mounted via fstab).
Upload and Download works.
The download action returns a copy of the file:
return response()->file($path_file);
I want that the download action to show me the original file (saved in windows folder, like: //192.168.1.2/share/folder/file.pdf).
It is possible?
After some research, direct link to local resource is denied in Firefox and Chrome for security reasons.
So, the solution is to install the Local Filesystem Links extension for firefox. In Laravel, return a blank page with the link to the resource, like: file:////192.168.1.2/folder/file.pdf.
Finally, when you click the pdf link, the original resource is opened with the default program (Adobe Reader, for example).
I tried downloading a directory folder using wget, and it seems to have worked properly. However all the PHP files are empty. I know that they should not be empty as they show a file size on the web directory.
The folder I am trying to download is here:
https://www.isuperman.tw/wp-content/plugins/automatewoo-referrals/
I used these directions to download recursively with wget.
How to download HTTP directory with all files and sub-directories as they appear on the online files/folders list?
Any ideas on why they are downloading blank/empty?
view-source:https://www.isuperman.tw/wp-content/plugins/automatewoo-referrals/automatewoo-referrals.php
nope the files are empty, doesnt matter whats the content of the files.. if you use wget to download a file, wget simulates A browser and get the parsed php content from the server...
and these seems to be empty
if you want to download the files with php, use ftp or the server must not parse these files and deliver its raw content
I'm trying to archive a big file using PHP and send it to the browser for download. The problem is the file is located on a remote machine and the only way to get it is via HTTP. So imagine this is my file: https://dropboxcontent.com/user333/3yjdsgf/video1.mp4
It's a direct link and I can download the file using wget, or curl anything. When a user wants to download it, I first fetch the file to the server, then zip it up and then send it to the user. Well, if the file is really large, the user has to sit there waiting for the server to download it before he sees the download dialog box in his browser. Is there a way for me to start the download of the file https://dropboxcontent.com/user333/3yjdsgf/video1.mp4 (let's say I'm downloading it into a local /tmp/video.mp4) and simultaneously start putting into an archive and streaming it into the user's browser?
I'm using this library to zip it up: https://github.com/barracudanetworks/ArchiveStream-php, which works great, but the bottleneck is still fetching the file to the server's local filesystem.
Here is my code:
$f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4");
$zip->add_file('big/hello.mp4', $f);
The problem is line $f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4"); takes too long if the file is really big.
As suggested in the comments of the original post by Touch Cat Digital Inc, I found the answer here: https://stackoverflow.com/a/6914986/1927991
A chunked stream of the remote file was the answer. Very clever.
I've got the following situation: I have some files with hashed filename on a cdn. Now I want a php script which redirects to the files (download) and give them another name. Is there a way without using readfile? The problem with readfile is that it doesn't make sense to download the file from cdn to my webserver and then download the file from the webserver to local computer.
I want to download a torrent using php.
So the scenario is that the user uploads a torrent and the server downloads it.
Than the user can download the torrent using his browser.
Dont know if this is the right way for you but:
Install rtorrent (https://wiki.archlinux.org/index.php/RTorrent)
configure rtorrent to watch a direcory for new torrent-files (http://jkt.im/2011/09/28/automatic-torrent-management-with-rtorrent-and-some-helper-scripts/)
rtorrent will download the files and move it into a directory like "done"
php script checks the folder vor new content and convert / show the files online in webpage
All the functionality you need already exist with rTWi http://rtwi.jmk.hu/ and its also open source so all you need is little customization