Is it possible to download a file using Wget? I want download that file into my default browser's download directory.
It's possible, but it wouldn't achieve the effect you desire.
Running wget would cause the file to be downloaded to the server (which is something you'd be better off using the cURL library for instead of shelling out to an external binary).
If you want the browser to download it, then you need to output the file from PHP, not save it to a file on the server.
Try something like this :
shell_exec('wget -P path_to_default_download_directory google.com');
Related
I need to download files in bulk each 0-2.5 MB from an Url to my server(Linux CentOS/can be any other too).
I would like to use the wget (if you have another solution then please post it):
My first approach is to test it with only 1 file:
wget -U --load-cookies=cookies.txt "url"
This is the Shell Response:
The Problem is that it doesn't download the file but only the empty html. The necessary cookie is saved in the right format in the file and the download works in the browser.
If it works to download the 1 file, I want to use a txt with all the urls (e.g. urls.txt) where the urls are like the above but only one parameter is changing. Then I want also that it downloads maybe 10-100 files at a time.
If you have a solution in PHP or Python for this, it will help me too.
Thank you for your help!
I have solved it now with aria2. Its a great Tool for such Things.
Basically:
for i in foo bar 42 baz; do
wget -other -options -here "http://blah/blah?param=$i" -O $i.txt
done
Note the -O parameter, which lets you set the output filename. foo.txt" is a little easier to use thandata-output?format=blahblahblah`.
I am trying to write some batch script to download and save an XLS file from a URL. I am able to down load the file by using
#ECHO OFF
start /d "C:\Program Files\Internet Explorer" IEXPLORE.EXE url link
exit
I would now like to save these files to a folder or directory.
Any help anyone could provide here would be greatly appreciated.
There are at least two ways to do it.
Like Buddy suggested, use a command-line downloader, for example wget.
Under Linux, you can run PHP directly (even without a webserver). See PHP: Command Line PHP on Microsoft Windows in PHP manual.
Don't run a browser, or - even worse - IE, just to run a PHP script.
I want to send data to my server via SSH. The data is image files that need to be saved into directories on the server.
If I run the script via SSH how can I get a PHP script to read the image files?
For example, if I used bash I could do
./uploadimage.bash -image.jpg
and that would be that. But my knowledge of PHP is limited. Usually PHP uses the $_FILE array when dealing with files, as far as I know. Do I need to use this if I send the files by SSH?
Run the script via php command line executable:
php myscript.php image.jpg image2.jpg
You can the get the file names via $argv array and deal with them any way you please.
It depends on what you need to do with the images.
Do you just need to echo them out to the user?
echo file_get_contents('image.jpg');
Are you meaning to retrieve the command-line variables passed to the script? Use $argv.
myScript.php image.jpg # image.jpg becomes $argv[1]
Do you need to do processing on the images? Use the GD functions.
$image = imagecreatefromjpeg('image.jpg');
// Do processing
imagejpeg($image); // Pass a filename if you want to save it instead of output it.
If you want to simply upload a bunch of files to a remote server, your best bet is to use scp command, rather than PHP
scp /your/file.png username#host:/remote/path.png
If you execute a PHP script through an SSH Session, there's no way you can send a file through SSH. You have to first copy it to the remote server and then execute the PHP which uses file_get_contents or something like that.
If you are sure that PHP must be the recipient of the file (maybe you want to do some logic to determine the file name, path), you have to host it as a webserver, and you can use curl to upload the file, like this:
curl -k -F myfile=#image.png foo.php.
I have a problem. I have the service which is giving me .exe file which they claim is in fact zip archive. A self-extracting archive.
Problem is that I am downloading that with my app (php) to server and need to extract it there witout downloading to local computer.
I have tried download .exe file to local computer - it is self extracting on windows to /temp dir and than self launching FLASH player.
$zip = zip_open($myfile); gives in print_r($zip): 1
zip->open gives no results either.
change .exe to .zip doesn't let winzip or other kind of un-packer on windows to open it - .exe cannot be opened by winzip too.
Now I have no idea how to deal with it. If anybody can advise please.
Try to execute the program as an executable with the system command
Executing files from an external source you don't trust 100% is never a good idea.
The info-zip version of zip allows you to remove the SFX stub from a self-extracting zip file (with the -J flag) converting it back into a normal zip file.
Source code is freely available.
Making a self-extracting zip file is a matter of prepending a zip file with the SFX binary code, then appending the size of the binary stub to the resulting file - but I'm not sure how the data is represented - but a bit of reverse-engineering the available code should make this clear.
Well... if your PHP server is Windows you shouldn't have a problem doing it as a system command. Otherwise, it's a little more tricky. I hear that the unzip system command will unzip self-extracting zip files, but I don't have access to a Linux box at the moment to try it out.
If you're on shared hosting, chances are you can't do it.
Well if you think after executing the exe file, it will extract its content, then you can use exec function to run the .exe files like the one below:
exec("d:\\example\\php\_exe\\1436.exe");
and also you can use system function to run external programs as well.
And also if you wonder what's the difference:
PHP - exec() vs system() vs passthru()
I'm setting up a script so that I can input a URL to a web page and the script will wget the file. Most of the files, however, will be in the *.rar format. Is there anyway I can pass the filename to the unrar command to unarchive the files downloaded via wget?
Many, many thanks in advance!
EDIT I thought about using PHP's explode() function to break up the URL by the slashes (/) but that seems a bit hack-y.
Rather than forking out to external programs to download and extract the file, you should consider using PHP's own cURL and RAR extensions. You can use the tmpfile() function to create a temporary file, use it as the value of the CURLOPT_FILE option to make cURL save the downloaded file there, and then open that file with the RAR functions to extract the contents.
Use basename()to get the filename.
#Wyzard gives the best answer. If there's a library that solves your problem, use it instead of forking an external process. It's safer and it's the clean solution. PHP's cURL and RAR are good, use them.
However, if you must use wget and unrar, then #rik gives a good answer. wget's -O filename option saves the file as filename, so you don't have to work it out. I would rather pipe wget's output directly to unrar though, using wget -q -O - http://www.example.com | unrar.
#Byron's answer is helpful, but you really should not need to use it here. It is, however, better than using explode() as your edit mentions.
wget -O filename URL && unrar filename