From the command line, wget http://mydomain.com/image.jpg successfully downloads the image.
The image file size is 7KB.
When I embed this code into exec(), system() or passthru(), such as
exec('wget http://mydomain.com/image.jpg');
or the same with system() or passthru()
The image gets created but only has 255 bytes...
Can anybody tell me why this happens?
I would try using the quiet (-q) and specifying an output file (-O). It is possible its continual status updates are causing an issue.
Here is an example of grabbing google's logo that is working for me.
<? system('wget -q http://www.google.com/images/logo_sm.gif -O test2.gif'); ?>
wget v1.12
php v5.3.1
Related
I'm trying to run a PHP script locally that scrapes Google with wget and dumps the HTML into temp.html.
Running this command from the terminal works fine:
wget -O temp.html http://www.google.ca
Running this command from PHP also works fine (so it's not a permissions issue):
shell_exec('touch temp.html');
But running this from PHP does not work (does not create temp.html):
shell_exec('wget -O temp.html http://www.google.ca');
Any suggestions? Wrapping that last command in a var_dump() outputs null.
Thanks!
According to man wget, using wget -O temp.html http://google.com takes all documents, concatenates them and prints everything in temp.html, without producing any stdout so PHP's shell_exec doesn't return anything (null).
The content of the scraped webpage should be present in temp.html, but shell_exec("wget ...") does not return anything, as not output is produced.
As you mentioned the webpage you are trying to scrape does not work, maybe they implemented some sort of bot-protection preventing exactly what you are trying.
Edit: You may use - to print everything to stdout instead. So try using shell_exec("wget -O - https://google.com"); should return the content of the requested page to your PHP script.
The simplest solution is to provide full path to the wget binary as it seems the user that runs your script does ot have the same $PATH as you.
How about using file_put_contents & file_get_contents instead? This should work without having to worry about wget.
<?php
$filename = 'temp.html';
$address = 'http://www.google.ca';
file_put_contents($filename,file_get_contents($address));
?>
I'm trying to make a simple script in PHP which download a video of youtube, at the first moment I tried some classes I found on web but unsuccessful, so I decide to use youtube-dl program and call it from to my script.
The big problem is: apparently the process is killed when the page loads in the browser and the download is interrupted.
The most curious thing is that if I execute the script like that: php page.php, the script works nicely but the browser doesn't work.
I note the same thing with wget command, the process also killed.
The code is something like:
<?php
exec("youtube-dl -o /var/www/YT/video.flv https://youtube....");
?>
and
<?php
exec("wget http://link");
?>
*Both youtube-dl and wget are in the same directory from script, I tried too redirect output to /dev/null and fork process mas both no success.
I would try executing it at the background.
<?php
exec("youtube-dl -o /var/www/YT/video.flv https://youtube.... > /dev/null 2>&1 &");
?>
If that works then what it's happening is that your php script ends before youtube-dl
how can I execute command wkhtmltopdf http://google.com /tmp/test.pdf from server ie http://localhost/test.php, when I do it from command line it works. I tried system() and exec() functions but did not work. When I use system('touch /tmp/test') file is created. What stops wkhtmltopdf? Is it php, apache?
Make sure that the user the script is running as knows where wkhtmltopdf bin is.
You can find out where it is with the which command.
which wkhtmltopdf
Also you can get the return status of a command by setting a variable equal to it
e.g.
$last_line = system('ls');
echo $last_line
I need to get notification on some script that file has been successfully downloaded with WGET, so I'll need to upload it to another server. Can it be done?
Thanks!
wget "http://example.com/path_to_file.tar.gz" && php scriptname.php file.tar.gz
Why not perform the download with cUrl in php instead, then the PHP script can be run in the background and continue when the download is complete?
I have a script calling a command to run an ffmpeg conversion on an uploaded video. It works only at random times however. Sometimes the form will finish submitting and the ffmpeg process will be running; at other times, the ffmpeg command fails to run at all. Here is the command that I'm running in an exec() function:
ffmpeg -i "uploaded_file -b 450k "converted_file" >/dev/null 2>&1 &
Can anyone explain why this will only work on certain tries and not on others?
What if ffmpeg fails and throws and error? Right now you're sending all output to /dev/null so you'll never know.
Change >/dev/null into >>/tmp/ffmpeglog to keep a log