I need to get notification on some script that file has been successfully downloaded with WGET, so I'll need to upload it to another server. Can it be done?
Thanks!
wget "http://example.com/path_to_file.tar.gz" && php scriptname.php file.tar.gz
Why not perform the download with cUrl in php instead, then the PHP script can be run in the background and continue when the download is complete?
Related
I am working on a batch-script that makes a SQL query and saves it to a file. The file will then be handled by PHP. Is it possible to POST a file from Windows CMD to a PHP site so it can be handled by php with $_FILES['someFile']?
Yes, you can use curl for this.
curl -F someFile=#localfile.sql http://example.org/upload
Or you can use wget.
wget --post-file=file.jpg http://yourdomain.com/target.php
My server currently disabled shell_exec for security reason so I cannot run the below command inside my php script. ( which I normally can when using ssh )
echo shell_exec('youtube-dl -f 134 http://youtu.be/8R_e09bOELs');
Are there any workarounds in PHP so I can still run the above command without calling shell_exec i suppose?
Any ideas are greatly appreciated. TIA
If it's disabled - you can't. There are many other ways to download a video from YouTube. Just google: php youtube download.
I want to run a PHP script every 15 minutes using either CURL or WGET.
This PHP file is in a local folder:
/home/x/cron.php
How would I run this using CURL/WGET?
It doesn't work when I try to run
curl /home/x/cron.php
Thank you!
CURL and WGET are more adecuate for URLs like http://myhost.com/cron.php
When the script is offline, you would better run it using php CLI:
Ex:
php -q cron.php
Just do something like this:
/usr/bin/php /home/x/cron.php
cURL/wget is for HTTP actions. If your PHP script is on the same system, you don't want to load it over HTTP. (You can, of course, if it is accessible over HTTP, but I don't think that is what you want.) Just call it directly.
Alternatively, you can set the execute permission on your script and throw in a shebang line for PHP:
#!/usr/bin/php
Then, just put your PHP script in crontab directly.
If you're using CURL or WGET, I believe you'll need to pass in the path as a URL. If you want to run the php script on the command line, you'll need to use the the php CLI
I want to send ffmpeg output to a php file so I can use a regex and update the output into a database. This will allow me to handle progress for multiple uploads. Does anyone know how to do this? Can it be done? Currently I can execute a php file with parameters after the ffmpeg command, and get ffmpeg to write to a txt file but can I send the output to the php file and execute it?
execute php file with parameters
&& php /opt/lampp/htdocs/xampp/site/update_db.php ".$parameter1." ".$parameter2.";
Write output to txt file
ffmpeg command and filepath to converted 1> /home/g/Desktop/output.txt 2>&1
Can something like this be done?
ffmpeg command and filepath to converted 1> php /opt/lampp/htdocs/xampp/site/update_db.php ".$output." 2>&1
Yes, you can read STDIN.
http://php.net/manual/en/features.commandline.io-streams.php
If it were me, I'd just execute FFMPEG from within PHP. You have a bit more flexibility that way, but I know that isn't desirable for every application.
You could use exec to call ffmpeg, then use the content of the output parameter to get returned output.
But doing so only allow you to get the output once the program execution is terminated:
If a program is started with this
function, in order for it to continue
running in the background, the output
of the program must be redirected to a
file or another output stream. Failing
to do so will cause PHP to hang until
the execution of the program ends.
From the command line, wget http://mydomain.com/image.jpg successfully downloads the image.
The image file size is 7KB.
When I embed this code into exec(), system() or passthru(), such as
exec('wget http://mydomain.com/image.jpg');
or the same with system() or passthru()
The image gets created but only has 255 bytes...
Can anybody tell me why this happens?
I would try using the quiet (-q) and specifying an output file (-O). It is possible its continual status updates are causing an issue.
Here is an example of grabbing google's logo that is working for me.
<? system('wget -q http://www.google.com/images/logo_sm.gif -O test2.gif'); ?>
wget v1.12
php v5.3.1