Im trying to grab all data(text) coming from a URL which is constantly sending text, I tried using PHP but that would mean having the script running the whole time which it isn’t really made for (I think). So I ended up using a BASH script.
At the moment I use wget (I couldn’t get CURL to output the text to a file)
wget --tries=0 --retry-connrefused http://URL/ --output-document=./output.txt
So wget seems to be working pretty well, apart from one thing, every time I re-start the script wget will clear the output.txt file and start filling it again, which isn’t what I want. Is there a way to tell wget to append to the txt file?
Also, is this the best way to capture the live stream of data?
Should I use a different language like Python or …?
You can do wget --tries=0 --retry-connrefused $URL -O - >> output.txt.
Explanation: the parameters -O is short for --output-document, and a dash - means standard output.
The line command > file means write "write output of command to file", and command >> file means "append output of command to file" which is what you want.
Curl doesn't follow redirects by default and outputs nothing if there is a redirect. I always specify the --location option just in case. If you want to use curl, try:
curl http://example.com --location --silent >> output.txt
The --silent option turns off the progress indicator.
You could try this:
while true
do
wget -q -O - http://example.com >> filename # -O - outputs to the screen
sleep 2 # sleep 2 sec
done
curl http://URL/ >> output.txt
the >> redirects the output from curl to output.txt, appending to any data already there. (If it was just > output.txt - that would overwrite the contents of output.txt each time you ran it).
Related
I am trying to create a tarball from a php5 script under Linux, and I don't really care for the output; the way I have so far found immediately is to simply
system("tar czf tarball.tgz directory/path &");
However I would like to background the process
Checking system() documentation it mentions having to redirect the output to file
However
system("tar czf tarball.tgz directory/path > /dev/null 2>&1");
doesn't help. The system() function does not take a file descriptor... what am I missing?
Testing with these:
script test.php
<pre><?php
exec("bash dodate 2>&1 /dev/null &");
system("echo done at \$(date)");
?></pre>
Script ./dodate
sleep 5
date
I go to my browser and call/refresh the page; it takes indeed 5 seconds thenprints/updates the "done" message.
Thanks
You "don't have" threads in php. One trick you can do is to do a curl request to another php that does what you want. You'll need to make sure that your curl times out pretty soon, and that the other php doesn't die when the http connection to it is closed by the curl timeout.
You can also read about the topic here: cURL Multi Threading with PHP or cURL Multi Threading?
I want to call an external php script with applescript and alfred. Currently I open the Safari with the url to the php script. This is very annoying. Is there a way to call the php script without open the Safari?
Best regards,
emha
PHP scripts / commands can be ran from a standard shell prompt:
do shell script "php -q " & quoted form of posix path of phpScriptPath
Source: here
EDIT:
You can use cURL from the shell, and dump the output. This is kind of like just pinging your script, which I'm guessing is what you want to do. Just replace http://www.google.com with the path to your script. And if you omit >/dev/null 2>&1, you can get cURL's output. This is nice because you can add flags to curl to show headers, do post / get, etc.
do shell script "curl http://www.google.com >/dev/null 2>&1"
I've been running a simple php script (which logs its running time in a text log file). From browser it runs fine, but as I use the scheduled tasks in my plesk 10.3.1 panel as follow:
*/5 * * * * php /var/www/vhosts/eblogs.co.uk/httpdocs/frostbox/cron/crone_test.php
It does run right after five minutes but does not write anything in the text file and sends me following notification messages via email:
php [-f from_encoding] [-t to_encoding] [-s string] [files...]
php -l
php -r encoding_alias
-l,--list
lists all available encodings
-r,--resolve encoding_alias
resolve encoding to its (Encode) canonical name
-f,--from from_encoding
when omitted, the current locale will be used
-t,--to to_encoding
when omitted, the current locale will be used
-s,--string string
"string" will be the input instead of STDIN or files
The following are mainly of interest to Encode hackers:
-D,--debug show debug information
-C N | -c | -p check the validity of the input
-S,--scheme scheme use the scheme for conversion
What should I add in the following line?
php /var/www/vhosts/eblogs.co.uk/httpdocs/frostbox/cron/crone_test.php
The text you get back is the usage message of piconv. This has absolutely nothing to do with PHP, the scripting language. What you probably want to do is one of the following:
Alternative 1: Using the PHP command line interpreter
You need to call the actual php interpreter on your system. This might be /usr/bin/php5, so your crontab line would look like
*/5 * * * * /usr/bin/php5 /var/www/vhosts/eblogs.co.uk/httpdocs/frostbox/cron/crone_test.php
However not every setup has this command line interpreter installed. It might happen that only the apache module is installed.
Alternative 2: Using an HTTP(S) request
If you don't have the right to install the command line tool, have a look if wget or curl is installed. If so, you can use them to invoke the script by sending a request to the web server.
Using wget:
/usr/bin/wget -O /dev/null 'http://eblogs.co.uk/crone_test.php'
-O /dev/null tells it to save the web page generated by your script in /dev/null. It is a special file that basically immediately forgets all data written to it. So this parameter avoids that any new file with the web page contents is created and lies around in your server's file system.
Using curl:
/usr/bin/curl -o /dev/null 'http://eblogs.co.uk/crone_test.php'
-o /dev/null has the same function here as the version with the capital O above for wget.
Add the following at the top of the php script:
#!/usr/bin/php
<?php
your code here
?>
(assuming php exists at /usr/bin)
Hope that helps !
You can use "curl" ... like this:
curl "http://eblogs.co.uk/crone_test.php"
How to setup a cron job command to execute an URL?
/usr/bin/wget -q http://www.domain.com/cron_jobs/job1.php >/dev/null 2>&1
Why can't I make this work!? Have tried everything.. The PHP script should send an email and create some files, but none is done
The command returns this:
Output from command /usr/bin/wget -q http://www.domain.com/cron_jobs/job1.php ..
No output generated
... but it still creates an empty file in /root on each execute!? Why?
Use curl like this:
/usr/bin/curl http://domain.com/page.php
Don't worry about the output, it will be ignored
I had the same problem. The solution is understanding that wget is outputting two things: the results of the url request AND activity messages about what it's doing.
By default, if you do not specify an output file, it will create one, seemingly named after the file in your url, in the current folder where wget is run.
If you want to specify a different output file:
-O outputfile.txt
will output the url results to outputfile.txt, overrwriting what's there.
If you wish to append to that file, write to std out and then append to the file from there:
and here's the trick: to write to std out use:
-O-
the second dash is in lieu of a filename and tells wget to write the url results to std out.
then use the append syntax, >>, to append to a file of your choice:
wget -O- http://www.invisibility.com >>/var/log/invisibility.log
The lower case o, specifies the location of the activity log, so if you wish to log activity for the url request, you can:
wget -o http://someurl.com /var/log/activity.log
-q suppresses output of activity messages
wget -q http://someurl.com /var/log/activity.log
will not log any activity to the specified file, and I think that is the crux where people get confused.
Remember:
-O is shorthand for --output-document
-o is shorthand for --output-file, which is the activity log.
Took me hours to get it working. Thank you for people writing down solutions.
One also needs to make sure to check whether single or double quotes are needed, otherwise it will parse the url wrong leading to error messages:
This worked (using single quotes):
/usr/bin/wget -O -q 'http://domain.com/cron-file.php'
This gave errors (using double quotes):
/usr/bin/wget -O -q "http://domain.com/cron-file.php"
Don't know if the /usr/bin/ is needed. Read about different ways of how to do the order of the -O -q. It is hard to find a reliable definitive source on the web for this subject. So many different examples.
An online wget manual can be found here, for the available options (but check with the Linux distro one is using for an up to date version):
http://unixhelp.ed.ac.uk/CGI/man-cgi?wget
For use wget to display HTML:
wget -qO- http://www.example.com
I have a php script I want to run every minute to see if there are draft news posts that need to be posted. I was using "wget" for the cron command in cPanel, but i noticed (after a couple days) that this was creating a blank file in the main directory every single time it ran. Is there something I need to stop that from happening?
Thanks.
When wget runs, by default, it generates an output file, from what I need to remember.
You probably need to use some option of wget, to specify to which file it should write its output -- and use /dev/null as destination file (It's a "special file" that will "eat" everything you can write to it)
Judging from man wget, the -O or --output-file option would be a good candidate :
-O file
--output-document=file
The documents will not be written to the appropriate files, but all will be concatenated together and written to file.
so, you might need to use something like this :
wget -O /dev/null http://www.example.com/your-script.php
And, btw, the output of scripts run from the crontab is often redirected to a logfile -- it can always help.
Something like this might help, about that :
wget -O /dev/null http://www.example.com/your-script.php >> /YOUR_PATH_logfile.log
And you might also want to redirect the error output to another file (can be useful, to help with debugging, the day something goes wrong) :
wget -O /dev/null http://www.example.com/your-script.php >>/YOUR_PATH/log-output.log 2>>/YOUR_PATH/log-errors.log