Has anyone ever done Google Analytics tracking from a PHP command line program?
I have a PHP command line program that is run through cron. It will grab data every 5 minutes and I need to track that.
Looked at the GA library but it looks like either it's using JavaScript or it needs an <img> tag.
Please enlight.
Thank you,
Tee
http://code.google.com/p/serversidegoogleanalytics/
Never used it myself but looks like exactly what you're looking for.
Edit:
Just pondering your question again and realised you were asking specifically for PHP CLI as a command line program. Just wanted to make sure you're aware that you can just use it to call a file and that'll be the only command line part. The rest of the program can be full OOP code (As my suggestion is structured) which includes files and extends classes.
So using CLI as Command line isn't really limiting as your tone suggests.
If you need to track that, just write a log file ..
file_put_contents('/tmp/myscript.log', "running cron\n", FILE_APPEND);
Google Analytics is used to track visits of a website and not cronjob activities. :s
Related
I'm trying to make sense on the best way to do automatize a series of things in a row in order to deploy a web-app and haven't yet came up with a suitable solution. I would like to:
use google's compiler.jar to minify my JS
use yahoo's yui-compressor.jar to minify my CSS
access a file and change a string so that header files like "global.css?v=21" get served the correct version
deploy the app (sftp, mercurial or rsync?) omitting certain directories like "/userfiles"
Can you guys put me on the right track to solve this?
Thank you!
you may want to check out phing http://phing.info/ (they are in the process of moving servers so may be down this weekend), but it can do all of what you are wanting and is written in php.
A quick google search should bring up plenty of tutorials to get you started.
You can run php from the command line to do all sorts of fun things.
$ php script_name.php arg1 arg2
See: command line, argv, argc, exec
Running PHP from the command line is very fast. I've been doing this a lot lately for various automation tasks.
I generally run Python projects so this may or may not be an option for you: but apart from writing your own scripts you could look into the following:
Fabric
Buildout
maven
Need some help here. I have a c++ library for communicating with an embedded module (ArchLinux) via tty. This library was compiled/converted into php using swig.
The issue now is that a sample program written in php and run from the command line executes as expected but when this same code is used as part of a web page's functionality it fails to execute.
My assumption based on my limited linux knowledge is that tty requires a console in order to run which is why it fails to run as part of a webpage??
Does anyone have any ideas as to how I can get this to work? I have read something about using posix_ttyname but I cant seem to find any code samples that demonstrate its use.
I have attached the offending c++ files along with a test main.php which works for review.
Thanks everyone
http://www.mediafire.com/?ctblcvsy86mdg8p
$argv variable is available only when script is called from CLI. If you don't want to change the script so it could be called from web, you can try calling it from another script as
exec('main.php param');
Just like you do from command line.
I am trying to display the images in the pdf document that I uploaded to the server as hyperlinks in php so that if user clicks on them they will get the corresponding document.
Please help me ,Thanks in advance!
Use pdfimages, which comes with the open-source xpdf software package (for *nix operating systems). You'll have to call it through exec or the like, then work with the output from PHP. I am not aware of any PHP library that provides this functionality, so you're going to have to experiment.
EDIT
You mentioned that you aren't experienced with PHP... I thought I'd add that this isn't a quick-and-easy type of task, you certainly aren't going to find a bunch of tutorials around the internet for this.
To get started, you'll have to install the xpdf package on your server. There's a lot of different ways to do this depending on which OS you've got.
After that is set up, you'll be using a command line to execute a program on your server; you'll want to capture the output of that command in PHP and work with it further. So initially, you'll want to work out exactly what your command line will look like as well as what the output looks like and means - do this from command line, don't worry about the PHP part yet. In this case, your output is going to be a list of the image files extracted from a given PDF, you're command line call will look something like "pdfimages mypdf.pdf". Play around, find out what happens.
After you work out exactly what command line you need to send and what the command does, you can focus on the PHP angle. In a nutt shell, you want PHP to execute the exact command that you've already worked out. Look at the manual for exec for information on how to call a command line and get the output back. Write your script to make the correct call and show the call's output.
Next, move on to doing something with that output. I presume you'll want to somehow store the extracted images in a web-accessible place, put them in the database, show them to the user, etc. That is the very last stage after you've worked out the initial steps.
Good luck!
Don't think that I'm mad, I understand how php works!
That being said. I develop personal website and I usually take advantage of php to avoid repetion during the development phase nothing truly dynamic, only includes for the menus, a couple of foreach and the likes.
When the development phase ends I need to give the website in html files to the client. Is there a tool (crawler?) that can do this for me instead of visiting each page and saving the interpreted html?
You can use wget to download recursively all the pages linked.
You can read more about this here: http://en.wikipedia.org/wiki/Wget#Recursive_download
If you need something more powerful that recursive wget, httrack works pretty well. http://www.httrack.com/
Pavuk offers much finer control than wget. And will rewrite the URLs in the grabbed pages if required.
If you want to use a crawler, I would go for the mighty wget.
Otherwise you could also use some build tool like make.
You need to create a file nameed Makefile in the same folder of your php files.
It should contain this:
all: 1st_page.html 2nd_page.html 3rd_page.html
1st_page.html: 1st_page.php
php command
2nd_page.html: 2nd_page.php
php command
3rd_page.html: 3rd_page.php
php command
Note that the php command is not preceded by spaces but by a tabulation.
(See this page for the php command line syntax.)
After that, whenever you want to update your html files just type
make
in your terminal to automatically generate them.
It could seem a lot of work for just a simple job, but make is a very handy tool that you will find useful to automate other tasks as well.
Maybe, command line will help?
If you're on windows, you can use Free Download Manager to crawl a web-site.
I'm trying to make sense on the best way to do automatize a series of things in a row in order to deploy a web-app and haven't yet came up with a suitable solution. I would like to:
use google's compiler.jar to minify my JS
use yahoo's yui-compressor.jar to minify my CSS
access a file and change a string so that header files like "global.css?v=21" get served the correct version
deploy the app (sftp, mercurial or rsync?) omitting certain directories like "/userfiles"
Can you guys put me on the right track to solve this?
Thank you!
you may want to check out phing http://phing.info/ (they are in the process of moving servers so may be down this weekend), but it can do all of what you are wanting and is written in php.
A quick google search should bring up plenty of tutorials to get you started.
You can run php from the command line to do all sorts of fun things.
$ php script_name.php arg1 arg2
See: command line, argv, argc, exec
Running PHP from the command line is very fast. I've been doing this a lot lately for various automation tasks.
I generally run Python projects so this may or may not be an option for you: but apart from writing your own scripts you could look into the following:
Fabric
Buildout
maven