Scheduled task (cronjob) with PHP - php

I'm creating a website that requires a file to be generated and stored on the server periodically (an XML feed for iTunes). The page is generated using ExpressionEngine. I discovered that the website's current server has a very restricted cPanel and doesn't have access to cron.
So I'm considering two options; find an alternative way to access the cronjobs (if they are available), or find an alternative way to created regularly scheduled tasks.
Regarding the first option, how would I go about determining if a server has cron available? I'm not sure how useful this would be anyway since I don't think the server allows shell access (it's a very basic setup for people who aren't tech savvy).
Regarding the second option, a friend mentioned to me that the functionality of cronjobs can just be done in PHP. How would I go about this?
Or, am I perhaps thinking too much with this? The page in ExpressionEngine that outputs the XML file is domain.com/itunes/itunes_feed. This just has some EE tags that outputs the relevant XML and the resultant page is in .xml format. Is it enough to just submit the above url to iTunes, or does it have to be a url to the actual pre-existing file on the server?

Option 1
Simply contact your hosts and ask them do they support cron jobs, and if so, how to set up.
Option 2
I only just set up my own set of cron jobs yesterday..
Create a php file that runs the code you want,
Set up and account on https://www.easycron.com/
Upload your php file to easycron
Set the times in which you would like your php code to run
Simple as that! Does that make sense?

Related

Cronjob for a zend view?

this isn't the best method for doing the task, but how would you run a cronjob of a zend view.
The view is used to generate a file using an output buffer and then save the file on the server, it runs once a day.
Would it just be a matter of calling the url of action of the controller with curl:
23 50 * * curl http://pclite.com/statistics/generate
The application required authentication though.
If you are the admin of the server, I will not do this way,
I will code a PHP page using curl to download and save the file, since you coding a php file,you are able to simulate the login procedure , you can write the username and password in the php file, and make sure the file is saved by where you want
then I using LYNX in the corn, a text browser , it will call this php file once a day, so you don't have to record any username password in the cronjob and this php do what ever you wan to grab
Since you said, that this is not the best method for doing such a task, i won't tell it again :D
If the cronjob runs on the same server your webserver is on, you could check the client-ip and skip authentication if they are the same. Because if the "attacker" can send requests from your own server to the application you really have a serious security issue.
So, yes. If you skip authentication when the ip is the same you just need to call the url.
As any other class Zend_View can be instantiated from anywhere and in particular Zend_View can render to a variable. This means that you do not need to call the whole web application if all you want to do is render something.
As stated your other option is to have an entry point to the application and call it to get the return. But if you're just saving some file to the server it could be perceived as a better approach to have the cronjob be a script that does any thing. This way you will also save some load of your web application. The last thing may not be so relevant but what if in the future you want to call this endpoint several times per day for a lot of users or something?
So, you can create a CLI script that includes Zend_View and renders within itself. As always with Zend Framework the implementation choice i left entirely to you.

Dynamically create PDFs and email them

I've got a registration list, which I need to send out a PDF to each person on the list. Each email needs to contain a PDF, which has a base version on the server, but each person's needs to be personalized via name/company etc over the top. This needs to be emailed to each person, which at the moment adds up to be 2,500, but can easily be much higher in the future.
I've only just started working on this project, but the problem I've encountered continuously since last week are that the server doesn't seem to be able to handle doing this. Currently the script is using Zend, which then allows it to use Zend_Pdf and Zend_Mail to create and email the PDFs. Zend_mail connects to an smtp server from smtp.com to do the actual emailing.
Since we have quite a few sites running on the server, we can't afford it to be going down, and when I run it in batches it can start to go down. The best solution I have thus far is running curl from my local machine to the script, which then does one person. The curl script then calls it again, over and over in batches. Even this runs into problems at times, and seems to some how hog memory even after it should be complete (I'm really not sure how).
So what I'm looking for is information on doing this, from libraries, code, information on server setups, anything that can make this much less painful, and much quicker for us to run. I've run out of ideas, and this is something I've not really had to do before (especially at a bulk level).
Thank you.
Edit:
I also forgot to mention that it's using zend_barcode::factory for creating a barcode on the PDF.
First step I suggest is to work out where the problem lies if you can. Is it the PDF generation? Is it the emailing? "Server doesn't seem to be able to handle this" doesn't say what is actually failing as with the "server goes down" - you need to determine if you are running out of memory/disk-space/time or something else. That will help you determine if you need a tweak or a new approach to your generation. Because you said that even single manual invocations can fail you should be able to narrow the problem down to exactly what is the cause of the failure.
If you are running near some resource limit (which might be the case with several sites running), you probably need to offload this capability onto another machine. Your options include:
run the same setup on a new host and adjust your applications to use the new system
run a new setup on a new host
use an external system (such as the mentioned PDFCrowd or Docmosis)
Start with the specifics of the problem. I hope that helps. Please note I work for the company that created Docmosis.
Here's some ideas:
Is there a particular reason this has to run on a web server? Why not run the framework
from a different machine, but with the same settings? You might have to create a different
controller to handle the command-line version of the request, but there's no fundamental
reason it can't work.
If creating PDFs programatically is giving you a headache, you can instead use a service.
In the past, I've used PDFCrowd with good results, and they provided
a useful PHP library. You can give them a blob of HTML, using full URLs for any stylesheets
and images, and they'll create a PDF for you.
The cost per document varies from 0.5-4.5 cents per document depending on your rate plan.
There are other services which do the same thing.
If this kind of batch job is a big deal for your company, you might consider an
asynchronous job queue like beanstalk. You could queue
up thousands of these, and a worker script could handle the requests at whatever pace you
deem reasonable.
From my experience - two options:
Dynamically generate PDFs using one or more PDF libraries (which can be awfully slow).
OR
Use something like wkhtmltopdf which is a simple shell utility to convert html to pdf using the webkit rendering engine, and qt.
Basically, you can loop over n HTML pages and generate PDF's without the overhead of purely dynamic PDF generation!
We've used this to distribute thousands of personalised PDF's on a daily basis as it quickly converts HTML pages to PDF. There are dependencies, but it works and is less intensive (computationally) than 'creating' PDFs individually.
Hope this helps.
If you are trying to call the script over HTTP, the script will timeout based on the max_execution_time specified in the php.ini.
You need to write a php script which can be run from command line and then schedule it via a cron job. The script at a time, can read one user, put together his pdf file, and email him. After that, you might have to run some performance checks to see if the server can handle the process.

Application level function in PHP (similar to page load in global.asax)

I am currently building an application in PHP using CodeIgniter. Usually, in .NET applications, if I need to execute a particular function only when the application first starts up, I can put the code in Global.asax and that code will be executed then. How do I simulate a similar functionality in PHP?
For example, I have a CSS file that has to be parsed server side (need to Akamai images, so the path is different in development, QA and production). In .NET, instead of parsing the files all the time when a user hits the application, I only do it when the application is restarted. How do I do this in PHP?
One solution that might be more complex than you're looking for would be to separate this rewriting functionality from your web server (e.g. have a script you run when you need to rebuild the css files, which requires manual running).
If you don't want to manually do this, you could have a check take place when your application loads, which will do the css replacing, and then set a flag in a flat-file or environment variable which your application would later use to identify that it does not need to rerun the css replace. Just saw someone else's post regarding using Memcache.. this would work great for this flagging.
There is no concept of an "application" in PHP. The script is the be-all and end-all of the "application". There's no state maintained when there's no active connection, beyond what gets stored in $_SESSION on a per-user basis.
You can force PHP to load a particular file before starting the first line of code in any particular script with the ini settings auto_prepend_file (and the same for shutdown with auto_append_file).
If you want to cache the processed CSS file, you can write it out and have some other script periodically refresh it. A cron job would fit the bill for this case.

Automatic file transfer (daily)

Is it possible to automaticly download xml files from one server to another server on a daily basis with PHP?
The goal is to create a webapplication in CakePHP which makes use of an xml report that comes from a online accountingserver.
So it can be done using a cronjob? But is cron supported with PHP?
Where can i configure that cronjob?
What kind of code should i write to get the file from the accountingserver in the first place?
Are the servers both running Linux? Maybe something like rsync would be your best bet, since PHP isn't designed for this kind of task out of the box.

PHP - Memcache - HTML Caching

I would like to create a caching system that will bypass some mechanisms in order to improve the performance.
I have some examples:
1-) I have a dynamic PHP page that is updated every hour. The page content is same for every user. So in this case I can either:
a) create an HTML page, and that page can be generated every hour. In this case I would like to bypass PHP, so there should be a static page and if the database is updated, a new HTML file will be generated. How can I do this? I can create a crontab script that generates the HTML file, but it does not seem as an elegant way.
b) cache the output in the memory, so the web server will update the content every hour. I guess I need a memory cache module for the web server. There is a unofficial memcache module for lighttpd, but it does not seem stable, I have also heard a memcache module for nginx but don't know whether is this possible or not. This way seems more elegant and possible, but how? Any ideas? (Again, I would like to bypass PHP in this case)
Another example is that I have a dynamic PHP page that is updated every hour, in that page only user details part is fully dynamic (so a user logs in or out and see his/her status in that section)
Again, how can I create a caching system for this page? I think, if I can find a solution for the first example, then I can use AJAX in that part with the same solution. Am I correct?
edit: I guess, I could not make clear. I would like to bypass PHP completely. PHP script will be run once an hour, after that no PHP call will be made. I would like to remove its overhead.
Thanks in advance,
Go with static HTML. Every hour simply update a static HTML file with your output. You'll want to use an hourly cron to run a PHP script to fopen() and fwrite() to the file. There's no need to hit PHP to retrieve the page whatsoever. Simply make a .htaccess mod_rewrite redirection rule for that particular page to maintain your current URL naming.
Although not very elegant, static HTML with gzip compression to me is more efficient and would use less bandwidth.
An example of using cron to run a PHP script hourly:
// run this command in your console to open the editor
crontab -e
Enter these values:
01 * * * * php -f /path/to/staticHtmlCreater.php > /dev/null
The last portion ensures you will not have any output. This cron would run on the first minute of every hour.
UPDATE
Either I missed the section regarding your dynamic user profile information or it was added after my initial comment. If you are only using a single server, I would suggest you make a switch to APC which provides both opcode caching and a caching mechanism faster than memcached (for a single server application). If the user's profile data is below the fold (below the user's window view), you could potentially wait to make the AJAX request until the user scrolls down to a specified point. You can see this functionality used on the facebook status page.
If this is just a single web server, you could just use PHP's APC module to cache the contents of the page. It's not really designed to cache entire pages, but it should do in a pinch.
Edit: I forgot to mention that APC isn't (yet) shipped with PHP, but can be installed from PECL. It will be shipped as part of PHP 6.
A nice way to do it is to have the static content stored in a file. Things should work like this :
your PHP script is called
if your content file has been modified more than 1 hour ago (width filemtime($yourFile))
re-generate content + store it in the file + send it back to the client
else
send the file content as is (with file($yourFile), or echo file_get_contents($yourFile)
Works great in every cases, even under heavy load.

Categories