I am trying to create a MySQL trigger that will invoke a php script. I have a MySQL server instance running in RDS and would like to use the php script to send a message to my SQS messaging system.
Where do I save the php scripts?
Do I need to install the PHP SDK for SQS on my EC2 instance?
Yes, you can use triggers normally.
Use this steps:
Change in "parameter group" menu the variable:
log_bin_trust_function_creators to "ON" or "1".
Do a "modify" and "reboot" at instance. MANDATORY!
Verify with mysql client if the variable was really modified. The command is: SHOW VARIABLES LIKE 'log_bin_trust%'
Now, you can create your triggers normally. The great difficult is to check questions correctly: your instance should be use that parameter group, and this should be correctly set
Forget about triggers, especially on Amazon RDS.
Use a cron to execute a PHP script every 5 minutes (for example) that looks for unsent messages and sends them using the SQS messaging system.
Related
We have a website running on multiple Azure instances – typically between 2 and 5.
There is a PHP script I would like to schedule to run every few minutes on each instance. (It just makes a local copy of data from a system that couldn't handle the load from all our users hitting it in real-time.)
If it were just one instance, that would be easy - I'd use Azure Scheduler to call www.example.com/my-scheduled-task.php every 5 minutes.
But the script needs to run on each instance, so that every instance has a reasonably up-to-date copy of the data. How would you achieve this? I can't work out if it's something in Azure Scheduler, or if I should be looking at some sort of startup script?
You can use a continuous webjob for that.
Just tweak your php script to have a loop and add a sleep of a few minutes between runs of your code.
The continuous webjob will run on all of your instances and even if somethings fails it will be brought back up.
Per my experience, a PHP webjob running on your each webapp instance is the good solution as #AmitApple said. However, I think you can try to use a scheduled webjob with a CRON expression for ensuring a start time, not a continuous one with a sleep time. And please make sure the script can be completed in the interval time.
You can refer to the section Create a scheduled WebJob using a CRON expression of the doc Run Background tasks with WebJobs to know how to get start.
Please see the note of the section Create a continuously running WebJob https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/#CreateContinuous.
Note:
If your web app runs on more than one instance, a continuously running WebJob will run on all of your instances. On-demand and scheduled WebJobs run on a single instance selected for load balancing by Microsoft Azure.
For Continuous WebJobs to run reliably and on all instances, enable the Always On* configuration setting for the web app otherwise they can stop running when the SCM host site has been idle for too long.
I know we have cron jobs in PHP. but I have a project in development phase and we won't have cPanel access.
We have a PHP + MSSQL application that needs to check the database periodically every 1 minute and collect the data and send a mail to a store admininstrator.
How can we do this?
You can have a alternative of cron jobs solution by implement you function in a file (eg: /very/secret.php if your jobs need to be secure, make sure the function can be call only when it get the right parameter eg: /very/secret.php?key=long-random).
Then use some free cron job server on the web like: https://www.easycron.com/ or https://www.setcronjob.com/ (just do a web search for "free online cron jobs"). You give them your URL and some configuration and then your jobs will be executed by them at a specific time of day.
I understand that you need to run a script to check the database periodically every one minute and collect the data and send a mail to store admin. Also I understand that you have a script and you need to run that script every one minute.
It can be done using "cron job", if you are using Linux server. Or if you are using Windows, there is a way to schedule the task (to run that script every min).
Note: it is nothing to with cPanel. Actually, the cPanel provides a user friendly GUI to schedule the cron job.
But if you are not using panel, you can do it manually.
If you are using Linux, here you can see, how to add the cron job - http://www.cyberciti.biz/faq/how-do-i-add-jobs-to-cron-under-linux-or-unix-oses/
If you are using Windows, here you can see, how to add the cron job - http://windows.microsoft.com/en-us/windows/schedule-task#1TC=windows-7
Hy guys!My story:I'm making an PHP application with Codeigniter. When my page is loaded I can click a button that calls my PHP API that makes some changes in the database and returns the result (true or false if the change in the database wasn't successful). Also after the database change I call a PHP script that sends push notifications to registered android devices that are stored in my database.My problem:When there are a lot of registered android devices it takes some time to load the page (PHP is waiting for every GCM request to come back). Is there a way that I can load the page after the database changes AND make GCM requests in the background/async?EDIT #1:I am on a Ubuntu server.
There are a number of different ways to address this, but the most common solution is to use some form of a message queue to offload the work to seperate processes.
You could just store the messages to a seperate table in your database and have a cron script run every few minutes to send those messages (and only delete them from the table when successfully sent) or you could look into using rabbitmq, gearman or beanstalk which are designed to be more robust and more easily scaled.
Recommended reading:
http://www.slideshare.net/appdynamics/scaling-php-in-the-real-world-23619565
https://github.com/kr/beanstalkd/wiki/client-libraries
http://www.sitepoint.com/introduction-gearman-multi-tasking-php/
If you can separate the push code in to it's own standalone script, you could call it with
exec("php /path/to/script.php > /dev/null &");
This should run it in the background (on Linux) without the script that calls it waiting.
Another option might be to store notifications in the database as a queue and have a script run via cron every N minutes to check the queue and push notifications from it.
I am a new user of Yii framwork, and I have implement YiiMail, which works well. But now I want to send mails automatically at the end of each month, is it possible?
Steps are as follows:
Set up a crontab on your server, that will run a PHP command.On Windows you can set a scheduled task which will run <yourdir>/protected/yiic-dev.bat with the proper commands to run (take a look at CConsoleCommand), which will send the emails.On the production server, you do the same but with php <yourdir>/protected/yiic.php as the cron with the command to run as argument, which will do the same but on a Linux environment (if it's a Windows server, just make a scheduled command for <yourdir>/protected/yiic.bat and the proper arguments)
Send the emails in the corresponding action, by using the mail class (see the documentation for YiiMailer)
If there are further problems and you run into more specific issues you can't work around, I suggest you make new questions for them.
Running Fedora, PHP/Gearman/MySQL/Drizzle.
Built Gearman/Drizzle from source, and have the process running on a linux/fedora box. I created the mysql test table, and can see that the Gearman Daemon instance can access/interface with the mysql service. I'm running the Gearman and mysql processes on the same box, using TCP.
When Gearman is started, and points to the MySQL account, I can see the initial select statements in the DEBUG information that's displayed as the Gearman process runs.
However, I'm not sure what I need to do to actually test that a job from the Client is stored in the mysql Table.
I created a test client that replicates the Gearman Client/Worker "Review" test, that normally works if the worker is running, and ran the client without the worker. I see in the DEBUG process that the client connects with the Gearman daemon, but when I examine the mysql table, nothing is in the table.
So my question really boils down to determining what I need to do to actually be able to see/ensure that jobs/data is really written to the actual mysql table.
Is there a given flag, method to call somewhere to establish that data is to be stored in the mysql table if not processed by the worker? Shouldn't a job be stored in the table, and then removed once it's processed? Or am I missing something in the flow?
http://gearman.org/manual/job_server/#persistent_queues
I guess what you are looking for is:
gearmand --queue-type=mysql --mysql-host=hostname