Best Way to Handle Background Processes - php

I'm trying to implement an iCal synchronization service for my project. There are nearly 10.000+ listings (more to come) in the database and almost every listing has a Google Calendar iCal URL to be synchronized every 12 hours. Synchronizing a single iCal URL takes about 0.5-1sec, so it will take serious time to process 10000+ items.
I'm not sure how to handle the synchronization process. I'm thinking to use Gearman but not sure if it's the best way. If I use Gearman, what would be the PROS and CONS of Gearman? How would I implement Gearman to handle iCal synchronization?
I also found BraincraftedBackgroundProcess written in PHP. I'm not sure if pure PHP can handle such a busy process. I might consider using it too but I'm still trying to figure out what the best way is.

I'm sure you've solved this by now, but Gearman would be possible solution to handle it. The beauty of Gearman is that you write rather simple workers that could have only one function, and that's how to perform the synchronization of one particular iCal URL.
You would then start as many workers as you'd be comfortable with making requests at the same time to Google's Calendar API, so you can easily scale back or up your performance by starting or killing of worker processes.
Updating calendars on regular intervals could be done by having a cron script fetch all calendars that hasn't been updated the last 12 hours, and then adding them to gearmand as tasks to be performed. You can also throttle the amount of requests if you do this every 15 minutes and have a hard limit of fetching only 250 calendars (or something like that) at a time, allowing updates to be bit more fluent that just every 12 hours.
If you need more specific advice on how to implement it using gearmand you'd have to have a more specific question and example code you're using, but there's nothing stopping you from using Gearman to do that task.

Related

What is a typical Gearman flow for database modification?

Would appreciate some help understanding typical best practices in carrying out a series of tasks using Gearman in conjunction with PHP (among other things).
Here is the basic scenario:
A user uploads a set of image files through a web-based interface. The php code responding to the POST request generates an entry in a database for each file, mostly with null entries in the columns, queues a job for each to do analysis using Gearman, generates a status page and exits.
The Gearman worker gets a job for a file and starts a relatively long-running analysis. The result of that analysis is a set of parameters that need to be inserted back into the database record for that file.
My question is, what is the generally accepted method of doing this? Should I use a callback that will ultimately kick off a different php script that is going to do the modification, or should the worker function itself do the database modification?
Everything is currently running on the same machine; I'm planning on using Gearman for background scheduling, rather than for scaling by farming out to different machines, but in any case any of the functions could connect to the database wherever it is.
Any thoughts appreciated; just looking for some insights on how this typically gets structured and what might be considered best practice.
Are you sure you want to use Gearman? I only ask because it was the defacto PHP job server about 15 years ago but hasn't been a reliable solution for quite some time. I am not sure if things have drastically improved in the last 12 months, but last time I evaluated Gearman, it wasn't production capable.
Now, on to the questions.
what is the generally accepted method of doing this? Should I use a callback that will ultimately kick off a different php script that is going to do the modification, or should the worker function itself do the database modification?
You are going to follow this general pattern with any job queue:
Collect a unit of work. In your case, it will be 1 of the images and any information about who that image belongs to, user id, etc.
Submit the work to the job queue with this information.
Job Queue's worker process picks up the work, and starts processing it. This is where I would create records in the database as you can opt to not create them on job failure.
The job queue is going to track which jobs have completed and usually the status of completion. If you are using gearman, this is the gearmand process. You also need something pickup work and process that work, I will refer to this as the job worker. The job worker is where the concurrency happens which is what i think you were referring to when you said "kick off a different php script." You can just kick off a PHP script at an interval (with supervisord or a cronjob) for a kind of poll & fork approach. It's not the most efficient approach, but it doesn't sound like it will really matter for your applications use case. You could also use pcntl_fork or pthreads in PHP to get more control over your concurrent processes and implement a worker pool pattern, but it is much more complicated than just firing off a script. If you are interested in trying to implement some concurrency in PHP, I have a proof-of-concept job worker for beanstalkd available on GitHub that implements a worker pool with both fork and pthreads. I have also include a couple of other resources on the subject of concurrency.
Job Worker (pthreads)
Job Worker (fork)
PHP Daemon Example
PHP IPC Example

Run Special Tasks According To User Defined Times

I'm surprised I haven't been able to find something on this here - so if I've just completely missed it, please direct me to the proper thread.
Before I dive into any code, I'm trying to gather some good ideas for handling this situation.
We're developing a website with a list of tasks the user can select for the server to execute on their behalf. Automated emails, text messages, calendar reminders, etc.
I first went down the road of thinking about using cron, but as that the times and tasks for each user will likely change every day throughout each day - for this to be feasibly salable, I figured involving cron directly for each task could get pretty messy and buggy.
My next thought was to run a cron script every night at midnight and generate a task-list for the next day - but I'd still need cron or some sort of cron-like timing daemon to check the list against the time every minute.
I've run through several ideas, but they all seem fairly active or processor heavy. I'd like to find a good light-weight solution that can handle up to several thousand user defined tasks per day.
I'm working with your basic LAMP7 stack. If anybody has dealt with a similar task, I'm just looking for some good ideas to consider.
Thanks in advance.
You can use ReactPHP application run in background in your machine.
Then you can create a simple http server on your ReactPHP application for recieving the user data from your webserver such as you specified LAMP7. And once you recieved that you can trigger those events by setting asyncronous timer on the event-loop.

Making an app that executes an amount after an amount of time

Question
Using PHP & Jquery how would you execute code after a given amount of time, say 1 month (even after the user has closed the browser etc)
Scenario
I've wanted to build an application that does something in an amount of time specified by the user, "sort of like hootsuite". But i cant get my head around how it would work.
I know you can use node.js (I struggle to understand and implement this in any of my laravel projects...) but even then wouldnt the server be filled with stress if say 1000 people had something waiting to be executed on the server for a whole month or even a year while still handling other user requests?
I've looked around a bit and CRON jobs came up but this doesnt sound like what i was looking for! Im not sure, ill be grateful if anyone can explain to me how they think i could go about it
Essentially what you're looking for is a scheduling system. The reason why the UNIX cron tool has come up in your searches is because it is a scheduling tool; it allows UNIX users to schedule tasks to happen at certain times. Other operating systems also have task schedulers.
Schedulers
The principal implementation strategy for a scheduler is some kind of polling mechanism, i.e., a software component which periodically checks to see if there are any scheduled tasks which are now due to be executed and, if so, executes them.
Implementation strategies
In order to implement something like this you would need a way to store information about scheduled tasks (e.g. when they're supposed to happen, who they belong to, what they're supposed to do). For example, you might use a database management system, or a file on disk.
You would also need a component to do the polling. This could be a daemon process (i.e. a process which is always running in the background) which includes a sleep (or wait or timeout) call which allows it to check at intervals for scheduled tasks, rather than constantly checking (and thereby most likely consuming all the CPU cycles!). Or it could be a program (in PHP if you like) which is itself run by cron on the host system, say, every five minutes which checks for scheduled tasks and then executes in, perhaps in separate processes. If you were to use cron, there are numerous PHP wrappers to help such as https://packagist.org/packages/peppeocchi/php-cron-scheduler.
Services
However, instead of implementing all this yourself, you may consider making use of an existing service. There seem to be several options, including at least one free (within limits) service: https://atrigger.com/.

Should I use Laravel Queues to manage threads across my application

I am looking to hit multiple 3rd party APIs to gather information for a user's search query. I am planning to spin off a thread for each API I want to hit to minimize the response time on my end. I also want to limit the amount of threads my application can have running at any one time due to memory/cpu concerns.
Since I am using Laravel as my framework, I was trying to accomplish this using Laravel queues, but it seems that I might have trouble getting the response data from the Job.
Are laravel queues the correct way to tackle this? If so how do I
listen for the job's status and retrieve the data once the job is complete? I see some things that point towards passing a closure to the job, but something just isn't clicking for me.
It depends. A job queue and worker pool might be appropriate if there are a really huge number of API calls to make, especially if those API calls can be very slow. But, I'd try to avoid all that architecture unless you're really sure you need it.
To start, I'd look at doing async requests to the external APIs, and try to keep the whole thing in a single process. The Guzzle HTTP client library provides a very programmer-friendly API for doing this kind of asynchronous requests.
If the external requests are really numerous or slow, you might consider using a queue. But in that case, you're looking at implementing a bunch of logic to queue all the jobs, then poll until they're done (giving feedback to your user along the way), and finally return the merged result. That may end up being necessary, but I'd start with the simpler implementation I describe above.

How to fire off multiple PHP scripts in the background?

this side of PHP is rather new to me.
I am interested in firing off a large number (25-50) separate processes from a parent script. I would like for the parent script to not wait for these other scripts to complete AND I would like for these other scripts to run in parallel.
Each script would run for a specified amount of time calling a webservice.
Can anyone give me some direction with this? I'm not asking for a coded answer specifically, but I just need some guidance.
Much thanks.
It really depends on what you want to achieve. #Julien's forking method could work, but this is not preferable if your web service calls are data intensive. I am not saying that forking is bad on the contrary it works, but with the ammount of different wev services you want to call you should have a way manage things better.
Another thing that you can do is base this on cronjobs. For example if you're calling these webservices for some users in your app create a queue - a DB table that you add records that need to be processed. If you are using Cake use the Cake Shells. Then set up cronjobs that call a the shells that processes these records every now and then. Divide all services is separate queues - at least for those that are very different in logic. This way you will also divide your risk because if there is a failure in one of the web service calls you would not jeopardise all in some way. Have separate logging abilities for each queue which will enable you to quickly track down problems. With consuming web services very often problems are external to your application.

Categories