how to run a php file from another server? - php

i am trying to do a video converter that grabs some files from a html form and converts them.
i would like to do the conversion on another server, .11, since i don't want to overload the main server .10
i can set up a network folder between the two servers, /media, and have a convert.php on .11 that will run the ffmpeg command.
if i run that php file from .10 then will the video conversion process take resources from .11 or from .10 ? Seems to me that .10 will be affected even if the php file is on .11.
I could do a cron job, but i really don't want to.
For this project i am using zend framework
Any ideas how to solve this issue?
thanks

I would definitely recommend implementing a Queue for these kind of tasks. Your queue could be simply a MySQL database that maintains a list of outstanding tasks. The workers can check this database for any tasks to be run.
This will provide you with far more flexibility in terms of scaling up. Tomorrow, if you decide to add two more worker servers/systems, they will fit seamlessly into the queue-model.

Related

Efficiently Monitor Directory for New Files and Run a PHP script on Windows

Are there any simple ways to run a PHP script when a file is added to a specific directory?
On linux there are perfect tools for this like inotify/dnotify, can't find any for Windows.
If I run a PHP script that loops infinitely - will that make a significant impact on cpu performance (if all it does is check folder for contents)?
I read that Win32::ChangeNotify could be used, but I'm a noob in Perl, so I have no idea how to set it up.
The easiest way to manage this would be to create a cron job that runs your script every minute (or however often you wish to check).
Edit - Just read the post again, and it hit me that you're using Windows. I suppose you can use Scheduled Tasks to do this.

Building a cron job scheduler [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
Currently I'm trying to build a good scheduler system as an interface for setting and editing cron jobs on my system. My system is built using Zend framework 1.11.11 on a Linux server.
I have 2 main problems that I want your suggestion for:
Problem 1: The setup of the application itself
I have 2 ways to run the cron job:
First way is to create a folder scripts and create a common bootstrap file in it where I'll load only the resources that I need. Then for each task I'll create a separate script and in each script I'll include the bootstrap file. Finally, I'll add a cron task in the crontab file for each one of these scripts and the task will be something like ***** php /path/to/scripts/folder/cronScript_1.php .
Secondly treat the cron job like a normal request (no special bootstrap). Add a cron task in the crontab file for each one of these scripts and the task will be something like ***** curl http://www.mydomain.com/module/controller/action .
Problem 2: The interface to the application
Adding a cron job also can be done in 2 ways:
For each task there will be an entry in the crontab file. when I want to add a new task I must do it via cPanel or any other means to edit the crontab (which might not be available).
Store the tasks in the database and provide a UI for interacting with the database (grid to add few tasks and configuration). After that write only 1 cron job in the crontab file that runs every minute. This job will select all jobs from the database and checks if there is a job that should be run now (the time for the tasks will be stored and compared with the current time of the server).
In your opinion which way is better to implement for each part? Is there a ready made solution for this that is better in general??
Note
I came across Quartz will searching for a ready made solution. Is this what I'm looking for or is it something totally different?
Thanks.
Just my opinion, but I personally like both 1 & 2 dependent on what your script is intending to accomplish. For instance, we mostly do 1 with all of our cron entries as it becomes really easy to look at /etc/crontab and see at a glance when things are supposed to run. However, there are times when a script needs to be called every minute because logic within the script will then figure out what to run in that exact minute. (i.e. millions of users that need to be processed continually so you have a formula for what users to do in each minute of the hour)
Also take a look at Gearman (http://gearman.org/). It enables you to have cron scripts running on one machine that then slice up the jobs into smaller bits and farm those bits out to other servers for processing. You have full control over how far you want to take the map/reduce aspect of it. It has helped us immensely and allows us to process thousands of algorithm scripts per minute. If we need more power we just spin up more "workhorse" nodes and Gearman automatically detects and utilizes them.
We currently do everything on the command line and don't use cPanel, Plesk, etc. so I can't attest to what it's like editing the crontab from one of those backends. You may want to consider having one person be the crontab "gatekeeper" on your team. Throw the expected crontab entries into a non web accessible folder in your project code. Then whenever a change to the file is pushed to version control that person is expected to SSH into the appropriate machine and make the changes. I am not sure of your internal structure so this may or may not be feasible, but it's a good idea for developers to be able to see the way(s) that crontab will be executing scripts.
For Problem 2: The interface to the application I've used both methods 1 & 2. I strongly recommend the 2nd one. It will take quite more upfront work creating the database tables and building the UI. In the long run though, it will make it much easier adding new jobs to be run. I build the UI for my current company and it's so easy to use that non-technical people (accountants, warehouse supervisors) are able to go in and create jobs.
Much easier than logging onto the server as root, editing crontab, remembering the patterns and saving. Plus you won't be known as "The crontab guy" who everyone comes to whenever they want to add something to crontab.
As for setting up the application itself, I would have cron call one script and have that script run the rest. That way you only need 1 cron entry. Just be aware that if running the jobs takes a long time, you need to make sure that the script only starts running if there are no other instances running. Otherwise you may end up with the same job running twice.

Can PHP attach to a running Windows process?

Are there any PHP functions/libraries that I can use to attach to a running process under the Windows OS ?
I'm playing an abandonware game and I would like to make changes to data in various memory locations whilst the game is running.
The game doesn't use Shared Memory, IPC's. I'm hoping PHP allows me to give the Process ID of the game, and then it can attach using some functions/library that I've not come across.
The GDB debgugger is one potential way forward, but I'd like to do everything in PHP if possible.
Any thoughts or ideas will be appreciated.
With PHP running on Windows you are able to create instances of ActiveX objects. But this will lead you to write an ActiveX class which gets all the information and does all the work. PHP would end for display only purposes and so you could create your html output directly with your ActiveX class / C++ / C# / whatever app.
Summary: For such a task PHP is not the best choice.

Using Pygments with PHP (Python in PHP)

Is it possible to use Python (specifically Pygments) with PHP? Currently, I have a phpBB forum that I'm developing for and JS Syntax Highlighters just haven't been working for me. There's already a GeSHI mod, but I want to develop something myself just for experience.
Also, would there be performance issues?
There is now a library for this at:
http://derek.simkowiak.net/pygments-for-php/
Pretty much the only way to perform that integration (with PHP as the dominant language) is to shell out. This means starting python manually every time you need it.
That can be a little slow if you need to do it a lot. You can mitigate this by creating the syntax hilite when posts are created or edited, not when viewing.
If you're interested in diving into Python, you could write an external script or server application to update new posts with syntax-highlighted code. If it were me, I'd retain the original code in one database column and place the syntax-highlighted version in another.
A simple script to update new posts in batches could run as a cron job at whatever interval you find ideal.
To support a near real-time scenario, you could write a server application that sits and waits to be notified of new posts one at a time. For example, upon processing a new post, the PHP application could send the highlighting application a message through an AMQP queue.

Best way of reliably updating content at a specific time across VARIOUS files and formats?

Let's say I have to push a change live at 8 AM EST on Tuesday May 18th - this change is across various files:
xml file
php file
30x static html files with no php processing enabled
all of these are hosted on a linux server with cron.
Is it reliable setting up a cron job to call a script which takes these files, for example:
templates/template.php
navigation.xml
specials-hot-deal.html
and appends '-old' to those, along with renaming the live files I'll have on the server "template-new.php", "navigation-new.xml" at that approximate time?
Is this reliable or should I just manually do this? Of course since I'm not familiar with cron I'll probably have to test it out today. Any weird cron gotchas I should know about?
im not sure that using a cron job for deploying changes is the best idea. you may be able to accomplish this, but usually cron is used for automated tasks related to the app such as sending out mails, deleting things older than certain date ect, removing stray images.
is you application in version control such as svn or git. could you do your scheduled deployment through your version control?
just a thought.

Categories