Can I execute a cURL call from postgres trigger - php

I have to create a process call on a db field(s) being a certain status. I have heard you can execute a cURL call with a db trigger but Google is not being kind enough to return anything I can use.
So I guess my question is three parts:
Can this be done?
Reference?
Alternative Solution?
Workflow:
db field is updated with status, need to kick off script/request/process that run the next step in my work flow (This is a PHP script) that will pull the recorded in the db and process another step, then update the db with the results.

You shouldn't use triggers for that, as a trigger blocks transactions so it will make your database very slow. Also you'd need to install unsafe language to Postgres — pl/sh, pl/perl, pl/python or other.
There are 2 better solutions for this problem:
have a process which connects to database and LISTENs for NOTIFY events generated by your trigger — this will work instantly;
periodically check for new data using, for example, a cron script - this would work with a delay.

If you can call a shell script,
http://plsh.projects.postgresql.org/
you can call a curl.
But I get a creepy feeling about the approach...
If the remote server goes offline, data inconsistency??
Alternative:
I wouldn't put business logic in triggers, only customized constraints or denormalisation.
Do what you need to do with either middle-tier, or stored procedures.
Regards,
//t

I think what you're looking for is a trigger in postgres that will run the necessery script. Triggers are explained in the documentation, the syntax for adding a new trigger is explained here. The trigger type you're looking for appears to be an AFTER UPDATE trigger. As far as I know, the script you run will have to check if the field is of the required status, as postgres will always run the trigger.

Related

Insert data with php on server with cron(not working)

On website i have php script with needs to be called in order to collect data and set to array from which it sends to mySql. I have tried cron job calling it with www.website.com/script/sc1.php?value="val" and the cron says it is done and ok, but it's not (the data is not inserted in mySql).
Is there any other way for me to do that or should i use task scheduler on windows or setup a program java or c++ to call the webpage in certain time (it's quite a bother so I would prefer an easier approach to the issue.
To recap: Cron is not working or rather wont execute the call (it's done in about 10-12sec due to high amount of data)
Opening it in browser does work and it executes without issue.
I have tried using cron-job.org and the response i get for the call is:
Sorry for edits, the call is passed into switch statement and if val == any the proper function is called to insert data into mySql.

SQL database table event listener

Long time reader, first time lister!
I have a project that I'm working on and need to know whether anyone has any ideas or tips... So I have an SQL database that I connect via an ODBC connection and only have read access. What I want to do is create some sort of listener for when a particular table is updated which will then call a PHP script. Much like an event trigger but I don't have access to the server.
Is there some mysterious PHP library that can handle this or a third party application that can be run on a server and plug into a database?
Thanks!
I ended up using a while(true) loop in my PHP script until it found the record that it was looking for which was called from a jquery ajax call. A little 'hackish' but I was under a bit of time pressure!! :)

Real Time Event Listener for PHP and mySQL

I am thinking of implementing a real time event listener for a personal project. Is there are way that say, if an INSERT, UPDATE, and DELETE SQL queries have been issued, then mySQL will trigger a PHP file which will in turn process this, like refreshing a page automatically if a new record is found, or say a record has been edited or deleted?
I have been reading through mySQL triggers but I do not know how to implement. Thanks!
If what you want to do is refreshing the page in an appropriately lazy manner, I suggest you look less to triggering out of MySQL and trigger it with AJAX. Waygood has a good link for the latter, but consider the former for simply updating data.
You can update the information on your site by way of long polling. That way you keep a persistant connection open back to your server and can update data whenever the server pushes an update through. When done, simply start the connection again and wait for another update. The most commonly used LP technique is probably one with AJAX. Alternatively, for things like cross-domain support, you could go a bit more exotic with script tag long polling.

Background PHP worker

I have a script that takes a while to process, it has to take stuff from the DB and transfers data to other servers.
At the moment i have it do it immediately after the form is submitted and it takes the time it takes to transfer that data to say its been sent.
I was wondering is the anyway to make it so it does not do the process in front of the client?
I dont want a cron as it needs to be sent at the same time but just not loading with the client.
A couple of options:
Exec the PHP script that does the DB work from your webpage but do not wait for the output of the exec. Be VERY careful with this, don't blindly accept any input parameters from the user without sanitising them. I only mention this as an option, I would never do it myself.
Have your DB updating script running all the time in the backgroun, polling for something to happen that triggers its update. Say, for example, it could be checking to see if /tmp/run.txt exists and will start DB update if it does. You can then create run.txt from your webpage and return without waiting for a response.
Create your DB update script as a daemon.
Here are some things you can take a look at:
How much data are you transferring, and by transfer is it more like a copy-and-paste the data only, or are you inserting the data from your db into the destination server and then deleting the data from your source?
You can try analyzing your SQL to see if there's any room for optimization.
Then you can check your php code as well to see if there's anything, even the slightest, that might aid in performing the necessary tasks faster.
Where are the source and destination database servers located (in terms of network and geographically, if you happen to know) and how fast the source and destination servers are able to communicate through the net/network?

php asynchronous call and getting response from the background job

I have done some google search on this topic and couldn't find the answer to my question.
What I want to achieve is the following:
the client make an asynchronous call to a function in the server
the server runs that function in the background (because that function is time consuming), and the client is not hanging in the meantime
the client constantly make a call to the server requesting the status of the background job
Can you please give me some advices on resolving my issue?
Thank you very much! ^-^
You are not specifying what language the asynchronous call is in, but I'm assuming PHP on both ends.
I think the most elegant way would be this:
HTML page loads, defines a random key for the operation (e.g. using rand() or an already available session ID [be careful though that the same user could be starting two operations])
HTML page makes Ajax call to PHP script to start_process.php
start_process.php executes exec /path/to/scriptname.php to start the process; see the User Contributed Notes on exec() on suggestions how to start a process in the background. Which one is the right for you, depends mainly on your OS.
long_process.php frequently writes its status into a status file, named after the random key that your Ajax page generated
HTML page makes frequent calls to show_status.php that reads out the status file, and returns the progress.
Have a google for long running php processes (be warned that there's a lot of bad advice out there on the topic - including the note referred to by Pekka - this will work on Microsoft but will fail in unpredicatable ways on anything else).
You could develop a service which responds to requests over a socket (your client would use fsockopen to connect) - some simple ways of acheiving this would be to use Aleksey Zapparov's Socket server (http://www.phpclasses.org/browse/package/5758.html) which handles requests coming in via a socket however since this runs as a single thread it may not be very appropriate for something which requiers a lot of processing. ALternatively, if you are using a non-Microsoft system then yo could hang your script off [x]inetd however, you'll need to do some clever stuff to prevent it terminating when the client disconnects.
To keep the thing running after your client disconnects then the PHP code must be running from the standalone PHP executable (not via the webserver) Spawn a process in a new process group (see posix_setsid() and pcntl_fork()). To enable the client to come back and check on progress, the easiest way to achieve this is to configure the server to write out its status to somewhere the client can read.
C.
Ajax call run method longRunningMethod() and get back an idendifier (e.g an id)
Server runs the method, and sets key in e.g. sharedmem
Client calls checkTask(id)
server lookup the key in sharedmem and check for ready status
[repeat 3 & 4 until 5 is finished]
longRunningMethod is finished and sets state to finished in sharedmem.
All Ajax calls are per definition asynchronous.
You could (although not a strictly necessary step) use AJAX to instantiate the call, and the script could then create a reference to the status of the background job in shared memory (or even a temporary entry in an SQL table, or even a temp file), in the form of a unique job id.
The script could then kick off your background process and immediately return the job ID to the client.
The client could then call the server repeatedly (via another AJAX interface, for example) to query the status of the job, e.g. "in progress", "complete".
If the background process to be executed is itself written in PHP (e.g. a command line PHP script) then you could pass the job id to it and it could provide meaningful progress updates back to the client (by writing to the same shared memory area, or database table).
If the process to executed it's not itself written in PHP then I suggest wrapping it in a command line PHP script so that it can monitor when the process being executed has finished running (and check the output to see if was successful) and update the status entry for that task appropriately.
Note: Using shared memory for this is best practice, but may not be available if you are using shared hosting, for example. Don't forget you want to have a means to clean up old status entries, so I would store "started_on"/"completed_on" timestamps values for each one, and have it delete entries for stale data (e.g. that have a completed_on timestamp of more than X minutes - and, ideally, that also checks for jobs that started some time ago but were never marked as completed and raises an alert about them).

Categories