Transferring json object from a cron job - php

I've a complex php cron job that retrieve data from an external webpage and join all the information in one variable that is encoded in json. The whole process is very slow and takes a lot of time.
The point is that I need to retrieve the json object from my index page, but I don't want to load all the script because it will take too long to execute. What I've been doing is tell the cron job to create a new file and write the json object and I've been retrieving the information from that file.
I would like to know if there is a more efficient/simple way to transfer this information without having to create a new file or executing the script 'manually'. I've heard that you can send information using CURL, the truth is that I've never used this technique before, so dunno if it would be useful in this situation.

This is a pretty common issue. Long running tasks shouldn't be executed on page load because it impacts ux. having your time intensive php script running as cron job is a great solution.
Perhaps using a db would be easier still. You can easily using sqlite or a "full fledged" rdbms to store your data (like mysql or postregs). it could be something like:
time intesive php script is running on cronjob every x minutes. Saves data to your db instead of a file.
When user requests index page it sends ajax request to another php script. The php script looks for data in your db and returns it to your user if it exists.

Related

Insert data with php on server with cron(not working)

On website i have php script with needs to be called in order to collect data and set to array from which it sends to mySql. I have tried cron job calling it with www.website.com/script/sc1.php?value="val" and the cron says it is done and ok, but it's not (the data is not inserted in mySql).
Is there any other way for me to do that or should i use task scheduler on windows or setup a program java or c++ to call the webpage in certain time (it's quite a bother so I would prefer an easier approach to the issue.
To recap: Cron is not working or rather wont execute the call (it's done in about 10-12sec due to high amount of data)
Opening it in browser does work and it executes without issue.
I have tried using cron-job.org and the response i get for the call is:
Sorry for edits, the call is passed into switch statement and if val == any the proper function is called to insert data into mySql.

Getting big amount of data from a very slow external data-source

I need to recieve a big amount of data from external source. The problem is that external source sends data very slow. The workflow is like this:
The user initiates some process from app interface (common it is fetching data from local xml file). This is quite fast process.
After that we need to load information connected with fetched data from external source(basically it is external statistics for data from xml). And it is very slow. But user needs this additional inforamtion to continue work. For example he may perform filtering according to external data or something else.
So, we need to do it asynchronously. The main idea is to shows external data as it becomes available. The question is how could we organise this async process? Maybe some quess or something else? We`re using php+mysql as backend and jquery at front-end.
Thanks a lot!
Your two possible strategies are:
Do the streaming on the backend, using a PHP script that curls the large external resource into a database or memcache, and responds to period requests for new data by flushing that db row or cache into the response.
Do the streaming on the frontend, using a cross-browser JavaScript technique explained in this answer. In Gecko and WebKit, the XmlHttpRequest.onreadystatechange event fires every time new data is received, making it possible to stream data slowly into the JavaScript runtime. In IE, you need to use an iframe workaround, also explained at Ajax Patterns article linked in the above SO post.
One possible solution would be to make the cURL call using system() with the output being redirected in a file. Thus PHP would not hang until the call is finished. From the PHP manual for system():
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
This would split the data gathering from the user interface. You could then work with the gathered local data by several means, for example:
employ an iFrame in the GUI that would refresh itself in some intervals and fetch data from the local stored file (and possibly store it in the database or whatever),
use jQuery to make AJAX calls to get the data and manipulate it,
use some CGI script that would run in the background and handle the database writes too and display the data using one of the above from the DB directly,
dozens more I can't think of now...

Background PHP worker

I have a script that takes a while to process, it has to take stuff from the DB and transfers data to other servers.
At the moment i have it do it immediately after the form is submitted and it takes the time it takes to transfer that data to say its been sent.
I was wondering is the anyway to make it so it does not do the process in front of the client?
I dont want a cron as it needs to be sent at the same time but just not loading with the client.
A couple of options:
Exec the PHP script that does the DB work from your webpage but do not wait for the output of the exec. Be VERY careful with this, don't blindly accept any input parameters from the user without sanitising them. I only mention this as an option, I would never do it myself.
Have your DB updating script running all the time in the backgroun, polling for something to happen that triggers its update. Say, for example, it could be checking to see if /tmp/run.txt exists and will start DB update if it does. You can then create run.txt from your webpage and return without waiting for a response.
Create your DB update script as a daemon.
Here are some things you can take a look at:
How much data are you transferring, and by transfer is it more like a copy-and-paste the data only, or are you inserting the data from your db into the destination server and then deleting the data from your source?
You can try analyzing your SQL to see if there's any room for optimization.
Then you can check your php code as well to see if there's anything, even the slightest, that might aid in performing the necessary tasks faster.
Where are the source and destination database servers located (in terms of network and geographically, if you happen to know) and how fast the source and destination servers are able to communicate through the net/network?

Is there a way to grab files from other servers and dump them into the DB asynchronous?

I need to create a php script that takes lots of URL's via POST and then loads the corresponding files and dumps them in the DB. The thing is that I would like to do it asynchronous, so that if I have 1000 files to get, the script won't hang till all the files are loaded. Also, every time a file it's done loading, I need to know so that I can insert it in the DB
Any ideas are appreciated.
Split the script in two parts - first to collect the URLs and the second is a shell script to be run from background to get the urls inserted in the database and fetch them.
So basically the process is as follows:
Script1:
Gets POST
Inserts into database
Call script 2 with
shell_exec to run in background
Script2:
Get all the urls from urls_to_download
Fetch the URLS (consequentially or parallel, depends on you)
Do stuff with them
Save them to database.
And you are done. The POST in script1 returns immediately and the script2 is then running. All left for you is to check status (poll from database through AJAX may be) for the URLS if you want to show some information about progress.
PHP is not multithreaded and perfectly synchronous. So you may not do this using PHP alone.
But you may use another language to do this task, for example JavaScript (which is asynchronous). Try node.js. It is lightning fast and has mysql bindings ;) Use http.Client to make the requests to the sites.

Possible methods to send the output of a PHP-invoked .exe program (that runs as a separate process, not in PHP) back to the iPhone client

My iPhone client app uploads a data to the server, which runs on PHP. There is a code to invoke a .exe program on the server side on PHP. The .exe program will take the uploaded data and run on a process on its own. That means the PHP execution will end without waiting for the .exe program to finish. After the .exe program finished processing the uploaded data and have an output, I want this output to be sent back to the iPhone.
Normally, if we call the .exe program to be run inside the php without making it a seperate process, we have to wait for the program to finish and we can send the output back to the iPhone client.
By running the .exe program as a seperate process, it is impossible to send the data back via PHP that invokes the .exe program. The question is, if we have the .exe program running on a seperate process rather than on the PHP script, what are the possible methods to send the output back to the iPhone client?
That's a need problem you've outlined. Let me explain a couple of ideas.
First of all, if you terminate the initial upload request, the only resonable way to check for it being done is to poll every few seconds from the iPhone. Send a request to "get-update.php" every 5 seconds to see if you have data.
By using $_SESSION, you should be able to store a token that will identify the data when it has finished processing.
Regarding the actually process, you may be able to accomplish that in a number of ways. One is to do a fairly standard double-fork, detaching the child process from the parent, so it will continue after the parent exits.
Another (recommended) would be to author a backend server process that would watch your database for requests, fetch them, process them, and update the database. So when the inital upload script actually uploads the data, have PHP put it in the database, store the record ID in $_SESSION, and return to the user.
The back end process will notice that there is a record to process, read the data, call the executable, and update the database with the result.
The get-update.php script will read $_SESSION for the record id, and check the database if the data has been processed (or what the status is).
If you do not have the ability to run a background process, and you are constrained to using PHP, you could do the double-fork magic, and fork of another PHP process to do the database read / exe / database update.
Feel free to comment with questions.
You need (a) a good way to pass the data to the program, and (b) a good way to get the data back.
I would say this is a perfect case for an AJAX snippet frequently polling data from, say, a text file the .exe writes its status in.
The upload script you call could return a unique identifier of some sort to the uploading client. Using that identifier, the client would poll the exe's status (e.g. "does the output file xyz already exist?") until it gets positive feedback.
You're going to have a hard time reconnecting with the iPhone once you've severed the connection. It may be out of coverage, it may have changed IP address, ......
Your best bet is to have the iPhone reconnect back to the server and poll for it's information.
You could do this by using Apple's Push Notification service, but that's probably overkill, unless you think the data processing is going to take a long time, and/or you want to update the app icon when the processing is done, even if the app isn't running.
Do you expect the user to just be patiently waiting for the result, or are they going to fire off the data, and check back later? If it's only going to take a couple of seconds, you could just have the iPhone app poll for the result after waiting a little while (while displaying a progress indicator).

Categories