Problems with multi-task on Flex - php

I have a method that calls two services from PHP at the same time. Due to multi-tasking abilities of Flex, I think that each service is called in a different thread.
My problem is: Both services return an Array of Objects from the database. But the second service will feed a DataGrid that has a handler for each record. This handler will compare that from both Arrays and when Flex finishes the second one before finishes the first one, I have a problem because the handler tries to compare data with a null object (the PHP Service didn't respond yet).
Any ideas?
EDIT:
On the day that I posted this question, some guy gave me an amazing idea, but sadly seems like he deleted his post, I don't know why.
I kept his idea on my mind and I found a solution that fits my design pattern with his idea.
He told me to put a flag telling me if the data was already loaded or not.
So here is what I'm doing now:
I call first service;
I call second service;
On the result of first service, I check the Flag on the second service, if it's true, it means that it was already loaded, so I can just store my data in the DataGrid so the handler can be called.
If the flag is false, it means that the second data wasn't loaded yet, so instead of storage the data in the official dataProvider, I storage it on a _temp DataProvider that is not BOUND to the dataGrid. In this case, when the second data is loaded, a listener event is dispatched to the first service telling him to catch the _temp dataProvider and copy that to the official dataProvider.
Particularly, I liked the solution and it doesn't break the Table Data Gateway design pattern.
Thanks everyone for the help.

Due to multi-tasking abilities of Flex, I think that each service is
called in a different thread.
What makes you think Flex supports multi-threading? It really doesn't. Single Thread only.
However, your calls are asynchronous in that when they are sent, the program does not stop to wait for an answer, but listens for a completion event.
Your solution is simple: Send only the first request, wait for it to complete, and then send the second request in the completion handler.
EDIT
To preserve your design pattern, you can apply two different strategies:
Use the Business Delegate pattern: Have a custom class act as the gateway and encapsulate the actual connections to your services. It could then dispatch its own COMPLETE event to trigger the handlers you have. To the DataGrid, it would appear like a single asynchronous call.
Use the Session Facade pattern: Create a single service on the server side, which accepts a single request, calls the referenced services locally, and returns the combined data for both services in a single response.

Flex doesn't have multi threading, but it can have multiple asynchronous calls at once. You can deal with not knowing which will return first by having each return handler check to make sure that both services have returned before progressing into code that depends on both.

Let us assume you have two services..
FirstService
SecondService
private function init(): void
{
// Call the first service.
myService.FirstService();
}
private function firstServiceResult(re:ResultEvent) :void
{
// Perform what you need to do with the results of your FirstService (i.e. set the result to array).
// Afterwards, call the next service.
myService.SecondService();
}
private function secondServiceResult(re:ResultEvent) :void
{
// Perform what you need to do with the results of your SecondService.
// Now you can compare the result in the first service and the second service.
}

Related

PHP / JavaScript: Calling a PHP Class via AJAX. Multiple Instances of the PHP Class

Maybe it's a stupid question. But anyway here is my problem. I have multiple classes in my project.
At the beginning the constructor of the class Calculate($param1, $param2...) is called.
This Calculate class is called multiple times via jQuery Events (click, change..) depending on which new form field is filled.. The prices and values are calculated in the background by php and are represented on the website via AJAX (live while typing).
The connection between the AJAX and the Calculate class is a single file (jsonDataHanlder) this file receives the POST-values from the AJAX and returns a JSON-String for the website output. So every time I call this jsonDataHandler a new Calculate object is beeing created. With the updated values, but never the first created object. I am experiencing now multiple problems as you may can imagine.
How can I always access the same object, without creating an new one?
EDIT: because of technical reasons, I cannot use sessions..
Here is the php application lifetime:
The browser sends an http request to the web-server
Web-server (for example Apache), accepts the request and launches your php application (in this case your jsonDataHandler file)
Your php application handles the request and generates the output
Your php application terminates
Web-server sends the response generated by php application to the browser
So the application "dies" at the end of each request, you can not create an object which will persist between requests.
Possible workarounds:
Persist the data on the server - use sessions or the database (as you said this is not an option for you)
Persist the data on the client - you still create your object for each request, but you keep additional information client-side to be able to restore the state of your object (see more on this below)
Use something like reactphp to have your application running persistently (this also can be not an option because you will need to use different environment). Variance of this option - switch to another technology which doesn't re-launch the server-side application each time (node.js, python+flask, etc).
So, if you can't persist the data on the server, the relatively simple option is to persist the data on the client.
But this will only work if you need to keep the state of your calculator for each individual client (vs keeping the same state for all clients, in this case you do need to persist data on the server).
The flow with client-side state can be this:
Client sends the first calculation request, for example param1=10
Your scripts responds with value=100
Client-side code stores both param1=10 and param1_value=100 into cookies or browser local storage
Client sends the next calculation, for example param2=20, this time the client-side code finds previous results and sends everything together (param1=10&param1_value=100&param2=20)
On the server you now can re-create the whole sequence of calculation, so you can get the same result as if you would have a persistent Calculate object
Maybe you should try to save the values of the parameters of Calculate object in database, and every you make an AJAX call you take the latest values from the DB.

singleton-registry pattern and object-interaction with ajax

My problem may be very specific i think. I already tryed to find some info about it, have viewed tons of sites, but with no success. And i'm a newbee in OO-php. I will try to explain issue without samples of code:
So, i have develop object-oriented php application. My Registry class implement singeltone pattern (have only one instance in whole app) and store objects that must be accessible in any part of application. At this moment i need to call JQuery AJAX to interact with user without page reloading. But calling php script via AJAX give me another instance of my Registry class (when i try to use registry in called php-file), this instance certainly empty (have no objects in array). I think this happened because AJAX calls works in different threads (may be i'm mistake). Anyway - is where some way to rich needed functionality using registry-pattern ? Or may be there is another ways to achieve it? I know that i can make my classes static, and use objects statically, but unfortunately i can't do it. I also know about global vars - it's no my way... Any help, please!
So every single request to a PHP application is handled by a separate process on the server. No memory/data is shared across requests. That being the case, the Ajax request to the PHP script will not have access to your other request's info.
In order to handle this, You'll need to keep state of the data you're trying to store in some other way. Store it in the DB, Sessions, etc.
So say you have a "singleton-ish" list of objects that are items available in a store, the number in stock and their meta data.
in pseudo code:
$inventory = array(
"cars" => array(SomeCarObject, AnotherCarObject),
"trucks" => array(SomeTruckObject, AnotherTruckObject)
);
Because this is not kept in shared memory accross requests, every time you need to asynchronously operate on on this list, you'll need to fetch it (from where ever you're keeping state of it), modify it, then save it and respond appropriately.
Your flow might look something like this:
client request $inventory => server fetches state of $inventory, does stuff, resaves => sends $inventory back to client.
You will likely need to add a locking mechanism to the DB to indicate that another process is using it but that's something you'd need to exercise caution with because it can cause other requests to wait for something that is modifying your data to finish before they can finish. You do not want to introduce a race condition. http://en.wikipedia.org/wiki/Race_condition

Iron.io Push Queue and Laravel 4 - preventing the queue request from running multiple times

I have a push queue set up to dial quite a few phone numbers and play back a recording - a blast announcement system powered by Twilio. It takes time to iterate through each number and place the call, so I am hoping to use a push queue to speed up the navigation of the app.
I attempted to use Iron.io push queues in the past with Laravel 4 and it seems that any task that takes a while to run, or if the HTTP request was slow at first, the code within the fire() method runs multiple times, even with $job->delete()
Here is an example of my queue handler -
class callLotsOfPeople {
public function fire($job, $data) {
// Do stuff with data, like calling lots of people.. takes time
$job->delete();
// For some reason this method can be called multiple times after a single queue push,
// resulting in multiple phone calls and angry clients
}
}
This might be too late, but I had the same issue. I found a fix and have submitted a pull request to have it included in Laravel 4.1.
Basically, the code changes allow you to pass an $options array like this
Queue::push('MyJob', $message, $queue, array('timeout' => 300));
The functionality is already there in the IronMQ.class.php, but I couldn't find an easy way to pass them in from Laravel. Hopefully they include this, makes the multiple job submission issue go away. :-)
https://github.com/laravel/framework/pull/3555
EDITED: Changed Queue::pull to Queue::push, small typo.
I've yet to leverage Push Queues fully, but a quick glance at IronMQ docs revealed the following:
Long Running Processes - aka 202’s
If you’d like to take some time to process a message, more than the 60 second timeout, you must respond with HTTP status code 202. Be sure to set the “timeout” value when posting your message to the maximum amount of time you’d like your processing to take. If you do not explicitly delete the message before the “timeout” has passed, the message will be retried. To delete the message, check the “Iron-Subscriber-Message-Url” header and send a DELETE request to that URL.
via: http://dev.iron.io/mq/reference/push_queues/#long_running_processes__aka_202s
Now, the timeout isn't something that Laravel seems to support at the moment since the payload is created behind the scenes with no easy access. You can create a Pull Request on the 4.1 branch to implement this functionality specifically for Iron push queues (tip: you'd need to edit both QueueInterface and all Queue drivers' push() function).
As a work-around, maybe you can just $job->delete() from the get-go (rather than after the time-consuming task) and just Queue::push() it again (or some chunk of it) if there are errors? Something like:
class callLotsOfPeople {
public function fire($job, $data) {
$job->delete();
// Do stuff with data, like calling lots of people.. takes time
if ($error) {
Queue::push(...);
}
}
}
Let me know how it goes, I may have a similar situation in the future and would like to know how you solve it!

Communication between web page and php script triggered from this web page

I have here an myAction function in some controller. And it has one class instance:
public function myAction() {
...
$myAnalyzer = new Analysis();
$myAnalyzer->analyze();
...
}
Let say this function analyze() takes 10 mins. That means it blocks my.phtml 10 mins, which is unacceptable. What I want is to render my.phtml first and then to show intermediate result from analyze() on my.phtml.
function analyze() {
...
foreach($items as $rv) {
...
...
// new result should be stored in db here
}
}
As far as I know, it's impossible, for there is just one thread in PHP. So I decided to ajax-call from my.phtml to run myAnalyzer instance.
First question: is that right? Can I do it in myAction() without blocking?
OK, now I run myAnalyzer using some script, say worker.php, from my.phtml with the help of javascript or JQuery.
Second question: how can I know when each foreach-loop ends? In other words, how can I let worker.php send some signal (or event) to my.phtml or zend framework. I do NOT want to update my.phtml on a time basis using javascript timer. That's all that I need to know, since intermediate data is supposed to be stored in DB.
Third question: the myAnalyzer muss stop, when user leaves pages. For that I have this code.
window.onbeforeunload = function(e) {
// killer.php kills myAnalyzer
};
But how can javascript communicate with myAnalyzer? Is there something like process-id? I mean, when worker.php runs myAnalyzer, it registers its process-id in zend framework. And when user leave page, killer.php stops myAnalyzer using this process-id.
I appreciate the help in advance.
First Q.: Yeah, I'm afraid that is correct.
Second Q.: I do not understand what do you mean here. See code example below
foreach($data => $item) {
...
}
//code here will be executed only after foreach loop is done.
Third Q.: Take a look at this page. You can set this to false (But I suppose it is already like that) and send something to client from time to time. Or you can set it to true and check if user is still connected with connection_aborted function. What I mean here is that you can run your worker.php with ajax and configure your request so browser will not disconnect it because of timeout (so connection will be kept while user stay on page). But it will be closed if user leave the page.
EDIT:
About second question. There are few options:
1) you may use some shared memory (like memcached, for instance). And call server with another ajax request from time to time. So after each loop is ended - you put some value into memcached and during request you can check that value and build response/update your page based on that value
2) There is such thing like partial response. It is possible to get some piece of response with XMLHTTPRequest, but as I remember - that is not really useful at this moment as it is not supported by many browsers. I do not have any details about this. Never tried to use it, but I know for sure that some browsers allow to process portions of response with XMLHTTPRequest.
3) You can use invisible iframe to call your worker.php instead of XMLHTTPRequest. In this case you can send some piece of where you can put a javascript which will call some function in parrent window and that function will update your page. That is one of Long-polling COMET implementations if you want to get some more information. There are some pitfalls (for instance, you may need to ensure that you are sending some specyfic amount of symbols in response in order to get it executed in some browser), but it is still possible to use (some web browser chats are based on this).
2) and 3) is also good because it will solve your third question problem automatically. At the same time 1) may be simpler, but it will not solve a problem in third question.
One more thing - as you will have long running script you must remember that session may block execution of any other requests (if default file based PHP session is used - this will happen for sure)

ZendAMF - function calls in quick succession fail

I'm implementing AMF service methods for an flash front-end. Normally things work fine, but we've found that if two methods are called one right after the other, the second call returns null to the flash front-end even though the method actually completes successfully on the PHP end (to verify this, I dump the return data to a file right before I return it).
Has anyone had this behavior before? Is it some setting with ZendAMF?
Maybe wait a confirmation that the first method was finished before call the second ?
I use ZendAMF too. I have noticed that if one call fails, it will trigger a failure message for any other batched calls (Async tokens can be used to get around this).
I would try sending each call one at a time and finding out which one is failing if there is one. Personally, I use a software called Charles which is and HTTPProxy that allows me to see the contents and error messages of any AMF calls I perform. You could also use wireshark, either way you would be able to see the exact request sent, and any error messages that are being thrown by your backend.
Are you using any transactions in your code (like Doctrine ), sometimes the code will pass test and write out correctly, but choke when the transaction gets closed and end up throwing an error.
It actually turns out the flash side was using the same connection for two function calls. Making separate connections for each call has solved the problem.

Categories