We created step function (WAIT State) to execute schedule task based up on user input time.
We are calling this step up function from PHP code so it will create entry in that state machine and step function (WAIT State) will trigger lambda service automatically when it meets countdown timer.
My requirement is to user to have the option to update the time or cancel event from PHP application. at this scenario i have to update existing scheduled step function event/task time to new time or delete the existing scheduled event and create new scheduled event with latest time.
How can i do with this from PHP application?
The below is my PHP code to create event in AWS step function.
$inputData = '{'.'"invocationTime"'. " : " .'"'.'2022-10-28\T13:15:16\Z.'"'.','.'"userid"'. " : " .'"1233345"'.'}';
$data = array(
//This is the schedule in UTC time.
'input' => $inputData,
'name' => 'Test Charan",
//STATIC
'stateMachineArn' => $awsDataarn //AWS stateMachineArn
);
$inputdataaws = array(
'http' => array(
'method' => 'POST',
'content' => json_encode($data),
'header' => "x-api-key: ".$awsDataapiKey."\r\n".
"Content-Type: application/json\r\n"
)
);
$url = 'https://testcharan.execute-api.us-east-1.amazonaws.com/myapplication/scheduletask'; //AWS endpoint URL
$request = stream_context_create($inputdataaws); // TO create data in AWS statemachine
$result = file_get_contents($url, false, $request); //read the data
$response = json_decode($result); //decode the result
The above code will create the event in AWS step function.
How i can update or delete or abort events/execution those or on Running status?
you can't modify a state machine directly on runtime (on my experiece).
I suggest to
manage the scheduling of the execution by eventbridge (https://docs.aws.amazon.com/step-functions/latest/dg/tutorial-cloudwatch-events-target.html)
create another Lambda that act as "trigger"
manage state machine execution by cli
How to stop all running Step Functions of a specific state machine?
It is likely that your use-case could benefit from the callback pattern which will cause your state machine execution to wait for an event for the $$.Task.Token to be received. You can set a timeout by providing a HeartbeatSeconds which will cause the task to fail with States.Timeout.
Related
I'm likely missing something incredibly obvious, but in the project I'm working on I've got to send many jobs from a CSV of info to be processed asynchronously and Google App Engine's current way is through their new (beta) Cloud Tasks mechanism.
It will accept a payload as part of the task, so I was going to send a JSON array with each job's pertinent data... except that the only way to dictate the "Content-Type: application/json" header is during creation of the task object.
I'm using Google's own cloud-tasks 0.5.0 library.
Here is what I've been attempting, since it seems this is how most other non-cURL HTTP POST requests would accept the Content-Type header...
require_once 'vendor/autoload.php';
use Google\Cloud\Tasks\V2beta3\AppEngineHttpQueue;
use Google\Cloud\Tasks\V2beta3\CloudTasksClient;
use Google\Cloud\Tasks\V2beta3\Queue;
use Google\Cloud\Tasks\V2beta3\Task;
<<< ...lots of cruft omitted... >>>
$json_payload = json_encode(
array(
"batch" => $operation_time,
"order" => $csvln[0],
"customer" => $csvln[1],
"email" => $csvln[2],
"salesperson" => $csvln[3]
)
);
//Create each of the tasks in the queue
$options = [
'http' => [
'header' => "Content-type: application/json",
'method' => 'POST',
'content' => $json_payload
]
];
$task = new Task($options);
Any help would be immensely appreciated!
You can load a task into the Task Queue with a pre-defined payload using an App Engine HTTP Request from the Cloud Tasks PHP Client Library.
After you defined the Task, you can use the setter methods provided to you by AppEngineHttpRequest to construct your HTTP object with any required headers. This will also allow to assign the payload.
Below is a simple snippet showing how to attach a task with a payload to the default queue:
use Google\Cloud\Tasks\V2beta3\AppEngineHttpRequest;
use Google\Cloud\Tasks\V2beta3\HttpMethod;
use Google\Cloud\Tasks\V2beta3\Task;
//Preparing the payload
$json_payload = json_encode(
array(
"batch" => date("h:i:sa"),
"order" => "Payload-0000",
"customer" => "Payload-0001",
"email" => "Payload-0002",
"salesperson" => "Payload-0003"
)
);
//Create and configure the task
$httpR=new AppEngineHttpRequest();
$httpR->setBody($json_payload);
$httpR->setHeaders(['Content-type'=>'application/json']);
$httpR->setHttpMethod(HttpMethod::POST);
$httpR->setRelativeUri("/example_task_handler");
$task = new Task();
$task->setAppEngineHttpRequest($httpR);
Also consider updating your library as the current version is v0.86.0 which it will allow the assignation of headers even after the creation of the task object.
I have the following code:
$client = new GuzzleHttp\Client(
array(
'base_uri' => 'https://somesite.com'
)
);
$response = $client->request('POST', '/api', [
'form_params' => array(
'action' => 'getusers',
'api_key' => $_POST['key'],
'id' => $_POST['id']
)
]);
When multiple users are accessing the same page with the following code above, other users waits for the first or recent request to finish before loading its request.
I'm not using any session.
I have tag curl because guzzle is built on top of it. Maybe it has something to do with it?
Any workaround for this?
using xhr won't fix it because the site I'm requesting for the API does not accept other origins.
Check available PHP processes if you are using PHP FPM. It has a status page (the setup is described there) to get this information.
If all the workers are busy, then client's requests will wait. You need to increase the amount of workers to be able to process more requests at once.
I'm implementing a way to trigger jenkins jobs and to schedule them with php in my site. The way I'm doing the "on-the-fly" trigger is by simply calling the url of the job with some parameters (I'm also using Build Token Root Plugin so I can trigger the jobs without authentication).
Example below:
$data = array(
'job' => 'JOB NAME',
'token' => 'job_token',
'parameter1' => 'some parameter',
);
$options = array(
'method' => 'POST',
'data' => drupal_http_build_query($data),
'timeout' => 15,
'headers' => array('Content-Type' => 'application/x-www-form-urlencoded'),
);
drupal_http_request('http://localhost:8080/buildByToken/buildWithParameters', $options);
I can trigger the job with multiple parameters but I need to schedule the build. In jenkins there is this option "Build periodically" but it's not a parameter.
Anyone knows a way to schedule the job trough the url way?
Thanks!
You can add ?delay=300secs to the end of the URL to schedule the job to start in five minutes.
Note that sec and secs are currently the only accepted duration units.
I have an SMS API that enables to send text messages. I want to use it to send SMSs in bulk.
User can enter upto 30,000 numbers at once and send SMSs. What I am using might be a really bad approach:
foreach ($targets as $target) {
sendSms($target,$text,$extra_parms):
}
It takes 10 minutes to process for 10,000 requests (SMSs) and it's too much. What I want is when a user should click 'Send' button, he should get a message like:
"Your SMS(s) have been added to queue to be sent"
And all the SMSs should be sent in background. How can I do that?
Thanks for the help.
Follow below Process:
Instead of directly calling API, Insert all the data into your
database.
Once numbers and text added in your DB, Show user message
"Your SMS(s) have been added to queue to be sent"
User Background process which will take data from DB and call asynchronous API requests using http://www.php.net/manual/en/function.curl-multi-exec.php
Update/Delete processed API records in database so next time you can fetch only data which are not processed.
Please note: Normal CURL request use synchronous and it wait for for response which make delay.
Create queue i.e. in database and put all your SMS actions there. Create separate script that will be sending SMSes (in batches, as whole whatever) and have it periodically started (i.e. using Curl) to deal with your queue.
That's what I use:
function fast_post($url,$data){
ignore_user_abort(true);
$ch = curl_init();
$defaults = array(
CURLOPT_POST => 1,
CURLOPT_HEADER => 0,
CURLOPT_URL => $url,
CURLOPT_FRESH_CONNECT => 1,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_FORBID_REUSE => 1,
CURLOPT_TIMEOUT => 1,
CURLOPT_POSTFIELDS => http_build_query($data)
);
curl_setopt_array($ch, $defaults);
curl_exec($ch);
curl_close($ch);
}
Note that in this function $data is an array.
Since I got tired of repetitively clicking/waiting/clicking with Amazon web services GUI interface, I needed an EC2 script to:
Stop the instance specified at bash command line
Detach a specified volume
Create a new a volume from a specified snapshot
Start the instance up again
It can of course be done with the GUI, but its such a pain. This way I can just let the script run for 5 minutes while I get coffee instead of having to attend to it.
Syntax:
php reprovision.php i-xxxx vol-xxxx snap-xxxx
reprovision.php:
<?php
require 'aws.php';
$config = aws_setup();
$ec2Client = \Aws\Ec2\Ec2Client::factory($config);
$stop = $argv[1];
$detach = $argv[2];
$snapshot = $argv[3];
$ec2Client->stopInstances(array('InstanceIds' => array($stop)));
sleep(60);
$ec2Client->detachVolume(array('VolumeId' => $detach));
sleep(10);
$vol = $ec2Client->createVolume(array('SnapshotId' => $snapshot, 'AvailabilityZone' => 'us-east-1a'));
sleep(10);
$ec2Client->attachVolume(array('VolumeId' => $vol->VolumeId, 'InstanceId' => $stop, 'Device' => '/dev/sda1'));
sleep(10);
$ec2Client->startInstances(array('InstanceIds' => array($stop)));
'aws_setup()' gets the configuration array to launch the ec2 client in the next line.
The command-line arguments are then assigned to variables.
The next version of the script would ideally use the EC2 wait functions instead of PHP's 'sleep'.
AWS PHP SDK2 EC2 Client API