Reprovisioning server with PHP AWS SDK2 script - php

Since I got tired of repetitively clicking/waiting/clicking with Amazon web services GUI interface, I needed an EC2 script to:
Stop the instance specified at bash command line
Detach a specified volume
Create a new a volume from a specified snapshot
Start the instance up again
It can of course be done with the GUI, but its such a pain. This way I can just let the script run for 5 minutes while I get coffee instead of having to attend to it.

Syntax:
php reprovision.php i-xxxx vol-xxxx snap-xxxx
reprovision.php:
<?php
require 'aws.php';
$config = aws_setup();
$ec2Client = \Aws\Ec2\Ec2Client::factory($config);
$stop = $argv[1];
$detach = $argv[2];
$snapshot = $argv[3];
$ec2Client->stopInstances(array('InstanceIds' => array($stop)));
sleep(60);
$ec2Client->detachVolume(array('VolumeId' => $detach));
sleep(10);
$vol = $ec2Client->createVolume(array('SnapshotId' => $snapshot, 'AvailabilityZone' => 'us-east-1a'));
sleep(10);
$ec2Client->attachVolume(array('VolumeId' => $vol->VolumeId, 'InstanceId' => $stop, 'Device' => '/dev/sda1'));
sleep(10);
$ec2Client->startInstances(array('InstanceIds' => array($stop)));
'aws_setup()' gets the configuration array to launch the ec2 client in the next line.
The command-line arguments are then assigned to variables.
The next version of the script would ideally use the EC2 wait functions instead of PHP's 'sleep'.
AWS PHP SDK2 EC2 Client API

Related

Delete or Stop Scheduled State machines of Step Function

We created step function (WAIT State) to execute schedule task based up on user input time.
We are calling this step up function from PHP code so it will create entry in that state machine and step function (WAIT State) will trigger lambda service automatically when it meets countdown timer.
My requirement is to user to have the option to update the time or cancel event from PHP application. at this scenario i have to update existing scheduled step function event/task time to new time or delete the existing scheduled event and create new scheduled event with latest time.
How can i do with this from PHP application?
The below is my PHP code to create event in AWS step function.
$inputData = '{'.'"invocationTime"'. " : " .'"'.'2022-10-28\T13:15:16\Z.'"'.','.'"userid"'. " : " .'"1233345"'.'}';
$data = array(
//This is the schedule in UTC time.
'input' => $inputData,
'name' => 'Test Charan",
//STATIC
'stateMachineArn' => $awsDataarn //AWS stateMachineArn
);
$inputdataaws = array(
'http' => array(
'method' => 'POST',
'content' => json_encode($data),
'header' => "x-api-key: ".$awsDataapiKey."\r\n".
"Content-Type: application/json\r\n"
)
);
$url = 'https://testcharan.execute-api.us-east-1.amazonaws.com/myapplication/scheduletask'; //AWS endpoint URL
$request = stream_context_create($inputdataaws); // TO create data in AWS statemachine
$result = file_get_contents($url, false, $request); //read the data
$response = json_decode($result); //decode the result
The above code will create the event in AWS step function.
How i can update or delete or abort events/execution those or on Running status?
you can't modify a state machine directly on runtime (on my experiece).
I suggest to
manage the scheduling of the execution by eventbridge (https://docs.aws.amazon.com/step-functions/latest/dg/tutorial-cloudwatch-events-target.html)
create another Lambda that act as "trigger"
manage state machine execution by cli
How to stop all running Step Functions of a specific state machine?
It is likely that your use-case could benefit from the callback pattern which will cause your state machine execution to wait for an event for the $$.Task.Token to be received. You can set a timeout by providing a HeartbeatSeconds which will cause the task to fail with States.Timeout.

aws-sdk-php v3.44 fails to sendMessage to SQS FIFO Queue: MessageGroupId missing?

I am using the Amazon SDK for PHP version 3.44 (released 2017-11-30). I can connect to my Amazon SQS account and execute the listQueues(), getQueueUrl(), getQueueAttributes(), and receiveMessage() commands just fine. However, the sendMessage() command consistently fails with the following message:
The request must contain the parameter MessageGroupId.
I am most definitely including this parameter. It doesn't seem to matter which version of the aws-sdk-php API I use, this message keeps coming back. Here is my code:
$queue = SqsClient::factory([
'profile' => $profile,
'region' => $region,
'version' => '2012-11-05',
'credentials' => $credentials,
]);
$queue_list = $queue->listQueues(); // ok
$url = $queue->getQueueUrl(['QueueName'=>$queue_name]); // ok
$received = $queue->receiveMessage(['QueueUrl'=>$url->get('QueueUrl')]); // ok
$response = $queue->sendMessage([
'MessageBody' => $message,
'MessageGroupId' => $message_group_id,
'QueueUrl' => $url->get('QueueUrl'),
]); // fails with message indicating MessageGroupId is missing
I have spent several hours searching for a working example of sending a message up to an Amazon SQS FIFO queue through the PHP SDK, and am beginning to believe this is not possible. Has anybody out there been able to get the aws-sdk-php library to work with an SQS FIFO queue?
The first line is creating an instance of SqsClient, not creating a SQS queue. You still still need to call $queue->createQueue. See the documentation. For fifo queues, you will need to enable "FifoQueue" to "true", and set up the "ContentBasedDeduplication" when creating the queue. When you send your message, depending on the ContentBasedDeduplication setting of the queue you created, you may or may not also need to send a "MessageDeduplicationId" along with the "MessageGroupId".
From your code, I can't see how you created the Queue.
Did you enable fifo queues with the property "FifoQueue" => "true"
Did you set "ContentBasedDeduplication" to "true" or "false" ?
Did you name your queue with the extension ".fifo" ?
I did all of these things, and configured my queue with ContentBasedDeduplication set to "false". When I send a message, the only other property that I'm sending that you aren't (along with the MessageGroupId) is MessageDeduplicationId. I'm able to send messages to the fifo queue just fine using sdk 3.44.
It looks like Amazon has quietly resolved whatever bug was blocking my API call. I did not change my queue settings or my code. The same API call that resulted in error messages last week now runs just fine.
I ran into this problem on 3.3.0 forever. In my case, I just needed to upgrade to 3.44.2, then pass in MessageDeduplicationId in addition to MessageGroupId. I would probably double check your SDK version if you run into this issue.

Logging to CloudWatch from EC2 instances

My EC2 servers are currently hosting a website that logs each registered user's activity under their own separate log file on the local EC2 instance, say username.log. I'm trying to figure out a way to push log events for these to CloudWatch using the PHP SDK without slowing the application down, AND while still being able to maintain a separate log file for each registered member of my website.
I can't for the life of me figure this out:
OPTION 1: How can I log to CloudWatch asynchronously using the CloudWatch SDK? My PHP application is behaving VERY sluggishly, since each log line takes roughly 100ms to push directly to CloudWatch. Code sample is below.
OPTION 2: Alternatively, how could I configure an installed CloudWatch Agent on EC2 to simply OBSERVE all of my log files, which would basically upload them asynchronously to CloudWatch for me in a separate process? The CloudWatch EC2 Logging Agent requires a static "configuration file" (AWS documentation) on your server which, to my knowledge, needs to lists out all of your log files ("log streams") in advance, which I won't be able to predict at the time of server startup. Is there any way around this (ie, simply observe ALL log files in a directory)? Config file sample is below.
All ideas are welcome here, but I don't want my solution to simply be "throw all your logs into a single file, so that your log names are always predictable".
Thanks in advance!!!
OPTION 1: Logging via SDK (takes ~100ms / logEvent):
// Configuration to use for the CloudWatch client
$sharedConfig = [
'region' => 'us-east-1',
'version' => 'latest',
'http' => [
'verify' => false
]
];
// Create a CloudWatch client
$cwClient = new Aws\CloudWatchLogs\CloudWatchLogsClient($sharedConfig);
// DESCRIBE ANY EXISTING LOG STREAMS / FILES
$create_new_stream = true;
$next_sequence_id = "0";
$result = $cwClient->describeLogStreams([
'Descending' => true,
'logGroupName' => 'user_logs',
'LogStreamNamePrefix' => $stream,
]);
// Iterate through the results, looking for a stream that already exists with the intended name
// This is so that we can get the next sequence id ('uploadSequenceToken'), so we can add a line to an existing log file
foreach ($result->get("logStreams") as $stream_temp) {
if ($stream_temp['logStreamName'] == $stream) {
$create_new_stream = false;
if (array_key_exists('uploadSequenceToken', $stream_temp)) {
$next_sequence_id = $stream_temp['uploadSequenceToken'];
}
break;
}
}
// CREATE A NEW LOG STREAM / FILE IF NECESSARY
if ($create_new_stream) {
$result = $cwClient->createLogStream([
'logGroupName' => 'user_logs',
'logStreamName' => $stream,
]);
}
// PUSH A LINE TO THE LOG *** This step ALONE takes 70-100ms!!! ***
$result = $cwClient->putLogEvents([
'logGroupName' => 'user_logs',
'logStreamName' => $stream,
'logEvents' => [
[
'timestamp' => round(microtime(true) * 1000),
'message' => $msg,
],
],
'sequenceToken' => $next_sequence_id
]);
OPTION 2: Logging via CloudWatch Installed Agent (note that config file below only allows hardcoded, predermined log names as far as I know):
[general]
state_file = /var/awslogs/state/agent-state
[applog]
file = /var/www/html/logs/applog.log
log_group_name = PP
log_stream_name = applog.log
datetime_format = %Y-%m-%d %H:%M:%S
Looks like we have some good news now... not sure if it's too late!
CloudWatch Log Configuration
So to answer the doubt,
Is there any way around this (ie, simply observe ALL log files in a directory)?
yes, we can mention log files and file paths using wild cards, which can help you in having some flexibility in configuring from where the logs are fetched and pushed to the log streams.

MQTT Subscribe with PHP to IBM Bluemix

I want to connect to IBM Bluemix through the MQTT protocol using PHP to subscribe to messages come from IoT Foundation.
I use this code:
<?php
require("../phpMQTT.php");
$config = array(
'org_id' => 't9m318',
'port' => '1883',
'app_id' => 'phpmqtt',
'iotf_api_key' => 'my api key',
'iotf_api_secret' => 'my api secret',
'device_id' => 'phpmqtt'
);
$config['server'] = $config['org_id'] .'.messaging.internetofthings.ibmcloud.com';
$config['client_id'] = 'a:' . $config['org_id'] . ':' .$config['app_id'];
$location = array();
// initialize client
$mqtt = new phpMQTT($config['server'], $config['port'], $config['client_id']);
$mqtt->debug = false;
// connect to broker
if(!$mqtt->connect(true, null, $config['iotf_api_key'], $config['iotf_api_secret'])){
echo 'ERROR: Could not connect to IoT cloud';
exit();
}
$topics['iot-2/type/+/id/phpmqtt/evt/+/fmt/json'] =
array("qos"=>0, "function"=>"procmsg");
$mqtt->subscribe($topics, 0);
// process messages
while ($mqtt->proc(true)) {
}
// disconnect
$mqtt->close();
function procmsg($topic, $msg) {
echo "Msg Recieved: $msg";
}
?>
But the browser show this message:
Fatal error: Maximum execution time of 30 seconds exceeded in /Library/WebServer/Documents/phpMQTT/phpMQTT.php on line 167
subscribe is not meant to run in the web browser as it has an infinite look, its best being run from the command line.
If you are using the subscribe method to receive messages you can look at persistent msgs and breaking out of the loop on msg receipt.
There is an example of how to use phpMQTT in the web browser in the file
web-app.php of this respository https://github.com/vvaswani/bluemix-iotf-device-tracker
You don't provide very much information about what you want to achieve by doing this; do you want to keep sending messages to the browser until the page is closed in the browser?
Server Sent Events or Websockets might be a better bet, and PHP might not be the best choice for this, because it uses up quite a lot of memory per connection (compared to node.js for example).
However if you just want to remove the 30 second PHP timeout, then you can use this function:
http://php.net/manual/en/function.set-time-limit.php
Or set max_execution_time in php.ini:
http://php.net/manual/en/info.configuration.php
Setting the maximum execution time to 0 should stop it from timing out.
But be warned that PHP and/or your webserver will have a limited number of concurrent HTTP connections.

How to see 'syslog' output at app_server running PHP

I'm testing a PHP application at my computer using the app_server launched by "Google App Engine Launcher".
But, at its logs, I am not seeing the output from the syslogs that are inserted at my PHP code.
I've tried the parameters --log_level and --dev_appserver_log_level without any success.
Do anybody knows what can be done?
My Google App Engine Launcher is version 1.8.6.
The default configuration should have syslogging enabled so no additional parameters should be necessary when launching app_server. Could you execute a very simple test script and post the output?
<?php
print 'Using syslog() '. (syslog(LOG_DEBUG, 'Testing syslog() functionality') ? 'succeeded' : 'failed');
After executing the above script and getting a positive result message you should find at least one entry in your local syslog.
If you want to read the logs programmatically (source [1]):
you can iterate over messages added by syslog() using AppEngine's LogService API:
use google\appengine\api\log\LogService;
use google\appengine\util as util;
$start = (float) $_GET["start"];
$end = (float) $_GET["end"];
$options = [
'start_time' => $start * 1e6,
'end_time' => $end * 1e6,
'include_app_logs' => true
];
$logs = LogService::fetch($options);
[1] https://developers.google.com/appengine/docs/php/logs/

Categories