I have some functions taking a lot of time and I want them to be executed in the background and to send an email when it's finished.
These functions generate a PDF as Response and if possible, I want this PDF to be attached to email.
This is the standalone function that takes a lot of time :
$passages = $em->getRepository(PasserColle::class)->calculClassementAction($id, $group);
This function is included in my Controller in a function imprimerAction($id, $request) that returns :
return new Response($html2pdf->Output('Classement.pdf'), 200, array('Content-Type' => 'application/pdf'));
I tried to use the Process Component but I can't make it work since I don't understand what to type in the parenthesis :
$process = new Process('ls -lsa');
And how to get the output I want.
You can use RabbitMQ to do this and this bundle for Symfony.
The concept is simple. You'll have Producers who will send messages (with the format you want) and Consumers who will consume these messages. Messages are published in an exchange and will be routed into queue where Consumers are waiting for new messages.
In your example, you can produce a message which tell to Consumers to do a pdf generation and to send email. Publish a message in format JSON for example with, in your case, $id and $group and Consumers will do what they have to do.
Follow this link that explain how you can do this.
Otherwise, if you want to use Process Component, you can simple create a Command Console and then do this : $process = new Process('php bin/console yourcommand') and $process->run()
I just hope it helps.
Best Regards.
Related
I'm trying to listen for subscription changes (new and existing) of my Google Play app on the server. Here's the code I'm using. This uses the google/cloud-pubsub composer package:
$projectId = 'app-name';
$keyFile = file_get_contents(storage_path('app/app-name.json'));
$pubsub = new PubSubClient([
'projectId' => $projectId,
'keyFile' => json_decode($keyFile, true)
]);
$httpPostRequestBody = file_get_contents('php://input');
$requestData = json_decode($httpPostRequestBody, true);
info(json_encode($requestData));
$message = $pubsub->consume($requestData);
info(json_encode($message));
The code above works but the problem is that the data I get doesn't match the one I'm getting in the app side. This is a sample data:
{
"message":{
"data":"eyJ2ZXJ...",
"messageId":"16797998xxxxxxxxx",
"message_id":"1679799xxxxxxxxx",
"publishTime":"2020-12-15T02:09:23.27Z",
"publish_time":"2020-12-15T02:09:23.27Z"
},
"subscription":"projects\/app-name\/subscriptions\/test-subs"
}
If you base64_decode() the data, you'll get something like this:
{
version: "1.0",
packageName: "com.dev.app",
eventTimeMillis: "1607997631636",
subscriptionNotification: {
version: "1.0",
notificationType: 4,
purchaseToken: "kmloa....",
subscriptionId: "app_subs1"
}
}
This is where I'm expecting the purchaseToken to be the same as the one I'm getting from the client side.
Here's the code in the client-side. I'm using Expo in-app purchases to implement subscriptions:
setPurchaseListener(async ({ responseCode, results, errorCode }) => {
if (responseCode === IAPResponseCode.OK) {
const { orderId, purchaseToken, acknowledged } = results[0];
if (!acknowledged) {
await instance.post("/subscribe", {
order_id: orderId,
order_token: purchaseToken,
data: JSON.stringify(results[0]),
});
finishTransactionAsync(results[0], true);
alert(
"You're now subscribed! You can now use the full functionality of the app."
);
}
}
});
I'm expecting the purchaseToken I'm extracting from results[0] to be the same as the one the Google server is returning when it pushes the notification to the endpoint. But it doesn't.
Update
I think my main problem is that I'm assumming all the data I need will be coming from Google Pay, so I'm just relying on the data published by Google when a user subscribes in the app.
This isn't actually the one that publishes the message:
await instance.post("/subscribe")
It just updates the database with the purchase token. I can just use this to subscribe the user but there's no guarantee that the request is legitimate. Someone can just construct the necessary credentials based on an existing user and they can pretty much subscribe without paying anything. Plus this method can't be used to keep the user subscribed. So the data really has to come from Google.
Based on the answer below, I now realized that you're supposed to trigger the publish from your own server? and then you listen for that? So when I call this from the client:
await instance.post("/subscribe", {
purchaseToken
});
I actually need to publish the message containing the purchase token like so:
$pubsub = new PubSubClient([
'projectId' => $projectId,
]);
$topic = $pubsub->topic($topicName);
$message = [
'purchaseToken' => request('purchaseToken')
];
$topic->publish(['data' => $message]);
Is that what you're saying? But the only problem with this approach is how to validate if the purchase token is legitimate, and how to renew the subscription in the server? I have a field that needs to be updated each month so the user stays "subscribed" in the eyes of the server.
Maybe, I'm just overcomplicating things by using pub/sub. If there's actually an API which I could pull out data from regularly (using cron) which allows me to keep the user subscription data updated then that will also be acceptable as an answer.
First of all - I have a really bad experience with php and pubsub because of the php PubSubClient. If your script is only waiting for push and checking the messages then remove the pubsub package and handle it with few lines of code.
Example:
$message = file_get_contents('php://input');
$message = json_decode($message, true);
if (is_array($message)) {
$message = (isset($message['message']) && isset($message['message']['data'])) ? base64_decode($message['message']['data']) : false;
if (is_string($message)) {
$message = json_decode($message, true);
if (is_array($message)) {
$type = (isset($message['type'])) ? $message['type'] : null;
$data = (isset($message['data'])) ? $message['data'] : [];
}
}
}
I'm not sure how everything works on your side but if this part publishes the message:
await instance.post("/subscribe", {
order_id: orderId,
order_token: purchaseToken,
data: JSON.stringify(results[0]),
});
It looks like it's a proxy method to publish your messages. Because payload sent with it is not like a PubSub described schema and in the final message it doesn't look like IAPQueryResponse
If I was in your situation I will check few things to debug the problem:
How I publish/read a message to/from PubSub (topic, subscription and message payload)
I will write the publish mechanism as it is described in Google PubSub publish documentation
I will check my project, topic and subscription
If everything is set-up correctly then I will compare all other message data
If the problem persist then I will try to publish to PubSub minimal amount of data - just purchaseToken at the start to check what breaks the messages
For easier debug:
Create pull subscription
When you publish a message check pull subscription messages with "View messages"
For me the problem is not directly in PubSub but in your implementation of publish/receiving of messages.
UPDATE 21-12-2020:
Flow:
Customer create/renew subscription
Publish to pubsub with authentication
PubSub transfers the message to analysis application via "push" to make your analysis.
If you need information like:
New subscribers count
Renews count
Active subscriptions count
You can create your own analysis application but if you need something more complicated then you have to pick a tool to met your needs.
You can get the messages from pubsub also with "pull" but there are few cases I've met:
Last time I've used pull pubsub returns random amount of messages - if my limit is 50 and I have more than 50 messages in the queue I'm expecting to get 50 messages but sometimes pubsub gives me less messages.
PubSub returned messages in random order - now there is an option to use ordering key but it's something new.
To implement "pull" you have to run crons or something with "push" you receive the message as soon as possible.
With "pull" you have to depend on library/package (or whatever in any language it's called) but on "push" you can handle the message with just few lines of code as my php exapmle.
I am trying to set up outgoing calls through Twilio task router. I am creating tasks through PHP with the all the necessary attributes (instruction, to , from, post_work_activity_sid ) but the created task doesn't set up a call between the twilio client and the external phone number. I was hoping that tasks created by the program would create a conference call between worker(browser) and external client. I keep getting an error that is shown below. I have a assignment php on my application server which de-queues calls to my workers (Browser clients). Currently, incoming calls from external number to browser clients through task router is working as expected. However, outbound calls generates a task and a reservation is assigned but Twilio is not able to dequeue the call to a worker.
Is there a way to create a task for voice call such that the task is created using Twiml Enqueue verb? Or is there a better way of handling outbound calls using the Twilio taskrouter so that calls are assigned successfully to the workers using Browser client ?
As per this thread: Can outbound calls be made through Twilio TaskRouter, I tried using instruction call.I have also gone through the documentation and another stack overflow post about assignment callback URL but it's not clear and am not sure what I could be potentially doing wrong.
Error message:
The dequeue instruction can only be issued on a task that was created using the TwiML verb
<?php
require_once('TwilioVendor/autoload.php'); // Loads the library
use Twilio\Rest\Client;
$sid = "ACxxxxxxxxxxxxxxxxxxxxxxx";
$token = "xxxxxxxxxxxxxxxxxxxxxxxxxxx";
try{
$twilio = new Client($sid, $token);
$task = $twilio->taskrouter->v1-
>workspaces("WSxxxxxxxxxxxxxxxxxxxxxxxxxxxx")->tasks-
>create(array("attributes" => json_encode(array(
//"instruction"=>"accept",
//"instruction"=>"conference",
"instruction"=>"call",
"to"=> "client:Bob",
"from"=> "+61123456789",
"post_work_activity_sid"=> "WAxxxxxxxxxxxxxxxxxxxx"
)),
"workflowSid" => "WWxxxxxxxxxxxxxxxxxx"
)
);
}catch(Exception $e)
{
echo 'Caught exception: ', $e->getMessage(), "\n";
}
print($task->sid);
**Assignment Callback code**
<?php
$assignment_instruction = [
'instruction' => 'call','to'=> 'client:Bob',
'from' => '+61xxxxx','url'=>'CRM REST END POINT'
];
header('Content-Type: application/json');
echo json_encode($assignment_instruction);
**CRM REST END POINT TWIML**
<?php
require __DIR__ . '/vendor/autoload.php';
require_once 'TwilioVendor/autoload.php';
use Twilio\Twiml;
$reservationSid= $_REQUEST['rsid']
header('Content-Type: text/xml');
?>
<?xml version="1.0" encoding="UTF-8"?>
<Response>
<Say voice="woman">You will now be connected to the customer</Say>
<Dial>
<Queue reservationSid="<?$reservationSid?>"/>
</Dial>
</Response>
Twilio developer evangelist here.
TaskRouter will only generate calls to your workers when a task is created by the <Enqueue> TwiML verb. Creating a task with the REST API, even if you add call attributes, will not generate a call when you use the dequeue or call instruction.
Instead, you will need to manage the task and call yourself. When your worker is sent the reservation and accepts it, you should use the REST API to create the call, connect it to your browser client and then dial out to the end user.
I know it's somehow weird to ask something like this, but I'm trying to program a telegram bot with PHP.
The bot is in a channel (e.g. Channel A) and I'm going to send messages in that channel, so the bot will copy X number of messages to another channel (Channel B), Every Y minutes.
Example:
X = 5
Y = 60
Channel A = ID .....
Channel B = ID .....
So it will COPY 5 messages from A to B every hour...
Can anybody write me a template please? I think I can configure the VPS and webhook stuff (SSL and etc).
If you need send message per minutes, and get message from Telegram callback, you need read about queue (zmq, redis, gearman or etc).
Create daemons. These are your bots. They can read messages from queue and send callbacks.
Write Controller to get callback from telegram. It can take message and push to queue.
Install Ev or Event extension on PHP. (You can use reactphp, it simple solution to create timer)
Bot1 create timer, and listen messages. If we have more 5 messages, timer can push message in queue for Bot2.
You can use reactphp/zmq, nrk/predis-async to helpful your task
P.S. It is most simple solution. But you can use pthreads (instead create daemon process) or use simple socket to send message in bot.
If you want to use webhook things you can do this.
write a sample code like this:
<?php
$texts_from_other_channel = [];
array_push($texts_from_other_channel , $update_array['message']['text']);
$t_size = sizeof($texts_from_other_channel)
for($i=0 ; $i < $t_size ; $i++){
$post_prs = ['chat_id' => $channel_id , 'text' => $texts_from_other_channel[$i]];
send_reply($sendmessag_url , $post_prs);
end
?>
other things like send_reply() function or $update_array are up to you and I left to yourself.
I'm working on trace logger of sorts that pushes log message requests onto a Queue on a Service Bus, to later be picked off by a worker role which would insert them into the table store. While running on my machine, this works just fine (since I'm the only one using it), but once I put it up on a server to test, it produced the following error:
HTTP_Request2_MessageException: Malformed response: in D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php on line 1013
0 HTTP_Request2_Response->__construct('', true, Object(Net_URL2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:1013
1 HTTP_Request2_Adapter_Socket->readResponse() D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2\Adapter\Socket.php:139
2 HTTP_Request2_Adapter_Socket->sendRequest(Object(HTTP_Request2)) D:\home\site\wwwroot\vendor\pear-pear.php.net\HTTP_Request2\HTTP\Request2.php:939
3 HTTP_Request2->send() D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\Http\HttpClient.php:262
4 WindowsAzure\Common\Internal\Http\HttpClient->send(Array, Object(WindowsAzure\Common\Internal\Http\Url)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\RestProxy.php:141
5 WindowsAzure\Common\Internal\RestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\Common\Internal\ServiceRestProxy.php:86
6 WindowsAzure\Common\Internal\ServiceRestProxy->sendContext(Object(WindowsAzure\Common\Internal\Http\HttpCallContext)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:139
7 WindowsAzure\ServiceBus\ServiceBusRestProxy->sendMessage('<queuename>/mes…', Object(WindowsAzure\ServiceBus\Models\BrokeredMessage)) D:\home\site\wwwroot\vendor\microsoft\windowsazure\WindowsAzure\ServiceBus\ServiceBusRestProxy.php:155
⋮
I've seen previous posts that describe similar issues; Namely:
Windows Azure PHP Queue REST Proxy Limit (Stack Overflow)
Operations on HTTPS do not work correctly (GitHub)
That imply that this is a known issue regarding the PHP Azure Storage libraries, where there are a limited amount of HTTPS connections allowed. Before requirements were changed, I was accessing the table store directly, and ran into this same issue, and fixed it in the way the first link describes.
The problem is that the Service Bus endpoint in the connection string, unlike Table Store (etc.) connection string endpoints, MUST be 'HTTPS'. Trying to use it with 'HTTP' will return a 400 - Bad Request error.
I was wondering if anyone had any ideas on a potential workaround. Any advice would be greatly appreciated.
Thanks!
EDIT (After Gary Liu's Comment):
Here's the code I use to add items to the queue:
private function logToAzureSB($source, $msg, $severity, $machine)
{
// Gather all relevant information
$msgInfo = array(
"Severity" => $severity,
"Message" => $msg,
"Machine" => $machine,
"Source" => $source
);
// Encode it to a JSON string, and add it to a Brokered message.
$encoded = json_encode($msgInfo);
$message = new BrokeredMessage($encoded);
$message->setContentType("application/json");
// Attempt to push the message onto the Queue
try
{
$this->sbRestProxy->sendQueueMessage($this->azureQueueName, $message);
}
catch(ServiceException $e)
{
throw new \DatabaseException($e->getMessage, $e->getCode, $e->getPrevious);
}
}
Here, $this->sbRestProxy is a Service Bus REST Proxy, set up when the logging class initializes.
On the recieving end of things, here's the code on the Worker role side of this:
public override void Run()
{
// Initiates the message pump and callback is invoked for each message that is received, calling close on the client will stop the pump.
Client.OnMessage((receivedMessage) =>
{
try
{
// Pull the Message from the recieved object.
Stream stream = receivedMessage.GetBody<Stream>();
StreamReader reader = new StreamReader(stream);
string message = reader.ReadToEnd();
LoggingMessage mMsg = JsonConvert.DeserializeObject<LoggingMessage>(message);
// Create an entry with the information given.
LogEntry entry = new LogEntry(mMsg);
// Set the Logger to the appropriate table store, and insert the entry into the table.
Logger.InsertIntoLog(entry, mMsg.Service);
}
catch
{
// Handle any message processing specific exceptions here
}
});
CompletedEvent.WaitOne();
}
Where Logging Message is a simple object that basically contains the same fields as the Message Logged in PHP (Used for JSON Deserialization), LogEntry is a TableEntity which contains these fields as well, and Logger is an instance of a Table Store Logger, set up during the worker role's OnStart method.
This was a known issue with the Windows Azure PHP, which hasn't been looked at in a long time, nor has it been fixed. In the time between when I posted this and now, We ended up writing a separate API web service for logging, and had our PHP Code send JSON strings to it over cURL, which works well enough as a temporary work around. We're moving off of PHP now, so this wont be an issue for much longer anyways.
I'm using codeigniter-gcm library on top of codeigniter to send messages to Google Cloud Messaging service. It sends the message and the message is received at the mobile device, but if I send multiple messages, only the latest message appears on the device (as if it is overriding the previous messages).
I'm seeing that I might need to create a unique notification ID, but I'm not seeing how it's done anywhere on the codeigniter-gcm documentation or Google's documentation for downstream messages.
Any idea how this should be done?
Here's my code in the codeigniter controller. It is worth mentioning that Google's response contains a different message_id for each time I send a push...
public function index() {
$this->load->library("gcm");
$this->gcm->setMessage("Test message sent on " . date("d.m.Y H:i:s"));
$this->gcm->addRecepient("*****************");
$this->gcm->setData(array(
'title' => 'my title',
'some_key' => 'some_val'
));
$this->gcm->setTtl(false);
$this->gcm->setGroup(false);
if ($this->gcm->send())
echo 'Success for all messages';
else
echo 'Some messages have errors';
print_r($this->gcm->status);
print_r($this->gcm->messagesStatuses);
}
After three exhausting days I found the solution. I'm posting it here in hope of saving someone else's time...
I had to add a parameter to the data object inside the greater JSON object, named "notId" with a unique integer value (which I chose to use a random integer from a wide range). Now why Google didn't include this in their docs? Beats me...
Here's how my JSON looks now, when it creates separate notifications instead of overriding:
{
"data": {
"some_key":"some_val",
"title":"test title",
"message":"Test message from 30.09.2015 12:57:44",
"notId":14243
},
"registration_ids":["*******"]
}
Edit:
I'm now thinking that the notId parameter is not really determined by Google, but by a plugin I use on the mobile app side.
To extend further on my environment, my mobile app is developed using Phonegap, so to get push notification I use phonegap-plugin-push which I now see in its docs that parameter name.
I'm kinda' lost now as far as explaining the situation - but happy it is no longer a problem for me :-)
You need to pass a unique ID to each notification. Once you have clicked on the notification you use that ID to remove it.
...
mNotificationManager = (NotificationManager) getSystemService(NOTIFICATION_SERVICE);
mNotificationManager.cancel(SIMPLE_NOTFICATION_ID_A);
...
But I'm sure you shouldn't have so much of notifications for user at once. You should show a single notification that consolidates info about group of events like for example Gmail client does. Use Notification.Builder for that purpose.
NotificationCompat.Builder b = new NotificationCompat.Builder(c);
b.setNumber(g_push.Counter)
.setLargeIcon(BitmapFactory.decodeResource(c.getResources(), R.drawable.list_avatar))
.setSmallIcon(R.drawable.ic_stat_example)
.setAutoCancel(true)
.setContentTitle(pushCount > 1 ? c.getString(R.string.stat_messages_title) + pushCount : title)
.setContentText(pushCount > 1 ? push.ProfileID : mess)
.setWhen(g_push.Timestamp)
.setContentIntent(PendingIntent.getActivity(c, 0, it, PendingIntent.FLAG_UPDATE_CURRENT))
.setDeleteIntent(PendingIntent.getBroadcast(c, 0, new Intent(ACTION_CLEAR_NOTIFICATION), PendingIntent.FLAG_CANCEL_CURRENT))
.setDefaults(Notification.DEFAULT_LIGHTS | Notification.DEFAULT_VIBRATE)
.setSound(Uri.parse(prefs.getString(
SharedPreferencesID.PREFERENCE_ID_PUSH_SOUND_URI,
"android.resource://ru.mail.mailapp/raw/new_message_bells")));