Process Bar for symfony process - php

i have long running task which i have incorporated as a Process in my symfony project. This is how i call the process
$rootDir = $this->get('kernel')->getRootDir();
$adk_process = new Process(
'php ../bin/console app:adkaction ' . $numcampaigns . ', ' . $timezone . ',' . $depdate);
$adk_process->setWorkingDirectory($rootDir);
$adk_process->setOptions(['suppress_errors' => false]);
$adk_process->setTimeout(null);
$adk_process->start();
while ($adk_process->isRunning()) {
$currprogress = $adk_process->getIncrementalOutput();
return $this->render('BackEnd/user.html.twig',[
'form'=>$form->createView(),
'currprogress' => $currprogress
]);
}
My process currently does not have any output (it is parsing xml file and pushing data to db). When done currprogress variable should be pushes into my .twig template when it will populate progress bar.
I need to show the progress of file parsing (i.e. how many items were processed as it can be up to 100k lines and the process can run for 2-3 hours).
At the moment i cannot get the the incremental output from my process to push it over to my temaplte. What would be the best way to do it!

Related

Updating php script one time per day

I am making a Covid-19 statistics website - https://e-server24.eu/ . Every time somebody is entering the website, the PHP script is decoding JSON from 3 urls and storing data into some variables.
I want to make my website more optimized so my question is: Is there any script that can update the variables data one time per day, not every time someone accesses the website?
Thanks,
I suggest looking into memory object caching.
Many high-performance PHP web apps use caching extensions (e.g. Memcached, APCu, WinCache), accelerators (e.g. APC, varnish) and caching DBs like Redis. The setup can be a bit involved but you can get started with a simple role-your-own solution (inspired by this):
<?php
function cache_set($key, $val) {
$val = var_export($val, true);
// HHVM fails at __set_state, so just use object cast for now
$val = str_replace('stdClass::__set_state', '(object)', $val);
// Write to temp file first to ensure atomicity
$tmp = sys_get_temp_dir()."/$key." . uniqid('', true) . '.tmp';
file_put_contents($tmp, '<?php $val = ' . $val . ';', LOCK_EX);
rename($tmp, sys_get_temp_dir()."/$key");
}
function cache_get($key) {
//echo sys_get_temp_dir()."/$key";
#include sys_get_temp_dir()."/$key";
return isset($val) ? $val : false;
}
$ttl_hours = 24;
$now = new DateTime();
// Get results from cache if possible. Otherwise, retrieve it.
$data = cache_get('my_key');
$last_change = cache_get('my_key_last_mod');
if ($data === false || $last_change === false || $now->diff($last_change)->h >= $ttl_hours ) { // cached? h: Number of hours.
// expensive call to get the actual data; we simple create an object to demonstrate the concept
$myObj = new stdClass();
$myObj->name = "John";
$myObj->age = 30;
$myObj->city = "New York";
$data = json_encode($myObj);
// Add to user cache
cache_set('my_key', $data);
$last_change = new DateTime(); //now
// Add timestamp to user cache
cache_set('my_key_last_mod', $last_change);
}
echo $data;
Voila.
Furthermore; you could look into client-side caching and many other things. But this should give you an idea.
PS: Most memory cache systems allow to define a time-to-live (TTL) which makes this more concise. But I wanted to keep this example dependency-free. Cache cleaning was omitted here. Simply delete the temp file.
Simple way to do that
Create a script which will fetch , decode JSON data and store it to your database.
Then set a Cron jobs with time laps of 24 hours .
And when user visit your site fetch the data from your database instead of your api provider.

What is the proper way to monitor PHP execution in the frontend?

I will use an example to demonstrate this.
Assuming I have a MySQL DB where I place paths to files to be uploaded to S3, and a status column where each file is attributed either a pending or uploaded string.
I have a PHP script, upload.php, which I can run with php upload.php and receive the output logged to my terminal as the script progresses. I would like to set up a cron job that runs the script at certain intervals, say every 30 minutes, where each time the DB is queried and the files which hold a pending status are processed for upload.
Now, I want to be able to track the progress of the script, regardless of its current status in the frontend (if currently no pending items are in the DB).
While I would appreciate any specific suggestion on how to do this, my question is also regarding best practice - meaning, what is the proper way to do this?
Here's an example of a script of such (it's using the Joshcam MysqliDb)
// Get items with a pending status
function get_items_queue() {
global $db;
$cols = Array ("id", "filename");
$db->where('status = "pending"');
return $db->get('files', null, $cols);
}
// Upload items to S3
function UploadToS3($filename) {
if (empty($filename)) {
return false;
}
include_once('/s3/aws-autoloader.php');
$s3 = new S3Client($somearray); // Some S3 credentials here
// Print status
echo $filename . ' is uploading';
$uploaded = $s3->putObject($somearray); // Uploading to S3
if ($s3->doesObjectExist($s3_bucket, $filename)) {
// Print status
echo $filename . ' was uploaded';
} else {
// Print status
echo 'There has been an issue while uploading ' . $filename;
}
}
// Run the script
$queue_items = get_items_queue();
foreach ($queue_items as $key => $item) {
$upload = UploadToS3($item['filename']);
// Some function here that changes the status column for the uploaded item to 'uploaded'
if ($upload) {
set_item_queue_status($item['id']);
}
}
I ended up setting an installation of Cronicle from jhuckaby.
Essentially a cron manager, but what's most important for my case is the live log-viewer. This enables me to run the script using a cron job at the intervals I defined, and watch as it executes via the log-viewer, while being able to leave and come back at any point to view the currently running task (or any of the previous tasks that ran while I was away).

PHP 5.3 - how to append contents to a large file without loading in to memory

I'm trying to efficiently write a large amount of data to a file ( I'm relatively new to PHP ) in a legacy system to a file without killing the memory. It is only writing 50 customers at a time , but after a while it slows down considerably , so I assume it is keeping the whole file in memory. Any ideas how I can just append to a file and cope with the file getting very large? Code snippet below. Note: I am stuck with PHP 5.3.
do{
//Tell the collection which page to load.
$collection->setCurPage($currentPage);
$collection->load();
$fp = fopen(Mage::getBaseDir('export') .'/customers.json', 'a');
foreach ($collection as $customer){
//write collection as json
fwrite($fp, "," . json_encode($customer->getData()));
$customerCount++;
}
fclose($fp);
$currentPage++;
//make the collection unload the data in memory so it will pick up the next page when load() is called.
$collection->clear();
echo memory_get_usage() . "\n";
echo "Finished page $currentPage of $pages \n";
} while ($currentPage <= $pages);

Queuing and image processing

Hi all I just have a quick question regarding best practacises and perhaps some help on queuing and image manipulation.
Im currently working on a website that allows the user to upload in excess of 10 files at one time now in my experience ive only really handled single uploads or 2-3 max, this site allows the user to upload as many as they like and then performs image manipulation to create 3 versions of each image at different sizes.
My thought process and how ive implemented this goes as follows.
User goes to upload form and selected multiple files these are all uploaded inline when they have finished the form autosubmits, The uploaded files are uploaded directly to a temporary folder in S3, this was done as there are multiple servers in the live environment with a load balancer in front of them so i was worried if I uploaded them all to the server then if i fired a queue it might go to the incorrect server and not find the files, would be great if there was a nice way of doing this.
when the form is submitted it fires a notification to the queue on iron.io with the data from the form submit which basically calls the server and starts the processing of images the code for this is below
public function fire($job, $data)
{
set_time_limit(0);
try {
if(is_array($data)){
foreach ($data['file'] as $x => $file){ //loop through each file uploaded and now save them
if ($this->images->doesMediaExistInTemporaryFolder($file)){
if ($new_file = $this->images->getMediaFromTemporaryS3Folder($file)){
file_put_contents (app_path() . '/storage/bulk-upload/' . $file, (string) $new_file['Body']);
$record_date = false;
if ($data['date'][$x] != 'no-date'){
if ($new_file['ContentType'] == 'image/jpeg') {
$exif_data = #exif_read_data(app_path() . '/storage/bulk-upload/' . $file, 'FILE');
}
if (!empty($exif_data) && #array_key_exists('DateTime', $exif_data)){
$record_date = $exif_data['DateTime'];
} else {
$record_date = $data['date'][$x];
}
}
$created_file = new \Symfony\Component\HttpFoundation\File\UploadedFile(app_path() . '/storage/bulk-upload/' . $file, $file, $new_file['ContentType'] );
$input = array('vehicle_objectId' => $data['vehicle_objectId'], 'type' => $data['type'], 'privacy' => $data['privacy'], 'date' => $record_date);
if (file_exists(app_path() . '/storage/bulk-upload/' . $file)){
if ($record = $this->record_repository->save($input, $created_file)) {
unlink(app_path() . '/storage/bulk-upload/' . $file);
$this->images->deleteMediaFromTemporaryS3(array(array('Key' => $file )));
} else {
$data['filename'] = $file;
\Mail::send('emails.bulk-upload', $data, function($message) {
$message->to('email', 'Daniel Newns')->subject('Bulk upload save issue');
});
}
}
}
}
}
$parse = new \ParseRestClient();
$user = $parse->retrieveCurrentUser( $data['pid']);
if (isset($user->email)) {
$vehicle_url = \URL::route('vehicles.show', $data['vehicle_objectId']);
$body = "<p>Hi " . $user->username . "</p><p>Your records have all been created. View them all as part of your vehicle record <a href='" . $vehicle_url . "'>here</a></p>";
$message = array(
'to' => array(array('email' => $user->email)),
'from_email' => 'xxxxx
'from_name' => 'xxxxx'
);
$template_content = array(array("name" => "share", "content" => $body));
$response = \Email::messages()->sendTemplate('Bulk_Upload', $template_content, $message);
}
}
} catch(\Exception $e){
$message = array(
'to' => array(array('email' => 'email')),
'from_email' => 'email',
'from_name' => 'xxxxxx'
);
$content = '<p>'. $e->getMessage() . '</p>';
$content .= '<p>' . $e->getTraceAsString() . '</p>';
$template_content = array(array("name" => "share", "content" => $content));
$response = \Email::messages()->sendTemplate('Content_Share', $template_content, $message);
}
}
as you can see it loops through the data returned from the queue and loops through the files from here it pulls the image from S3 and stores it locally then it checks if there is a date set and works out the created date via either that or exif data. then it creates the file and saves the record in the save function it performs all the resizing required.
my question is really does someone else have any suggestions on how i can improve this as im occasionally getting emails from the exception where its saying that it cant find a certain image like it hasnt been created locally, is my method of creating the image locally using file_put_contest the one i should be using or is there a better way for me to pull the data from S3 and work with it. Ive put a number of if statements in to stop things falling through the gaps etc.
be great to hear other peoples thoughts on where I have gone wrong here and what perhaps i could do to improve this? perhaps i could store an array of files that dont exists on first loop and then try again afterwards as i was thinking that it might be a case of the the code executing before the image exists would this be the case?
any help would be much appreciated.
thanks
I'm curious how you are implementing the actual queue for processing the images?
When I have needed a process queue in the past, I created a server daemon using PHP that would check a db for new images. Each time an image was uploaded I copied the original to a temp location, and stored the name, and status of the image in the DB. Status was new, processing, and complete. As soon as a server grabbed a file to process from the db I updated the status to processing.
I also mounted an S3 bucket to each of my machines and then symlinked to a local folder so all files were accessible with out having to download the file first. The code behaves as if the file is local, even though in the background the image is being downloaded.
However, another solution that lives in the AWS service is their SQS (simple queuing service). Use the S3 API with the SQS API inside your application and you can accomplish what your trying to do with out trying to build a server daemon.
I would check out this link here:http://aws.amazon.com/articles/1602?_encoding=UTF8&jiveRedirect=1
They have a pretty good guide on how to do exactly what your wanting to do using the services above. They recommend using dynamoDB but you can probably swap out with any DB that you are already using.
Either route you go, you need a DB to track files, and process status, and to keep track of your files in general. If you are worried your sometimes running into errors because the file isn't downloaded yet, I would check to make sure the file exists first, if it does check file size against DB, and then determine if the file is ready to be processed. You could run a script in laravel by hitting that specific url with a cron job as well.
Hope this helps!

PHP - Call external programs and show progress

I have a PHP script that takes the fields of a form to make 3 alternated program calls, where the output of the first one is the input of the second one and so on.
The problem is that I need to display the progress of each program call, nothing to complicated, I just like to show 3 different messages:
System call N waiting.
System call N in execution.
System call N finished.
I´m trying to do that with the different PHP functions like exec(), popen() or proc_open() but with these ones the browser waits until each call finish.
The whole system calls don´t take more than 5 minutes, maybe 3 or 4, so, it would also be good to place a timer in each call, maybe 1.5 minutes, and if the call takes more that that time, kill the current system call, skip the following calls and show an error message.
Do you have any idea? Maybe a combination of ajax and javascript can be a solution. Thanks in advance.
<?php
/*
System Calls
This file is required in another main script
$projectPath and $projectName defined in the main script
*/
//$mainHome = getcwd();
$home = $projectPath . $projectName;
$temp = $home . "/temp/";
$calls = $temp . "CALLS";
$threads = array();
if(is_dir($temp)){
//chdir($temp);
$FILE = fopen($calls, "r");
while(($call = fgetcsv($FILE)) !== FALSE) {
//print_r($call);
$threads[] = implode(" ", $call);
}
}
//print_r($threads);
$descriptorspec = array(0 => array("pipe","r"),
1 => array("pipe","w"),
2 => array("file","./phpError.log","a")
);
for ($a=0; $a<count($threads); $a++) {
print $threads[$a] . "<br/><br/>";
exec($threads[$a])
//$res = proc_open($threads[$a], $descriptorspec, $pipes, $temp);
}
//chdir($mainHome);
?>
Thank you very much! finally I develop a python CGI based solution, a bit more static than I thought but meets the expectations of the team.
Greetings

Categories