Pdf export freeze when i try to export it - php

So i'm trying to implement a PDF export. I installed dompdf with composer, i put "'PDF' => Barryvdh\DomPDF\Facade::class" in aliases and "Barryvdh\DomPDF\ServiceProvider::class" in providers in config/app.php. On my route i have this code:
\Illuminate\Support\Facades\Route::get('/users/{userId}/details/{forReport}/exportPDF', function($userId) {
$user = UserHelper::getUser($userId);
$internal_devices = UserHelper::getUsersAssignedInternalDevices($userId);
$external_devices = UserHelper::getUsersAssignedExternalDevices($userId);
$licenses = UserHelper::getUsersAssignedLicenses($userId);
$delegation_devices = UserHelper::getUsersDevicesOnDelegation($userId);
$pdf = PDF::loadView(URLHelper::userReportView(),
[
'user' => $user,
'internal_devices' => $internal_devices,
'external_devices' => $external_devices,
'licenses' => $licenses,
'delegation_devices' => $delegation_devices,
]);
return $pdf->download('My.pdf');
})->name('admin.exportPDF');
When i call this route, i get execution time-out and i have 300 seconds. In conclusion, PDF is not generated. Is something wrong with my code?
PS: i did a var_damp and everything looks alright

Related

Testing Guzzle request and PDF converter

I'm pretty new into testing things, and I was wondering how I could test such code or even if it's valuable to test this.
this is what my code looks like :
public function convert(string $urlToConvert, string $path, array $options = []): void
{
try {
$result = $this->getClient()->post(
sprintf('%s/%s', $this->config->getEndpoint(), self::GOTENBERG_CHROMIUM_ENDPOINT),
[
'multipart' => [
['name' => 'url', 'contents' => $urlToConvert],
['name' => 'emulatedMediaType', 'contents' => 'screen'],
...$options
]
]
);
$directory = $this->filesystem->getDirectoryWrite(DirectoryList::MEDIA);
$directory->writeFile($path, $result->getBody());
} catch (Exception | GuzzleException $e) {
$this->logger->error($e, ['context' => 'm2-gotenberg']);
throw new GotenbergConvertException(__('Failed converting PDF'), $e);
}
}
The getClient() returns an instance of a GuzzleHttp.
The process is the following :
Do a request on an endpoint with the URL
Get the response body which is the converted PDF
Create a file with the content given by the response
I don't feel like I could test anything, or a really small amount of the code.
About the lines creating the files, this is the done by the framework I'm using
The only thing I see is to do an integration test, and fetching the endpoint to create a real pdf, then delete it.
Can someone clarify this for me or give me some advices? When using framework I have troubles on what to test to avoid testing the implementation of my code

When using the AWS PHP SDK 3.x is there a way to batch-upload multipart files to S3 in parallel using a getCommand array?

I am working on a process to upload a large number of files to S3, and for my smaller files I am building a list of commands using getCommand to upload them concurrently, like this:
$commands = array();
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename.ext',
'Body' => fopen('filepath', 'r'),
));
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'mybucket',
'Key' => 'filename_2.ext',
'Body' => fopen('filepath_2', 'r'),
));
etc.
try {
$pool = new CommandPool($s3Client, $commands, [
'concurrency' => 5,
'before' => function (CommandInterface $cmd, $iterKey) {
//Do stuff before the file starts to upload
},
'fulfilled' => function (ResultInterface $result, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff after the file is finished uploading
},
'rejected' => function (AwsException $reason, $iterKey, PromiseInterface $aggregatePromise) {
//Do stuff if the file fails to upload
},
]);
// Initiate the pool transfers
$promise = $pool->promise();
// Force the pool to complete synchronously
$promise->wait();
$promise->then(function() { echo "All the files have finished uploading!"; });
} catch (Exception $e) {
echo "Exception Thrown: Failed to upload: ".$e->getMessage()."<br>\n";
}
This works fine for the smaller files, but some of my files are large enough that I'd like them to automatically be uploaded in multiple parts. So, instead of using getCommand('PutObject'), which uploads an entire file, I'd like to use something like getCommand('ObjectUploader') so that the larger files can be automatically broken up as needed. However, when I try to use getCommand('ObjectUploader') it throws an error and says that it doesn't know what to do with that. I'm guessing that perhaps the command has a different name, which is why it is throwing the error. But, it's also possible that it's not possible to do it like this.
If you've worked on something like this in the past, how have you done it? Or even if you haven't worked on it, I'm open to any ideas you might have.
Thanks!
References:
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_commands.html#command-pool
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-multipart-upload.html#object-uploader
I decided to go a different direction with this, and instead of using an array of concurrent commands I am now using a set of concurrent MultipartUploader promises, as shown in an example on this page: https://500.keboola.com/parallel-multipart-uploads-to-s3-in-php-61ff03ffc043
Here are the basics:
//Create an array of your file paths
$files = ['file1', 'file2', ...];
//Create an array to hold the promises
$promises = [];
//Create MultipartUploader objects, and add them to the promises array
foreach($files as $filePath) {
$uploader = new \Aws\S3\MultipartUploader($s3Client, $filePath, $uploaderOptions);
$promises[$filePath] = $uploader->promise();
}
//Process the promises once they are all complete
$results = \GuzzleHttp\Promise\unwrap($promises);

Exception "The content cannot be set on a BinaryFileResponse instance." thrown only on production server

I want my users to download a .csv file.
The code in my controller works flawlessly on my local machine and on a dev server. However, it doesn't work on the production server.
The framework is Laravel 5.5.44
The PHP version is 7.2
$videos = Video::whereNotNull('title')->get();
//First row for labels
$list = array([
'id' => 'Video ID',
'director' => 'Director',
'title' => 'Title',
]);
foreach($videos as $video) {
$list[] = [
'id' => $video->id,
'director' => $video->director,
'title' => $video->title,
];
}
$today = date('Ymd');
$filename = $today.'-list.csv';
$fp = fopen(storage_path($filename), 'w');
foreach ($list as $fields) {
fputcsv($fp, $fields);
}
fclose($fp);
return response()->download(storage_path($filename));
When firing the controller on the production server, I get:
prod.ERROR: The content cannot be set on a BinaryFileResponse instance. instead of downloading the file.
I just noticed that the exception is thrown only if the app environment is set to production.
I was able to reproduce the behaviour on my local machine by setting
APP_ENV=prod in the .env file.
I'm puzzled. I wasn't able to find enough information in the official documentation and I still don't know how to solve this strange issue.
Your AfterMiddleware is trying to set content on the Response, which you can not do to a BinaryFileResponse object, any calls to setContent will cause this error.
You will have to figure out if you need the vrkansagara/lara-out-press middleware you are using.

Laravel CSV Import running out of memory (allowed memory exhausted)

I've got a CSV file of members that I receive once a month that contains ~6000 rows.
I'm (trying to) loop through the CSV file, check if the record already exists in the members table, and if so check whether it's the same data.
Then insert it into the pending table (with a exists flag where appropriate).
I'm using Laravel and League\CSV to read in the file that is saved in my storage folder:
class ImportController extends Controller
{
public function import(Request $request) {
$readDirectory = 'storage/csv/';
$filename = $request->name;
$stream = fopen($readDirectory.$filename, 'r');
$reader = Reader::createFromStream($stream, 'r')->setHeaderOffset(0);
$records = (new Statement())->process($reader);
// Truncate the imported table prior to import
Imported::truncate();
foreach ($records as $record) {
$email = $record['email'];
$recordExists = $this->recordExists($email);
if($recordExists) {
// Compare the md5 of the recordArray and the memberArray and skip the record if thit's the same.
$memberArray = $this->getmemberArray($recordExists);
$recordArray = $this->getRecordArray($record);
if($memberArray['hash'] === $recordArray['hash']) { continue; }
$record['exists'] = TRUE;
$this->write($record);
continue;
}
else
{
$record['exists'] = FALSE;
$this->write($record);
Log::debug("missing: ".$record['URN']);
continue;
}
};
// End Foreach Loop
return redirect()->route('upload.show');
}
public function recordExists($urn){
$member = Member::where('email', 'LIKE', $email)->first();
if ($member == null) { return false; }
return $member;
}
public function getmemberArray($member) {
$memberArray = [
'email' => $member->email,
'first_name' => $member->first_name,
'last_name' => $member->last_name,
'age_years' => $member->age_years,
'gender' => $member->gender,
'address_1' => $member->address_1,
'address_2' => $member->address_2,
'address_3' => $member->address_3,
'town' => $member->town,
'county' => $member->county,
'postcode' => $member->postcode,
'sport_1' => $member->sport_1,
'sport_2' => $member->sport_2,
];
$memberArray['hash'] = md5(json_encode($memberArray));
return $memberArray;
}
public function getRecordArray($record) {
$recordArray = [
'email' => $record['email'],
'first_name' => $record['first_name'],
'last_name' => $record['last_name'],
'age_years' => $record['age_years'],
'gender' => $record['gender'],
'address_1' => $record['address_1'],
'address_2' => $record['address_2'],
'address_3' => $record['address_3'],
'town' => $record['town'],
'county' => $record['county'],
'postcode' => $record['postcode'],
'sport_1' => $record['sport_1'],
'sport_2' => $record['sport_2'],
];
$recordArray['hash'] = md5(json_encode($recordArray));
return $recordArray;
}
public function write($record) {
$import = [];
$import['email'] = $record['email'],
$import['first_name'] = $record['first_name'],
$import['last_name'] = $record['last_name'],
$import['age_years'] = $record['age_years'],
$import['gender'] = $record['gender'],
$import['address_1'] = $record['address_1'],
$import['address_2'] = $record['address_2'],
$import['address_3'] = $record['address_3'],
$import['town'] = $record['town'],
$import['county'] = $record['county'],
$import['postcode'] = $record['postcode'],
$import['sport_1'] = $record['sport_1'],
$import['sport_2'] = $record['sport_2'],
$import['exists'] = $record['exists']
DB::table('imported')->insert(
$import
);
Log::debug($record['email']);
return TRUE;
}
}
But I keep getting:
Symfony \ Component \ Debug \ Exception \ FatalErrorException (E_UNKNOWN)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 181321056 bytes)
It works if I use a lot less rows in my CSV, but that's not an option.
I was previously writing to the DB using eloquent->save(), but changed it to DB::table()->insert to improve performance.
I've already added the following for testing purposes, but it's still breaking.
set_time_limit(0);
ini_set('max_execution_time', 100000);
ini_set('memory_limit','512m');
Am I missing something? Some kind of memory leak somewhere?
I'm guessing it's keeping the record in memory each time, so is there any way to make it forget after each row?
ALSO:
Is there a way to clear this memory, so that I can edit the code and retry?
Even if I stop and re-run php artisan serve it still keeps the same error message.
The problem here is that League\CSV is reading the whole CSV file into memory when you do:
$records = (new Statement())->process($reader);
You should use the chunk method of the Reader like this to only read a specific amount of rows at once:
foreach($reader->chunk(50) as $row) {
// do whatever
}
The chunk method returns a Generator that you can iterate over. You can find this mentioned here in the documentation.
EDIT: I misread the documentation and recommended the wrong method.
You basically just have to iterate over the $reader itself:
foreach ($reader as $row) {
print_r($row);
}
Also if you are using a mac or if your CSV was created on one you need to use the following to be able to successfully read large CSV files:
if (!ini_get('auto_detect_line_endings')) {
ini_set('auto_detect_line_endings', '1');
}
See this part of the documentation.
I get it that you are using php artisan serve to run your server. You can try deploying some form of an actual web server as you will be using that in production environment. You can try Apache, comes easily in XAMPP for windows and Linux.
You can check online on how to install Apache HTTP Server or Nginx on your operating system. These have better control and use of memory that the php default server.

how to replace new amazon S3 bucket details over old in already configured application

I have a built-in application in that he used s3 bucket now I have to create my new bucket and trying to configure the application unable to identify the issue how can I check where the issue is coming.?
With the old credentials data is uploading/receiving fine. but with the new one, it is not working
These are changes what I made in an application:
awsConfig.php
return array(
'includes' => array('_aws'),
'services' => array(
'default_settings' => array(
'params' => array(
'key' => 'AKIA***', //replaced old with my new
'secret' => '9VB4q0********', //replaced old with my new
)
)
)
);
I see that application is triggering awsController.php at first but there is no method present in the index action
public function indexAction()
{
//is there any to check from here where the issue is coming.?
}
In the same page, i have many functions shown in this image:
and the bucket name present in this controller itself:
public function initialize()
{
$this->view->disable();
$this->s3 = Aws::factory(CommonLibrary.'awsConfig.php')->get('s3')->registerStreamWrapper();
$this->aws = Aws::factory(CommonLibrary.'awsConfig.php');
$this->prefix='folder/'; //i dont why this prefix is for.
$this->bucket="bucketname"; //my bucketname
// Get the client from the builder by namespace
$this->client = $this->aws->get('S3');
My issue is cant able to identify the exact problem how can I check the issue.?

Categories