My existing Laravel project is such that all the tasks are executed sequentially. I have identified part of the code which can be run in parallel using PHP threads and can reduce the response time.
So I'm trying to use pthreads, https://github.com/krakjoe/pthreads/tree/PHP5. The appserver.io is already comes with pthread, thus I'm running project on appserver.io and not apache.
I was able to use pthread successfully. As in creating new PDO connection under run() method for each thread and database interaction is working fine using native query like this:
self::$connection = new PDO ( 'mysql:host=127.0.0.1;dbname=mydb' , 'myuser' , 'mypass' );
But I'm not able to use Laravel's \DB:connection the same way. I need to have this working in order to run my code which is written with ORM based queries. How can I create new connection every time under run() method?
Can you not dispatch jobs and just have those jobs run in background (queue them)?
Two things you need to consider.
1. You should load the vendor file in your threads.
2. Instead of facade use singleton class in threads.
Hope this will solve the issue.
Related
I've a code base written in Laravel 9 which targets to execute multiple requests to one API with 1000 different API keys + API secrets. Is there a way to actually do this simmultaniously and not one after another ?
I've come up to this thread Laravel Commands, Pthreads and Closure but the answer there doesn't look like it is wokring, in my opinion it shows that everything is kinda executing at the same time because there is sleep(rand(1,3)) inside the OperatorThreaded class.
Any idea ?
A Laravelly way is to do it like so:
Create a RequestYourApi Job (php artisan make:job RequestYourApi or make a class yourself inside App\Jobs).
Dispatch any amount of jobs using RequestYourApi::dispatch([/* args */]) (the arguments will be passed to the constructor of this class, and the job will be put into the queue, see https://laravel.com/docs/9.x/queues)
Run multiple workers using for instance Supervisor (or Laravel Horizon): php artisan queue:work (use --help to see additional options)
Scale up/down your workers depending on how fast you need to process things.
I've created some migration and seed files using bake, but now the dba guys say i should use an sql script to create and populate the tables. Is there a fast way of converting the files, without having to write the script by hand?
Yes, of course. Connect to SQL Server with SQL Profiler and run your current migration. Written profiled script is the converted one. You need only set appropriate Profiler's filter properties.
I'm beginner in Symfony, I wanna know if there is any way to control the process of my functions ( I mean to stop it or to run it.. at any time I want)
I found this case using the Symfony process library :
use Symfony\Component\Process\Process;
$process = new Process('#command');
$process->start();
// ... do other things
$process->stop(3, SIGINT);
But, is it necessary to use the process as a command?
Is it similar to pcntl_fork?
if you use Process component from Symfony it will launch processess as commands, there are no other options right now. However you can still use pcntl PHP functions if you are sure there is no chance with Process class.
Take a look to Process class and you will realize that internally it is using pcntl constants as POSIX signals and so on.
https://github.com/symfony/process/blob/master/Process.php
I want to run some functions parallel in the background.
Is there a nice way to do this in Symfony2 or even with php?
Or is there only the Symfony\Component\Process\Process?
Can I use this Process with a function? I need the actual Context (logged in user and some session data), so it is not possible to source the function out to an external php-file...
Symfony2 Process component allows you to run some shell command or execute php-script in a different process.
To run exact function in a thread try look at PHP Thread class
I am trying to implement a simple database using PHP & sqlite on my Linux/Apache server.
I can read from it quite readily, but I cannot perform any UPDATE, DELETE or INSERT actions. The Fatal Error I get is:
General error: 5 database is locked
As a simple example:
$pdo=new PDO('sqlite:test.sqlite');
$pdo->exec("INSERT INTO menus(id,name,description) VALUES(6,'test','this is a test')");
This waits for a long time (about a minute), and then reports the above error.
I have read a lot of suggestions, many of which suggest that the database or its containing folder should be writable. They are. (Or were. I made them world writable for testing, and restored more reasonable permissions when that failed.)
I have no trouble writing to the database using other techniques such as the sqlite3 command in Linux and the SQLite manager addon in Firefox.
I would welcome any comments on how to make this work.
Please, try to give the database file a 777 permission and try again. I suspect it has something to do with permissions because you are able to modify the database using sqlite3 program.
If it fails, then try to see the answers to this question.