Multiple php requests simultaneously, second request doesn't start until first finishes - php

I've got a long running php 7.2 script that is producing a zip file. I want to use a looping ajax call to check on the progress of building the zip file. The second script appears to be locked and doesn't start processing until the first script is completely done.
In fact, the second script doesn't even print the error_log() on line 1 of my index.php routing script until after the first script is completely finished.
The top of my index.php router script:
<?
error_log('top of index '.$_SERVER['REQUEST_URI']);
This is true even if I'm just requesting static image resources. The error_log() on line 1 doesn't even print until after the long-running script has completely finished.
At first, I believed I was running into Session Locking as described here but the solution they offer doesn't seem to work (calling session_write_close()), and I'm wondering if something else is going on because the second script is locking before line 1 as opposed to locking up when I try to start the session. The second script doesn't seem to be starting AT ALL. I thought maybe the server was automatically starting the session before line 1, but I checked and my server has session.auto_start=0.
Is there some server configuration I need to set?
What am I missing? What am I doing wrong?
I'm running this on localhost (Mac) with the built-in PHP server.
php -c /usr/local/etc/php/7.2/php.ini -S 127.0.0.1:8080 index.php

This is because the PHP server only uses a single process by default, which prevents it from executing multiple requests simultaneously.
The built-in server supports multiple workers from PHP 7.4 onward (on platforms where fork() is available). See this commit: https://github.com/php/php-src/commit/82effb3fc7bcab0efcc343b3e03355f5f2f663c9
To use it, add the PHP_CLI_SERVER_WORKERS environment variable before starting the server. For example:
PHP_CLI_SERVER_WORKERS=10 php7.4 -S 127.0.0.1:7080 index.php
I recommend upgrading your PHP 7.2 installation if you want to utilize this feature.

The built in web server is a single-threaded, single-process, synchronous server. It's only meant for testing purposes. It can only handle a single request at a time. It cannot serve multiple concurrent requests at the same time.

Related

ssh2 execution lifetime on the remote side

I am trying to run a bash script using php-ssh2. The script must run forever in the remote machine (the only way to stop it must be with pkill). The problem is that somehow the connection is closed and the bash script is killed.
Nohup, disown and screen... I tried everything and nothing really changed, it simply doesn't keep that script alive.
What can I do?
(I know that this is a security hole (HUGE) but this is just experimental, the main idea is using an HTML button, run a bash script in the server computer, using apache2)
Create a frequent cron job (every minute?) that first checks some kind of a flag (e.g. existence of certain file) before running a job.
In the PHP code, only raise the flag (create the file).
Does the script generate output? If not, check out the keep alive option of ssh.

What is the difference between a PHP console script and web script?

I have tried reading and have understood PHP console to be a command-line interface (CLI) like one used in composer. I do not understand the difference between a web script and a console script. I do not see the use of having the two.
I want to crawl data from a certain link. Should I use a console script or web script and why?
Please explain in the simplest manner possible.
There is no difference between the two. In most instances, the same PHP script will run whether you execute it from the command line or via the web.
There is, however, a difference between the environment the script will execute within. A CLI script is initiated from and executed within your shell on your computer. It is very self-contained. A web script, on the other hand, is (typically) initiated via a HTTP request from a browser, passes over the web to a web server, is executed on that remote server and a result (typically a web page) is passed back to your browser. In the latter case, there are special environment variables related to the web request made available to the script.
It's a bit hard to know which is the best case for your web crawler script without knowing more detail. But I'd say a command line script is what you're after.
One difference between a web page and a CLI instance is the way the script is executed: webpages will be loaded via a web container, while CLI's will be usually executed by the shell used to launch the PHP. Due to this, a CLI might not have access to all $_SERVER variables as the webpage as practically there is no HTTP request involved.
CLI scripts are useful for doing background tasks that are not initiated by the web server, for example a cron job that periodically cleans your database, on one that executes queued jobs. Think of CLI as shell scripts, you can write a PHP script instead of a bash one.
The PHP interpreter is the same in both cases, and it's up to you to decide which one suits best your needs: webpages are more common, however if you need to have you server do some work without waiting for a web request, the you can go with CLI.
Well, basically a console script is the way for your task.
The difference resides in the fact a Webscript will block your browser, will not show your progress real-time, etc.
I was able to crawl and download about 6000 images from my beloved anime with a console script, showing the progress status, something harder with a Web script as the browser will cache the output. Also you can chain your script and also make some cron magic(assuming you are on nix box)

Persistent PHP socket server

I'm planning the development of a server written in PHP that can service socket requests. I use a free host (Heliohost) for testing, and it has cPanel. So far the only thing I've been able to think of to have a PHP script always running is to write a cron job that runs a bash script to check ps to see if the PHP is already running, and if it isn't, start it.
Is there a better way? Perhaps a way for a PHP thread to be started on an HTTP request and continue to run in Apache after the request has been serviced?
You will almost certainly not have success running persistent processes from Apache. It is designed to prevent that scenario (though if you can get to the fork(2) system call, it is probably do-able). I wouldn't recommend trying it though.
What would make more sense is if you use a hosting provider that gives you the ability to write your own crontab(5) specifications and run the PHP interpreter directly. Then you could just add a line to your crontab(5) like:
#reboot /path/to/php /path/to/script.php
Your script should probably perform the usual daemonization tasks so that cron(8) isn't stuck waiting for your process to exit.

Does PHP run in background when browser is closed?

I start my browser and run a PHP program (in another server) and them I close the browser, the program will still keep running in the server, right?
What if you run the program and them remove the folder in the server (while the program is running). Assuming its a single PHP file, will it crash? Does the whole PHP file is read in memory before running or do the system does periodic access for this file?
draft saved
First off, when the server receives a request, it will continue to process that request until it finishes it's response, even if the browser that made the request closes.
The PHP file call is loaded into memory and processed, so deleting the file in the middle of processing will not cause anything to crash.
If however, half way through your PHP it references another file that is deleted BEFORE that code is reached, then it may crash (based on your error handling).
Note however, that causing PHP to crash will not crash the whole web server.
According to the PHP Connection Handling Page:
http://php.net/manual/en/features.connection-handling.php
You can decide whether or not you want a client disconnect to cause
your script to be aborted. Sometimes it is handy to always have your
scripts run to completion even if there is no remote browser receiving
the output.
Of course you can delete the file or folder which includes the PHP file as long as it is not directly in use/open on the server.
Otherwise you could never delete files on a Webserver as they always might be in use :-)

How many PHP scripts can run concurrently via http request?

I know PHP scripts can run concurrently, but I notice that when I run a script and then run another script after it. It waits for the first to finish and then it does what it has to do?
Is there an Apache config I have to change or do I use a different browser or something?!!
Thanks all
EDIT
If I go to a different browser, I can access the page/PHP script I need?! Is this limiting the number of requests by browser? Apache?
Two things comes to mind:
HTTP 1.1 has a recommend maximum limit of 2 simultaneous connections to any given domain. This would exhibit the problem where you can only open one page on a site at a time unless you switch browsers.
Edit: This is usually enforced by the client, not the server
If your site uses sessions, you'll find out that only one page will load at a time...
...unless you call session_write_close(), at which point the second page can now open the now-unlocked session file.
If you run php under fastcgi you can probably avoid this restriction.
I haven't had this problem with apache / php, the project I work on runs 3 iframes each of which is a php script + the main page, apache handles all 4 at the same time.
You may want to check your apache configuration and see how the threading is setup.
This article may help:
http://www.devside.net/articles/apache-performance-tuning

Categories