Client-Server PHP communication - php

How do PHP server pages handle multiple requests from different users?
I am used to C-like languages and they use multithreading. What does PHP use in this case?

The PHP interpreter is generally invoked by a webserver (like apache, lighthttpd, ...). The webserver then handles the requests (by threading, forking or whatever). Every instance of PHP is running sequentially, so there is no builtin multithreading.
There exists a PECL extension https://github.com/krakjoe/pthreads which adds multithreading to PHP.

That's really dependant upon the server you're using to handle these requests. PHP itself does not handle requests, but instead is used to parse the script and return the result. It is possible to write a web server in PHP but then again, this is all dependent on the server you are using.

It's actually a good question. Once several users try send requests to the same server , no confilct happens becuase the webserver creates an independent process, which you can call a session ( not a php session ) of it's own for each request.
PHP has nothing to do with this becuase it's only a script which is finally translated by the a machine .

Related

Node.js run as program vs php

I have been looking at the node.js application vs php, I found many comparisons comparing these two server technique. Mostly people suggest that the javascript V8 engine is much faster than php in terms of running speed of a single file calculation.
I worked on some javascript code for Node.js, now I have this idea I don't know if it is correct.
In my opinion, Node.js runs a javascript application and listen on a port. so this javascript application is an application that is running on server computer. Therefore, the application code is all copied in the memory of the computer. stuff like global variables are declared and saved at the beginning when node.js execute this program. So any new request come in, the server can use these variables very efficiently.
In php, however, the php program execute *.php file based on request. Therefore, if some request is for www.xx.com/index.php, then the program will execute index.php, and in which, there may be stuff like
require("globalVariables.php");
then, php.exe would go there and declare these variables again. same idea for functions and other objects...
So am I correct in thinking that php may not be a good idea when there are many other libraries that need to be included?
I have searched for the comparison, but nobody have talked about this.
Thanks
You are comparing different things. PHP depends on Apache or nginx (for example) to serve scripts, while Node.js is a complete server itself.
That's a big difference, cause when you load a php page, Apache will spawn a thread and run the script there. In Node all requests are served by the Node.js unique thread.
So, php and Node.js are different things, but regarding your concern: yes, you can mantain a global context in Node that will be loaded in memory all the time. On the other hand PHP loads, runs and exits all the time. But that's not the typical use case, Node.js web applications have templates, that have to be loaded and parsed, database calls, files... the real difference is the way Node.js handles heavy tasks: a single thread for javascript, an event queue, and external threads for filesystem, network and all that slow stuff. Traditional servers spawn threads for each connection, that's a very different approach.

Does PHP controls Multitasking?

I'm new to PHP and will like to develop a mobile app that interacts with the server (By putting and pulling datas from the server). Initially I was using Java, but finacial issues I decided to use PHP because getting domain that uses java is expensive.
My question is that does PHP controls multitasking ? reason been that since I will have thousands of users connected to my server probably the same. I llok forward for your answers Thanks
How should PHP have control over multitasking?
PHP interprets a PHP-Script to one point in time when a http-Request occurs on the Script.
PHP does not do multi-threading. It's a single-process-execution kind of scripting language.
However, when set up as a server-side language, it's usually paired up with a HTTP server like Apache, IIS or Nginx, who manage several child processes to handle multiple requests. - If you set it up like a normal server-side language, on top of one of those HTTP servers, you will have no problems handling a lot of parallel traffic.

file_get_contents/curl blocks other clients

I use file_get_contents/curl to get access for one API at the another server from my php script. This API isn't fast and can take up to 10 seconds to respond.
When I try to open 2 pages on my web site at the same time, which uses this API, they loaded one by one, i.e. I need to wait 1st to be loaded before server will start to server request for 2nd page.
I use Apache2 and php under linux.
How I can avoid such behaviour, I don't want to block other clients while one of them access this API. Need help!
Thanks.
Yes.
There is this PHP library: http://code.google.com/p/multirequest/ (it's a multithreaded CURL lib).
As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

Use WEB Apache server(or any other web server) to read the content generated by a compiled c file (i.e .exe)

so since my webpage makes very complex calculations its VERY important to have it generated with a compiled code, but since im doing it for the web I need a few commands like the one it comes in PHP like $_SERVER (to get for example the IP of the user), $_GET, $_POST .
if theres already is one web server like this that pass these things for parameter for example it would be easier.
Thanks in advance.
You have two basic options:
Use CGI, which is a well supported system for communicating between web servers and scripts/executables.
Write a module
CGI is simple and near universal, but requires a new process to be spawned for each request. There is also FastCGI which is a bit more complicated but lets processes be reused.
Writing a module is significantly more complicated, but provides better performance.
Perhaps look at http://www.boutell.com/cgic/?
You can either compile you program as a CGI, or bounce your requests through a PHP script and pass whichever values you need in as command line parameters:
<?php
passthru("/path/to/my/binary {$_SERVER['HTTP_HOST']} {$_GET['aparameter']} {$_POST['aparameter']}");
?>
If you want to go down the CGI route, start here... ;-)

best method for large File transfer via http(s) using PHP (POST) invoked via shell

I want to setup a automated backup via PHP, that using http/s I can "POST" a zip file request to another server and send over a large .zip file , basically, I want to backup an entire site (and its database) and have a cron peridocally transmit the file over via a http/s. somethiling like
wget http://www.thissite.com/cron_backup.php?dest=www.othersite.com&file=backup.zip
The appropriate authentication security can be added afterwords....
I prefer http/s because this other site has limited use of ftp and is on a windows box. So the I figure the sure way to communicate with it is via http/s .. the other end would have a correspondign php script that would store the file.
this process needs to be completely programmatic (ie. Flash uploaders will not work, as this needs a browser to work, this script will run from a shell session)//
Are there any generalized PHP libraries or functions that help with this sort of thing? I'm aware of the PHP script timeout issues, but I can typically alter php.ini to minimize this.
I'd personally not use wget, and just run from the shell directly for this.
Your php script would be called from cron like this:
/usr/local/bin/php /your/script/location.php args here if you want
This way you don't have to worry about yet another program to handle things (wget) if your settings are the same at each run then just put them in a config file or directly into the PHP script.
Timeout can be handled by this, makes PHP script run an unlimited amount of time.
set_time_limit(0);
Not sure what libraries you're using, but look into CRUL to do the POST, should work fine.
I think the biggest issues that would come up would be more sever related and less PHP/script related, i.e make sure you got the bandwidth for it, and that your PHP script CAN connect to an outside server.
If its at all possible I'd stay away from doing large transfers over HTTP. FTP is far from ideal too - but for very different reasons.
Yes it is possible to do this via ftp, http and https using curl - but this does not really solve any of the problems. HTTP is optimized around sending relatively small files in erlatively short periods of time - when you stray away from that you will end up undermining a lot of the optimization that's applied to webservers (e.g. if you've got a setting for maxrequestsperchild you could be artificially extending the life of processes which should have stopped, and there's the interaction between LimitRequest* settings and max_file_size, not to mention various timeouts and the other limit settings in Apache).
A far more sensible solution is to use rsync over ssh for content/code backups and the appropriate database replication method for the DBMS you are using - e.g. mysql replication.

Categories