I'm trying to determine the best approach to providing an Ajax based terminal using PHP. I haven't made an attempt at writing it yet but having rolled the idea around, the only way I could see it possible, would be 2 scripts:
Script 1; handles Ajax communication
between server and client browser. when a
request is made to use the terminal,
it connects to (or starts as a service then
connects to) Script 2 via a socket.
Script 2; performs the system calls,
passing back output to the Ajax
script for output via the socket.
There are multiple holes I can see in this though, and I'm wondering if anyone has created/seen a set of scripts that can perform these tasks? Any insight would be greatly appreciated!
Thanks :)
Edit: I think I was unclear about a few things. I've found a few scripts that imitate terminals, providing nearly the functionality that I'm looking for, such as AjaxPHPTerm (http://sourceforge.net/projects/ajaxphpterm/)
The problem is that, I'm trying to find a method that permits interaction with shell scripts. If a script prompts Press any key to continue, or Select option [x], using AjaxPHPTerm, it just hangs or drops out of the shell script.
That's why I started thinking sockets, or streams; some way of forming a direct I/O stream to the system calls.
Http is stateless and AJAX, sockets or any other technology based on pages generated by server will not change it magically. Whatever tricks You would use, it will be not efficient and simply not worth the effort (In my opinion at least).
The problem seems to be that AjaxPHPTerm is actually closer to a shell than a terminal (glancing at the code, it seems to do its own CWD handling, and has a simple read-eval-print loop).
Assuming a Posix-compatible OS on the server, the proper way to implement this would probably be to use the pseudo-terminal facility, so that your web terminal appears like a virtual terminal on the system, that running programs can interactively access.
Related
I am programming PHP applications where I need to move processing routines from the client (browser) to the server. We have our own dedicated Windows servers.
Let us say that a shopper buys something and the system has to generate a nice and complex invoice PDF (and do a lot other things that takes some seconds) and after this send it to the client as fast as possible. Right now, I have running these time-consuming routines in a hidden Iframe and hoping that client is not breaking the routine by going to another page. It is not a good solution.
A much better solution would be to trigger some kind of software on the Windows server that does the processing instead of the browser (and does it instantly).
I could use "Scheduled Tasks" in Windows but the quickest it can run is each minute. I need something that can run instantly. Do you know what can do this on a Windows server? Like some kind callback server (software).
To make sure user navigation doesn't stop your scripting you can use ignore_user_abort(true);
You can also use exec create background process and run your php script
Example
exec("start /B php Notify.php");
To learn more about execfunction click here
You can find more information by googling stuffs.
I have been looking at the node.js application vs php, I found many comparisons comparing these two server technique. Mostly people suggest that the javascript V8 engine is much faster than php in terms of running speed of a single file calculation.
I worked on some javascript code for Node.js, now I have this idea I don't know if it is correct.
In my opinion, Node.js runs a javascript application and listen on a port. so this javascript application is an application that is running on server computer. Therefore, the application code is all copied in the memory of the computer. stuff like global variables are declared and saved at the beginning when node.js execute this program. So any new request come in, the server can use these variables very efficiently.
In php, however, the php program execute *.php file based on request. Therefore, if some request is for www.xx.com/index.php, then the program will execute index.php, and in which, there may be stuff like
require("globalVariables.php");
then, php.exe would go there and declare these variables again. same idea for functions and other objects...
So am I correct in thinking that php may not be a good idea when there are many other libraries that need to be included?
I have searched for the comparison, but nobody have talked about this.
Thanks
You are comparing different things. PHP depends on Apache or nginx (for example) to serve scripts, while Node.js is a complete server itself.
That's a big difference, cause when you load a php page, Apache will spawn a thread and run the script there. In Node all requests are served by the Node.js unique thread.
So, php and Node.js are different things, but regarding your concern: yes, you can mantain a global context in Node that will be loaded in memory all the time. On the other hand PHP loads, runs and exits all the time. But that's not the typical use case, Node.js web applications have templates, that have to be loaded and parsed, database calls, files... the real difference is the way Node.js handles heavy tasks: a single thread for javascript, an event queue, and external threads for filesystem, network and all that slow stuff. Traditional servers spawn threads for each connection, that's a very different approach.
I'm considering the idea of a browser-based PHP IDE and am curious about the possibility of emulating the command line through the browser, but I'm not familiar enough with developing tools for the CLI to know if it's something that could be done easily or at all. I'd like to do some more investigation, but so far haven't been able to find very many resources on it.
From a high level, my first instinct is to set up a text input which would feed commands to a PHP script via AJAX and return any output onto the page. I'm just not familiar enough with the CLI to know how to interface with it in that context.
I don't need actual code, though that would be useful too, but I'm looking for more of which functions, classes or APIs I should investigate further. Ideally, I would prefer something baked into PHP (assume PHP 5.3) and not a third-party library. How would you tackle this? Are there any resources or projects I should know about?
Edit: The use case for this would be a localhost or development server, not a public facing site.
Call this function trough a RPC or a direct POST from javascript, which does things in this order:
Write the PHP code to a file (with a random name) in a folder (with a random name), where it will sit alone, execute, and then be deleted at the end of execution.
The current PHP process will not run the code in that file. Instead it has to have exec permissions (safe_mode off). exec('php -c /path/to/security_tight/php.ini') (see php -?)
Catch any ouput and send it back to the browser. You are protected from any weird errors. Instead of exec I recomment popen so you can kill the process and manually control the timeout of waiting for it to finish (in case you kill that process, you can easily send back an error to the browser);
You need lax/normal security (same as the entire IDE backend) for the normal PHP process which runs when called through the browser.
You need strict and paranoid security for the php.ini and php process which runs the temporary script (go ahead and even separate it on another machine which has no network/internet access and has its state reverted to factory every hour just to be sure).
Don't use eval(), it is not suitable for this scenario. An attacker can jump out into your application and use your current permissions and variables state against you.
The basic version would be
you scripts outputs a form with a line input
The form action points to your script
The script takes the input on the form and passes it to eval
pass any output from eval to the browser
output the form again
The problem is, that defined functions and variables are lost between each request.
Would you could to is to add each line that is entered to your session. Lets say
$inputline = $_GET['line'];
$_SESSION['script'] .= $inputline . PHP_EOL;
eval($_SESSION['script'];
by this, on each session a the full PHP script is executed (and of course you will get the full output).
Another option would be to create some kind of daemon (basically an instance of a php -a call) that runs on the server in the background and gets your input from the browser and passes the output.
You could connect this daemon to two FIFO devices (one for the input and one for the output) and communicate via simple fopen.
For each user that is using your script, a new daemon process has to be spawned.
Needless to say, that it is important to secure your script against abuse.
Recently I read about a PHP interpreter written in Javascript php.js, so you could write and execute PHP code using your browser only. I'm not sure if this is what you need in the end but it sounds interesting.
We've tested some products at my university for ssh-accessing our lab servers and used some of the Web-SSH-Tools - they basically do exactly what you want. The Shell-In-A-Box-Project may be bound to any interpreter you like and may be used with an interactive php-interpreter, if desired (on the demo-page, they used a basic-interpreter). The project may serve as a basis for a true PHP-IDE. These have the advantage of being capable of interacting with any console-based editor as well (e.g. vi, emacs or nano), as well as being able to give administrative commands (e.g. creating folders, changing ownerships or ACLs or rebooting a service).
Mozilla also has a full-featured webbased IDE called Bespin, which is also highly extensible and configurable.
As you stated, that the page is not for the public, you of course have to protect the page with Authentication and SSL to combat session hijacking.
I need to run a javascript code on server side using IE8
(the javascript works with activeX objects)
But I need to run it from command line, from PHP.
So in short, I will install apache + php on 2003 Windows server, and php will use system() to execute iexplore running a page of javascript.
I would like to know if this is logically possible, as i can see a number of pitfalls:
PHP might not be able to execute iexplore without a user logged in.
iexplore might not run the javascript correctly to interact with ActiveX objects
iexplore might not quit when JS finished running.
I will attepmt to make a little test case as soon as i can, but any pointers about this aproach will be apreciated.
Edit:
Now, I realise that this is a round about way of doing things (read, wrong), The goal was to make a Dymo Label printer print from a central location rather than client machines (this is where the JS is from). Dymo SDK provide several ways of interacting with their printers, but Im still looking for a way to use pure PHP. I think it might be possible to use one of their example cli binaries.
Does the Dymo have a way of interacting with it from Command Line? If so you can easily send commands to it via shell_exec(). http://www.php.net/manual/en/function.shell-exec.php
This is generally the easiest option when you are able to control something via command-line. Sometimes you need a bit more control, however (interactive command-line programs, for instance) and sometimes the program you want to run isn't even command-line based. In these cases you may need proc_open() (http://www.php.net/manual/en/function.proc-open.php) or exec() (http://www.php.net/manual/en/function.exec.php)
Just make sure that if you use exec() you redirect the output!!. Failure to do this can cause the program to hang indefinitely.
From the PHP manual:
Note:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Make sure to update your Service Packs and AntiVirus definitions. I can foresee many many many potential security issues here.
Keep in mind that JavaScript in IE runs with a webpage context. When you refresh/navigate pages, the old JavaScript execution state is wiped and a new one begins.
Was there a specific question here?
I want to setup a automated backup via PHP, that using http/s I can "POST" a zip file request to another server and send over a large .zip file , basically, I want to backup an entire site (and its database) and have a cron peridocally transmit the file over via a http/s. somethiling like
wget http://www.thissite.com/cron_backup.php?dest=www.othersite.com&file=backup.zip
The appropriate authentication security can be added afterwords....
I prefer http/s because this other site has limited use of ftp and is on a windows box. So the I figure the sure way to communicate with it is via http/s .. the other end would have a correspondign php script that would store the file.
this process needs to be completely programmatic (ie. Flash uploaders will not work, as this needs a browser to work, this script will run from a shell session)//
Are there any generalized PHP libraries or functions that help with this sort of thing? I'm aware of the PHP script timeout issues, but I can typically alter php.ini to minimize this.
I'd personally not use wget, and just run from the shell directly for this.
Your php script would be called from cron like this:
/usr/local/bin/php /your/script/location.php args here if you want
This way you don't have to worry about yet another program to handle things (wget) if your settings are the same at each run then just put them in a config file or directly into the PHP script.
Timeout can be handled by this, makes PHP script run an unlimited amount of time.
set_time_limit(0);
Not sure what libraries you're using, but look into CRUL to do the POST, should work fine.
I think the biggest issues that would come up would be more sever related and less PHP/script related, i.e make sure you got the bandwidth for it, and that your PHP script CAN connect to an outside server.
If its at all possible I'd stay away from doing large transfers over HTTP. FTP is far from ideal too - but for very different reasons.
Yes it is possible to do this via ftp, http and https using curl - but this does not really solve any of the problems. HTTP is optimized around sending relatively small files in erlatively short periods of time - when you stray away from that you will end up undermining a lot of the optimization that's applied to webservers (e.g. if you've got a setting for maxrequestsperchild you could be artificially extending the life of processes which should have stopped, and there's the interaction between LimitRequest* settings and max_file_size, not to mention various timeouts and the other limit settings in Apache).
A far more sensible solution is to use rsync over ssh for content/code backups and the appropriate database replication method for the DBMS you are using - e.g. mysql replication.