Creating a PHP web server... executing PHP files - php

I'm making a simple web server using PHP and sockets. Everything is working fine right now (static content only). I'm interested in supporting the execution of PHP files.
How would I go about doing this? I don't really want to use eval($fileContents) since that does not seem very secure. Is there some way that I can use FastCGI sockets or something?
What about PHP-CGI?
I've decided on using FastCGI, so
here's a more specific question:
How do I pass files into PHP-CGI and get the output as a string?
php-cgi "phpinfo.php" outputs HTML content like I want.
I understand that I can use sockets but I can't seem to find out what to send into that socket to get the output.
Thanks

what's so insecure in eval($fileContents)? or, more familiar but actually equal include $fileName?
(with proper filename sanitizing of course)
apparently it's no more insane than PHP web server itself.

You could exec() the PHP executable from within your web server. Though you'd also have to finagle the passing of GPC data to the script in question.

Related

Aurelia + Php Possible/Recommended?

Before this question gets closed, I know the setup above is possible. I just want clarification on some things.
I just started learning Aurelia because I want to convert one of my projects into a web app. My project is built with html+css+JavaScript(jQuery)+ PHP(MySql).
I havent used any sort of framework before.
In the guide, they mention a few ways to setup a web server. I used the http server with node. Now this is where I need some help understanding a few things.
I dont want to use node.js. I want to use PHP on the server. Will that work and how?
When using Apache server, I know any PHP page is sent to the interpreter that renders the final html. I use XAMPP and its apache comes bundled with PHP. Does the http server used by node come with PHP? Is this even a sensible question?
Now I know Aurelia is purely front end. If it used to make single page applications, it uses Ajax. So now I made the following assumption:
Using Aurelia, the user accesses the root page of the app that the web server sends. After that, Aurelia makes various Ajax requests to the server which will use my PHP files to do database query stuff.
Is that right or am I missing something. And can I just use xampp(apache) to host my app instead of server from node?
Aurelia is a framework that, after you export it to any server, does not rely on any back-end software at all. This means that with the help of the http- / fetch-client API, you can just call out to your php script.
I have an example in my github:
https://github.com/rjpvroegop/randyvroegop.nl-made-with-aurelia
Here I use the http-client to post data to my php script wich has a very simple email functionality.
You can see the action inside my view-model in src/pages/contact/index.js.
You can see the PHP script in src/assets/components/contactengine.php.
These work the way they should. Note: you have to change your gulp build if you want your PHP served the way I serve mine, from the dist folder after gulp-watch or gulp-export.
Next to that you can use any back-end functionality you would like, as long as it returns the proper data. This PHP script does that. If you would download my distribution to test this you can simply do the following:
gulp export from your terminal in the root folder
copy everything from the export folder to your PHP webserver.

Transfer large data from one server to another

I m currently trying to transfer large data from one server to another using php cURL (posting the data). In some cases the remote server is getting incomplete data(corrupted).
Is there any other way to achieve this reliably
EDIT - 1
Using FTP seems good idea, anybody would like to say that it is bad or i should avoid it for any reason (Suggestions - #Ed Heal, #Neo)
I would guess your php session is timing out. See How to increase the execution timeout in php?
Or you could get curl to run in it's own thread. Call it from a bash script maybe.
Posting large files is not what http is for. Ftp is for transferring files. Hence the name.
But if you are stuck on using http, you can take a look at the WebDAV extensions to http. There is a php library called SabreDAV that you should take a look at:
http://code.google.com/p/sabredav/
You can even use scp to do so, so that the data transfer is secure as well. You would be able to find libraries to do so. Also basic function in php can be useful: http://php.net/manual/en/function.ssh2-sftp.php
As you say that it is truncated, I would imaging that the server has a file limitation size - i.e. to prevent abuse and denial of service attacks.
I would stick to FTP and perhaps compressing the files.

Polling a running php cli script

I want to run a php script from the command line that is always running and constantly updating a variable.
I then want any php script that is run in the meantime (probably but not necessarily from the web) to be able to read that variable at any time.
Anyone know how I can do this?
Thanks.
Here, you want some kind of inter-process communication mecanism.
You cannot use a PHP variable for that : these are local to the script they're in.
Which means you'll have to use some "external" tool to store your data, like, to only speak of a few :
a file
a database (SQLite, MySQL, ...)
some shared-memory segment
In each case, you'll have :
One script that write to the data-storage space -- i.e. your first always running script
One or many other scripts that will read from the data-store
You should write the variable to a file with the CLI script and read from that with the other script.
Be sure to use flock to prevent race conditions.
You can write a php socket based server script, which will listen on desired port. Find article here.
Then your client php script can connect to it either locally or from the web and retrieve any data, including variables.
You can use any simple protocol designed by you or well known like XML to transfer variables.
Lots of idea's:
At set intervals it appends/writes to a file.
You use sqlite and write your data to it.
Your use a small memcached service as your intermediary.
You go somewhat crazy and write a socket class, listen on a set port, then make non-blocking calls to check.
1-2 are probably the simplest
3 would work great if you need to query the value a lot
4 would be fun, but might not be worth the effort.

I'm trying to run some PHP scripts as CLI instead of over HTTP. How do I make them play nice?

I'm using some PHP scripts from FeedForAll to join together RSS feeds (RSSmesh) and display them as HTML (RSS2HTML).
Because I intend to run these scripts fairly intensively and don't want the resulting HTTP requests and bandwidth to count towards my hosting quota, I am in the process of moving to running them on the web host's server in an umbrella PHP "batch" script, and call this script via cron (this is a Linux server, by the way).
Here's a (working) sample request over HTTP:
http://www.mydomain.com/a/rss2htmlcore/rss2html2.php?XMLFILE=http://www.mydomain.com/a/myapp/xmlcache/feed.xml&TEMPLATE=template.html
This will produce the desired HTML output. An example of how I want this to work on the command line:
/srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html
This is with the correct shebang line added to "rss2html2-cli.php". I could just as well specify the executable ("/usr/local/bin/php") in the request, I doubt it makes a difference because I am able to run another script (that I wrote myself) either way without problems.
Now, RSS2HTML and RSSmesh are different in that, for starters, they include secondary files -- for example, both include an XML parser script -- and I suspect that this is where I am getting a bit in over my head.
Right now I'm calling exec() from the "umbrella" batch script, like so:
exec("/srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html", $output)
But no output is being produced. What's the best way to go about this and what "gotchas" should I keep in mind? Is exec() the right way to approach this? It works fine for the other (simple) script but that writes its own output. For this I want to get the output and write it to a file from within the umbrella script if possible. I've also tried output buffering but to no avail.
Do I need to pay attention to anything specific with regard to the includes? Right now they're specified in the scripts as include_once("FeedForAll_XMLParser.inc.php"); and the specified files are indeed in the same folder.
Further info:
-This is a Linux server.
-I have no direct access to the shell, so I can't test things directly on a command line, everything is via crontab.
-I will admit that support for the FeedForAll scripts leaves a lot to be desired, but I'd like to keep using their scripts if at all possible, if only because I know them and have been using them for a while. I have looked into Simplepie, but the FFA scripts do some things that I've seen no obvious solutions for with Simplepie, like limiting the number of items per individual feed (RSSmesh) or limiting the description length (RSS2HTML).
-Yahoo! Pipes is out, they cache their data for too long for my application.
Should you want to take a look at the code, here are the scripts as txt files. RSS2HTML2 and RSSmesh are the FeedForAll scripts, FeedForAll_XMLParser... is the included parser. Note that I have not yet amended these to handle $argv etc. I have however in "scraper-universal-rss-cli", which works fine with CLI.
If anyone has any thoughts to share on this it would be very much appreciated. Thank you in advance.
I think the $hideErrors = 0; line in rss2html is not helping. Since isset is used to check if errors should be displayed you should comment this out. Setting it to zero does nothing since a variable set to 0 still evaluates to true with isset.
Re-run and see if it throws up some errors for you.
Use wget or curl to issue the request against the local web server. Don't use CLI.

Dynamic Website on static html

I am trying to create a script that will display the contents of a folder, onto a newsticker, and I was wondering if anyone had a script that could run this. I was thinking probably php, but it has no been working for me.
Thanks for the help
The software I am using is dreamweaver cs4
I'm guessing that you have written some PHP, but are trying to run it locally without a server - for PHP to work you need a server. XAMPP is a good bet to do this locally, or you'll need to upload your file to some hosting that supports PHP.
I'm thinking why you're having trouble with it is because you just copy and pasted the code in a .html file and opening it on your local environment.
With that said, in order for it to run locally, you have to install php and a http server. The code is done on server side, not client side. So either get a hosting service that supports php or download and install your own server and php.
Also, if you already have the above, the code has to be surrounded by a < ?php and ? > tags(without spaces). If you're running it on cli, then you need to make sure you give it execution permission with the path to php, OR execute php < name of script >.
Last, the code you presented provides major security flaws. The first of which is where will the "$dir_path" variable be set? Will that be user given, or will you specify the variable?
Whenever you allow users to view your file system, always make sure you give limitations to it. For example, let's say you did this:
www.example.com/newsticker.php?path=/www/files/newsticker
looks innocent enough, but a clever hacker could say let me try....
www.example.com/newsticker.php?path=/
And so fort.
So be careful and don't allow users to specify directories or execute code.

Categories