XML with PHP instead of MySQL for first time - php

I've always used MySQL with PHP and on my site I added a chat system that uses MySQL which as you would know is VERY database heavy so I think I'm going to switch it over to XML files for each user just for the chat system to take all the load off of the DB.
I have never used XML with PHP before so I was looking for information, and found that the queries are pretty similar to MySQL.
Here are my questions:
Is it the client writing to the file or the server writing to the file?
Do I have to chmod them as 777?
Do I also have to chmod the XML file as 777?
If I do have to set the permissions to 777, does that drastically decrease security and is there any way to tighten that up?
Does anyone have any tutorials they would recommend me to as well? Most of the stuff I found is from 2003 - 2005. Don't know how much has changed in it.
Hope it's cool I ask this question here.
Thanks a bunch
-Sal

First off, it seems like a very bad idea to do something that is very database intensive in xml, it is going to be a lot more intensive that way.
To answer your question; It is always the server writing to the server filesystem. you only need to provide write access to the user running apache (and consequently php).
I suggest you stick with mysql for chat but look into DOMDocument to learn more about xml

Related

How can I get CPU Usage with PHP on Windows IIS 8.5? (2012 R2)

What I'm looking for is an easy way to get either individual core usage or total CPU usage for the system that the PHP Script is running on.
However I'm unable to do so. I've looked all over for all manner of solutions from using perf (with and without passthru) to using winmgmts through COM.
The issue is, some of these will work on Windows if you use Apache, but with IIS the security restrictions stop PHP from being able to use for example winmgmts through COM so I just get back a null object.
How can I solve this? - I've honestly tried every solution I can find on the internet and while there is lots of information about how to raise the permissions all the guides point to IIS 7 or earlier and are no longer applicable to IIS 8.5 with literally the suggested option changes being non-existent.
If anyone could help me with this I'd be really appreciative, a workaround like using a third party application that could provide this data would also be acceptable if I can query the data through PHP either from a file or network etc Even a asp.net script that I could query? (I don't know anything about asp.net but I could use it for this single thing if it'd work?)
Thank you.
I managed to solve this and I hope it helps someone else.
What you must do is convert the folder where your PHP (or asp) will execute to an Application. So the structure will look like this:
Website Name
-> Application Name
Then you want to select the parent folder, the Website Name folder and go to "Basic Settings" in the far right actions pane and select "Connect As..." and connect as an Administrator account.
Once you've done this the application will inherent the credentials you specified on the parent website folder and you'll now have full access to perf, wmi and so on.
If you only give the credentials directly to the application it doesn't work and it also doesn't work if you don't convert your folder where your scripts will execute to an application. This is where I was being tripped up and the documentation online is very sparse.
I'd like to thank the good people at the phpsysinfo github for their IIS documentation which pointed me on the right track on needing to convert a site to an application which was part of the puzzle I was missing.

Prevent scripts from stealing password in open source PHP project?

I'm currently developing a PHP framework. Other developers may create modules for the framework. Source code of these modules should reside in the framework directory.
Since the project is open-source, modules know location of the config file which has database password in it. How to protect passwords from malicious modules? Please check that modules may just require_once the config file and do harmful things!
Currently I'm storing Database passwords in a directory named config, and protecting it by a .htaccess file:
<Directory config>
order allow,deny
deny from all
<Directory>
But that is not sufficient to prevent scripts steal the password, is it?
I've read the thread How to secure database passwords in PHP? but it did not help me finding the answer.
In PHP, you can't. It's not a sandboxed language; any code you run gets all the permissions of the user it's running under. It can read and write files, execute commands, make network connections, and so on, You must absolutely trust any code you're bringing in to your project to behave well.
If you need security boundaries, you would have to implement them yourself through privilege separation. Have each module run in its own process, as a user with very low privileges. Then you need some sort of inter-process communication. That could be using OS-level pipes, or by having separate .php files run as different users running as web services accessed by the user-facing scripts. Either way, it doesn't fit neatly into the usual way PHP applications work.
Or use another language such as Java, which can offer restricted code with stronger guarantees about what it is allowed to do (see SecurityManager et al).
Unfortunately, PHP is not a very secure language or runtime. However, the best way to secure this sort of information is to provide a configuration setting that has your username/password in it, outside of your document root. In addition, the modules should just use your API to get a database connection, not create one of their own based on this file. The config setting should not be global. You should design something like this in a very OOP style and provide the necessary level of encapsulation to block unwarranted access.
I've got an idea that may work for you, but it all really depends on what abilities your framework scripts have. For my idea to be plausible security wise you need to essentially create a sandbox for your framework files.
One idea:
What you could do (but probably more resource intensive) is read each module like you would a text file.
Then you need to identify everywhere that reads a file within their script. You've got things like fopen for file_get_contents to consider. One thing I'd probably do is tell the users they may only read and write files using file_get_contents and file_put_contents, then use a tool to strip out any other file write/read functions from their script (like fopen).
Then write your own function to replace file_get_contents and file_put_contents, make their script use your function rather than PHP's file_get_contents and file_put_contents. In your file_get_contents function you're essentially going to be checking permissions; are they accessing your config file, yes or no, then return a string saying "access denied" if they are or you use the real file_get_contents to read and return the file if not.
As for your file_put_contents, you just need to make sure they're not writing files to your server (they shouldn't be allowed, imagine what they could do!), alternatively, you could probably use a CHMOD to stop that happening.
Once you've essentially rewritten the module in memory, to be secure, you then use the "exec" function to execute it.
This would take a considerable amount of work - but it's the only pure PHP way I can think of.
I am not sure if it is possible, however you could maybe make a system which checks the files in the module for any php code which tries to include the config file, and then warn the user about it before installing.
However it really shouldn't be your responsibility in the end.
A very good question with no good answer that I know of, however...
Have you seen runkit? It allows for sandboxing in PHP.
The official version apparently isn't well maintained any more, however there is a version on GitHub that is quite popular: zenovich/runkit on GitHub
Although the best solution is perhaps a community repository where every submission is checked for security issues before being given the OK to use.
Good Luck with your project
Well, I see no problem here.
If it's a module, it can do harmful things by definition, with or without database access. It can delete files, read cookies, etc etc.
So, you have to either trust to these modules (may be after reviewing them) or refuse to use modules at all.
Don't include your actual config file in your open source project.
The way I do it is a create just the template config file config.ini.dist
When a user downloads your project they have to rename it to config.ini and enter their own configuration information.
This way every user will have their own database connection info like username and password. Also when you update your project and users download your newest version, their own config files will not be overwritten by the one from your program.
This a a very common way to store configuration in open source projects - you distribute a template config file and tell users that they have to rename it and enter their own configuration details.
I don't think there is a way to prevent a module to capture sensible data from the actual framework configuration and send it to some stranger out there. On the other end, I don't think that should be your responsability to protect the user from that to happen.
After all, it's the user that will decide to install any module, right? In theory it should be him that would have to verify the module intents.
Drupal, for example, does nothing in this direction.
There is a worst problem, anyway: what'd prevent a nasty module to wipe out your entire database, once it is installed?
And, by the way, what could the malicious stranger do with your database password? At the very least you anyway need to secure the connection of the database, so that only trusted hosts can connect to the database server (IP/host based check, for example).

Need an advice on text-processing flow

I've been a research programmer (MATLAB) for most of my programming career, writing things for only myself that can be run on my own computer. Now, I'd like to be able to have people submit a comma-delimited text file and get processed text files in return without having to use my computer directly (only 1 MATLAB installment).
I'm thinking perhaps this can be done on my web server (XAMPP) over LAN and some programming language script that can be run on my server. This is what I'm thinking:
have people create comma-delimited text files.
have them go to a site I created on my localhost and submit it via a webpage forum.
have the uploaded file processed in PHP (small files, < 100KB). This involves looking up a MySQL database as well.
have people download the processed files somehow.
Is this a sound system? By "sound" I mean, if you, the expert, wanted to set up this system, would this be the steps and tools you would use? I've been learning PHP lately, and it seems like I could do this using PHP, but I'm not sure if this is the right tool for the task. The whole thing seems ... a bit on-the-fly, as in you upload the file, and things are done in PHP memory (from what I've read) instead of the file being stored on my server and the server running a script using that file (is there a difference?!). I would be greatly thankful if you guys could chime in and give me some pointers on how to do this properly (general ideas, not asking for codes).
PHP is most definitely a good tool for something like this. As meteorainer mentioned, PHP offers a pretty simple solution for most of what you need to do, and is much less complicated (in my opinion) than Java or .NET. I also believe it to be much easier to get started with.
As far as pointers go, a lot of what you need to accomplish can be found in the PHP manual itself, along with code samples. For example:
File uploads:
http://php.net/manual/en/features.file-upload.php
CSV Processing:
http://php.net/manual/en/function.fgetcsv.php
or, the method meteorainer mentioned
http://us3.php.net/manual/en/function.explode.php
MySQL Databases:
http://us3.php.net/manual/en/book.mysql.php
http://us3.php.net/manual/en/function.mysql-connect.php
Creating new files:
http://php.net/manual/en/function.fwrite.php
As far as whether or not this is a sound system, that all really depends on what this is going to be used for. I may be wrong, but it sounds like you just need a simple application for a very specific use. If this is the case, I would say it sounds just fine. You can always expand upon it later on if you choose to do so. Adding more security measures, more robust output, things like that. Either way, at the very least, your PHP implementation sounds like pretty good starting point to me.
Ya php can definitely do what you are looking for. You'll be using functions like:
$variablesArray = explode(file_get_contents('uploadedfile.csv'));
To bust open the CVS into a useful array and do some storage/math to that. PHP is definitely your bag.
You have other options, like java and asp, but imo java is far too complicated for what you get out of it, and asp requires a .net license and again, grants nothing over FREE php.

PHP / FTP Client

I'm about to get my hands dirty writing an FTP wrapper for PHP, I just need to perform the basics:
read / write and append to files
list / chmod and delete files / folders
Unfortunately I only had to mess with FTP within PHP once to answer this question, and I got somewhat disappointed with the ftp extension, mainly because it ain't trivial to distinguish between files and folders and the overall speed wasn't great.
As far as I know PHP has four distinct ways of interacting with FTP servers:
Pure Socket Implementation
File Wrappers
FTP Extension
CURL Extension
Now, I don't want to code the FTP client protocol myself, so option #1 is out of the equation.
File wrappers are great if I need to do something trivial like getting a single file, but they are extremely slow if I need to perform more complex operations since each call will open its own connection.
That leaves me with the FTP and CURL extensions, and here is where I need some guidance. As I said before I am not a big fan of the FTP extension, on the other hand I've never used CURL to FTP so I can't objectively compare one with the other.
Has anyone ever tried both approaches? What are your thoughts on them? Is the CURL option faster?
Also, are there any alternatives I'm not aware of?
Have you looked at the PEAR package Net_FTP?
I've tried both for one proj. Was needed to upload some file via ftps+auth connection with encryption and authentication then to get response code and XML info, kind of XML-RPC exchanging so at the end could not even come closer to the solution with php-ftp-extension and everything was accomplished with some debugging (CURLOPT_VERBOSE) and configuring with PHP-CURL. So I vote for CURL, it is from 1997-th and works great!

How to isolate server disaster script in PHP?

Oh my goodness. I never thought that I will need to ask you this. But unfortunately yes, I need to!
I have a PHP written script of my own that uses ffmpeg-php. And ffmpeg-php is a bastard. For some input it works ok, but for some it crashes my whole PHP and server throws Internal Server Error 500. I've tried several times to update ffmpeg-php, ffmpeg itself and so on, but when for some input it works in version 0.5 in 0.6 it wont work. And what i need is be sure that rest of the script will be processed correctly. And now it does not, because when it comes to run toGDImage() on movie frame I have Internal Server Error 500 and no feedback why from any source.
So for peace of mind of my users I decided that I need to isolate this part of script that messes with ffmpeg-php. I need a way to assure that if something will go terribly wrong in this part, it rest will go on.
Try catch does not work because this is not a warning, nor a fatal error, it is a horrible server-disaster. So what are your suggestions?
I think about putting this script into another file called ffmpeg-php-process.php and call it via HTTP and read result, if it is Internal Server Error 500 - I will know that it was not ok.
Or are there any other, more neat ways to isolate disaster scripts in PHP?
Ps. Don't write that I need to diagnose or debug or find the source of the error. I'm not a damn beginner and I'm not a ffmpeg dev to mess in it's code, I need to make my users safe now, and it's everything that i care now.
If you're getting a 500 error, it's because an exception of some sort is being thrown at a level lower than that of PHP itself. Unless your code is spinning into some kind of infinite loop or hitting a recursion limit (and especially since it worked with version 0.5), there's a good chance that ffmpeg or ffmpeg-php is crashing and taking the instance of PHP that launched it down with it.
Frankly, there's nothing you can do from PHP.
Your best bet would be, since you've already got access to the server, to write the script in question using a language like Python. There's a ton of ffmpeg python plugins, so you shouldn't have a difficult time setting that up at all. Call your Python script from PHP and pull in the output from a file. What this will do is isolate PHP from your script failing. It'll also get you away from ffmpeg-php (which, at least to me, seems like an unholy combination).
If you're dead-set on using PHP (which I don't recommend), you can launch another PHP script using php-cli from your outward-facing PHP script and do the work from there (as you would with Python). Again, I highly recommend that you avoid this.
Hope this helps!
You could spawn a new process containing your php-ffmpeg script. There are some functions to do that: proc_open() as instance.
The documentation has a not bad example about it:
http://php.net/proc_open
I have something similar going with a convoluted, large, bulky legacy php-email system I support. When it got apparent that the email system was becoming it's own beast, we split it off as its own virtual server entirely. There's no separation like PHYSICAL separation. And hey, virtual servers are cheap....
On the plus side, you can start, restart, and generally destroy the separate server with little affect on the rest of your code. It may also have improved backup implications (isolate media and logic) Since going this route, we've not ever taken the main application server down.
However, it does create a connection challenge as now rather than working local you're going to have your server talking to another separated by at the very least a bit of wire in the same cabinet (hopefully)

Categories