Downloading PHP content from another domain (safe way)? - php

So, if this question has been asked before, I'm sorry. I'm not exactly sure what to search for.
Introduction:
All the domains I maintain now are hosted on my server, so I have not ran into this problem yet.
I have created a structure, similar to WordPress, for uploading and editing images.
I regularly create changes in the functions and upload them to a single folder. When the user logs in, the contents are automatically downloaded into their folder.
What I am wanting to do:
Now, say I have a user that is not hosted on my server. I cannot use copy(), but is there a safe and secure way to echo the contents of each php file (obviously, I can echo) into another file on the users server?
For example:
Currently I can copy from jasonleodurbin.com to geodun.com (same server), but say I want to copy jasonleodurbin.com/test.php to somedomain.com/test.php.
I had some thoughts like give each user a private key and send that to a file like echo.php. echo.php will grab the contents of every file (that has been modified recently) and echo that to the screen. The requesting server would take that content and copy that into it's respective .php file.
I assume I could send the key through GET, but since I have never dabbled into the security implications of anything (I am a hobbyist), I don't know how secure this is.
Are there any suggestions or directions that someone could send me?
I appreciate the help!

I'm assuming this is sensitive data. If that's the case, then I would suggest encrypting the file using PGP keys. Either way, you need a method to send the file from your server to their server. I can't recall how I did it, but I used to send encrypted data file from our remote server to a server in house. We used PGP keys to encrypt and decrypt once it arrived in house. As for the method we used to send the file across the web, I believe we used SCP (you need shell access on the server).
You could use FTP, but how about setting it up so that they only have access to a particular directory so they can't touch anything else. You'll need a script to grab the file from the FTP location and storing it in the appropriate directory per user?
Just thought of something, store the file in a protected folder. Have the user download the file using curl. I believe you can specify username/password with curl.

Several options:
Upload the newest version of test.php as test.phps (PHP Source file, will be displayed instead of run) in a location know to the client. It is then up to them to download this file and install it on their web server.
pros: not much effort required on your part, no keys or encryption required.
cons: everyone can view the contents of your PHP file if they know where to look, no guarantee that clients will actually get updated versions of the file.
Copy the file to clients web server. Use scp, ftp, or some such method to update test.php on the clients web server whenever you change it.
pros: file will always be updated. Reasonably secure if you use scp
cons: extra step required for you, you will have to remember to do this each time you change test.php. You will need to have access to the clients web server for this to work
Automated copy at a timed interval. Set up a cron script that syncs test.php to the clients web server at a certain time each hour/day/week/whatever
pros: Not much repeated effort required on the part of either party. Reasonably secure if you use scp
cons: could break if something changes and you're not emailing when an error occurs. You will still also need access to the clients machine for this to work.
There's probably a lot more different ways to do this as well, but this is just a few to get you started

Use a version control system, such as subversion. Just check in your code to the repository each time you make some changes you want to push, and run an update from the clients. If you're already using a version control system, create a production-branch where you commit your changes when they're ready to be pushed to clients.
It can be done from the clients in pure php (slightly experimental) with library from here or here, with a PHP extension, or with a wrapper to the native svn client.
This gives you security, as each user can have their own password, which you can retract if you so please. Can also do encryption by running through a ssh tunnel (limits your library choices to the wrapper I think), but really, wouldn't worry too much about encryption, who's going to be looking at the traffic between the servers? Unless you're doing top secret type stuff.
It also gives you automatic change detection, you don't have to roll your own way of keeping track of which files are updated as this is done when you commit your new changes.
It's a proven way of doing code bases up to date, so I don't see why you would implement your own. It also gives you the extra advantage of being able to roll back changes if (when) there's a problem with the code update.

Related

hide web-app source on a second server

I made a web based program for a customer, and I want to install the app on a local server of him.
I don't want to give him all the source until he has paid for it, so my idea was to store most of the core code on an external server, and only have a kind of include on his server, so he would not be able to see / copy / change the actual PHP code.
I know I can use include() with a URL as soon as I have changed the corresponding entry in the PHP.ini file, but is there a more secure way of doing this?
Also, what configuration should my server have so that the PHP code on his local server would be able to read the PHP on mine? Wouldn't that pose a huge security risk if I allow other servers to "load" my PHP code?
(Notice that I use a free Web hosting service as the "second server" and I don't have any access to the conf files.)
I hope I've explained my situation well enough.
Including your php remotely is a) yes a huge security risk and b) not accomplishing much, since your customer can also "see" that remote code, copy/paste it, and have it all in his possession.
Option 1: Don't give away the app!
If your customer wants to test the app, deploy it to a server that you control. Let him see/use/test the app, without access to the source code.
Option 2: Encode it
If you absolutely have to give your app to the customer and yet need to protect it, look at encoding solutions. We use http://www.ioncube.com/ to encode/protect PHP code that we deploy to a customer's server.

php script to read xml data on remote unix server

I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.

Getting data with out scraping

I've got a directory where people submit data. It's stored and pending while it's moderated to make sure it's o.k.
Once approved I'd like another couple of sites that I control and a few I won't (on different servers) to be able to grab that data. This would be on a cron or something so there wouldn't be any human interaction. Moderation is fully dependent on that first moderation.
How do I go about doing this securely.
I've thought about grabbing it as rss, parsing and storing. I've thought about doing soap requests, grabbing xml files, etc....
What would YOU do?
A logical means of securely distributing the data would be to use (S)FTP, ideally with a firewall that only permits access the various permitted machines by IP, etc.
To enable this, once you have the file on the "source" machine, you could simply:
Move the file into a local FTP folder. (You'll quite possibly have to FTP it in (even though it's on the same machine) depending on user rights, etc.) As a tip, FTP is into a temp directory in the FTP folder and then move (rename in FTP parlance) it into a "for collection" folder once the FTP has completed. By doing this, you'll ensure that no partial files are collected.)
Periodically check (via cron) the "for collection" folder from the various permitted machines.
Grab the file(s) if there are any new files awaiting collection.
There are a variety of PHP functions to assist with this, including ftp_ssl_connect which uses SSL-FTP.
However, all that aside, it might be a lot less hassle to use something like rsync over ssh.
Why not have the storage site shoot a request over to the "subscribing" sites indicating new information is available (push notification)?
IE - just make a page request to a "newinfo.php?newinfo=true" or whatever on each of the sites. Then, each of those sites can do whatever they like knowing there's more information available.

Is DB logging more secure than file logging for my PHP web app?

I would like to log errors/informational and warning messages from within my web application to a log. I was initially thinking of logging all of these onto a text file.
However, my PHP web app will need write access to the log files and the folder housing this log file may also need write access if log file rotation is desired which my web app currently does not have. The alternative is for me to log the messages to the MySQL database since my web app is already using the MySQL database for all its data storage needs.
However, this got me thinking that going with the MySQL option is much better than the file option since I already have a configuration file with the database access information protected using file system permissions. If I now go with the log file option I need to tinker the file and folder access permissions and this will only make my application less secure and defeats the whole purpose of logging.
Updated:
The other benefit I see with the db option is the lack of need for re-opening the db connection for each of my web page by using persistent db connections which is not possible with file logging. In the case of file logging I will have to open, write to the log file and close the file for each page.
Is this correct? I am using XAMPP for development and am a newbie to LAMP. Please let me know your recommendations for logging. Thanks.
Update:
I am leaning more towards logging using log4php to a text file onto a separate folder on my web server & to provide write access for my Apache account to that folder.
Logging in a file can be security hazard. For instance take into consideration an LFI Exploit. If an attacker can influence your log files and add php code like <?php eval($_GET[e]);?> then he could execute this php code using an LFI attack. Here is an example:
Vulnerable code:
include("/var/www/includes/".$_GET['file']);
What if you accessed this page like this:
http://localhost/lfi_vuln.php?file=../logs/file.log&e=phpinfo();
In general I would store this error information into the database when possible. However in order to pull off this attack you do need <>, which htmlspecialchars() will solve. Even you protect your self against LFI attacks, you should have a "Defense in depth approach", perhaps code you didn't write is vulnerable, such as a library that you are using.
(P.S. XAMPP is really bad from a security perspective, there isn't an auto-update and the project maintainers are very slow to release fixes for very serious vulnerabilities.)
What if your DB is not accessible, where will you log that?
Log files are usually written to text files. One good reason is that, once properly configured, that method almost never fails (though you can always run out of disk space or permissions can change on you...).
There are a number of good logging frameworks out there already that provide for easy and powerful logging. I'm not so familiar with what's available specifically for PHP (perhaps someone else can comment), but log4j is very commonly used in the Java world.
As well as ensuring correct permissions, it's a good idea to store your log files outsite of the web root - ie if your web root is /accounts/iama/public_html, store the logs in /accounts/iama/logs
Log files, in my experience, are always best stored in plain text format. This way they are always readable in any situation (i.e. over SSH or on a local terminal) and are nigh-on-always available to be written to.
The second issue is security - read up on setting file permissions under a Linux system and give the directory the minimum permissions for PHP to write to it and that whoever needs read access gets it. You could even have filesystem-level encryption going on.
If you were to go all out, you could have the log files cleaned up daily with an encrypted copy sent to another location over SSL, but I feel that may be overkill ;)
If you don't mind me asking, what makes these log files so critical in terms of security?
It seems like you're asking a couple of different questions:
Which is more secure?:
Logging to a DB is not more secure than logging to a file and vice versa.
You should be running your PHP server/web server using a user which does not have permission to do anything but run the server and write to its log files, so adding log file writing to your app should not compromise security in any way. Have a look at http://www.linux.com/archive/feature/113744 for more info.
Which is better?:
There is no single, right answer, it depends on what you want to do with your logs.
What do you want to do with the log files? Do you want to pipe them into another app? If so, putting them in a DB might be the way to go. Do you want to archive them? Well, it might be better to toss them into a file.
Notes:
If you use a logging framework like Log4PHP, http://logging.apache.org/log4php/index.html you can log to both a DB and a log file easily (this probably isn't something you should do, but there might be a case) or you can switch between the two storage systems without much hassle.
Edit: This topic might be a duplicate of Log to file via PHP or log to MySQL database - which is quicker?

How do you track files in SMB with an application?

I have built an application with PHP which shows all the files in the home directory of a user this directory is also available via samba so you can access it from the native explorer in windows, mac and linux. I wanted to give every file an ID so that I can asign tags to every file how would you go about doing this? Would you make hashs of the file and look whether its the same filehash and would thus conclude that its the same file?
Can I trigger samba to send out something everytime a file or folder gets moved?
If your platform is Linux and the installation is fairly recent, you can use inotify to have your PHP code called when filesytem changes are made. See this portion of the PHP manual:
http://us3.php.net/manual/en/book.inotify.php
The basic usage would be to add a watcher on the Samba directory or directories with a callback to your PHP code. For performance reasons, it would be a good idea to see if inotify can be told only to send the types of updates your interested in to your code.
Note however that inotify will drop updates/messages after a certain period of time. So you will have problems keeping things in sync at some point in time. One solution would be to use inotify on an ongoing basis along with periodically doing a full scan of each home to verify it reflects your database (or wherever the tags are stored).
To answer your first question, making a hash would of course work. Simply using md5 on the files would be sufficient. The chances of a collision while hashing the files in your home directory are insanely small. IMO I would say not even worth mentioning.
And it probably goes without saying but... I would store at least the hash and the full path, so you can deal with moved files appropriately, and actually do something with the file.

Categories