php script to read xml data on remote unix server - php

I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.

Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php

While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.

Related

PHP displays as plaintext on web server?

I wrote an HTML page using XAMPP on my Windows computer, and everything works great locally. However, I want to host the content on a website, and I have no idea how to do it. Here it is as is, with a PHP code displayed, instead of actually running.
Does github.io support running PHP at all? I read that it doesn't. If so, where would I be able to host my code so that my PHP and JavaScript could run given that I point my browser to the webpage's URL? Also, XAMPP had linked it to an MySql database, but I am unsure of how to set that up as well on a server.
These seem like simple questions for the beginning web developer, but I scoured Google and couldn't find an answer. Thank you.
To deploy php on the server you have to check:
Is there PHP?
Take care of pretags
Check version of php (it will run with error if php parser is on)
Configure your wwwroot or httpdocs or hotdocs (webpage public folder)
Take care of .htaccess if something is crashing
And not every server is apache and not every server is node and not every... Just ask support of this hosting because in some cases you have to "turn on" PHP by some fancy button.
Have a nice day!
No, Github Pages doesn't have PHP, it's only meant to host static pages. It does support Jekyll, a static site generator (in that it generates static pages once per push which are then hosted as-is), but that's about all.
PHP/MySQL is only one of the many sets of possible web application technologies, so you can't expect it to be everywhere where web hosting is. It has to be either explicitly listed on hosting service's website, or be available for installation in case you get yourself a full-fledged server machine (maybe virtual) to run your website.
Browser-based JavaScript will still be run by the client, since it's not the server's responsibility to run it, just the delivery. So it can be hosted on GitHub Pages. Also, 3rd party services that don't depend on your own server's code execution are usable too: stuff like commenting systems, searches (you can even make a client-side one!) and analytics.

PHP/MySQL Performance Testing with Just PHP

I'm trying to diagnose a server where the website is loading very slowly, but unfortunately my client has only provided me with FTP access.
I've got FTP access so I can upload PHP scripts, but can't set up any other server side tools.
I have access to phpMyAdmin, but not direct access to the MySQL server. It is also unfortunately a Windows server (and we've been a Linux shop for over a decade now).
So, if I wan to evaluate MySQL & disk speed performance through PHP on a generic server, what is the best way to do this?
There are already tools like:
https://github.com/raphaelm/php-benchmark or https://github.com/InfinitySoft/php-benchmark
But I'm surprised there isn't something that someone has already set up & configured to just run through and do some basic testing of a server's responsiveness.
Every time we evaluate a new server environment it's handy to be able to compare it to an existing one quickly to see if there are any anomalies. I guess I'd just hoped that someone else had written up a script to do this already. I know I have, but that was before Github when there was a handy place to post scraps of code like this.
You've probably already done this, but just in case... If I were in your shoes, the first thing I'd be looking at are the indexes on the mysql tables and the queries in the application. I've seen some sites get huge speed boosts just by fixing a join or adding a missing index.
Don't forget to check the code for performance issues or calls to sleep(). If you haven't yet, it may be helpful to get the code running locally so you can run it through xdebug.

Full Oracle backup from PHP

I maintain a PHP driven web application with Oracle backend. The app interacts with a number of third-party apps so information is managed with a combination of XML files, Microsoft Access databases and HTML forms. There are currently 80 tables with many BLOBs and a pretty good bunch of foreign key relationships. All procedures are carefully explained in a document that (of course) nobody ever reads. The customer was feeling uneasy about his data so he was given an estimate with some improvements that could be made (stuff like adding previews and confirmations in some operations).
Sadly, the customer misinterpreted one of the specs (a partial export to be written in 12 man-hours) and he's expecting a full backup and restore feature that would allow him to save and restore the complete database through a web browser without the DBA intervention.
Before having yet another argument with the client, I'd like to know whether I have any option to actually implement this feature in a timely manner, considering that it doesn't need any refinements (e.g., there is no need to select what to restore).
Production server is a Windows Server 2003 box running PHP/5.2.9. The Oracle server is a remote box running "Oracle9i Release 9.2.0.1.0 - 64bit Production".
(Please note I'm not a DBA so there may be well-known solutions I'm not aware of.)
Oracle is a monster. Once you've read this you'll realise that how you backup the system depends totally on how it has been configured. The short answer is to automate whatever manual process - invoke it as a long running process (since this is MSWindows, prefix the rman command with 'start') then use polling to detect when it finishes (e.g. wrap rman in a DOS batch file which logs start and end times).
I'd be hard pushed to think of a more difficult problem to provide a generic solution for than Oracle runing on top of MSWindows. The latter may be nice for users to click on buttons, but automating anything is a PITA.
Have fun :)
Finally, I had the chance of implementing full Oracle backup from PHP in a later project. I used the Oracle Data Pump command-line utilities, available since 10g. In short:
You define an Oracle directory to map a keyword to a physical directory and grant write permission to the app's Oracle user.
You run expdp with the appropriate arguments and get a complete dump in a single file.
To restore a backup, you run impdp.
It's also advisable to run commands with proc_open() rather than system() since you can bypass_shell if on Windows and have fine-grained control on the process.
As for this question, the pre-10g alternative is the "exp" / "imp" combo.

access + mysql converting to webplatform = (php + asp.net + mysql)?

i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.

Communication between PHP and application

I'm playing with an embedded Linux device and looking for a way to get my application code to communicate with a web interface. I need to show some status information from the application on the devices web interface and also would like to have a way to inform the application of any user actions like uploaded files etc. PHP-seems to be a good way to make the interface, but the communication part is harder. I have found the following options, but not sure which would be the easiest and most convenient to use.
Sockets. Have to enable sockets for the PHP first to try this. Don't know if enabling will take much more space.
Database. Seems like an overkill solution.
Shared file. Seems like a lot of work.
Named pipes. Tried this with some success, but not sure if there will be problems with for example on simultaneous page loads. Maybe sockets are easier?
What would be the best way to go? Is there something I'm totally missing? How is this done in those numerous commercial Linux based network switches?
I recently did something very similar using sockets, and it worked really well. I had a Java application that communicates with the device, which listened on a server socket, and the PHP application was the client.
So in your case, the PHP client would initialize the connection, and then the server can reply with the status of the device.
There's plenty of tutorials on how to do client/server socket communication with most languages, so it shouldn't take too long to figure out.
What kind of device is it?
If you work with something like a shared file, how will the device be updated?
How will named pipes run into concurrency problems that sockets will avoid?
In terms of communication from the device to PHP, a file seems perfect. PHP can use something basic like file_get_contents(), the device can just write to the file. If you're worried about the moment in time the file is updated to a quick length check.
In terms of PHP informing the device of what to do, I'm also leaning towards files. Have the device watch a directory, and have the script create a file there with something like file_put_contents($path . uniqid(), $command); That way should two scripts run at the exact sime time, you simply have two files for the device to work with.
Embedded linux boxes for routing with web interface don't use PHP. They use CGI and have shell scripts deliver the web page.
For getting information from the application to the web interface, the Shared file option seems most reasonable to me. The application can just write information into the file which is read by PHP.
The other way round it looks not so good at first. PHP supports locking of files, but it most probably doesn't work on a system level. Perhaps one solution is that in fact every PHP script which has information for the application creates it own file (with a unique id filename, e.g. based on timestamp + random value). The application could watch a designated directory for these files to pop-up. After processing them, it could just delete them. For that, the application only needs write permission on the directory (so file ownership is not an issue).
If possible, use shell scripts.
I did something similar, i wrote a video surveillance application. The video part is handled by motion (a great FOSS package). The application is a turn-key solution on standardized hardware, used to monitor slot-machine casinos. It serves as a kiosk system locally and is accessible via internet. I wrote all UI code in PHP, the local display is a tightly locked down KDE desktop with a full screen browser defaulting to localhost. I used shell scripts to interact with motion and the OS.
On a second thought:
If you can use self-compiled applications on the device: Write a simple program that returns the value you want and use PHP's exec() or passthru() or system().

Categories