PHP creating .nfs00000 files? - php

I have a PHP app, which is working fine for me, both on test system and a production system.
But another user of my app wrote me, that it creates a lot of files .nfs00000* on his system and it slows down loading of the page.
My app does not create any files on the filesystem, all datas are stored into MySQL. So I was really surprised by this. But that user removed my PHP app from his website and the problem dissappeared.
I will be honest -- I know nothing about .nfs00000* files and I was not able to google out anything reasonable about them. Can someone please try to give me explanation, what they are, why they are created and if I can do anything to avoid their creation?
Thanx, Honza

Maybe this can help:
Under linux/unix, if you remove a file that a currently running process still has open, the file isn't really removed. Once the process closes the file, the OS then removes the file handle and frees up the disk blocks. This process is complicated slightly when the file that is open and removed is on an NFS mounted filesystem. Since the process that has the file open is running on one machine (such as a workstation in your office or lab) and the files are on the file server, there has to be some way for the two machines to communicate information about this file. The way NFS does this is with the .nfsNNNN files. If you try to remove one of these file, and the file is still open, it will just reappear with a different number. So, in order to remove the file completely you must kill the process that has it open.
If you want to know what process has this file open, you can use 'lsof .nfs1234'. Note, however, this will only work on the machine where the processes that has the file open is running. So, if your process is running on one machine (eg. bobac) and you run the lsof on some other burrow machine (eg. silo or prairiedog), you won't see anything.
(Source)
If your app is deleting or modifying some files it could be the cause of the problem.

Related

Sync phpMyAdmin DB's Across Desktops

So i just setup my Xampp Apache server to load all the documents i create on my Google Drive. For example if i type 127.0.0.1, it will show me all my web files on my Google Drive. I set this up so i can develop across my laptop which i use at school and my desktop which i use at home without having to copy files back and forth between computer to computer. This works the way i want it to but i forgot one thing. How am i supposed to sync my databases that i create. My question to you is how can i sync my databases to the cloud or somewhere else so i don't have to export and import every time i switch devices?
Also i would like to stay away from using hosting as i won't be online all the time.
The database server (the application itself) expects exclusive access to the data files. If you try to synchronize a data file between two systems, you're going to have issues and probably data loss.
What you could do is synchronize the data directory and make sure you're only running one server at a time. So when you're done working on the laptop, shut down the MySQL server process/service (mysqld), wait for it to finish synchronizing, and then start up the mysqld on the desktop. I suspect this will work, but it's a pretty non-standard usage so anything could happen.
To make it easier, I'd definitely consider writing a wrapper script/batch file that first tests for the presence of a lock file, then (if non exists) creates one, starts the mysqld, and when exiting make sure mysqld is stopped before deleting the lock file.
Anyway, to make this happen you would first stop mysqld everywhere, take the one mysql data directory that you wish to use, copy it to your Google Drive, then edit all of your MySQL configuration files to point to the new data directory instead of the old one. Whether XAMPP makes this more difficult than it should be, I'm not sure, but with stock MySQL it should be pretty trivial.
Remember that just because it's possible doesn't make it a good idea, and likewise that just because it's not a good idea doesn't make it won't work. So I'm saying it's not a good idea to do this, but if done with proper attention it will "probably" work.
Hope that helps.

Keep Uploaded Files in Sync Across Multiple Servers - PHP Linux

I have a website right now that is currently utilizing 2 servers, a application server and a database server, however the load on the application server is increasing so we are going to add a second application server.
The problem I have is that the website has users upload files to the server. How do I get the uploaded files on both of the servers?
I do not want to store images directly in a database as our application is database intensive already.
Is there a way to sync the servers across each other or is there something else I can do?
Any help would be appreciated.
Thanks
EDIT: I am adding the following links for people that helped me understand this question more:
Synchronize Files on Multiple Servers
and
Keep Uploaded Files in Sync Across Multiple Servers - LAMP
For all Reading this post NFS seems to be the better of the 2.
NFS will keep files in sync but you could also use ftp to upload the files across all servers as well but NFS looks like the way to go.
This is a question for serverfault.
Anyway I think you should definitely consider getting in the "cloud".
Syncing uploads from one server to another is simply unreliable - you have no idea what kind of errors you can get and why you can get them. Also the syncing process will load both servers. For me the proper solution is going in the cloud.
Should you chose the syncing method you have a couple of solutions:
Use rsync to sync the files you need between the servers.
Use crontab to sync the files every X minutes/hours/days.
Copy the files upon some event (user login etc)
I got this answer from server fault:
The most appropriate course of action in a situation like this is to break the file share into a separate service of its own. Don't duplicate files if you have a network that can let the files be "everywhere (almost) at once." You can do this through NFS/CIFS or through a proper storage protocol like iSCSI. Mount as local storage in the appropriate directory. Depending on the performance of your network and your storage needs, this could add a couple of undetectable milliseconds to page load time.
So using NFS to share server files would work OR
as stated by #kgb you could specify one single server to hold all uploaded files and have other servers pull from that (just make sure you run a cron or something to back up the file)
Most sites solve this problem by using a 3rd party designated file server like Amazon S3 for the user uploads.
Another answer could be to use a piece of software called BTSync, it is very easy to install and use and could allow you to easily keep files in sync accross as many servers as you need to. It takes only 3 terminal commands to install and is very efficient.
Take a look here
and here
You can use db server for storage... Not in the db i mean, have a web server running there too. It is not going to increase cpu load much, but is going to require a better channel.
you could do it with rsync.. people have suggested using nfs.. but that way you create one point of failure... if the nfs server goes down.. both your servers are screwed... correct me if im wrong

Serve a PHP website with PHP files being remote

This is the situation:
I have a LAMP server, which serves HTML, PHP, etc... Now I have remote folder, somewhere in the web, which has a directory full of PHP files, images, an MVC folder structure (CodeIgniter), etc...
Now, What I want to do is that instead of every time I want to serve those PHP files, instead of downloading them and uploaded them into my LAMP server, I want to use those PHP files directly and serve them in my LAMP server.
Again, I want the PHP files from a folder in another server, which I only have access to the direct link to each individual file, being serve in my LAMP server, so if I access my website, for instance: www.website.com/page1, gets the folder structure from the remote web server or all PHP files, and get serve within my server.
I know this sounds a little bit complicated but I'm not sure what to use... Maybe reverse proxy? Do you think I may download the files directly and constantly syncing the files? If anyone gets with a good solution I may even pay that person...
EDIT(1)
Good answers so far... but I think I did not make a good question so here it goes again:
I have access to a "list" of PHP files, and in order to get them I need to authenticate myself using oath via PHP. Once I get authenticated, I can retrieve a list of PHP, html, etc.. files, each one of them having a public URL that anyone can access. So the think is that instead of downloading all files in that repository, and serve those files, I want to be able to reuse that repository's web space and I just serve these files myself. So basically I want to be able to have symbolic links to urls, which I think is not possible, but being able to just read the files and serve the PHP logic, even though the files are elsewhere.
I'm concern about the security issues involved, but if someone could help me I will be thankful... Also if you are interested in what I'm doing I always can use a partner for this project which I intent to use it in charity, but still can pay that person.
This is not a smart thing to do. You open yourself up to potential security issues, but at a minimum, you will significantly slow your site down.
I would recommend that you simply script synchronizing the files on both servers over SSH by a script.
Edit: ManseUK's suggestion if rsync is also a good one.
If you have ftp access to the remote server, you could mount the folder using fuse, and serve as usual for apache.
Do you have the ability to mount the remote folder as an NFS volume, or perhaps with SSHFS? If those options are available, either could work for you. You'd mount the remote folder locally and tell your local web server to serve files from that path.
Not that it would be the most efficient setup in the world, but I don't know why you have all this split apart in the first place. ;)
You could write a cronjob to grab the remote file list every X minutes/hours/days then store the results locally, then write a simple script to parse those results upon request. Alternatively, you could still use an NFS or SSHFS mount to read the remote paths in real time and build whatever URL's you need.

Calling Excel from PHP 5 through COM fails on Windows 7 when Apache started through Task Planner

Hey folks, this question can't be too complicated. Please provide a solution to at least figure out the ultimate root cause of the problem.
I currently write an application, which controls Excel through COM: The app creates a COM-based Excel instance, opens some XLS files and reads their contents.
Scenario I
On Windows 7, I start Apache and mySQL using xmapp-control with system administrator rights. All works as expected. The PHP-based controller script interacts with Excel as expected.
Scenario II
A problem appears, if I start Apache and mySQL as 'background jobs'. Here is how:
I created two jobs using Windows 7 Task Planner. One runs apache_start.bat, the other runs mysql_start.bat.
Both tasks run as SYSTEM with elevated privileges when Windows 7 boots.
Apache and mySQL work as expected. Specifically, Apache serves HTTP request from clients and PHP is able to talk to mySQL.
When I call the PHP controller, which calls and interacts with Excel using COM, I do receive an error.
The error message comes from Excel [not COM itself] and reads like this:
Excel can't read the specified Excel-file
Excel failed to save the file due to an ill-name worksheet
Interestingly, the first during the first run of the PHP-based controller script, it takes a few seconds to render the error message. Each subsequent run immediately renders the error message.
Windows system logs didn't show a single problem report entry.
Note, that the PHP program and the Apache instance didn't change - except the way Apache was started.
At least the PHP controller script is perfectly able to read the file-system, since it provides the pathes to the XLS-file through scandir() of a certain directory.
Concurrency issues can't be the cause of the problem. A single instance of the specific PHP controller interacts with Excel.
Question
Could someone provide details, why this happens? Or provide ways to isolate the ultimate cause of the problem (e.g. by means of a PowerShell 2 script)?
UPDATE-1 :: 2011-11-29
As proposed, I switched the Task Planner job from SYSTEM to a conventional user. Works. Apache and MySQL get started and process request.
Unfortunately, the situation regarding Excel did't change a bit. Still, I see the error.
As assumed earlier, the EXCEL COM server starts. I'm able to change various settings (e.g. suppress dialogs) without a problem through the COM-instance.
The problem happens while calling this:
$excelComObject->Workbooks->Open( 'PathToXLSFile' );
UPDATE-2 :: 2011-11-30
Added the accounts USER, GUEST and EVERYONE with the READABLE right to the access control list of the XLS file . No change.
Modified the app in such a way, that the PHP part creates a copy of the XLS file as a temporary file and moves the contents of the original file into this. Just to ensure, that the problem isn't forced by odd file / path names.
Still, the problem persists.
UPDATE-2 :: 2011-12-05
I'm going to send the EXCEL COM-Server methods in such a way, that Excel creates a blank file and saves it to /tmp. Let's see, if Excel even isn't able to read this file.
Go into the task planner and let everything run as a local user. This will probably require that you enter a password so create one if you don't have one already.
Excel is a user-level application that shouldn't run as SYSTEM. I'm sure there are ways around it, but you should simply let everything run at the correct level.
Having Apache run on the user level isn't a problem.
Try creating the following directories:
C:\Windows\SysWOW64\Config\Systemprofile\Desktop
C:\Windows\System32\Config\Systemprofile\Desktop
it worked for me :-)
http://social.msdn.microsoft.com/Forums/en-US/innovateonoffice/thread/b81a3c4e-62db-488b-af06-44421818ef91
In the past (read: pre Vista) services had an option called "Allow Service to interact with desktop" which allowed services to spawn windows etc. Starting Vista, this is no longer allowed.
I suspect Excel is failing because it can't function under this restriction. Thus, any attempt to run it as a service in your Win7 installation will fail.
You can Windows XP and allow desktop interaction for your Apache process, which I don't really recommend for obvious reasons.
Another approach I would take is to create a PHP script that runs as a regular process and listens on a socket in an infinite loop. Your PHP script that runs under Apache would communicate with the secondary script through the local socket and have the secondary script spawn Excel.
This may sound complicated but in fact it's not a lot of code and it fixes a problem you will soon have anyway: You should only have one instance of Excel running or you may run into problems. The secondary script could queue requests, handing off one by one to Excel then taking the next in the queue.

How to optimize upgrading web application?

I mantain a custom PHP application (build for me) that is hosted in a web server. Sometimes I add new features or repair bugs, and after test in local I upload the changes to the web server. It's not a critical application (is a game), but the most of the time there are some people connected.
The steps that I make to upgrade the application:
Access via FTP (Filezilla)
Upload a .htaccess file that redirects all the people (except my IP) to a mantain.html file
Check that access is denied for other IP except mine.
Backup old code
Upload new code
Go to PhPMyAdmin
Backup DB
Execute scripts for the DB
Test that all works fine (if not -> revert the backups)
remove .htaccess file
I usually spend an average of 30 minutes doing these steps, and I'm wondering if there is any way to optimize, automatize or doing something to spend less time. Also I know that if I can automatize some steps there are less prone to have errors.
Several other answers suggest PHP-specific deployment tools, but being as I'm not very familiar with PHP, I'll offer some general tips. These suggestions may be redundant by some of the other tools already suggested, though.
First off, don't upload a new .htaccess file every time--just have two of them on your server. Perhaps call them .htaccess-permanent, and .htaccess-maintenence. Then create a symlink to the one that ought to be active. Then once you've tested that access is properly denied once, you don't have to do this manual testing phase every single time you do an upgrade.
I'd also write a shell script to do most everything for me. My new work flow would look like this:
Upload new code to server in a directory called new/
Log in to the server via shell, and execute the upgrade script
Test the new site
Run upgrade-finalize
The end.
Now for the interesting part, the upgrade script will do this:
It will delete the .htaccess symlink, and re-create it, pointing to .htaccess-maintenence.
It will copy the current code in current/ to backup/
It will back up the DB, using the exact same commands that PHPMyAdmin uses
It will move the contents of new/ (which you just uploaded) to current/
It will execute the scripts for the DB
And the upgrade-finalize script will simply:
Delete the .htaccess symlink, and re-create it, pointing to .htaccess-permanent once again
The only possibly tricky part here will be getting the exact commands that PHPMyAdmin uses to back up your database, but it's probably a simple mysqldump command, and you can probably get that info from PHPMyAdmin or some logs, or something. Sorry, I don't know more about PHPMyAdmin to help in this specific area.
Look into a deployment tool like Capistrano that allows you to automate those steps.
I usually spend an average of 30 minutes doing these steps, and I'm wondering if there is any way to optimize, automatize or doing something to spend less time.
There are many ways. For starters, steps one through eight can be done in a single shell script. You could checkout Phing, an automated deployment system. Also, you might want to delve in continuous integration for even more control over how and when the software can be deployed.
Doing this manually is, like you say, asking for trouble.
For starters, you could upload your files into a new webroot and when done, switch over the DocumentRoot in apache, leaving it available during the copy process. For any shared files you could use a symlink to a common folder (eg, uploaded images etc)
You could probably take the backup during operation as well if you don't care about consistency in the database. For migrations that doesn't "break" the functionality, you could also migrate it and test it on your new webroot with another hostname if consistency isn't a problem.
The best option is always to use multiple webservers so that you can take one offline for testing while the other one is operational, but you will still have problem with consistency, however I assume that is not an option since you don't mention it.

Categories