I'm using a VPS with CentOS 6.5, I need to create a PHP script that looks if a file was uploaded to my home/user/public_html/uploads and take it's name and save it in a DB (I can do this part). I do not know when files will be uploadd to that directory, it could be at any time so I'm wondering if there is a PHP or any other way (Linux) to monitor the upload folder for any file created inside it and run the PHP script, I found some articles on Google about things like inotify, auditd but there is no clear tutorials on how to do that. Can you help me please?
Thanks
Related
I created a website using laravel.
I uploaded the site to my shared server, and site was running fine.
After that I had to add a library called Intervention. In local, I used this command and everything is working fine.
php composer.phar require intervention/image
I made changes in files and everything is working fine in local.
Now, I need to upload the updated files. So rather than deleting whole online folder, and re-uploading, I just want to upload changed files (I know the controller, view files that were changed).
But, as it is a shared server there is no way or place to execute composer update command or get that one library (thus now I am getting intervention Image class not found error).
So, what would be ideal thing to do in this case?
The easiest one maybe delete the whole folder and upload it again but the files are almost 500mb so do not want that.
Any suggestions to handle this situation? (updating composer libraries after deployment)
Not really a php or laravel question, but if you're using ftp to upload there's usually an option to only upload changed files.
For example in filezilla you can set the over-ride option here:
Currently I will upload a website to a server, but before I upload it I want to run this website on my localhost (I'm using XAMPP).
They gave me an archive containing the website files, based on the structure of the webfiles I realize TYPO3 is used.
Since I never tried using TYPO3, I tried installing it I followed this steps:
here
After the installation I run the sample, so far I don't have a problem running it on my web browser.
But when I tried to run the website that I need to upload
I'm getting an error.
here are the files inside the archive.
Here is after I extract it. As you can see I wasn't able to extract typo3_src which is the target location of index.php, t3lib and typo3 file that are all both in .symlink type.
And also the typo3 is not in folder type unlike the sample I run.
I'm not sure if the archive is broken/corrupted or I need something to do first before it will work. Can someone please help me regarding this issue?
You do not need to deal with those unix symlinks at all.
Just figure out or ask the person who provided you with that system, which version of TYPO3 you need to use.
Then go to https://sourceforge.net/projects/typo3/files/TYPO3%20Source%20and%20Dummy/ and download the appropriate zip package.
Decompress the package, delete all symlinks (typo3, t3lib, index.php) and just copy the corresponding resources from the package.
You then need to deal with the database and database connection settings in typo3conf/localconf.php or typo3conf/LocalConfiguration.php.
If it all works, you can then upload it like this to the server without using any symlinks.
I been using gulp for front-end development lately and I find it very helpful.
I use XAMPP in windows for making PHP website some of which sometime include database operations.
Now, I have used gulp-livereload and gulp-connect for starting a server for front-end but then it won't process the PHP files.
All I want to do is, PHP livereload with database access. Like livereloading but via XAMPP's server(since it can process PHP).
yes, I have found a temporary solution for now.
See, it's just temporary but as long as you write PHP correct, you will get the work done. Let me tell you how I do it.
Though right off the start I want to tell you that this is not a proper way to approach this, you should use some MVC framework.
Anyway,
I have XAMPP installed under Windows 8.1 and XAMPP's htdocs is in my "C" Drive.
I then create a folder inside "htdocs" in which I put all my frontend + backend code.Basically, gulpfile.js is at the root, then there is a components folder in which all front-end sources reside. There is another folder at root called "www" inside which I put my index.php.
Then I load gulp-livereload instead of gulp-connect and I add the livereload.listen(); method and Tags in all the php files I want to reload.
Works very fine except when PHP throws an error, untill you correct that error, you have keep reloading the page manually.
im doing a php project. all is fine doing it in xp webserver which im using iis..but now i have set up a new pc with windows 7 and iis.
problem is when i copy my php files which is in a folder(e.g portal1) from the xp wwwroot to the windows 7 wwwroot, i cant access it on the browser. it returns an internal server error.
now i assumed i didnt properly set up my web server or even php. i have done it a dozen times following tutorials and im pretty sure its all correct.
i have done a further research on the folder itself and has led me to a theory that this has to do with permissions.
when i copy directly the whole thing,it wont run, BUT if i CREATE a folder and the subfolder with all the same names as the one i copied and just take the php files and put it accordingly, it runs ok!!
now that has something to do with inherited permission i think. how do i overcome this?
i dont want to everytime take the updated work folder from my partner which is done in xp and i have to create new folder and its subfolder with all the same name on my machine and then copy the php files accordingly. thats a lot of work!!
i just want to copy the folder and put it in my wwwroot folder and run it on the browser without problems. how do i overcome this permission issue?
any ideas?
By the sounds of things, you're working on a project with someone else. Copying & pasting is the absolute worst way of sharing files when working on a project with multiple people. You should really be using versioning like git. That's pretty hard to set up and learn though. An easy solution which will work for the time being (but won't manage conflicts well) is to use Dropbox. Set up a free Dropbox account, and create a folder in that called www or whatever you want. Then install WAMP (way better than IIS), and create an alias to the www folder in your Dropbox. Do this on both machines. Now every time your partner makes a change, it will be instantly reflected on your machine, and vice versa. Easy, free, and will work while you learn a versioning tool.
I know this doesn't address your actual problem, but it should be more helpful to you.
I've been trying to come up with a way to build an automatic update system for a cms that I'm building that will potentially be installed on numerous servers (likely with different configurations). What I've come up with is to keep a current version uploaded to my server in a predetermined directory. Then have the distributed systems check that directory (on a remote server) once every so often to see if a new version has been uploaded. If the version number of the uploaded version is greater than the one that particular system has, it will prompt the admin to update. Then the files will be copied via FTP into a tmp directory, then will be copied from tmp to replace the older versions of each file. then the tmp directory is deleted, and the systems version number is increased.
THe problem with this, is that I haven't found a way to transfer whole directories via PHP FTP. I know I can zip it and transfer it that way, but I haven't found a reliable way to unzip files in various server environments.
I know I can write my own method that will just go through each directory and transfer over each file one at a time, which is fine. I just wanted to know if someone else had already written something to this effect, or if someone knew of an alternate solution to this problem before I dive into it.
Thanks for your help!
Instead of sending a directory, why not zip the folder to save bandwidth and time? Then unzip on the destination servers. deleted since it was not an option for the OP
You could use PHAR archives for your app deployment (if you want to be bleeding-edge modern)
Or you could write an update script for Phing that gets the files from your server. You could even do checkouts from SVN then, instead of placing the build in a directory.
Try using exec() to run lftp from the command line:
/usr/bin/lftp -e 'o ftp://ftp.example.com/path/to/remote/directory && mirror --verbose && quit'