I have a question about netbeans.
I made a new netbeans php project, configured it with the existing ftp settings, and it started downloading all the files to my local hd, so far everything is working correctly.
but because I'm using a php framework, which will automatically generate some files for me makes a file, it won't be synced to my local hd.
So my problem is as long as I make new files in netbeans it will work correctly, but when files are generated outside netbeans it won't.
How can I configure netbeans so it will sync both ways via ftp?
You can't as far as I know. You have to manually right click on the project/folder you want to update and click Download or Upload. Note that this will overwrite any changes on the receiving side (e.g. downloading files will overwrite local changes and uploading files will overwrite remote file changes).
Alternatively, you can have the Upload Files On Save or On Run options enabled, but this will only work for local->remote synchronization scenario. You can access this option by right clicking your project, choosing Properties and following the screenshot below.
Related
I created a website using laravel.
I uploaded the site to my shared server, and site was running fine.
After that I had to add a library called Intervention. In local, I used this command and everything is working fine.
php composer.phar require intervention/image
I made changes in files and everything is working fine in local.
Now, I need to upload the updated files. So rather than deleting whole online folder, and re-uploading, I just want to upload changed files (I know the controller, view files that were changed).
But, as it is a shared server there is no way or place to execute composer update command or get that one library (thus now I am getting intervention Image class not found error).
So, what would be ideal thing to do in this case?
The easiest one maybe delete the whole folder and upload it again but the files are almost 500mb so do not want that.
Any suggestions to handle this situation? (updating composer libraries after deployment)
Not really a php or laravel question, but if you're using ftp to upload there's usually an option to only upload changed files.
For example in filezilla you can set the over-ride option here:
I am working on a moodle based project, which I inherited from someone else. Having copied the files into htdocs folder and started MAMP, the files still don't show in the browser. Instead, the browser automatically initiates a download. I might be required to change the config file, however, since I do not have much experience with php and SQL I am not sure what exactly. My part of the project is to develop html and css, but need to be able to run it locally first.
What do I need to do to get the files run locally? The route I use is localhost:8888/whatever/whatever/index.php
In case someone else runs into the same problem - Apache downloads php files instead of reading them - here is what helped me.
.htaccess file may need changes if the application has changed servers.
Delete (at least rename if you don't want to remove it) config.php and run the application through the browser. It should initiate install automatically.
To run php and SQL I used MAMP.
In the past, with Eclipse and a PHP Server/system, I had it setup so that when I commited changes to the CVS repository, it also saved the actual php files on the server. I had this functionality on a another computer in the past (I can't check this computer). The files for the repository seemed to have been saved in a different folder. So the cvs is in a folder stucture like var/cvs and my system files/PHP files facing clients are in something like var/www/html/. How would one go about setting something like this up? I use sftp to change files right now with Filezilla. It was very convenient before being able to commit the changes and check the web to make sure that changes worked. Right now I have to commit the changes then save the file with ftp to see the changes. Would love to be able to get rid of the sftp with Filezilla step if at all possible...
It sounds to me that you are testing your latest changes on the live website, which is bad idea, because if you inadvertently edit some error in the files, your website may expose that to the public.
My current work-flow is as follows:
I use Netbeans on a local project, which is the SVN checkout too. On most projects I use the Netbeans option "Copy files from source folder to another location" to copy the edited files "on save" to the local test webserver directory. If the changes work on the local webserver, I'll commit them to the SVN repository and login to the live-webserver via SSH and checkout the latest revision from the SVN.
So in fact I have four copies of each file:
The working copy (a Netbeans project and SVN checkout)
/home/feeela/projects/xyz/ (editing here only)
The test-server copy; Netbeans stores a copy there on each save;
/var/www/vhosts/xyz/ (127.0.0.1/xyz/)
The SVN repository; I'll manually commit files to it after testing on the local webserver;
/var/svn/xyz/ (svn commit -m "my last change")
The SVN checkout on the live-server, which is the actual website;
/var/www/vhosts/xyz/ (svn update # xyz.com/)
I don't have a clue, how setup the "local copy" feature (which can also refer to some other machine) with Eclipse. If someone knows a way to reproduce the above workflow using Eclipse and not need to manually sync the files to the test-server, I#ll be glad to read it hereā¦
You could use a post-commit hook script on the CVS server to update (refresh) a working copy on var/www/html/. Every time you commit, the hook script would thus get the latest version of the files on the server and put them in var/www/html/.
I am currently using Aptana to edit files and upload them directly to the webserver whenever I save them. But it's a big project I am working on and I regularly need to search for a specific line, or piece of code, in multiple files. it's literally 100's of files, and Aptana's search function does not work directly on FTP.
Now my question is, is there a way to store all the files locally AND sync them through FTP whenever I save them?
If it is a big project you should really consider not editing your files live on your server, but edit them locally and use some kind of revisioning system (cvs, mercury, svn, git, clearcase, perforce, ...). Have you considered your options on using an external tool to sync your files from your local copy? Such as rsync or scp.
Do you work in a Windows or Unix environment?
I noticed a strange sync problem. I have my project setup as a remote project and everything works fine. I have it set to upload on save. However, if someone on my server is working on a file and saves it. I don't see this change in netbeans and I end up opening a older version of the file and overwriting my colleague's changes when I press save.
Is there a way to have netbeans check the remote server for the latest file before saving?
thanks
No, NetBeans can not check the remote file automatically. It's just not set up to do that. Even if it were, you would still run into problems where you would clobber your colleague's changes, or he would clobber yours.
What you have is a basic version control problem, which is best solved by implementing one of the several version control systems out there (e.g., CVS, Subversion, Git, Mercurial, etc.) and then "building" your website out to the server from version control. Short of that, an imperfect solution would be to partition the files such that you are forbidden to edit files assigned to your colleague, and vice-versa.
NetBeans can download the file for you, but only when you tell it to by right-clicking on the file and choosing the Download command. This downloads the file whether it has been updated by someone else or not.