I have developed a PHP site in Eclipse on localhost and have just transferred it to a GCP Compute Engine instance. To do this I've had to upload the site to a Storage Bucket and then on the SSH Shell for the instance in GCP used gsutil to transfer the files.
Now, as far as editting goes, is there a way for me to connect eclipse to the GCP instance and edit via Eclipse?
I don't want to have to, for every tweak, upload to the storage bucket and then copy file across to the instance. This would be most tedious.
I have created a firewall rule on the network to allow all traffic from my public IP to the Compute Engine instance (until I can lock it down once I've got the connection)
Thanks
This article describes a procedure to editing remote files in Eclipse:
http://www.chrisdanford.com/blog/2010/05/19/edit-files-directly-over-sftp-in-eclipse-remote-system-explorer/
Quoting that article:
From Eclipse: Help -> Install New Software. “Work with: -All available sites-“. In the search box type “remote system”. Check “Remote System Explorer End-User Runtime”, click Next to proceed with the install.
After the wizard completed, click Window -> Open Perspective -> Remote System Explorer. Right-click in Remote Systems, choose New Connection, type in your details. After you connect, expand “Sftp Files” and you’ll be able to open remote files in the editor.
You can forward ports over SSH using the gcloud sdk
gcloud compute ssh example-instance \
--project my-project \
--zone us-central1-a \
-- -L 2222:localhost:8888
Once you have this setup you can connect direct in Eclipse
See details:
https://cloud.google.com/solutions/connecting-securely#Port%20forwarding%20over%20SSH
Related
I have access to server of our company's website. I access it using Putty.
I want to download all files(index.php, any jpg files for favicons, etc.) used for deployment of the web project. When I told to programmer of website to share the full script(HTML,CSS,PHP, jquery plugins), he said that I can access the entire code from server.
When I enter the server via Putty(private key+ssh), and then I type "ls" I see that there are "index.php" and "mysql" files, the full contents of which I cannot download.
What are the useful resources for list of commands?
Which command should I use to download the project folder containing all files with code and without code?
P.S. I do not know if this information is necessary, but the website was deployed using DigitalOcean.
As you are using Putty, I'll assume you are working on Windows. There is a GUI tool called WinSCP that works similar to Putty (i.e. over SSH and uses private keys and stuff) and can be used to access the remote server's filesystem. It has a pretty simple to understand UI which will be divided into you local filesystem and the server filesystem. Once you're connected and have reached the files you need you can just drag and drop the files into your local filesystem side.
If you would like to explore some command line options, its basically any tool that does scp. I think Putty comes with scp or pscp installed already. You can check by just typing in the command in your cmd/powershell.
You can download pscp.exe from PuTTY website and then:
Open cmd.exe and type:
pscp your_username_here#yourcompany.com:/path/to/file C:\Path\To\The\New File
I have a website in my own localhost using Wamp coded using php.
I recently joined Google Cloud Platform and have deployed Lamp Stack in it.
I also setup the MySQL database in it successfully.
But now I am confused on how to upload my files into that.
The OS is Debian 8
I have been using BitBucket for some time, is there a way that I can clone the data from there directly to google cloud?
Can anyone guide me how to upload the PHP files in there so that I can test my website?
Is there any GUI for that rather than command line? I am not that good with command line.
P.S. Ready to give any more relevant info, as I don't know what all data from my side is required to answer this.
When deploying google Lamp stack(click to deploy) you will automatically be creating an instance of Google Compute Engine - check Compute Engine / VM Instances menu.
Method 1:
Click on SSH button next to the instance name and a new terminal window will open. Make sure you have git installed, if not install it yourself
sudo apt-get install git
Locate your html apache/linux folder. Usually it's
cd /var/www/html
Then download your repository with
git clone https://www.path.to.repository.git
Make sure you use the https repository url not the ssh one. For SSH you will need to have the same SSH key on your instance as on your bitbucket account. With https you will be able to log on with your normal credentials.
Method 2:
You can upload files with SFTP.
First you need to generate a key with PuttyGen if you don't have one already.
Next go to the GCP menu, click on the Compute Engine menu then on the VM instances submenu.
Check the lamp instance then click Edit to go to the Edit page. Scroll down till you find the SSH keys textbox. Paste in the contents of your key.
Next use any SFTP client. You can do that from within PHP Storm, FileZilla or Putty by selecting your private key and connecting to yourusername#instance_external_ip
Good luck php wizz
Why I'm asking
Conditions
What I'm trying to do
What I've tried so far
Why I'm asking
The Seagate Personal Cloud (Network Attached Storage) I have can only install .rbw (Ruby on Windows) packages when installing 3rd party applications. Apparently, I can't access the cloud's root folder to just install a .tar from either Ubuntu or Windows (I can only access the drives that are shared), so I need to package into RBW.
Conditions
The Seagate Personal Cloud is hooked up to my router, and my computer accesses it over wi-fi. The NAS uses NAS OS. I have a Windows drive and an Ubuntu drive available to try whatever needs to be done. When I connect to it (over 192.168.2.x), I get a Web User Interface that logs in, and presents me with the NAS's applications/options/etc. There is no terminal available from the WebUI.
What I'm trying to do
ownCloud is available for download as a 3rd party application, from the WebUI's App Manager, or for download from Seagate's website as a .rbw file. However, the packaged .rbw file is v6, but ownCloud is up to v8 now, and ownCloud can't update from my NAS, because it can't get the permissions (and since I can't access the NAS root folders, I can't seem to chmod it's folders to allow anything to write to it).
What I've tried so far
I've tried accessing it from both Win/Ubu drives over the network - only accesses the shared folders.
And with WebDAV - only accesses the shared folders.
And S/FTP both - only shared folders.
And "SDrive" (a Windows program that comes with it) - only shared folders.
And I've tried "Take Ownership" with Windows, but doesn't do anything.
And just clicking the "Update" button from ownCloud - says it doesn't have write access.
So finally, I'm trying to find out how to package a .tar (<--mainly) or .php into Ruby (ownCloud can install with a PHP file). Plus, it would be nice to learn how to package .tar's into Ruby, so that I can just do the same thing with multiple other programs, and submit those .rbw's to each program's maintainers, so that they can provide it to others who need that extension for their NAS.
I'm a basic user, so I'm sorry and will clarify if something here didn't make much sense.
For the Seagate Personal Cloud the RBW files are created within Seagate's Virtual machine. See https://www.seagate.com/nasos/SDK/0.7/ for all kinds of information. Taking an old rbw file and changing a few files, will not work easily.
If your Seagate device is like mine where by default the ssh server is turned off and not listed as a service in the the Web menus, you can actually turn it on. Access your device as http://my.device.com/?appdev=1 where the ( my.device.com your device. It might be 192.168.1.??? if you just have an ip number ) Go to the services menu and disable sftp and then enable ssh. The root password is what you used for your admin account when you first set-up the device.
P.S. There is now a rbw file for Owncloud 9.0.2 in the Owncloud forums.
I am totally confused on how to host a Dynamic website created using PHP and MySQL in Amazon Cloud.
I went through Amazon S3 and I hosted a static website there!
Then I tried Amazon EC2 and I learned some aspects about the concept of VPC. I thought that the dynamic websites are hosting in Amazon Cloud using EC2. I followed some steps and they taught me how to launch a website using Drupal (But, I didn't want that !! )
No other tutorials on EC2 to deploy my web application was not found.
Then I found AWS Elastic Beanstalk, I uploaded a simple PHP document and I can see that deployed successfully.
But Still, I am not satisfied. Because, I don't know which is the correct way to deploy my PHP application.
So can anyone direct me on Deploying a PHP MySQL Application in AWS ?
Depends on your needs. Elastic Beanstalk might be a good option for many apps, but I chose EC2 for my app's backend (using PHP, MySQL and S3 for storage).
Quick steps to get you up and running:
Log into the AWS Mangement Console and start a new EC instance (Windows server 2012 R2 Base > t2.micro should be good enough for a start!)
At step "6. Configure Security Group", add Rules for at least HTTP, HTTPS and RDP (so you can connnect via Remote Desktop)
Connect to your new instance via Remote Desktop and install a decent browser (Enable File Downloads in IE's Security Settings and download Chrome or Firefox)
Open the Windows Firewall and add rules for the same ports you opened in the Security Group of your Instance in the AWS Management Console. (Right-click on “Inbound Rules”, then select “New Rule…”)
Download and install XAMPP (I put it in C:\xampp)
Open the XAMPP Control panel and install Apache and MySQL as services (so they will start automatically when your instance launches); make sure everything is started up.
Now put your files in C:\xampp\htdocs\ and you're ready to go!
Bonus Steps:
Set up Filezilla FTP Server (and open the required ports in both the instance's security group and the Windows Firewall) so you can upload/download files without having to go through Remote Desktop.
Get an Elastic IP and assign it to your instance, so it's IP address will never change.
Get an SSL certificate so you can use HTTPS
The answer depends on the load that you are expecting and the resources you have to handle all the administration tasks.
If you expect heavy or variable loads, there are many reasons why not to deploy a production PHP + MySQL application on a EC2.
Here are some of the benefits of deploying to Elastic Beanstalk instead of a manual configured EC2:
You get version control of each deployment.
You can scale up or down automatically if you need more/less instances to handle new load.
You get a load-balancer in front of your EC2s instances with a bunch of out-of-the-box "recommended" configurations.
Regarding MySQL, if you go for an Amazon RDS instance you can handle replication, monitorization and automatic backups with pretty low effort. A lot of the configurations you would need to tweak are now available through parameter-groups.
On the other hand, if you want to have full control of everything that is going on on your server (that means you have time to monitor, backup and do maintenance tasks, which is not my case :), or if you do not plan to have much traffic, or if you want the less expensive option, you should go with a low cost EC2 instance.
In my experience, (after 2 years of working on AWS with 10 production applications, I'm kind of a regular AWS user) pretty much every customization or change I needed on both RDS and EBS I was able to tweak it and get it working, so I'm pretty satisfied with choosing the EBS+RDS option.
Below are two links i found which are helpful to Create and Update an Application with AWS Elastic Beanstalk
https://aws.amazon.com/getting-started/tutorials/launch-an-app/
https://aws.amazon.com/getting-started/tutorials/update-an-app/
I'm trying to create a new PHP project in a situation where our client gave us only remote desktop connection to to their files.
How does it work usually in Netbeans:
To put it simple when a project is created with external sources we can click in our project on the single file and upload and download it (via ftp).
I'd like to be able to do this also on remote files located in a server which can be reached by our local network but which hasn't ftp installed.
Extra details
Until now we have been able to work normally with this setup by using Dreamweaver which allows to set in the server options an address located in the local network.
I really would like to switch to Netbeans and being able to click on upload and download on the local files to sync with their server but I can't find a way to achieve this.
There seem to be no option when creating a new project that allows this kind of setup. Selecting remote website seem to allow only ftp synch.
On the other hand if i select "Local web site" i can select the files in the local network, but NB doesn't allow me to make a local copy of them.
note 1: the server we're accessing hasn't ftp installed.
note 2: the "copy files from sources folder to another location" isn't really an option for me since I'd like to keep separated my local copy from what's on the server (and this setup I think would just copy the files without giving me any control on them).
note 3: creating a project with existing sources seem to allow only to have remote files reachable via ftp.
It would be useful to have some versioning system (git, mercurial, svn...). Can you mount the network drive in Windows? (see here ). This would at least allow you to easily create project from existing sources (although working via network could be quite slow)
One hacky way I can think of is to:
mount the network drive as described above and map it to some letter, say Z
install local FTP server (e.g. FileZilla)
configure FileZilla FTP server to use the network drive as "ftp home" (aka the folder which FileZilla will use for saving files sent over FTP) - simply use Z:\ path to point to the mounted network drive mapped to letter Z
in NetBeans, create a new PHP project from remote server (where remote server is actually your local FileZilla server)
In theory, this should work.
Here is the fragment from NetBeans site:
To set up a NetBeans project for an existing web application:
Choose File > New Project (Ctrl-Shift-N on Windows/Cmd-Shift-N on OS X).
Choose Java Web > Web Application with Existing Sources. Click Next.
In the Name and Location page of the wizard, follow these steps:
In the Location field, enter the folder that contains the web application's source root folders and web page folders.
Type a project name.
(Optional) Change the location of the project folder.
(Optional) Select the Use Dedicated Folder for Storing Libraries checkbox and specify the location for the libraries folder. See Sharing Project Libraries in NetBeans IDE for more information on this option.
(Optional) Select the Set as Main Project checkbox. When you select this option, keyboard shortcuts for commands such as Clean and Build Main Project (Shift-F11) apply to this project.
Click Next to advance to the Server and Settings page of the wizard.
(Optional) Add the project to an existing enterprise application.
Select a server to which to deploy. If the server that you want does not appear, click Add to register the server in the IDE.
Set the source level to the Java version on which you want the application to run.
(Optional) Adjust the context path. By default, the context path is based on the project name.
Click Next to advance to the Existing Sources and Libraries page of the wizard.
Verify all of the fields on the page, such as the values for the Web Pages Folder and Source Package Folders.
Click Finish.