PhpStorm include file outside project from remote host - php

I have create a project from existing files over SSH.
PhpStrom copied this project to a local folder on my computer.
When i used the same construction
requre '/var/www/libs/log.php';
i've got notice from IDE like this: Include expression is not resolved
Project are available on remote server by this path: /var/www/project
How i can resolve this notice?

You can definitely do this on Ubuntu Linux, i test it and it works. Here is how to do this.
1. Open nautilus and connect to your host over sftp (also you can use other file managers)
2. Navigate to folder /run/user/xxx/gvfs/ where xxx is user uid. Possible you path will be run/user/1000/gvfs/
3. Open PhpStorm and add external library pointed to you folder.
The main feature is to mount your remote host folder with local folder.
important: don`t include huge external folders from your host to phpstorm.
Btw it is better to keep all your files locally.

Related

Open directory on network in PhpStorm

I'm working from a windows 7 PC and my source code is on an Ubuntu PC. I have a shared folder on the Ubuntu computer that I can browse from my Windows computer.
I'm trying to open this directory in PhpStorm via File -> Open Directory. The problem is I can't seem to browse the network using the File browser (shown below) PhpStorm provides. Also entering a path like \my-dev-pc\projects\php\cms just causes it to open my C:\ directory.
Any ideas on how I can solve this? Perhaps some configuration that allows PhpStorm to use the default Windows Explorer file browser?
Since your project files are on network, you should follow below steps:
file > new project from existing source.
Then you will get a pop up with options for connecting to remote machine/server using ftp/sftp.
Select that and then enter your credentials.
There is also an option to select files which are accesile via network.
This should help you.

openshift "www" directory?

My openshift rhc client is not working so i again use Filezilla FTP to acess my openshift
Past year i use filezilla then "www" directory found at /var/lib/openshift/54216b58500444bb9d0009d0/app-root/repo/php/ But now there no such structure exist i think may openshift change directory structure.
My question is where i upload my code? so that my scalable app work seamless
You can find the path to your code by looking at OPENSHIFT_REPO_DIR environment variable. So doing an echo $OPENSHIFT_REPO_DIR should print the path to your code.
But please remember that the only directory that is persistent accross deployments is the data directory (OPENSHIFT_DATA_DIR). Any change you make in other place will be lost on your next deployment.
Read more about directory variables.

Open file using modify address instead of default address

When XAMPP is installed, we can open file using URL like localhost/home.php. Can we open the very same file like (for example) using an URL like hamzazafeer.com/home.php or www.hamzazafeer.com/home.php? Is there any way or we can't change this address?
You can override the domain target to localhost. That means that your browser "thinks" that the domain is on localhost. The browser will fetch then the local files instead of calling a remote server. To do that you have to edit in C:\Windows\System32\drivers\etc a file named hosts. Depending on your operating system and settings it could be a little hard to edit the file. But you will find further specific information with the help google.
The row you have to add into this file will look like:
127.0.0.1 hamzazafeer.com
Yes and no.
Yes, you can. In default XAMPP installation, it always points to some default folder (htdocs folder in its root folder under Windows, /var/www under Linux etc.) This answer may help you locating this folder in your installation of XAMPP).
You can install (FTP copy) XAMPP on any hosting, where your domain hamzazafeer.com currently points to, and properly configure it (both hosting and domain) to point to your XAMPP's default websites folder and you're ready to go.
(BTW: You're mixing certain things. XAMPP nothing to do with this, it is Apache's configuration variable plus XAMPP / Apache has nothing to do with this in general -- you can point your domain to any folder on any hosting, no matter, what server software is used to serve your website)
No, you shouldn't. XAMPP is from the begining to the very end designed as localhost, test-purpose-only, developer-only solution. You should ever, never use it for serving production version your websites or anywhere on any publicly-accessible hosting. Limit it only to your localhost, as it was meant by XAMPP's creators.

Laravel Homestead Serve code not creating and connecting to folder(s)

I have tried running the following code, and even though it runs and sets the nginx blocks ok, its not linking to the folder in question with in the host machine.
serve projects.dev /home/vagrant/Code/projects.dev
When i then list the folders within the Code folder on the guest machine, i only get the folders that were created via the automated Yaml config file on init set up.
It seems not be creating the folder and/or linking to it at all with between the guest and host machines
Running it on a iMac OS Mavericks.
Vagrant 1.6.2
VirtualBox 4.3.12
That is strange. What I have is I have a folder mapped on the "folders" section from my host machine that inside contains all the sub-folders for all my projects in progress. So my nginx sites just basically link to the sub-folders inside this directory. If you have something like that it "should" show you the folders from your host inside that Code directory in your VM. Maybe post your YAML file to check it out?
My problem was the new Homestead file, use small c for spelling directory "code".
I was upgrading my homestead, so the old code folder was using capital c, spelled "Code".
I just change my Homeastead.yaml file from calling "code" to "Code".

PHP can't move (write) uploaded files to mounted network drive

We develop our PHP-based web applications by editing our working files in a shared, local directory on each of our windows machines. The (linux) staging server then mounts our shared drives and serves them each under subdomains. E.g. joe.work.com would serve Joe's working directory. We then access our own staging sites by editing our hosts file to point the subdomain to the staging server...and all of that works great!
We're now running into the issue that PHP doesn't seem to have permission to move uploaded files from the tmp directory to a directory inside of the mounted directory (which is actually the shared windows drive...?).
The working directory on the windows machines are set to allow everyone rw, and I have tried putting 777 on the mounted directory in the staging machine, but I am still get permission denied.
The shared drive, say, \\joes_machine\joes_working_dir mounts on the staging server to /var/mnt/joe. The file upload needs to be moved to /images/common.
Albeit slightly dumbed down for this example, I'm not doing anything fancy code-wise:
$working_directory = '/var/mnt/joe';
$image_directory = '/images/common/';
$full_filename = $working_directory . $image_directory . $filename;
if(move_uploaded_file($_FILES['photo']['tmp_name'], $full_filename))
// do some other stuff
My error of course is:
Message: move_uploaded_file(/var/mnt/joe/images/common/resulting_filename.jpg):
failed to open stream: Permission denied
Message: move_uploaded_file(): Unable to move '/tmp/phpjsEfBc' to
'/var/mnt/joe/images/common/resulting_filename.jpg'
What am I not understanding about file permissions pertaining to a windows shared drive being mounted over the network by linux and PHP needing to write to it? I can't seem to find the hang up!
Once we hit production, we won't be using the schema, but if there's a simple solution to be able to continue in our current development environment, then that would be ideal!
After a few hours of facerolling, I finally found what I was missing. The issue was that the credentials supplied couldn't write to the mounted directory as expected. The way I was able to fix this was by editing the mount command as follows:
mount -t cifs //shared/directory /mount/target
-o rw,username=connectionuser,password=password,uid=48
so, username and password are to be the windows credentials used to connect to the drive, but uid specifies the unique identifier of the local user on the staging server that apache runs as so that it may write to the mounted directory.
previously, i had not specified the uid of the local user, so when apache was trying to write to the mounted directory, it was trying to use the windows credentials (that couldn't write on the 'local' drive)
hope this is helpful!

Categories