I have a php project here that I want to run in Google Cloud Run, since it also works with files, I thought it makes sense to start a GSC instance and put files there.
Locally this has also worked so far. However, as soon as I run the project in Cloud Run, I get the error when accessing GCS:
cURL error 5: Could not resolve proxy: null (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for https://storage.googleapis.com/storage/v1/b/<redacted>/o?delimiter=%2F&includeTrailingDelimiter=true&prefix=public%2Fvar%2Ftmp%2Fthumbnails%2F_default_upload_bucket%2F&prettyPrint=false
If I understand correctly, cURL wants to go through a proxy, however, I have not configured anything like that. I only have a VPC connector on the cloud run, but it is configured so that only requests to private IPs go through the VPC connector.
The framework used is Symfony 5 and the library to connect is "flysystem" with the "google-cloud-storage" adapter.
Is there something here that I am essentially misunderstanding?
I have been able to solve the problem. However, I am still not quite clear why I had to do that.
Anyway, the following environment variables must be set for this to work:
HTTPS_PROXY = ""
HTTP_PROXY = ""
Related
Amazon AWS have an official template for WordPress configuration file that uses the $_SERVER['SOME_CUSTOM_SYSTEM_VAR'] syntax to set environment variables values to the application. And I have realized that the key in $_SERVER is always equals to a Linux environment variable, that I conveniently can setup in the web console. It would mean that I could also set the same custom variables in my development environment and in deployment make no changes in the code in terms of variables adjustments and neither worry about some configuration file that can be wrongly handled in the deployment process.
But actually, in my development environments the things almost works like this. I an working in a Laravel application, and the $_SERVER['SOME_CUSTOM_SYSTEM_VAR'], as those related to the databases connection works file in the console commands. But, when running the application in a PHP server for development, those same variables does not works, and I receive the Undefined index error.
How does I set my local development environment, so that the local PHP server can understood the $_SERVER['SOME_CUSTOM_SYSTEM_VAR'], fetching variables from the operational system?
Find the location of your local systems php.ini file and update the system variable for the server so that the application can actually see it. Using the command line you have direct access to these types of things because they're loaded into a wrapper, in an application you do not.
Edit: You may also consider using your .env file if it's simply needing to be available to your Laravel app.
What exactly are you trying to set? And what does it get used for?
The situation
I have been developing in php and using wamp for the past 2 years. Then I come across a module to implement a chat system followed by instant notifications. So I go look it up and found this awesome "nodejs" that allows you to connect to connected users in realtime.
This guy nodejs socket.io and php uploaded a way to integrate nodejs socket.io and php without node server.
So I downloaded his project (github) and ran it on my computer but it gave
connection refused error from 8080 So,
I go to nodejs site and install nodejs on my system (windows). It automatically updated my environment variables and I could just go to my command line to run a example project as
path(...)node nodeServer.js
and then run the index file of the project from the shared link and it starts working. everything runs smooth and nice.
MY QUESTION
If without installing nodejs on my system I cannot run the node app in the small example project then how am I supposed to install nodejs on live server (apache) and use command line to start nodejs.
I know this might be too silly but I am really new to nodejs so I don't know if I can run node on live php server. If it is possible then can anyone tell me how can I do that ? or is it just an ideal situation and can't be done.
Node.js do not need to be installed with Apache. Node.js itself provide a server that would listen on a port. You can use Apache or Nginx to use proxy. You can run your application without these server also.
Create a file index.js using the code below and run node index.js
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');
Open you browser and enter this url : http://127.0.0.1:1337/ You will see Hello World over there. In this case nodejs is listening on port 1337
If you are using cloud or VPS or any kind of solution that allows you full control of stuff installed, you can just install node.js there and run what you need...
https://github.com/joyent/node/wiki/installing-node.js-via-package-manager
some services will allow you to pick what gets installed... so you just pick nodejs and run it alongside your apache.
However, if you are using shared hosting solution, there is limited number of those actually even hosting node (if any) and solving this would be almost impossible for you.
Second Edit: Sorry for editing twice, but there is a thing with "no nodejs server" in mentioned stackoverflow post - there is actually a server and mentioned need to npm install certain modules... this is not right way to do this, but if you still want to try this you need node installed (and npm along with it) and then you need to npm isntall mentioned packages, add simple server file quoted in the post, run it and then have all you need for your chat...
If you need some help, ping me, but if this is time critical project, rather find some third party solution... and then learn about this one.
TLDR find a hosting service that'll give u admin and support firewall requests, or self host w/ a free dns subdomain and have a script update your ip so you dont have to pay for static.
My Experiences:
You can actually utilize node for input/output stream manipulation as well. Look at gulp and node for more info. Using bower and bluebird on top of a git project makes setting up apps very easy and quick via node.
As for using socket.io w/ a node/wamp setup, I've actually used this in the past. I had wamp installed on the server initially, but I used the apache directives to reverse proxy requests on 8080 to the node.js app from the client scripts.
I did have to install node separately, though, so you'll need something like ramnode maybe (I think they allow hosted apps like iis/mvc etc too).
Easiest hosting setup for development imo was self host wamp/node w/ a free subdomain from afraid.dns.
Otherwise ramnode gives you full access to admin features on your vm, i believe. So you may be able to install node there as long as you request firewall permissions when needed for xtra ports (socket.io used diff ports for requests on page so I didnt have to worry about CORs crap or anything).
I'm trying to send a cURL request from a Windows Server 2008 machine using PHP (version 5.3.12) and keep receiving the error Could not resolve proxy: http=127.0.0.1; Host not found. As far as I cal tell, I'm not using a proxy - CURLOPT_PROXY is not set, I've run netsh winhttp show proxy to make sure there's not a system-wide setting in place, I've even checked all the browsers on my machine to confirm none are configured to use a proxy (just in case this could possibly have an effect). I'm having trouble figuring out why cURL insists on telling me that 1) I'm using a proxy and 2) it can't connect to it.
I'm able to resolve the error by explicitly disabling the use of a proxy via curl_setopt($curl, CURLOPT_PROXY, '');, but this isn't the greatest solution - a lot of the places I use cURL are in libraries, and it'd be a pain (not to mention less than maintainable) to go around and hack this line into all of them. I'd rather find the root cause and fix it there.
If it helps, this has happened to me only with POST requests so far. Command-line cURL (from a Git bash prompt) works fine. These calls also work fine from our dev machine, so it seems to be something specific to my machine.
If I need to apply the above hack, I will, but I thought before I resorted to that I'd ask the good folks of SO - is there anywhere I'm missing that could be configuring the use of a proxy? Let me know if there's any additional helpful info I forgot to add.
cURL relies on environment variables for proxy settings. You can set these on Windows via "Advanced System Settings".
The variables you need to set and/or change for optimum control are "http_proxy", "HTTPS_PROXY", "FTP_PROXY", "ALL_PROXY", and "NO_PROXY".
However, if you just want to avoid using a proxy at all, you can probably get away with creating the system variable http_proxy and setting it to localhost and then additionally creating the system variable NO_PROXY and setting it to *, which will tell cURL to avoid using a proxy for any connection.
In any case, be sure to restart any command windows to force recognition of the change in system environment variables.
[source - cURL Manual]
Ok, let me first start off by saying that I've only ever dealt with VPN access through windows by setting up a connection through the control panel. It's pretty simple since everything is pretty much a point-and-click setup.
I'm now working on a project where I need to access a computer cloud on a private network (there is no public IP directly to the cloud so it can only be accessed when I'm on the network). My project involves a website that needs to access that cloud "somehow". Because of my lack of experience/knowledge with VPN's through the command line and how to programmatically connect to a VPN, I've hit a mild obstacle that I'm hoping someone here can help me with.
What kind of server side scripting would I do to get a VPN connection up and running? The website is being hosted on a linux machine. Is their a "default" VPN utility under linux that I can call through PHP to establish a connection? If not, I would really appreciate any and all suggestions on how to circumvent this little problem of mine. FYI the VPN uses PPTP.
Looks like there is a little bit to getting it set up from a shell rather than the GUI, but here are some references that will hopefully help you out.
If you set up the connection and connect it, when your PHP script attempts to communicate with an IP address on the remote side of the connection, it will go through the PPTP connection. Having the connection always open is probably better than having to your PHP script connect every time it needs to do something.
http://ubuntuforums.org/showthread.php?t=1443735 - An easy PPTP client setup
http://pptpclient.sourceforge.net/ - Client you can install and configure
http://www.cyberciti.biz/tips/howto-configure-ubuntu-fedora-linux-pptp-client.html - Walk through of setting up PPTP using the linux PPTP network manager
Hopefully those will help you out a bit. The first one looks like it may be worth trying first.
i can't figure out how i should access the repository from a CakePHP project called fredistrano (you can do CakePHP deploys with a web 2.0 interface). i have fredistrano in my web broadcasting directory on a shared unix web server. when i use tortoisesvn from my laptop, i have to use svn+ssh://username#domain.com/svnpath/trunk/. i tried using the same thing in fredistrano, but i keep getting the svn command error "svn: Network connection closed unexpectedly". i copied and pasted the command: svn export --non-interactive --username myusername --password mypwd svn+ssh://myusername#mydomain.com/home/myusername/svn/mydomain.com/trunk tmpDir 2>&1 into my SSH terminal connected to the shared server and i get a prompt for a password, which i believe is actual a prompt for the SSH password and not the SVN password (see this post). fredistrano is failing because it can't deal w/ the SSH password prompt. i noticed in the fredistrano documentation that the example uses http://ipaddress/svn/test for the SVN URL. i copied my svn to my web broadcasting direrctory and tried this but get a connection refused error. my shared hosting provider is pretty strict and i doubt that i can use that. is there a way i can get svn+ssh to work w/ a PHP script like this (fredistrano is just using shell_exec() to execute svn commands)? is there a way i can get just get svn, http, or https working (or any other method that i don't know about)?
I am interested in this problem, too, and I hope that I'm close to the solution.
I haven't tried to put it into work in my application due to the lack of time and other high-priority tasks, but I guess that it should look something like this:
shell_exec(svn something svn+ssh://...)
$response = trim(fgets(STDIN))
[then check if the response contains password prompt text]
fwrite(STDOUT, 'yourpassword');
[analyze the next response and see if SVN has returned the requested information - log, info, whatever]
"svn: Network connection closed unexpectedly" most probably means that your host has restricted/forbidden access to other hosts. This might imply using sockets at all (SVN, HTTP, etc.) or maybe only non-HTTP. In this case you should try setting up your SVN server to allow HTTP requests (e.g. using mod_dav_svn for Apache).
This is only a guess - see my comment to your question.
How do you authenticate from your dev machine to the svn-server? You might be using a key to authenticate (Do you have putty pageant running?)
maybe check out the Subversion PHP Module (1.0.3) instead of wrapping shell_exec; it requires building from source, with phpize, ./configure and make (just built it against PHP 5.6 and Subversion 1.9.5)... while the Apache Module mod_dav (Subversion via HTTP/HTTPS) is not required for version control, rather an optional method of accessing the repository.