Running AngularJs and Laravel apps together under the same domain - php

So im trying to build 2 separate applications 1 that used as a backend (Laravel as a REST api) and Angular application as the client, eventually those 2 apps have to work together under the same domain as a single web app.
What im trying to accomplish:
My Angular app is a single page application that boot from index.html, all the routes are handled by Angular except /api/* that should be handled by Laravel app.
Im using 2 different apps in order to build web app more dynamic so i can easily change my backend framework and technologies and testing each app as a 'stand-alone' more easily.
I dont want to use CORS in my response headers because my REST API serves ONLY my Angular app and not other applications such as api for developers.
I want to use proxy that will foward all requests come from http://localhost:9100/api/* to: http://localhost:9000/api/*
Firstly im running Laravel on port 9000 by running:
php artisan serve --port 9000
And Angular app under port 9100 by running a gulp task (index.html is in the path ./src):
gulp.task('webserver', function(){
connect.server({
root: './src',
port: 9100,
middleware: function(connect, o) {
var url = require('url');
var proxy = require('proxy-middleware');
var options = url.parse('http://localhost:9000/api');
options.route = '/api';
return [proxy(options)];
}
});
});
Both apps work perfectly as a stand-alone, but when im trying to navigate to:
http://localhost:9100/api/v1/comments i receive the following error:
Error: connect ECONNREFUSED at errnoException (net.js:904:11) at Object.afterConnect [as oncomplete] (net.js:895:19)
I tried to investigate the cause of this problem, some people say it connected to my hosts file so i had to add the line:
127.0.0.1 localhost
But it doesnt work.
I tried different gulp task:
gulp.task('webserver', function() {
gulp.src("./src")
.pipe(webserver({
port: 9100,
livereload: true,
open: 'http://localhost:9100',
proxies: [
{
source: '/api', target: 'http://localhost:9000/api'
}
]
}));
});
And i receive the exact same error...
My develop environment is Windows 10 x64 bit.

Did you try to use http-proxy-middleware instead of proxy-middleware?
I experienced the same error with proxy-middleware. (Gulp browser-sync - redirect API request via proxy)
Error: connect ECONNREFUSED
at errnoException (net.js:904:11)
at Object.afterConnect [as oncomplete] (net.js:895:19)
Ended up creating http-proxy-middleware, which solved the issue in my case.
proxy-middleware somehow didn't work on the corporate network. http-proxy just did. (http-proxy-middleware uses http-proxy to do actual proxying)
Guess you are using gulp-webserver; The proxy can be added like:
var proxyMiddleware = require('http-proxy-middleware');
gulp.task('webserver', function() {
gulp.src("./src")
.pipe(webserver({
port: 9100,
livereload: true,
open: 'http://localhost:9100',
middleware: [proxyMiddleware('/api', {target: 'http://localhost:9000'})]
}));
});
Never found out why this error is thrown with proxy-middleware in the corporate network ...
Update:
Think this question has been answered:
It was a problem with the artisan server, had to run it this way:
php artisan serve --host 0.0.0.0
Source:
https://github.com/chimurai/http-proxy-middleware/issues/38
https://github.com/chimurai/http-proxy-middleware/issues/21#issuecomment-138132809

Related

Can't log in to Laravel API on staging server (Laravel Forge) via NextJs or Postman

I have developed an API project on Laravel with Sanctum (Token) and NextJs for the frontend. I have setup things up correctly and everything is working fine on Localhost.
I deployed the project on Laravel Forge with one custom subdomain (eg. api.example.com). I run php artisan storage:link and php artisan migrate:fresh --seed (with env as staging) as per their guide (cd /to the path && artisan command) and this works. FRONTEND_URL in env has also been updated to the live frontend url (eg. nextjs.example.com).
I tried logging in to the backend from nextjs after deploying Backend on Laravel Forge and NextJs on Vercel. https://api.example.com/sanctum/csrf-cookie is working correctly as it responds to the browser with the XSRF-TOKEN. But it fails on login with csrf-token mismatch.
Then I tried logging into it with Postman and the same thing happens. I can request the csrf-cookie separately but can not log in to the api backend with responded token. However, it is working fine on the localhost.
This is a piece of my working codes on localhost (NextJs)
const csrf = () => axios.get('/sanctum/csrf-cookie');
const login = async (loginDetails, setErrors) => {
setErrors('');
await csrf();
await axios
.post('/login', loginDetails, {
headers: {
'X-XSRF-TOKEN': getCookie('XSRF-TOKEN'),
},
})
.then(async (res) => {
if (res.data.status === 401) {
setErrors(res.data.message);
} else {
localStorage.setItem('user_token', res.data.data.token);
setCookie('user_token', res.data.data.token);
await axios
.get('/api/authenticated-user', {
headers: {
Authorization: `Bearer ${localStorage.getItem('user_token')}`,
},
})
.then((res) => {
localStorage.setItem(
'user_data',
JSON.stringify(res.data.data)
);
});
router.push('/dashboard/);
}
})
.catch((err) => {
setErrors(err.response.data.message);
});
};
I got the solution for this.
The reason it is not working on Postman is that X-Requested-With: XMLHttpRequest & X-XSRF-TOKEN: {{csrf-token}} were missing on the headers after duplicating the one that works on localhost for the live version. If you are experiencing the same issue, please double-check whether those headers are present on the request's headers. Then, it started working on Postman but not on the live NextJs project with the subdomain.
To make it work perfectly on the subdomain, please check this thread since I created a different question at the same time with the purpose of making it clear for the ones who wanted to answer.

laravel 5.6 - connection refused

I test an laravel 5.6 app running on localhost:8000.
I have a separated vue project (not the one in laravel but external stand alone vue project) running on localhost:8080.
I want external vue to call a route from web.php in laravel:
Route::get('/stores/{store}', 'StoresController#show');
My code in external vue:
axios.get('http://localhost:8000/stores/68')///<-- I do have a record of 68 in database.
.then((response) => {
console.log(self.postdata);
}).catch((err) => {
console.log(err);
});
And I get error:
GET http://localhost:8000/stores/68 net::ERR_CONNECTION_REFUSED
I tried with IP address too:
axios.get('http://192.168.0.142:8000/stores/68')
And still get error:
GET http://192.168.0.142:8000/stores/68 net::ERR_CONNECTION_REFUSED
After much reading, I suspect is because of the csrf-token don't exist in external vue.
How can I resolve this?
Edit:
Below are the screencap of the error from the console:

Guzzle HTTP from Docker to NodeJS in Host [duplicate]

This question already has answers here:
From inside of a Docker container, how do I connect to the localhost of the machine?
(41 answers)
How to access host port from docker container [duplicate]
(17 answers)
Closed 4 years ago.
folks.
I have a servive running on my host machine. It is a NodeJS app with Express. It works fine at "localhost:3000".
Then, in a separate project, I have a Laravel App running fine inside Docker, and I access it at "http://localhost".
Now, my Laravel app needs to call the NodeJS app. I saw in Docker documentation I should use "host.docker.internal", since it will resolve to my host machine.
The this->http is a Guzzle\Client instance.
In my PHP code I have this:
$response = $this->http->request('POST', env($store->remote), [
'form_params' => [
'login' => $customer->login,
'password' => $customer->password,
]);
If I call the NodeJS app from Postman it works fine. But calling from that PHP I got this error:
"message": "Client error: `POST http://host.docker.internal:3000` resulted in a `404 Not Found` response:\n<!DOCTYPE html>\n<html lang=\"en\">\n<head>\n<meta charset=\"utf-8\">\n<title>Error</title>\n</head>\n<body>\n<pre>Cannot POST /</p (truncated...)\n",
"exception": "GuzzleHttp\\Exception\\ClientException",
"file": "/var/www/html/vendor/guzzlehttp/guzzle/src/Exception/RequestException.php",
"line": 113,
Does anyone have any clue how I can call my node app from PHP in Docker?
EDIT
I was thinking if I should not open the port 80 and bind it to port 3000 in my PHP instance (since the request is running in php docker image). I put in my Docker file these ports attribute:
php:
build: ./docker
volumes:
- .:/var/www/html
- ./.env:/var/www/html/.env
- ./docker/config/php.ini:/usr/local/etc/php/php.ini
- ./docker/config/php-fpm.conf:/usr/local/etc/php/php-fpm.conf
- ./docker/config/xdebug.ini:/usr/local/etc/php/conf.d/docker-php-ext-xdebug.ini
links:
- mysql
ports:
- "3000:80"
So, port 80 in my PHP instance would bind to my OSX port 3000. But Docker complains port 3000 is in use:
Cannot start service php: b'driver failed programming external connectivity on endpoint project_php_1 (241090....): Error starting userland proxy: Bind for 0.0.0.0:3000 failed: port is already allocated'
Yes! In fact it is allocated. It is allocated by my NodeJS app, that is where I want to go. It looks like I do not know very well how ports and DNS works inside Docker for Mac.
Any help is very appreciated.
SOLVED
Hey, guys. I figured it out. I turned off Docker container, point a regular Apache to my Laravel project and I got what was happening: CORS.
I already had cors in my Express app, but after configure it better, it worked!
Here it is, in case anyone stumbled here and needs it:
1) Add cors to your Express (if you haven't yet)
2) Configure cors to your domains. For now, I will keep it open, but, for production APPS, please, take care and control wisely who can query your app:
// Express app:
app.use(
cors({
"origin": "*",
"methods": "GET,HEAD,PUT,PATCH,POST,DELETE",
"preflightContinue": false,
"optionsSuccessStatus": 204
})
);
app.options('*', cors());
3) Use the host.docker.internal address (in my case, host.docker.internal:3000 , since my app is running on that port) from PHP to get to your Express App in OSX host machine. In my case, it will be a different domain/IP when it gets to production.
4) Just use Guzzle\Client to make your http call:
$response = $this->http->request('POST', env($store->remote) . '/store-api/customers/login', [
'json' => [
"login" => $customer->login,
"password" => encrypt($customer->password),
]
]);
A important point to note: Express waits for json (in my app, at least), so do NOT use "form_data", use "json" option to POST requests:
At least, it was NOT a duplication of the other answers, as marked by #Phil, because those answers points to the same solution I have already mentioned, use the 'host.docker.internal' address.

Make request to php endpoint in angular4

So i am new to Angular4, I have a service that is supposed to make a request to a php api. I have the following code -
getWeather(keyword: string){
return this.http.get("weather.php", {
params: new HttpParams().set('keyword', keyword),
headers: new HttpHeaders().set('Method', 'search'),
});
I am getting a 404 localhost:4200/weather.php cannot be found. not sure wher to place the php endpoint file.
Assuming that localhost:4200 is your front-end server, you have to set your php server api path in the call. If your php server in in local on the 8080 port, you type this.http.get('http://localhost:8080/weather.php',...
You can put it in your src/assets folder and then call it using this.http.get("assets/weather.php", {....
This is because for Angular application is hosted on 4200 port and your PHP file will be using port 80.
Try using
http://localhot/path/to/weather.php

Configuring PhpStorm RESTful Client to work with Laravel

I'm trying to build RESTful API with Laravel using PhpStorm and artisan server, but when I try to test with Rest Client I receive this error:
For now I write only the GET method and I receive correct output in my browser with the address http://localhost:8000/users
This is my code:
routes.php
Route::resource('users','UserController');
UserController.php
public function index()
{
return \Response::json(User::all());
}
I tried also to add json in request window
php artisan serve --host 127.0.0.1 should do the trick.
Looks like PHPStorm does a lookup on localhost which results in 127.0.0.1. But php artisan serve binds to your local IPv6 address ::1.

Categories