Random 400 "token_invalid" errors with Laravel / jwt-auth and Angular / Satellizer app - php

I have an Angular app that consumes an API I built in Laravel, and I use jwt-auth for token management and satellizer on the front end to send the token with each request.
My live environment (for both the front end and the API - which will be moved to a different server once the app is finished) at the moment consists of 2 AWS EC2 instances running nginx with a load balancer. Both servers have the same jwt secret key.
However, and at the moment I can't work out any pattern to it, I randomly get 400 "token_invalid" errors returned from my api. It is not one particular api route, nor is it on every load of the app. When I get a 400 error, from my /clients endpoint for example, other requests will have returned 200's. Next time, all will return 200's. The time after that I may get 200 returned for /clients but a 400 error for /users.
Could this be an issue with me using a load balancer? The jwt secret key, as I said, is the same on both servers - as all the code is in GIT.
I am not using the jwt.refresh middleware.
One other thing to mention is that I don't ever get 400 errors returned when running the app locally via Homestead, ony in production.
EDIT - it seems as though logging out (which clears both my user object (basic details only) and the token from local storage, clearing my cache, then logging back in most often causes the error - is this helpful?
Below is an example of one of my api calls.
App.js
.service('ClientsService', function($http, $q, __env) {
this.index = function () {
var deferred = $q.defer();
$http.get(__env.apiUrl + '/clients')
.then(function successCallback(response) {
console.log(response.data);
deferred.resolve(response.data);
},
function errorCallback(response) {
console.log(response);
});
return deferred.promise;
}
})
ClientsController.js
.controller('ClientsController', function(ClientsService, $stateParams, $mdDialog, $mdToast) {
var vm = this;
ClientsService.index().then(function(clients) {
console.log('ClientsCtrl init');
vm.clients = clients.data;
});
// other controller code
})
I'm really struggling to debug this, so any help would be much appreciated. If any more info is needed, please let me know.

https://github.com/tymondesigns/jwt-auth/issues/1583
solution: use the same jwt secret in .env file

Related

socket.io read browser cookies already set by php [duplicate]

I am trying to use Socket.IO in Node.js, and am trying to allow the server to give an identity to each of the Socket.IO clients. As the socket code is outside the scope of the http server code, it doesn't have easy access to the request information sent, so I'm assuming it will need to be sent up during the connection. What is the best way to
1) get the information to the server about who is connecting via Socket.IO
2) authenticate who they say they are (I'm currently using Express, if that makes things any easier)
Use connect-redis and have redis as your session store for all authenticated users. Make sure on authentication you send the key (normally req.sessionID) to the client. Have the client store this key in a cookie.
On socket connect (or anytime later) fetch this key from the cookie and send it back to the server. Fetch the session information in redis using this key. (GET key)
Eg:
Server side (with redis as session store):
req.session.regenerate...
res.send({rediskey: req.sessionID});
Client side:
//store the key in a cookie
SetCookie('rediskey', <%= rediskey %>); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
//then when socket is connected, fetch the rediskey from the document.cookie and send it back to server
var socket = new io.Socket();
socket.on('connect', function() {
var rediskey = GetCookie('rediskey'); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
socket.send({rediskey: rediskey});
});
Server side:
//in io.on('connection')
io.on('connection', function(client) {
client.on('message', function(message) {
if(message.rediskey) {
//fetch session info from redis
redisclient.get(message.rediskey, function(e, c) {
client.user_logged_in = c.username;
});
}
});
});
I also liked the way pusherapp does private channels.
A unique socket id is generated and
sent to the browser by Pusher. This is
sent to your application (1) via an
AJAX request which authorizes the user
to access the channel against your
existing authentication system. If
successful your application returns an
authorization string to the browser
signed with you Pusher secret. This is
sent to Pusher over the WebSocket,
which completes the authorization (2)
if the authorization string matches.
Because also socket.io has unique socket_id for every socket.
socket.on('connect', function() {
console.log(socket.transport.sessionid);
});
They used signed authorization strings to authorize users.
I haven't yet mirrored this to socket.io, but I think it could be pretty interesting concept.
I know this is bit old, but for future readers in addition to the approach of parsing cookie and retrieving the session from the storage (eg. passport.socketio ) you might also consider a token based approach.
In this example I use JSON Web Tokens which are pretty standard. You have to give to the client page the token, in this example imagine an authentication endpoint that returns JWT:
var jwt = require('jsonwebtoken');
// other requires
app.post('/login', function (req, res) {
// TODO: validate the actual user user
var profile = {
first_name: 'John',
last_name: 'Doe',
email: 'john#doe.com',
id: 123
};
// we are sending the profile in the token
var token = jwt.sign(profile, jwtSecret, { expiresInMinutes: 60*5 });
res.json({token: token});
});
Now, your socket.io server can be configured as follows:
var socketioJwt = require('socketio-jwt');
var sio = socketIo.listen(server);
sio.set('authorization', socketioJwt.authorize({
secret: jwtSecret,
handshake: true
}));
sio.sockets
.on('connection', function (socket) {
console.log(socket.handshake.decoded_token.email, 'has joined');
//socket.on('event');
});
The socket.io-jwt middleware expects the token in a query string, so from the client you only have to attach it when connecting:
var socket = io.connect('', {
query: 'token=' + token
});
I wrote a more detailed explanation about this method and cookies here.
Here is my attempt to have the following working:
express: 4.14
socket.io: 1.5
passport (using sessions): 0.3
redis: 2.6 (Really fast data structure to handle sessions; but you can use others like MongoDB too. However, I encourage you to use this for session data + MongoDB to store other persistent data like Users)
Since you might want to add some API requests as well, we'll also use http package to have both HTTP and Web socket working in the same port.
server.js
The following extract only includes everything you need to set the previous technologies up. You can see the complete server.js version which I used in one of my projects here.
import http from 'http';
import express from 'express';
import passport from 'passport';
import { createClient as createRedisClient } from 'redis';
import connectRedis from 'connect-redis';
import Socketio from 'socket.io';
// Your own socket handler file, it's optional. Explained below.
import socketConnectionHandler from './sockets';
// Configuration about your Redis session data structure.
const redisClient = createRedisClient();
const RedisStore = connectRedis(Session);
const dbSession = new RedisStore({
client: redisClient,
host: 'localhost',
port: 27017,
prefix: 'stackoverflow_',
disableTTL: true
});
// Let's configure Express to use our Redis storage to handle
// sessions as well. You'll probably want Express to handle your
// sessions as well and share the same storage as your socket.io
// does (i.e. for handling AJAX logins).
const session = Session({
resave: true,
saveUninitialized: true,
key: 'SID', // this will be used for the session cookie identifier
secret: 'secret key',
store: dbSession
});
app.use(session);
// Let's initialize passport by using their middlewares, which do
//everything pretty much automatically. (you have to configure login
// / register strategies on your own though (see reference 1)
app.use(passport.initialize());
app.use(passport.session());
// Socket.IO
const io = Socketio(server);
io.use((socket, next) => {
session(socket.handshake, {}, next);
});
io.on('connection', socketConnectionHandler);
// socket.io is ready; remember that ^this^ variable is just the
// name that we gave to our own socket.io handler file (explained
// just after this).
// Start server. This will start both socket.io and our optional
// AJAX API in the given port.
const port = 3000; // Move this onto an environment variable,
// it'll look more professional.
server.listen(port);
console.info(`🌐 API listening on port ${port}`);
console.info(`🗲 Socket listening on port ${port}`);
sockets/index.js
Our socketConnectionHandler, I just don't like putting everything inside server.js (even though you perfectly could), especially since this file can end up containing quite a lot of code pretty quickly.
export default function connectionHandler(socket) {
const userId = socket.handshake.session.passport &&
socket.handshake.session.passport.user;
// If the user is not logged in, you might find ^this^
// socket.handshake.session.passport variable undefined.
// Give the user a warm welcome.
console.info(`⚡︎ New connection: ${userId}`);
socket.emit('Grettings', `Grettings ${userId}`);
// Handle disconnection.
socket.on('disconnect', () => {
if (process.env.NODE_ENV !== 'production') {
console.info(`⚡︎ Disconnection: ${userId}`);
}
});
}
Extra material (client):
Just a very basic version of what the JavaScript socket.io client could be:
import io from 'socket.io-client';
const socketPath = '/socket.io'; // <- Default path.
// But you could configure your server
// to something like /api/socket.io
const socket = io.connect('localhost:3000', { path: socketPath });
socket.on('connect', () => {
console.info('Connected');
socket.on('Grettings', (data) => {
console.info(`Server gretting: ${data}`);
});
});
socket.on('connect_error', (error) => {
console.error(`Connection error: ${error}`);
});
References:
I just couldn't reference inside the code, so I moved it here.
1: How to set up your Passport strategies: https://scotch.io/tutorials/easy-node-authentication-setup-and-local#handling-signupregistration
This article (http://simplapi.wordpress.com/2012/04/13/php-and-node-js-session-share-redi/) shows how to
store sessions of the HTTP server in Redis (using Predis)
get these sessions from Redis in node.js by the session id sent in a cookie
Using this code you are able to get them in socket.io, too.
var io = require('socket.io').listen(8081);
var cookie = require('cookie');
var redis = require('redis'), client = redis.createClient();
io.sockets.on('connection', function (socket) {
var cookies = cookie.parse(socket.handshake.headers['cookie']);
console.log(cookies.PHPSESSID);
client.get('sessions/' + cookies.PHPSESSID, function(err, reply) {
console.log(JSON.parse(reply));
});
});
use session and Redis between c/s
Server side
io.use(function(socket, next) {
// get here session id
console.log(socket.handshake.headers.cookie); and match from redis session data
next();
});
this should do it
//server side
io.sockets.on('connection', function (con) {
console.log(con.id)
})
//client side
var io = io.connect('http://...')
console.log(io.sessionid)

I m getting Unauthenticated. on laravel sanctum api

so I created the login it worked well but when I want to fetch user posts it returns Unauthenticated even tho I sent the token and used the xsrf cookies but still the same problem
axios call
axios.defaults.withCredentials = true;
axios.get('/sanctum/csrf-cookie').then(response => {
axios.get('/api/posts/20',{headers:{Authorization: `Bearer ${localStorage.getItem('userToken')}`}}).then(res=>{
console.log(res);
})
api route
Route::post('login',"UserController#index");
Route::get('posts/{id}', 'PostController#index')->middleware('auth:sanctum');
please help me guys keep in mind I tried everything out there but nothing works
I had problem in domain as TEFO said I launched the backend on virtual host + adding configuration to cors.php solved the problem

Rate-limiting Guzzle Requests in Symfony

This actually follows on from a previous question I had that, unfortunately, did not receive any answers so I'm not exactly holding my breath for a response but I understand this can be a bit of a tricky issue to solve.
I am currently trying to implement rate limiting on outgoing requests to an external API to match the limit on their end. I have tried to implement a token bucket library (https://github.com/bandwidth-throttle/token-bucket) into the class we are using to manage Guzzle requests for this particular API.
Initially, this seemed to be working as intended but we have now started seeing 429 responses from the API as it no longer seems to be correctly rate limiting the requests.
I have a feeling what is happening is that the number of tokens in the bucket is now being reset every time the API is called due to how Symfony handles services.
I am setting currently setting up the bucket location, rate and starting amount in the service's constructor:
public function __construct()
{
$storage = new FileStorage(__DIR__ . "/api.bucket");
$rate = new Rate(50, Rate::MINUTE);
$bucket = new TokenBucket(50, $rate, $storage);
$this->consumer = new BlockingConsumer($bucket);
$bucket->bootstrap(50);
}
I'm then attempting to consume a token before each request:
public function fetch(): array
{
try {
$this->consumer->consume(1);
$response = $this->client->request(
'GET', $this->buildQuery(), [
'query' => array_merge($this->params, ['api_key' => $this->apiKey]),
'headers' => [ 'Content-type' => 'application/json' ]
]
);
} catch (ServerException $e) {
// Process Server Exception
} catch (ClientException $e) {
// Process Client Exception
}
return $this->checkResponse($response);
}
I can't see anything obvious in that, that would allow it to request more than 50 times per minute unless the amount of available tokens was being reset on each request.
This is being supplied to a set of repository services that handle converting the data from each endpoint into objects used within the system. Consumers use the appropriate repository to request the data needed to complete their process.
If the amount of tokens is being reset by the bootstrap function being in service constructor, where should it be moved to within the Symfony framework that would still work with consumers?
I assume that it should work, but maybe try to move the ->bootstrap(50) call from every request? Not sure, but it can be the reason.
Anyway it's better to do that only once, as a part of your deployment (every time you deploy a new version). It doesn't have anything with Symfony, really, because the framework doesn't have any restrictions on deployment procedure. So it depends on how you do the deployment.
P.S. Have you considered to just handle 429 errors from the server? IMO you can wait (that's what BlockingConsumer does inside) when you receive 429 error. It's simpler and doesn't require an additional layer in your system.
BTW, have you considered nginx's ngx_http_limit_req_module as an alternative solution? It usually comes with nginx by default, so no additional actions to install, only a small configuration is required.
You can place an nginx proxy behind your code and the target web service and enable limits on it. Then in your code you will handle 429 as usual, but the requests will be throttled by your local nginx proxy, not by the external web service. So the final destination will get only limited amount of requests.
I have found a trick using Guzzle bundle for symfony.
I had to improve a sequential program sending GET requests to a Google API. In code example, it a pagespeed URL.
To have a rate limit, there an option to delay the requests before they are sent asynchronously.
Pagespeed rate limit is 200 requests per minute.
A quick calculation gives 200/60 = 0.3s per request.
Here is the code I tested on 300 urls, getting a fantastic result of no error, except if the url passed as a parameter in the GET request gives a 400 HTTP Error (Bad request).
I put a delay of 0.4s and the average result time is less then 0.2s, whereas it took more than a minute with a sequential program.
use GuzzleHttp;
use GuzzleHttp\Client;
use GuzzleHttp\Promise\EachPromise;
use GuzzleHttp\Exception\ClientException;
// ... Now inside class code ... //
$client = new GuzzleHttp\Client();
$promises = [];
foreach ($requetes as $i=>$google_request) {
$promises[] = $client->requestAsync('GET', $google_request ,['delay'=>0.4*$i*1000]); // delay is the trick not to exceed rate limit (in ms)
}
GuzzleHttp\Promise\each_limit($promises, function(){ // function returning the number of concurrent requests
return 100; // 1 or 100 concurrent request(s) don't really change execution time
}, // Fulfilled function
function ($response,$index)use($urls,$fp) { // $urls is used to get the url passed as a parameter in GET request and $fp a csv file pointer
$feed = json_decode($response->getBody(), true); // Get array of results
$this->write_to_csv($feed,$fp,$urls[$index]); // Write to csv
}, // Rejected function
function ($reason,$index) {
if ($reason instanceof GuzzleHttp\Exception\ClientException) {
$message = $reason->getMessage();
var_dump(array("error"=>"error","id"=>$index,"message"=>$message)); // You could write the errors to a file or database too
}
})->wait();

Laravel and Passport, random 401 errors

I'm writing a single page web application.
I'm using Vue.js in the frontend and Laravel in the backend.
I included Passport token authentication and I'm getting the auth token sending
var login_data = {
client_id : 2,
client_secret : "SECRET_KEY",
grant_type : "password",
username : "mail",
password : "pass"
}
To this Passport endpoint http://IPADDRESS/oauth/token. Then I authenticate my AJAX requests including this header
{ 'Authorization': 'Bearer ' + ACC_TOKEN }
Most of the time everything works fine but sometimes I get 401 unauthorized. Usually, if I simply do it again the request goes through.
I removed the VerifyCsrfToken middleware from Kernel.php and also added the API route to the exceptions so I don't think that's the problem.
The frequency the error appears seems to change from network to network, meaning when connected to certain networks it almost never happens while sometimes it's constant.
I honestly have no idea why this happens.
My problem was in PROJECTDIR/vendor/lcobucci/jwt/src/Signer/Rsa.php.
Here an openssl function (openssl_get_publickey) sometimes returns something wrong despite the certificate being valid.
I did not manage to find a real solution.
My hack for now is simply changing the code to always return 1.
This does not change the way the token auth works but removes the Rsa check and the 401 errors caused by such malfunction.

Asana project is created but ajax readystate 0

I have a php application that accesses Asana API. I am able to create a project in Asana. However, the ajax call to the API class is returning a readystate=0.
While troubleshooting in firebug I also noticed that the network console has a 302, 400(??), and 200 status code. I thought 400 status code is related to invalid request or malformed url, but the project gets created anyway.
Any idea?
Update: More information.
I have a Ajax call to a php file which intern calls Asana API to getAuth code and tokens before calling the API services.
I believe I am getting the CORS warning and hence the readystate=0 and the 400 error. However because rest of my script proceeds with the token it was inserting records anyways. However, after the tokens expired (3600 sec), now I am unable to insert records. I know if I call the php file directly it works without the CORS error.
$.ajax({
type: 'POST',
url: "oa/asana.php",
data: {apiprovider:"asana",type:"addnewproject",notes:"notes",name:"name",id:"id",resource:"projects"},
//dataType:"json",
//contentType: "application/json",
success: function(data) {
console.log(data);
},
error: function( error )
{
console.log("asana api error");
console.log(JSON.stringify(error)) ;
},
async:true
});
my php code looks like this.
...$asana = new AsanaAuth(API_KEY,API_SECRET,callbackUrl);
if (!isset($_GET['code'])) {
$url = $asana->getAuthorizeUrl();
header('Location:' . $url);
} else {
$asana->getAccessToken($_GET['code']);
if ($asana->hasError()) {
echo 'Error response code: ' . $asana->responseCode;
}
else {
echo $asana->response;
}
}
Is there a better way to do this outside of Ajax call?
OK here's how I fixed it. Partly my understanding was wrong and I made a wrong assumption.
This is purely based on my application's need and this may not be applicable to all.
I have a settings page where the user clicks on the Asana app to authorize the connection. This is not a Ajax call but a hyperlink to a file which is also my redirect uri. Within this file I check if the user has already authorized if not I call authorize, get tokens and store the refresh token in the database for future use.
In the other parts of my application, when the user clicks on create an asana project, I check for the validity of the access token if it has expired, I refresh and get another token again.
Earlier, I was trying to call this from one page (which in a normal user flow will never happen -in my application). So right now there is no Ajax call for authorize but calling the refresh token, and Asana API endpoints are through Ajax and they work fine.
Hope this helps someone in future.

Categories