I'm trying to forward a request made from the client for a stream, where it keeps the requests originally made from the video player intact:
Content-Type: Keep-Alive;
Range: 0-
...
What I'm Using:
Frontend: Web - ReactJS
Backend: PHP REST API
CDN: AWS CloudFront
Storage: AWS S3
Architecture Graphic
Reason:
I need to be able to authenticate the user with our own JWT middleware through the REST to validate if they can access the file.
Constraints:
Cannot use nginx to forward the request, unless there is still a way to authenticate it with the PHP Middleware.
What I've Looked Into:
aws php sdk
I've look at the AWS PHP, but the documentation on this specific functionality seems to be missing.
guzzle + php curl
I'm afraid my knowledge is lacking in terms of what I would need to pass onto the CloudFront for this to work.
cloudfront signed url/signature
Unless I'm mistaken, this would not be helpful because the video expiration for access would be set by AWS and not by the App's REST API, so if they refresh their JWT it would not be updated with the signature.
why not s3 directly?
S3 doesn't support headers for chunks like, Range: 0-100 bytes.
Any help or recommendations would be appreciated, even if it means recommending to buy something pre-built to look at how they implemented it.
======= UPDATE: June 29, 2020 =======
After the recommendation from #ChrisWilliams, I ended up creating a script on AWS Lambda#Edge with the following configurations:
Trigger: CloudFront - viewer request
The reason for viewer request was because it's the only way to get the GET query parameters from the user's original request.
Function Code:
(Please forgive the very rough code to get things working)
File: index.js
// IMPORTS
const zlib = require('zlib');
const https = require('https');
// HTML ERROR TEMPLATE
const content = `
<\!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Unauthorized Page</title>
</head>
<body>
<p>Unauthorized!</p>
</body>
</html>
`;
// TRIGGER FUNCTION
exports.handler = async (event, context, callback) => {
// Getting request and response
const originalResponse = event.Records[0].cf.response;
const request = event.Records[0].cf.request;
// Setup for html page (cont content)
const buffer = zlib.gzipSync(content);
const base64EncodedBody = buffer.toString('base64');
// Response Templates
var response401 = {
headers: {
'content-type': [{key:'Content-Type', value: 'text/html; charset=utf-8'}],
'content-encoding' : [{key:'Content-Encoding', value: 'gzip'}]
},
body: base64EncodedBody,
bodyEncoding: 'base64',
status: '401',
statusDescription: "OK"
};
var response500 = {
headers: {
'content-type': [{key:'Content-Type', value: 'text/html; charset=utf-8'}],
'content-encoding' : [{key:'Content-Encoding', value: 'gzip'}]
},
body: base64EncodedBody,
bodyEncoding: 'base64',
status: '500',
statusDescription: "OK"
};
// Perform Http Request
const response = await new Promise((resolve, reject) => {
// Expected ?token=ey...
const req = https.get(`https://myauthserver.com/?${(request && request.querystring) || ''}`, function(res) {
if (res.statusCode !== 200) {
return reject(response401);
}
return resolve({
status: '200'
});
});
req.on('error', (e) => {
reject(response500);
});
}).catch(error => error);
// Get results from promise
const results = await response;
if (results.status === '200') {
// Successful request - continue with the rest of the process
callback(null, request);
}
// Not successful, show the errors results (401 or 500)
callback(null, results);
};
NOTE: You will have to try this a few times in case any typos or syntax errors arise because of the caching. I also recommend trying this with different IP addresses to validate access to the content. Not to mention you will get scenarios of 502 if the returned request isn't formatted correctly with the base64EncodedBody.
DOUBLE NOTE: This was after looking at the tutorials from AWS that weren't working or outdated and looking at the comments of multiple devs not getting things working.
I would suggest using a Lambda#Edge function rather than adding a third stage in front of your CloudFront.
By adding a proxy in front of your CloudFront it could lead to issues with debug, and allows someone to bypass the proxy to reach your CloudFront origin without locking it down.
Using a Lambda#Edge function guarantees that the solution validates the authenticity of the JWT token, it could be configured to either validate the JWT token with the Lambda function directly or have the Lambda call an endpoint you build to validate. If the JWT is invalid it can reject the request.
Amazon have a great article with a demo stack that demonstrates how you can make use of this.
Related
This question already has answers here:
XMLHttpRequest cannot load XXX No 'Access-Control-Allow-Origin' header
(11 answers)
Closed 2 years ago.
I'm doing a ReactJS frontend App and get data from an API created with PHP Rest API, but my react is host on localhost:3000, but my php file is hosted on localhost:80. so not sure how to write the baseurl in react, cause it always got some error until now.
May I know how to solve it? Thank you.
error:
Access to XMLHttpRequest at 'http://localhost/reacttest/src/api/read.php' from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
xhr.js:184 GET http://localhost/reacttest/src/api/read.php net::ERR_FAILED
ReactJS:
import React from 'react';
// import logo from './logo.svg';
import './App.css';
import axios from "axios";
const baseUrl = "http://localhost:80/reacttest/src/api";
const sampleGet = async () => {
const result = await axios.get(baseUrl + "/read.php");
console.log(result);
};
const samplePost = async () => {
const result = await axios.post(baseUrl + "/posts", {
sampleData: "nabezap"
});
console.log(result);
};
const sampleDelete = async () => {
const result = await axios.delete(baseUrl + "/posts/4");
console.log(result);
};
function App() {
return (
<div className="App">
<button onClick={sampleGet}>GET</button>
<button onClick={samplePost}>POST</button>
<button onClick={sampleDelete}>DELETE</button>
</div>
);
}
export default App;
read.php:
<?php
header('Content-Type: application/json;charset=utf-8');// all echo statements are json_encode
include('se.php');
include('db.php');
session_start();
$doctordb = new doctorModel; //instantiate database to start using
$result = $doctordb->showDoctorinfo();
if($result == false) {
http_response_code(204); // no content
} elseif(is_array($result)) {
http_response_code(200); //success
echo json_encode($result);
}
?>
api.php:
The base url is correct (80 is default) and if you check your network tab in dev tools, you’ll see the request did in fact go out and received the expected response.
The issue is with your REST API. The clue is in the error:
No 'Access-Control-Allow-Origin' header is present on the requested resource.
This means the server received the request, processed it and returned a response— but it didn’t attach a Access-Control-Allow-Origin: ‘http://localhost:3000’ header. When your browser receives the response from the API, it checks for this header and refuses javascript access to the response data if it’s missing. This is normal.
Setting up CORS on your REST API is the way to go. What framework (if any) are you using? I’ll edit this answer with more info once I know.
I am having trouble when I'm trying to initialise a Channel.
I've followed some tutorials provided (https://getstream.io/blog/chat-messaging-with-laravel/, https://getstream.io/blog/tutorial-build-customer-support-chat-with-laravel-vue-and-stream/) that have a stack as mine (Laravel + Vue)
I am already getting the token on the backend, initializing the Client, setting the User and the token on the client.
But when I try to do this.channel.watch(); or even a simple channels query like
const filter = { type: 'messages', id: '1000056864'};
const sort = { last_message_at: -1 };
const channels = await this.client.queryChannels(filter, sort, {
watch: true,
state: true,
});
It will return to me the error as follows:
Access to XMLHttpRequest at 'https://chat-us-east-1.stream-io-api.com/channels/messages/1000056864/query?user_id=62&api_key=2e******e2&connection_id=5983f850-3d50-4ac3-9c06-d9e0fdaf7212' from origin 'http://local.site.test' has been blocked by CORS policy: Request header field x-csrf-token is not allowed by Access-Control-Allow-Headers in preflight response.
Everything is working on the backend, even the equivalent calls.
Based on the error you are receiving, it looks like you are including your CSRF token to all your AJAX requests. Stream API servers have a whitelist of headers that you can pass, this is to safe developers from sending sensitive data by accident. In this specific case it is arguable that csrf-token could be in such whitelist for the sake of ease of use.
Perhaps you are using something like this on your frontend?
$.ajaxSetup({
headers: {
'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content')
}
});
If that's the case my suggestion is to opt for a more fine grained solution such as:
$.ajaxSetup({
url: "/laravel/",
headers: {
'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content')
}
});
Or make sure that only your Laravel backend receives the CSRF token by extracting JS code doing Ajax calls.
CSRF tokens are not as valuable as session IDs but they exist to make your application more secure and are not meant to be shared with 3rd parties.
I'm using this package to add Google Cloud tasks to my project and it works perfectly. The problem is that I can't figure out how to increase http target request timeout?
Use dispatchDeadline if you are creating a task using nodejs.
Source: https://googleapis.dev/nodejs/tasks/latest/google.cloud.tasks.v2beta3.Task.html
Example implementation:
//npm install --save #google-cloud/tasks
const client = new CloudTasksClient();
const project = 'your-project-name';
const queue = 'your-queue-name';
const location = 'us-central1';
const parent = client.queuePath(project, location, queue);
const serviceAccountEmail = 'user#projectname_or_whatever.iam.gserviceaccount.com';
const url = 'http://destination_url'
const payload = JSON.stringify({ "user": "Manuel Solalinde", 'mode': 'secret mode' })
const body = Buffer.from(payload).toString('base64')
// task creation documentation: https://googleapis.dev/nodejs/tasks/latest/google.cloud.tasks.v2beta3.Task.html
const task = {
httpRequest: {
httpMethod: 'POST',
url: url,
dispatchDeadline: 30 * 60, //30 minutes
body: body,
headers: { "Content-type": "application/json" },
oidcToken: {
serviceAccountEmail,
},
},
};
// Send create task request.
console.log('Sending task:');
const [response] = await client.createTask({ parent, task });
console.log(`Created task ${response.name}`);
The dispatch_deadline property of the Tasks object should allow you to extend the request timeout. Default is 10 minutes for HTTP targets.
Cloud Tasks Client Library Documentation for PHP
I can't comment due to lack of reputation, but the first solution is incorrect. dispatch_deadline is part of the task request, not the httpRequest. It should be moved out one level of that object.
task: {
dispatch_deadline: 200
httpRequest: {
}
}
However, I tried to implement this and unfortunately the request just hangs when you add this flag. My request never goes through to creating a task. I think it is a broken feature.
I've got a very strange issue.
local hosted PHP Slim App using XAMPP (localhost:4040)
local hosted Angular 4 App using CLI (localhost:4200)
Making API Requests using "Postman" and browser is no problem, everything works fine.
Now I'm integrating the requests into my Angular app using import { Headers, Http } from '#angular/http'; and observables.
const requestUrl = 'http://localhost:4040/register';
const headers = new Headers({
'content-type': 'application/x-www-form-urlencoded'
});
this.http
.get(requestUrl, {headers: headers})
.map(response => response.json())
.subscribe(result => {
console.log(result);
}, error => {
console.log(error);
});
The request always fails with:
Failed to load http://localhost:4040/register: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:4200' is therefore not allowed access.
But: I am definitely sending these headers!
public static function createJsonResponseWithHeaders($response, $requestedData)
{
// Add origin header
$response = $response->withHeader('Access-Control-Allow-Origin', '*');
$response = $response->withHeader('Access-Control-Allow-Methods', 'GET');
// Add json response and gzip compression header to response and compress content
$response = $response->withHeader('Content-type', 'application/json; charset=utf-8');
$response = $response->withHeader('Content-Encoding', 'gzip');
$requestedData = json_encode($requestedData, JSON_UNESCAPED_UNICODE | JSON_UNESCAPED_SLASHES | JSON_NUMERIC_CHECK | JSON_PRETTY_PRINT);
$response->getBody()->write(gzencode($requestedData), 9);
if (!$requestedData || (count($requestedData) === 0)) {
return $response->withStatus(404)->write('Requested data not found or empty! ErrorCode: 011017');
}
return $response;
}
What I already tried for solving:
Run Slim App inside a Docker Container to get a different origin than localhost - same behaviour
Add allow-origin-header right on top of the index.php
header('Access-Control-Allow-Origin: *'); - same behaviour
Your requests are blocked because of CORS not being set up properly. There are other questions that address this, e.g. How to make CORS enabled requests in Angular 2
What you should ideally look at using is a proxy that forwards your requests to the API, the latest Angular CLI comes with support for a dev proxy (see https://github.com/angular/angular-cli/blob/master/docs/documentation/stories/proxy.md) out of the box. You set it up with a proxy.conf.json that could look like this:
{
"/api": {
"target": "http://localhost:4040",
"secure": false,
"pathRewrite": {"^/api" : ""}
}
}
What this piece of code does is any requests from Angular to a URI matching /api will be forwarded to localhost:4040.
Note that you will also need to figure out how your app will talk to the API server in a non-dev environment. I have been happy with using Nginx to serve Angular files, and act as proxy for the API.
Sorry, my bad. The solution is simple:
The "Cache-control" header in the request seems to be not allowed, although it worked fine when testing the api with Postman.
I removed the header from the request and everything worked well.
I am currently working on an AngularJS project with a server backend written in PHP. The frontend and backend communicate entirely in JSON, however, there is an export scenario where the server's output is not JSON encoded but instead a (text or binary) file.
The web application cannot just redirect the client's browser to a download URL as the server requires custom headers in the HTTP request (i.e. an API key) to serve the file. Therefore, I am using $http in AngularJS to initiate an AJAX request. Here is what happens:
File generation on the server side (using PHP with Slim framework):
$export = $this->model->export_cards($project_key);
$this->app->response()->status(200);
$this->app->response()->header("Content-Type", "text/plain");
$this->app->response()->header("Content-Disposition", "attachment; filename=\"export.txt\"");
$this->app->response()->header("Last-Modified", date("r");
$this->app->response()->header("Cache-Control", "cache, must-revalidate");
$this->app->response()->body($export);
$this->app->stop();
This is what happens on the client side (so far):
$http({
method: "get",
url: "/server/projects/cards/export_cards/" + $scope.key,
headers: {
"X-API-Key": session_service.get("api_key")
}
}).then(
function(response)
{
// Success, data received
var data = response.data; // This variable contains the file contents (might be plain text, or even binary)
// How do I get the browser to offer a file download dialog here?
},
function(response)
{
// Error handling
}
);
I successfully receive the file contents in the AngularJS frontend and store them in a variable data. How do I get the browser to display a file download dialog?
The solution must work in Internet Explorer 10+ and reasonably recent versions of Firefox, Chrome and Safari (only desktop versions).
What is the best way to achieve this?
Thank you for your help and let me know if I need to provide any additional information.
Peter
I'm not sure this is possible.
Could you either:
Supply the API key directly, eg:
location.href = "/server/projects/cards/export_cards/" + $scope.key + '?api_key=' + session_service.get("api_key");
Or, have your API return a temporary, time-expiring URL for the file download, and then use location.href to access this URL.