the goal is to make a http request (empty) from Angular 7 to PHP to receive binary data in Angular for the use with protobuf3.
More specifically, the binary data (encoded like described here: https://developers.google.com/protocol-buffers/docs/encoding) in PHP (source) is encapsulated in a string, while the goal in Angular is a Uint8Array.
Therefore, I currently have the following working code:
PHP Code (a simple ProcessWire root template):
header('Content-Type: application/b64-protobuf');
…
echo base64_encode($response->serializeToString());
Angular:
let res = this.httpClient.get(`${this.API_URL}`, { responseType: 'text' });
res.subscribe((data) => {
let binary_string = atob(data);
let len = binary_string.length;
let bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
let parsedResponse = pb.Response.deserializeBinary(bytes)
})
As you can see I encode the data as base64 before sending it. So, it is not as efficient as it could be, because base64 reduces the amount of information per character. I tried already quite a lot to get binary transmission working, but in the end the data always gets corrupted, i.e. the variable bytes is not identical to the argument of base64_encode.
But still, according to some sources (e.g. PHP write binary response, Binary data corrupted from php to AS3 via http (nobody says it would not be possible)) it should be possible.
So my question is: What must change to directly transfer binary data? Is it even possible?
What have I tried?
using different headers, such as
header('Content-Type:binary/octet-stream;'); or using Blob in Angular.
I also tried to remove base64_encode from the PHP Code and atob
from the Angular Code. The result: the content of the data is modified between serializeToString and deserializeBinary(bytes), which is not desired.
I checked for possible characters before <?php
Specifications:
PHP 7.2.11
Apache 2.4.35
Angular 7.0.2
If further information is needed, just let me know in the comments. I am eager to provide it. Thanks.
Related
I use pako.deflate to compress data in javascript, like this:
js file:
const params = [{id: 5, name: '张三', list: [{code: '10010', type: 'media'}]},{id: 6, name: '李四', list: [{code: '20010', type: 'site'}]}]
let binaryString = pako.deflate(JSON.stringify(params), { to: 'string' })
http.post({data: binaryString})...
and in the web server, I need to decompress that data using PHP.
This is what I do
php file:
$data = $params['data']; // got the right post data
$res = gzinflate(base64_decode($data));
echo $res; //echo false
but $res echo false
What am I missing?
You are base 64 decoding at the server - do you actually base64 encode at the client at all prior to posting the data? There is no indication this is occurring in the code posted.
I suspect it is likely you are sending a deflated utf-8 string as your data, and then trying to base64 decode a string which contains a far greater range of characters.
Maybe look what characters $params['data'] contains - if any of them are outside the base64 range (being a-z,A-Z, 0-9, +, /, and possibly trailing =) then I think this would be the problem.
You could then try simply changing the line to:
$res = gzinflate($data);
I seem to be stuck at sending the compressed messages from PHP to NodeJS over Amazon SQS.
Over on the PHP side I have:
$SQS->sendMessage(Array(
'QueueUrl' => $queueUrl,
'MessageBody' => 'article',
'MessageAttributes' => Array(
'json' => Array(
'BinaryValue' => bzcompress(json_encode(Array('type'=>'article','data'=>$vijest))),
'DataType' => 'Binary'
)
)
));
NOTE 1: I also tried putting compressed data directly in the message, but the library gave me an error with some invalid byte data
On the Node side, I have:
body = decodeBzip(message.MessageAttributes.json.BinaryValue);
Where message is from sqs.receiveMessage() call and that part works since it worked for raw (uncompressed messages)
What I am getting is TypeError: improper format
I also tried using:
PHP - NODE
gzcompress() - zlib.inflateraw()
gzdeflate() - zlib.inflate()
gzencode() - zlib.gunzip()
And each of those pairs gave me their version of the same error (essentially, input data is wrong)
Given all that I started to suspect that an error is somewhere in message transmission
What am I doing wrong?
EDIT 1: It seems that the error is somewhere in transmission, since bin2hex() in php and .toString('hex') in Node return totally different values. It seems that Amazon SQS API in PHP transfers BinaryAttribute using base64 but Node fails to decode it. I managed to partially decode it by turning off automatic conversion in amazon aws config file and then manually decoding base64 in node but it still was not able to decode it.
EDIT 2: I managed to accomplish the same thing by using base64_encode() on the php side, and sending the base64 as a messageBody (not using MessageAttributes). On the node side I used new Buffer(messageBody,'base64') and then decodeBzip on that. It all works but I would still like to know why MessageAttribute is not working as it should. Current base64 adds overhead and I like to use the services as they are intended, not by work arounds.
This is what all the SQS libraries do under the hood. You can get the php source code of the SQS library and see for yourself. Binary data will always be base64 encoded (when using MessageAttributes or not, does not matter) as a way to satisfy the API requirement of having form-url-encoded messages.
I do not know how long the data in your $vijest is, but I am willing to bet that after zipping and then base64 encoding it will be bigger than before.
So my answer to you would be two parts (plus a third if you are really stubborn):
When looking at the underlying raw API it is absolutely clear that not using MessageAttributes does NOT add additional overhead from base64. Instead, using MessageAttributes adds some slight additional overhead because of the structure of the data enforced by the SQS php library. So not using MessageAttributes is clearly NOT a workaround and you should do it if you want to zip the data yourself and you got it to work that way.
Because of the nature of a http POST request it is a very bad idea to compress your data inside your application. Base64 overhead will likely nullify the compression advantage and you are probably better off sending plain text.
If you absolutely do not believe me or the API spec or the HTTP spec and want to proceed, then I would advise to send a simple short string 'teststring' in the BinaryValue parameter and compare what you sent with what you got. That will make it very easy to understand the transformations the SQS library is doing on the BinaryValue parameter.
gzcompress() would be decoded by zlib.Inflate(). gzdeflate() would be decoded by zlib.InflateRaw(). gzencode() would be decoded by zlib.Gunzip(). So out of the three you listed, two are wrong, but one should work.
I have many ogg & opus files on my server and need to generate json-waveform numeric arrays on an as-needed basis (example below).
recently i discovered the node based waveform-util which uses ffmpeg/ffprobe for rendering a JSON waveform and it works perfectly. i am undecided if having a node process constantly running is the optimum solution to my issue.
since ffmpeg seems to be able to handle anything i can throw at it, i wish to stick with an ffmpeg solution.
i have three questions:
1) is there a php equivalent? i have found a couple that generate PNG images but not one that generates JSON-waveform numeric arrays
2) are there any significant advantages of going with the node-based solution rather than a php based solution (assuming there is a php based solution)?
3) is there a way using CLI ffmpeg/ffprobe to generate a json-waveform ? i saw all the -show_ options (-show_data, -show_streams, -show_frames) but nothing looked like it produced what i am looking for.
the json-waveform needs to be in this format:
[ 0.0002, 0.001, 0.15, 0.14, 0.356 .... ]
thank you all.
it sounds as if there is a conflict with the way my server is handling cgi. i am using virtualmin and am using the following setting:
PHP script execution mode: CGI wrapper (run as virtual server owner)
after much research, it appears that using pure node.js is more lightweight rather than using a shell executable. i was able to have some success merely by putting a schbang line to call node, but having a node.js script always memory resident is probably the way to go.
For anyone in the future looking to do this with RN:
// convert the file to pcm
await RNFFmpeg.execute(`-y -i ${filepath} -acodec pcm_s16le -f s16le -ac 1 -ar 1000 ${pcmPath}`)
// you're reading that right, we're reading the file using base64 only to decode the base64, because RN doesnt let us read raw data
const pcmFile = Buffer.from(await RNFS.readFile(pcmPath, 'base64'), 'base64')
let pcmData = []
// byte conversion pulled off stack overflow
for(var i = 0 ; i < pcmFile.length ; i = i + 2){
var byteA = pcmFile[i];
var byteB = pcmFile[i + 1];
var sign = byteB & (1 << 7);
var val = (((byteA & 0xFF) | (byteB & 0xFF) << 8)); // convert to 16 bit signed int
if (sign) { // if negative
val = 0xFFFF0000 | val; // fill in most significant bits with 1's
}
pcmData.push(val)
}
// pcmData is the resulting waveform array
This question already has answers here:
Should I embed images as data/base64 in CSS or HTML
(7 answers)
Closed 9 years ago.
Could someone please explain how does this work?
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAEAAAABACAYAAACqaXHeAAACDUlEQVR4Xu2Yz6/BQBDHpxoEcfTjVBVx4yjEv+/EQdwa14pTE04OBO+92WSavqoXOuFp+u1JY3d29rvfmQ9r7Xa7L8rxY0EAOAAlgB6Q4x5IaIKgACgACoACoECOFQAGgUFgEBgEBnMMAfwZAgaBQWAQGAQGgcEcK6DG4Pl8ptlsRpfLxcjYarVoOBz+knSz2dB6vU78Lkn7V8S8d8YqAa7XK83ncyoUCjQej2m5XNIPVmkwGFC73TZrypjD4fCQAK+I+ZfBVQLwZlerFXU6Her1eonreJ5HQRAQn2qj0TDukHm1Ws0Ix2O2260RrlQqpYqZtopVAoi1y+UyHY9Hk0O32w3FkI06jkO+74cC8Dh2y36/p8lkQovFgqrVqhFDEzONCCoB5OSk7qMl0Gw2w/Lo9/vmVMUBnGi0zi3Loul0SpVKJXRDmphvF0BOS049+n46nW5sHRVAXMAuiTZObcxnRVA5IN4DJHnXdU3dc+OLP/V63Vhd5haLRVM+0jg1MZ/dPI9XCZDUsbmuxc6SkGxKHCDzGJ2j0cj0A/7Mwti2fUOWR2Km2bxagHgt83sUgfcEkN4RLx0phfjvgEdi/psAaRf+lHmqEviUTWjygAC4EcKNEG6EcCOk6aJZnwsKgAKgACgACmS9k2vyBwVAAVAAFAAFNF0063NBAVAAFAAFQIGsd3JN/qBA3inwDTUHcp+19ttaAAAAAElFTkSuQmCC
And how does this generate an image and how to create it? I found this a lot of times in html.
Follow up question
How does this differ on a url as a src in terms of loading time and http request?
does this make loading time faster? How would it scale if i am to use, say 50 images?
Also.
if this is better
in uploading, converting images to base64 and saving it on database rather than a url would make a site better?
You can use it like this:
<img alt="Embedded Image" src="data:image/png;base64,{base64 encoding}" />
It's used to generate new images, or to store images as plain text. You can read more about base64 encoding here on Wikipedia:
http://nl.wikipedia.org/wiki/Base64
How does it work?
The characters are converted to binair
They take a group of 6 bits
The groups will be converted to decimal
For each decimal they take the number on the position n+1 which is in the base64 character table, the numbers variate between 0 and 63.
It does not always come out correctly, since the number of bits must be a multiple of 6. If this is the case, there will be, depending on the required number of additional bits, put 2 or 4 zeros at the end. If so, there will be added a = at the end.
Base64 character table
ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/
Different languages and usage
PHP
<?php
base64_encode($source);
// Or decode:
base64_decode($source);
Python
>>> import base64
>>> encoded = base64.b64encode('data to be encoded')
>>> encoded
'ZGF0YSB0byBiZSBlbmNvZGVk'
>>> data = base64.b64decode(encoded)
>>> data
'data to be encoded'
Objective C
// Encoding
NSData *plainData = [plainString dataUsingEncoding:NSUTF8StringEncoding];
NSString *base64String = [plainData base64EncodedStringWithOptions:0];
NSLog(#"%#", base64String); // Zm9v
// Decoding
NSData *decodedData = [[NSData alloc] initWithBase64EncodedString:base64String options:0];
NSString *decodedString = [[NSString alloc] initWithData:decodedData encoding:NSUTF8StringEncoding];
NSLog(#"%#", decodedString); // foo
The bit after the "base64," is a base64 encoded version of the binary png. Since your question is tagged PHP, here's how you would do that in php:
<?php
$img = file_get_contents('img.png');
echo "data:image/png;base64,".base64_encode($img);
How does it generate an image?
First off, the src of the image is recognized by the browser as a Data URI. It then tries to parse the Data URI (see how chrome(ium) does it here. The parser parses the URI, finds that it is a base64 encoded image and decodes it using a base64 decoder into a binary object. This is equivalent to any normal image file. This binary object is used subsequently while rendering the page.
How does this differ on a URL as a src in terms of loading time and HTTP request?
Since there are no HTTP requests made and the image data is already in memory data URIs should load significantly faster.
Does this make loading time faster? How would it scale if i am to use, say 50 images?
The page loading time? Depends. Base64 encoding string is about 2-3 times larger than the original string. That means more data is transferred with the page load. Also, data URI images are not cached in the browser! So that means it has a clear disadvantage if you have to show this image on different pages - because you have to serve base64 content every time! Instead you could have simply set cache headers on your image data types and simply served it once, and let the browser take the image from memory/cache in subsequent page loads. It really depends on your specific usage. But, you now know the intricacies of base64 encoded data URIs.
Summing it up
+
Easier to generate/store
Has a fixed charset
Smaller perceived loading times
-
More data transferred
Require decoding by the browser
No caching
Format: data:[<MIME-type>][;charset=<encoding>][;base64],<data>
This method is called data URI scheme, it is a URI scheme (Uniform Resource Identifier scheme) that provides a way to include data in-line in web pages as if they were external resources. It is a form of file literal or here document (is a file literal or input stream literal). This technique allows normally separate elements such as images and style sheets to be fetched in a single HTTP request rather than multiple HTTP requests, which can be more efficient.
Ref: http://en.wikipedia.org/wiki/Data_URI_scheme
Ref: http://en.wikipedia.org/wiki/Here_document
I'm trying to mimick an application that sends octet streams to and from a server. The data contained in the body looks like raw bytes, and I'm fairly certain the data being sent for each command is static, so I'm hoping to map the bytes to something more readable in my application. For example, I'll have an array that does: "test" => "&^D^^&#*#dgkel" So I can call "test" and get the real bytes that need to be sent. Trouble is, PHP seems to convert these bytes. I'm not sure if it is an encoding problem or what, but what has been happing is I'll give it some bytes (for example, �ھ����#�qs��������������������X����������������������������) which has a length of 67 I believe, but PHP will say (when I do a var_dump of the HTTP request) that the headers sent contained "Content-Length: 174" or something close to that and the bytes will look like �ھ����#�qs��������������������X����������������������������
So I'm not really sure how to fix this.. Anyone have any ideas? Cheers!
Edit, a little PHP:
$request = new HttpRequest($this->GetMessageURL(), HTTP_METH_POST);
$request->addHeaders($headers);
$request->addRawPostData($buttonMapping[$button]);
$request->send();