user stream in twitter api - php

I want to read my own twit in a litte localhost application in js + php.
I know how to read the json in api.twitter.com/1/statuses/user_timeline.json?screen_name=myName but due to the limit rate I need to make a User Stream (https://dev.twitter.com/docs/streaming-api/user-streams)
I have my 4 keys by create a dev account :
'consumer_key' => '*****',
'consumer_secret' => '*****',
'user_token' => '*******',
'user_secret' => '******',
So I try with this https://github.com/themattharris/tmhOAuth/blob/master/examples/userstream.php
download the lib
run my MAMP (or WAMP or LAMP)
open the example, put my key
go to the page
and nothing. except the browser loader.
Why this hapeens?
is it due to localhost ?
or no params ?
or new twitter restriction ?

Streaming API is not right tool for litte application You are better off with plain REST API.
Streaming API application is not supposed to be run in browser; don't forget to set_time_limit(0) and start your .php script in comandline - where it will run forever (you should save tweets in database so your normal browser scripts could display them)

Related

"no alive nodes found in cluster" while indexing docs

I have a "legacy" php application that we just migrated to run on Google Cloud (Kubernetes Engine). Along with it I also have a ElasticSearch installation (Elastic Cloud on Kubernetes) running. After a few incidents with Kubernetes killing my Elastic Search when we're trying to deploy other services we have come to the conclusion that we should probably not run ES on Kubernetes, at least if are to manage it ourselves. This due to a apparent lack of knowledge for doing it in a robust way.
So our idea is now to move to managed Elastic Cloud instead which was really simple to deploy and start using. However... now that I try to load ES with the data needed for our php application if fails mid-process with the error message no alive nodes found in cluster. Sometimes it happens after less than 1000 "documents" and other times I manage to get 5000+ of them indexed before failure.
This is how I initialize the es client:
$clientBuilder = ClientBuilder::create();
$clientBuilder->setElasticCloudId(ELASTIC_CLOUD_ID);
$clientBuilder->setBasicAuthentication('elastic',ELASTICSEARCH_PW);
$clientBuilder->setRetries(10);
$this->esClient = $clientBuilder->build();
ELASTIC_CLOUD_ID & ELASTICSEARCH_PW are set via environment vars.
The request looks something like:
$params = [
'index' => $index,
'type' => '_doc',
'body' => $body,
'client' => [
'timeout' => 15,
'connect_timeout' => 30,
'curl' => [CURLOPT_HTTPHEADER => ['Content-type: application/json']
]
The body and which index depends on how far we get with the "ingestion", but generally pretty standard stuff.
All this works without any real problems when running on a own installation of Elastic in our own GKE cluster.
What I've tried so far is to add the retries and timeouts, but none of that seems to make much of a difference?
We're running:
php 7.4
ElasticSearch 7.11
Elastic Search client 7.12 (php via composer)
If you use WAMP64, this error will occur, You have to use XAMPP instead.
Try the following command in the command prompt, If it runs, there is a problem with your configurations.
curl -u elastic:<password> https://<endpoint>:<port>
(Ex for Elastic Cloud)
curl -u elastic:<password> example.es.us-central1.gcp.cloud.es.io:9234

How to include Google App Engine for PHP in my scripts / autoloading?

I have a website on an Ubuntu webserver (not an app and not hosted at App Engine) and I want to use google cloud storage for the upload/download of large files. I am trying to upload a file directly to the Google Cloud Storage which isn't working (maybe because I made some basic errors).
I have installed the Google Cloud SDK and downloaded and unzipped Google App Engine. If I now include CloudStorageTools.php I get an the error:
Class 'google\appengine\CreateUploadURLRequest' not found"
My script looks like this:
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
$options = [ 'gs_bucket_name' => 'test' ];
$upload_url = CloudStorageTools::createUploadUrl( '/test.php' , $options );
If you want to use the functionality of Google App Engine (gae), you will need to host on gae, which will likely have a larger impact on your app architecture (it uses a custom google compiled php version with limited libraries and no local file handling, so all that functionality needs to be into blob store or gcs - Google Cloud Storage).
With a PHP app running on ubuntu, your best bet is to use the google-api-php-client to connect to the storage JSON api.
Unfortunately the documentation is not very good for php. You can check my answer in How to rename or move a file in Google Cloud Storage (PHP API) to see how to GET / COPY / DELETE an object.
To upload I would suggest to retrieve a pre signed Upload URL like so:
//get google client and auth token for request
$gc = \Google::getClient();
if($gc->isAccessTokenExpired())
$gc->getAuth()->refreshTokenWithAssertion();
$googleAccessToken = json_decode($gc->getAccessToken(), true)['access_token'];
//compose url and headers for upload url request
$initUploadURL = "https://www.googleapis.com/upload/storage/v1/b/"
.$bucket."/o?uploadType=resumable&name="
.urlencode($file_dest);
//Compose headers
$initUploadHeaders = [
"Authorization" =>"Bearer ".$googleAccessToken,
"X-Upload-Content-Type" => $mimetype,
"X-Upload-Content-Length" => $filesize,
"Content-Length" => 0,
"Origin" => env('APP_ADDRESS')
];
//send request to retrieve upload url
$req = $gc->getIo()->makeRequest(new \Google_Http_Request($initUploadURL, 'POST', $initUploadHeaders));
// pre signed upload url that allows client side upload
$presigned_upload_URL = $req->getResponseHeader('location');
With that URL sent to your client side, you can use it to PUT the file directly onto your bucket with an upload script that generates an appropriate PUT request. Here an example in AngularJS with ng-file-upload:
file.upload = Upload.http({
url: uploadurl.url,
skipAuthorization: true,
method: 'PUT',
filename: file.name,
headers: {
"Content-Type": file.type !== '' ? file.type : 'application/octet-stream'
},
data: file
});
Good luck - gcs is a tough one if you don't want to go google all the way with app engine!
The Google API PHP Client allows you to connect to any Google API, including the Cloud Storage API. Here's an example, and here's a getting-started guide.

Slow DynamoDB for php session handling

we're using DynamoDB in order to synchronize sessions between more than one EC2 machine under ELBs.
We noticed that this method slow down a lot the scripts.
Specifically, I made a js that calls 10 times 3 different php scripts on the server.
1) The first one is just an echo timestamp(); and takes about 50ms as roundtrip time.
2) The second one is a php script that connect through mysqli to the RDS MySQL and takes the same time (about 50-60ms).
3) The third script use the DynamoDB session keeping method described in official AWS documentation and takes about 150ms (3 times slower!!).
I'm cleaning the garbage every night (as documentation say) and the DynamoDB metrics seems OK (attached below).
The code I use is this:
use Aws\DynamoDb\DynamoDbClient;
use Aws\DynamoDb\Session\SessionHandler;
ini_set("session.entropy_file", "/dev/urandom");
ini_set("session.entropy_length", "512");
ini_set('session.gc_probability', 0);
require 'aws.phar';
$dynamoDb = DynamoDbClient::factory(array(
'key' => 'XXXXXX',
'secret' => 'YYYYYY',
'region' => 'eu-west-1'
));
$sessionHandler = SessionHandler::factory(array(
'dynamodb_client' => $dynamoDb,
'table_name' => 'sessions',
'session_lifetime' => 259200,
'consistent_read' => true,
'locking_strategy' => null,
'automatic_gc' => 0,
'gc_batch_size' => 25,
'max_lock_wait_time' => 15,
'min_lock_retry_microtime' => 5000,
'max_lock_retry_microtime' => 50000,
));
$sessionHandler->register();
session_start();
Am I doing something wrong, or is it normal all that time to retrieve the session?
Thanks.
Copying correspondence from an AWS engineer in AWS forums: https://forums.aws.amazon.com/thread.jspa?messageID=597493
Here a couple things to check:
Are you running your application on EC2 in the same region as your DynamoDB table?
Have you enabled OPcode caching to ensure that the classes used by the SDK do not need to be loaded from disk and parsed each time your
script is run?
Using a web server like Apache and connecting to a DynamoDB session
will require a new SSL connection to be established on each request.
This is because PHP doesn't (currently) allow you to reuse cURL
connection handles between requests. Some database drivers do allow
for a persistent connections between requests, which could account for
the performance difference.
If you follow up on the AWS forums thread, an AWS engineer should be able to help you with your issue. This thread is also monitored if you want to keep it open.

RiakCS S3 PHP client library

Is there any RiakCS S3 PHP client library out there? The best I could find was S3cmd command line client software.
Also I've seen there is Riak PHP Client, but it looks like there is nothing related to S3.
I've installed aws-sdk-php-laravel and used same credentials as for RiakCS S3 but it doesn't seem to work. Error message below:
The AWS Access Key Id you provided does not exist in our records.
Thank you for any guidance or advice.
Actually, if you are using Riak, it wouldn't be a proxy, it would be a completely different endpoint. So you should do it this way with the base_url option:
$s3 = S3Client::factory([
'base_url' => 'http://127.0.0.1:8080',
'region' => 'my-region',
'key' => 'my-key',
'secret' => 'my-secret',
'command.params' => ['PathStyle' => true]
]);
Using 'command.params' allows you to set a parameter used in every operation. You will need to use the 'PathStyle' option on every request to make sure the SDK does not move your bucket into the host part of the URL like it is supposed to do for Amazon S3.
This is was all talked about on an issue on GitHub.
aws-sdk-php-laravel uses aws-sdk-php which is hard coded to use Amazon's URLs. If you want to use that library with Riak CS, you'll need to configure it to use your node as a proxy. According to the config docs that would be set using:
use Aws\S3\S3Client;
$s3 = S3Client::factory(array(
'request.options' => array(
'proxy' => '127.0.0.1:8080'
)
));
I haven't used Laravel, so I'm not sure where to put that so that it will pass the setting along to

How to deal with large files in SbreDav?

I am using the SabreDAV PHP library to connect to a WebDAV server and download some files but it is taking forever to download a 1MB file and I have to download up to 1GB files from that server. I looked at this link http://code.google.com/p/sabredav/wiki/WorkingWithLargeFiles but it is not helpful because it's telling me that I will get a stream when I do a GET but it is not the case.
Here is my code:
$settings = array(
'baseUri' => 'file url',
'userName' => 'user',
'password' => 'pwd'
);
$client = new \Sabre\DAV\Client($settings);
$response = $client->request('GET');
response is an array with a 'body' key that contains the content of the file. What am I doing wrong? I only need the file for read only. How can I can read through the file line by line as quick as possible?
Thanks in advance.
If its taking too long just to download a 1MB file, then I think its not SabreDAV problem but a problem with your server or network, or perhaps the remote server.
The google code link you mentioned just lists a way if you want to transfer very large files, for that you will have to use the stream and fopen way they mentioned, but I think I was able to transfer 1GB files without using that way and just normally when I last used it with OwnCloud.
If you have a VPS/Dedi server, open ssh and use wget command to test the speed and time it takes to download that remote file from WebDAV, if its same as what its taking with SabreDAV, then its a server/network problem and not SabreDAV, else, its a problem with Sabre or your code.
Sorry but I donot have any code to post to help you since the problem itself is not clear and there can be more than 10 things causing it.
PS: You need to increase php limits for execution time, max file upload and max post size too relatively

Categories