Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
got some questions about the Instagram real-time api. first let me tell you what I want to do. I'd like to run a daemon process with a socket connection or something of the like to the Instagram API to get a constant feed of photos with a certain tag. We estimate it to be a large amount of data at a particular time (thus why we want to go Real-time). This process will parse the feed and store it into a mongodb.
Secondly, for the front end, I'd like to display all new, live photos in real-time, possible with ajax or some form of checking on a set interval.
Problem being, I can't find anyone doing this with php. All of the resources I have seen use Node.js and Tornado. Has anyone done this with PHP or know of a good Real-time API demonstration/tutorial to get me started?
Here's the documentation...
http://instagr.am/developer/realtime/
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I'm new to Laravel, I want to update the interface every time there is a change in the database but I think calling a JavaScript every 5s to update the interface is not very efficient, so I wonder what is the most efficient way to do that?
Have you considered using some for of websocket to create a connection between your backend and JavaScript based UI?
Either https://laravel.com/docs/master/broadcasting or https://github.com/beyondcode/laravel-websockets would allow you to notify your UI to request the updates from your Laravel backend.
My personal suggestion would be that you JS would receive a notification that something had changed and then have to obtain the updated information via Ajax so that you are not sending confidential information through Pusher's servers.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
Iam going to scraper 4 website using Laravel and Goutte .
number of url(s) are 900 and I don't have any idea, how to send url(s)
(I wrote crawler code and don't have any question about this)
but I don't know How to send url(s)?
Must I use queue or cronJob or ... ?
Do you recognize any package or tool or idea ? I don't have any idea to send 900 urls, 5 times in a day
If you wrote crawl code for websites, you can separate the links and store in CSV format file. You should write another script that enable to read with an exact numbers of these urls in CSV file and send you back. It's very easy in Ruby with open.csv library.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
so I have a shopify website. And shopify sometimes give me a visitor log kind of like this:
"A user from has visited your website,
First page visited: homepage (22 seconds)
About us (1 minute)
Product page (2 minutes)"
I was wondering how I would go about creating something similar to this in php/js.
Thanks in advance!
You can instead use Google Analytics, it has in depth analytics on visitors. Its just about embedding JS code in your page and you're good to go. If you build your own code in any language, I'm pretty sure you will end up writing tonnes of codes.
To answer your question,
1.Create a web socket connection from your page to backend.
2.Pass event codes like clicks, refresh, session ids etc on user actions. Your JS can fire them up with an active web socket connection. It is pretty much light weight.
To store this information, I would suggest to use NoSql.
Hope this helps!
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Currently im working on task where i initially need to fetch media from instagram with specific hashtag and populate those info into database. After that, each new media published on instagram side (containing specific hashtag) needs to be fetched again and populated into database.
Of course i can accomplish this one using cron job (checking specific hashtag and number of media for that hashtag each 10 mins for example), but im wondering does instagram have "hooks" implemented for these kind of things. So for example, if something is published on instagram side, hook will be triggered and call will be sent to specific url provided within instagram dev app?
You should check out Instagram's Real-time API's: http://instagram.com/developer/realtime/
They allow you to subscribe and receive new contents via push updates. (uses pubsubhubbub)
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I know how to read and parse XML with PHP, but is there a way to make it auto advance through pages of XML data automatically? I work for a TV station that is featuring open houses via a real estate xml feed and need to show one or two houses at a time for a half hour without requiring someone to sit there and advance the pages themselves. Any thoughts would be appreciated!
You could have PHP output the necessary JavaScript to redirect to the next page.
Here's how to redirect to a new page: How can I make a redirect page in jQuery/JavaScript? combine that with setTimeout() to add a delay.
I'm not sure whether I would rely on this for a live TV feed, though - I'd be too scared of the embarrassment of the browser crashing, or an error popping up....