Audio streaming with ajax - php

I have a web system (with php) in which I must implement an audio streaming. I have seen many solutions that lead to the use of web-sockets and node.js, but for certain reasons I can not make use of it.
Will there be any way to do it with ajax? That has as recorded, saved on server and that can be consulted by customers. (for example)

Based on your question, I don't think ajax is going to be the solution. The reason being, that ajax is not a continuous stream, but rather a callback between your end users client (browser) and the server. Once the ajax call completes, the connection to the server ends. You may be able to find a drop in jquery solution to play the audio client side, but the best you'll be able to do with ajax is possibly utilize it as a means of delivering files/references to files. For example, you COULD use ajax to call back to the server, get file data and pass back the data to the client side script that handles the audio playback. Also, not sure why this question got a downvote, as it's a perfectly valid question.

Related

Long Polling: Notifications in every browser

I am starting a long polling request for every page of my website. It looks every second at the database and if there is something new it marks it as seen and outputs the notification. The calling JavaScript is then starting a new ajax request.
Now I got problems having opened multiple tabs on the website, because only one will recieve a new notification. This is also a problem cross-browser with the same username logged in...
What would be the smartest way to solve this fool-proof?
Than you for your input!
I think it is better to avoid browser pulling. You will have browsers problems and also your infrastructure should be huge to support it.
Try a server side pushing tech like Commet,
Comet is a web application model in which a long-held HTTP request
allows a web server to push data to a browser, without the browser
explicitly requesting it.
Other approach could be using WebSockets.

How to share real time updates on a website between users on different computers?

I'm trying to figure out a way for users of a website (say a student and teacher) to share a secure connection where real time updates on one page are viewed by both of them.
From research I've concluded that some of the real time updates could be performed using ajax and javascript.
But I'm stumped as to how users could share a connection where only the two users would be viewing the updates that take place on the website (such as flash animations of a drawing board.) I'm also confused how you would even begin to set up a connection like this.
I've looked intp php sessions and cookies but I'm not sure I'm doing the right research.
Any pointers as to how two specific users could share a secure connection where real time updates are viewed by the both of them only. I don't want a terse response please. I'm looking for specific details like functions and syntax specific to php. I appreciate the help and will rate you up if you give me good answers!
You cannot share a secure connection (e.g. HTTPS) its one client to one server.
If both clients are logged in and have a background AJAX task running in the browser, is it acceptable to have each client "pull" every few seconds the same data to display for both users?
This would require the "drawing board" updates to also be sent continuously back to the server to share the updated data with the other client. I'm sure there will be an event you can use to trigger the post of data (e.g. on mouse up).
If performance is an issue, you'd want to use a better server technology like Java that is able to keep session state between requests without having to persist to a database.
You can look at ajax push techniques. I used comet once where an administrator posted messages and everybody who was logged in saw that message appear on their screen. I don't know if comet supports PHP. I only used it with JSP. Just search for "ajax push" in Google.
Flash allows for connections between users, I think they refer to them as sockets.
If you want to use Ajax, et al, you need a server side technology that supports push.
Node is the standard in this regard, and you can set up a Heroku instance for free.
There are others, and you need to learn tools before even beginning to learn application.
Among the many overviews, this might interest you:
http://arstechnica.com/business/2012/05/say-hello-to-the-real-real-time-web/?1
A few good examples where this is happening:
Google Docs
Etherpad
HTML5 Games: Multi player
Techniques you can use (with varying browser support)
HTML5 WebSockets (Wikipedia; MDN; HTML5 Demo)
Comet (Wikipedia)
Really pushing data to a web browser client from a server (which would do that when it receives something from another client) is only possible with WebSockets as far as I know. Other mechanism would either require browser plugins or a stand-alone application.
However with Comet (through AJAX) you can get really close to pushing data by polling the server periodically for data. However contrary to traditional polling (e.g. where a clients asks for data every 5 seconds), with the Comet principle the server will hold that periodic request hostage for, say, up to 30 seconds. The server will not send back data until either it has data or the time out is reached. That way, during those 30 seconds, any data that the server receives can be instantly pushed back to the other clients. And right after that the client starts a new 30 second session, and so forth.
Although both Comet and WebSockets should work with a PHP backend served by Apache. I'd recommend looking into NodeJS (as server technology) for this.
There is a lot of information regarding Comet on the internet, I suggest you google it, maybe start at Wikipedia.
The great thing about Comet is that it is more a principle than a technology. It uses what we already have (simple HTTP requests with AJAX), so browser support is very wide.
You could also do a combination, where you use sockets if supported and fallback to Comet.
I'm sure you have looked into this. The opinion that this can be done via ajax is misleading to believe that two users of a website can communicate via javascript.
As you are aware, javascript happens on the client and ajax is essentially 'talking to the server without a page change or refresh'.
The communication between two users of the website has to happen via the server - php and some chosen datastore.
Hope that wasn't terse.
cheers, Rob

Getting remote data - PHP or javascript?

What method would you choose for retrieving tweets from Twitter, for example?
Using json with jQuery (like http://juitter.com/), where client does the request.
using PHP, for eg. CURL, where server does the request
What are the advantages/disadvantages of both?
Using server side approach, you may benefit from caching facilities, and having an overall faster page load (no roundtrips to twitter, or any other 3rd party service on each request).
The other benefit of the server side approach is, users coming from countries (or even companies) which twitter is filtered in, won't see a big empty pane, or even worse a mangled layout. And even when twitter goes bad, you have a cache somewhere to show the recent tweets.
But the most obvious advantage of client side approach is being hassle free.
There is a distinction between the both approaches. When using server side outside HTTP requests, you retrieve the data to the server. If you want that data to be seen to client, you have to send it with the next request the client does to your server side. With server side requests you can do native HTTP requests and cross-domain requests also.
Client side cross-domain requests retrieve the data directly to the client. You can display that to the client in the same instance when the request returns data but if you want the data on the server side (storing the tweets in db) you have to send them from the client, back to the server. Javascript can't do cross-domain XHR requests. In order to do that, you (or the libs that do that) make some workarounds: Using iframes, using including of JS files that already have the info that you need etc.
If you need to consume a web service, I advice on using the backend as the service client and either use timely pull from the client side, or use some of the "comet" techniques.
I think it depends on how frequently the stream you are pulling in gets updated. If it isvery frequently then JS is better, because you can continually run the call without refreshing the page, whereas not so frequently, and you can pull all the data in using CURL and use JS for animations.
Client-side requests are better when you have to do many requests (e.g. for use by a public site) as you lower your server load / avoid a bottleneck, possibly benefitting from content-delivery networks (CDN) caching the requests on behalf of your web clients and you move some of the liability from yourself to the users of your site as they are actually accessing the third-party API (which may be more relevant if you have really many requests; some API terms of use even limit the number of requests per time unit so that client-side requests are the only option for big sites).
Server-side requests have the advantage of not requiring JavaScript to be enabled on the client side and can also be logged easily for statistical purposes or processed further.
using server as "MITM" you may cache, alter, insert data from 3rd part before published to your users. Your users may dislike it, though...
I would prefer client side for displaying tweets rather than the server side because of the following reasons:-
Since tweets are displayed directly on the browser in real time, using jQuery complements well with this scenario. This refreshes the page quickly along with your tweets (using multiple asynchronous ajax calls ofcourse).
Well when you take the server side, it takes an extra roundtrip to the server and back to the client. Do you need that trip when all you need is to display just tweets??. Tweets are just social messages (say hello!!!) stuff and they are not supposed to contain any harmful stuff.
You generally take the server side approach when you need to do data validation before displaying on the browser. For tweets, it's not necessary.
//for client making request
jQuery.ajax({
url:'target_url',
type:'get',
dataType:'jsonp',
success:function(data){
//do something with data
}
});
//note: dataType is jsonp not JSON as calling twitter from your domain would
//not be allowed by browser(cross domain request are not allowed)
//so u have use jsonp.

Simulating Browser Clicks In PHP

I want to write a PHP script that performs a routine task in a web app I use. I am trying to figure out the easiest way to submit a form, click a link, and get some information. What's the easiest way to do this (keeping the session open, etc.).
Javascript would be a better solution than PHP. You can use it in tandem with PHP to submit a form that references the same page, ie. <form method='index.php' action='post'>
If method is GET then you ought to be able to work it out form the URLs of a few real world attempts.
It POST then you are probably SOL unless it's your own web page./app and you know what $_POST it expects ... unless you find a tool to snoop your HTTP traffic and get the POST info from observing a few real wrold examples.
You can use CURL in PHP to simulate submitting data, clicked links, etc., I suppose, but a client-side scripting language like Javascript--as opposed to a server-side language like PHP--is more suited to what you're describing. I'd need more info to give you a specific example.
You will not be able directly emulate those events in PHP as web apps use Javascript on the client side and PHP is a different language and operates on the server side.
Firstly, I would see if there is an open API available for the web app you're wondering about, e.g. Gmail: http://code.google.com/apis/gmail/ . Not all APIs can do what the web app can do, so you'll need to check the documentation to make sure the API does what you want and has an easy way to interface with PHP.
The other option is to essentially reverse engineer how the web app communicates with it's server. Most all web apps operate by sending POST or GET HTTP data in some sort of serialized format like XML, JSON or text. You can use something like the Firebug add-on for Firefox to view POST/GET data. If you know what the server sends to the client and what the client sends to the server, you can essentially write a script using something like CURL to emulate the client in PHP instead of JavaScript. This would take quite a bit of work and probably involves a lot of trail & error.

Real Time Twitter Streaming without constant Polling

I was looking at Twitter Streaming API for getting a real-time feed.
But I dont want it stored on my server.
I just want it pulled from the server and the browser page will retrieve the data from my server's twitter pull URL.
But I want to avoid polling my server every few mill-seconds.
Is there a way for my server script to keep pushing to my browser page ?
Just how live do you want it? There are ways to set up sockets, but they can be fairly complex, and still consume their fair share of bandwidth.
Is it acceptable to poll every 5, 10 seconds or so? Every few milliseconds would give you pretty darn "live" results, but I would not be upset as a user if it took a matter of seconds for something to appear on your website. That would be satisfactorily "live" for me.
Check out COMET.
In web development, Comet is a neologism to describe a web application model in which a long-held HTTP request allows a web server to push data to a browser, without the browser explicitly requesting it.
I've always wanted to try this method but I haven't gotten around to it:
Hidden IFrame
A basic technique for dynamic web application is to use a hidden IFrame HTML element (an inline frame, which allows a website to embed one HTML document inside another). This invisible IFrame is sent as a chunked block, which implicitly declares it as infinitely long (sometimes called “forever frame”). As events occur, the iframe is gradually filled with script tags, containing JavaScript to be executed in the browser. Because browsers render HTML pages incrementally, each script tag is executed as it is received.[8]
One benefit of the IFrame method is that it works in every common browser. Two downsides of this technique are the lack of a reliable error handling method, and the impossibility of tracking the state of the request calling process.[8]

Categories