Download single file with multiple connections (PHP) - php

Is it possible to download a single file with multiple parallel connections through php?

As far as I know not easily, no.
You would have to build a PHP implementation of whatever protocols or additions there are to the HTTP protocol to allow the download of files through multiple connections.
While that may not be impossible, I've never heard of such an implementation and even if one exists, it stands to reason that it would be a horrible drain on resources on server side to do this in PHP.
I recommend you look for solutions on the server side for this (i.e. Apache modules and settings for example).

Not through PHP alone. You may be able to do this using some client-side components (maybe even javascript) but I suspect that it would be a lot of work. Many companies that distribute software, instead of having the user download installers via HTTP, deliver a small executable with user interface that then downloads the file with all the optimizations possible. Maybe you could go this way?

You can use the PHP http extension. It has a "range" option for requests, and can wait for incoming request data in the background (open multiple socket connections, non-blocking functions calls). I've even seen callbacks mentioned somewhere there.
And you want to look into HttpRequestPool. Don't ask me for examples, though. If it's that important for your case, you have to write the processing logic yourself.
Or just googled:
Parallel HTTP requests in PHP using PECL HTTP classes [Answer: HttpRequestPool class]

Related

Websites on the same server communicating

I want to separate out the API calls my site makes to another install as the site has become big and buggy by having everything together. I would like to know what ways are there if any to make two sites communicate when they are on the same server.
I originally was thinking I could get the client-facing site to just include the models from the API site through a custom loader for CodeIgniter, but I am currently leaning towards wanting the API site to take advantage of Laravel which would obviously scrap directly loading them.
Currently I have some calls which are using CURL to POST requests, is this the only way? I was hoping to drop the HTTP calls in favour of something more direct.
As I said in my comments to the question, I'm definitely no expert on this kind of stuff, but my original thinking was that IPC-style stuff could be done, maybe using names pipes.
PHP does allow for this in its POSIX and process control functions. Simply use posix_mkfifo to create a named pipe and then you should be able to use fopen, fread, etc. (along with the stream_* functions if you need to) to write to and read form the pipe. However, I'm not sure how well that works with a single writer and multiple readers, and it's also probably quite a large change to your code to replace the HTTP stuff you currently have.
So the next possibility is that, if you want to stick with HTTP (and don't mind the small overhead of forming HTTP requests and the headers, etc.), then you should ensure that your server is using local sockets to cut down on transport costs. If your web site's domain name is the same hostname as the server itself this should already be happening (as your /etc/hosts file will have any entry pointing the hostname to 127.0.0.1). However, if not, all you need to do is add such an entry and, as far as I'm aware, it'll work. At the very worst you could hardcode 127.0.0.1 in your code (and ensure your webserver responds correctly to these requests), of course.

How does one achieve realtime messaging between different instances of (perhaps different) PHP scripts?

I'm working on feeding the client realtime events using event-streams and HTML5 SSEs client-side.
But some of my events will actually come from form submissions by other clients.
What's the best method for detecting these form submissions (so as to append them to the event-stream script) ASAP (after they occur)?
So essentially, I need realtime cross-script messaging between multiple instances of different scripts instantiated by different clients, analagous to X-doc messaging in JS, but for PHP.
The best I can come up with is to repeatedly poll a subdir of /tmp for notification files, which is a terrible solution.
Often you can use MYSQL to play the role of the tmp dir you were talking about. This is more portable because they don't have to be on the same server to do this and the data is separate. However the scripts will have to manually check the mysql location to see if the other one has taken care of this. The other option is to open sockets and write back and forth in real time or to use some prebuilt tool for just this purpose which I'm pretty sure might exist.
If you want the events to be triggered near to realtime, then you need to handle them synchronously - which means running a daemon. And the simplest way to implement a daemon which can synchronize data across client connections is to use an event based server. There's a nice implementation of the latter using php here - there are plenty of examples of how to daemonize a PHP process on the interent. Then just open a blocking connection to the server from your PHP code / access this via comet.

PHP Threading and high-latency file access (eg; FTP)

This is a bit complicated, so please don't jump to conclusions, feel free to ask about anything that is not clear enough.
Basically, I have a websocket server written in PHP. Please note that websocket messages are asynchronous, that is, a response to a request might take a lot of time, all the while the client keeps on working (if applicable).
Clients are supposed to ask the server for access to files on other servers. This can be an FTP service, or Dropbox, for the matter.
Here, please take note of two issues: connections should be shared and reused and the server actually 'freezes' while it does its work, hence any requests are processed after the server has 'unfrozen'.
Therefore, I thought, why not offload file access (which is what freezes the server) to PHP threads?
The problem here is twofold;
how do I make a connection resource in the main thread (the server) available to the sub threads (not possible with the above threading model)?
what would happen if two threads end up needing the same resource? It's perfectly fine if one is locked until the other one finishes, but we still need to figure out issue #1.
Perhaps my train of thought is all screwed up, if you can find a better solution, I'm eager to hear it out. I've also had the idea of having a PHP thread hosting a connection resource, but it's pretty memory intensive.
PHP supports no threads. The purpose of PHP is to respond to web requests quickly. That's what the architecture was built for. Different libraries try to do something like threads but they usually cause more issues than they solve.
In general there are two ways to achieve what you want:
off-load the long processes to an external process. A common approach is using a system like gearman http://php.net/gearman
Use asynchronous operations. Some stream operations and such provide an "async" flag or "non-blocking" mode. http://php.net/stream-set-blocking

Jquery PHP push?

I need to implement real-time page data update with php and jquery.
(I found www.ape-project.org/ but it seems site is down)
Is any other solutions?
Very TNX!
You might want to check out Comet:
Comet is a web application model in
which a long-held HTTP request allows
a web server to push data to a
browser, without the browser
explicitly requesting it.[1][2] Comet
is an umbrella term, encompassing
multiple techniques for achieving this
interaction. All these methods rely on
features included by default in
browsers, such as JavaScript, rather
than on non-default plugins. The Comet
approach differs from the original
model of the web, in which a browser
requests a complete web page at a
time.
http://en.wikipedia.org/wiki/Comet_%28programming%29
If you want to do streaming (sending multiple messages over a single long lived, low latency connection), you probably need a comet server. Check out http://cometdaily.com/maturity.html for details on a variety of server implementations (I am the maintainer of one of them - Meteor).
If you are happy to reconnect after each message is received, you can do without complicated servers and transports and just use long polling - where you make an ajax request and the server simply sleeps until it has something to send back. But you will end up with LOTS of connections hanging off your web server, so if you're using a conventional web server like Apache, make sure it's configured to handle that. By default Apache doesn't like having more than a few hundred concurrent connections.
There is lots of solutions to do this...
Depending on what is your data, how your data are organized and stored (mysql ?).
Your question is too open to have a real answer.

CURL as a download manager with multiple connections and progress display in PHP

I wanted to use the CURL extension for PHP to create some sort of download manager and I was thinking if CURL allowed to implement these 2 features I'm thinking about:
1) Multiple connections or multi-part download just like a normal desktop applications download manager.
2) Constantly update on screen (text or graphical, doesn't matter) the download progress.
Does CURL for PHP allows any of this? If so, care to provide some hints?
To all the "PHP isn't great for multi-tasking" critics:
Take a step back and consider that you have an awesome Multithreading framework at your disposal if you're in a LAMP environment. Use this base architecture to your advantage - i.e. Apache is the multi-threading manager - and a damn good one at that.
It is very easy to setup PHP to work in this environment.
Set max_execution_time = 0 to allow scripts to run indefinatly
Set ignore_user_abort = true to allow scripts to run even after the
client has aborted
Design light-weight single-task REST web services. Design them in such a way that you don't care when they return such as in a queue type system. Writing to the queue is thread-safe and removing from the queue is thread-safe if done with some basic OS-level mutexes.
"forking" the web services is as simple as opening a file:
fclose(fopen("http://somewebservice....php?a1=v1&a2=v2&....")); // Launch a web service and continue...
Not only is this approach multi-threaded, but it is inherently distributed as well. The web service can be local or located on across the world. PHP certainly doesn't care.
For a basic system the only thing that holds you back is the number of threads that apache allows. Otherwise your code is ready to take advantage of load-balancing and all the other neat tricks that advanced Apache implementations have to offer.
Too often when developer think "multi-threaded" they think "OMG I have to handle forks and execs and waits and PIDs". And if you design your system that way - you're right, it gets very complicated very quickly. Step back and use what is given. You've got access to directories? Boom - you've got queues. You can issue web calls? Boom - you've got a multi-threaded (distributed) app. Now just merge the concepts together as your app dictates.
PHP is not multi-threaded and, if you try to force it as such by means of multiple file calls or forking, the results are usually sub-optimal. I would suggest against this, HOWEVER, it would be possible to do something like this with a mix of js, php (probably not curl though but a custom php file stream), and long polling
The curl_multi_xyz() functions, like curl_multi_exec() allow you to process multiple requests at the same time. Also take a look at CURLOPT_RANGE if you want to download multiple segements of the same file in parallel.
And the callback functions you can set with CURLOPT_READFUNCTION and CURLOPT_WRITEFUNCTION would allow you to send some kind of progress data to the client.
It's possible, take a look into curl_multi_init();
No, that is not case. It is not possible because download manager calls the class that handles download 5 times - that is PHP class instance.
This is a sample class call:
$tr = new teConnections();
$data = $tr->downloadManager(array('http', 'host', path', 'login', 'pass', 'port'), 'file name, compression, streaming);

Categories