PHP - Architechtural advice needed with FTP streaming - php

I've built an application which interacts with a web camera via FTP, however upon completing it I have a speed issue. The application structure is as follows:
Web Camera -> Proftp/Mysql -> PHP
The web camera ftp's images to the Proftp server which is managed via MySQL / PHP. The PHP acts as a client for users, and it in turn pulls the latest images from the FTP server.
This works, but is really slow. The problem is on the Proftp -> PHP side. Using the standard PHP ftp library it takes around 4 seconds to connect to the ftp server -> do a directory listing -> output the file contents.
The speed issue is due to the authentication part of the process. From what I've seen there's no way of caching/storing/serializing the FTP connection, meaning every request to the server has to start a new request.
These are the thoughts I've had so far..
1) Have a PHP script run in a while loop with a permanent FTP connection open, but I know PHP isn't designed to be run in this way.
2) Create a daemon running node.js / java etc which is able to keep a permanent ftp connection open, and have PHP interact with that. What I'm worried about with this approach is the extra maintenance involved in writing more code which duplicates the authentication code already written in PHP.
3) ???
Any help / suggestions would be greatly appreciated!.

won't it be wise to schedule it on cron job to enable the FTP script to run nearly permanent, or at a given short interval?

Why you go the extra mile to read the images from ftp?
If they are on the same server, just read them via php directly from the storage directory.
If they are not on the same server, utilize some mechanism to inform php about the latest image (for example via a text file, a GET variable etc.) and directly output http/ftp image.

Related

Are php files stored on a web server?

I am trying to learn how to use php via ajax and interact with wamp and databases. I do not understand how this works & it's slowing my progress significantly.
When I hear/read "upload php files to your server", what does that mean? I was under the impression that you included all files (php/js/html/etc) in the same folder locally when putting a website/app/etc online - am I mistaken? Are files stored on server and then initiated when called?
Where should php files (specifically scripts to pull and send information) be located? Not understanding this is bottle-necking my progress greatly, so thanks for the help.
A server is a Computer with High Specification which keeps running all time so that anyone can have access anytime.
let us see an Analogy.If you had hands on language's Such as C,C++,Python. You must have heard They are High level Language and need to be converted to machine code before they are Executed.
Similarly when we are on web our web browser only understands HTML (That is How to display data on screen)
PHP is a Scripting language (which means how program will work is decided by PHP)
A Database is location Where You can store Data For latter (PHP my need to access this data for computing eg: check if user is a valid user).
When You create a Website You Want a computer That is available all time (server).But they are expensive so You rent some space from Company such as GoDady ..Now This is like having Your own Computer. Uploding Files to Server Means Putting website Files to Your New Computer.
Now suppose You Want to Access your file on your local computer What you do?
C:/myfolder/myfile.php
Similarly on Server 'C' is Your websites Name so if your php file is in myfolder directory on your server.
www.mywebsite.com/myfolder/myfile.php
When you request webpage www.myfle.com it go to the server there it processes Php scripts and sends back only Html components that Browser Understands.

How to connect Android app to mySQL *SPECIFICALLY*

After at least 10 hours of pouring over online resources, videos, and tutorials, I have two questions about connecting my android app with my mySQL database.
Saving files
1) All the tutorials save the php files in C/WAMP/www/hello_php - for example, and so when you go to localhost/hello_php everything works.
--Where do I store my php files if I don't want to use localhost? i.e I want to use my mySQL's IP address.
--For example, the guy from this video uses this:
HttpPost httppost = new HttpPost("http://192.168.168.0.3/~tahseen#amin/php/getAllCustomers.php");
--I presume the 192.168... is the IP of his server. Where did he save the "getAllCustomers.php" file?
--Note, I am using phpMyAdmin to handle the database.
Existing JDBC code
2) I have already created all the code required to insert/update/delete elements from my DB. I have done this in Java using JDBC in eclipse. My understanding is that connecting my android app to my DB with JDBC is not ideal / unsafe / not recommended.
--Is all the code I wrote useless? i.e do I have to convert it all to php code?
Thanks in advance for your help
The php file in your example is stored in the home folder of user 'tahseen#amin' in the subdirectory 'php'. You can put your php file anywhere on the server as long as it is accessible for HTTP requests. Usually you would put the files in a subdirectory of the root webfolder, which is usually in /var/www/ on the server.
As far as I know Android has no support for MySQL databases, so you have to do the queries via PHP (or another programming language, as long as it is accessible as a service on your server). You can then send HTTP requests from your Android application in order to perform the database modification via the PHP scripts on the server.

Python > PHP - Download files and folders overwriting

I develop some python applications so I know how to do this in python locally, but am working with some PHP developers (I know nothing of PHP) who say this can't be done in PHP. This is the idea: A php driven remote website which creates / hosts files. Using a web browser I want to download from this website a series of folders and files onto the local machine overwriting already existing files/folders with the same name. So in my browser I click on a download button which asks me to browse to a local or network folder to download the folders and files to. Currently we are just downloading a single .zip file containing all these files and folders which we have to unzip and manually move, copy paste, etc, very messy and cumbersome. There must be a better way with PHP and some other language?
No, it's not possible to access from a PHP (server-side language) to the Client Machine (from a Browser) and manipulate directly his file system, hard drive, or something like that. This is not the way it works.
Just think about it for a moment, if it could be accomplish, we have serious security threat, for example we visit a page like somebadassdude.com and they have a PHP script that create unlimited folders and files to fill up all our HD... and that is soft.
But hopefully the browsers dont allow this by security design.
Look at this:
As you can see at the Diagram, the Browser and the Server response each other through HTTP Requests & Responses. There's no a communication between them like a local program running at the Client OS. You treat with his Browser, and there's no way to command the Browser to manipulate the client hard-disk, and if that can happend, look at the security consern that I mentioned before.
To be more clearer, your PHP script is running at your server, not at the client machine. It only response when a user/browser request a specific resource at your server, and response with a HTTP Response, and it can contain HTML, or Json, or a File (to be downloaded or visualized by external program), or whatever.
You have limited options:
If it is something for a Intranet, or
local network, and you have access to that network, locally or
remotely like with a VPN access. You could share a folder over
network, in that way you can use a Php Script or Python script in
order to create the folders and copy the files to it, without have to
download a zip, and unzip manually from the Browser.
Using a Java Applet. Why? Because a Java Applet runs
on the Client Side, so you have access to his computer (if the user
allow it), and you certeinly can manipulate (create, delete, read,
etc. folders and files) his hard-drive. So when the user choose the files to download,
you fire the Java Applet, and let em request to the server the files
that the user has marked. When you have the files downloaded, create
or overwrite the files on the client machine.
Create and run a program in the Client Machine, in detriment of a Web Page, by this way you gain the needed flexibility. But of course, it have his own complexities.
So IHMO i think the Java Applet maybe is the best suited solution for you:
Do not have to change much your actual business model
It doesn't require a large time investment.
It is cross-platform, Java can work on a plenty of operating systems, and Java Applets in the most popular browsers.
By the way, I personally dislike Java, but it's a tool, and you have to use the right tool for a job.
Cheers.

what is the most reliable method to get a file from remote server in php

I am writing a PHP script that will be installed on a users server. I want to pull a file from MY remote server (a zip file) and write it to the local server that the script is running on.
I can do it just fine use file_get_contents and also using Snoopy.
However, the issue is that my script will be distributed to lots of people who will have a variety of servers. They will range from freebie webspace to their own dedicated rack with everything in between :) I can't be certain that the correct options will be enabled for file_get_contents to work and I know that, if certain options are disabled, it will be unlikely that the user will be able to get them enabled.
So, I am thinking that, pretty much, the only thing I can guarantee is the they will have PHP 4 + what is the best way to pull the remote file - IE: Which way has the best chance of working on such a large range of webservers :
file_get_contents
Snoopy (using fetch)
fsockopen
Any ideas or comments would be MUCH appreciated :)

php script to read xml data on remote unix server

I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.

Categories