After at least 10 hours of pouring over online resources, videos, and tutorials, I have two questions about connecting my android app with my mySQL database.
Saving files
1) All the tutorials save the php files in C/WAMP/www/hello_php - for example, and so when you go to localhost/hello_php everything works.
--Where do I store my php files if I don't want to use localhost? i.e I want to use my mySQL's IP address.
--For example, the guy from this video uses this:
HttpPost httppost = new HttpPost("http://192.168.168.0.3/~tahseen#amin/php/getAllCustomers.php");
--I presume the 192.168... is the IP of his server. Where did he save the "getAllCustomers.php" file?
--Note, I am using phpMyAdmin to handle the database.
Existing JDBC code
2) I have already created all the code required to insert/update/delete elements from my DB. I have done this in Java using JDBC in eclipse. My understanding is that connecting my android app to my DB with JDBC is not ideal / unsafe / not recommended.
--Is all the code I wrote useless? i.e do I have to convert it all to php code?
Thanks in advance for your help
The php file in your example is stored in the home folder of user 'tahseen#amin' in the subdirectory 'php'. You can put your php file anywhere on the server as long as it is accessible for HTTP requests. Usually you would put the files in a subdirectory of the root webfolder, which is usually in /var/www/ on the server.
As far as I know Android has no support for MySQL databases, so you have to do the queries via PHP (or another programming language, as long as it is accessible as a service on your server). You can then send HTTP requests from your Android application in order to perform the database modification via the PHP scripts on the server.
Related
I am trying to learn how to use php via ajax and interact with wamp and databases. I do not understand how this works & it's slowing my progress significantly.
When I hear/read "upload php files to your server", what does that mean? I was under the impression that you included all files (php/js/html/etc) in the same folder locally when putting a website/app/etc online - am I mistaken? Are files stored on server and then initiated when called?
Where should php files (specifically scripts to pull and send information) be located? Not understanding this is bottle-necking my progress greatly, so thanks for the help.
A server is a Computer with High Specification which keeps running all time so that anyone can have access anytime.
let us see an Analogy.If you had hands on language's Such as C,C++,Python. You must have heard They are High level Language and need to be converted to machine code before they are Executed.
Similarly when we are on web our web browser only understands HTML (That is How to display data on screen)
PHP is a Scripting language (which means how program will work is decided by PHP)
A Database is location Where You can store Data For latter (PHP my need to access this data for computing eg: check if user is a valid user).
When You create a Website You Want a computer That is available all time (server).But they are expensive so You rent some space from Company such as GoDady ..Now This is like having Your own Computer. Uploding Files to Server Means Putting website Files to Your New Computer.
Now suppose You Want to Access your file on your local computer What you do?
C:/myfolder/myfile.php
Similarly on Server 'C' is Your websites Name so if your php file is in myfolder directory on your server.
www.mywebsite.com/myfolder/myfile.php
When you request webpage www.myfle.com it go to the server there it processes Php scripts and sends back only Html components that Browser Understands.
I have an application that gets data from a SQL Server database using a PHP Script stored in an online server.
I get the data on my iOS app with NSURLConnection, I connect to the script and the script executes the queries on the server.
My question is, can I store this script on the iPhone or iPad and forget the online server?
Also, can I execute SQL Server queries without PHP Scripts and POST methods?
If your database is static(no update), then you can use iOS's native CoreData to manage the database locally and you dont need internet connection at all.
If your database needs to be updated after you release your application, then you will need a server.
In either case the programming language on iOS will be Objective C and C++ (instead of Python).
I was searching a lot of time for some library to do that I want. I found this library:
https://github.com/martinrybak/SQLClient
Main blog of the proyect:
http://objcsharp.wordpress.com/2013/10/15/an-open-source-sql-server-library-for-ios/
If you know another Libraries you can post in this thread for help the people with the same problem.
Regards.
I have the situation of editing the php files in the "www" folder of a server from local machines. But in my condition, all the users across the sites, who shares the intranet, should be able to access the php files. Its more like I'm counting how many times users using an application which I developed. So, my idea is like creating a database in WAMP in the server and creating a php file which performs appending operation. So, if the user uses my application, I will make the php file I created to run in the server and that counts number of times my application used. I'm new to web design and I'm using Apache2.4 version. Please let me know if you have a better approach. Thanks in advance.
say it's filename.php
filename.php
$con=mysql_connect("hostname", "username", "password");
mysql_select_db("user_db");
$sql="update usage_count set count=count+1";
mysql_query($sql);
This code increases the hit-count(count as in database) each time the file is fetched from clint computer.
I've built an application which interacts with a web camera via FTP, however upon completing it I have a speed issue. The application structure is as follows:
Web Camera -> Proftp/Mysql -> PHP
The web camera ftp's images to the Proftp server which is managed via MySQL / PHP. The PHP acts as a client for users, and it in turn pulls the latest images from the FTP server.
This works, but is really slow. The problem is on the Proftp -> PHP side. Using the standard PHP ftp library it takes around 4 seconds to connect to the ftp server -> do a directory listing -> output the file contents.
The speed issue is due to the authentication part of the process. From what I've seen there's no way of caching/storing/serializing the FTP connection, meaning every request to the server has to start a new request.
These are the thoughts I've had so far..
1) Have a PHP script run in a while loop with a permanent FTP connection open, but I know PHP isn't designed to be run in this way.
2) Create a daemon running node.js / java etc which is able to keep a permanent ftp connection open, and have PHP interact with that. What I'm worried about with this approach is the extra maintenance involved in writing more code which duplicates the authentication code already written in PHP.
3) ???
Any help / suggestions would be greatly appreciated!.
won't it be wise to schedule it on cron job to enable the FTP script to run nearly permanent, or at a given short interval?
Why you go the extra mile to read the images from ftp?
If they are on the same server, just read them via php directly from the storage directory.
If they are not on the same server, utilize some mechanism to inform php about the latest image (for example via a text file, a GET variable etc.) and directly output http/ftp image.
I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.