I am after opinions from some expert web developers as to the best solution to my problem.
THE SETUP
In my job, I look after an online shop, which is based upon osCommerce, with many additional features and tweaks. This is on Linux shared hosting and uses PHP and MySQL as its main technologies.
We are going through the process of upgrading our stock and order system, which is Microsoft SQL based and runs on a data server on our network (a Dell Server PC).
THE PROBLEM
I would like to achieve better integration between the stock/order system and our website, such as to get up-to-date stock information, check prices, etc. With the eventual aim of getting customer data linked as well.
MY POSSIBLE SOLUTION
I currently have XAMPP running on our SBS server for development, which I use to test new code, before uploading to the live site.
I have written a PHP file on this server, which connects to the SQL server and runs various SQL queries, taking in parameters in the $_GET array, retrieving the result as an associative array, JSON encoding the result and echoing the encoded string.
The website just uses something like this to get and use the result:
$result = file_get_contents('http://SBS-SERVER-IP/getinfo.php?partenquiry=' . $cleanStringPartNumber);
if ($result) $stock_info = (array) json_decode($result);
I thought this would be a decent solution as any logins for the SQL, even the fact that it is SQL is not exposed to the website, so if the website was compromised, it shouldn't compromise our system.
I will be making sure that the login for the SQL server only has access to SELECT data as I don't want to update/insert/delete anything as it may cause problems with the stock/order system.
Also, I was thinking of caching the results somewhere, so our stock/order system doesn't suffer performance issues.
I am sure there are many ways of passing the data between the 2 systems, but I want to be sure I am using a solution that is secure, using the most efficient and industry-standard methods of carrying this out.
I see technologies such as cURL, SOAP, XML, JSON, etc and wonder if any of these are ideal.
What are your thoughts?
I would use https if possible to have a secure communication between both servers.
Which technologie you use is up to you and the framework you are willing to learn.
JSON is one of the easiest way to send data from a to b.
SOAP practicly is XML but you don't have to bother with the XML itself.
Using SOAP you can send and store Objects.
Using php's serialize and unserialize you can transform objects too and send them and afterwards store the content of that object into the database.
It's more about preference and productionspeed.
Learning a new framework takes some time at first but can make you more productive afterwards.
Related
I have a client who has a feed of leads which have Name, IP, Address, OptIn Time/Date, and I want them to be able to post the data to my hosted SQL database. If you are familiar with lead generation you will get what Im trying to do.
Id also like to know if its possible to write a script and place it on my server so that when someone posts a CSV file to it I can have the script automatically post the data in the CSV to the SQL server.
Is this possible? And are there any tutorials our reference manuals, sources, etc. I can use to accomplish this?
The answer to your question is Yes.
You can go about this two ways:
Write an API for your database which is consumed by those wishing to search/write/query your database. To do this, you can use any language that you are comfortable with. PHP, XML and Python are not interchangeable. XML is a format specification, it describes what the data should look like when its being transported between two systems. So you can use any programming language that provides XML libraries to write your code. In addition to XML, JSON has emerged as the more popular transport format especially for mobile and web applications.
The second option is to use a service like apigee, google cloud endpoints and mashery which will automate a lot of this process for you. Each requires its own amount of effort (with google cloud endpoints perhaps requiring the most effort). For example apigee will automatically create an API for you as long as you can provide it access to your data source.
I am creating an app for my clients to add to their webpages. however, I am hosting the database that stores the info for this app. All I want to do is do all the queries on my server and somehow pass the $var to their server.
so what I was thinking was to have my PHP page with all the MYSQL credentials store on my server and give them a code that calls that page and outputs the stuff, something like
require_once('192.163.163.163/config.php');
But I bet this is the least secure way to do this. I don't want to give anyone access to the central database and I am handling all the requests. Do you guys have any suggestions that I can pull the data off my db and pass it to their server in a $var without opening any doors?
If you can't afford to give away your DB credentials or other internal details of your system but you need the clients to be able to read data from you, then the only really secure way to do set your system up as an API that the clients can call.
Don't try to combine the two systems into a single app; it will open up holes that cannot be closed.
To create an API is fairly simple in principle. Just create a suite of normal PHP programs that accept a set of pre-defined arguments return the data in a pre-defined format that can be easily processed by the calling program -- eg maybe a JSON structure.
The clients would then simply call your system via an HTTP call. They'd never need to see your code; the wouldn't need to be hosted on the same server, and they wouldn't even need to be writing their system in the same language as yours.
There's a lot more to it than that -- it is, of course, perfectly easy to write an insecure API as well, and you'll want to read up on how to write a good API to avoid that sort of thing -- but that's your starting point. I hope it helps.
I've decided to try out some DB connections for my android applikation. However I need some advice regarding structure.
I came to know that in order to fetch data from DB with Android I need to use php scripts. I'm a total novice in this area and for me it sounds a bit akward. Therefore I figured I could create a Java server which the application connects to, and among other things this server also fetches data from the DB and returns.
In a performance perspective, what's best? Let the application itself fetch data from the DB, or connect to the server which fetches it for you? How about security? For clearance, I will have a Java server anyhow to take care of other things.
Sorry about the Vista paint skills.
Thanks for any input!
Like others have said it doesn't matter which you use PHP vs Java. But you're on the right track. Most developers create a web service on their HTTP Server and the app talks to that. That's usually in the form of JSON or XML strings over HTTP. Popular choice is using the REST architecture which essentially tells you that stuff you access on the service are resources and you structure your service based on that.
For the PHP vs Java question it's really up to you so do which ever you can setup faster and are more familiar with. I will also say Java has productivity advantages in Android's case because you can create plain old java objects as your models for results and share that code between the server and your Android client. You get this because you can use something like Gson library which serializes and deserializes objects into JSON format. You can also use Google AppEngine for hosting your Java code too.
Well this problem is irrelevant in context to android programming i would say.
Assuming that You are returning the extracted data to your device in json format, the entire performance issue is sort of restricted to the performance of java or php in retrieving data from database and converting it to json and sending to the client.
And as far as simple operations are considered the efficiency wont matter much in both cases, it is just a matter of preference on the developers part.
I have a dedicated Linux Server with my multiplayer game (C++/Sockets) on it and recently I acquired a MySQL server "service" for my web pages (I also have a little multiplayer game done in PHP) so I don't need to think about backups etc.
The MySQL service works wonderfully well for my web pages but I can't connect from the server to the database as they are not on the same internal networks.
The only connections that can be done must come from web pages hosted on the providers servers.
I have thought of sending requests from the server (the C++ game server) to a PHP web page that connects to the database and sends back the answer.
This will work for any simples request (like how many HP has player X) but when it comes to iterative requests (ie. "SELECT id FROM player") where there are a lot of answers it becomes more complicated.
Is there a standard way to circumvent this? Is there already proof read PHP / C++ code somewhere out there? Any other way (ie some sort of fake port forwarding)?
Or should I bite the sour apple and start fiddling with Linux automated backups?
In this case, speed is not an issue, data integrity and reliability is.
Your question basically suggests proxy for MySql requests with C++ client and PHP as server. While it is completely possible and there can be some good solutions(PHP can work with raw sockets, so creating a proxy shouldn't be so much of a problem) you have one more limitaion - you don't have root access so you can't create any sockets. That leaves us only with HTTP protocol.
Executing remote queries by HTTP protocol is possible: for example http://www.phpclasses.org/package/4000-PHP-Execute-remote-MySQL-queries-across-the-Web.html is an example PHP client and PHP server. But there will always be severe limitations: each separate request will be a separate MySql connection, so some MySql features will be hard to do preoperly: temporary tables etc.
The more complex queries you want to execute the more complex your application will get: prepared statements, MySql escaping, getting last insert id etc. all have different transfer formats, so they all have to be written on both your C++ client and PHP server.
Error chance will increase too: you will get additional HTTP errors etc. One of the reasons MySql in PHP is so popular is because it is reliable: thousands of developers/applications use it each day and it can be considered stable. But you can still find some bug reports about MySql driver for PHP, so imagine how many bugs will be in some rarely used code? And more, using rare programs is almost always bad idea - if you get some obscure error one day there will be nowhere to look for solution.
On the other hand there are a lot of programs/scripts/advices how to do automatical MySql backups and most likely on your Web(PHP) server they are done with one of well known ways that you can do yourself.
If your concern is backups, you definitely should go for automated backups, it's very simple. Let's say you want to backup your mysql database every day at 12 am, use this cron job:
0 0 * * * mysqldump dbname -u username -ppassword > /path/to/store/backup
You can then just download this backup if you want to store it offsite.
I'm not sure if I understand the question, but I'll give it a shot. You would generate an array of your SQL commands client side using whatever language you would like...make sure to handle escaping. Take that array, encode it using Serialize, JSON, etc. Pass the plain text string through post to the PHP api. on the php side:
mysql_connect($server, $username, $pass);
#mysql_select_db($db) or die("Cannot select DB.");
$unencoded = unserialize( $_POST['input'] ) //Match for w/e encoding you used
$cnt=0;
foreach( $unencoded as $query){
mysql_query($query) or die(mysql_error());
$cnt++
}
return "$cnt Queries";
This is going to be dangerous though as the above is expecting fully escaped strings. I would definitely also include some sort of hash validation to avoid exploits. You could also use SOAP requests, but I highly doubt the extension is enabled. Maybe an alternative to the simple array structure would be something a little more complex, which would allow you to escape the user generated portions...something like this maybe?
$queries['SELECT'][$cnt] = array( 'cols'=>"*", 'from'=>'table', 'where'=>'condition' );
then loop through on the php side, using mysql_real_escape_string on the values of the array.
I read some nice articles about how to connect to a remote MySQL database via Android.
Found some really interesting links here and here.
So the common way for getting data seems to be using some kind of webservice (interface, in this case a php script) which queries the db and renders the result in JSON (or XML) format. Then its possible to parse this output with the android JSON_Object implementation. So far so good.
Receiving data from the database and showing it up in a android listview was done in about minutes.
But what is the best practice for writing (inserting) data into tables?
Should a webservice be used here too? (or rather direct mysql conn)
What is the best method to push data to a webservice? (for ex. to insert a new entity in a database) and which format should be used?
In this case I do not use any html forms or anything to post the parameters. So how to post these parameters to the php script? (from within the android app!)
Of course this operation should be secure as well. Implementing a data manipulation machanism is bit more risky (in order to keep the db persistant)
I think, that many apps use some kind of DB, to synchronize data (ex: highscores).
So there should be a best practise for that.
I would recommend keeping anything database-specific hidden behind a web service.
If you build a dependency on MySQL into your application and later find that you need to change databases, the entire installed base has to be cut over. Think about the logistics of accomplishing that for a few minutes and you'll start to realize it's a nightmare.
Premiumsoft's Navicat for MySQL comes with a HTTP tunnel (PHP script) you might be able to use. It basically provides a method for doing anything to a MySQL database over HTTP.
I'd just make sure there are no licensing issues if you plan to distribute your app.