Connect to a MySQL database from C++ Through PHP - php

I have a dedicated Linux Server with my multiplayer game (C++/Sockets) on it and recently I acquired a MySQL server "service" for my web pages (I also have a little multiplayer game done in PHP) so I don't need to think about backups etc.
The MySQL service works wonderfully well for my web pages but I can't connect from the server to the database as they are not on the same internal networks.
The only connections that can be done must come from web pages hosted on the providers servers.
I have thought of sending requests from the server (the C++ game server) to a PHP web page that connects to the database and sends back the answer.
This will work for any simples request (like how many HP has player X) but when it comes to iterative requests (ie. "SELECT id FROM player") where there are a lot of answers it becomes more complicated.
Is there a standard way to circumvent this? Is there already proof read PHP / C++ code somewhere out there? Any other way (ie some sort of fake port forwarding)?
Or should I bite the sour apple and start fiddling with Linux automated backups?
In this case, speed is not an issue, data integrity and reliability is.

Your question basically suggests proxy for MySql requests with C++ client and PHP as server. While it is completely possible and there can be some good solutions(PHP can work with raw sockets, so creating a proxy shouldn't be so much of a problem) you have one more limitaion - you don't have root access so you can't create any sockets. That leaves us only with HTTP protocol.
Executing remote queries by HTTP protocol is possible: for example http://www.phpclasses.org/package/4000-PHP-Execute-remote-MySQL-queries-across-the-Web.html is an example PHP client and PHP server. But there will always be severe limitations: each separate request will be a separate MySql connection, so some MySql features will be hard to do preoperly: temporary tables etc.
The more complex queries you want to execute the more complex your application will get: prepared statements, MySql escaping, getting last insert id etc. all have different transfer formats, so they all have to be written on both your C++ client and PHP server.
Error chance will increase too: you will get additional HTTP errors etc. One of the reasons MySql in PHP is so popular is because it is reliable: thousands of developers/applications use it each day and it can be considered stable. But you can still find some bug reports about MySql driver for PHP, so imagine how many bugs will be in some rarely used code? And more, using rare programs is almost always bad idea - if you get some obscure error one day there will be nowhere to look for solution.
On the other hand there are a lot of programs/scripts/advices how to do automatical MySql backups and most likely on your Web(PHP) server they are done with one of well known ways that you can do yourself.

If your concern is backups, you definitely should go for automated backups, it's very simple. Let's say you want to backup your mysql database every day at 12 am, use this cron job:
0 0 * * * mysqldump dbname -u username -ppassword > /path/to/store/backup
You can then just download this backup if you want to store it offsite.

I'm not sure if I understand the question, but I'll give it a shot. You would generate an array of your SQL commands client side using whatever language you would like...make sure to handle escaping. Take that array, encode it using Serialize, JSON, etc. Pass the plain text string through post to the PHP api. on the php side:
mysql_connect($server, $username, $pass);
#mysql_select_db($db) or die("Cannot select DB.");
$unencoded = unserialize( $_POST['input'] ) //Match for w/e encoding you used
$cnt=0;
foreach( $unencoded as $query){
mysql_query($query) or die(mysql_error());
$cnt++
}
return "$cnt Queries";
This is going to be dangerous though as the above is expecting fully escaped strings. I would definitely also include some sort of hash validation to avoid exploits. You could also use SOAP requests, but I highly doubt the extension is enabled. Maybe an alternative to the simple array structure would be something a little more complex, which would allow you to escape the user generated portions...something like this maybe?
$queries['SELECT'][$cnt] = array( 'cols'=>"*", 'from'=>'table', 'where'=>'condition' );
then loop through on the php side, using mysql_real_escape_string on the values of the array.

Related

Speed of MySQL connection vs PHP file access

Assume that I have a simple VPS setup with LAMP (so with PHP and MySQL on the same server and no other strings attached). And assume that I want to make a self-written ajax chat client on my website.
Obviously, each participant in the conversation would have to listen constantly for new things being said. Since it is very well possible that two or more participants say something in the very same second (and refreshing more than once per second would likely cause insane system load), it seems to me that I would need to store for each participant a list of things that happened since the last refresh.
Which would be the "best" way to do this (in terms of system load)? In the following, an "event" just 'any participant saying anything in the chat'. Clearly, this could be used for a more general as well.
(A) Use MySQL, connecting to the db every second and asking for events WHERE participant_id = $participant_id? (and then deleting all of these so they're only fetched once)
(B) Create a file $participant_id.php and append the events to it (in PHP format so that it can be included, and then empty or delete the file at the next refresh?
(C) Does anyone know any other useful alternatives?
An alternative would be to use a socket connection. Each person connected to the socket server daemon would be able to send a message to the daemon, the daemon would then send the message out to all or a partial list of subscribers which makes chat instantaneous with no need to save the data at all.
A good way to create socket connections from a client is socket IO. See below.
http://socket.io/
A good technology to use for creating a socket server daemon is node.js. This is a server side event driven javascript based library. Very efficient for things like this. See below.
http://nodejs.org/
On both A and B you are still effectively polling. You will either poll MySQL which really isn't too bad, or you can get notified on select() of a file change BUT you will still need to parse to see if the new data is the right stuff on the file-side.
For conceptual and support ease-of-use, it is really hard to beat a database as you won't have to worry about locking semantics. Debugging and message tracking are clean in this structure.
I however recommend you investigate the msg_send() and msg_receive() (of PHP) functions to put this data into an underlying message queue. Your problem seems to be a message queueing problem that should be solved by that mechanism.
Does anyone know any other useful alternatives?
If you search simple solutions on PHP, I can offer 2 ways:
Cache
It mean that you keep MySQL for store data, but install APC (this solution is simplest and fastest for small servers and applications) or Memcached (better for using width several servers). For each read-request you check APC/Memcached for you data and ask MySQL only if your cache is removed or updated. And on each write-request you inserting data in MySQL and update cache.
Other DB
In this case you change MySQL for one of memory-base DB (for example MongoDB). And you may not afraid hard disk usage.

Reduce MySQL query amount with jQuery and PHP

I am building a "multiplayer world" with jQuery and PHP. Here is a bit how it works:
User's character's positions are taken from a database, user is plotted accordingly (the position values are CSS values - left and top)
User is able to move about using the arrow keys on the keyboard, making their character move using jQuery animations. While this is happening (on each arrow press) the user's position values are inserted into a database and updated.
In order to make this "global" (so users see each other) as you could say, the values need to be updated all at once for each user using AJAX
The problem I am having is I need to continuously call a JavaScript function I wrote which connects to the MySQL server and grabs values from a database table. And this function needs to be called constantly via setInterval(thisFunction, 1000); however my host just suspended me for overloading the server's resources and I think this was because of all my MySQL queries. And even after grabbing values from my database repeatedly, I had to insert values every few seconds as well so I can imagine that would cause a crash over time if enough clients were to login. How can I reduce the amount of queries I am using? Is there another way to do what I need to do? Thank you.
This is essentially the same thing as a chat system with regards to resource usage. Try a search and you'll find many different solution, including concepts like long polling and memcached. For example, check this thread: Scaling a chat app - short polling vs. long polling (AJAX, PHP)
You should look into long polling - http://en.wikipedia.org/wiki/Push_technology. This method allows you to establish a connection with your server, and then update it only when you need to. However by the sounds of it, you have a pretty intensive thing going on if you want to update every time, you may want to look into another way of storing this data, but if your wondering how big companies do it, they tend to have mass amounts of servers to handle it for them, but they will also use a technique similar to long polling.
You could store all the positions in memory using memcached See http://php.net/manual/fr/book.memcached.php and save them all at once every few seconds into the database (if needed).
You could use web sockets to overcome this problem. Check out this nettuts tutorial.
There is another way, it's to emulate or use actual sockets. Instead of constantly pulling the data (refreshing to check if there are new records), you can push the data over WebSockets which works in Chrome at the moment (at least to my knowledge, didn't try it in FF4) or you can use Node.js for leaner long pooling. That way, the communication between players will be bi-directional without the need of MySQL for storing positions.
Checkout Tornado
From their site:
Tornado is an open source version of the scalable, non-blocking web server and tools that power FriendFeed. The FriendFeed application is written using a web framework that looks a bit like web.py or Google's webapp, but with additional tools and optimizations to take advantage of the underlying non-blocking infrastructure.
The framework is distinct from most mainstream web server frameworks (and certainly most Python frameworks) because it is non-blocking and reasonably fast. Because it is non-blocking and uses epoll, it can handle thousands of simultaneous standing connections, which means it is ideal for real-time web services. We built the web server specifically to handle FriendFeed's real-time features — every active user of FriendFeed maintains an open connection to the FriendFeed servers. (For more information on scaling servers to support thousands of clients, see The C10K problem.)

Is it safe to use a MySQL database for recording positions on a multiplayer game?

Would it be safe to use a MySQL database to record positions of players on the screen?
Then everyone second, Flash retrieves the new position data in the database and sets the players' positions of the map to the new positions?
I'm not sure how slow this would be.
I know PHP.
I know SQL.
I am not very experienced in ActionScript, but I can do basic things like set positions of objects.
I do not know how to retrieve information from a database via Flash.
I do not know how to make Flash send out queries.
Do you think you could give me a bit of help?
It would be safe to use MySQL.
But, I strongly wouldn't recommend using PHP + MySQL as a game server though, or your server will tend to lock up from the influx of requests. The HTTP protocol was not designed for this.
It might take a bit of time, but I would learn an easy programming language (especially something like Java or C#) to create a basic server. Then you can store their user information within RAM, instead of constantly accessing the database repeatedly. But, you could also have it where the server updates a database every n amount of minutes, in case the server is shutdown and needs to be started back up with the same data.
Look up 'Flash Remoting' for flash<->server communications. An open-source server-side handler for that is AMFPHP. Flash would send out AMF messages, AMFPHP translates that back into normal PHP data structures, and then you'd have the PHP code handle interfacing to the database.
you would have php be a controller between your db and flash. So flash would send/receive info from php and php would query db.
Yeah, MySQL is pretty secure, as long as you strip all tags, mysql injection etc from the string. And it should be pretty instant.
However, hundreds of MySQL requests every second will be a lot of bandwidth, although I can't think of any alternatives.

PHP MySQL Website to External SQL System Integration

I am after opinions from some expert web developers as to the best solution to my problem.
THE SETUP
In my job, I look after an online shop, which is based upon osCommerce, with many additional features and tweaks. This is on Linux shared hosting and uses PHP and MySQL as its main technologies.
We are going through the process of upgrading our stock and order system, which is Microsoft SQL based and runs on a data server on our network (a Dell Server PC).
THE PROBLEM
I would like to achieve better integration between the stock/order system and our website, such as to get up-to-date stock information, check prices, etc. With the eventual aim of getting customer data linked as well.
MY POSSIBLE SOLUTION
I currently have XAMPP running on our SBS server for development, which I use to test new code, before uploading to the live site.
I have written a PHP file on this server, which connects to the SQL server and runs various SQL queries, taking in parameters in the $_GET array, retrieving the result as an associative array, JSON encoding the result and echoing the encoded string.
The website just uses something like this to get and use the result:
$result = file_get_contents('http://SBS-SERVER-IP/getinfo.php?partenquiry=' . $cleanStringPartNumber);
if ($result) $stock_info = (array) json_decode($result);
I thought this would be a decent solution as any logins for the SQL, even the fact that it is SQL is not exposed to the website, so if the website was compromised, it shouldn't compromise our system.
I will be making sure that the login for the SQL server only has access to SELECT data as I don't want to update/insert/delete anything as it may cause problems with the stock/order system.
Also, I was thinking of caching the results somewhere, so our stock/order system doesn't suffer performance issues.
I am sure there are many ways of passing the data between the 2 systems, but I want to be sure I am using a solution that is secure, using the most efficient and industry-standard methods of carrying this out.
I see technologies such as cURL, SOAP, XML, JSON, etc and wonder if any of these are ideal.
What are your thoughts?
I would use https if possible to have a secure communication between both servers.
Which technologie you use is up to you and the framework you are willing to learn.
JSON is one of the easiest way to send data from a to b.
SOAP practicly is XML but you don't have to bother with the XML itself.
Using SOAP you can send and store Objects.
Using php's serialize and unserialize you can transform objects too and send them and afterwards store the content of that object into the database.
It's more about preference and productionspeed.
Learning a new framework takes some time at first but can make you more productive afterwards.

Convincing an IT Manager to allow SQL Server instead of Access

An IT Manager is not allowing the use of SQL Server with an ASP.NET website being developed. The current setup being replaced is a php site connecting to a Microsoft Access database. I have a few reasons of my own as to why SQL should be used, but would like as many strong arguments as possible (student vs. IT Man.). Does anyone have any strong arguments on why SQL should be used? Particularly posed towards an IT Manager who has stated "this is the way we have been doing it, and [it] has been working."
Thanks in advance!
UPDATE
In the interest of 'unloading' this question... If you would recommend keeping Access, when and why?
Do a load test on your finished product and prove that Access isn't meant for powering websites.
Write your code so that you can change out the database back end easily. Then when access fails migrate your data to a real db, or MySQL if you have to.
Here are some Microsoft Web server stress tools
For the record, it is possible to send mostly SQL commands to the database and not keep an active connection open, thereby allowing far more than 6 or 7 connections at once, but the fact is that access just isn't meant to do it. So the "it works fine" point is like saying it is fine to clean your floor with sticky tape: it works, but isn't the right tool for the job.
UPDATED ANSWER TO UPDATED QUESTION:
Really the key here is the separation of data access in your code. You should be able to more or less have the data database structure in any number of DBMS. Things can get complicated, but a map of tables should be universal. Then should access not work, decide to use a different database.
Access CAN be used in kinda high traffic sites. With the SQL statement only routines I was able to build an e-commerce site that did a couple million a year in sales, and had 60K visitors a month. It is possible, but maybe not ideal. Those aren't big numbers, but they are the biggest for any site I have been a part of.
Keep access if IT Manager is too busy to maintain another server, or unwilling to spend time configuring one. Ultimately guessing does nothing, and testing tells you everything you need to know. Test and make decisions on the results.
Here's a document from Microsoft that might help:
Access vs. Sql Server
Another Article.
My own personal thoughts, Access has no place in an environment that could scale beyond more than two or three concurrent connections. Would you use Excel as the back end?
Your manager has stated the reason he wants to use Access. Are you responsible for designing an alternative? Do you have any reason to think you will benefit from proving your manager wrong? What is your personal upside in this conversation? Are you certain that Access won't be "good enough"? Is the redesigned site going to have heaver or different loads (i.e. more users, or less efficient design)? I'm not sure you want to be arguing with your manager that you can't implement something that does as well as the old design.
It's going to be a lot easier to let the project fail (if you expect that will be the outcome) and rescue it with SQL Server, than to get your manager to concede that you understand the situation better than he does.
Don't forget that for something as small as most Access Databases, you can use SQL Server Express Edition, which is free, so it won't cost you anything.
I found this nice quote as well:
It is not recommended to use an Access
database in a production web
application. For production purposes,
consider connecting to a Microsoft™
SQL Server database using the
SqlDataSource or ObjectDataSource
controls.
http://quickstarts.asp.net/QuickStartv20/aspnet/doc/ctrlref/data/accessdatasource.aspx
Don't argue, benchmark it. Real data should trump rhetoric (in a rational world, at least! ;-)
Set up text boxen with the two alternatives and run 'em hard. (If you're exposing web services, you can use a tool such as SoapUI for automated stress testing. There are lots of other tools in this space.) Collect stats and analyze them to determine the tradeoffs.
One simple reason to use SQL Server instead of a Microsoft Access Database: The MS Access DB can result in a bottleneck if the DB will be used heavily by a lot of users.
Licensing for one. I doubt he wants to have hundreds of Office licenses (one for each end user that connects to the site). SQL has licenses that allows multiple connects at the same time without specific connection licenses.
Not to mention scalability and reliability issues. SQL Server ios designed to be used and administrated in a 24/7 environment. Access has not.
SQL can scale to squillions of simultaneous connections, Access cannot.
SQL can backup while operating, Access cannot.
SQL is designed as a highly robust data repository, Access is not designed with the same requirements in mind.
Access doesn't deal with multiple users very well (at all?). This means if you have more than one person trying to access or especially update your site it's very likely to die or at best be very slow.
There's much better tooling around SQL Server (linq to sql or entity framework or any number of ORMs).
SQL express is a much better choice than access for a web site backend and it's free.
Consider the option that maybe he is right. If it is working fine with Access just leave it like this. If there are scalability problems in the future ( the site is used from more than 1 user simultaneously), than it his problem, not yours.
Also consider sqlite, may be better than access
Just grab a testsuite (or just throw one together):
compare the time taken for create a db with 1000,000 enteries.
search an entry in the db.
Vaccum the db
Delete the db
Do couple of operations that you think will be done more on the db couple of times.
and do it infront of him to compare (write a script).My guess is that either your IT manager is joking, or the site that you are working on are non critical and he doesn't want to allocate resources(including you).
MS Access is for 1 desk 1 user ! I was spending a year in a previous project to detach an application (growing to enterprise size in terms of users) from Access because its strange locking behavior and awful performance. SQL Server Express Edition is a good starting point, as echoed from previous posts.

Categories