Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am new to making games, and I am writing one for fun using php and javascript. I am using MySQL to store all of the info of the users of the game. The game is kind of like managing a sports team. You have a few variables (cash, assets, players, staff, etc...) and you take on all the roles of a sports manager. I know it exists, this is just a personal challenge.
My question is, what is the best and most efficient way to get information from the database into the game?
1. Do I have to run an sql query on every page?
2. Do I have to update my database EVERY single time something is updated?
3. Is it possible to get all of the information from the database when the user logs in, let him/her play, then only update the database with the new information when the session is killed?
Sorry for the lack of code, just looking for a starting point because it would be helpful to me to know this before I start writing a lot of the game.
Thanks
No, you don't necessarily have to run a MySQL query on every page load. You could store the results of such queries in a cache system such as memcached, or keep necessary data in $_SESSION.
No, you can use similar workarounds as before, but if the user disconnects you may end up with unsaved changes.
Well, you could load the data relevant to the user and write your own session handler for saving the data when the session is destroyed, but although I haven't ever tried it I would say there's a very real risk of losing data if, for example, your server is restarted or PHP's garbage collector callback is not called for some reason.
Overall, I think you may perceive SQL queries as much heavier than they actually are. If your database structure and indexes are set up correctly, your queries and updates shouldn't take longer than about 0.01 seconds each to complete.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I made a database with web interface (PHP) for a store and I made a log system "history page" so every change made on the stock will be saved. I made it with an MySQL table with 5 columns (date / user / action / product / changes), but what if I made a file "log.txt" that change every time a user make an action? Which is better / faster and why?
For text files, unless you create a system that ensures that the file is only written to by one thread at a time (like a logging framework would do) you might run into concurrency issues.
A "SQL table" (MS SQL/MySQL/Postgres etc) would be able to handle many concurrent log messages at once. It is however a bit overkill and as your table grows some queries against that table may slow down your database file size will grow too.
Given your scenario (php web app with a history page) SQL is going to be preferable over a text file.
Both are essentially the same thing right? Both store the data to a file and read the data from the file.
Generally I would say if you need to find specific data within the file a database is going to make accessing that data easier and faster. If you only need to append to the data and then read, for example, the last 1000 lines, a text file is going to be easier and faster.
I would recommend using a log utility to write the log to a text file, if you decide to go that route. The log utility will deal with concurrency issues.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
The Problem
I have an app that scrapes data and presents it to the user, directly, because of lack of disk space.
This data is very volatile, it can change within minutes. Much like the stock market.
Since the data changes so often, and it varies from user to user, it is useless to save it in a database.
The question
I need to sort the data presented to the user, compare it, link it etc. A lot of functions that a database provides. Yet I cannot save it in said database because of the above conondrums, what should I do?
What I've Thought of Doing So Far
I've tried organizing the data presented to each user using just PHP but seems troublesome, fragile and inefficient.
Should I just create some sort of virtual table system in MySQL just for data handling? Maybe use a good database engine for that purpose?
Maybe I can save all data for each user but have a cron job remove the old data in the database in a constant fashion? Seems troublesome.
The Answer
I'd like some implementation ideas from folks who have encountered a similar problem. I do not care for "try all of the above and see what is faster" type of answers.
Thanks all for your help.
If the data is of the type you would store in a db and you would benefit from being able to query it in ways that are more difficult in PHP, but you just don't want to keep it, you can still use a database. You can create temporary tables, insert raw data, and query it to get what you want. When you close the db connection, the tables disappear. Even though the script names them the same, the database will actually create a unique set per connection so each user will have unique data. This solution may not perform as well as you need so do some testing to see if it's suitable for your situation.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm trying to create a web chat app using AJAX, PHP and mySQL. I'm having trouble with the database structure.. Here's what I've thought :
A users table: Contains basic user's info
A Chat table: Contains basic columns like 'to', 'from' 'timestamp' etc..
The problem:
I think that this will get pretty messy very quickly since lots of users will be querying the same table. Not to mention some security issues. I want to create a separate table for each conversation. Is this a good idea? What would be your preferred structure?
Separate table for each conversation would be very messy indeed. A single table would get huge and degrade performance with sufficient volume and accumulation.
If you don't need to store each line of conversation in perpetuity in the database, you can simply purge the conversation from the chat lines table once it's over. You'd only need to keep it there if you wanted to search lines in past conversations. (Use other approaches for keeping chat statistics etc.)
You could archive a concatenated/serialized version of the conversation, ie. the whole lot in one chunk, into a file in the filesystem, or into a separate table with the relevant metadata (users, length, duration etc.). Then simply reload it, whenever an old conversation becomes active again.
If you do want to distribute your per-table load, you could e.g. track typical user connections and then generate an adequate amount of group-dedicated tables, or use any other user aggregation algorithm that works. But if you do purge the chat lines table periodically, it'll take a huge volume of usage before database performance will become an issue.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm trying to create web application in php, using symfony.
It's sort of a community creator. But my question is, should I make only one database for all the application, or should I create a database every time a user creates a "community". What are the best practices, in relation to this. Thanks for all your responses.
Creating a new database every time a user creates a new community is a bad idea in my opinion, for more than a few reasons. I'll give you the most important ones:
It's very unsafe. It means that the overall database user that is connecting to the database server has a higher level of privileges (beyond the standard CRUD operations). That is considered bad security practice, as a security flaw in your application could open the databases for all kinds of attacks.
It's hard to maintain. I'm not sure how many communities you expect to get, but imagine that an update of the code requires you to update a given table in all of those databases.
For each database another connection is used, which means that another socket connection is in use. When using persistent connections, this means a lot of database connections may be open at the same time (depending on the scale of the application). This could cause bottlenecks and thus performance issues.
The first and second reasons are by far the most important.
If, however, you feel the need to separate each community from another in some way, I suggest using tables with a prefix for every community. In that case the first reason is migitated somewhat, as the database user only has to have rights to a single database, and the third reason is no longer in order.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.