Fastest geolocation (IP to town)? [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I am writing small geolocation service: then user come to my site I should to set his town from his IP-address. Now I found three way to solve this problem:
Create from PHP connection to MySql DB and select town from it.
From PHP go to cgi script (perl,c ?) and select town from file with towns and IP-addrs.
Use services like http://ipinfodb.com/ip_location_api.php and get town from it.
But what way would be fastest? Minimal time etc?
Thanks!

3.
Primarily because of just how much data you'd have to manually compile together to do either 1 or 2.

There is no easy answer to it because a lot depends on unknown factors such as:
Speed of your MySQL DB
Speed of your php inplementation and size of the file
Speed of the location_api service
In other words, there are only two ways to find out the answer:
build them all and test
gather all parameters (speeds, bandwidth, concurrent users of all systems) and calculate/guesstimate.

I've used the MaxMind database for country-level lookup from PHP (there is example code for other languages). The downloadable database is in a binary format optimised for speed of reading - although I've not compared it to a import into Mysql and searching with SQL, I have no doubt of Maxmind when they say it would be faster to use the API and original data rather than via another means, like SQL.

Related

Need Best database to process huge data [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I have situation as below.
Every day I am getting 256 GB products information from different online shops and content providers (Ex. CNET datasource).
These information can be CSV, XML and TXT files. Files will be parsed and storing into MongoDB.
Later information will be transformed to searchable and indexed into Elasticsearch.
All 256 GB information is not different every day. Mostly 70% information will be same and few fields like Price, Size, Name and etc will be changed frequently.
I am processing Files usig PHP.
My problem are
Parsing huge data
Mapping the fields inside DB ( ex. title is not title for all onlineshops. They will give field name as Short-Title or some other name)
Increasing GB of information every day. How to store all and process. ( may be Bigdata but not sure how to use it)
Searching information fast with huge data.
Please suggest me suitable Database for this problem.
parsing huge data - Spark is fastest distributed solution for your need, thought you have 70% same data just for comparing its duplicate you anyway have to process it, here you can do mapping n all as well.
data store, if you are doing any aggregation here, i would recommend to use HBase/Impala , if each row of product is important to you use cassandra
For serching nothing is faster than lucene, so use use Solr or Elasticsearch whatever you think comfortable, both are good.

How to make multiple open source web apps use the same database [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I would like to set up an online store and a point of sale application for a food coop.
My preference is php/mysql, but I can't find any projects which accomplish both these requirements. I was wondering if it would be possible to use separate store and pos apps and get them using the same product database.
The questions I have about this are:
is it a bad idea?
Should one of the apps be modified to use the same tables as the other or should there be a database replication process which maps the fields together (is this a common thing?)
is it a bad idea?
The greatest danger might be that if someone successfully attacks your online store, then the pos systems might get affected as well. E.g. from a DOS attack. That wouldn't keep me from taking this route, though.
Should one of the apps be modified to use the same tables as the other or should there be a database replication process which maps the fields together (is this a common thing?)
If you can get at least one of the two systems to use the products data in read only mode, then I'd set up a number of views to translate between the different schemata without physically duplicating any data.

One database per user [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I am developing a php application using CodeIgniter. I am planning to split the single MySQL database to multiple sqlite databases. That is, one database(MySQL/PostgreSQL/SQLite) that handles authentication and one sqlite database per user that holds information related to users. I do not use any joins and it will have more reads than writes.
Is it a good idea to split the database into multiple sqlite databases for speed? Also, will it have problem when scaling to multiple servers? I can use redirection depending on user to point to right server.
Edit:
Decided to use MariaDB as my server for all users.
By splitting data into multiple sqlite databases, instead of speed, you will gain major headache and time sink. Don't do this, unless you know you have to, and can prove it with hard numbers, not hypothetical scenarios.
The advice above applies if the system you're building has some value (will be used commercially, etc.). Of course, if this is just a toy/training project, you're welcome to do whatever you like, and learn from it.

Read / write load balancing with php - mongodb vs mysql [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a system where users can 'like' content. There will likely be many hundreds of these likes going on at once. I'd like it to be AJAX driven so you get an immediate response.
At the moment I have a mysql table of likes which contains the post_id and user_id and I have a 'cached' counter on the posts table with the total number of likes - simple so far.
Would I benefit in any way, from storing any of this information in mongodb to take the load off of mysql?
At the moment, I click like, and two mysql queries run - and INSERT into likes and an UPDATE on posts. If I'm in a large-scale environment in heavy read/write situation what would be the best way to go?
Thanks in advance :)
MySQL isn't a good option for something like this, as a large number of writes will cause scaling issues. I believe MongoDB's real advantage is schemaless JSON document oriented storage, and while it should perform better than MySQL (if set up correctly), I think you should look at using Redis to store counters like this (The single INC command to increase a number value is the cream on top of the cake). It can handle writes much more efficiently than any other database, as per my experience.

retrieving datatbase information on a web page [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
This will be a total novices question, but I am looking for advice.
My apologies, In the post as I failed to mention that the database that I am working on is MySQL.
I know absolutely nothing in regards to any technologies that retrieve or get information from a database. The only 3 facts that I know is that it can be done by either PHP or HTML5, I should be able to pick it up and that I will make many mistakes
Could the community suggest which would be the better technology to learn and would any be able to suggest a starting point?
Yours in advance
Keith
In order to retrieve database information, you generally only need a database such as MySQL - and a client to perform your queries (fetching data from the database).
Your client could be anything, a commandline tool or a PHP script opening a connection to your database and performing the desired queries.
Fetching data alone will not get you very far unless you can display that information somewhere, or even provide access to it or (if desired) allow users to interact with it.
Basically, if you want to retrieve database information and show it on a website, your minimum requirements would be HTML, a database server, a database (preferably with some data to run some tests with) and some kind of scripting language (such as PHP).
There are numerous tutorials out there on how to make your first steps with this.
Here is one.
Start with PHP + MySQL. There are a lot of manuals and examples over the Internet. Google it.

Categories