Shoukd I use JSON for my posting and commenting system? [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I am building a social networking website and should i use JSON instead of database for storing users posts and comments? Is it good and secure?

JSON is a format, not a file standard. So you could even think of storing JSON in DB.
So, you will need to answer two questions
is file format better than real DB format (like MySQL like databases)?
is JSON format good for the use I have?
File format
Because you will have to deal with quick access and sometimes complex queries (search engine, statistics, ... ), it's clear that a DB system will be much more efficient than a file system oriented solution. File creations / openings / writings / closings need a lot of time and it will be a pain to create search queries. In PHP you would have to open all your files and put them in memory beforte doing the real job.
Using a Database, you would perhaps be directly be able to ask the system if someone with "%john%" in his name has posted something about "%futurama%" in the title.
So... one point for DB
JSON format stored in DB
One more time, you'd better use rows capacities of the DB system. By that, I mean using a author_id row for example. It would greatly impact the perfs. On the other hand, you would have to think of complex queries dealing with the downsides of the JSON format in your DB engine query.
One more point for not using JSON
But when using JSON?
JSON is great when dealing with APIs. If you need to serve data for an application (i.e. an Angular2 front end that will query your API), JSON is a native format to manage data... but not for storing it... JSON is more often used for stream purposes.

Related

NSUserDefaults vs Core Data [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am creating an app like Snapchat using Swift and PHP as my backend. I am using MSQL is my database. I am currently saving my user information in NSUserDefaults (it is only a couple variables like username, email, id etc). I heard if I am creating a full scale app for more than 100k users I am supposed to use Core Data so I am not sure if I should switch over. As i said it is only a small number of variables is stored, rest is in my database. Will i be noticing any speed differences?
It can't logically follow that the number of users downloading your app has any impact on the viability of using CoreData vs. UserDefaults to save someone's username and email. However, it's plausible that there are advantages one way or the other (independent of scalability). Personally, I use UserDefaults to save that information, but can't imagine it makes much a difference.
However, the fact that you're keeping all other data (user profiles, photos, messages) exclusively in a remote database is quite worrisome. Without any kind of caching, your Snapchat app should probably be marketed toward the older generation, keen to reminisce in the days of Dial-Up with an app crafted to pull any data (other than username/email) repeatedly from a remote database. For this purpose, you ought to use CoreData or a similar alternative (SQLite, etc.). It's absolutely crucial that you have some sort of local database if you want to scale. And please, dear God, do not attempt to save this type of data to UserDefaults.
Unless, of course, you find my facetious proposition appealing.

web base accounting application [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to build a web Base application for accounting company with large amount of data that doing task like following
1-login page.
2- Inquiry account Details like (Account Owner Name, Account number, Balance, date,..personal information and account information.. etc ) by entering the primary key account number and then display to user some details
3- reports something like this example of Datatable
and these reports should be exported to excel file or PDF file (I don't know if that possible in some way)
Notes
the system is running with oracle data
I want to know if using PHP to enhance the security and ajax technique to enhance the preference of querying data is good way to design my web base application or should I use something better
finally I you don't mind to give me a good explained example for above ideas or
for the suggests ideas specifically example that using liked to oracle database
I searched and found examples that use PHP ajax and jquery or bootstrap with MySQL
I also want to know what the difference between them and which is better in my case
I would do it like this, because I feel comfortable with those languages, this kind of application can be created in a lot of languages, so why not go with those you are good at ->
Backend: PHP, mysql
Frontend: html, css, js, bootstrap
I would use ajax if there is need for posting data and updating elements without refresh.
But this is just me, do it with what you feel comfortable, but prepare everything before u start, create ER model for database based on as much info you can get, create business logic process and procedures before you start coding so you do not end up writing the same code multiple times.
Hope this helps

PHP - Managing A Lot of Data Without Database [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
The Problem
I have an app that scrapes data and presents it to the user, directly, because of lack of disk space.
This data is very volatile, it can change within minutes. Much like the stock market.
Since the data changes so often, and it varies from user to user, it is useless to save it in a database.
The question
I need to sort the data presented to the user, compare it, link it etc. A lot of functions that a database provides. Yet I cannot save it in said database because of the above conondrums, what should I do?
What I've Thought of Doing So Far
I've tried organizing the data presented to each user using just PHP but seems troublesome, fragile and inefficient.
Should I just create some sort of virtual table system in MySQL just for data handling? Maybe use a good database engine for that purpose?
Maybe I can save all data for each user but have a cron job remove the old data in the database in a constant fashion? Seems troublesome.
The Answer
I'd like some implementation ideas from folks who have encountered a similar problem. I do not care for "try all of the above and see what is faster" type of answers.
Thanks all for your help.
If the data is of the type you would store in a db and you would benefit from being able to query it in ways that are more difficult in PHP, but you just don't want to keep it, you can still use a database. You can create temporary tables, insert raw data, and query it to get what you want. When you close the db connection, the tables disappear. Even though the script names them the same, the database will actually create a unique set per connection so each user will have unique data. This solution may not perform as well as you need so do some testing to see if it's suitable for your situation.

How to bring parsed files to a unique format [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have different sources like S3 (json files) and API, and I have to bring all the data to a unique format to store the data in DB.
I tried to parse files and API response on my php back-end but it is too slow.
Is there some best practices or advises how I can do it in a right way?
I'm going to do an Interface with all required methods, and Class' for every source which will implement the Interface.
If I will work with hundreds or thousands files (per hour) Is this approach the best way to do it?
P.S. Currently the project is build on top of Symfony2 framework.
I guess you are forced my traditional RDBMS to convert all sources to specific format.
You may use schema-less systems like MongoDB, Cassandra or even JSON type in MySQL 5.7 to store 3 fields: id, source_type and source_json. This way you create several classes that know how to parse the source_type (ex: S3) and use them accordingly

Parsing JSON vs. Parsing XML in an IOS project [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
In the backend of my IOS project, the admin is saving the data into a DB or in an XML file. So whenever he wants, he can simply add an entry.
In the IOS app, I want to retrieve the data.
If I use XML, I can directly parse the XML file, since data are already in XML format (when admin added the value, the XML file got updated).
If I use JSON, I have to connect to the DB, get the result of the query and then encode it into JSON.
So, what do you think would be faster, in terms of the response come into phone.
Is there any other option that I didn't take into account?
I have read all of these similar questions:
JSON and XML comparison [closed],
What's better: Json or XML (PHP) [closed],
JSON or XML: Just Decide (April 2012; by Mark Nottingham)
and many more, but I want to ask something specific for my project.
It depends on lots different things:
amount of data
cpu time needed to generate the data
network bandwith/latency
mobile phone's hardware
...
But because generally mobile network is the bottleneck, probably the less redundant transfer will be the most efficient. And it is json in this case.

Categories