using phpfastcache file cache system - php

I have a question regarding phpfastcache that i hope someone can answer and explain to me. phpfastcache claim that it can serv 10,000+ visitors, but what is the limit it can serv if there is any? let me give you a example so you can answer me based on it.
Let say my website or app have 12 million online users ( just example ) and i have to send the same query to the database on every page/app load. Do you think it will be able to handle this amount of users? we are using a nosql database and our website is linked to a cdn. We know that memory is always faster than file system, but we are using file system when we are using phpfastcache for cache. I hope there is someone who can answer my question and explain things to me to future use.

It doesn't claim that it can serve 10.000 requests! It suggests that it's a great fit if you have 10.000 identical requests to the database. To get real numbers you have to profile your server.
As it's probably an Apache based one it will depend on the number of concurrent connections Apache can handle.
What the guys from phpfastcache mean is that if you have a page that's constantly hit and your page performs the same query over and over again their software is a great fit to that problem.

Related

my website is working slow because of server or big data in mysql DB?

My website name is reqsbook.com
It is basically a job portal website, actually My website has around 1 lack jobs.
when I am searching jobs from my website it is performing too slow, actually I am using hostgator cloud server and I have hosted single domain here
I mad my website with very compress that as much as possible
I came to know through internet that i have to go for dedicated server or go for local server means I need to keep server in my location and maintain the website from there
I am thinking that if a took dedicated server then also the same problem may repeated because My website db are increasing day by day
If i go for local server....I don't have any knowledge on this
Please some one help me give me better idea
Thank you
Yes, It's really slow. Only OOP and RDBMS can help you.
Customized your code so that same code need not execute many times.
Follow relation database management system. You can use indexing for fast searching. View table, temporary table, transaction can help you to manage data easily.

File creation limits

Dear respective developer across the globe,
I'm only seeking knowledge and understanding. Please provide as much
information as you can to not only help me out, but also others around the
world.
I will divide my questions so it's easier to understand what i'm asking for
and allow you to answer the questions individually for better
understanding.
My questions is:
a)
I'm runing a home server using mamp pro on a windows 8 32gb ram, 4k ssd, i7 cpu.My server is dns is set with cloudflare.com. When people from the public world view my site there is a php script that create a text-file with their username. Like: {username}.txt. For every username. The same file get re-created everytime the user login to keep the data fresh about him. That was some information about what i'm doing. What i want to understand is. Is there any limits? let say 500000 people at same time try to reach my site and every user will make my site create a new fresh txt file for him. Will it work? is there any problem.. please share with me.
b)
Can a textfile get views by let say 1000000 as same time? i'm talking about viewed not created here.
The number of files is not limited at all by the operating system or by php.
But is is limited by the file system you safe the files in. The exact numbers depend on the type of file system and its configuration. Typical limits are 32000 inodes in a single directory. But as mentioned that can be configured.
What is typically done in such cases is that you spread all those files over directories, where the directories are named by a substring of the file name itself. So for example the file somegoodguy.txt is saved under /som/ego/odg/somegoodguy.txt. Provided you have a more or less equal usage of characters that should prevent that you hit any limits, since the files are equally spread over many, many folders.
However:
It is questionable if that is a good approach at all. File based storage is not exactly efficient. You get a much better performance if you use a database instead. One entry (row) per user in a database table. Accessing that information is really efficient and fast. And you don't have to worry about any limits.

PHP application took longer time to respond [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have developed an Application in PHP, the application has complete User Management System along with Dynamic form creation feature , Data import,export feature and many more. I am using Mysql as a database. When i was doing testing on that application, it was working perfectly fine. Now i have deployed this application on the customer side and almost 50-60 user are running this application, its been two month and now they are facing some problem. They said some time application respond very late , and some time it took so much time to respond. For example to use this application user need to login into the application , now some time login feature perfactly fine and user can easly and some time it took a lot of time to login. I personally look into this and face the same problem. Now i am confuse where is the actual problem.
My Applicaiotn
Network Speed
Sever
LargeData in SQL
How can i get any clue that where is the exact problem.
You're going to need to provide a LOT more information to get a decent answer. And in providing that data, you will almost certainly solve the problem...
In most database-driven applications, the database is the first place where performance issues arise, especially as the data scales. A system that works fine with just a few records in the database can grind to a halt when scaling up...
So the first thing I'd do is look at the database processes while people are using the system, and look for long-running queries. Optimize those queries, and rinse & repeat.
It may also be worth writing debug log statements around your database access logic so you can look at historical performance stats - this is useful if the client tells you that the system slowed down yesterday, but is running fine today.
If it isn't the database, you need to look at the PHP performance. You'll need a profiler (try XDebug) and look for bottlenecks in your PHP code.
If it's neither the database nor your PHP code, you may have a configuration problem with your web server software (Apache or whatever you're using). This is hard to debug - you'll need to trawl through the configuration files and look for limits (e.g. "MaxConnections").
If it's not those things, the network may be the problem. This is pretty rare - if the network can't support a web application, file sharing, email and video conferencing will all suffer too. Ask your client if this is the case. To prove or disprove, put a decent size file on your webserver (a 20MB MP3 file should do it) and test how long it takes to download it while the application is running slowly.
If it is the the problem with your application, try optimizing the code. This point becomes irrelevant as the code is not provided.
Try pinging the server and check the response time. If it is normal then there is no issue with the network.
Check the Server's H/W configuration(both application and mysql server) and if the H/W configuration of the server is low for the application to run, kindly upgrade.
THIS IS MOST LIKELY TO BE THE SOLUTION: If your mysql has a large amount of data, try indexing the database. For instance you had problem with logging in, so try to indexing the usernames in your "users" table.
To be frank the data provided in the question is insufficient to come up with a concrete solution.
If the speed of your application was correct when you deployed it initially, I suppose that the problem will be the database.
Is your database normalized? Do you have the correct indexes? You can try indexing all the columns that you use in the WHERE clausules.
But, as Abhay Pai said, the data provided is insufficient to solve this problem.

PHP Image Generation

So, for a simple test game, I'm working on generating user images based on their current in-game avatar. I got this idea from Club Penguin and GTA V. They both generate images of the current in-game avatar.
I created a script to simply put a few images together and print out the final image to the client. It's similar to how Club Penguin does it, I believe: http://cdn.avatar.clubpenguin.com/%7B13bcb2a5-2e21-442c-b8e4-10516be6abc6%7D/cp?size=300
As you can see, the penguin is wearing multiple clothing items. The items are each different images located at http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/ (ex: http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/210.png)
Anyway, I've already made the script and all, but I have a few questions.
When going to Club Penguin's or Grand Theft Auto's avatar generator, you'll notice it finishes the request so fast. Even when it's a new user, (so before it has a chance to cache the image since it hasn't been generated yet), it finishes in under a second.
How could I possibly speed up the image generation process? Right now I'm just using PHP, but I could definitely switch over to another language. I know a few others too and I'm willing to learn. Which language can provide the fastest web-image generator (it has to connect to a database first to grab the user avatar info)?
For server specs, how much RAM and all that fun stuff would be an okay amount? Right now I'm using an OVH cloud server (VPS Cloud 2) to test it and it's fine and all. But, if someone with experience with this could help, what might happen if I started getting a lot more traffic and there were people with 100+ image requests being made per client when they first log in (relationship system that shows their friend's avatar). I'll probably use Cloudflare and other caching tools to help so that most of them get cached for a maximum of 24 hours, but I can't completely rely on that.
tl;dr:
Two main questions:
What's the fastest way to generate avatars on the web (right now I'm using PHP)?
What are some good server specs for around 100+ daily unique clients (at minimum) using this server for generating these avatars?
Edit: Another question, which webserver could process more requests for this? Right now I'm using Apache for this server, but my other servers are using nginx for other API things (like logging users in, getting info, etc).
IMHO, language is not the bottleneck. PHP is fast enough for real-time small images processing. You just need right algorithm. Also, check out bytecode caching engines such as APC, or XCache, or even HHVM. They can significantly improve PHP performance.
I think, any VPS can do the job until you have >20 concurrent requests. The more clients use service at the same time the more RAM you need. You can easily determine your script memory needs and other performance info by using profiler, such as XHProf.
Nginx or Lighttpd in FastCGI mode use less RAM than Apache http server and they can handle more concurrent connections. But is's not important until you have many concurrent connections.
Yes, PHP is can do this job fast and flexible(example generate.php?size=32)
I know only German webspaces, but they have also an English interface. www.nitrado.net

Synchronised local copy of MySQL / PHP website

I have a web based CRM coded in PHP, running off a MySQL database. The server is hosted in the same city as the company HQ but the company's internet connection is average (10Mbps down, 2Mbps up, 30ms ping to the server, all on a good day). The boss is happy with the results but now wants it to 'run super fast in the office' but we still need it to be viewable on the internet.
Short of moving the web server from our host and on to the local office network, which isn't a great option because then it would be super slow for everyone outside of the office, does anyone know a way to achieve this? I was thinking of setting up a local copy of the site and having the MySQL databases synchronise, but this sounds like a logistical nightmare.
Any ideas would be much appreciated! Happy to provide more info if needed.
You can setup dual-master replication with MySQL to accomplish this.
I would not attempt it without a fast, reliable line (which it sees you have). I would certainly setup and load test temporary servers to prove the configuration works.
For more information
http://devel.reinikainen.net/40
http://www.neocodesoftware.com/replication/
I am not joking around here.
Step 1) Have your boss define in written format what super fast means. This could(should?) include page load times for specific pages.
Step 2) Determine where there is a deficiency in speed. You think you know, but you don't. Measure it and record results. Use firebug in firefox to check page load and transfer times.
Step 3) Identify how you can speed up the app based on SPECIFIC measurements you looked at.

Categories