Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
Closed 8 years ago.
Improve this question
Been messing around with Firebug on a test website. Need some help regarding it.
This page is 2.7kb but takes 11.39 seconds to load:
http://puu.sh/8V9sk.png
Would someone point me in the right direction?
Thanks
Hovering the timeline within Firebug's Net panel provides some more detailed information on the network request timings. So you can see, which part of the request is slow.
Regarding your screenshot the Waiting part (purple) takes the longest, which means that your server-side script takes some time to execute.
Though because client-side debugger tools like Firebug cannot provide information on why a script on the server-side is slow, you need to analyze the problem in your server-side script.
I ended up allowing the site to cache all the web scripts allowing the page to load quicker. This can be done in Moodle via
Site Administration > Appearance > Settings.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I have a website that I built using PHP and HTML hosted on GoDaddy. When the website is idle for more than ~30 minutes, it takes around 20 seconds for it to load on the first visit after that.
I am suspicious that GoDaddy puts the server to sleep if there is no activity for ~30 minutes, but GoDaddy support tells me that is not the case.
I have tried to add session_write_close() just in case there was an issue with session locking. I have also tried clearing my browser cache to see if the website was just loading fast from my cache, but that also did nothing.
Any ideas would be very appreciated. Thank you.
You can follow these steps,
make sure images of website are not too big.
find out whether any JavaScript is taking too long to load.
find out whether any api call is taking too long to return values
bad coding can also increase website loading time
you can use this site -> https://developers.google.com/speed/pagespeed/insights/
or your google chrome to find out the problem
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm trying to simulate a web browser in order to log into a secure site, where the site's backend seems to be written in some mix of PHP and ASP.NET, and retrieve some user details.
In order to fit my own project, the simulation results (i.e. the user details) must be returned to a PHP script for processing.
So far I've been working with CURL in PHP to do this, and realised that the site is far too complicated to use CURL effectively, and this method is far too slow to develop. What I would like is some sort of browser simulator that can:
Execute JavaScript
Submit forms
Click links
Handles cookies
Uses ASP.NET postbacks
Can access the DOM
Basically something that behaves exactly like a real browser, and can return the page source to me.
I've explored the Snoopy class in PHP and Capybara in Ruby. If I don't get any better options I will be forced to implement with one of these.
You have two options:
Use a headless browser. This is basically browser without any graphical output, which can be controlled via. code. You can check out Selenium and PhantomJS, there probably exists bindings for your language of choice.
Reverse their site. Do the login flow and actions needed to get to the resource you need, and look at the network traffic, for example with Chrome's developer tools. Look at the requests, headers and form data needed for the endpoints in question and emulate that in the code.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
We are currently moving a lot of our code to use the api we've developed instead of making sql calls from our php. There will be a lot of functionality to test once this happens. I was wondering if you know of a good plugin or software to use to track and replicate and action (such as registering a user, the logging in, posting a comment, etc). I know there is software like selenium, but I've heard that it would be more of a hassle to setup than it's worth (for what we need it for).
I basically want to create a script of my actions on our stable build, then run that script on the build that is using our newly implemented api build that uses a different database, then come the two databases to make sure they have the same data.
Any suggestions would be great. There has to be a chrome plugin or something, but I haven't been be able to find it after a few hours of searching.
If these are web service calls to your API, you can use curl (on the command line or within PHP) or even Guzzle as it's just an HTTP Client for communicating with web services. What you are describing is testing your app, which is common. There is nothing trivial or easy about full test coverage so prepare to spend some time setting this up and working out the kinks.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Improve this question
I'm developing an application composed of two elements: a simple PHP server interacting with a SQL DB and a client which on certain events will send information to the server to log.
I would like a way to somehow verify (on the server's side) that the incoming query is indeed coming from my client, because there is a possibilty that someone will decompile the client file and see how I connect and send commands to the server, and that would let them inject false data.
I have no idea how to do such a mechanism though simply because anything I implement in the client could (theoretically) be viewed after decompilation process. Or maybe obfuscation is a solution in this case?
If someone is intrepid enough to decompile your client, they will simply write their own client using your mechanism and there's no way you can distinguish the two. No amount of "authentication" will stop that. (Like for example, if someone gets my private SSH key, game over: they are me until those keys are revoked.)
The best you can do is make it hard for them to decompile, detect intrusion, and limit damage. Some ideas, but you really should consider the attack patterns you expect to face:
Only allow the client to execute certain commands with certain parameters
Do not allow any more than the expected number of commands per time period
Limit the IP from which those certain commands can come
Be able to revoke client privileges on the server
PS: Expect this question to be closed or at least downvoted. It's not about code, but about design.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I built a Website using PHP and want to deploy it on internet. I want to know what is the best option as to host on web server or cloud and what will be pros and cons doing that.
Resources needed for my site:
PHP
Mysql
Apache or lighttpd
My site is simple CMS with 10 pages (max).
A web server has defined hardware specifications, meaning that if too many users try to access it, it will fail to answer their requests.
Cloud hosting providers will restrict you in what you can do (what language, what APIs you can access, ...), but they usually allow for automatic scaling, meaning: If the first instance's ("server's") load exceeds a certain limit, a second instance may start automatically to handle half of the load, and so on.
A single server often is sufficient for PHP sites, but you may suffer the slashdot effect, i. e. a sudden peak of attention to your site may bring it down quickly.
You didn't specify the kind of application you are going to deploy and kind of resources you application need. Anyhow below are the points you need to consider to decide cloud vs non-cloud.
1) Availability
2) Scalability
3) Security
4) Cost effective
Prons:
1) Security
2) Limited control.