Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Can anyone suggest any good cURL based PHP browser / spider / crawler / http / file download libraries..?
I just want to have a tool for downloading content from URLs, like I would do with implode('', file(url)) or file_get_contents(), but it should support timeouts, HTTP code responses, custom headers etc.
cURL is awesome, I've been using it for a while in a function I created, but I want an encapsulated 3rd party library. Sure there must be such.. I made a search, but I couldn't find exactly what I wanted.
You can try this http://simplehtmldom.sourceforge.net/
Is a php library for parsig/downloading content.
I was using PHPCrawl for my web crawler project. It is purely standalone library, without dependency of cURL.
It provides functions to recursively download content of given URL, and support pattern matching and timeout. Then you can do whatever you want with the retrieved URL content. Oh yes, it can give you the HTTP response status. But not sure for custom headers.
I wrote my own pretty decent function using cURL: http://pastebin.com/4CPaCfMm
It works, but I just though I could search for a more advanced OOP tool that I could use in my projects, that would get maintained and furthur developed - a library.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm trying to simulate a web browser in order to log into a secure site, where the site's backend seems to be written in some mix of PHP and ASP.NET, and retrieve some user details.
In order to fit my own project, the simulation results (i.e. the user details) must be returned to a PHP script for processing.
So far I've been working with CURL in PHP to do this, and realised that the site is far too complicated to use CURL effectively, and this method is far too slow to develop. What I would like is some sort of browser simulator that can:
Execute JavaScript
Submit forms
Click links
Handles cookies
Uses ASP.NET postbacks
Can access the DOM
Basically something that behaves exactly like a real browser, and can return the page source to me.
I've explored the Snoopy class in PHP and Capybara in Ruby. If I don't get any better options I will be forced to implement with one of these.
You have two options:
Use a headless browser. This is basically browser without any graphical output, which can be controlled via. code. You can check out Selenium and PhantomJS, there probably exists bindings for your language of choice.
Reverse their site. Do the login flow and actions needed to get to the resource you need, and look at the network traffic, for example with Chrome's developer tools. Look at the requests, headers and form data needed for the endpoints in question and emulate that in the code.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
We are currently moving a lot of our code to use the api we've developed instead of making sql calls from our php. There will be a lot of functionality to test once this happens. I was wondering if you know of a good plugin or software to use to track and replicate and action (such as registering a user, the logging in, posting a comment, etc). I know there is software like selenium, but I've heard that it would be more of a hassle to setup than it's worth (for what we need it for).
I basically want to create a script of my actions on our stable build, then run that script on the build that is using our newly implemented api build that uses a different database, then come the two databases to make sure they have the same data.
Any suggestions would be great. There has to be a chrome plugin or something, but I haven't been be able to find it after a few hours of searching.
If these are web service calls to your API, you can use curl (on the command line or within PHP) or even Guzzle as it's just an HTTP Client for communicating with web services. What you are describing is testing your app, which is common. There is nothing trivial or easy about full test coverage so prepare to spend some time setting this up and working out the kinks.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Imagine a canvas paint tool where you can draw and paint on a website, but like a chat application, what you draw immediately shows up on your friend's canvas as well. WebSockets would be more that perfect for this. But since my website is being hosted by a web hotel that doesn't support JavaScript on the server, WebSockets is not an option (if I understood it correctly). Is there any other way I could build it - that almost keeps the efficiency that WebSockets provide? Or is my only good solution to host my web site on a server that let's me run JavaScript, (suck as note.js)?
This is what you are looking for http://tutorialzine.com/2012/08/nodejs-drawing-game/
The next best solution for sharing the canvas data in real-time, would be ajax long polling. Simply put, the client makes an ajax request to the server, if the server has fresh canvas data it returns the data, otherwise it keeps the HTTP request open until it has new data to return. Once data is returned, the process is repeated.
Since we are using standard HTTP requests, this wont be as efficient as web-sockets, as every HTTP request carries a bunch of headers which are not needed.
More on long polling - http://en.wikipedia.org/wiki/Comet_%28programming%29
I should add, WebSockets is not specific to NodeJs. WebSockets is a protocol which can be implemented in any language. There are libraries available for using WebSockets in a variety of different languages, including PHP, which I assume your server supports.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I need to verify if a browser meets the requirements of our website. If one of the requirements is not met, I'd like to display a corresponding warning on the login page. The requirements are the following:
Cookies must be enabled
Javascript must be enabled
HTML5 must be supported
IE9+
What's your approach to this problem? Can you e.g. recommend an open source library (Javascript, PHP, Zend Framework 2) I can use to do the job? Or is there a good diagnostics website I can refer to for checking browser requirements? Verifying the requirements will save us a lot of support time.
Looks like http://detector.dmolsen.com/ could be of great help. It can be combined with the well known http://modernizr.com/ too!
you can check if 1st party cookies are enabled by setting and reading a cookie from javascript. You can do the same with 3rd party cookies by doing an ajax request to a php file
its a tough one
you can't detect if "html5" is supported but you can detect if certain javascript functions are enabled (http://diveintohtml5.info/detect.html)
if you are using jQuery you can go with
if (jQuery.browser && jQuery.browser.msie && jQuery.browser.version > 9)
unfortunately the latter doesn't work with jQuery version > 1.9.1
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I'm looking for a tool to draw good-looking Venn diagrams, for use on a Linux-based PHP site, which already employs Flash for graph drawing (Open Flash Chart 2). Free (as in beer or speech) would be nice, but isn't essential.
So it should be one of the following (in my rough order of preference):
Browser based (Flash)
PHP library
Linux command line app
Web service
So far the options I'm aware of are:
Google Charts
Write something myself using PHP GD or Flash
Use Venny
Very easy to use and goes up to four groups.
Use the Google chart api.
Write something yourself using PHP GD or Flash.