I've got a couple of affiliate sites and would like to bring together the earnings reports from several Amazon sites into one place, for easier viewing and analysis.
I get the impression that cURL can be used to get external webpage content, which I could then scrape to obtain the necessary info. However, I've hit a wall in trying to log in to the Associates reports using cURL.
Has anyone done this and do you have any advice?
I am working on an open-source project called PHP-OARA it's allows you to get your data from the different networks, it's part of the AffJet Project.
We have solved the problem with Amazon(it wasn't easy) and a PHP class is available to get your data from your associate account.If you like it you can even co-operate to do it better.
I hope it helps!
You can do this, but youll need to make use of cookies with curl: http://www.electrictoolbox.com/php-curl-cookies/ But id be willing to bet some cash that Amazon offers an API to get the data you want, although the last time i dealt with their web services it was a nightmare but proably because i was using the PHP SOAP extension and Amazon SOAP API.
Related
I am some what a beginner in iOS development, and am beginning to move from the front-end dev to the back-end. I know using JSON with php and MySQL is probably the most common to use for loading table views from databases, but I am wondering if anyone has some insight in where and how to start this process for building a database to load image based user-generated content to TableViews (much like instagram).
I know this is not a programming specific question, just trying to get some direction on where to pin-point my efforts. Appreciate any and all input.
Okay so learning how to build a database(MySQL) . I found some links that can help you.
http://www.wikihow.com/Create-a-Database-in-MySQL
http://dev.mysql.com/doc/refman/5.1/en/database-use.html
It isn't advisable to store images in tables because you'll encounter problems when it comes to backing up. Save images in a folder on the server and store their references in the tables.
To Access this images from your app. You'll need to make a post request to your database through an API written in PHP. You can find a good tutorial on how to write a simple PHP API script on google.
I don't have enough reputation so I wasn't allowed to post more than 2 links. So my answer continues
From your app you hit your API with the necessary parameters you set when writing the PHP script with a POST request. You API should return you the url of the image you seek and you should be able to download the image in your app.
This is a tutorial on how to make a simple NSURL Connection
http://codewithchris.com/tutorial-how-to-use-ios-nsurlconnection-by-example/
Below is another link that might be of good help. Its a simple tutorial that builds a messaging app that hits an API and gets messages from a database. There is a lot of learn from there to fulfill your need.
http://www.ibm.com/developerworks/library/x-ioschat/
As an alternative to the traditional ways I find Parse very useful. Parse can manage your whole backend and they have a very powerful free basic plan.
If you focus on images dive here in, to decide wether Parse fits for you or not: https://parse.com/tutorials/saving-images
Any ideas? I am new to php and am having a lot of trouble with curl and domdocuments so please write or show me an example. I was thinking of using dom documents but I can not figure out how to get amazon to search a users input from my site and display certain parts of the results such as price, category ex.....
There are several methods using file_get_contents, a "save html" plugin (https://simplehtmldom.sourceforge.io/) and CURL which I've had varying luck with, but eventually it starts flagging my requests with robot checks. I originally used the API, but Amazon locked that down to minimum traffic rules that I can't meet with my budding webservice rendering that useless.
There currently is no easy/effective way to consistently pull Amazon data though I'm playing with randomizing useragents and using proxies.
Use the Product Advertising API instead of scraping. http://docs.aws.amazon.com/AWSECommerceService/latest/DG/ItemSearch.html
The product API actually would be the best resource for this although it gives you limited results and after 180 days if no affiliate transaction occurs I believe they may revoke your access so it does limit you to some extent depending on your uses. Not sure but I think you may need a professional seller account or an affiliate membership, not 100% on that but that is my understanding.
I am trying to create a web application which have a similar functionality with Google Alerts. (by similar I mean, the user can provide their email address for the alert to be sent to, daily or hourly) The only limitation is that it only gives alerts to user based on a certain keyword or hashtag. I think that I have found the fundamental API needed for this web application.
https://dev.twitter.com/docs/api/1/get/search
The problem is I still don't know all the web technologies needed for this application to work properly. For example, Do I have to store all of the searched keywords in database? Do I have to keep pooling ajax request all the time in order to keep my database updated? What if the keyword the user provided is very popular right now that might have thousands of tweets just in an hour (not to mention, there might be several emails that request several trending topics)?
By the way, I am trying to build this application using PHP. So please let me know, what kind of techniques I need to learn for such web app (and some references maybe)? Any kind of help will be appreciated. Thanks in advance :)
Regards,
Felix Perdana
I guess you should store user's e-mails and search keywords (or whatever) in the database.
Then your app should make API queries (so it should be run by a server) to get some relevant data. Then you have to send data to the all users.
To understand here is the algorithm:
User adds his request to the page like http://www.google.ru/alerts
You store his e-mail and keyword in the database.
Then your server runs script (you can loop it or use cron) which makes queries to the Twitter to get some data.
Your script process all the data and send it to the user's e-mails.
Have you ever tried to code to get all the visitors from the cpane's latest visitor?
Is this possible?
A little poking around the CPanel API site showed that what your looking for is the listlastvisitors function in the ApiStats module (API2). From my memories of the CPanel API, it takes a lot of poking around through an amazing multitude of pages to get the API code to work in PHP, but if your up for a challenge, Here's the module, and information about calling the API functions is here. Poke around the XML API, and check out API2. Good luck, you will need it!
I am defining out specs for a live activity feed on my website. I have the backend of the data model done but the open area is the actual code development where my development team is lost on the best way to make the feeds work. Is this purely done by writing custom code or do we need to use existing frameworks to make the feeds work in real time? Some suggestions thrown to me were to use reverse AJAX for this. Some one mentioned having the client poll the server every x seconds but i dont like this because it is unwanted server traffic if there are no updates. I was also mentioned a push engine like light streamer to push from server to browser.
So in the end: What is the way to go? Is it code related, purely pushing SQL quires, using frameworks, using platforms, etc.
My platform is written in PHP codeignitor and DB is MySQL.
The activity stream will have lots of activities. There are 42 components on the social networking I am developing, each component has approx 30ish unique activities that can be streamed.
Check out http://www.stream-hub.com/
I have been using superfeedr.com with Rails and I can tell you it works really well. Here are a few facts about it:
Pros
Julien, the lead developer is very helpful when you encounter a problem.
Immediate push of new feed entries which support PubSubHubHub.
JSon response which is perfect for parsing whoever you'd like.
Retrieve API in case the update callback fails and you need to retrieve the latest entries for a given feed.
Cons
Documentation is not up to the standards I would like, so you'll likely end up searching the web to find obscure implementation details.
You can't control how often superfeedr fetches each feed, they user a secret algorithm to determine that.
The web interface allows you to manage your feeds but becomes difficult to use when you subscribe to a loot of them
Subscription verification mechanism works synchronous so you need to make sure the object URL is ready for the superfeedr callback to hit it (they do provide an async option which does not seem to work well).
Overall I would recommend superfeedr as a good solution for what you need.