What would be the best way to mine Twitter data? [closed] - php

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I need to do a MASSIVE data mine. I want to find out;
A users location
Look at their tweets for specific words in the last two days
Repeat (ideally) for every twitter user
I've seen R recommended somewhere, but wouldn't really know where to begin.
Happy with CSV, json or SQL endpoint.

As you tagged "python" in your question, I'm going to assume you're ok with it ! Twitter lets you access its data by two APIs :
REST API allows you to make specific user requests (profile, friends, etc.), but it only allows a few queries per hour, so it probably does not meet your "massive data" criterion
The streaming API delivers tweets based on a search on real-time. You can definitely harvest massive data using this API, and if I remember correctly, tweets come up with useful infos (user who tweeted of course, but probably location too if enabled).
Tweepy (http://www.tweepy.org/) is a user-friendly Python library implementing both Twitter APIs, providing particularly helpful functions for capturing data from the streaming API (see examples here : https://github.com/tweepy/examples).

Related

How do I program this thing (extract data from other website) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am a union member of a airlines company, and I possess the elementary level of HTML, PHP and MYSQL. I have experience programming a library system and personnel system.
Our union would like to create an online platform, one of the function is to allow our crews to calculate their flying allowances and salary easily.
I think first of all, I need the roster data, so the platform is required to log in my company website in order to extract the roster data, and then I can code with PHP.
Therefore, I wonder if it is possible to write PHP code "log in to my company website and extract the data".
Or what is the best language for this program you recommend? Maybe I can learn a new language if PHP is not applicable.
Thanks for your attention.
Welcome to Stack Overflow.
To answer your question, Yes, you can write this in PHP. And by you, I mean you! None of us know the site and there is no way for us to know how it operates.
You may want to consider if there is an API for this site. You may want to load the site via cURL or another HTTP request into PHP and parse the HTML. There are a lot of ways to do this.
PHP is not the only language to perform this in: ASP.NET, Perl, Python... in the end, it sounds like you need to collect data that you would normally see using a web browser, so you're going to mimic that behavior with whatever language you choose to use.

Open Source Reporting and Analytics for PHP (Logistics) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
We're a small logistics company and want to be able to provide clients with a way to generate custom reports with their data in our shared database (limited by the relations built in the DB) as well a few standard reports that we will build for them.
We've looked into iDashboards and LogiAnalytics but the price tags are huge, and iDashboards have limitations on user sessions that directly obstruct how we deal with user logins (one company might have 10 people using the same User/Pass) so a "perpetual named session" is pretty unacceptable.
The ability to generate maps with the data (like a US map) and drill down to the city, county, or zip code level is also a must (though I would consider software that enables me to add this easily).
I've been searching, and searching, and have not found anything that looks useful thus far and am hoping someone out there has used something they liked and can make a recommendation.
If I can provide any more info about our requirements or needs, let me know and I'll gladly edit.
Try DBxtra, the license allow an unlimited number of report viewer users and also can do maps of pivot grids, besides it can connect to several databases and even plain text (CSV) files.
Take a look at myDBR. It will do everything you listed and much more.

Private Twitter application? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I'm looking to implement Twitter into an internal existing system. I need to use several features of the API (such as mentions) that require authorization.
From my understanding, this can only be used with a Twitter application, and as such I've created one.
However it all seams very public facing, asking for details like website URL, and application description, which in my case I do not need or want.
I simply need to authorize my system to make calls to the API, am I going about it the right way?
If not, is there a certain PHP library / alternative way of getting autorized? I can't imagine i'm first in this situation?
Thanks!
You'll need to provide these details so Twitter knows what kind of application you're building. If your application is not going to publish any tweets, that's entirely fine; if you're simply making GET requests, your application information won't be published anywhere.
If you don't have a public-facing URL for your application, you can enter a placeholder or a fake URL.

Is there a single PHP API to post to multiple sharing sites (Twitter, Reddit, Linkedin, etc)? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Is there a single PHP API to post to multiple sharing sites (Twitter, Reddit, Linkedin, YouTube etc), or do I have to use multiple APIs? Or is there an online service that will do this via, say, a REST interface?
Good question, I know a lot of people use hootsuite to update multiple networks however I didnt see anything about an API on their site. In searching i found two that looked ok and offered APIs. They both update ~40 services however I didn't see reddit or youtube. You can check them out though, maybe you're able to link one account to another like facebook lets you do with twitter. Anyway the sites are:
http://ping.fm
http://hellotxt.com
Ping.fm supports quite a few services and has an API.
There is no single API for all services, so to speak, but there are multiple GitHub projects that do this!
Social Share URLs
Sharer.js
I personally contribute to both of these projects on GitHub! Even though there is a "support" button there for contributions for Sharer.js, they go to the project owners -- and I'm only a contributor, not a project owner!

How does sites such as livescore.com work? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I had an idea to work on a livescore site with some additions, however before even trying to work on one, I would like to know from where such sites get their data from?
Thanks for any answer.
I've no idea for this particular site, but it seems like there are a couple of viable possibilities:
Subscribe to a live feed of information from a provider, e.g. a broadcaster or similar. I would imagine this would cost a fair bit and be subject to restrictions on use, etc.
Hire people to obtain the information from a variety of sources (including primary sources, i.e. watch several of the games) and update a database manually.
Scrape ("borrow") the information from one or more other sites which do one of the above.
you can do this using google live sports data api xml ->http://www.xmlsportsfeeds.com/products/live-scores/ it's a kind of paid rss feeds and they can be quite expensive.

Categories