Extracting facebook groups data with No admin rights - python

I need to extract some data from a facebook group for my research. I tried certain approach and tools but nothing works out the way I wanted. I want to extract things like comment, no. of likes per post, post content, date of post etc.I am looking for a R or python script that can automate this process. I followed many tutorial but they seems to be outdated since facebook change its api. I have come to know that facebook has its own api to do data extraction. Please quide me to do this either in R or in Python.
(Remember - I don't have admin rights to grab the data from the group.)

I found this blog post about Rfacebook helpful. Note that if you don't have admin rights for the group then you're probably out of luck unless you just want to access your own data.

Related

Find a Facebook group ID via graph

Im writing a script to convert facebook user id's in Python. To get a user ID I am simple going to:
https://graph.facebook.com/USERNAME
Is there any way to access this for groups as well?
I have found other questions like this but none give a definitive answer.
Do I need to use oauth?
I have seen several websites that do what I need such as
http://lookup-id.com/
https://www.wallflux.com/facebook_id/

how do I track how many users visit my website

I just deployed my first ever web app and I am curious if there is an easy way to track every time someone visits my website, well I am sure there is but how?
Easy as pie, use Google Analytics, you just have to include a tiny script in your app's pages
http://www.google.com/analytics/
PythonAnywhere Dev here. You also have your access log. You can click through this from your web app tab. It shows you the raw data about your visitors. I would personally also use something like Google Analytics. However you don't need to do anything to be able to just see your raw visitor data. It's already there.
know from myself people are obsessed with traffic, statistics, looking at other sites – tracking their stats and so on. And if there is enough demand, of course there are sites to satisfy You. I wanted to put those sites and tools in one list together, because at least for me this field was really unclear – I didn’t know what means Google Pagerank, Alexa, Compete, Technorati rankings and I could continue so on. I must say not always these stats are precise, but however they give at least overview, how popular the certain page is, how many visitors that sites gets – and if You compare those stats with Your site statistics, You can get pretty precise results then.
http://www.stuffedweb.com/3-tools-to-track-your-website-visitors/
http://www.1stwebdesigner.com/design/10-ways-how-to-track-site-traffic-popularity-statistics/
I am a huge fan of Cloudflare's analytics. It is super easy to setup, and you don't have to worry about adding a javascript blurb to each page. Cloudflare is also able to track all of the things that visit your page without loading the javascript.
http://www.cloudflare.com

It's possible to validate/lint/bleach a piece of code given by analytics/tracking sites like Google Analytics or Piwik

I provide in the administration interface of the site I make an area for the webmaster to place tracking codes from external analytic tools. Essentially, this codes must be included 'as-is', but my concern is that any typo could render the page useless, mess-up the HTML, etc...
It's possible (at some extent) to cleanup/validate these codes so I least it ensures the HTML won't be corrupted?
I'm using Python/Django, but i guess the Django part is somewhat irrelevant for this topic.
Regards
The easy way would be to grab the code from the tracking site and hard-code everything but the unique portion (usually a user ID number), offer the user a choice of approved trackers (with a radio button) and have them paste in their ID, then insert that value when you render the page. If I remember correctly, blogger worked like this before they tied it in directly with analytics via the google account.

how to count number of links shared by a facebook page?

I am working on a website for which it would be useful to know the number of links shared by a particular facebook page (e.g., http://www.facebook.com/cocacola) so that the user can know whether they are 'liking' a firehose of information or a dribble of goodness. What is the best way to get the number of links/status updates that are shared by a particular page?
+1 for implementations that use python (this is a django website) but any solutions are welcome! I tried using fbconsole to accomplish this but I have come up a little short.
For what it is worth, this unanswered question seems relevant. As does the fact that, as of 2012.04.18, you can export your data to csv from the insights management page on the facebook site. The information is in there I just don't know how to get it out...
Thanks for your help!
In the event that anyone else finds this useful, I thought I'd post my gist example here. fbconsole makes it fairly simple to extract data through the Facebook Graph API.
The caveat is that it was not terribly easy to programmatically extract data through fbconsole so I wrote the fbconsole.automatically_authenticate to make it much easier to access this information in a systematic way. This addition has not yet been incorporated into the master branch of fbconsole (it was just posted this morning), but it is available here in the meantime for those that are interested.

Crawler for Twitter social graph via python

I am sorry for asking but I am new in writing crawler.
I would like to crawl Twitter space for Twitter users and follow relationship among them using python.
Any recommendation for starting points such as tutorials?
Thank you very much in advance.
I'm a big fan of Tweepy myself - https://github.com/tweepy/tweepy
You'll have to refer to the Twitter docs for the API methods that you're going to need. As far as I know, Tweepy wraps all of them, but I recommend looking at Twitter's own docs to find out which ones you need.
To construct a following/follower graph, you're going to need some of these:
GET followers/ids - grab followers (in IDs) for a user
GET friends/ids - grab followings (in IDs) for a user
GET users/lookup - grab up to 100 users, specified by IDs
besides reading the twitter api?
a good starting point would be the great python twitter library by mike verdona which personally I think is the the best one. (also an intorduction here)
also see this question in stackoverflow

Categories

Resources