I have a problem with python library instaloader, this one is really cool but I can`t find a method to download a post by url or post id. Everything I have found is terminal command in official documentation.
instaloader -- -B_K4CykAOtf
But this isn`t my decision, I need a way to use it in script. Hope somebody know answer, thank for attention
instaloader -- -B_K4CykAOtf
Related
I am trying to download about 6000+ comments from this link, which uses Spot.IM to manage the comments. I saw an earlier solution posted here that requires a Spot.IM token, but the token can only be given by the account manager (I presume it requires a paid account).
Is there any other way to download the comments without the need for a token?
Yes, you can use a webdriver and Selenium for it.
You can follow this link to start:
https://selenium-python.readthedocs.io/getting-started.html
Probably this was already answered somewhere, but my google-fu can't get proper keywords for this.
Ok so, I need to get a file from my site which is like foo.bar/foobar/file.ext . That file is always accessible but if you aren't google-authenticated on the site you get a blank file.
How can I get proper authentication with python?
Sorry if this isn't very clear but it's my first time here... Thanks in advance for help
First install the Python SDK here.
Then I would start reading about OAuth here.
They have different examples depending on your use case. (e.g. If you are authenticating a user from their browser, versus from server to server)
I need to input text into the text boxon this website:
http://www.link.cs.cmu.edu/link/submit-sentence-4.html
I then require the return page's html to be returned.
I have looked at other solutions. But i am aware that there is no solution for all.
I have seen selenium, but im do not understand its documentation and how i can apply it.
Please help me out thanks.
BTW i have some experience with beautifulsoup, if it helps.
Check out the requests module. It is super easy to use to make any kind of HTTP request and gives you complete control of any extra headers or form completion payload data you would need to POST data to the website you want to.
P.S. If all else fails, make the request you want to make to the website in a web browser and get the curl address of the request using an inspector. Then you can just start a python script and exec the curl command (which you might need to install on your system if you dont have it) with the parameters in the curl request you copied
can anyone help me get sessions or cookies working with my code here:
http://pastebin.com/2Y2tydsF
I have tried a few session modules that I found with Google but nothing seems to work, or I dont know how to use it.
I have also fiddled with cookies but with no luck
Also, what are the differences, and what are CGI and WSGI apps? and would my code be one of the two?
Thanks
Use gai-sessions. The source includes demos which show how to use it, including how to integrate with the Users API or RPX/JanRain.
I want to write a script in Python that would connect and post some data to several web-servers automatically.
Please write how to post and submit data for example to google. Just can't understand how to do it from Python documentation. Thank You.
Check the documentation on the urllib2 module and check out this urllib2 the missing manual. It's all there.
If you are open for packages outside standard library, then mechanize is a good option for such tasks.