Python3.4 HTTP connection using ip, port, path - python

im trying to establish a HTTP connection but getting the error:
"http.client.InvalidURL: numeric port: '80/test/path'
Code:
Import http.client
self.conn = http.client.HTTPConnection("192.168.1.1:80/test/path") #"[ip]/[path],[port] does'nt work either
# some post and put requests
working Code:
Import requests
requests.post('192.168.1.1:80/test/path',data='testdata')
How can I establish a HTTP Connection by using ip, port and path?

Related

Connecting with ftplib via FTP proxy in Python?

I am trying to download files from FTP. It works fine at home but it doesn't work when I run through company's network. I know there is something to do with proxy. I have looked at a few posts regarding the proxy issue in Python. I have tried to set up a connection to the proxy. It works ok for url but it failed when connecting to FTP. Does anyone know a way to do that? Thanks in advance.
Below is my code:
import os
import urllib
import ftplib
from ftplib import FTP
from getpass import getpass
from urllib.request import urlopen, ProxyHandler, HTTPHandler, HTTPBasicAuthHandler, \
build_opener, install_opener
user_proxy = "XXX"
pass_proxy = "YYY"
url_proxy = "ZZZ"
port_proxy = "89"
url_proxy = "ftp://%s:%s#%s:%s" % (user_proxy, pass_proxy, url_proxy, port_proxy)
authinfo = urllib.request.HTTPBasicAuthHandler()
proxy_support = urllib.request.ProxyHandler({"ftp" : url_proxy})
# build a new opener that adds authentication and caching FTP handlers
opener = urllib.request.build_opener(proxy_support, authinfo,
urllib.request.CacheFTPHandler)
# install it
urllib.request.install_opener(opener)
#url works ok
f = urllib.request.urlopen('http://www.google.com/')
print(f.read(500))
urllib.request.install_opener(opener)
#ftp is not working
ftp = ftplib.FTP('ftp:/ba1.geog.umd.edu', 'user', 'burnt_data')
The error message I got:
730 # and socket type values to enum constants.
731 addrlist = []
--> 732 for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
733 af, socktype, proto, canonname, sa = res
734 addrlist.append((_intenum_converter(af, AddressFamily),
gaierror: [Errno 11004] getaddrinfo failed
I can connect via the proxy using FileZilla by selecting custom FTP proxy with specification:
USER %u#%h %s
PASS %p
ACCT %w
FTP Proxy using FileZilla
You are connecting using an FTP proxy.
FTP proxy cannot work with HTTP, so your test against http:// URL to www.google.com is completely irrelevant and does not prove anything.
FTP proxy works as an FTP server. You connect to the proxy, instead of to the actual server. And then use some special syntax of a username (or other credentials) to specify your actual target FTP server and its credentials. In your case the special syntax of username is user#host user_proxy. Your proxy expects the proxy password in FTP ACCT command.
This should work for your specific case:
host_proxy = '192.168.149.50'
user_proxy = 'XXX'
pass_proxy = 'YYY'
user = 'user'
user_pass = 'burnt_data'
host = 'ba1.geog.umd.edu'
u = "%s#%s %s" % (user, host, user_proxy)
ftp = ftplib.FTP(host_proxy, u, user_pass, pass_proxy)
No other code should be needed (urllib or any other).
If the proxy uses a custom port (not 21), use this:
ftp = ftplib.FTP()
ftp.connect(host_proxy, port_proxy)
ftp.login(u, user_pass, pass_proxy)

Tor + Urllib2 Python

I am trying to use tor to get a new IP every time I access a website:
import socks
import socket
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS4, '127.0.0.1', 9151, True)
socket.socket = socks.socksocket
import urllib2
print urllib2.urlopen("http://almien.co.uk/m/tools/net/ip/").read()
I have also tried port 9150, 9050 too.
I keep getting:
socks.ProxyConnectionError: Error connecting to SOCKS4 proxy 127.0.0.1:9151: [Errno 61] Connection refused
Use stem package to interact with Tor. Official site have many tutorials for different cases, for example:
https://stem.torproject.org/tutorials/to_russia_with_love.html

Sending SSL data over a TCP proxy connection in Python

I am facing the following scenario:
I am forced to use an HTTP proxy to connect to an HTTPS server. For several reasons I need access to the raw data (before encryption) so I am using the socket library instead of one of the HTTP specific libraries.
I thus first connect a TCP socket to the HTTP proxy and issue the connect command.
At this point, the HTTP proxy accepts the connection and seemingly forwards all further data to the target server.
However, if I now try to switch to SSL, I receive
error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol
indicating that the socket attempted the handshake with the HTTP proxy and not with the HTTPS target.
Here's the code I have so far:
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect(('proxy',9502))
s.send("""CONNECT en.wikipedia.org:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.7; rv:15.0) Gecko/20100101 Firefox/15.0.1
Proxy-Connection: keep-alive
Host: en.wikipedia.org
""")
print s.recv(1000)
ssl = socket.ssl(s, None, None)
ssl.connect(("en.wikipedia.org",443))
What would be the correct way to open an SSL socket to the target server after connecting to the HTTP proxy?
(Note that in generally, it would be easier to use an existing HTTPS library such as PyCurl, instead of implementing it all by yourself.)
Firstly, don't call your variable ssl. This name is already used by the ssl module, so you don't want to hide it.
Secondly, don't use connect a second time. You're already connected, what you need is to wrap the socket. Since Python doesn't do any certificate verification by default, you'll need to verify the remote certificate and verify the host name too.
Here are the steps involved:
Establish your plain-text connection and use CONNECT like you're doing in the first few lines.
Read the HTTP response you get, and make sure you get a 200 status code. (You'll need to read the header line by line).
Use ssl_s = ssl.wrap_socket(s, cert_reqs=ssl.CERT_REQUIRED, ssl_version=ssl.PROTOCOL_TLS1, ca_certs='/path/to/cabundle.pem') to wrap the socket. Then, verify the host name. It's worth reading this answer: the connect method and what it does after wrapping the socket.
Then, use ssl_s as if it was your normal socket. Don't call connect again.
works with python 3
< proxy > is an ip or domain name
< port > 443 or 80 or whatever your proxy is listening to
< endpoint > your final server you want to connect to via the proxy
< cn > is an optional sni field your final server could be expecting
import socket,ssl
def getcert_sni_proxy(cn,endpoint,PROXY_ADDR=("<proxy>", <port>)):
#prepare the connect phrase
CONNECT = "CONNECT %s:%s HTTP/1.0\r\nConnection: close\r\n\r\n" % (endpoint, 443)
#connect to the actual proxy
conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
conn.connect(PROXY_ADDR)
conn.send(str.encode(CONNECT))
conn.recv(4096)
#set the cipher for the ssl layer
context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
#connect to the final endpoint via the proxy, sending an optional servername information [cn here]
sock = context.wrap_socket(conn, server_hostname=cn)
#retreive certificate from the server
certificate = ssl.DER_cert_to_PEM_cert(sock.getpeercert(True))
return certificate

A socket operation was attempted to an unreachable network in python httplib

I am trying to make a REST client from django using httplib . But it is refusing the connection
I tried the following
import hashlib
import hmac
from django.shortcuts import render_to_response
from django.template import RequestContext
def loginAction(request):
username=request.POST['email']
password=request.POST['password']
import httplib, urllib
params = urllib.urlencode({'username': username})
#hash username here to authenticate
digest=hmac.new("qnscAdgRlkIhAUPY44oiexBKtQbGY0orf7OV1I50", str(request.POST['password']),hashlib.sha1).hexdigest()
auth=username+":"+digest
headers = {"Content-type": "application/json","Accept": "text/plain","Authorization":auth}
conn = httplib.HTTPConnection("10.0.2.2",8000)
conn.request("POST", "/api/ecp/profile/", params, headers)
but is giving following error
[Errno 10051] A socket operation was attempted to an unreachable network
What could be the issue?
The error indicates that your the machine you are running this script on cannot reach the destination IP address (10.0.2.2), as it doesn't have a network route configured from one to the other.
This is a problem with your internal network (10.x.x.x IP addresses are always private network addresses). If you are running this script on a different network from the machine you are trying to reach, you'll need a public IP address for it instead.

How can I pass a SSL certificate to a SOAP server using SOAPpy / Python

I am building a script to access a HTTPS/TLS TCP site that requires a X.509 certifcate that I have as a .pfx file.
I am using SOAPpy 0.12.5 and Python 2.7 and have started off with code as below,
import SOAPpy
url = "192.168.0.1:5001"
server = SOAPpy.SOAPProxy(url)
# I think I need to pass the cert to server here...
server.callSoapRPC(xxxx)
If I try running this it fails with the following message
socket.error: [Errno 10061] No connection could be made because the target machine actively refused it
Any sugestions how to tie the .pfx certificate to the SOAPproxy?
Thanks
I managed to do it this way:
import SOAPpy
SOAPpy.Config.SSL.cert_file = 'cert_file'
SOAPpy.Config.SSL.key_file = 'key_file'
server = SOAPpy.SOAPProxy(url, config=config)

Categories

Resources