I'm trying to test response time of webpages hosted on many backends. These hosts are behind load balancer and above that I have my domain.com.
I want to use python+selenium on these backends but with spoofed hostname, without messing with /etc/hosts or running fake DNS servers. Is that possible with pure selenium drivers?
To illustrate problem better, here is what's possible in curl and I'd like to do the same with python+selenium:
If you're on a UNIX system, you can try something as explained here:
https://unix.stackexchange.com/questions/10438/can-i-create-a-user-specific-hosts-file-to-complement-etc-hosts
Basically you still use a hosts file, but it's only for you, located in ~/.hosts, setting the HOSTALIASESenvironment variable.
In short, no.
Selenium drives browsers using the WebDriver standard, which is by definition limited to interactions with page content. Even though you can provide Selenium with configuration options for the browser, no browser provides control over Host headers or DNS resolution outside of a proxy.
But even if you could initiate a request for a particular IP address with a custom Host header, subsequent requests triggered by the content (redirection; downloading of page assets; AJAX calls; etc) would still be outside of your control and are prohibited from customizing the Host header, leading the browser to fall back to standard DNS resolution.
Your only options are modifying the local DNS resolution (via /etc/hosts) or providing an alternative (via a proxy).
Related
[Never worked with a RPi before, absolute noob on that field]
I want to make a desktop/mobile app to access a program on a RaspberryPi. The only task of the app is to send a command and display the received response on an UI. It's meant only for private use, but it should also work outside my local network. So as long as I have mobile internet on the phone it should be possible to access the program with the app.
Can I achieve this without using any kind of public website? I saw some tutorials that used Flask and other frameworks to do sth similar, but I want the access to be restricted to the app. There shouldn't be any URL I could type in my browser, that gives me access to a login page or sth like that.
If you know the specific term for what I am describing here or even better an article/tutorial that features it, that would be very helpful.
You need two things for that:
Make your Raspi visible to the outside world. That can typically be done by configuring port forwarding in your router. Note that this might impose a certain security risk.
Make sure you have a global DNS name for your internet access. Since the IP of your router may change frequently (depending on your ISP), you need a URL or rather, a DNS entry. There exist public DNS services that can assign a DNS entry to a dynamic IP (typically for a fee). Many routers support a protocol to configure such services.
After that, you can program an app that uses the given DNS entry to talk to your Pi.
So no, without a public URL, this is not possible, at least not over the long term. You might be able to go with the public IP of your router, but then your app may fail from one day to the next.
I configured Tor browser and privoxy using https://jarroba.com/anonymous-scraping-by-tor-network/. When I checked my IP with http://icanhazip.com/, my IP addresses are changed; it works. But, when I tried to scrape desired website, I got:
You are attempting to access "website" using an anonymous private/proxy network. Please disable that and try accessing the site again.
Tor hides your IP address, but it does not hide the fact that you are using Tor, since Tor exit relays are public knowledge. For example, xmyip.com will tell you whether or not your IP is a Tor IP.
Given the error you received, it looks like that website blocks Tor users, which is a fairly common practice. See Tor users being actively blocked on some websites for more details.
I'm writing this application where the user can perform a web search to obtain some information from a particular website.
Everything works well except when I'm connected to the Internet via Proxy (it's a corporate proxy).
The thing is, it works sometimes.
By sometimes I mean that if it stops working, all I have to do is to use any web browser (Chrome, IE, etc.) to surf the internet and then python's requests start working as before.
The error I get is:
OSError('Tunnel connection failed: 407 Proxy Authentication Required',)
My guess is that some sort of credentials are validated and the proxy tunnel is up again.
I tried with the proxies handlers but it remains the same.
My doubts are:
How do I know if the proxy need authentication, and if so, how to do it without hardcoding the username and password since this application will be used by others?
Is there a way to use the Windows default proxy configuration so it will work like the browsers do?
What do you think that happens when I surf the internet and then the python requests start working again?
I tried with requests and urllib.request
Any help is appreciated.
Thank you!
Check if there is any proxy setting in chrome
I have built a MITM with python and scapy.I want to make the "victim" device be redirected to a specific page each time it tried to access a website. Any suggestions on how to do it?
*Keep in mind that all the traffic from the device already passes through my machine before being routed.
You can directly answer HTTP requests to pages different to that specific webpage with HTTP redirections (e.g. HTTP 302). Moreover, you should only route packets going to the desired webpage and block the rest (you can do so with a firewall such as iptables).
I am trying to make a "proxy" in Python that allows the user to route all of their web traffic through a host machine (this is mainly for me and a couple people I know to confuse/avoid hackers and/or spies, who would only see web pages and so on coming in through one IP). I have run into several difficulties. The first is that I would like to be able to use the final, compiled product with Firefox, which can be set to route all of its traffic through an installed proxy program. I don't know what kind of configuration my proxy needs to have to do this. Second, the way the proxy works is by using urllib.requests.urlretrieve (yes, going to die soon, but I like it) to download a webpage onto the host computer (it's inefficient and slow, but it's only going to be used for a max of 7-10 clients) and then sending the file to the client. However, this results in things like missing pictures or broken submission forms. What should I be using to get the webpages right (I want things like SSL and video streaming to work as well as pictures and whatnot).
(wince) this sounds like "security through obscurity", and a lot of unnecessary work.
Just set up an ssh tunnel and proxy your web traffic through that.
See http://www.linuxjournal.com/content/use-ssh-create-http-proxy