You would like to browse web pages through a proxy. If you have configured your browser with a proxy server and that works, you can try this recipe. Otherwise, you can use any of the public proxy servers available on the Internet.
You need to have access to a proxy server. You can find a free proxy server by searching on Google or on any other search engine. Here, for the sake of demonstration, we have used 165.24.10.8
.
Let us send our HTTP request through a public domain proxy server.
Listing 4.5 explains proxying web requests across a proxy server as follows:
#!/usr/bin/env python # Python Network Programming Cookbook -- Chapter - 4 # This program is optimized for Python 2.7. # It may run on any other version with/without modifications. import urllib URL = 'https://www.github.com' PROXY_ADDRESS = "165.24.10.8:8080" if __name__ == '__main__': resp = urllib.urlopen(URL, proxies = {"http" : PROXY_ADDRESS}) print "Proxy server returns response headers: %s " %resp.headers
If you run this script, it will show the following output:
$ python 4_5_proxy_web_request.py Proxy server returns response headers: Server: GitHub.com Date: Sun, 05 May 2013 16:16:04 GMT Content-Type: text/html; charset=utf-8 Connection: close Status: 200 OK Cache-Control: private, max-age=0, must-revalidate Strict-Transport-Security: max-age=2592000 X-Frame-Options: deny Set-Cookie: logged_in=no; domain=.github.com; path=/; expires=Thu, 05-May-2033 16:16:04 GMT; HttpOnly Set-Cookie: _gh_sess=BAh7...; path=/; expires=Sun, 01-Jan-2023 00:00:00 GMT; secure; HttpOnly X-Runtime: 8 ETag: "66fcc37865eb05c19b2d15fbb44cd7a9" Content-Length: 10643 Vary: Accept-Encoding
This is a short recipe where we access the social code-sharing site, www.github.com, with a public proxy server found on Google search. The proxy address argument has been passed to the urlopen()
method of urllib
. We print the HTTP header of response to show that the proxy settings work here.
18.119.124.49