Results 1 to 3 of 3

Thread: Python urllib2 problem: Name or service not known

  1. #1
    Join Date
    Aug 2006
    Beans
    37

    Python urllib2 problem: Name or service not known

    Hi all,

    Some time ago I've written some python code to read video data off an IP camera connected via a router to a laptop. Now I try to run this code on a different laptop and router combination, but now I can't access the camera.
    Some minimal example code:
    Code:
    import urllib2
    
    url = urllib2.urlopen("http://192.168.1.3/-wvhttp-01-/image.cgi")
    This fails and returns the error:
    Code:
    <urlopen error [Errno -2] Name or service not known
    When I check further, I cannot access any url via urllib2.urlopen, not the camera nor the router nor localhost. Pinging them poses no problem though, and I can access the video feed and the router admin page in firefox without any issues.
    Searching for the problem turned up this discussion on the gentoo forums, which didn't provide any solution but did hint that it might be a system-related problem rather than a coding problem.
    I run kubuntu 11.10 on the laptop, the router is not connected to the internet and serves as dhcp server for both the laptop and the camera.
    I.

  2. #2
    Join Date
    Aug 2006
    Beans
    37

    Re: Python urllib2 problem: Name or service not known

    Never mind it's a proxy problem. I disabled http_proxy in .bashrc for this, but apparently it was also set in /etc/bash.bashrc and I didn't check for that.

    That did leave me with the problem how I could get my code to work with the proxy enabled. But I found that one too, just put
    Code:
    import os
    os.environ['http_proxy']=''
    before importing urllib2.
    I.

  3. #3
    Join Date
    Dec 2012
    Beans
    1

    Re: Python urllib2 problem: Name or service not known

    hey Iskendar, I added the import os code into my program but I'm still getting the same error. I'm running ubuntu on a virtual machine and my program is trying to open a lot of urls to try and scrape song lyrics from a site.

    Code:
    from bs4 import BeautifulSoup
    import os
    os.environ['http_proxy'] = ''
    from urllib.request import urlopen
    from urllib.request import urlretrieve
    import re
    
    base = "http://www.hymnal.net/hymn.php/"
    ns_urls = []
    
    #this while loop places all the urls for all the new songs into a list
    ns = 1
    while ns < 391:
    	addr = "ns/" + str(ns)
    	ns_urls.append(base + addr)
    	ns += 1
    
    ns_songs = []
    count = 1
    #this for loop parses the html found in the urls and places the lyrics into a list
    for url in ns_urls:
    	soup = BeautifulSoup(urlopen(url))
    	print(count)
    	count += 1
            """the counter here is see on which url the error arises.
               from what I've seen it changes everytime but is usually around 20 - 40.
    	for tag in soup.find_all('div'):
    		if tag.has_key('class'):
    			if tag['class'][0] == 'main-content':
    				str_tag = str(tag)
    				ns_songs.append(str_tag)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •