wasm-demo/demo/ermis-f/python_m/cur/0502

56 lines
1.5 KiB
Plaintext

From: steffen at cyberus.ca (Steffen Ries)
Date: 26 Apr 1999 07:59:50 -0400
Subject: timeout on urllib.urlopen?
References: <7g0r7c$gn9$1@nnrp1.dejanews.com> <19990426061959.A18551@toast.internal>
Message-ID: <m3k8uzwq15.fsf@gondolin.beleriand>
Content-Length: 1209
X-UID: 502
jam <jam at newimage.com> writes:
> On Mon, Apr 26, 1999 at 04:48:44AM +0000, Kevin L wrote:
> >
> > I'm trying to use urllib.urlopen() on a big list of urls, some of which are
> > dead (they don't return a 404, just no response). And the function just waits.
> > Is there any way to specify a timeout period for this function? thanks,
....
> attached, please find a short lightly tested module that might do what you
> are looking for.. please let me know if this is what you need. it's a piece
> of code I wrote for a larger application, and it seems to get the job done
> nicely. suggestions for optimizations, etc, accepted.
I used once SIGALRM to force a timeout. Maybe somebody could comment
on that approach?
/steffen
--8<----8<----8<----8<--
import signal
def alarmHandler(*args):
"""
signal handler for SIGALRM, just raise an exception
"""
raise "TimeOut"
....
signal.signal(signal.SIGALRM, alarmHandler)
try:
# set timeout
signal.alarm(120)
#... urllib.urlretrieve pages
except "TimeOut":
# some error handling
signal.alarm(0)
--8<----8<----8<----8<--
--
steffen at cyberus.ca <> Gravity is a myth -- the Earth sucks!