wasm-demo/demo/ermis-f/python_m/cur/0283

39 lines
1.1 KiB
Plaintext

From: a.mueller at icrf.icnet.uk (Arne Mueller)
Date: Fri, 23 Apr 1999 17:38:51 +0100
Subject: Python too slow for real world
References: <372068E6.16A4A90@icrf.icnet.uk>
Message-ID: <3720A21B.9C62DDB9@icrf.icnet.uk>
X-UID: 283
Hi All,
thanks very much for all the suggestions how to speed up things and how
to THINK about programming in python. I got alot of inspiration from
your replys. However the problem of reading/writing larges files line by
line is the source of slowing down the whole process.
def rw(input, output):
while 1:
line = input.readline()
if not line: break
output.write(line)
f = open('very_large_file','r')
rw(f, stdout)
The file I read in contains 2053927 lines and it takes 382 sec to
read/write it where perl does it in 15 sec. These simple read/write
functions use the functions from the C standard library, don't they? So,
readline/write don't seem to be implemented very efficently ... (?)
I can't read in the whole file as a single block, it's too big, if
readline/write is slow the program will never get realy fast :-(
thanks a lot for discussion,
Arne