is file open in system ? - other than lsof

B

bvidinli

is there a way to find out if file open in system ? -
please write if you know a way other than lsof. because lsof if slow for me.
i need a faster way.
i deal with thousands of files... so, i need a faster / python way for this.
thanks.


--
Ý.Bahattin Vidinli
Elk-Elektronik Müh.
-------------------
iletisim bilgileri (Tercih sirasina gore):
skype: bvidinli (sesli gorusme icin, www.skype.com)
msn: (e-mail address removed)
yahoo: bvidinli

+90.532.7990607
+90.505.5667711
 
C

Chris McAloney

This is not a Python question but an OS question.
(Python is not going to deliver what the OS doesn't provide).

Please first find an alternative way at OS level (ie ask this
question at an
appropiate OS news group). Once you have found that, you can think
about Python
support for that alternative.

I agree with Albert that this is very operating-system specific.
Since you mentioned 'lsof', I'll assume that you are at least using a
Unix variant, meaning that the fcntl module will be available to you,
so you can check if the file is already locked.

Beyond that, I think more information on your application would be
necessary before we could give you a solid answer. Do you only need
to know if the file is open, or do you want only the files that are
open for writing? If you only care about the files that are open for
writing, then checking for a write-lock with fcntl will probably do
the trick. Are you planning to check all of the "thousands of files"
individually to determine if they're open? If so, I think it's
unlikely that doing this from Python will actually be faster than a
single 'lsof' call.

If you're on Linux, you might also want to have a look at the /proc
directory tree ("man proc"), as this is where lsof gets its
information from on Linux machines.

Chris
 
T

Thomas Guettler

bvidinli said:
is there a way to find out if file open in system ? -
please write if you know a way other than lsof. because lsof if slow for me.
i need a faster way.
i deal with thousands of files... so, i need a faster / python way for this.
thanks.

On Linux there are symlinks from /proc/PID/fd to the open
files. You could use this:

#!/usr/bin/env python
import os
pids=os.listdir('/proc')
for pid in sorted(pids):
try:
int(pid)
except ValueError:
continue
fd_dir=os.path.join('/proc', pid, 'fd')
for file in os.listdir(fd_dir):
try:
link=os.readlink(os.path.join(fd_dir, file))
except OSError:
continue
print pid, link
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,534
Members
45,008
Latest member
Rahul737

Latest Threads

Top