Well, almost everything is in title. I have a dbf file which I would like to copy even if it is locked (edited) by another program like DBU.
If I try to open it or copy with shutil.copy I get
>>> f = open('test.dbf', 'rb')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
IOError: [Errno 13] Permission denied: 'test.dbf'
I know that it is locked on windows level because I am unable to copy it witch batch or with windows explorer. But is there any method to copy such a file?
In general, you can't. Even if you were to circumvent the locking mechanism, another process might be in the middle of writing to the file, and the snapshot you would take may be in an inconsistent state.
Depending on your use case, Volume Shadow Copy might be of relevance.
There is a tool from Joakim Schicht that copies any locked file.
The only issue is that some AV tag it as malicious, when it is not.
Depending on your use case, this can be a solution.
https://github.com/jschicht/RawCopy
Related
Abstract:
I am analysing a pcap file, with live malware (for educational purposes), and using Wireshark - I managed to extract few objects from the HTTP stream and some executables.
During my Analysis, I found instances hinting Fiestka Exploit Kit used.
Having Googled a ton, I came across a GitHub Rep: https://github.com/0x3a/tools/blob/master/fiesta-payload-decrypter.py
What am I trying to achieve?
I am trying to run the python fiesta-payload-decrypter.py against the malicious executable (extracted from the pcap).
What have I done so far?
I've copied the code onto a plain text and saved it as malwaredecoder.py. - This script is saved in the same Folder (/Download/Investigation/) as the malware.exe that I want to run it against.
What's the Problem?
Traceback (most recent call last):
File "malwaredecoder.py", line 51, in <module>
sys.exit(DecryptFiestaPyload(sys.argv[1], sys.argv[2]))
File "malwaredecoder.py", line 27, in DecryptFiestaPyload
fdata = open(inputfile, "rb").read()
IOError: [Errno 2] No such file or directory: '-'
I am running this python script in Kali Linux, and any help would be much appreciated. Thank you.
The script expects two args... What are you passing it?
Looks like it expects the args to be files and it sees a -, (dash), as the input file.
https://github.com/0x3a/tools/blob/master/fiesta-payload-decrypter.py#L44 Here it looks like the first arg is the input file and second is the output file.
Try running it like this:
python malewaredecoder.py /Download/Investigation/fileImInvestigating.pcap /Download/Investigation/out.pcap
All that said, good luck, that script looks pretty old and was last modified in 2015.
Yeah I know this or similar questions have been posted to this forum but so far none of the answers is sufficient to solve my problem:
Here I have the following code:
with open(filename,'r',buffering=2000000) as f:
f.readline() # takes header away
for i, l in enumerate(f): # count the number of lines
print('Counting {}'.format(i),end='\r')
pass
What happens is the file is a 23Gbytes csv file. I get the following error:
File "programs\readbigfile.py", line 33, in <module>
for i, l in enumerate(f): # count the number of lines
PermissionError: [Errno 13] Permission denied
The error always happens at the line number 1374200. I checked the file with a text editor and there is nothing unusual at that line. This happened to me with the same file but a smaller version (a few less Gigabytes). Then suddenly it worked.
The file is not being used by any other process at all.
Any ideas of why this error occurs in the middle of the file?
PD. I am running this program on a computer with an Intel i5-6500 CPU/16Gb memory and a NVIDIA GeForce GTX 750 Ti card.
System is Windows 10. Python 3.7.6 x64/Anaconda
The file is on a local disk, no networking involved.
Whatever it is, I think your code is ok.
My ideas:
do you need this buffering? Did you try to remove it completely?
I see you're running on Windows. I don't know if that's important, but many weird issues happen on Windows
if you're trying to access it on a disk in the network (samba etc), maybe it's not fully synced?
are you sure nothing else tries to access this file in the meantime? excel?
did you try reading this file with csv.reader? I don't think it'd help though, just wondering
you can try/except and when the error is raised check os.stat or os.access if you have permissions
maybe printing is at fault, it sounds like a huge file. Did you try without printing? You might want to add if i % 1000 == 0: print(...)
I found out the error is due to a file writing error, either because a bad disk block or a disk system failure. The point is, the file had a CRC error somewhere in the middle of it, which I corrected just by creating the file again. It is a random error so if you find yourself in the same situation, one of the checks should be the soundness of the file itself.
I'm using Xbee3 and I want to append data to file.
I try this script for test but I received EEXIST error if TEST.txt file exists. If this file doesn't exist the file is created for the first run but I get same error when I run this script again.
f = open("TEST.txt", 'a')
for a in range(3):
f.write("#EMPTY LINE#\n")
f.close()
Traceback (most recent call last):
File "main", line 1, in
OSError: [Errno 7017] EEXIST
I formatted xbee by the way.
It's sounds like you're using an 802.15.4, DigiMesh or Zigbee module. The file system in those modules is extremely limited, and doesn't allow for modifying existing files. There should be documentation on the product that lists those limitations (no rename, no modify/append, only one open file at a time, etc.)
XBee/XBee3 Cellular modules have a fuller file system implementation that allows for renaming files and modifying file contents.
I'm using Python 2.7 on Windows XP.
My script relies on tempfile.mkstemp and tempfile.mkdtemp to create a lot of files and directories with the following pattern:
_,_tmp = mkstemp(prefix=section,dir=indir,text=True)
<do something with file>
os.close(_)
Running the script always incurs the following error (although the exact line number changes, etc.). The actual file that the script is attempting to open varies.
OSError: [Errno 24] Too many open files: 'path\\to\\most\\recent\\attempt\\to\\open\\file'
Any thoughts on how I might debug this? Also, let me know if you would like additional information. Thanks!
EDIT:
Here's an example of use:
out = os.fdopen(_,'w')
out.write("Something")
out.close()
with open(_) as p:
p.read()
You probably don't have the same value stored in _ at the time you call os.close(_) as at the time you created the temp file. Try assigning to a named variable instead of _.
If would help you and us if you could provide a very small code snippet that demonstrates the error.
why not use tempfile.NamedTemporaryFile with delete=False? This allows you to work with python file objects which is one bonus. Also, it can be used as a context manager (which should take care of all the details making sure the file is properly closed):
with tempfile.NamedTemporaryFile('w',prefix=section,dir=indir,delete=False) as f:
pass #Do something with the file here.
I'm reading a bunch of netcdf files using the pupynere interface (linux). The following code results in an mmap error:
import numpy as np
import os, glob
from pupynere import NetCDFFile as nc
alts = []
vals = []
path='coll_mip'
filter='*.nc'
for infile in glob.glob(os.path.join(path, filter)):
curData = nc(infile,'r')
vals.append(curData.variables['O3.MIXING.RATIO'][:])
alts.append(curData.variables['ALTITUDE'][:])
curData.close()
Error:
$ python2.7 /mnt/grid/src/profile/contra.py
Traceback (most recent call last):
File "/mnt/grid/src/profile/contra.py", line 15, in <module>
File "/usr/lib/python2.7/site-packages/pupynere-1.0.13-py2.7.egg/pupynere.py", line 159, in __init__
File "/usr/lib/python2.7/site-packages/pupynere-1.0.13-py2.7.egg/pupynere.py", line 386, in _read
File "/usr/lib/python2.7/site-packages/pupynere-1.0.13-py2.7.egg/pupynere.py", line 446, in _read_var_array
mmap.error: [Errno 24] Too many open files
Interestingly, if I comment one of the append commands (either will do!) it works! What am I doing wrong? I'm closing the file, right? This is somehow related to the python list. I used a different, inefficient approach before (always copying each element) that worked.
PS: ulimit -n yields 1024, program fails at file number 498.
maybe related to, but solution doesn't work for me: NumPy and memmap: [Errno 24] Too many open files
My guess is that the mmap.mmap call in pupynere is holding the file descriptor open (or creating a new one). What if you do this:
vals.append(curData.variables['O3.MIXING.RATIO'][:].copy())
alts.append(curData.variables['ALTITUDE'][:].copy())
#corlettk: yeah since it is linux, do strace -e trace=file will do
strace -e trace=file,desc,munmap python2.7 /mnt/grid/src/profile/contra.py
This will show exactly which file is opened when - and even the file decriptors.
You can also use
ulimit -a
To see what limitations are currently in effect
Edit
gdb --args python2.7 /mnt/grid/src/profile/contra.py
(gdb) break dup
(gdb) run
If that results in too many breakpoints prior to the ones related to the mapped files, you might want to run it without breakpoints for a while, break it manually (Ctrl+C) and set the breakpoint during 'normal' operation; that is, if you have enough time for that :)
Once it breaks, inspect the call stack with
(gdb) bt
Hmmm... Maybe, just maybe, with curData might fix it? Just a WILD guess.
EDIT: Does curData have a Flush method, perchance? Have you tried calling that before Close?
EDIT 2:
Python 2.5's with statement (lifted straight from Understanding Python's "with" statement)
with open("x.txt") as f:
data = f.read()
do something with data
... basically it ALLWAYS closes the resource (much like C#'s using construct).
How expensive is the nc() call? If it is 'cheap enough' to run twice on every file, does this work?
for infile in glob.glob(os.path.join(path, filter)):
curData = nc(infile,'r')
vals.append(curData.variables['O3.MIXING.RATIO'][:])
curData.close()
curData = nc(infile,'r')
alts.append(curData.variables['ALTITUDE'][:])
curData.close()