MQL5 readfile while being write by another program - python

I got a program that using 2 different language, MQL5 and Python. As a bridge between 2 script, I'm using 2 text file. MQL5 will write a file. Python will standby and periodically check if said file exist. If it indeed exist, Python will read the file then write another file, before deleting file that have been read. MQL5 after writing the file will go to standby and periodically check if Python have produced a reply. If reply filename exist, it will read said file.
Unfortunately, MQL5 keep trying to read the reply file while it being written by Python. It cause MQL5 to throw error and if I force it to read it, it will read blank file. Is there anyway to avoid this? Is there anyway to detect if a file have finish being modify by another program in MQL5?
Below is the code I used to try to handle this problem to no avail.
while(!FileIsExist("output.txt"))
{
}
if(FileIsExist("output.txt"))
{
ResetLastError();
do
{
int file_handle=FileOpen("output.txt", FILE_READ|FILE_SHARE_READ|FILE_TXT);
}
while(file_handle==5004);
}

Related

How to ensure that globally a file can be read only once by my python script

I'm trying to create a script (Script #1) that writes a file that can only be read once globally, and then another script (Script #2) that reads this file only if it was never read before in the world.
Example Situation:
I create a CSV file with my Script #1 and email this CSV file to 10 people, who are on different computers.
All 10 try to run this file with my Script #2:
Expected behaviour:
The first person in the world to run Script #2 with this file gets a message saying they are the first person to read this file and can actually see the content.
2nd -10th person that try to read the file get a message saying someone has already read it before and can't access the file.
How can I accomplish this?
This is not something very serious, so I'm not really worried about security of the process, but want it to work.
Not Giving the whole process because it will be a large explanation but you can do the below mention points
Encrypt the file first.
Second the script which you want to read the content of your file must be linked with a server application from which it will receive decryption key and send data to server about whether the file is read before or not. It's like if the file readed earlier then the server don't send the decryption key.
Then after receiving the key the script should decrypt the file and read it.

Can I save a text file in python without closing it?

I am writing a program in which I would like to be able to view a log file before the program is complete. I have noticed that, in python (2.7 and 3), that file.write() does not save the file, file.close() does. I don't want to create a million little log files with unique names but I would like to be able to view the updated log file before the program is finished. How can I do this?
Now, to be clear I am scripting using Ansys Workbench (trying to batch some CFX runs). Here's a link to a tutorial that shows what I'm talking about. They appear to have wrapped python, and by running the script I can send commands to the various modules. When the script is running there is no console onscreen and it appears to be eating all of the print statements, so the only way I can report what's happening is via a file. Also, I don't want to bring a console window up because eventually I will just run the program in batch mode (no interface). But the simulations take a long time to run and I can't wait for the program to finish before checking on what's happening.
You would need this:
file.flush()
# typically the above line would do. however this is used to ensure that the file is written
os.fsync(file.fileno())
Check this: http://docs.python.org/2/library/stdtypes.html#file.flush
file.flush()
Flush the internal buffer, like stdio‘s fflush(). This may be a no-op on some file-like objects.
Note flush() does not necessarily write the file’s data to disk. Use flush() followed by os.fsync() to ensure this behavior.
EDITED: See this question for detailed explanations: what exactly the python's file.flush() is doing?
Does file.flush() after each write help?
Hannu
This will write the file to disk immediately:
file.flush()
os.fsync(file.fileno())
According to the documentation https://docs.python.org/2/library/os.html#os.fsync
Force write of file with filedescriptor fd to disk. On Unix, this calls the native fsync() function; on Windows, the MS _commit() function.
If you’re starting with a Python file object f, first do f.flush(), and then do os.fsync(f.fileno()), to ensure that all internal buffers associated with f are written to disk.

Beginner Python: Saving an excel file while it is open

I have a simple problem that I hope will have a simple solution.
I am writing python(2.7) code using the xlwt package to write excel files. The program takes data and writes it out to a file that is being saved constantly. The problem is that whenever I have the file open to check the data and python tries to save the file the program crashes.
Is there any way to make python save the file when I have it open for reading?
My experience is that sashkello is correct, Excel locks the file. Even OpenOffice/LibreOffice do this. They lock the file on disk and create a temp version as a working copy. ANY program trying to access the open file will be denied by the OS. The reason for this is because many corporations treat Excel files as databases but the users have no understanding of the issues involved in concurrency and synchronisation.
I am on linux and I get this behaviour (at least when the file is on a SAMBA share). Look in the same directory as your file, if a file called .~lock.[filename]# exists then you will be unable to read your file from another program. I'm not sure what enforces this lock but I suspect it's an NTFS attribute. Note that even a simple cp or cat fails: cp: error reading ‘CATALOGUE.ods’: Input/output error
UPDATE: The actual locking mechanism appears to be 'oplocks`, a concept connected to Windows shares: http://oreilly.com/openbook/samba/book/ch05_05.html . If the share is managed by Samba the workaround is to disable locks on certain file types, eg:
veto oplock files = /*.xlsx/
If you aren't using a share or NTFS on linux then I guess you should be able to RW the file as long as your script has write permissions. By default only the user who created the file has write access.
WORKAROUND 2: The restriction only seems to apply if you have the file open in Excel/LO as writable, however LO at least allows you to open a file as read-only (Go to File -> Properties -> Security, set Read-Only, Save and re-open the file). I don't know if this will also make it RO for xlwt though.
Hah, funny I ran across your post. I actually just implemented this tonight.
The issue is that Excel files write, and that's it, not both. You cannot read/write off the same object. So if you have another method to save data please do. I'm in a position where I don't have an option.. and so might you.
You're going to need xlutils it's the bread and butter to this.
Here's some example code:
from xlutils.copy import copy
wb_filename = 'example.xls'
wb_object = xlrd.open_workbook(wb_filename)
# And then you can read this file to your hearts galore.
# Now when it comes to writing to this, you need to copy the object and work off that.
write_object = copy(wb_object)
# Write to it all you want and then save that object.
And that's it, now if you read the object, write to it, and read the original one again it won't be updated. You either need to recreate wb_object or you need to create some sort of table in memory that you can keep track of while working through it.

python xlrd check if file is already open

Trying to determine if a certain excel file is already open. I have a script that opens up a template excel file and writes data to the file and then saves it as a specific formatted name. Now if the person runs the script again and forgets to close out of the excel file I get errors that stop the program saying cant save the file as it is already open. Is there a way to check if, not only a program is open (excel) but a specific file? That way I can prompt the user to either close the file or save it as another filename.
If the processing time of the input is really small you do not need to detect this before processing the file. You can easily catch the error that you describe with "I get errors that stop the program saying cant save the file as it is already open" and provide a meaningful error message to the user.

python saving unicode into file

i'm having some trouble figuring out how to save unicode into a file in python. I have the following code, and if i run it in a script test.py, it should create a new file called priceinfo.txt, and write what's in price_info to the file. But i do not see the file, can anyone enlighten me on what could be the problem?
Thanks a lot!
price_info = u'it costs \u20ac 5'
f = codecs.open('priceinfo.txt','wb','utf-8')
f.write(price_info)
f.close()
I can think of several reasons:
the file gets created, but in a different directory. Be certain what the working
directory of the script is.
you don't have permission to create the file, in the directory where you want to create it.
you have some error in your Python script, and it does not get executed at all.
To find out which one it is, run the script in a command window, and check for any error output that you get.
Assuming no error messages from the program (which would be the result of forgetting to import the codecs module), are you sure you're looking in the right place? That code writes priceinfo.txt in the current working directory (IOW are you sure that you're looking inside the working directory?)

Categories

Resources