python long running process crashing without error output - python

I'm running a python script that does the following logical steps:
Gets a list of hosts from a DB query
For each of those hosts, gets a csv
Processes the csv with Pandas
Inserts the output in elasticsearch
It works well but as the list of hosts is large in some cases, it can take up to 6 days to finish some of the runs.
The problem is that sometimes it crashes without much information.
This is what the error looks like in /var/log/messages: abrt[3769]:
Saved core dump of pid 24382 (/usr/bin/python) to
/var/spool/abrt/ccpp-2016-02-05-10:33:42-24382 (631136256 bytes)
abrtd: Directory 'ccpp-2016-02-05-10:33:42-24382' creation detected
abrtd: Interpreter crashed, but no packaged script detected: 'python
/home/cloud/collection-all-es-commissioned.py' abrtd: 'post-create' on
'/var/spool/abrt/ccpp-2016-02-05-10:33:42-24382' exited with 1 abrtd:
Deleting problem directory
'/var/spool/abrt/ccpp-2016-02-05-10:33:42-24382'
It happens with python 2.6 and 2.7 in Oracle Linux.
Any ideas on how to find out the root cause and fix it?
Thanks,
Isaac

Related

Can I edit the copy of the python script a process is running?

I've started running a Python script that takes about 3 days to finish. My understanding is that Linux processes make a copy of a python script at the time they're created, so further edits on the file won't impact the running process.
However, I just realized I've made a mistake on the code and the processes running for over 2 days are going to crash. Is it possible to edit the copy of the python script the process loaded?
I'm basically trying to figure out if there's a way to fix the mistake I've made without having to restart the script execution.
No, you can't. The interpreter byte compiles your source code when it initially reads it. Updating the file won't change the byte code that is running.

Task Scheduler returns 0xFFFFFFFF in Windows 7

I created a Python script that collects data from a website and generates an Excel file based on a table in that website. I used pyinstaller with -w -F parameters to generate a .exe file.
I ran this file a few times and it worked perfect so I decided to use Task Scheduler to run it every hour. Two days after the task worked every hour, while I was using the computer the Task Scheduler returned this error when it tried to run the .exe: 0xFFFFFFFF and a pop-up saying: Failed to "something"
Given the fact that I needed data every hour, I ran the file manually and again... it worked!
Is there any way I can fix this? How can I make sure that it won't fail again when I leave my computer online for 1 week but I'm not there to manually start it in case it fails...
Here are the settings for the Task Scheduler:
Actions:
Program/script: C:\path1\path2\path3\Script_G1.exe
/ Start in (optional): C:\path1\path2\path3\
Settings:
Allow task to be run on demand
We had a similar problem where we'd get a 0xFFFFFFFF error when running our custom .exe from Task Scheduler, but it would work fine outside of Task Scheduler.
The workaround was to create a .bat file to run the .exe and make the scheduled task call the .bat file. Not a solution, obviously, but works in a pinch.
We had a similar problem. The program accessed shared disk F:\SomeFolder\File.log and copied a file from it to the local folder. I had to change the shared disk path name in the program to use full server path.
From
F:\SomeFolder\File.log
to
\\serverName\\docs\\SomeFolder\File.log
and then it worked.
In the "Action" check the "Start in (optional)" field. It makes a difference in these situations.

How to find a real cause for Python segmentation fault

Python script generates segmentation fault
I cannot find the source of the problem.
How is it to debug a segfault simply in Python ?
Are there recommended coding practices to avoid Segmentation Faults ?
I wrote a script (1600 lines) to systematically download CSV files from a source and format the collected data. The script works very well for 100-200 files, but it systematically crashes with a segmentation fault message after a while.
-Crash always happens at the same spot in the script, with no understandable cause
-I run it on Mac OSX but the crash also happens on Ubuntu Linux and Debian 9
-If I run the Pandas Routines that crash during my script on single files, they work properly. The crash only happens when I loop the script 100-200 times.
-I checked every variable content, constructors (init) and they seem to be fine.
Code is too long to be pasted, but available on demand
Expected result should be execution to the end. Instead, it crashes after 100-200 iterations

Running two .py files in cmd via a .bat?

EDIT2: Stress tested the fix, it still failed after creating near 1TB of intermediary files. Changed .py code to delete intermediary files after performing necessary processes. Stress test succeeded after this change. Original issues likely to do with memory. Still no way of proving.
EDIT: I'm still not sure whether or not it was a memory issue. However I got the full process to run via .bat by breaking up the second script into 4 additional .py files and running those instead of the full second script at once. Problem solved for now.
ORIGINAL:
I am running two .py files through cmd via a .bat file. The reason I am doing this is the first .py is an arcpy file that requires use of 32 bit python, and the second .py is a combined PCI and arcpy file that requires use of 64 bit python.
The first file runs without issue. The second file gets to a certain point, the same point every time, and I am prompted with a "python.exe has stopped working" dialog box. This point is in the middle of a for loop in the PCI section of code. I have run the PCI python in PCI's interpreter without issue. I suspect this might be related to memory issues but am uncertain, and have no idea how to check.
How do I check to see if memory issues are the problem?
.bat code below:
C:/Python27/ArcGIS10.5/python.exe E:/Path/CODE/file1.py
C:/Python27/ArcGISx6410.5/python.exe E:/Path/CODE/file2.py

Force delete windows directory from python [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
DirectoryInfo.Delete(True) Doesn't Delete When Folder Structure is Open in Windows Explorer
Folks, I am writing a python testing harness and part of the project involves uninstalling an application and then reinstalling it every night. Part of the uninstall task involves deleting the data directory. During the day testers/developer will be logging into the computer and occasionally a command prompt and/or log file is left open. This causes my script to fail as windows will not let me delete a file if another process has a handle to it.
I would like to know if it is possible to get the pid of the process that is holding onto the file so I can kill the process with WMI or something like that? I would like to do this without logging people out.
If it is impossible then is there a way, from python, to force logout of all users to to get a system where my script can keep working without waiting for me to show up and kill process/log out users?
Any suggestions at all are greatly welcome. I am not an experienced windows programmer.
So it looks like sysinternals provides the solution.
handle.exe does the job.
Do a system call (subprocess module in python for example) and search the output for the directory you are trying to delete. Get the pid and then kill the process with another system call to psKill.exe.
I have not written the code yet but I went through the procedure on the command like and it works. Should be trivial to script this with any programming language doing system calls to the psTools.

Categories

Resources