How do I execute python OpenCV programs without sudo? - python

I am trying python OpenCV and facing multiple issues (mostly permission related) as I cannot execute the video capture scripts without sudo. On the other hand, using sudo is creating permission issues with output files when they need to be shared with other processes.
When I'm not using sudo, this is the error message I get when cv2.VideoCapture(0) is called, cv2 being opencv module:
cv2.error:/home/sidmeister/opencv/modules/videoio/src/cap_gstreamer.cpp:818: error: (-2) GStreamer: unable to start pipeline
in function cvCaptureFromCAM_GStreamer
Going through the source code I understand that gst_element_set_state( ) function is returning GST_STATE_CHANGE_FAILURE. And, as I understand, that's a permission issue!
So, I circle back to my original point, is there any way to overcome these permission issues?

Add your user to the group video
gpasswd -a sidmeister video

Related

Why doesn't any python commands work in my command line?

I was normally using python3.7.3 on my system(Windows 10). A couple of days earlier I noticed that the command prompt won't run any of my python programs. It did nothing, no response. I thought there is a problem displaying the output stream so I ran an infinite loop(and was expecting to terminate the process, ctrl^c) but again no response. Even python --version command won't work. I uninstalled python3.7.3, downloaded the latest python3.8.5, and again the same problem. image from my cmd line
please help me out. I use sublime text so I prefer running my codes through cmd.
Update: Here is another snap of my command line after running commands in one answer and comments.
Also, I think the problem lies within the PATH setting but PATHs are already added.
image to enviornment variables, image to system variables
It is possible you failed to add python to your path in the installer.
Image displaying adding python to your path in the installer
You also may have forgotten to disable the path limit in the final set up screen.
Image displaying increase path limit in the installer
If you navigate to C:\Users\chira\AppData\Local\Programs\Python\Python38-32 then run python --version do you get any output?

Pytest and virtualenv || Failed to configure container: [Errno 1]

I am currently working on a python program for finding flaky tests by running them multiple times. To achieve this goal, I'm executing the tests in a virtualenv in random order using pytest.
When I execute the program on a remote machine via slurm job, I get following error codes:
2019-11-26 18:18:18,642 - CRITICAL - Failed to configure container: [Errno 1] Creating overlay mount for '/' failed: Operation not permitted. Please use other directory modes, for example '--read-only-dir /'.
2019-11-26 18:18:18,777 - CRITICAL - Cannot execute 'pytest': execution in container failed.
This doesn't happen on my local machine, only on the task startet via the slurm job.
This is my first time working with python at this complexity, so I'm not really sure where to start solving the problem.
Thanks a lot in advance!
I finally figured out that the problem only occurs on the newest version of benchexec.
When my python program executes run exec --no-container --pytest inside a virtualenv and benchexec is version 2.0 or higher, the error message displayed in my original post shows up. I simply tell pip to install an older version of benchexec in my virtualenv and voila it works.
I would've created the tag benchexec but don't have the necessary credit points for doing so. Feel free to do it for me!

Windows permission error to delete joblibmem mapping folder in python

I am trying to run a python code that performs XGboosting and I wanted it to run parallely to take less time in building a model. But I am facing this issue while running a code.
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\<<username>>\\AppData\\Local\\Temp\\joblib_memmapping_folder_85680_9566857635\\85680-1746225537432-968de5958f0642829c37f0f09f0e8b00.pkl'
I have even tried running the anaconda prompt as an administrator. But it is of no use. As a workaround I have also tried what is suggested in https://github.com/joblib/joblib/issues/806 but even then I am facing the same issue.
Could you please advice?

Hive transform with Python library in a folder

I'd like to run a Python script in my TRANSFORM that uses an external library, which I'm unable to install in the cluster.
For tests outside Hive, I could just copy the folder with the main files to my home directory, where the script is, and run from there, with an "import "
For Hive, I don't know how to use this library. I tried importing the files one by one with ADD FILE and also tried zipping and using ADD ARCHIVE, but I keep getting the message
FAILED: Execution Error, return code 20003 from org.apache.hadoop.hive.ql.exec.MapRedTask. An error occurred when trying to close the Operator running your custom script.
Any suggestions on how I can adapt my script to use these files?
Can you please post the log dump. That would be more helpful to troubleshoot. I came across the same problem and it resulted in the transformer library i wrote was referring to non-existent classes.
I din't had enough reputation to put this in as comment.

Create new files from upload

I'm stuck with a little problem:
I have a website to manage videofiles for different users. Each user can upload videos to a personal folder which I don't want to change because I don't want to mix up files from different users. After uploading the video file I call a subprocess which should create a thumbnail. The subprocess fails because of an error in ffmpeg, seeming to be related to missing writing permissions. The uploaded file and the containing folder belong to www-data.
The code:
command = ("ffmpeg -ss 00:00:10 -i %s -dframes 1 %s -y" % (video_path, image_path)).split()
subprocess.call(command)
FFMPEG seems to be run as a different user because it only works if the target-folder has 777-permissions. Otherwise it fails with this message:
av_interleaved_write_frame(): I/O error occurred
Usually that means that input file is truncated and/or corrupted.
If I touch the image-file instead of creating it via ffmpeg it doesn't matter if the folder has 775 or 777. The resulting file then also belongs to www-data, which means that the subprocess itself is run as www-data, doesn't it?
I thought about creating a subfolder which has 777-permissions but I don't like it for two reasons: This folder had to be created dynamically because I want to be able to create new users (and resulting new subfolders in my uploads-folder). 777-permissions are no nice solution anyway.
Do you have any suggestions what I have to change so ffmpeg can write to the folder without opening security leaks and without having to touch anything when creating a new user/folder?
I found it!
It was not a permission-problem but something strange in error handling: If the code is run from the webserver the resulting image-file is dismissed when the error occurs. If it is run from command line the resulting file remains in the folder.
So basically I changed my command that no error message appears any more by using -vframes instead of -dframes (which only worked fine in windows):
command = ("ffmpeg -ss 00:00:10 -i %s -vframes 1 %s -y" % (video_path, image_path)).split()
Try specifying -vframes 1 as described here
However, for ffmpeg 0.9 and later both dframes and vframes are aliases for frames, so if you use newer version of ffmpeg, problem is somewhere in other place.
You could run conversion process asynchronously with Celery. Your worker process might be invoked with required permissions, and apache just needs permissions to access communication channel, such as RabbitMQ for example

Categories

Resources