Trying to connect to an smb share via pysmb and getting error...
smb.smb_structs.OperationFailure: Failed to list on \\\\H021BSBD20\\shared_folder: Unable to connect to shared device
The code I am using looks like...
from smb.SMBConnection import SMBConnection
import json
import pprint
import warnings
pp = pprint.PrettyPrinter(indent=4)
PROJECT_HOME = "/path/to/my/project/"
# load configs
CONF = json.load(open(f"{PROJECT_HOME}/configs/configs.json"))
pp.pprint(CONF)
# list all files in storage smb dir
#https://pysmb.readthedocs.io/en/latest/api/smb_SMBConnection.html#smb.SMBConnection.SMBConnection.listPath
IS_DIRECT_TCP = False
CNXN_PORT = 139 if not IS_DIRECT_TCP else 445
LOCAL_IP = "172.18.4.69"
REMOTE_NAME = "H021BSBD20" # exact name shown as Device Name in System Settings
SERVICE_NAME = "\\\\H021BSBD20\\shared_folder"
REMOTE_IP = "172.18.7.102"
try:
conn = SMBConnection(CONF['smb_creds']['username'], CONF['smb_creds']['password'],
my_name=LOCAL_IP, remote_name=REMOTE_NAME,
use_ntlm_v2=True,
is_direct_tcp=IS_DIRECT_TCP)
conn.connect(REMOTE_IP, CNXN_PORT)
except Exception:
warnings.warn("\n\nFailed to initially connect, attempting again with param use_ntlm_v2=False\n\n")
conn = SMBConnection(CONF['smb_creds']['username'], CONF['smb_creds']['password'],
my_name=LOCAL_IP, remote_name=REMOTE_NAME,
use_ntlm_v2=False,
is_direct_tcp=IS_DIRECT_TCP)
conn.connect(REMOTE_IP, CNXN_PORT)
files = conn.listPath(f'{SERVICE_NAME}', '\\')
pp.pprint(files)
Using smbclient on my machine, I can successfully connect to the share by doing...
[root#airflowetl etl]# smbclient -U my_user \\\\H021BSBD20\\shared_folder
The amount of backslashes I use in the python code is so that I can create the same string that works when using this smbclient (have tried with less backslashes in the code and that has not helped).
Note that the user that I am using the access the shared folder in the python code and with smbclient is not able to access / log on to the actual machine that the share is hosted on (they are only allowed to access that particular shared folder as shown above).
Does anyone know what could be happening here? Any other debugging steps that could be done?
After asking on the github repo Issues section (https://github.com/miketeo/pysmb/issues/169), I was able to fix the problem. It was just due to the arg I was using for the conn.listPath() servicename param.
When looking closer at the docs for that function (https://pysmb.readthedocs.io/en/latest/api/smb_SMBConnection.html), I saw...
service_name (string/unicode) – the name of the shared folder for the path
Originally, I was only looking at the function signature, which said service_name, so I assumed it would be the same as with the smbclient command-line tool (which I have been entering the servicename param as \\\\devicename\\sharename (unlike with pysmb which we can see from the docstring wants just the share as the service_name)).
So rather than
files = conn.listPath("\\\\H021BSBD20\\shared_folder", '\\')
I do
files = conn.listPath("shared_folder", '\\')
The full refactored snippet is shown below, just for reference.
import argparse
import json
import os
import pprint
import socket
import sys
import traceback
import warnings
from smb.SMBConnection import SMBConnection
def parseArguments():
# Create argument parser
parser = argparse.ArgumentParser()
# Positional mandatory arguments
parser.add_argument("project_home", help="project home path", type=str)
parser.add_argument("device_name", help="device (eg. NetBIOS) name in configs of share to process", type=str)
# Optional arguments
# parser.add_argument("-dfd", "--data_file_dir",
# help="path to data files dir to be pushed to sink, else source columns based on form_type",
# type=str, default=None)
# Parse arguments
args = parser.parse_args()
return args
args = parseArguments()
for a in args.__dict__:
print(str(a) + ": " + str(args.__dict__[a]))
pp = pprint.PrettyPrinter(indent=4)
PROJECT_HOME = args.project_home
REMOTE_NAME = args.device_name
# load configs
CONF = json.load(open(f"{PROJECT_HOME}/configs/configs.json"))
CREDS = json.load(open(f"{PROJECT_HOME}/configs/creds.json"))
pp.pprint(CONF)
SMB_CONFS = next(info for info in CONF["smb_server_configs"] if info["device_name"] == args.device_name)
print("\nUsing details for device:")
pp.pprint(SMB_CONFS)
# list all files in storage smb dir
#https://pysmb.readthedocs.io/en/latest/api/smb_SMBConnection.html#smb.SMBConnection.SMBConnection.listPath
IS_DIRECT_TCP = False
CNXN_PORT = 139 if IS_DIRECT_TCP is False else 445
LOCAL_IP = socket.gethostname() #"172.18.4.69"
REMOTE_NAME = SMB_CONFS["device_name"]
SHARE_FOLDER = SMB_CONFS["share_folder"]
REMOTE_IP = socket.gethostbyname(REMOTE_NAME) # "172.18.7.102"
print(LOCAL_IP)
print(REMOTE_NAME)
try:
conn = SMBConnection(CREDS['smb_creds']['username'], CREDS['smb_creds']['password'],
my_name=LOCAL_IP, remote_name=REMOTE_NAME,
use_ntlm_v2=False,
is_direct_tcp=IS_DIRECT_TCP)
conn.connect(REMOTE_IP, CNXN_PORT)
except Exception:
traceback.print_exc()
warnings.warn("\n\nFailed to initially connect, attempting again with param use_ntlm_v2=True\n\n")
conn = SMBConnection(CREDS['smb_creds']['username'], CREDS['smb_creds']['password'],
my_name=LOCAL_IP, remote_name=REMOTE_NAME,
use_ntlm_v2=True,
is_direct_tcp=IS_DIRECT_TCP)
conn.connect(REMOTE_IP, CNXN_PORT)
files = conn.listPath(SHARE_FOLDER, '\\')
if len(files) > 0:
print("Found listed files")
for f in files:
print(f.filename)
else:
print("No files to list, this likely indicates a problem. Exiting...")
exit(255)
Related
I basically want to run this command: argo submit -n argo workflows/workflow.yaml -f params.json through the official python SDK.
This example covers how to submit a workflow manifest, but I don't know where to add the input parameter file.
import os
from pprint import pprint
import yaml
from pathlib import Path
import argo_workflows
from argo_workflows.api import workflow_service_api
from argo_workflows.model.io_argoproj_workflow_v1alpha1_workflow_create_request import \
IoArgoprojWorkflowV1alpha1WorkflowCreateRequest
configuration = argo_workflows.Configuration(host="https://localhost:2746")
configuration.verify_ssl = False
with open("workflows/workflow.yaml", "r") as f:
manifest = yaml.safe_load(f)
api_client = argo_workflows.ApiClient(configuration)
api_instance = workflow_service_api.WorkflowServiceApi(api_client)
api_response = api_instance.create_workflow(
namespace="argo",
body=IoArgoprojWorkflowV1alpha1WorkflowCreateRequest(workflow=manifest, _check_type=False),
_check_return_type=False)
pprint(api_response)
Where to pass in the params.json file?
I found this snippet in the docs of WorkflowServiceApi.md (which was apparently too big to render as markdown):
import time
import argo_workflows
from argo_workflows.api import workflow_service_api
from argo_workflows.model.grpc_gateway_runtime_error import GrpcGatewayRuntimeError
from argo_workflows.model.io_argoproj_workflow_v1alpha1_workflow_submit_request import IoArgoprojWorkflowV1alpha1WorkflowSubmitRequest
from argo_workflows.model.io_argoproj_workflow_v1alpha1_workflow import IoArgoprojWorkflowV1alpha1Workflow
from pprint import pprint
# Defining the host is optional and defaults to http://localhost:2746
# See configuration.py for a list of all supported configuration parameters.
configuration = argo_workflows.Configuration(
host = "http://localhost:2746"
)
# Enter a context with an instance of the API client
with argo_workflows.ApiClient() as api_client:
# Create an instance of the API class
api_instance = workflow_service_api.WorkflowServiceApi(api_client)
namespace = "namespace_example" # str |
body = IoArgoprojWorkflowV1alpha1WorkflowSubmitRequest(
namespace="namespace_example",
resource_kind="resource_kind_example",
resource_name="resource_name_example",
submit_options=IoArgoprojWorkflowV1alpha1SubmitOpts(
annotations="annotations_example",
dry_run=True,
entry_point="entry_point_example",
generate_name="generate_name_example",
labels="labels_example",
name="name_example",
owner_reference=OwnerReference(
api_version="api_version_example",
block_owner_deletion=True,
controller=True,
kind="kind_example",
name="name_example",
uid="uid_example",
),
parameter_file="parameter_file_example",
parameters=[
"parameters_example",
],
pod_priority_class_name="pod_priority_class_name_example",
priority=1,
server_dry_run=True,
service_account="service_account_example",
),
) # IoArgoprojWorkflowV1alpha1WorkflowSubmitRequest |
# example passing only required values which don't have defaults set
try:
api_response = api_instance.submit_workflow(namespace, body)
pprint(api_response)
except argo_workflows.ApiException as e:
print("Exception when calling WorkflowServiceApi->submit_workflow: %s\n" % e)
Have you tried using a IoArgoprojWorkflowV1alpha1WorkflowSubmitRequest? Looks like it has submit_options of type IoArgoprojWorkflowV1alpha1SubmitOpts which has a parameter_file param.
I'm using Python 3.8.3 on a MacBook Pro (13-inch, 2016, Two Thunderbolt 3 ports)
I'm currently building an online chat and this is my person.py code:
class Person:
"""
Represents a person, hold name, socket client and client addr
"""
def __init__(self, addr, client):
self.addr = addr
self.client = client
self.name = None
def set_name(self, name):
self.name = name
def __repr__(self):
return f"Person({self.addr}, {self.name})"
And this is my server.py coding:
from threading import Thread
import time
from person import Person
# GLOBAL CONSTANTS
HOST = 'localhost'
PORT = 5500
ADDR = (HOST, PORT)
MAX_CONNECTIONS = 10
BUFSIZ = 512
# GLOBAL VARIABLES
persons = []
SERVER = socket(AF_INET, SOCK_STREAM)
SERVER.bind(ADDR) # set up server
def broadcast(msg, name):
"""
send new messages to all clients
:param msg: bytes["utf8"]
:param name: str
:return:
"""
for person in persons:
client = person.client
client.send(bytes(name, "utf8") + msg)
def client_communication(person):
"""
Thread to handle all messages from client
:param person: Person
:return: None
"""
client = person.client
# get persons name
name = client.recv(BUFSIZ).decode("utf8")
person.set_name(name)
msg = bytes(f"{name} has joined the chat!", "utf8")
broadcast(msg, "") # broadcast welcome message
while True:
try:
msg = client.recv(BUFSIZ)
if msg == bytes("{quit}", "utf8"):
client.close()
persons.remove(person)
broadcast(f"{name} has left the chat...", "")
print(f"[DISCONNECTED] {name} disconnected")
break
else:
broadcast(msg, name+": ")
print(f"{name}: ", msg.decode("utf8"))
except Exception as e:
print("[EXCEPTION]", e)
break
def wait_for_connection():
"""
Wait for connetion from new clients, start new thread once connected
:param SERVER: SOCKET
:return: None
"""
run = True
while run:
try:
client, addr = SERVER.accept()
person = Person(addr, client)
persons.append(person)
print(f"[CONNECTION] {addr} connected to the server at {time.time()}")
Thread(target=client_communication, args=(person,)).start()
except Exception as e:
print("[EXCEPTION]", e)
run = False
print("SERVER CRASHED")
if __name__ == "__main__":
SERVER.listen(MAX_CONNECTIONS) # listen for connections
print("[STARTED] Waiting for connections...")
ACCEPT_THREAD = Thread(target=wait_for_connection)
ACCEPT_THREAD.start()
ACCEPT_THREAD.join()
SERVER.close()
The problem is, everytime i try to run the program, it gives me this error:
from person import Person
ModuleNotFoundError: No module named 'person'
Does somebody know how to solve this problem?
The Problem
Most likely, this is a path-finding error. The Python Path will look in the installation's lib and site-packages folders. It will also look at the current working directory. So if you're running one file from another folder but trying to import something it will look in your running directory instead of where the file you're running. Here are two solutions to this problem.
First Solution
You can make the working directory the same as the file you're running. Giving the following file structure:
workingdir
+-- thecode
|-- server.py
+-- person.py
You have the current working directory aka where you're running your command as workingdir so a terminal might look like this:
workingdir % python thecode/server.py
Thus you need to change the working directory to thecode instead. Do cd thecode to get there.
Second Solution
You can instead add the file's directory to the python path. This is held in sys.path, and gets at the end of each python run. Thus, it is best to add to the path at the beginning of your server.py file so that person.py is in your path before you import it. Use something like the following code to do this.
import sys
import os.path
sys.path.append(os.path.split(os.path.abspath(__file__))[0])
# now you may import Person.
from person import Person
The first two modules, sys and os, are pretty standard and will give you the tools to append to the Python path. The os.path.abspath gets the absolute path of your current file, in this case it's server.py. Then os.path.split will get the trailing (all the directories) and the head (the filename) of your absolute path. Finally appending to the sys.path allows Python to find person.py in your file's directory.
Other Solutions (That probably won't be used in this case)
You can also create a custom Python package and install it using pip. This isn't very useful in this case since the import problem is just one file, however, if you ever want to create a custom Python project that other's may use this Python docs article will help.
The final solution (which admittedly I used to do and isn't the best workaround) is to put your file in a folder that's already in the Python path. In this case, it would probably be the lib folder. For me, running Python 3.8 the path is /Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8, for any Python 3.x version this will be the same but the version replaced. Python 2.x has different locations sometimes.
I am working on a script to transfer some files to a Cisco IOS device using netmiko FileTransfer. Below is the code that I found to accomplish this. However I cant seem to find where the source_file should be or how to specify where on the host that files lives. How do I specify where to copy that file from?
dest_file_system = 'disk0:/'
source_file = 'test1.txt' # where should this file be live?
dest_file = 'test1.txt'
with FileTransfer(ssh_conn, source_file=source_file, dest_file=dest_file,
file_system=dest_file_system) as scp_transfer:
if not scp_transfer.check_file_exists():
if not scp_transfer.verify_space_available():
raise ValueError("Insufficient space available on remote device")
print("\nTransferring file\n")
scp_transfer.transfer_file()
The source file from the scripts perspective if you simple want to call it by name should be in the same directory as the script itself. If you want to move the file to a new directory the search path in the script is relative to the directory the script is run from. Example 1 - The file_name.txt is in the same directory as your script In your script simply call the file source = "file_name.txt". Example 2 create test_folder in current directory that holds your script and call it test_folder move the file_name.txt into test_folder. In your script your source variable would no need to look like this source = "test_folder/file_name.txt"
Hopefully, till now you found the solution already. If not then You can try the netmiko file transfer feature also. That will be a more secure and efficient way to do it.
try below sample script.
from getpass import getpass
from netmiko import ConnectHandler, file_transfer
password = getpass()
cisco = {
"device_type": "cisco_ios",
"host": "cisco1.twb-tech.com",
"username": "pyclass",
"password": password,
}
source_file = "test1.txt"
dest_file = "test1.txt"
direction = "put"
file_system = "flash:"
ssh_conn = ConnectHandler(**cisco)
transfer_dict = file_transfer(
ssh_conn,
source_file=source_file,
dest_file=dest_file,
file_system=file_system,
direction=direction,
overwrite_file=True,
)
print(transfer_dict)
Could anyone please help me in getting a way to get the source code from Environment or SB Console or Weblogic.
I created the python script whick exports the JAR, but I need the source code. Because if I unjar the jar, I do not get the exact source code, as file names are shorten and some code is added by itself in wsdls, xqueries etc. I don't want that.
Here's my wlst Python/Jython Script:
from java.io import FileInputStream
from java.io import FileOutputStream
from java.util import ArrayList
from java.util import Collections
from com.bea.wli.sb.util import EnvValueTypes
from com.bea.wli.config.env import EnvValueQuery;
from com.bea.wli.config import Ref
from com.bea.wli.config.customization import Customization
from com.bea.wli.config.customization import FindAndReplaceCustomization
import sys
#=======================================================================================
# Utility function to load properties from a config file
#=======================================================================================
def exportAll(exportConfigFile):
def exportAll(exportConfigFile):
try:
print "Loading export config from :", exportConfigFile
exportConfigProp = loadProps(exportConfigFile)
adminUrl = exportConfigProp.get("adminUrl")
exportUser = exportConfigProp.get("exportUser")
exportPasswd = exportConfigProp.get("exportPassword")
exportJar = exportConfigProp.get("exportJar")
customFile = exportConfigProp.get("customizationFile")
passphrase = exportConfigProp.get("passphrase")
project = sys.argv[2]
if project == None :
project = exportConfigProp.get("project")
connectToServer(exportUser, exportPasswd, adminUrl)
ALSBConfigurationMBean = findService("ALSBConfiguration", "com.bea.wli.sb.management.configuration.ALSBConfigurationMBean")
print "ALSBConfiguration MBean found"
print "Input project: ", project
if project == None :
ref = Ref.DOMAIN
collection = Collections.singleton(ref)
if passphrase == None :
print "Export the config"
theBytes = ALSBConfigurationMBean.exportProjects(collection, None)
else :
print "Export and encrypt the config"
theBytes = ALSBConfigurationMBean.export(collection, true, passphrase)
else :
ref = Ref.makeProjectRef(project);
print "Export the project", project
collection = Collections.singleton(ref)
theBytes = ALSBConfigurationMBean.export(collection, false, None)
aFile = File(exportJar)
out = FileOutputStream(aFile)
out.write(theBytes)
out.close()
print "ALSB Configuration file: "+ exportJar + " has been exported"
if customFile != None:
print collection
query = EnvValueQuery(None, Collections.singleton(EnvValueTypes.WORK_MANAGER), collection, false, None, false)
customEnv = FindAndReplaceCustomization('Set the right Work Manager', query, 'Production System Work Manager')
print 'EnvValueCustomization created'
customList = ArrayList()
customList.add(customEnv)
print customList
aFile = File(customFile)
out = FileOutputStream(aFile)
Customization.toXML(customList, out)
out.close()
print "ALSB Dummy Customization file: "+ customFile + " has been created"
except:
raise
#=======================================================================================
# Utility function to load properties from a config file
#=======================================================================================
def loadProps(configPropFile):
propInputStream = FileInputStream(configPropFile)
configProps = Properties()
configProps.load(propInputStream)
return configProps
#=======================================================================================
# Connect to the Admin Server
#=======================================================================================
def connectToServer(username, password, url):
connect(username, password, url)
domainRuntime()
# EXPORT script init
try:
exportAll(sys.argv[1])
except:
print "Unexpected error: ", sys.exc_info()[0]
dumpStack()
raise
Any help would be appreciated.
What you get as a result of the export is the deployed unit. Yes, there is some metadata added/modified as a result of the deployment on the OSB runtime (deployment could also mean creating/editing components directly on the servicebus console).
To get it back as "source code" from the exported jar, you can simply import it back into JDeveloper (12c) or Eclipse with OEPE (11g)
I'm trying to unmount a filesystem that I mounted using FilesystemMount, but I keep getting UnknownMethod exceptions. I've verified that I can call the method on the Device interface via D-Feet, but trying to do it via dbus directly doesn't appear to work at all. I've tried using the following arguments:
''
None
[]
['']
The following code demonstrates the problem:
import dbus
bus = dbus.SystemBus()
proxy = bus.get_object('org.freedesktop.UDisks', '/dev/fd0')
dev = dbus.Interface(proxy, 'org.freedesktop.UDisks.Device')
dev.FilesystemUnmount(['force'])
Exception:
dbus.exceptions.DBusException: org.freedesktop.DBus.Error.UnknownMethod: Method "FilesystemUmount" with signature "as" on interface "org.freedesktop.UDisks.Device" doesn't exist
Turns out that the problem is that FilesystemUnmount will only take an ObjectPath that udisks handed out. So by adding a check for that and then looking it up I got it to work. See the code below.
import dbus
path = '/dev/fd0'
bus = dbus.SystemBus()
if not isinstance(path, dbus.ObjectPath):
manager_obj = bus.get_object('org.freedesktop.UDisks',
'/org/freedesktop/UDisks')
manager = dbus.Interface(manager_obj, 'org.freedesktop.UDisks')
path = manager.FindDeviceByDeviceFile(path)
proxy = bus.get_object('org.freedesktop.UDisks', path)
dev = dbus.Interface(proxy, 'org.freedesktop.UDisks.Device')
dev.FilesystemUnmount('')