Difficulty creating an XLSM file - python

I'm trying to create an xlsm file using xlwings. Or openpyxl if not possible with xlwings.
I'm on Mac so I can't use PyWin32.
I'm running the following python code:
import xlwings as xw
b = xw.Book()
b.save('test_book2.xlsm')
When I run that code I receive the message:
When I click 'yes' to that message I receive the following error:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/aeosa/appscript/reference.py", line 482, in __call__
return self.AS_appdata.target().event(self._code, params, atts, codecs=self.AS_appdata).send(timeout, sendflags)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/aeosa/aem/aemsend.py", line 92, in send
raise EventError(errornum, errormsg, eventresult)
aem.aemsend.EventError: Command failed: Parameter error. (-50)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/matthewbell/Desktop/test.py", line 4, in <module>
b.save('test_book2.xlsm')
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/xlwings/main.py", line 704, in save
return self.impl.save(path)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/xlwings/_xlmac.py", line 244, in save
self.xl.save_workbook_as(filename=hfs_path, overwrite=True)
File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/aeosa/appscript/reference.py", line 518, in __call__
raise CommandError(self, (args, kargs), e, self.AS_appdata) from e
appscript.reference.CommandError: Command failed:
OSERROR: -50
MESSAGE: Parameter error.
COMMAND: app(pid=6198).workbooks['Book1'].save_workbook_as(filename='Macintosh HD:Users:matthewbell:Desktop:test_book2.xlsm', overwrite=True)
What can I do to create an xlsm file?

Related

Python connection with Hive database in HDInsight

I am trying to create connection to Hive hosted in HDInsight cluster through my python script and getting below error-
Traceback (most recent call last):
File "ClassLoader.java", line 357, in java.lang.ClassLoader.loadClass
File "Launcher.java", line 349, in sun.misc.Launcher$AppClassLoader.loadClass
File "ClassLoader.java", line 424, in java.lang.ClassLoader.loadClass
File "URLClassLoader.java", line 382, in java.net.URLClassLoader.findClass
java.lang.ClassNotFoundException: java.lang.ClassNotFoundException: org.apache.thrift.transport.TTransportException
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "org.jpype.JPypeContext.java", line 330, in org.jpype.JPypeContext.callMethod
File "Method.java", line 498, in java.lang.reflect.Method.invoke
File "DelegatingMethodAccessorImpl.java", line 43, in sun.reflect.DelegatingMethodAccessorImpl.invoke
File "NativeMethodAccessorImpl.java", line 62, in sun.reflect.NativeMethodAccessorImpl.invoke
File "NativeMethodAccessorImpl.java", line -2, in sun.reflect.NativeMethodAccessorImpl.invoke0
File "DriverManager.java", line 247, in java.sql.DriverManager.getConnection
File "DriverManager.java", line 664, in java.sql.DriverManager.getConnection
File "HiveDriver.java", line 105, in org.apache.hive.jdbc.HiveDriver.connect
Exception: Java Exception
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "s.py", line 5, in <module>
"/root/jdbc/hive-jdbc-1.2.1000.2.6.5.3009-43.jar")
File "/usr/local/lib64/python3.6/site-packages/jaydebeapi/__init__.py", line 412, in connect
jconn = _jdbc_connect(jclassname, url, driver_args, jars, libs)
File "/usr/local/lib64/python3.6/site-packages/jaydebeapi/__init__.py", line 230, in _jdbc_connect_jpype
return jpype.java.sql.DriverManager.getConnection(url, *dargs)
java.lang.NoClassDefFoundError: java.lang.NoClassDefFoundError: org/apache/thrift/transport/TTransportException
My Script is -
import jaydebeapi
conn = jaydebeapi.connect("org.apache.hive.jdbc.HiveDriver",
"jdbc:hive2://10.20.30.40:10001/default;transportMode=http;ssl=false;httpPath=/hive2",
["username", "password"],
"/root/jdbc/hive-jdbc-1.2.1000.2.6.5.3009-43.jar")
I have exported CLASSPATH wil all jar files.
The error is java.lang.ClassNotFoundException: java.lang.ClassNotFoundException which specifies that the execution is not able to find the jar /root/jdbc/hive-jdbc-1.2.1000.2.6.5.3009-43.jar.I believe it's only placed in a host from where you are executing the code. I would suggest placing the jar file in the same directory structure in all the nodes in the cluster and have a check on permissions so that the user executing the job has access to that path.

tensorflow_hub throwing this error: 'SentencepieceOp' when loading the link

The following line of code I am trying to run in PyCharm and I have tensorflow_hub installed and imported.
use = hub.load("https://tfhub.dev/google/universal-sentence-encoder-multilingual-large/3")
Any suggestions for the below error? As I need this for my project.
Traceback (most recent call last):
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3820, in _get_op_def
return self._op_def_cache[type]
KeyError: 'SentencepieceOp'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:/Users/Jon10/OneDrive/Documents/Computer Science/Dissertation/PythonPractice/TFTest/test.py", line 28, in <module>
use = hub.load("https://tfhub.dev/google/universal-sentence-encoder-multilingual-large/3")
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_hub\module_v2.py", line 102, in load
obj = tf_v1.saved_model.load_v2(module_path, tags=tags)
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\saved_model\load.py", line 517, in load
return load_internal(export_dir, tags)
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\saved_model\load.py", line 541, in load_internal
export_dir)
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\saved_model\load.py", line 114, in __init__
meta_graph.graph_def.library))
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\saved_model\function_deserialization.py", line 312, in load_function_def_library
copy, copy_functions=False)
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\framework\function_def_to_graph.py", line 61, in function_def_to_graph
fdef, input_shapes, copy_functions)
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\framework\function_def_to_graph.py", line 214, in function_def_to_graph_def
op_def = ops.get_default_graph()._get_op_def(node_def.op) # pylint: disable=protected-access
File "C:\Users\Jon10\miniconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\framework\ops.py", line 3824, in _get_op_def
c_api.TF_GraphGetOpDef(self._c_graph, compat.as_bytes(type), buf)
tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'SentencepieceOp' in binary running on DESKTOP-..... Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
You need to install tensorflow_text, and import it before using hub.load

Getting OSError when running script

I am trying to set up the ChirpSDK, but every time I configure and run the code, I get this error:
Traceback (most recent call last):
File "test.py", line 3, in <module>
chirp = ChirpSDK()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/chirpsdk/chirpsdk.py", line 395, in __init__
self.read_chirprc(block)
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/chirpsdk/chirpsdk.py", line 501, in read_chirprc
raise IOError('Could not find a ~/.chirprc file')
OSError: Could not find a ~/.chirprc file
Exception ignored in: <function ChirpSDK.__del__ at 0x10fa31af0>
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/chirpsdk/chirpsdk.py", line 422, in __del__
self.close()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/chirpsdk/chirpsdk.py", line 470, in close
if self._sdk:
AttributeError: 'ChirpSDK' object has no attribute '_sdk'
I realize that the error is saying that my .chirprc file is not being recognized, but I have no idea how to remedy this. I created a .chirprc file in my /Users/username/ path, and named it c.chirprc (as the Chirp getting started article suggests), but I am still getting this error. Is there another part that I am missing? Am I reading the instructions wrong?
Thanks
The Chirp configuration file should be placed at /Users/<username>/.chirprc on macOS.
If you run ls -l ~/.chirprc in the terminal, do you get any results? If it displays no such file or directory then you have not created the file correctly.

raise exception line 1029, in __init__

i have an error at my generate code, this is my code
import shapefile
w=shapefile.Writer()
w.shapeType
w.field("kolom1","C")
w.field("kolom2","C")
w.record("ngek","satu")
w.record("ngok","dua")
w.point(1,1)
w.point(2,2)
w.save("soal1")
and i have this problem...
python soal1.py
Traceback (most recent call last):
File "soal1.py", line 2, in <module>
w=shapefile.Writer()
File "C:\Users\Fathi-PC\AppData\Local\Programs\Python\Python36\lib\site-packages\shapefile.py", line 1029, in __init__
raise Exception('Either the target filepath, or any of shp, shx, or dbf must be set to create a shapefile.')
Exception: Either the target filepath, or any of shp, shx, or dbf must be set to create a shapefile.
is there anyone can help me..??

Why I am getting file not found exception while copying file on HDFS using Pydoop?

My Python code using Pydoop API is show below. I have a text file /home/progen/test.txt. While running the script to copy this file to HDFS, I am getting an error.
import pydoop.hdfs as hdfs
local_path = '/home/progen/test.txt'
hdfs_path = '/spark_user/'
host = 'master'
port = 9000
hdfsobj = hdfs.hdfs(host, port, user='spark', groups=['supergroup'])
hdfsobj.copy(local_path, hdfsobj, hdfs_path)
I am getting the Error: "File does no exist"
hdfsCopyImpl(src=/home/progen/test.txt, dst=/spark_user/, deleteSource=0): FileUtil#copy error:
java.io.FileNotFoundException: File does not exist: /home/progen/test.txt
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
Traceback (most recent call last):
File "hdfs.py", line 8, in <module>
hdfsobj.copy(local_path, hdfsobj, hdfs_path)
File "/usr/local/lib/python2.7/dist-packages/pydoop/hdfs/fs.py", line 304, in copy
return self.fs.copy(from_path, to_hdfs, to_path)
hdfsCopyImpl(src=/home/progen/test.txt, dst=/spark_user/, deleteSource=0): FileUtil#copy error:
java.io.FileNotFoundException: File does not exist: /home/progen/test.txt
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
Traceback (most recent call last):
File "hdfs.py", line 8, in <module>
hdfsobj.copy(local_path, hdfsobj, hdfs_path)
File "/usr/local/lib/python2.7/dist-packages/pydoop/hdfs/fs.py", line 304, in copy
return self.fs.copy(from_path, to_hdfs, to_path)

Categories

Resources