I launched R in my command-line and typed the following:
install.packages("XML")
After selecting a mirror site, I saw the following output:
trying URL 'https://cloud.r-project.org/src/contrib/XML_3.98-1.4.tar.gz'
Content type 'application/x-gzip' length 1599214 bytes (1.5 MB)
==================================================
downloaded 1.5 MB
* installing *source* package ‘XML’ ...
** package ‘XML’ successfully unpacked and MD5 sums checked
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking how to run the C preprocessor... gcc -E
checking for sed... /usr/local/Library/ENV/4.3/sed
checking for pkg-config... /usr/local/bin/pkg-config
checking for xml2-config... /Users/richiethomas/anaconda/bin/xml2-config
USE_XML2 = yes
SED_EXTENDED_ARG: -E
Minor 9, Patch 2 for 2.9.2
Located parser file -I/Users/richiethomas/anaconda/include/libxml2/parser.h
Checking for 1.8: -I/Users/richiethomas/anaconda/include/libxml2
Using libxml2.*
checking for gzopen in -lz... yes
checking for xmlParseFile in -lxml2... yes
You are trying to use a version 2.* edition of libxml
but an incompatible library. The header files and library seem to be
mismatched. If you have specified LIBXML_INCDIR, make certain to also
specify an appropriate LIBXML_LIBDIR if the libxml2 library is not in the default
directories.
ERROR: configuration failed for package ‘XML’
* removing ‘/usr/local/lib/R/3.2/site-library/XML’
The downloaded source packages are in
‘/private/var/folders/jy/0cwn40p951xc7f1480z3sxzm0000gn/T/RtmpvWMrkH/downloaded_packages’
Warning message:
In install.packages("XML") :
installation of package ‘XML’ had non-zero exit status
I Googled around and found this link, which suggested running the 'which xmllint' command in the command line to find any XML installations which might conflict with the one R depends on. The output when I ran it was:
/Users/richiethomas/anaconda/bin/xmllint
Correct me if I'm wrong, but it appears that my Python installation has an XML dependency which conflicts with the one R wants to download. Is this right? And if so, how can I fix it so that both Python and R are installed on my machine?
EDIT: I am using OSX. I ran "brew install libxml2" but Homebrew said it was already installed.
EDIT #2: I tried un-installing and re-installing R via Homebrew, and I am still getting the same error.
EDIT #3: I ran "brew info libxml2" and saw that there is an "--with-python" flag which enables a build with Python support. I also noticed the following:
Generally there are no consequences of this for you. If you build your
own software and it requires this formula, you'll need to add to your
build variables:
LDFLAGS: -L/usr/local/opt/libxml2/lib
CPPFLAGS: -I/usr/local/opt/libxml2/include
I ran "brew uninstall libxml2" and then "brew install libxml2 --with-python", and then set the above 2 environment variables using the "export" command. Then I re-ran R and again tried to install the XML package. I saw dozens of warnings of the type "passing argument to parameter here" (see below), followed by a non-zero exit code:
* installing *source* package ‘XML’ ...
** package ‘XML’ successfully unpacked and MD5 sums checked
checking for gcc... gcc
checking for C compiler default output file name... a.out
checking whether the C compiler works... yes
checking whether we are cross compiling... no
checking for suffix of executables...
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking how to run the C preprocessor... gcc -E
checking for sed... /usr/local/Library/ENV/4.3/sed
checking for pkg-config... /usr/local/bin/pkg-config
checking for xml2-config... /Users/richiethomas/anaconda/bin/xml2-config
USE_XML2 = yes
SED_EXTENDED_ARG: -E
Minor 9, Patch 2 for 2.9.2
Located parser file -I/Users/richiethomas/anaconda/include/libxml2/parser.h
Checking for 1.8: -I/Users/richiethomas/anaconda/include/libxml2
Using libxml2.*
checking for gzopen in -lz... yes
checking for xmlParseFile in -lxml2... yes
checking for xmlHashSize in -lxml2... yes
Using built-in xmlHashSize
Checking DTD parsing (presence of externalSubset)...
checking for xmlHashSize in -lxml2... yes
Found xmlHashSize
checking for xmlOutputBufferCreateBuffer in -lxml2... yes
have xmlOutputBufferCreateBuffer()
checking for xmlDocDumpFormatMemoryEnc in -lxml2... yes
checking libxml/xmlversion.h usability... yes
checking libxml/xmlversion.h presence... yes
checking for libxml/xmlversion.h... yes
Expat: FALSE
Checking for return type of xmlHashScan element routine.
No return value for xmlHashScan
xmlNs has a context field
Checking for cetype_t enumeration
Using recent version of R with cetype_t enumeration type for encoding
checking for xmlsec1-config... no
nodegc default
xml-debug default
Version has XML_WITH_ZLIB
Version has xmlHasFeature()
****************************************
Configuration information:
Libxml settings
libxml include directory: -I/Users/richiethomas/anaconda/include/libxml2
libxml library directory: -L/Users/richiethomas/anaconda/lib -lxml2 -lz -liconv -lm -lz -lxml2
libxml 2: -DLIBXML2=1
Compilation flags: -DLIBXML -I/Users/richiethomas/anaconda/include/libxml2 -DUSE_EXTERNAL_SUBSET=1 -DROOT_HAS_DTD_NODE=1 -DDUMP_WITH_ENCODING=1 -DUSE_XML_VERSION_H=1 -DXML_ELEMENT_ETYPE=1 -DXML_ATTRIBUTE_ATYPE=1 -DNO_XML_HASH_SCANNER_RETURN=1 -DLIBXML_NAMESPACE_HAS_CONTEXT=1 -DHAVE_R_CETYPE_T=1 -DHAVE_XML_WITH_ZLIB=1 -DHAVE_XML_HAS_FEATURE=1 -DUSE_R=1 -D_R_=1 -DHAVE_VALIDITY=1 -DXML_REF_COUNT_NODES=1
Link flags: -L/Users/richiethomas/anaconda/lib -lxml2 -lz -liconv -lm -lz -lxml2
****************************************
configure: creating ./config.status
config.status: creating src/Makevars
config.status: creating R/supports.R
config.status: creating inst/scripts/RSXML.csh
config.status: creating inst/scripts/RSXML.bsh
** libs
clang -I/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include -DNDEBUG -DLIBXML -I/Users/richiethomas/anaconda/include/libxml2 -DUSE_EXTERNAL_SUBSET=1 -DROOT_HAS_DTD_NODE=1 -DDUMP_WITH_ENCODING=1 -DUSE_XML_VERSION_H=1 -DXML_ELEMENT_ETYPE=1 -DXML_ATTRIBUTE_ATYPE=1 -DNO_XML_HASH_SCANNER_RETURN=1 -DLIBXML_NAMESPACE_HAS_CONTEXT=1 -DHAVE_R_CETYPE_T=1 -DHAVE_XML_WITH_ZLIB=1 -DHAVE_XML_HAS_FEATURE=1 -DUSE_R=1 -D_R_=1 -DHAVE_VALIDITY=1 -DXML_REF_COUNT_NODES=1 -I. -DLIBXML2=1 -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include -I/usr/local/opt/openssl/include -I/usr/local/include -fPIC -g -O2 -c DocParse.c -o DocParse.o
DocParse.c:375:60: warning: passing 'const char *' to parameter of type 'const xmlChar *' (aka 'const unsigned char *') converts between
pointers to integer types with different sign [-Wpointer-sign]
SET_STRING_ELT(VECTOR_ELT(rdoc, FILE_ELEMENT_NAME), 0, ENC_COPY_TO_USER_STRING(doc->name ? XMLCHAR_TO_CHAR(doc->name) : fileName));
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
./Utils.h:235:74: note: expanded from macro 'ENC_COPY_TO_USER_STRING'
#define ENC_COPY_TO_USER_STRING(x) CreateCharSexpWithEncoding(encoding, CHAR_TO_XMLCHAR (x))
^~~~~~~~~~~~~~~~~~~
./Utils.h:12:31: note: expanded from macro 'CHAR_TO_XMLCHAR'
#define CHAR_TO_XMLCHAR(val) ((xmlChar *) val)
^~~~~~~~~~~~~~~~~
./Utils.h:220:73: note: passing argument to parameter 'str' here
SEXP CreateCharSexpWithEncoding(const xmlChar *encoding, const xmlChar *str);
.....
/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include/Rinternals.h:822:28: note: passing argument to parameter here
SEXP Rf_mkChar(const char *);
^
schema.c:122:25: warning: passing 'const char *' to parameter of type 'const xmlChar *' (aka 'const unsigned char *') converts between pointers
to integer types with different sign [-Wpointer-sign]
p = xmlHashLookup(t, CHAR_DEREF(STRING_ELT(name, 0)));
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
./RSCommon.h:140:27: note: expanded from macro 'CHAR_DEREF'
#define CHAR_DEREF(x) CHAR((x))
^~~~~~~~~
/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include/Rinternals.h:440:18: note: expanded from macro 'CHAR'
#define CHAR(x) R_CHAR(x)
^~~~~~~~~
/Users/richiethomas/anaconda/include/libxml2/libxml/hash.h:171:22: note: passing argument to parameter 'name' here
const xmlChar *name);
^
2 warnings generated.
clang -I/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include -DNDEBUG -DLIBXML -I/Users/richiethomas/anaconda/include/libxml2 -DUSE_EXTERNAL_SUBSET=1 -DROOT_HAS_DTD_NODE=1 -DDUMP_WITH_ENCODING=1 -DUSE_XML_VERSION_H=1 -DXML_ELEMENT_ETYPE=1 -DXML_ATTRIBUTE_ATYPE=1 -DNO_XML_HASH_SCANNER_RETURN=1 -DLIBXML_NAMESPACE_HAS_CONTEXT=1 -DHAVE_R_CETYPE_T=1 -DHAVE_XML_WITH_ZLIB=1 -DHAVE_XML_HAS_FEATURE=1 -DUSE_R=1 -D_R_=1 -DHAVE_VALIDITY=1 -DXML_REF_COUNT_NODES=1 -I. -DLIBXML2=1 -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include -I/usr/local/opt/openssl/include -I/usr/local/include -fPIC -g -O2 -c xmlsecurity.c -o xmlsecurity.o
clang -I/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include -DNDEBUG -DLIBXML -I/Users/richiethomas/anaconda/include/libxml2 -DUSE_EXTERNAL_SUBSET=1 -DROOT_HAS_DTD_NODE=1 -DDUMP_WITH_ENCODING=1 -DUSE_XML_VERSION_H=1 -DXML_ELEMENT_ETYPE=1 -DXML_ATTRIBUTE_ATYPE=1 -DNO_XML_HASH_SCANNER_RETURN=1 -DLIBXML_NAMESPACE_HAS_CONTEXT=1 -DHAVE_R_CETYPE_T=1 -DHAVE_XML_WITH_ZLIB=1 -DHAVE_XML_HAS_FEATURE=1 -DUSE_R=1 -D_R_=1 -DHAVE_VALIDITY=1 -DXML_REF_COUNT_NODES=1 -I. -DLIBXML2=1 -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include -I/usr/local/opt/openssl/include -I/usr/local/include -fPIC -g -O2 -c xpath.c -o xpath.o
xpath.c:36:41: warning: passing 'const xmlChar *' (aka 'const unsigned char *') to parameter of type 'const char *' converts between pointers
to integer types with different sign [-Wpointer-sign]
SET_NAMES(ref, ScalarString(mkCharCE(el->name, encoding)));
^~~~~~~~
/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include/Rdefines.h:135:54: note: expanded from macro 'SET_NAMES'
#define SET_NAMES(x, n) setAttrib(x, R_NamesSymbol, n)
^
/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/include/Rinternals.h:889:30: note: passing argument to parameter here
SEXP Rf_mkCharCE(const char *, cetype_t);
^
1 warning generated.
clang -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/usr/local/Cellar/r/3.2.4_1/R.framework/Resources/lib -L/usr/local/opt/gettext/lib -L/usr/local/opt/readline/lib -L/usr/local/opt/openssl/lib -L/usr/local/lib -o XML.so DocParse.o EventParse.o ExpatParse.o HTMLParse.o NodeGC.o RSDTD.o RUtils.o Rcatalog.o Utils.o XMLEventParse.o XMLHashTree.o XMLTree.o fixNS.o libxmlFeatures.o schema.o xmlsecurity.o xpath.o -L/Users/richiethomas/anaconda/lib -lxml2 -lz -liconv -lm -lz -lxml2 -F/usr/local/Cellar/r/3.2.4_1/R.framework/.. -framework R -lintl -Wl,-framework -Wl,CoreFoundation
installing to /usr/local/lib/R/3.2/site-library/XML/libs
** R
** inst
** preparing package for lazy loading
Creating a generic function for ‘source’ from package ‘base’ in package ‘XML’
in method for ‘xmlAttrsToDataFrame’ with signature ‘"AsIs"’: no definition for class “AsIs”
in method for ‘readKeyValueDB’ with signature ‘"AsIs"’: no definition for class “AsIs”
in method for ‘readSolrDoc’ with signature ‘"AsIs"’: no definition for class “AsIs”
** help
*** installing help indices
** building package indices
** testing if installed package can be loaded
Error : .onLoad failed in loadNamespace() for 'XML', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/usr/local/lib/R/3.2/site-library/XML/libs/XML.so':
dlopen(/usr/local/lib/R/3.2/site-library/XML/libs/XML.so, 6): Library not loaded: libxml2.2.dylib
Referenced from: /usr/local/lib/R/3.2/site-library/XML/libs/XML.so
Reason: image not found
Error: loading failed
Execution halted
ERROR: loading failed
* removing ‘/usr/local/lib/R/3.2/site-library/XML’
The downloaded source packages are in
‘/private/var/folders/jy/0cwn40p951xc7f1480z3sxzm0000gn/T/RtmpEi0XEv/downloaded_packages’
Warning message:
In install.packages("XML") :
installation of package ‘XML’ had non-zero exit status
I solved this problem by using anaconda to install XML:
conda install -c r r-xml=3.98_1.5
Solution for Mac OS Catalina (i.e. in zsh).
I'm a novice, but this worked for me.
Make sure you have libxml2 installed in homebrew. (If you got it, brew list | grep libxml2 should confirm it; otherwise install it brew install libxml2. Don't have homebrew? Install it.)
Either create a zsh run com, touch ~/.zshrc, or find it in your home directory cd ~. (Use ls -a in Terminal to see if its in ~.)
Then make sure: export PATH="/usr/local/opt/libxml2/bin:$PATH" is in there (i.e. in ~/.zshrc). It shouldn't be in there yet! Add it using sudo nano ~/.zshrc. Then save it (ctrl + x, Y, enter). This should make sure the path to brew's libxml2 is the first thing that comes up when looking for libxml2. (FYI, I also added it to ~/.zprofile. I don't know which one is correct.)
Ok, now restart your terminal. Then run R in the Terminal using R. Once R has started, install.packages("XML"), pick a server to download it, and you should be good to go.
adding to #Travis's answer using conda install -c r r-xml=3.98_1.5 or conda install -c conda-forge r-xml works.
But make sure to run conda update r-essentials and reopen/restart the r terminal\notebook (whichever is your case).
You explained what is wrong and that helps.
XML package documentation in here: https://cran.r-project.org/web/packages/XML/index.html
Pay attetion to where it says You are trying to use a version 2.* edition of libxml
but an incompatible library.
What system are you using? I use OSX and when I had a similar issue I did run
brew install libxml2
and then it worked
I also see you use R 3.0+ and that should be very compatible.
Related
I have a pure Python script that I would like to distribute to systems with unkown Python configuration. Therefore, I would like to compile the Python code to a stand-alone executable.
I run cython --embed ./foo.py without problems giving foo.c. Then, I run
gcc $(python3-config --cflags) $(python3-config --ldflags) ./foo.c
where python3-config --cflags gives
-I/usr/include/python3.5m -I/usr/include/python3.5m -Wno-unused-result -Wsign-compare -g -fdebug-prefix-map=/build/python3.5-MLq5fN/python3.5-3.5.3=. -fstack-protector-strong -Wformat -Werror=format-security -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes
and python3-config --ldflags gives
-L/usr/lib/python3.5/config-3.5m-x86_64-linux-gnu -L/usr/lib -lpython3.5m -lpthread -ldl -lutil -lm -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions
This way I obtain a dynamically linked executable that runs without a problem. ldd a.out yields
linux-vdso.so.1 (0x00007ffcd57fd000)
libpython3.5m.so.1.0 => /usr/lib/x86_64-linux-gnu/libpython3.5m.so.1.0 (0x00007fda76823000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fda76603000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fda763fb000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fda761f3000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fda75eeb000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fda75b4b000)
libexpat.so.1 => /lib/x86_64-linux-gnu/libexpat.so.1 (0x00007fda7591b000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007fda756fb000)
/lib64/ld-linux-x86-64.so.2 (0x00007fda77103000)
Now, I try to add the option -static to gcc, but this results in an error:
/usr/bin/ld: dynamic STT_GNU_IFUNC symbol `strcmp' with pointer equality in `/usr/lib/gcc/x86_64-linux-gnu/6/../../../x86_64-linux-gnu/libc.a(strcmp.o)' can not be used when making an executable; recompile with -fPIE and relink with -pie
collect2: error: ld returned 1 exit status
I checked that all shared libraries given by ldd are also installed as static libraries.
So, is this some incompatibility with the options given by python3-config?
The experienced problems are obviously from the linker (gcc started a linker under the hood, to see it - just start gcc with -v - in verbose mode). So let's start with a short reminder how the linkage process works:
The linker keeps the names of all symbols it needs to resolve. In the beginning it is only the symbol main. What happens, when linker inspects a library?
If it is a static library, the linker looks at every object file in this library, and if this object files defines some looked for symbols, the whole object file is included (which means some symbols becomes resolved, but some further new unresolved symbols can be added). Linker might need to pass multiple times over a static library.
If it is a shared library, it is viewed by the linker as a library consisting out of a single huge object file (after all, we have to load this library at the run time and don't have to pass multiple times over and over to prune unused symbols): If there is at least one needed symbol the whole library is "linked" (not really the linkage happens at the run-time, this is a kind of a dry-run), if not - the whole library is discarded and never looked again at.
For example if you link with:
gcc -L/path -lpython3.x <other libs> foo.o
you will get a problem, no matter whether python3.x is a shared or a static lib: when the linker sees it, it looks only for the symbol main, but this symbol is not defined in the python-lib, so it the python-lib is discarded and never looked again at. Only when the linker sees the object-file foo.o, it realizes, that the whole Python-Symbols are needed, but now it is already too late.
There is a simple rule to handle this problem: put the object files first! That means:
gcc -L/path foo.o -lpython3.x <other libs>
Now the linker knows what it needs from the python-lib, when it first sees it.
There are other ways to achieve a similar result.
A) Let the linker to reiterate a group of archives as long as at least one new symbol definition was added per sweep:
gcc -L/path --Wl,-start-group -lpython3.x <other libs> foo.o -Wl,-end-group
Linker-options -Wl,-start-group and -Wl,-end-group says to linker iterate more than once over this group of archives, so the linker has a second chance (or more) to include symbols. This option can lead to longer linkage time.
B) Switching on the option --no-as-needed will lead to a shared library (and only shared library) being linked in, no matter whether in this library defined symbols are needed or not.
gcc -L/path -Wl,-no-as-needed -lpython3.x -Wl,-as-needed <other libs> foo.o
Actually, the default ld-behavior is --no-as-needed, but the gcc-frontend calls ld with option --as-needed, so we can restore the behavior by adding -no-as-needed prior to the python-library and then switch it off again.
Now to your problem of statical linking. I don't think it is advisable to use static versions of all standard libraries (all above glibc), what you should probably do is to link only the python-library statically.
The rules of the linkage are simple: per default the linker tries to open a shared version of the library first and than the static version. I.e. for the library libmylib and paths A and B, i.e.
-L/A -L/B lmylib
it tries to open libraries in the following order:
A/libmylib.so
A/libmylib.a
B/libmylib.so
B/libmylib.a
Thus if the folder A has only a static version, so this static version is used (no matter whether there is a shared version in folder B).
Because it is quite opaque which library is really used - it depends on the setup of your system, usually one would switch on the logging of the linker via -Wl,-verbose to trouble-shoot.
By using the option -Bstatic one can enforce the usage of the static version of a library:
gcc foo.o -L/path -Wl,-Bstatic -lpython3.x -Wl,-Bdynamic <other libs> -Wl,-verbose -o foo
Notable thing:
foo.o is linked before the libraries.
switch the static-mode off, directly after the python-library, so other libraries are linked dynamically.
And now:
gcc <cflags> L/paths foo.c -Wl,-Bstatic -lpython3.X -Wl,-Bdynamic <other libs> -o foo -Wl,-verbose
...
attempt to open path/libpython3.6m.a succeeded
...
ldd foo shows no dependency on python-lib
./foo
It works!
And yes, if you link against static glibc (I don't recommend), you will need to delete -Xlinker -export-dynamic from the command line.
The executable compiled without -Xlinker -export-dynamic will not be able to load some of c-extension which depend on this property of the executable to which they are loaded with ldopen.
Possible issues due to implicit -pie option.
Recent versions of gcc build with pie-option per default. Often/sometimes, older python versions where build with an older gcc-version, thus python-config --cflags would miss the now necessary -no-pie, as it was not needed back then. In this case the linker will produce an error message like:
relocation R_X86_64_32S against symbol `XXXXX' can not be used when
making a PIE object; recompile with -fPIC
In this case, -no-pie option should be added to <cflags>.
Is it possible (and how) to use MinGW-w64 for building of C-extensions for Python or embeding Python on Windows?
Let's take as example the following cython-extension foo.pyx:
print("foo loaded")
from which the C-code can be generated either via cython -3 foo.pyx or cython -3 --embed foo.pyx if interpreter should be embedded.
While mingw-w64-compiler is not really supported (the only supported windows compiler is MSVC), it can be used to create C-extensions or to embed Python. There are however no guarantee, this won't break in the future versions.
distutils does not support mingw-w64, so there is no gain in setting up a setup.py-file - the steps must be performed manually.
First we need some information usually provided by distutils:
Headers: We need the path to the Python includes. For a way to find them see this SO-post.
DLL: mingw-w64's linker works differently than MSVC's: python-dll and not python-lib is needed. So we need the path to the pythonXY.dll which is usually next the the python.exe.
Once the C-code is created/generated, the extension can be build via
x86_64-w64-mingw32-gcc -shared foo.c -DMS_WIN64 -O2 <other_options> -I <path_to_python_include> -L <path_to_python_dll> -lpython37 -o foo.pyd
The important details are:
it is probably Ok to use only use -O2 for optimization and leave <other_options> empty-
It is important to define MS_WIN64-macro (e.g. via -DMS_WIN64). In order to build for x64 on windows it must be set, but it works out of the box only for MSVC (defining _WIN64 could have slightly different outcomes):
#ifdef _WIN64
#define MS_WIN64
#endif
if it is not done, at least for files generated by Cython the following error message will be generated by the compiler:
error: enumerator value for ‘__pyx_check_sizeof_voidp’ is not an integer constant
201 | enum { __pyx_check_sizeof_voidp = 1 / (int)(SIZEOF_VOID_P == sizeof(void*)) };
pyd is just a dll in disguise, thus we need the -shared option, which means a dynamic library (i.e. shared-object in Linux-world) will be created.
It is important, that the python-library (pythonXY) should be the dll itself and not the lib (see this SO-post). Thua we use the path to pythonXY.dll (in my case python37) and not pythonXY.lib, as it would be the case for MSVC.
One probably should add the proper suffix to the resulting pyd-file, I use the old convention for simplicity here.
Embeded Python:
In this case an executable should be build (e.g. the C-file is generated by Cython with --embed option: cython -3 --embed foo.pyx) and thus the command line looks as follows:
x86_64-w64-mingw32-gcc foo.c -DMS_WIN64 -O2 <other_options> -I <path_to_python_include> -L <path_to_python_dll> -lpython37 -o foo.exe -municode
There are two important differences:
-shared should no longer be used, as the result is no longer a dynamic library (that is what *.pyd-file is after all) but an executable.
-municode is needed, because for Windows, Cython defines int wmain(int argc, wchar_t **argv) instead of int main(int argc, char** argv). Without this option, an error message like
/build/mingw-w64-_1w3Xm/mingw-w64-4.0.4/mingw-w64-crt/crt/crt0_c.c:18: undefined reference to 'WinMain'
collect2: error: ld returned 1 exit status
would appear (see this SO-post for more information).
Note: for the resulting executable to run, a whole python-distribution (and not only the dll) is needed (see also this SO-post), otherwise the resulting executable will abort with error (either the python dll wasn't found or the python installation or the site packages - depending on the configuration of the machine on which the exe has to run).
mingw-w64 can also be used on Linux for cross-compilation for Windows, see this SO-post.
I tried to install crf++ in my macbook. I downloaded CRF++-0.58 from https://taku910.github.io/crfpp/#download. Then I followed the instructins on the official website of crf++.
I firstly entered the folder named CRF++-0.58. Then I typed following code in terminal:
make
sudo make install
cd python
These commands run well. Then I typed
python setup.py install
The output was as following:
running build
running build_py
running build_ext
building '_CRFPP' extension
gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/anaconda3/include -arch x86_64 -I/anaconda3/include -arch x86_64 -I/anaconda3/include/python3.7m -c CRFPP_wrap.cxx -o build/temp.macosx-10.7-x86_64-3.7/CRFPP_wrap.o
warning: include path for stdlibc++ headers not found; pass '-stdlib=libc++' on
the command line to use the libc++ standard library instead
[-Wstdlibcxx-not-found]
CRFPP_wrap.cxx:2375:23: warning: explicitly assigning value of variable of type
'int' to itself [-Wself-assign]
res = SWIG_AddCast(res);
~~~ ^ ~~~
CRFPP_wrap.cxx:2378:23: warning: explicitly assigning value of variable of type
'int' to itself [-Wself-assign]
res = SWIG_AddCast(res);
~~~ ^ ~~~
CRFPP_wrap.cxx:2900:9: warning: variable 'res' is used uninitialized whenever
'if' condition is true [-Wsometimes-uninitialized]
if (PyType_Ready(tp) < 0)
^~~~~~~~~~~~~~~~~~~~
CRFPP_wrap.cxx:2924:10: note: uninitialized use occurs here
return res;
^~~
CRFPP_wrap.cxx:2900:5: note: remove the 'if' if its condition is always false
if (PyType_Ready(tp) < 0)
^~~~~~~~~~~~~~~~~~~~~~~~~
CRFPP_wrap.cxx:2881:10: note: initialize the variable 'res' to silence this
warning
int res;
^
= 0
CRFPP_wrap.cxx:2981:10: fatal error: 'stdexcept' file not found
#include <stdexcept>
^~~~~~~~~~~
4 warnings and 1 error generated.
error: command 'gcc' failed with exit status 1
Then I searched the error "error: command 'gcc' failed with exit status 1" online. And I found people who had similar problems. I tried some of their solutions but none of them worked.
I tried to install python-dev to solve this problem but failed.
(base) localhost:python dxm$ brew install python3-dev
Error: No available formula with the name "python3-dev"
==> Searching for a previously deleted formula (in the last month)...
Warning: homebrew/core is shallow clone. To get complete history run:
git -C "$(brew --repo homebrew/core)" fetch --unshallow
Error: No previously deleted formula found.
==> Searching for similarly named formulae...
Error: No similarly named formulae found.
==> Searching taps...`enter code here`
==> Searching taps on GitHub...
Error: No formulae found in taps.
So how could I solve this problem?
I found apple gave a way to install crf++ which seems totally different with what I tried: http://macappstore.org/crf/
After installing by this way, I can use crf++ like this:
crf_train = "crf_learn -f 3 template.txt dg_train.txt dg_model"
os.system(crf_train)
crf_test = "crf_test -m dg_model dg_test.txt -o dg_result.txt"
os.system(crf_test)
I'm trying to compile a Python wrapper to a small C++ library I've written. I've written the following setup.py script to try to use setuptools to compile the wrapper:
from setuptools import setup, Extension
import numpy as np
import os
atmcmodule = Extension(
'atmc',
include_dirs=[np.get_include(), '/usr/local/include'],
libraries=['mcopt', 'c++'], # my C++ library is at ./build/libmcopt.a
library_dirs=[os.path.abspath('./build')],
sources=['atmcmodule.cpp'],
language='c++',
extra_compile_args=['-std=c++11', '-v'],
)
setup(name='tracking',
version='0.1',
description='Particle tracking and MC optimizer module',
ext_modules=[atmcmodule],
)
However, when I run python setup.py build on OS X El Capitan, clang complains about not finding some C++ standard library headers:
In file included from atmcmodule.cpp:7:
In file included from ./mcopt.h:11:
In file included from ./arma_include.h:4:
/usr/local/include/armadillo:54:12: fatal error: 'initializer_list' file not found
#include <initializer_list>
^
1 error generated.
error: command 'gcc' failed with exit status 1
Passing the -v flag to the compiler shows that it is searching the following include paths:
#include <...> search starts here:
/Users/[username]/miniconda3/include
/Users/[username]/miniconda3/lib/python3.4/site-packages/numpy/core/include
/usr/local/include
/Users/[username]/miniconda3/include/python3.4m
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.11.sdk/usr/include/c++/4.2.1
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.11.sdk/usr/include/c++/4.2.1/backward
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../lib/clang/7.0.0/include
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.11.sdk/usr/include
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.11.sdk/System/Library/Frameworks (framework directory)
End of search list.
This apparently doesn't include the path to the C++ standard library headers. If I compile a small test C++ source with the -v option, I can see that clang++ normally also searches the path /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/../include/c++/v1, and if I include this path in the include_dirs option for Extension in my setup.py script, then the extension module compiles correctly and works. However, hard-coding this path into the script doesn't seem like a good solution since this module also needs to work on Linux.
So, my question is how do I properly make setuptools include the required headers?
Update (11/22/2015)
As setuptools tries to compile the extension, it prints the first command it's running:
gcc -fno-strict-aliasing -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/[username]/miniconda3/include -arch x86_64 -I/Users/[username]/miniconda3/lib/python3.4/site-packages/numpy/core/include -I/Users/[username]/Documents/Code/ar40-aug15/monte_carlo/mcopt -I/usr/local/include -I/Users/[username]/miniconda3/include/python3.4m -c /Users/[username]/Documents/Code/ar40-aug15/monte_carlo/atmc/atmcmodule.cpp -o build/temp.macosx-10.5-x86_64-3.4/Users/[username]/Documents/Code/ar40-aug15/monte_carlo/atmc/atmcmodule.o -std=c++11 -fopenmp -v
If I paste this command into a terminal and run it myself, the extension compiles successfully. So I suspect either setuptools is modifying some environment variables I'm not aware of, or it's lying a little about the commands it's actually running.
Setuptools tries to compile C/C++ extension modules with the same flags used to compile the Python interpreter. After checking the flags used to compile my Python install (from Anaconda), I found it was compiling for a minimum Mac OS X version of 10.5. This seems to make it use the GCC libstdc++ instead of clang's libc++ (which supports C++11).
This can be fixed by either setting the environment variable MACOSX_DEPLOYMENT_TARGET to 10.9 (or later), or adding '-mmacosx-version-min=10.9' to extra_compile_args.
I am trying to install mod_wsgi for a updated version of Apache.
I currently have Apache 2.4 installed via /opt/rh/httpd24/root/etc/httpd.
I am wanting to compile with it using 2.4 not 2.2. Any help on the syntax here? I can't find any reference to that specific argument.
[root#bmograba mod_wsgi-4.4.13]# ./configure -help
`configure' configures this package to adapt to many kinds of systems.
Usage: ./configure [OPTION]... [VAR=VALUE]...
To assign environment variables (e.g., CC, CFLAGS...), specify them as
VAR=VALUE. See below for descriptions of some of the useful variables.
Defaults for the options are specified in brackets.
Configuration:
-h, --help display this help and exit
--help=short display options specific to this package
--help=recursive display the short help of all the included packages
-V, --version display version information and exit
-q, --quiet, --silent do not print `checking ...' messages
--cache-file=FILE cache test results in FILE [disabled]
-C, --config-cache alias for `--cache-file=config.cache'
-n, --no-create do not create output files
--srcdir=DIR find the sources in DIR [configure dir or `..']
Installation directories:
--prefix=PREFIX install architecture-independent files in PREFIX
[/usr/local]
--exec-prefix=EPREFIX install architecture-dependent files in EPREFIX
[PREFIX]
By default, `make install' will install all the files in
`/usr/local/bin', `/usr/local/lib' etc. You can specify
an installation prefix other than `/usr/local' using `--prefix',
for instance `--prefix=$HOME'.
For better control, use the options below.
Fine tuning of the installation directories:
--bindir=DIR user executables [EPREFIX/bin]
--sbindir=DIR system admin executables [EPREFIX/sbin]
--libexecdir=DIR program executables [EPREFIX/libexec]
--sysconfdir=DIR read-only single-machine data [PREFIX/etc]
--sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com]
--localstatedir=DIR modifiable single-machine data [PREFIX/var]
--libdir=DIR object code libraries [EPREFIX/lib]
--includedir=DIR C header files [PREFIX/include]
--oldincludedir=DIR C header files for non-gcc [/usr/include]
--datarootdir=DIR read-only arch.-independent data root [PREFIX/share]
--datadir=DIR read-only architecture-independent data [DATAROOTDIR]
--infodir=DIR info documentation [DATAROOTDIR/info]
--localedir=DIR locale-dependent data [DATAROOTDIR/locale]
--mandir=DIR man documentation [DATAROOTDIR/man]
--docdir=DIR documentation root [DATAROOTDIR/doc/PACKAGE]
--htmldir=DIR html documentation [DOCDIR]
--dvidir=DIR dvi documentation [DOCDIR]
--pdfdir=DIR pdf documentation [DOCDIR]
--psdir=DIR ps documentation [DOCDIR]
Optional Features:
--disable-option-checking ignore unrecognized --enable/--with options
--disable-FEATURE do not include FEATURE (same as --enable-FEATURE=no)
--enable-FEATURE[=ARG] include FEATURE [ARG=yes]
--enable-framework enable mod_wsgi framework link
--disable-embedded disable mod_wsgi embedded mode
Optional Packages:
--with-PACKAGE[=ARG] use PACKAGE [ARG=yes]
--without-PACKAGE do not use PACKAGE (same as --with-PACKAGE=no)
--with-apxs=NAME name of the apxs executable [[apxs]]
--with-python=NAME name of the python executable [[python]]
Some influential environment variables:
CC C compiler command
CFLAGS C compiler flags
LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries in a
nonstandard directory <lib dir>
LIBS libraries to pass to the linker, e.g. -l<library>
CPPFLAGS (Objective) C/C++ preprocessor flags, e.g. -I<include dir> if
you have headers in a nonstandard directory <include dir>
Use these variables to override the choices made by `configure' or to help
it to find libraries and programs with nonstandard names/locations.
Report bugs to the package provider.
Compiled plane with no arguments:
[root#bmograba mod_wsgi-4.4.13]# ./configure
checking for apxs2... no
checking for apxs... /usr/sbin/apxs
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking for prctl... yes
checking Apache version... 2.2.15
checking for python... /usr/bin/python
configure: creating ./config.status
config.status: creating Makefile
Just to see what package output would be looking for:
[root#bmograba mod_wsgi-4.4.13]# ./configure --with-PACKAGE
configure: WARNING: you should use --build, --host, --target
configure: WARNING: unrecognized options: --with-PACKAGE
checking for apxs2... no
checking for apxs... /usr/sbin/apxs
checking for httpd-gcc... no
checking for gcc... gcc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether gcc accepts -g... yes
checking for gcc option to accept ISO C89... none needed
checking for prctl... yes
checking Apache version... 2.2.15
checking for python... /usr/bin/python
configure: creating ./config.status
config.status: creating Makefile
configure: WARNING: unrecognized options: --with-PACKAGE
When they say, --with-PACKAGE, PACKAGE is just a placeholder for a package name. You aren't supposed to literally put --with-PACKAGE.
Find where the 'apxs' script is inside of the alternate Apache installation. Then run:
./configuure --with-apxs /opt/rh/httpd24/bin/apxs
where that argument is the location of that 'apxs' script.
https://code.google.com/p/modwsgi/wiki/QuickInstallationGuide#Configuring_The_Source_Code