Supress UserWarning from torchmetrics - python

When I train a neural network using PyTorch, I get the following warning caused by the torchmetrics library:
/Users/dev/miniconda/envs/pytorch/lib/python3.10/site-packages/torchmetrics/utilities/prints.py:36:
UserWarning: Torchmetrics v0.9 introduced a new argument class
property called full_state_update that has not been set for this
class (SMAPE). The property determines if update by default needs
access to the full metric state. If this is not the case, significant
speedups can be achieved and we recommend setting this to False. We
provide an checking function from torchmetrics.utilities import check_forward_no_full_state that can be used to check if the
full_state_update=True (old and potential slower behaviour, default
for now) or if full_state_update=False can be used safely.
I tried to suppress this warning by using the warnings package in my script:
with warnings.catch_warnings():
warnings.simplefilter("ignore")
However, the warning is still shown which is probably due to a function in prints.py of torchmetrics:
def _warn(*args: Any, **kwargs: Any) -> None:
warnings.warn(*args, **kwargs)
Is it possible to get rid of this warning from my script without changing the library code?

Use -W argument to control how python deals with warnings, consider following simple example, let dowarn.py content be
import warnings
warnings.warn("I am UserWarning", UserWarning)
warnings.warn("I am FutureWarning", FutureWarning)
then
python dowarn.py
gives
dowarn.py:2: UserWarning: I am UserWarning
warnings.warn("I am UserWarning", UserWarning)
dowarn.py:3: FutureWarning: I am FutureWarning
warnings.warn("I am FutureWarning", FutureWarning)
and
python -W ignore dowarn.py
gives empty output and
python -W ignore::UserWarning dowarn.py
gives
dowarn.py:3: FutureWarning: I am FutureWarning
warnings.warn("I am FutureWarning", FutureWarning)
See python man page for discussion of -W values

Related

scipy overwriting warning filters?

It seems as though some scipy modules are messing with my warning filters. Consider the following code. My understanding is that it should only throw one warning because of the "once" filter I supplied to my custom Warning class. However, the warning after the scipy import gets shown as well.
This is with python 3.7 and scipy 1.6.3.
import warnings
class W(DeprecationWarning): pass
warnings.simplefilter("once", W)
warnings.warn('warning!', W)
warnings.warn('warning!', W)
from scipy import interpolate
warnings.warn('warning!', W)
This only seems to happen when I import certain scipy modules. A generic "import scipy" doesn't do this.
I've narrowed it down to the filters set in scipy.special.sf_error.py and scipy.sparse.__init__.py. I don't see how that code would cause the problem, but it does. When I comment those filtersout, my code works as expected.
Am I misunderstanding something? Is there a work-around that that doesn't involved overwriting warnings.filterwarnings/warnings.simplefilters?
This an open Python bug: https://bugs.python.org/issue29672.
Note, in particular, the last part of the comment by Tom Aldcroft:
Even a documentation update would be useful. This could explain not only catch_warnings(), but in general the unexpected feature that if any package anywhere in the stack sets a warning filter, then that globally resets whether a warning has been seen before (via the call to _filters_mutated()).
The code in scipy/special/sf_error.py sets a warning filter, and that causes a global reset of which warnings have been seen before. (If you add another call of warnings.warn('warning!', W) to the end of your sample code, you should see that it does not raise a warning.)

Deprecation warning for np.ptp

I'm using Python and when using the following code
df['timestamp'] = df.groupby(["id"]).timestamp.transform(np.ptp)
I'm getting the warning FutureWarning: Method .ptp is deprecated and will be removed in a future version. Use numpy.ptp instead.. df is a Pandas DataFrame and timestamp and id are columns. I think np.ptp is causing this warning.
What do I have to change?
It means that the Method .ptp is being deprecated in favor of (from what I've read) the function np.ptp() so you can either set warnings to false in order to not read it, or replace the method with the function as numpy seems to suggest.
If you wish to supress the warnings, you can try with:
warnings.filterwarnings('ignore') or warnings.simplefilter('ignore', FutureWarning) if it's only FutureWarning you are ignoring.

Specifying the message in warnings.simple_filter() filters based on category not message

When we import our_library in Python2, we have coded it to raise a DeprecationWarning once. Here's the representative code.
our_library/init.py
def _py2_deprecation_warning():
py2_warning = ('Python2 support is deprecated and will be removed in '
'a future release. Consider switching to Python3.')
warnings.filterwarnings('once', message=py2_warning)
warnings.warn(message=py2_warning,
category=DeprecationWarning,
stacklevel=3,
)
def _python_deprecation_warnings():
if sys.version_info.major == 2:
_py2_deprecation_warning()
_python_deprecation_warnings()
We deprecated the parameters in a function in our_library. Here's the representative code:
our_library/some_module.py
def some_function(new_param, deprecated_param):
if deprecated_param:
param_deprecation_msg = (
'The parameter "{}" will be removed in a future version of Nilearn.'
'Please use the parameter "{}" instead.'.format(deprecated_param,
new_param,
)
)
warnings.warn(category=DeprecationWarning,
message=param_deprecation_msg,
stacklevel=3)
Then when we import our library, and call that function, like this:
calling_script.py
from our_library.some_module import some_function
some_function(deprecated_param)
We get the Python2 DeprecationWarning but not the Parameter DeprecationWarning.
DeprecationWarning: Python2 support is deprecated and will be removed in a future release. Consider switching to Python3.
_python_deprecation_warnings()
Now know I can solve this by using a with warnings.catch_warnings(): or resetwarnings(). However I thought that specifying the message explicitly in Python2 Warning will prevent the 'once filter being set for other DeprecationWarnings.
However that is not the case? WhHy is that and how do I make my existing code work without using CatchWarnings or reset warnings?
If I change the Parameter warning to FutureWarning, that I can see.
Why is the first simplefilter blocking all deprecation messages based on category instead of messages?
UPDATE:
with warnings.catch_warnings(): doesn't seem to work either.
def _py2_deprecation_warning():
py2_warning = ('Python2 support is deprecated and will be removed in '
'a future release. Consider switching to Python3.')
with warnings.catch_warnings():
warnings.filterwarnings('once', message=py2_warning)
warnings.warn(message=py2_warning,
category=DeprecationWarning,
stacklevel=3,
)
Nevermind, I had forgotten that DeprecationWarnings are not displayed by defualt. They must specifically be set to be displayed, which I have not done here.

Suppress warnings for python-xarray

I'm running the following code
positive_values = values.where(values > 0)
In this example values may contain nan elements. I believe that for this reason, I'm getting the following runtime warning:
RuntimeWarning: invalid value encountered in greater_equal if not reflexive
Does xarray have methods of surpressing these warnings?
The warnings module provides the functionality you are looking for.
To suppress all warnings do (see John Coleman's answer for why this is not good practice):
import warnings
warnings.simplefilter("ignore")
# warnings.simplefilter("ignore", category=RuntimeWarning) # for RuntimeWarning only
To make the suppression temporary do it inside the warnings.catch_warnings() context manager:
import warnings
with warnings.catch_warnings():
warnings.simplefilter("ignore")
positive_values = values.where(values > 0)
The context manager saves the original warning settings prior to entering the context and then sets them back when exiting the context.
As a general rule of thumb, warnings should be heeded rather than suppressed. Either you know what causes the warning or you don't. If you know what causes the warning, there is usually a simple workaround. If you don't know what causes the warning, there is likely a bug. In this case, you can use the short-circuiting nature of & as follows:
positive_values = values.where(values.notnull() & values > 0)

code is working but with a 'DeprecationWarning' -scipy.stats

I wrote this code to calculate the mode and standard deviation for a large sample:
import numpy as np
import csv
import scipy.stats as sp
import math
r=open('stats.txt', 'w') #file with results
r.write('Data File'+'\t'+ 'Mode'+'\t'+'Std Dev'+'\n')
f=open('data.ls', 'rb') #file with the data files
for line in f:
dataf=line.strip()
data=csv.reader(open(dataf, 'rb'))
data.next()
data_list=[]
datacol=[]
data_list.extend(data)
for rows in data_list:
datacol.append(math.log10(float(rows[73])))
m=sp.mode(datacol)
s=sp.std(datacol)
r.write(dataf+'\t'+str(m)+'\t'+str(s)+'\n')
del(datacol)
del(data_list)
Which is working well -I think! However after I run the code there is an error message on my terminal and I am wondering if anybody can tell me what it means?
/usr/lib/python2.6/dist-packages/scipy/stats/stats.py:1328: DeprecationWarning: scipy.stats.std is deprecated; please update your code to use numpy.std.
Please note that:
- numpy.std axis argument defaults to None, not 0
- numpy.std has a ddof argument to replace bias in a more general manner.
scipy.stats.std(a, bias=True) can be replaced by numpy.std(x,
axis=0, ddof=0), scipy.stats.std(a, bias=False) by numpy.std(x, axis=0,
ddof=1).
ddof=1).""", DeprecationWarning)
/usr/lib/python2.6/dist-packages/scipy/stats/stats.py:1304: DeprecationWarning: scipy.stats.var is deprecated; please update your code to use numpy.var.
Please note that:
- numpy.var axis argument defaults to None, not 0
- numpy.var has a ddof argument to replace bias in a more general manner.
scipy.stats.var(a, bias=True) can be replaced by numpy.var(x,
axis=0, ddof=0), scipy.stats.var(a, bias=False) by var(x, axis=0,
ddof=1).
ddof=1).""", DeprecationWarning)
/usr/lib/python2.6/dist-packages/scipy/stats/stats.py:420: DeprecationWarning: scipy.stats.mean is deprecated; please update your code to use numpy.mean.
Please note that:
- numpy.mean axis argument defaults to None, not 0
- numpy.mean has a ddof argument to replace bias in a more general manner.
scipy.stats.mean(a, bias=True) can be replaced by numpy.mean(x,
axis=0, ddof=1).
axis=0, ddof=1).""", DeprecationWarning)
Those are deprecation warnings, which usually mean that your code will work, but may stop working in a future release.
Currently you have this line s=sp.std(datacol). It looks like the warning suggests using numpy.std() instead of scipy.stats.std() Making this change could make this warning go away.
If you don't care about the deprecation warning and want to use your code as is, you can suppress it with the warnings module. For example, if you have a function fxn() that generates a DeprecationWarning, you can wrap it like this:
with warnings.catch_warnings():
warnings.simplefilter("ignore")
fxn() #this function generates DeprecationWarnings
The DeprecationWarnings don't prevent your code to run properly, they are just warnings that the code you are using will soon be deprecated and that you should update it to the proper syntax.
In this particular case, it stems from inconstencies between NumPy and SciPy on the default arguments for the var, std... functions/methods. In order to clean things up, it was decided to drop the functions from scipy.stats and use their NumPy counterparts instead.
Of course, just dropping the functions would upset some users whose code would suddenly fail to work. So, the SciPy devs decided to include a DeprecationWarning for a couple of releases, which should leave enough time for everybody to update their code.
In your case, you should use check the docstring of scipy.stats.std on your system to see what defaults they're using, and follow the warning instructions on how to modify your code accordingly.

Categories

Resources