I have seen passing a functions using teleport_function, is there any way to pass class methods ?? or can I execute my class object remotely in any other way
I can alternative work around by making class a service . But that does not serve completely my requirement in my project.
import rpyc
from rpyc.utils.classic import teleport_function
def a(test):
print test
if __name__ = '__main__':
proxy = rpyc.classic.connect('remte.com')
launch_command_remote = teleport_function(proxy, a)
launch_command_remote("Voila") # Executed remotly
Related
I am trying to use the hydra tool in my project and would like to use the decorator for class functions
import hydra
from hydra.core.config_store import ConfigStore
from src.config import RecordingConfig
cs = ConfigStore.instance()
cs.store(name="recording_config", node=RecordingConfig)
class HydraClassTest:
#hydra.main(config_path="../src/conf/", config_name="conf")
def __init__(self, conf: RecordingConfig):
print(conf)
def main():
HydraClassTest()
if __name__ == "__main__":
main()
But I get the error
TypeError: __init__() missing 1 required positional argument: 'conf'
Is this intended and should I pass the configuration from the outside to the class? (For example by using the decorator on the main function and passing the configuration as a parameter to the initializer, this works)
Or am using the decorator in a wrong way?
If it is intended, is there some design reason why one would not want to do it that way?
I have checked whether I used the decorator correctly by passing the configuration through the main function, that worked.
import hydra
from hydra.core.config_store import ConfigStore
from src.config import RecordingConfig
cs = ConfigStore.instance()
cs.store(name="recording_config", node=RecordingConfig)
class HydraClassTest:
def __init__(self, conf: RecordingConfig):
print(conf)
#hydra.main(config_path="../src/conf/", config_name="conf")
def main(conf: RecordingConfig):
HydraClassTest(conf)
if __name__ == "__main__":
main()
This gives me the expected result.
#hydra.main() is not appropriate for this use case. It's designed to be used once in an application and it has many side effects (changing working directory, configuring logging etc).
Use the Compose API instead.
So,
consider I have a simple library that I am trying to write unit-tests for. This library talks to a database and then uses that data to call an SOAP API. I have three modules, and a testfile for each module.
dir structure:
./mypkg
../__init__.py
../main.py
../db.py
../api.py
./tests
../test_main
../test_db
../test_api
Code:
#db.py
import mysqlclient
class Db(object):
def __init__(self):
self._client = mysqlclient.Client()
#property
def data(self):
return self._client.some_query()
#api.py
import soapclient
class Api(object):
def __init__(self):
self._client = soapclient.Client()
#property
def call(self):
return self._client.some_external_call()
#main.py
from db import Db
from api import Api
class MyLib(object):
def __init__(self):
self.db = Db()
self.api = Api()
def caller(self):
return self.api.call(self.db.data)
Unit-Tests:
#test_db.py
import mock
from mypkg.db import Db
#mock.patch('mypkg.db.mysqlclient')
def test_db(mysqlclient_mock):
mysqlclient_mock.Client.return_value.some_query = {'data':'data'}
db = Db()
assert db.data == {'data':'data'}
#test_api.py
import mock
from mypkg.api import Api
#mock.patch('mypkg.db.soapclient')
def test_db(soap_mock):
soap_mock.Client.return_value.some_external_call = 'foo'
api = Api()
assert api.call == 'foo'
In the above example, mypkg.main.MyLib calls mypkg.db.Db() (uses third-party mysqlclient) and then mypkg.api.Api() (uses third-party soapclient)
I am using mock.patch to patch the third-party libraries to mock my db and api calls in test_db and test_api separately.
Now my question is, is it recommended to patch these external calls again in test_main OR simply patch db.Db and api.Api? (this example is pretty simple, but in larger libraries, the code becomes cumbersome when patching the external calls again or even using test helper functions that patch internal libraries).
Option1: patch external libraries in main again
#test_main.py
import mock
from mypkg.main import MyLib
#mock.patch('mypkg.db.mysqlclient')
#mock.patch('mypkg.api.soapclient')
def test_main(soap_mock, mysqlcient_mock):
ml = MyLib()
soap_mock.Client.return_value.some_external_call = 'foo'
assert ml.caller() == 'foo'
Option2: patch internal libraries
#test_main.py
import mock
from mypkg.main import MyLib
#mock.patch('mypkg.db.Db')
#mock.patch('mypkg.api.Api')
def test_main(api_mock, db_mock):
ml = MyLib()
api_mock.return_value = 'foo'
assert ml.caller() == 'foo'
mock.patch creates a mock version of something where it's imported, not where it lives. This means the string passed to mock.patch has to be a path to an imported module in the module under test. Here's what the patch decorators should look like in test_main.py:
#mock.patch('mypkg.main.Db')
#mock.patch('mypkg.main.Api')
Also, the handles you have on your patched modules (api_mock and db_mock) refer to the classes, not instances of those classes. When you write api_mock.return_value = 'foo', you're telling api_mock to return 'foo' when it gets called, not when an instance of it has a method called on it. Here are the objects in main.py and how they relate to api_mock and db_mock in your test:
Api is a class : api_mock
Api() is an instance : api_mock.return_value
Api().call is an instance method : api_mock.return_value.call
Api().call() is a return value : api_mock.return_value.call.return_value
Db is a class : db_mock
Db() is an instance : db_mock.return_value
Db().data is an attribute : db_mock.return_value.data
test_main.py should therefore look like this:
import mock
from mypkg.main import MyLib
#mock.patch('mypkg.main.Db')
#mock.patch('mypkg.main.Api')
def test_main(api_mock, db_mock):
ml = MyLib()
api_mock.return_value.call.return_value = 'foo'
db_mock.return_value.data = 'some data' # we need this to test that the call to api_mock had the correct arguments.
assert ml.caller() == 'foo'
api_mock.return_value.call.assert_called_once_with('some data')
The first patch in Option 1 would work great for unit-testing db.py, because it gives the db module a mock version of mysqlclient. Similarly, #mock.patch('mypkg.api.soapclient') belongs in test_api.py.
I can't think of a way Option 2 could help you unit-test anything.
Edited: I was incorrectly referring to classes as modules. db.py and api.py are modules
I come from Java background and most of my thinking comes from there. Recently started learning Python. I have a case where I want to just create one connection to Redis and use it everywhere in the project. Here is how my structure and code looks.
module: state.domain_objects.py
class MyRedis():
global redis_instance
def __init__(self):
redis_instance = redis.Redis(host='localhost', port=6379, db=0)
print("Redus instance created", redis_instance)
#staticmethod
def get_instance():
return redis_instance
def save_to_redis(self, key, object_to_cache):
pickleObj = pickle.dumps(object_to_cache)
redis_instance.set(key, pickleObj)
def get_from_redis(self, key):
pickled_obj = redis_instance.get(key)
return pickle.loads(pickled_obj)
class ABC():
....
Now I want to use this from other modules.
module service.some_module.py
from state.domain_objects import MyRedis
from flask import Flask, request
#app.route('/chat/v1/', methods=['GET'])
def chat_service():
userid = request.args.get('id')
message_string = request.args.get('message')
message = Message(message_string, datetime.datetime.now())
r = MyRedis.get_instance()
user = r.get(userid)
if __name__ == '__main__':
global redis_instance
MyRedis()
app.run()
When I start the server, MyRedis() __init__ method gets called and the instance gets created which I have declared as global. Still when the service tries to access it when the service is called, it says NameError: name 'redis_instance' is not defined I am sure this is because I am trying to java-fy the approach but not sure how exactly to achieve it. I read about globals and my understanding of it is, it acts like single variable to the module and thus the way I have tried doing it. Please help me clear my confusion. Thanks!
the last few weaks, I am playing a little bit with the Web.py framework. As my application is now getting bigger and bigger, I want to restructure the sourcecode and put code fragments in different classes. Now, I don't really know where I should create my object instances if I need them in different web.py classes. Let us assume, my sourcecode looks like:
import web
import myclass
urls = (
'/', 'index',
'/test', 'test'
)
#should i make my instance global...
my = myclass.myClass()
class test:
def __init__(self):
#...or should i make my instance local: my = myclass.myClass()
pass
def GET(self):
item = my.getItem()
return item
def POST(self):
pass
class index:
def __init__(self):
#...or should i make my instance local: my = myclass.myClass()
pass
def GET(self):
date = my.getDate()
return date
if __name__ == "__main__":
app = web.application(urls, globals())
app.run()
Now, I want to access the methods getItem() and getDate() (which belong to the instance my), if the appropriate sites in my webbrowser are called. My question is now: Should I make the instance global or is it better, if I make it local? I really don't like global instances, but I don't see any other way as to make it global. Sure, it would be possible, to create a local instance, but then, every time the page loads, a new instance would be created, right? Normally, this wouldn't be a problem, but myclass accesses a serial port, so I need to make sure, that only one instance is created.
Am I missing something or is a global instance the only possible solution to accomplish this?
After some research, I came to the conclusion, that global instances are the way to go here. However, one must be careful with global instances if they are used together with the web.py auto reload mode. In auto reload mode, a global instance is created every time a new page loads. If you want to avoid that, you have to use something like this:
import web
import serial
urls = ("/(.*)", "test"
)
web.config.debug = True
if(web.config.debug):
#in debug mode, make sure that global serial instance is only created at start up
if not hasattr(serObj, "_email"):
serObj = serial.Serial(0, 9600, parity=serial.PARITY_NONE)
web._serObj = serObj
else:
serObj = web._serObj
class test:
def GET(self):
return "Test"
def POST(self):
pass
if __name__ == "__main__":
app = web.application(urls, globals())
app.run()
I need to create a dbus object in python with method names that are decided at runtime.
The code I've tried is basically this:
import dbus
import dbus.service
from dbus.mainloop.glib import DBusGMainLoop
import gobject
DBusGMainLoop(set_as_default=True)
gobject.threads_init()
class greg(dbus.service.Object):
def __init__(self):
dbus.service.Object.__init__(self, bus, "/greg")
#dbus.service.method(
dbus_interface="com.blah.blah",
in_signature="",
out_signature="")
def dance(self):
print "*busts a move*"
def func(self):
pass
func = dbus.service.method(
dbus_interface="com.blah.blah",
in_signature="",
out_signature="")(func)
setattr(greg, "do_nothing", func)
bus = dbus.SystemBus()
busname = dbus.service.BusName("com.blah.blah", bus)
obj = greg()
loop = gobject.MainLoop()
loop.run()
In this case the function 'dance' is available on the interface but the function 'do_nothing' is not. I don't understand why? Is there a way to do what I'm trying to achieve?
I'm guessing that the do_nothing method is available, but not visible. Have you tried to call it blindly?
What is visible is what is returned by the Introspect method, which in turn depends on the _dbus_class_table class attribute, which you therefore need to update to have Introspect return the updated list of D-Bus methods.
func() has no dbus service header, so it is not recognized.
How can you set "do_nothing" to your function when the greg object contains no such attribute?
Check whether the object has the attribute to ensure that your statement will complete successfully.
print(hasattr(greg, "do_nothing"))
Also, it would be appreciated if you could pay more attention to python code style guidelines in the future:
http://www.python.org/dev/peps/pep-0008/