Today I was building a blockchain in python off of this tutorial, since I'm interested in cryptocurrency. I was ready to launch it, when I got this error.
Traceback (most recent call last):
File "blockchain.py", line 9, in <module>
class Chain(object):
File "blockchain.py", line 17, in Chain
blockchain = Chain()
NameError: name 'Chain' is not defined
I don't really know why this is happening, to me there doesn't really seem to be a problem. Here is part of the code where it is having the error:
import hashlib
import json
from textwrap import dedent
from time import time
from hashlib import sha256
from uuid import uuid4
from flask import Flask, jsonify, request
class Chain(object):
def __init__(self):
self.chain = []
self.current_transactions = []
self.new_block(previous_hash=1, proof=100)
app = Flask(__name__)
node_indentifier = str(uuid4()).replace('-', '')
blockchain = Chain()
If you need more of the code then I don't mind giving more.
Thanks in advance!
It is simple as indentation.
You wrote
chain = Chain()
In the class. The correct code is just moving the last lines back.
import hashlib
import json
from textwrap import dedent
from time import time
from hashlib import sha256
from uuid import uuid4
from flask import Flask, jsonify, request
class Chain(object):
def __init__(self):
self.chain = []
self.current_transactions = []
self.new_block(previous_hash=1, proof=100)
app = Flask(__name__)
node_indentifier = str(uuid4()).replace('-', '')
blockchain = Chain()
IIUC you are trying to call the blockchain = Chail() outside the class Chain. Your indentation is wrong. This should work
class Chain(object):
def __init__(self):
self.chain = []
self.current_transactions = []
self.new_block(previous_hash=1, proof=100)
app = Flask(__name__)
node_indentifier = str(uuid4()).replace('-', '')
blockchain = Chain()
Related
I'm confused about import of custom modules. As you can see in the code below, in main I first import all libraries needed by everything AND that I duplicated those imports in my i_setup_functions.py file. Leaving any of them out of either file created errors. Same with duplication of "app = Flask(name)". I really hope that redundancy is not correct and there is some simple way to fix this. All I want to do is include setup for sessions, email, data connection, etc. Only showing sessions here for simplicity sake.
BTW: The entire app worked bug-free until I tried to modularize.
Error message:
RuntimeError: The session is unavailable because no secret key was set. Set the secret_key on the application to something unique and secret.
That error message points to a line in a function in the middle of main.py that tries to create a session.
Thanks for any ideas you all can share!
main.py:
from flask import session
import random
from datetime import datetime, timedelta
from i_setup_functions import setup_sessions
app = Flask(__name__)
# is the following line even necessary in either module?
application = app
setup_sessions()
setup_mail()
setup_logging()
[snip]
# Error here:
​session["id_user"] = id_user
i_setup_functions.py
from flask import session
import random
from datetime import datetime, timedelta
from i_setup_functions import setup_sessions
app = Flask(__name__)
application = app
def setup_sessions():
random.seed(datetime.now())
app.config['SECRET_KEY'] = str(random.randint(1, 500)) + "jibber" + str(random.randint(1, 500)) + "jabber"
app.permanent_session_lifetime = timedelta(days=30)
return True
You are creating two (or more?) separate apps and setting the SECRET_KEY to the one that isn't serving your application.
To fix this remove all app = Flask(__name__) calls from all modules except main.py. Then, pass the app you create in main.py to all the places you need it.
from flask import session
import random
from datetime import datetime, timedelta
from i_setup_functions import setup_sessions
app = Flask(__name__)
setup_sessions(app)
setup_mail(app)
setup_logging(app)
[snip]
​session["id_user"] = id_user
I am trying to mock test an endpoint that gets the time and date.
I have viewed several tutorials and python docs, but I am still getting stumped by the mock test.
Any help is appreciated greatly
from flask import Flask, redirect, url_for
import json
import urllib.request
import requests
app = Flask(__name__)
#app.route('/')
def welcome():
return "Hello"
#app.route('/<zone>')
def Endpoint(zone):
address = f"http://worldclockapi.com/api/json/{zone}/now"
response = urllib.request.urlopen(address)
result = json.loads(response.read())
time = result['currentDateTime']
return time
if __name__ == "__main__":
app.run(debug=True)
My attempt.
I think I am still calling the external element.
I want to use a fake JSON string and actually mock with that.
The first test passes when I run it. But I don't think it is a true mock.
#!/usr/bin/python
import unittest
from unittest import TestCase
from unittest.mock import patch, Mock
#name of endpoint program
import question
class TestingMock(TestCase):
#patch('question.Endpoint')
def test_call(self, MockTime):
current = MockTime()
current.posts.return_value = [
{"$id": "1", "currentDateTime": "2020-07-17T12:31-04:00", "utcOffset": "-04:00:00"}
]
response = current.posts()
self.assertIsNotNone(response)
self.assertIsInstance(response[0], dict)
#patch('question.Endpoint')
def test_response(mock_get):
mock_get.return_value.ok = True
respond = question.Endpoint()
assert_is_not_none(respond)
if __name__ == '__main__':
unittest.main()
You are conflicting with your root URL handler. Try changing #app.route('/<zone>') to #app.route('/time/<zone>'), then navigate to that url
I programmed a gateway to a opcua-server with python-opcua.
The gateway is subscribing some values in the opcua. That is working good and fast.
Now I want to call a script that writes to the opcua.
In principle, it works too. But because I have to import the whole gateway(and all opcua stuff), it is very slow...
My Question: Is is possible to trigger a function in my class-instance without imorting everything?
To start e.g. function setBool(), I have to import Gateway...
#!/usr/bin/env python3.5 -u
# -*- coding: utf-8 -*-
import time
import sys
import logging
from logging.handlers import RotatingFileHandler
from threading import Thread
from opcua import Client
from opcua import ua
from subscribeOpcua import SubscribeOpcua
from cmdHandling import CmdHandling
from keepConnected import KeepConnected
class Gateway(object):
def __init__(self):
OPCUA_IP = '1.25.222.222'
OPCUA_PORT = '4840'
OPCUA_URL = "opc.tcp://{}:{}".format(OPCUA_IP, str(OPCUA_PORT))
addr = "OPCUA-URL:{}.".format(OPCUA_URL)
# Setting up opcua-handler
self.client = Client(OPCUA_URL)
self.opcuaHandlers = [SubscribeOpcua()]
# Connect to opcua
self.connecter = KeepConnected(self.client,self.opcuaHandlers)
self.connecter.start()
def setBool(self, client):
"""Set e boolean variable on opcua-server.
"""
path = ["0:Objects","2:DeviceSet"...]
root = client.get_root_node()
cmd2opcua = root.get_child(path)
cmd2opcua.set_value(True)
if __name__ == "__main__":
"""Open connecter when gateway is opened directly.
"""
connect = Gateway()
The only way to prevent a code from runing when importing a module is to put it inside a method:
def import_first_part():
global re
global defaultdict
print('import this first part')
# import happen locally
# because when you do `import re` actually
# re = __import__('re')
import re
from collections import defaultdict
def import_second_part():
print('import pandas')
# really unnecessary check here because if we import
# pandas for the second time it will just retrieve the object of module
# the code of module is executed only in the first import in life of application.
if 'pandas' in globals():
return
global pandas
import pandas
def use_regex():
import_first_part()
# do something here
if __name__ == '__main__':
use_regex()
re.search('x', 'xb') # works fine
I checked that 'pandas' is in global scope before reimport it again but really this is not necessary, because when you import a module for the second time it's just retrieved no heavy calculation again.
I am using tornado framework for loading my machine learning model. I have a popularity class
import numpy as np
import pandas as pd
from pandas import DataFrame
class Popularity():
users_data = pd.read_csv('~/Desktop/LatentCollaborativeFiltering/lib/seed_data/ratings.csv')
movies_data = pd.read_csv('~/Desktop/LatentCollaborativeFiltering/lib/seed_data/movies.csv')
data = pd.merge(users_data, movies_data, left_on="movieId", right_on="movieId")
data = pd.DataFrame.sort_values(data, ['userId','movieId'],ascending=[0,1])
def __init__(self):
pass
def favoriteMovies(self, activeUser,N):
topMovies=pd.DataFrame.sort_values(self.data[self.data.userId==activeUser],['rating'],ascending=[0])[:N]
# return the title corresponding to the movies in topMovies
return list(topMovies.title)
def recommend_movies(self):
return "No recommendation"
Now i have another file to pickle an object of this class build_model.py
from __future__ import print_function
import os
from sklearn.externals import joblib
import pandas as pd
import numpy as np
from popularity import Popularity
if __name__ == "__main__":
popu = Popularity()
_CUR_DIR = os.path.dirname(os.path.realpath(__file__))
_SERIALIZATION_DIR = os.path.join(_CUR_DIR)
if not os.path.exists(_SERIALIZATION_DIR):
os.makedirs(_SERIALIZATION_DIR)
model_filename = os.path.join(_SERIALIZATION_DIR, "model.pkl")
joblib.dump(popu, model_filename)
print("Successfully Built and Picked into models folder")
This now builds the model and successfully saves the model in same directory as model.pkl file. But when I load the model in the torando it gives me following error
[I 180702 06:30:44 server:40] Loading Latent Collaborative Filtering model...
Traceback (most recent call last):
File "run.py", line 7, in <module>
server.main()
File "/home/rabin/Desktop/LatentCollaborativeFiltering/movies-api/app/server.py", line 45, in main
MODELS["recommender"] = pickle.load(infile)
ModuleNotFoundError: No module named 'Popularity'
My server.py file is
# !/usr/bin/env python
# -*- coding: utf-8 -*-
import os
import logging
import logging.config
import tornado.ioloop
import tornado.web
from tornado.options import options
from sklearn.externals import joblib
from app.settings import MODEL_DIR, ROOT_DIR, _CUR_DIR
from app.handler import IndexHandler, IrisPredictionHandler
from app.popularity import Popularity
import pickle
MODELS = {}
def load_model(pickle_filename):
return joblib.load(pickle_filename)
def main():
# Get the Port and Debug mode from command line options or default in settings.py
options.parse_command_line()
# create logger for app
logger = logging.getLogger('app')
logger.setLevel(logging.INFO)
FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
logging.basicConfig(format=FORMAT)
# Load ML Models
logger.info("Loading Latent Collaborative Filtering model...")
#MODELS["recommender"] = load_model(os.path.join(_CUR_DIR, 'model.pkl'))
#MODELS["recommender"] = load_model('model.pkl')
with open(os.path.join(_CUR_DIR, 'model.pkl'), 'rb') as infile:
MODELS["recommender"] = pickle.load(infile)
urls = [
(r"/$", IndexHandler),
(r"/api/recommender/(?P<action>[a-zA-Z]+)?", RecommenderHandler, # action is function in handler
dict(model=MODELS["recommender"]))
]
# Create Tornado application
application = tornado.web.Application(
urls,
debug=options.debug,
autoreload=options.debug)
# Start Server
logger.info("Starting App on Port: {} with Debug Mode: {}".format(options.port, options.debug))
application.listen(options.port)
tornado.ioloop.IOLoop.current().start()
And my handler.py file is
"""
Request Handlers
"""
import tornado.web
from tornado import concurrent
from tornado import gen
from concurrent.futures import ThreadPoolExecutor
from app.base_handler import BaseApiHandler
from app.settings import MAX_MODEL_THREAD_POOL
from app.popularity import Popularity
class IndexHandler(tornado.web.RequestHandler):
"""APP is live"""
def get(self):
self.write("Movie Recommendation System is Live")
def head(self):
self.finish()
class RecommenderHandler(BaseApiHandler):
_thread_pool = ThreadPoolExecutor(max_workers=MAX_MODEL_THREAD_POOL)
def initialize(self, model, *args, **kwargs):
self.model = model
super().initialize(*args, **kwargs)
#concurrent.run_on_executor(executor='_thread_pool')
def _blocking_predict(self, X):
target_values = self.model.favoriteMovies(5,10)
return target_values
#gen.coroutine
def predict(self, data):
if type(data) == dict:
data = [data]
X = []
for item in data:
record = (item.get("user_id"))
X.append(record)
results = yield self._blocking_predict(X)
self.respond(results)
I have searched too much for the solution but has not found yet that worked for me.
I cannot load from the console too
i've just finished this script it connects to the main script and the website runs but it doesnt print a csv like i want it to have i done something wrong in my script when i try the /dan route it says it cant find the csv
import psutil
import sys
import os
import inspect
import socket
from gevent.pywsgi import WSGIServer
from time import gmtime, strftime
from threading import Thread
from flask import Flask, render_template, session, app, request, redirect,
url_for, send_file, send_from_directory
from flask_cors import CORS, cross_origin
from time import sleep
import requests
import data
import tank
import csv
APP_ROOT = os.path.dirname(os.path.abspath('__file__'))
def funcScript():
from data import tankOBJ as tank
with open(os.path.join('F:\csvtester', '\motherwell.csv'),'w',newline='') as
output_file:
writer = csv.writer(output_file)
writer.writerow(tank)
return 'Hello World!'
here is the function above to output the csv and below is the output for the webapp
#app.route('/dan')
def downloadDocument():
try:
return csvmotherwell.funcScript()
except Exception as error:
return event.Error(str(inspect.stack()[1][3]), str(sys.exc_info()[-1].tb_lineno), str(type(error)), str(error))
os.path.join('F:\csvtester', '\motherwell.csv')
doesn't work. The result is 'F:\motherwell.csv'
you probably want
os.path.join(r'F:\csvtester', 'motherwell.csv')
which yields:
F:\csvtester\motherwell.csv
(note the raw prefix to avoid characters being interpreted by backslashes)