I'm trying to change the date of launch time for EC2 instances in AWS to something more friendly using Python 3.
The error that I'm getting says:
datetime(launch_time)
TypeError: 'module' object is not callable
My program is doing this:
import boto3
import time
import datetime
instance_id = 'i-024b3382f94bce588'
instance = ec2.describe_instances(
InstanceIds=[instance_id]
)['Reservations'][0]['Instances'][0]
launch_time = instance['LaunchTime']
datetime(launch_time)
launch_time_friendly = launch_time.strftime("%B %d %Y")
print("Server was launched at: ", launch_time_friendly)
How can I get the time the instances were created into a user friendly format?
There is both a datetime module and a datetime class. You are attempting to call the module:
import datetime
dt = datetime(2019, 3, 1) # This will break!
Instead, you need to either import the class from the module:
from datetime import datetime
dt = datetime(2019, 3, 1) # Okay!
... or import the module and reference the class:
import datetime
dt = datetime.datetime(2019, 3, 1) # Good!
Related
So I made an Apache Airflow system in a Docker and so far it works perfectly well, with one problem, that persists through all dags: they activate on the previous iteration, not the current one.
For example, if I make a DAG that activates every minute, when it is 15:08, it will activate the DAG for 15:07. And if I make a DAG that activates every year, when it is 2023, it will activate the DAG for 2022, but not the current year.
Is there any way to fix this? Or is it supposed to be that way, and I should just account for this?
Here is the code for some of my dags as an example:
from datetime import datetime
from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
from airflow.operators.python_operator import PythonOperator
import logging
import random
import pandas as pd
import sqlalchemy
from airflow.utils.log.logging_mixin import LoggingMixin
from dateutil.relativedelta import relativedelta
import requests
from datetime import datetime
def test_print(ds, foo, **kwargs):
start_date = str(ds)
end_date = str((datetime.strptime(ds, '%Y-%m-%d') + relativedelta(years=1)).date())
print('HOLIDAYS:')
print('--------------')
print('START DATE:' + start_date)
print('END DATE:' + end_date)
print('--------------')
now = ds
data2send = {'the_date_n_hour': now}
r = requests.post("http://[BACKEND SERVER]:8199/do_work/",json=data2send)
print(r.text)
assert now in r.text
task_logger = logging.getLogger('airflow.task')
task_logger.warning(r.text)
return 'ok'
dag = DAG('test_test', description='test DAG',
schedule_interval='*/1 * * * *',
start_date=datetime(2017, 3, 20), catchup=False)
test_operator = PythonOperator(task_id='test_task',
python_callable=test_print,
dag=dag,
provide_context = True,
op_kwargs={'foo': 'bar'})
test_operator
from __future__ import print_function
import time
from builtins import range
from pprint import pprint
import airflow
from airflow.models import DAG
from airflow.operators.python_operator import PythonOperator
import sqlalchemy
import pandas as pd
import datetime
import requests
from dateutil.relativedelta import relativedelta
args = {
'owner': 'airflow',
"depends_on_past": False,
"retries": 12,
"retry_delay": datetime.timedelta(minutes=60)}
dag = DAG(
dag_id='dag_holidays',
default_args=args,
schedule_interval='0 12 1 1 *',
start_date=datetime.datetime(2013, 1, 1),
catchup=True)
def get_holidays(ds, gtp_id, **kwargs):
"""Wait a bit so that SQL isn't overwhelmed"""
holi_start_date = str(ds)
holi_end_date = str((datetime.strptime(ds, '%Y-%m-%d') + relativedelta(years=1)).date())
print('HOLIDAYS:')
print('--------------')
print('GTP ID: {}'.format(str(gtp_id)))
print('START DATE:' + holi_start_date)
print('END DATE:' + holi_end_date)
print('--------------')
r = requests.post("http://[BACKEND SERVER]/load_holidays/",data={'gtp_id': gtp_id, 'start_date': holi_start_date, 'end_date': holi_end_date})
if 'Error' in r.text:
raise Exception(r.text)
else:
return r.text
return ds
engine = sqlalchemy.create_engine('[SQL SERVER]')
query_string1 = f""" select gtp_id from gtps"""
all_ids = list(pd.read_sql_query(query_string1,engine).gtp_id)
for i, gtp_id in enumerate(all_ids):
task = PythonOperator(
task_id='holidays_' + str(gtp_id),
python_callable=get_holidays,
provide_context = True,
op_kwargs={'gtp_id': gtp_id},
dag=dag,
)
task
Yes, this is supposed to be this way and it can definitely be a bit confusing at first.
The reason for this behavior is that Airflow was used for a lot of ETL type processing when it was built and with that pattern you are running your DAG on the data of the previous interval.
For example when your data processing DAG runs every day at 3am, the data it processes is the data what was collected since 3am the previous day.
This period is called the Data Interval in Airflow terms.
The start of the data interval is the Logical Date (in earlier versions called execution date), which is what is incorporated into the Run ID. I think this is what you are seeing as the previous iteration.
The end of the data interval is the Run After date, this is when the DAG actually will be scheduled to run.
When you hover over the Next Run: field in the Airflow UI for a given DAG you will see all of those dates and timestamps for the next run of a specific DAG.
This guide on scheduling DAGs might be helpful as a reference and it has some examples.
Disclaimer: I work for Astronomer, the company behind the guide I linked. :)
I'm trying to make a trading bot and am using backtrader for the same. I have been trying to debug this issue but couldn't find a solution yet. The code is as shown below
# from ast import Constant
from operator import imod
import os, sys
import config
from binance.client import Client
import backtrader
import pandas as pd
import datetime, time
client = Client(config.Binanceapikey, config.BinancesecretKey)
def GetHistoricalData(howLong):
# Calculate the timestamps for the binance api function
untilThisDate = datetime.datetime.now()
sinceThisDate = untilThisDate - datetime.timedelta(days = howLong)
# Execute the query from binance - timestamps must be converted to strings !
candle = client.get_historical_klines("BTCUSDT", Client.KLINE_INTERVAL_1MINUTE, str(sinceThisDate), str(untilThisDate))
# Create a dataframe to label all the columns returned by binance so we work with them later.
df = pd.DataFrame(candle, columns=['dateTime', 'open', 'high', 'low', 'close', 'volume', 'closeTime', 'quoteAssetVolume', 'numberOfTrades', 'takerBuyBaseVol', 'takerBuyQuoteVol', 'ignore'])
# as timestamp is returned in ms, let us convert this back to proper timestamps.
df.dateTime = pd.to_datetime(df.dateTime, unit='ms').dt.strftime("%Y-%m-%d")
df.set_index('dateTime', inplace=True)
# Get rid of columns we do not need
df = df.drop(['closeTime', 'quoteAssetVolume', 'numberOfTrades', 'takerBuyBaseVol','takerBuyQuoteVol', 'ignore'], axis=1)
cerebro = backtrader.Cerebro()
cerebro.broker.set_cash(100000)
cerebro.adddata(GetHistoricalData(1))
print('Starting porfolio value: %.2f' %cerebro.broker.getvalue())
cerebro.run()
print('Final porfolio value: %.2f' %cerebro.broker.getvalue())
The error message is as follows:
File "/TradingBot/tradingBot.py", line 40, in <module>
cerebro.adddata(GetHistoricalData(1))
File "/usr/local/lib/python3.8/site-packages/backtrader/cerebro.py", line 757, in adddata
data._id = next(self._dataid)
AttributeError: 'NoneType' object has no attribute '_id'
Thanks in advance!
You do not have a return in GetHistoricalData so it is sending None to adddata(). Maybe you need to return the dataframe? if not specify your intent.
# from ast import Constant
from operator import imod
import os, sys
import config
from binance.client import Client
import backtrader
import pandas as pd
import datetime, time
client = Client(config.Binanceapikey, config.BinancesecretKey)
def GetHistoricalData(howLong):
# Calculate the timestamps for the binance api function
untilThisDate = datetime.datetime.now()
sinceThisDate = untilThisDate - datetime.timedelta(days = howLong)
# Execute the query from binance - timestamps must be converted to strings !
candle = client.get_historical_klines("BTCUSDT", Client.KLINE_INTERVAL_1MINUTE, str(sinceThisDate), str(untilThisDate))
# Create a dataframe to label all the columns returned by binance so we work with them later.
df = pd.DataFrame(candle, columns=['dateTime', 'open', 'high', 'low', 'close', 'volume', 'closeTime', 'quoteAssetVolume', 'numberOfTrades', 'takerBuyBaseVol', 'takerBuyQuoteVol', 'ignore'])
# as timestamp is returned in ms, let us convert this back to proper timestamps.
df.dateTime = pd.to_datetime(df.dateTime, unit='ms').dt.strftime("%Y-%m-%d")
df.set_index('dateTime', inplace=True)
# Get rid of columns we do not need
df = df.drop(['closeTime', 'quoteAssetVolume', 'numberOfTrades', 'takerBuyBaseVol','takerBuyQuoteVol', 'ignore'], axis=1)
#return the df
return df
cerebro = backtrader.Cerebro()
cerebro.broker.set_cash(100000)
cerebro.adddata(GetHistoricalData(1))
print('Starting porfolio value: %.2f' %cerebro.broker.getvalue())
cerebro.run()
print('Final porfolio value: %.2f' %cerebro.broker.getvalue())
I have this code right here that I'm making a report, and I'm trying to work with the date but I cant cause pycharm says it cant work with "series" format, I`m trying to convert it to simple datetime but nothing works, can u guys help me?
the "DATA" is coming with the format of "datetime n 64" and I need it to be normal datetime, how can I do this?
import pyodbc
import pandas as pd
import matplotlib.pyplot as plt
import datetime
class generate_report():
def __init__(self):
self.csv = "output.csv"
self.sql_conn = pyodbc.connect('Trusted_Connection=yes', driver = '{SQL Server}',
server = 'localhost', database = 'MPWJ_BI')
self.query = "select * from CTP_EXTRATO_GERAL where HISTORICO = 'Aplicação' order by data"
self.df = pd.read_sql(self.query, self.sql_conn)
self.df['DATA'] = pd.to_datetime(self.df['DATA'])
self.df.to_csv(self.csv)
def analyze_data(self):
pd.read_csv(self.csv)
print(self.df.dtypes)
It depends on how your date looks like
for example
from datetime import datetime
datetime_object = datetime.strptime('Jun 1 2020 7:31PM', '%b %d %Y %I:%M%p')
documentation
https://docs.python.org/3/library/datetime.html#datetime.datetime.strptime
EDIT:
To convert from datetime64 to datetime you can do the following:
import datetime
import numpy as np
# Current time UTC
dt = datetime.datetime.utcnow()
# Convert to datetime64
dt64 = np.datetime64(dt)
# convert to epoch
ts = (dt64 - np.datetime64('1970-01-01T00:00:00')) / np.timedelta64(1, 's')
# Convert to datetime
print(datetime.datetime.fromtimestamp(ts))
I am trying to loop through a list of symbols to get rates for various currencies via the mt5. I use the code below but i get TypeError
d[i] = [y.close for y in rates1]
TypeError: 'NoneType' object is not iterable
I can't see where im going wrong i would like to use this structure to loop through create multiple dataframe and then make a big multiindex of all pairs and time using same kind of loop. I've not been coding long.
sym = ['GBPUSD','USDJPY','USDCHF','AUDUSD','GBPJPY']
# Copying data to dataframe
d = pd.DataFrame()
for i in sym:
rates1 = mt5.copy_rates_from(i, mt5.TIMEFRAME_M1, 5)
d[i] = [y.close for y in rates1]
# -*- coding: utf-8 -*-
"""
Created on Mon Jun 29 18:38:11 2020
#author: DanPc
"""
# -*- coding: utf-8 -*-
"""
"""
import pytz
import pandas as pd
import MetaTrader5 as mt5
import time
from datetime import datetime
from threading import Timer
import talib
import numpy as np
import matplotlib as plt
from multiprocessing import Process
import sys
server_name = "" ENTER DETAILS HERE
server_num =
password = ""
#------------------------------------------------------------------------------
def actualtime():
# datetime object containing current date and time
now = datetime.now()
dt_string = now.strftime("%d/%m/%Y %H:%M:%S")
#print("date and time =", dt_string)
return str(dt_string)
#------------------------------------------------------------------------------
def sync_60sec(op):
info_time_new = datetime.strptime(str(actualtime()), '%d/%m/%Y %H:%M:%S')
waiting_time = 60 - info_time_new.second
t = Timer(waiting_time, op)
t.start()
print(actualtime)
#------------------------------------------------------------------------------
def program(symbol):
if not mt5.initialize(login=server_num, server=server_name, password=password):
print("initialize() failed, error code =",mt5.last_error())
quit()
timezone = pytz.timezone("Etc/UTC")
utc_from = datetime.now()
######### Change here the timeframe 525600
# Create currency watchlist for which correlation matrix is to be plotted
sym = ['GBPUSD','USDJPY','USDCHF','AUDUSD','GBPJPY']
# Copying data to dataframe
d = pd.DataFrame()
for i in sym:
rates1 = mt5.copy_rates_from(i, mt5.TIMEFRAME_M1, 5)
d[i] = [y.close for y in rates1]
print(rates1)
mt5.shutdown()
if not mt5.initialize():
print("initialize() failed, error code =",mt5.last_error())
quit()
# starting mt5
if not mt5.initialize(login=server_num, server=server_name, password=password):
print("initialize() failed, error code =",mt5.last_error())
quit()
#------------------------------------------------------------------------------
# S T A R T I N G M T 5
#------------------------------------------------------------------------------
authorized=mt5.login(server_num, password=password)
if authorized:
account_info=mt5.account_info()
if account_info!=None:
account_info_dict = mt5.account_info()._asdict()
df=pd.DataFrame(list(account_info_dict.items()),columns=['property','value'])
print("account_info() as dataframe:")
print(df)
else:
print(mt5.last_error)
mt5.shutdown()
#------------------------------------------------------------------------------
def trading_bot():
symbol_1 = 'EURUSD'
symbol_2 = 'EURCAD'
while True:
program(symbol_1)
program(symbol_2)
time.sleep(59.8) # it depends on your computer and ping
sync_60sec(trading_bot)
copy_rates_from returns None if there is an error. The documentation suggests calling last_error() to find out what that error is.
(And no, I don't know why copy_rates_from doesn't just raise an exception to indicate the error. Apparently, the module is a thin wrapper around a C library.)
I came to this solution that creates a dictionary of dataframes.
sym = ["GBPUSD","USDJPY","USDCHF","AUDUSD","GBPJPY"]
# Copying data to dataframe
utc_from = datetime.now()
for i in sym:
rates = {i:pd.DataFrame(mt5.copy_rates_from(i, mt5.TIMEFRAME_M1, utc_from , 60),
columns=['time', 'open', 'low', 'high', 'close', 'tick_volume', 'spread', 'real_volume']) for i in sym}
I ran when it was noon in turkey..this is what I got:
2017-12-22 20:11:46.038218+03:00
import pytz
from pytz import timezone
from datetime import datetime
utc_now = datetime.now()
utc = pytz.timezone('UTC')
aware_date = utc.localize(utc_now)
turkey = timezone('Europe/Istanbul')
now_turkey = aware_date.astimezone(turkey)
Why did I get 20:11:46?
Because the base time is wrong, just change utc_now = datetime.now() to utc_now = datetime.utcnow() and then it works.
As #RemcoGerlich has said, you should use utcnow to get UTC.
Whole code:
import pytz
from pytz import timezone
from datetime import datetime
utc_now = datetime.utcnow()
utc = pytz.timezone('UTC')
aware_date = utc.localize(utc_now)
turkey = timezone('Europe/Istanbul')
now_turkey = aware_date.astimezone(turkey)