Finding the price change between two dates in yfinance - python

I'm trying to find the price change betweeen two dates using yfinance. I have my code set up right now to show me a graph of a selected stock from the start and end of The Great Recession with matplotlib. But I can't figure out how to get just one day of data and store it in a variable. Is there any way to store the closing price of a certain date or even get the price change between two dates?

Related

Use TensorFlow model to guess / predict values between 2 points

My question is something that I didn't encounter anywhere, I've been wondering if it was possible for a TF Model to determinate values between 2 dates that have real / validated values assigned to them.
I have an example :
Let's take the price of Nickel, here's it's chart the last week :
There is no data for the two following dates : 19/11 and 20/11
But we have the data points before and after.
So is it possible to use the datas from before and after these 2 points to guess the values of the 2 missing dates ?
Thank you a lot !
It would be possible to create a machine learning model to predict the prices given a dataset of previous prices. Take a look at this post for instance. You would have to modify it slightly such that it predicts the prices in the gaps given previous and upcoming prices.
But for the example you gave assuming the dates are of this year 2022, these are a Saturday and Sunday, the stock market is closed on the weekends, hence there is not price of the item. Also notice that there are other days in the year where there is not trading occurring, think about holidays, then there also is not price of course.

How do i plot a bar graphic with the groupby made column

I'm an Environmental Engineer, trying to make a leap change to the data science area which interests me more.
I'm new to Python, I work at a company that evaluates air quality data and I think that if I automate the analysis, I should save some time.
I've imported the CSV files with environmental data from the past month, did some filter in that just to make sure that the data were okay and did a groupby just analyse the data day-to-day (I need that in my report for the regulatory agency).
The step by step of what I did:
medias = tabela.groupby(by=["Data"]).mean()
display (tabela)
enter image description here
As you can see there's a column named Data, but when I do the info check it not recognizes the Data as a column.
print (medias.info())
enter image description here
How can I solve this? I need to plot some graphs with the concentration of rain and dust per day.
After grouping, please do a reset index
medias = tabela.groupby(by=["Data"]).mean()
medias = medias.reset_index()

Rolling dates in Python Pandas

enter image description hereenter image description hereI have the following data frame. I am trying to group the data into rolling date categories based on the received date. I need this code to work so that when a new piece of data is added to the source, it still falls into these date groups: 0-3 months, 3-6 months, 6-9, 9-12, and 12+. Can someone please walk me through how to do this? I have searched high and low for days and I just don't understand how to do it. I have tried grouping the dates based on receive date but that doesn't account for new dates being added (picture attached)enter image description here. My ultimate goal is to deploy this in a dash app. I have a ton of other graphs that are currently deployed but I have a handful that have to use the datetime buckets and I can't do them until I figure this part out.

How to perform a function for every day on spreadsheet data of an entire year

I have Fitbit data for an entire year. I am calculating heart rate(HR) variability from the changes in HR over time. The function to calculate it is using root mean square RMSSD = sqrt(average(X:Xn)).
I have already calculated the X value I need for the equation, but I want it to automatically detect the date, run the equation from all the values from that date, then go to the next date on the list and same value. So, this will theoretically break down all the data from a year into 365 cells.
I've attached an image of my Excel file. Basically, I want to be able to run the function above on the RRS column for all the data within one day and return one value per day.
Ahmed, you can perform that same mean calculation using AGGREGATE on any subsets of data in SPSS Statistics that you can designate with one or a combination of multiple variables. If you have a month variable, you simply break on that month variable if you want to combine all values over the month rather than the day.
In response to the request for an example, in the example data files that come with SPSS Statistics, open the file worldsales.sav. There are three variables in the file, Continent, Product, and Revenue. If you wanted to average (or sum, or take a variety other summary functions) Revenue for each Product you would specify Data>Aggregate, move the Product variable to the Break Variable(s) box, move Revenue into the Summaries of Variables box, and if desired, edit the function to change it from MEAN to something else, and perhaps change the name and/or label for the new variable. The new variable can be added back to the dataset (the default) or output to a new dataset or file, using the additional options in the Save section of the dialog box.
If you wanted to aggregate values by Continent, you could use that variable as a break variable instead of Product. If you wanted to get a distinct value for each Product on each Continent, you could include both those variables as break variables, and you'd get this.

Python time series data bases

I will have hourly temperature data for each city in US which needs some kind of statistical and plotting post-processing using grandfathered tools in Python. Right now the list has only few cities but it will grown to include more cities.
Each city with hourly temperature data is thus a 2 column vector. The first column being hourly date-time and the second column as temperature.
What is the suggested open source database tool that I can use as I am trying to avoid keeping this temperature data in csv file.
Edit: I also get 10 day hourly temperature forecast each day for each city. I want to store the city historical forecast I receive from vendor every day also.

Categories

Resources