I have a mysql database and I am trying to print all the test result from a specific student. I am trying to create a command line where I enter the username and then it will shows his/her test result.
I visited this page already but I couldn't get my answer.
optparse and strings
#after connecting to mysql
cursor.execute("select * from database")
def main():
parser = optparse.OptionParser()
parser.add_option("-n", "--name", type="string", help = "student name")
(options, args) = parser.parse_args()
studentinfo = []
f = open("Index", "r")
#Index is inside database, it is a folder holds all kinds of files
Well, the first thing you should do is not use optparse, as it's deprecated - use argparse instead. The help I linked you to is quite useful and informative, guiding you through creating a parser and setting the different options. After reading through it you should have no problem accessing the variables passed from the command line.
However, there are other errors in your script as well that will prevent it from running. First, you can't open a directory with the open() command - you need to use os.listdir() for that, then read the resulting list of files. It is also very much advisable to use a context manager when open()ing files:
filelist = os.listdir("/path/to/Index")
for filename in filelist:
with open(filename, "r") as f:
for line in f:
# do stuff with each line
This way you don't need to worry about closing the file handler later on, and it's just a generally cleaner way of doing things.
You don't provide enough information in your question as to how to get the student's scores, so I'm afraid I can't help you there. You'll (I assume) have to connect the data that's coming out of your database query with the files (and their contents) in the Index directory. I suspect that if the student scores are kept in the DB, then you'll need to retrieve them from the DB using SQL, instead of trying to read raw files in the filesystem. You can easily get the student of interest's name from the command line, but then you'll have to interpolate that into a SQL query to find the correct table, select the rows from the table corresponding to the student's test scores, then process the results with Python to print out a pretty summary.
Good luck!
Related
I am using Python to create a list of groups. It only makes sense to me that, using the code, I have for creating the password protected zip, I can create the input in my code as long as it is created before listing the input. As such, I have created a txt file which then needs to be placed in a password protected zip. With the code I am using below, I get this error message when I try to run it: (OSError: error in opening /Users/name/Desktop/Groups.txt for reading). I'm simply not very experienced in this regard and wouldn't know how to solve this issue (+ I am extremely desperate right now). This is the code I have so far, but it does not work:
#creates .txt file to put in zip folder
with open("Groups.txt", 'w') as file:
file.write("Each row is a group:" + '\n')
for row in final_group:
s = ", ".join(map(str, row))
file.write(s+'\n')
# creates password protected zip
inpt = "/Users/name/Desktop/Groups.txt"
pre = None
oupt = "/Users/name/Desktop/Groups.zip"
password = "password"
com_lvl = 5
pyminizip.compress(inpt, None, oupt, password, com_lvl)
Could someone help me out here?
The input file might be in a different directory, because of which there's an error. You can either
Set inpt = "Groups.txt"
Set with open("/Users/name/Desktop/Groups.txt", 'w') as file:
I am looking for more information regarding parsing files in Python, specifically from someone's GitHub. For instance, if Person A has a GitHub account with a file with the contents:
name = Person A
my script = scriptA.sh
script output = yarn output
my other files = fileA, fileB
I would want to be able to access this information and store with my Python script.
This is not for a class or anything so I am struggling to find good, clear, beginner level information going over concepts like this. Does anyone have any advice here? I have basic parsing and Python understanding, but I want to advance this. Here is some pseudocode I am using to try to brainstorm.
def parseAddFile(filename):
with open(filename, 'r') as file:
lines = file.readlines() # this should read all the lines of the file
dict_of_contents={} # this will then put the contents into a dictionary
for parameter in line
name =
etc...
Just looking to grow my knowledge, not asking for answers only.
I need to read an msi file and make some queries to it. But it looks like despite it is a standard lib for python, it has poor documentation.
To make queries I have to know database schema and I can't find any examples or methods to get it from the file.
Here is my code I'm trying to make work:
import msilib
path = "C:\\Users\\Paul\\Desktop\\my.msi" #I cannot share msi
dbobject = msilib.OpenDatabase(path, msilib.MSIDBOPEN_READONLY)
view = dbobject.OpenView("SELECT FileName FROM File")
rec = view.Execute(None)
r = v.Fetch()
And the rec variable is None. But I can open the MSI file with InstEd tool and see that File is present in the tables list and there are a lot of records there.
What I'm doing wrong?
Your code is suspect, as the last line will throw a NameError in your sample. So let's ignore that line.
The real problem is that view.Execute returns nothing of use. Under the hoods, the MsiViewExecute function only returns success or failure. After you call that, you then need to call view.Fetch, which may be what your last line intended to do.
I'm using the Python FTP lib for the first time. My goal is simply to connect to an FTP site, get a directory listing, and then download all files which are newer than a certain date - (e.g. download all files created or modified within the last 5 days, for example)
This turned out to be a bit more complicated than I expected for a few reasons. Firstly, I've discovered that there is no real "standard" FTP file list format. Most FTP sites conventionally use the UNIX ls format, but this isn't guaranteed.
So, my initial thought was to simply parse the UNIX ls format: it's not so bad after all, and it seems most mainstream FTP servers will use it in response to the LIST command.
This was easy enough to code with Python's ftplib:
import ftplib
def callback(line):
print(line)
ftp = ftplib.FTP("ftp.example.com")
result = ftp.login(user = "myusername", passwd = "XXXXXXXX")
dirlist = ftp.retrlines("LIST", callback )
This works, except the problem is that the date given in the UNIX list format returned by the FTP server I'm dealing with doesn't have a year. A typical entry is:
-rw-rw-r-- 1 user user 1505581 Dec 9 21:53 somefile.txt
So the problem here is that I'd have to code in extra logic to sort of "guess" if the date refers to the current year or not. Except really, I'd much rather not code some complex logic like that when it seems so unnecessary - there's no reason the FTP server shouldn't be able to give me the year.
Okay, so after Googling around for some alternative ways to get LIST information, I've found that many FTP servers support the MLST and MLSD command, which apparently provides a directory listing in a "machine-readable" format, i.e. a list format which is much more amenable to automatic processing. Great. So, I try the following:
dirlist = ftp.sendcmd("MLST")
print(dirlist)
This produces a single line response, giving me data about the current working directory, but NOT a list of files.
250-Start of list for /
modify=20151210094445;perm=flcdmpe;type=cdir;unique=808U6EC0051;UNIX.group=1003;UNIX.mode=0775;UNIX.owner=1229; /
250 End of list
So this looks great, and easy to parse, and it also has a modify date with the year. Except it seems the MLST command is showing information about the directory itself, rather than a listing of files.
So, I've Googled around and read the relevant RFCs, but can't seem to figure out how to get a listing of files in "MLST" format. It seems the MLSD command is what I want, but I get a 425 error when I try that:
File "temp8.py", line 8, in <module>
dirlist = ftp.sendcmd("MLSD")
File "/usr/lib/python3.2/ftplib.py", line 255, in sendcmd
return self.getresp()
File "/usr/lib/python3.2/ftplib.py", line 227, in getresp
raise error_temp(resp)
ftplib.error_temp: 425 Unable to build data connection: Invalid argument
So how can I get a full directory listing in MLST/MLSD format here?
There is another module ftputil which is built based on ftplib, and has many features emulating os, os.path, shutil. I found it pretty easy to use and robust in related operation. Maybe you could give it a try.
As for your purpose, the introduction codes solves it exactly.
you could try this, and see if you can get what you need.
print(ftp.mlst('directory'))
I am working on something similar where i need to parse the content of directory and all sub directories within. However the server that I am working with did not allow mlst command, so i accomplished what i need by,
parse the main directory content
for loop through main directory content
Append for loop output to pandas DataFrame.
test = pd.Series('ftp.nlst('/target directory/'))
df_server_content = pd.DataFrame()
for i in test:
data_dir = '/target directory/' + i
server_series = pd.Series(ftp.nlst(data_dir))
df_server_content = df_server_content.append(server_series)
I have more than 40 txt files needed to be loaded into a table in Mysql. Each file contains 3 columns of data, each column lists one specific type of data, but in general the format of each txt file is exactly the same, but these file names are various, first I tried LOAD DATA LOCAL INFILE 'path/*.txt' INTO TABLE xxx"
Cause I think maybe use *.txt can let Mysql load all the txt file in this folder. But it turned out no.
So how can I let Mysql or python do this? Or do I need to merge them into one file manually first, then use LOAD DATA LOCAL INFILE command?
Many thanks!
If you want to avoid merging your text files, you can easily "scan" the folder and run the SQL import query for each file:
import os
for dirpath, dirsInDirpath, filesInDirPath in os.walk("yourFolderContainingTxtFiles"):
for myfile in filesInDirPath:
sqlQuery = "LOAD DATA INFILE %s INTO TABLE xxxx (col1,col2,...);" % os.path.join(dirpath, myfile)
# execute the query here using your mysql connector.
# I used string formatting to build the query, but you should use the safe placeholders provided by the mysql api instead of %s, to protect against SQL injections
The only and best way is to merge your data into 1 file. That's fairly easy using Python :
fout=open("out.txt","a")
# first file:
for line in open("file1.txt"):
fout.write(line)
# now the rest:
for num in range(2,NB_FILES):
f = open("file"+str(num)+".txt")
for line in f:
fout.write(line)
f.close() # not really needed
fout.close()
Then run the command you know (... INFILE ...) to load the one file to MySql. Works fine as long as your separation between columns are strictly the same. Tabs are best in my opinion ;)