I'm developing a web application which creates visualizations of some data.
The data is taken from third parties, using their APIs, and imported in my database. The importation will be done sporadically, therefore my database will be pretty static.
The visualizations will be dynamically created in JavaScript, using d3.
When thinking about how to pass (and format) the data from the server to the client I thought I could export it to a .csv file and then load it from javascript (d3 has a builtin csv parser).
This way the csv file doubles as a caching system: it will regenerated (and therefore the database queried), only if it is older than, say, a week.
My question is: where and how should I save the generated the csv file? STATIC_ROOT, MEDIA_ROOT, another hardlinked directory?
Also, do you think the csv system is a good idea?
Sorry if the questions may seem useless, I literally picked up both django and d3 less than a week ago.
You can place the file in STATIC_ROOT, that would be a suitable location.
Two thoughts on the side:
Did you think about locking / mutexing the csv file while it is writing? Or is it not a problem if a client may get half a CSV file if the request comes in at an unlucky moment?
CSV is not the standard way to transfer a data series to a JS client. I would probably write a JSON array to the file.
In Django, we usually store the static files - files used by our website to render content (like CSS, JS) under the STATIC_ROOT. Files under the MEDIA_ROOT are usually media files like images and videos that Django lets the webserver to serve. I would store the visualization data file under a data directory within my app (which goes under the main django project directory). This article is a good resource to structure your django project.
As for using a CSV file for the data file that drives the visualization, I would prefer exporting your data as a JSON, since it is a more compact notation. Also, I would assume decoding JSON in JavaScript would be faster than CSV. Although it would depend on other parameters like the size and structure of data in the file.
Related
I am relatively new to web development and very new to using Web2py. The application I am currently working on is intended to take in a CSV upload from a user, then generate a PDF file based on the contents of the CSV, then allow the user to download that PDF. As part of this process I need to generate and access several intermediate files that are specific to each individual user (these files would be images, other pdfs, and some text files). I don't need to store these files in a database since they can be deleted after the session ends, but I am not sure the best way or place to store these files and keep them separate based on each session. I thought that maybe the subfolders in the sessions folder would make sense, but I do not know how to dynamically get the path to the correct folder for the current session. Any suggestions pointing me in the right direction are appreciated!
I was having this error "TypeError: expected string or Unicode object, NoneType found" and I had to store just a link in the session to the uploaded document in the db or maybe the upload folder in your case. I would store it to upload to proceed normally, and then clear out the values and the file if not 'approved'?
If the information is not confidential in similar circumstances, I directly write the temporary files under /tmp.
I am developing a web application in which users can upload excel files. I know I can use the OPENROWSET function to read data from excel into a SQL Server but I am refraining from doing so because this function requires a file path.
It seems kind of indirect as I am uploading a file to a directory and then telling SQL Server go look in that directory for the file instead of just giving SQL Server the file.
The other option would be to read the Excel file into a pandas dataframe and then use the to_sql function but pandas read_excel function is quite slow and the other method I am sure would be faster.
Which of these two methods is "correct" when handling file uploads from a web application?
If the first method is not frowned upon or "incorrect", then I am almost certain it is faster and will use that. I just want an experienced developers thoughts or opinions. The webapp's backend is Python and flask.
If I am understanding your question correctly, you are trying to load the contents of an xls(s) file into a SQLServer database. This is actually not trivial to do, as depending on what is in the Excel file you might want to have one table, or more probably multiple tables based on the data. So I would step back for a bit and ask three questions:
What is the data I need to save and how should that data be structured in my SQL tables. Forget about excel at this point -- maybe just examine the first row of data and see how you need to save it.
How do I get the file into my web application? For example, when the user uploads a file you would want to use a POST form and send the file data to your server and your server to save that file (for example, either on S3, or in a /tmp folder, or into memory for temporary processing).
Now that you know what your input is (the xls(x) file and its location) and how you need to save your data (the sql schema), now it's time to decide what the best tool for the job is. Pandas is probably not going to be a good tool, unless you literally just want to load the file and dump it as-is with minimal (if any) changes to a single table. At this point I would suggest using something like xlrd if only xls files, or openpyxl for xls and xlsx files. This way you can shape your data any way you want. For example, if the user enters in malformed dates; empty cells (should they default to something?); mismatched types, etc.
In other words, the task you're describing is not trivial at all. It will take quite a bit of planning and designing, and then quite a good deal of python code once you have your design decided. Feel free to ask more questions here for more specific questions if you need to (for example, how to capture the POST data in a file update or whatever you need help with).
I am writing a bit of code that will take a CSV file input and perform an operation based on its contents. In the admin panel I am designing, the admin should be able to select a CSV file on their local system which my application will then read. The application does not need to store the CSV file, just read from it for a one-time operation.
Any ideas on how to best handle this in Pyramid?
What you want is essentially a file upload, followed by additional processing on the uploaded data. You can create input elements of type "file" in HTML forms to allow uploading of files.
Refer to the cookbook in the Pyramid documentation on file uploads for how to handle the uploaded data on the server side (summarized: use the file-like object request.POST[ field_name ].file).
Is there any add-on which will activate while uploading files into the Plone site automatically? It should compress the files and then upload into the files. These can be image files like CAD drawings or any other types. Irrespective of the file type, beyond a specific size, they should get compressed and stored, rather than manually compressing the files and storing them.I am using plone 4.1. I am aware of the css, javascript files which get compressed, but not of uploaded files. I am also aware of the 'image handling' in the 'Site Setup'
As Maulwurfn says, there is no such add-on, but this would be fairly straightforward for an experienced developer to implementing using a custom content type. You will want to be pretty sure that the specific file types you're hoping to store will actually benefit from compression (many modern file formats already include some compression, and so simply zipping them won't shrink them much).
Also, unless you implement something complex like a client-side Flash uploader with built-in compression, Plone can only compress files after they've been uploaded, not before, so if you're hoping to make uploads quicker for users, rather than to minimize storage space, you're facing a somewhat more difficult challenge.
Developing a project of mine I realize I have a need for some level of persistence across sessions, for example when a user executes the application, changes some preferences and then closes the app. The next time the user executes the app, be it after a reboot or 15 minutes, I would like to be able to retain the preferences that had been changed.
My question relates to this persistence. Whether programming an application using the win32 API or the MFC Framework .. or using the newer tools for higher level languages such as wxPython or wxRuby, how does one maintain the type of persistence I refer to? Is it done as a temporary file written to the disk? Is it saved into some registry setting? Is there some other layer it is stored in that I am unaware of?
I would advice to do it in two steps.
First step is to save your prefs. as
string, for that you can
a)
Use any xml lib or output xml by
hand to output string and read
similarly from string
b) Just use pickle module to dump your prefs object as a string
c) Somehow generate a string from prefs which you can read back as prefs e.g. use yaml, config , JSON etc actually JSON is a good option when simplejson makes it so easy.
Once you have your methods to convert to and from string are ready, you just need to store it somewhere where it is persisted and you can read back next time, for that you can
a) Use wx.Config which save to registry in windows and to other places depending on platform so you don't have to worry where it saves, you can just read back values in platform independent way. But if you wish you can just use wx.Config for directly saving reading prefs.
b) Directly save prefs. string to a file in a folder assigned by OS to your app e.g. app data folder in windows.
Benefit of saving to a string and than using wx.Config to save it, is that you can easily change where data is saved in future e.g. in future if there is a need to upload prefs. you can just upload prefs. string.
There are different methods to do this that have evolved over the years.
These methods include (but not limited to):
Registry entries.
INI files.
XML Files
Simple binary/text files
Databases
Nowadays, most people do this kind of thing with XML files residing in the user specific AppData folders. It is your choice how you do it. For example, for simple things, databases can be overkill and for huge persisted objects, registry would not be appropriate. You have to see what you are doing and do it accordingly.
Here is a very good discussion on this topic