Hi My app had visitors and I like to analyze the log file. Can I run a log analyzer program on the log file that google app engine allows us to download? Are third-party programs such as webalizer and visitors compatible?
Thank you
You can get a lot of meaningful and derived data out of Appstats instead of downloading and analyzing your log file. You may want to try with those server side stats and see if those fit your bill.
I suggest you use Google Analytics on your web app. If you want to do some sort of server-side visitor analaytics (instead of the client-side Javascript that Google Analytics uses), you'd have to store something in a database (BigTable on GAE) and run your own analytics.
Related
I would like to automate this process of viewing logs in dashboard and typing the information (Total messages sent in a time period, total errors, CPU usage, memory usage), this task is very time consuming at the moment.
The info is gathered from mulesoft anypoint platform. I'm currently thinking of a way to extract all of the data using python webscraping but I don't know how to use it perfectly.
You'll find here a screenshot of the website i'm trying to get the data off of, you can choose to see the logs specific to a certain time and date. My question is, do I start learning python webscrapping or is there another way of doing things that I am just unaware of ?
Logs website example
It doesn't make any sense to use web scrapping. All services in Anypoint Platform have a REST API. Most of them are documented at https://anypoint.mulesoft.com/exchange/portals/anypoint-platform/. Scrapping may broke with any minor change to the UI. The REST API is stable.
The screenshot seems to be from Anypoint Monitoring. I see in the catalog Anypoint Monitoring Archive API. I'm not sure if the API for getting Monitoring Dashboards data is documented. You could alternatively use the older CloudHub Dashboards API. It is probably not exactly the same but it will approximate.
In my company we want to build an application in Google app engine which will manage user provisioning to Google apps. But we do not really know what data source to use?
We made two propositions :
spreadsheet which will contains users' data and we will use spreadsheet API to get this data and use it for user provisioning
Datastore which will contains also users' data and this time we will use Datastore API.
Please note that my company has 3493 users and we do not know too many advantages and disadvantages of each solution.
Any suggestions please?
If you use the Datastore API, you will also need to build out a way to manage users data in the system.
If you use Spreadsheets, that will serve as your way to manage users data, so in that way managing the data would be taken care of for you.
The benefits to use the Datastore API would be if you'd like to have a seamless integration of managing the user data into your application. Spreadsheet integration would remain separate from your main application.
What is the best way to monitor website traffic for a Google App Engine hosted website?
It's fairly trivial to put some code in each page handler to record each page request to the datastore, and now (thanks stackoverflow) I have the code to log the referring site.
There's another question on logging traffic using the datastore, but it doesn't consider other options (if there are any).
My concern is that the datastore is expensive. Is there another way? Do people typically implement traffic monitoring, or am I being over-zealous?
If I do implement traffic monitoring via the datastore, what fields are recommended to capture? What's good and/or common practise?
I'd go with: time-stamp; page; referer; IP address; username (if logged in). Any other suggestions?
All of the items you mention are already logged by the built-in App Engine logger. Why do you need to duplicate that? You can download the logs at regular intervals for analysis if you need.
People usually use Google Analytics (or something similar) as it does client-side tracking and gives more insight then server-side tracking.
If you only need server-side tracking then analysing logs should be enough. The problem with Log API is that it can be expensive because it does not do real querying: for every log search it goes thorough all logs (within range).
You might want to look at Mache, a tool that exports all GAE logs to Google BigQuery which has proper query functionality.
Another option would be to download logs and analyse them with a local tools. GAE logs are in Apache format so there are plenty of tools available.
You can use the logging module and that comes with a separate quota limit.
7 MBytes spanning 69 days (1% of the Retention limit)
I don't know what the limit is but that's a line from my app so it seems to be quite large.
You can then add to the log with
logging.debug("something to store")
if it does not already come with what you need, then read it out locally with:
appcfg.py --num_days=0 request_logs appname/ output.txt
Anything you write out via System.err.println (or the python equivalent) will automatically be appended to the app engine log. So, for example, you can create you own logging format, put println's on all your pages, and then download the log and grep for that format. So for example, if this is your format:
MYLOG:url:userid:urlparams
then download the log and pipe it through grep ^MYLOG and it would give you all the traffic for your site.
I would like to generate reports in pdf format with following scenario: people would enter information on a web site and after submitting, data would be transfered to jasper reports server and pdf would be created.
Python would be language of choice for my task.
Is this scenario plausible with current jasper reports software (open source or similar), could it be done, and what would be steps in the right direction ?
Is this scenario plausible with current jasper reports software (open source or similar),
Yes.
could it be done
Yes.
and what would be steps in the right direction ?
Write a web server in Python. Your web server will allow a user to enter information on a web site and after submitting, data would be transfered to jasper reports server and pdf would be created. Your web server would provide the PDF back to the user.
You need to pick a framework, install the components, write the unit tests, write the code, debug the code and transition the code to production.
It's hard (given the question) to determine what part of this you actually need help with.
Write the interface for the user with the language of your choice. Then, having the data from the user, make an API request to the jasperserver's API requesting the report.
Make sure to account for the time the report may need to be generated if you want to make it synchronous.
Otherwise, the API allows you to generate a report and poll for it's completion. When it's done, just send the file to the user.
If you use the second approach, don't point the client ajax polling mechanism to the jasperserver as you might not want it to be accessible from the internet directly. You should do that in the backend of your app.
More information about the REST web services for Jasper Server here: https://community.jaspersoft.com/documentation/jasperreports-server-web-services-guide/v550/rest-web-services-overview
Good luck! :)
Use jasper reports server to publish the report and use its rest interface to produce the output. See Render HTML to PDF in Django site that shows a practical implementation of a python rest client
I am working on a Python App, which runs on App Engine. Is there a way I can publish the app on each customers' appSpot account, so that the App uses the users' cloud storage? Instead of running the App on my AppSpot account and all the users storing the data on my Cloud space?
Yes, absolutely.
You just need to have each client create an App Engine account with an application to which you have administrator access. You can adjust the settings on the application to forbid downloads of your code by the other administrators if that's appropriate for your agreement with the client. This also allows the clients to be billed directly for their instances' usage, and makes it completely impossible for data to leak between different clients' instances.
Using multiple applications for multiple clients who are licensing your application almost certainly does not violate part 4.4 of the TOS, although don't take this as legal advice.
No, you cannot do that. The app is hosted and run in the administrator's account which would be you. What you can do is, release the source code and point your users do install it in their appspot account, just like creating a new application.
I suppose it's not exactly what you need. But it can give you an idea where to go. Please check DryDrop project. There is small Python application you can ask each user to install on their account, then they can configure it to fetch your site files from your GitHub repo through webhooks functionality. I didn't try it, but, theoretically, you update your site, commit it to your repo, and all users get your updated application automatically. You can share your thoughts if that works for you.
Maybe. If it's an open source app that you're giving away, you can publish the source and instruct users to upload it to their own accounts.
If you're selling the app, displaying ads or otherwise trying to monetize the service, you probably want to stick with one instance. Using multiple instances to avoid paying for quota usage is direct violation of the App Engine TOS:
4.4. You may not develop multiple Applications to simulate or act as a
single Application or otherwise access
the Service in a manner intended to
avoid incurring fees.
No. Writing an application that deploys other applications is in violation of the terms of service.
Note we don't have any 'hard' limits - those limits that aren't billing enabled can be increased on application to us if you provide a reasonable use-case.