Automation of performance monitoring of mulesoft application - python

I would like to automate this process of viewing logs in dashboard and typing the information (Total messages sent in a time period, total errors, CPU usage, memory usage), this task is very time consuming at the moment.
The info is gathered from mulesoft anypoint platform. I'm currently thinking of a way to extract all of the data using python webscraping but I don't know how to use it perfectly.
You'll find here a screenshot of the website i'm trying to get the data off of, you can choose to see the logs specific to a certain time and date. My question is, do I start learning python webscrapping or is there another way of doing things that I am just unaware of ?
Logs website example

It doesn't make any sense to use web scrapping. All services in Anypoint Platform have a REST API. Most of them are documented at https://anypoint.mulesoft.com/exchange/portals/anypoint-platform/. Scrapping may broke with any minor change to the UI. The REST API is stable.
The screenshot seems to be from Anypoint Monitoring. I see in the catalog Anypoint Monitoring Archive API. I'm not sure if the API for getting Monitoring Dashboards data is documented. You could alternatively use the older CloudHub Dashboards API. It is probably not exactly the same but it will approximate.

Related

The right API/scrape method to extract tables from a Qlikview app

I'm trying to get some tables with specific filters on this qlikview page, for future analysis: http://transferenciasabertas.planejamento.gov.br/QvAJAXZfc/opendoc.htm?document=painelcidadao.qvw&lang=en-US&host=QVS%40srvbsaiasprd01&anonymous=true
I don't want to do it manually (downloading tables for every filter). Therefore, I searched for API's for Python on qlikview website, but only found qliksense API's for SSE (like this https://github.com/qlik-oss/server-side-extension).
Is there any chance that I could automate the retrieving process that I explained using Python?
Server side extensions are used for something else. They extend Qlik's functionality to process data (for example running some statistical functions on top of the displayed data if such functions do not exists in Qlik natively)
Interestingly is that the portal link (http://transferenciasabertas.planejamento.gov.br) is a QlikView app that later redirects to a Qlik Sense app(s). It seems that anonymous users are allowed on the platform (which makes automating data retrieval easier).
Qlik Sense communicates with the browser via web sockets. So the answer to your question is - yes. You can used Python to connect to the underlying Qlik Sense Engine and make some selections and get the data back.
The not very good news is that I dont think there is dedicated Python library so you'll have to send the raw web socket requests by yourself. The documentation for the Engine API can be found at Qlik's help site
If you are open for JS solution then you can use Qlik's enigma.js library for Engine communication.
The web sockets traffic can be monitored from the browser (to view what data is being send/received and its format)

How do e-commerce websites integrate with ERP systems?

How does e-commerce usually handle integrations with ERP software?
We are working on a project for a client, who previously planned to use an ERP system that had a REST API.
This API allowed us to:
Place orders
Inform the ERP if the order was paid for
Get order status
Get all of the items available
Check item availability
Get user data
That would allow us to build a fairly complex online store with a lot of features.
Now the client wants to use another ERP system:
http://www.netsuite.com/portal/platform.shtml
I researched it, and the difficulty of integration surprised me. No REST API, some weird SOAP protocol to communicate with the system, and you have to write a lot of logic using SuiteScript. A whole new, different programming language just to build an integration with an online store? Why not just give developers access to an API to place orders and fetch items? And there are absolutely no docs available online for the thing. People on forums are saying that the system lacks in documentation and one has figure it out himself, along the way.
Magento and Shopify integration is done by third parties and looks dodgy. Same thing with SAP ERP. Am I missing something? Why is such a basic thing as a REST API for e-commerce not available for those systems?
Why develop using Python Django for the back-end and using React.js for the frontend. What is the right way to integrate them with the ERP system?
NetSuite does have a REST API and webservices. "you have to write a lot of logic using SuiteScript" is true but it's just JavaScript and there are many talented developers out there.
I'm not sure there is a "right way" but there are many ways to connect to the data.
My suggestion would be to contact a partner company, such as SWK Technologies. http://swktech.com
NetSuite has two main APIs, SuiteTalk and SuiteScript.
SuiteTalk is the Web Services API, which is SOAP based and allows for pulling data from and updating NetSuite. The SuiteScript API is JavaScript based and allows you to customize accounts and export data at the appropriate event during your business process. The term "SuiteCloud" encompasses all APIs and integration tools.
As for documentation, this is mostly only available to clients and partners. If you have a client who provides you with access to their account, you will gain access to the NetSuite Help Center and all relevant documentation.
Your options for integrating with the e-commerce platform depends on the exact platform. This ranges from Webhooks to HTTP requests.
You can't say NetSuite is delimiting developers in any way. It depends on how you look at it. As I see it, NetSuite provides two main method for developers - SuiteTalk and SuiteScript.By this, developer can create his/her own API, define what kind of acces those API should have.
SuiteTalk is SOAP based.
I would suggest using SuiteScript to create your own API using either NS RESTlet or NS Suitelet.
They have the feature for External URL. By sending request to this external URL you can trigger your own custom functions written on the SuiteScript. By SuiteScript, you can create your own API and define your own function. Ie, developer is in full control.
The only problem I see with NetSuite is its higher barrier for entry. There is no way you can access NetSuite Help Centre without having a Client/Partner/Test account.
But obviously, those who need some kind of integration with NetSuite have NS account.

How to load test Django API?

I'm putting the finishing touches on my Django-based Webfaction-hosted server. It has a REST API that will deliver content to mobile devices, and will accept POST requests from a different source every few minutes. So far, the POST part is going fine -- it's been accepting data for a week now with no major issues.
However, I'm worried about what happens when I release the mobile app - I expect a fair number of users and I don't want to run into a dead app if my API can't handle the load from all the GET requests.
How do I load test my Django API? Are there tools available online to simulate several hundred GET requests at once, or should I build a test from scratch?
You can check out Locust
This is an open source load testing tool and will help you test your api as well.

Google App Engine: traffic monitoring

What is the best way to monitor website traffic for a Google App Engine hosted website?
It's fairly trivial to put some code in each page handler to record each page request to the datastore, and now (thanks stackoverflow) I have the code to log the referring site.
There's another question on logging traffic using the datastore, but it doesn't consider other options (if there are any).
My concern is that the datastore is expensive. Is there another way? Do people typically implement traffic monitoring, or am I being over-zealous?
If I do implement traffic monitoring via the datastore, what fields are recommended to capture? What's good and/or common practise?
I'd go with: time-stamp; page; referer; IP address; username (if logged in). Any other suggestions?
All of the items you mention are already logged by the built-in App Engine logger. Why do you need to duplicate that? You can download the logs at regular intervals for analysis if you need.
People usually use Google Analytics (or something similar) as it does client-side tracking and gives more insight then server-side tracking.
If you only need server-side tracking then analysing logs should be enough. The problem with Log API is that it can be expensive because it does not do real querying: for every log search it goes thorough all logs (within range).
You might want to look at Mache, a tool that exports all GAE logs to Google BigQuery which has proper query functionality.
Another option would be to download logs and analyse them with a local tools. GAE logs are in Apache format so there are plenty of tools available.
You can use the logging module and that comes with a separate quota limit.
7 MBytes spanning 69 days (1% of the Retention limit)
I don't know what the limit is but that's a line from my app so it seems to be quite large.
You can then add to the log with
logging.debug("something to store")
if it does not already come with what you need, then read it out locally with:
appcfg.py --num_days=0 request_logs appname/ output.txt
Anything you write out via System.err.println (or the python equivalent) will automatically be appended to the app engine log. So, for example, you can create you own logging format, put println's on all your pages, and then download the log and grep for that format. So for example, if this is your format:
MYLOG:url:userid:urlparams
then download the log and pipe it through grep ^MYLOG and it would give you all the traffic for your site.

How to live stream stock price using python

At first I want to find some API, but I have searched on the internet and didn't find anything
really helpful.
"Real time" I mean live stream the stock price on a webpage without a refresh.
If there is not such API, would my following method be a good way to implement this?
1. Python side, call yahoo finance api to get the most recent price.
2. Browser side, use ajax to constantly call server side to get the price and display the price. More specifically, I am thinking to use setInterval in jquery to achieve this.
How does this approach look?
Actually this is not specific to stock price data, any website that need to constantly retrieve data from server side need to consider this problem. For example google chat, facebook news feed, and so on. Can anybody tell me in general how to achieve live streaming data from server to browser?
Another way would be to use a push architecture. You could take a look at APE - Ajax Push Engine.
You could also take a look at Socket.IO, a realtime application framework for Node.JS.
Hope this helps!
You should definitely use a Push API. These days you should probably use http://www.websocket.org/
You don't want to use a rest API for real time, its inefficient to constantly "pull" the live price. Instead you want a service that will "push" changes to you whenever new trades are executed on the exchange. This is done with a websocket, which is a type of API but it is definitely different from a rest API. This article discusses the difference.
Intrinio provides a real-time websocket and you can access it via Python using this SDK on Github. You can access the same data via rest API using this package in Python. If you try them both you will see the architecture doesn't make sense with a rest API.
This video shows the trades coming in- trades don't execute on the market at regular intervals, it's completely sporadic. Instead of constantly "asking" the server for the data, it's better to "listen". This is called top of the book, meaning you get the newest trades as they come in from the top.

Categories

Resources