How to Figure Out The Relationship among Webpages in An Individual Website? - python

Normally, when we make the test scripts either it is Robot Framework or Behat Framework. We will manually find the locator of the element we focused on each web page by using Developer Tools on the browser we use. We can make the scripts because we know the flow or the step of an individual testing scenario. However, I want to figure the automated way to extract the information of the relationship among web pages inside an individual website without any manual input and make a script out of it
The purpose of this question is to figure out the way to automatically detect the relationship between each page in an individual website in order to develop a test step and further develop the automated test scripts generator which the scripts can be Robot Framework or Behat Framework for automated testing on an individual website developed by using Laravel Framework which normally contains many web pages inside that are related to each other.
Do you guys have any ideas about this?
Please tell me if you have any.
Please leave your comments below.

Related

What is the best way to integrate Python scripts into a website?

Specifically, I want to make a website/blog where people can track live data from some of my data-analysis scripts about stocks and such which I have made in Python. My idea for now is just to display some tables.
I'm quite novice with web hosting however, so is it possible to import the files directly into WordPress or Wix or something, or do I have to embed it within the HTML code?
I would prefer using Python because of all the libraries and the stock data API I use. Is it better to use Javascript though?
Lastly, I know about Django and Flask but the design of the website is very important for me too. That is why I want to use something like WordPress or Wix.

Conver Python to html using Django

I have a Python program for customers to query price. Each time a customer can input some necessary information, the program will calculate and return the price to customer. Note: during the calculation process, the program also need to query a third party map service web API to get some information (such as google map API or other similar service).
I have a website developed using web development tools such Wix, Strikingly. It offers a capability to customize a web page by simply input a block of HTML codes. So, I want to study the possibility of using Django to convent my python program into HTML (incl. add some user interface such as text box and button), which can then be pasted into the website to form a unique webpage.
I am not sure if it is doable? Especially, the part of connecting third party map service API. Would Django be able to convert this part automatically to HTML as well? (how does it deal with API key and connection).
Python itself runs only on the console, and is meant to be the backend in site development, whereas HTML is meant only to be the frontend, so no calculation or data fetching. Wix is a frontend tool with some content management that offers customization but still in the frontends (html/css), and there's nothing more you could do with the content management other than using the built in table like feature. Trying to use the html generated by wix will be so much pain due to its css name optimization and making it quite unscalable.
If you don't wish to learn frontend building at all then you could look up other html generator tool for the frontend codes. From there, django itself is capable of building the entire website, using the html you generated as template, and passing the data you've computed into the templates. That's what Django is meant to do. In this case you would need to learn Django itself, which I would recommend if you intend to showcase your project as an interactive program rather just console logs.
Other alternatives include converting your python codes into javascript, which is capable of doing calculations and fetching from APIs, and you can include the javascript code directly in HTML with the tag.

How to Connect Django with Python based Crawler machine?

Good day folks
Recently, I made a python based web crawler machine that scrapes_ some news ariticles and django web page that collects search title and url from users.
But I do not know how to connect the python based crawler machine and django web page together, so I am looking for the any good resources that I can reference.
If anyone knows the resource that I can reference,
Could you guys share those?
Thanks
There are numerous ways you could do this.
You could directly integrate them together. Both use Python, so the scraper would just be written as part of Django.
You could have the scraper feed the data to a database and have Django read from that database.
You could build an API from the scraper to your Django implementation.
There are quite a few options for you depending on what you need.

How to Combine Html + CSS code with python function?

I have zero experience with website development but am working on a group project and am wondering whether it would be possible to create an interaction between a simple html/css website and my python function.
Required functionality:
I have to take in a simple string input from a text box in the website, pass it into my python function which gives me a single list of strings as output. This list of strings is then passed back to the website. I would just like a basic tutorial website to achieve this. Please do not give me a link to the CGI python website as I have already read it and would like a more basic and descriptive view as to how this is done. I would really appreciate your help.
First you will need to understand HTTP. It is a text based protocol.
I assume by "web site" you mean User-Agent, like FireFox.
Now, your talking about an input box, well this will mean that you've already handled an HTTP request for your content. In most web applications this would have been several requests (one for the dynamically generated application HTML, and more for the static css and js files).
CGI is the most basic way to programmatically inspect already parsed HTTP requests and create HTTP responses from objects you've set.
Now your application is simple enough where you can probably do all the HTTP parsing yourself to gain a basic understanding of what's going on, but you will still need to understand how to develop a server that can listen on a socket.
To avoid all that just find a Python application server that has already implemented all of the above and much more. There are many python application servers to choose from. Use one with a small learning curve for something as simple as above. Some are labeled as "micro-frameworks" in this genre.
Have you considered creating an app on Google App Engine (https://developers.google.com/appengine/)?
Their Handling Forms tutorial seems to describe your problem:
https://developers.google.com/appengine/docs/python/gettingstartedpython27/handlingforms

Are bots different from crawlers from python Django point of view

Actually i am confused with the terminology. I am studying the scrapy and i think its for crawling the website and extract some data.
But i want to make some python programs which does something like the actual users does. I mean like automating tasks.
E,g Go to www.myblah.com and then get the cheapest product in some category and if that is less than my preset amount , then send me email.
Now i dont know whether these type of things come under crawling or something else.
Can i do that in scrapy or we have other libraries for doing those kind of tasks.
Scrapy is framework that can be used to create a bot or a crawler (aka spider). A crawler is specific kind of bot, but a bot isn't necessarily a crawler. Crawlers are defined by being designed to explore the graph of pages (nodes) and their embedded URLs (edges) although they may be restricted from following particular URLs.
Automating tasks is the work of a bot. Whether Scrapy will work for that will depend on what information is needed and how actions have to be taken. Many sites are heavy on javascript these days, so if the bot can't execute javascript and correctly provide cookies it may not be able to get the information to it's task. Some web automation tasks may require a browser plug-in or even GUI automation tools may be needed.

Categories

Resources