Google Chrome running in background without my knowledge - python

Ever since I started using Python Selenium and windows task manager to automate a python script to scrape the web using Google chrome I am having this issue where even though I haven't opened Chrome it still shows in the task manager and uses 60-70% CPU. This is slowing my laptop significantly and I can't seem to solve it. If I end task Chrome then it fixes but randomly after some time chrome again pops up in task manager slowing my laptop. If anyone has a solution to this it would be a great help if you can help me.

Related

Distinguishing Between Selenium Browsers and Manual Browsers

I'm using Python Selenium to execute a few daily tasks overnight. Unfortunately, one of the websites I'm logging into is strangely unreliable. Therefore, I call that task repeatedly until it works. Each time I start over, I have to use a new Chrome window. As a result, there are sometimes several erroneous Chrome windows open in the morning.
I would to automatically close all these windows all the tasks are complete. However, I don't want to close all instances of Chrome because I often keep my normal Chrome windows open overnight. Is there a way to close just the instances of Chrome started by selenium even if they are not still attached to a running script?
I'm not sure how to start!

How to keep Selnium script, which needs to open a headed browser to scrape from, running even after closing macbook laptop lid closes?

I am currently working on a project where I am using Selenium to scrape some data from a website. My script opens a headed browser and performs the scraping. The problem is that when I close the lid of my MacBook laptop, the script stops running.
Is there a way to keep the script running even after closing the laptop lid? I have tried searching for a solution online, but I have not found anything that works. I would appreciate any help that can be provided.
Thank you in advance!

Running Selenium Automated Scrips on Raspberry Pi 4

I’m relatively knew to the selenium package and have been using it for a couple weeks. My current script uses selenium to scrape data, I analyze the data by running a few tests, and if there is a datastring that passes said tests python texts me using Twilio. I’m currently using my mac to run all of this but I was looking to run this script every 5 minutes, headless, and on a platform such that I dont need to keep my computer on. I have been looking at some potential solutions and it seems as though running this on a headless raspberry pie is the right option. I was wondering if anyone see’s any potential problems with doing so as I haven’t seen a thread with someone using Twilio? And, I’ve encountered problems trying to set up a cron task to automate it on my mac because of selenium and was wondering if this will be possible on the pi (looking at the raspberry pi 4)? Sorry, if this is a little long winded, appreciate the help.
Run the script though any CI CD tools like Jenkins ,GoCD,Gitlab in a scheduled job so that the script would run in every 5 minutes in Agent node specified and you don't have to keep your computer on.

Selenium Webdriver works fast when network/internet is turned off

I am facing a very peculiar issue with Selenium Webdriver. All of sudden my selenium scripts written in python have started to execute very slow in both firefox and chrome.
The script execution returns to its normal speed when I manually turn off the network/internet through my laptop's wifi button after a webpage is loaded. The same scripts run at normal speed on another computer (on my office network) where I connect to internet through VPN.
Another example is when I run my scripts on a locally stored webpage file (htm). The scripts run superfast when internet is turned off but run equally slow when the internet is turned on.
I do not use a proxy server.

Selenium Firefox randomly freezes under Debian

I'm running a selenium test using Firefox-17, and it will randomly "freeze" - the window is visible but completely useless. The mouse cursor that usually shows up when you hover over a link is active over the entire firefox window, but cannot actually interact/click on the page or firefox's menus. This only happens on the Debian machine, and only with selenium. I use Firefox-28 for daily browsing, and I've never experienced any issues like this.
The code runs fine for several minutes, but then it always randomly freezes in the middle of requesting a new page. The process must then be force killed.
Things I've tried:
Using firefox-28 - still freezes at random
Running the same code on my Windows machine - this runs for hours with no problem
Hypothesis:
I'm running the tests with python's multiprocessing. (For debugging purposes I've only been using one master queue that feeds to a single driver instance.)
Could this freezing problem be related to the forking mechanism used by Linux for multiprocessing?
Maybe somehow related to the http://shallowsky.com/blog/linux/firefox-freeze-and-dbus.html - although I have no problems accessing the bbc podcast referenced in that link
I have other code that runs firefox with javascript disabled, and it hasn't had any issues on this Debian machine. Could this be something to do with Linux's javascript engine?
I finally figured out that this only occurs with javascript enabled in firefox. Fortunately, chromium doesn't have this javascript problem on the same debian system.

Categories

Resources