Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TypeError: __init__() got an unexpected keyword argument 'service' error using Python Selenium ChromeDriver with company pac file

I've been struggling with this problem for sometime, but now I'm coming back around to it. I'm attempting to use selenium to scrape data from a URL behind a company proxy using a pac file. I'm using Chromedriver, which my browser uses the pac file in it's configuration.

I've been trying to use desired_capabilities, but the documentation is horrible or I'm not grasping something. Originally, I was attempting to webscrape with beautifulsoup, which I had working except the data I need now is in javascript, which can't be read with bs4.

Below is my code:

import pandas as pd
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.proxy import Proxy, ProxyType
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

desired_capabilities = webdriver.DesiredCapabilities.CHROME.copy()

PAC_PROXY = {
    'proxyAutoconfigUrl': 'http://proxy-pac/proxy.pac',
}
proxy = Proxy()
proxy.proxy_autoconfig_url = PAC_PROXY['proxyAutoconfigUrl']

desired_capabilities = {}
proxy.add_to_capabilities(desired_capabilities)
URL = "https://mor.nlm.nih.gov/RxClass/search?query=ALIMENTARY%20TRACT%20AND%20METABOLISM%7CATC1-4&searchBy=class&sourceIds=a&drugSources=atc1-4%7Catc%2Cepc%7Cdailymed%2Cmeshpa%7Cmesh%2Cdisease%7Cmedrt%2Cchem%7Cdailymed%2Cmoa%7Cdailymed%2Cpe%7Cdailymed%2Cpk%7Cmedrt%2Ctc%7Cfmtsme%2Cva%7Cva%2Cdispos%7Csnomedct%2Cstruct%7Csnomedct%2Cschedule%7Crxnorm"

service = Service('C:\Program Files\Chrome Driver\chromedriver.exe')
driver = webdriver.Chrome(service=service)
driver.get(URL)
print(driver.requests[0].headers, driver.requests[0].response)

WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.CSS_SELECTOR, 'tr.dbsearch')))
print(pd.read_html(driver.page_source)[1].iloc[:,:-1])
pd.read_html(driver.page_source)[1].iloc[:,:-1].to_csv('table.csv',index=False)

I'm not sure why I'm receiving an:

TypeError: __init__() got an unexpected keyword argument 'service'

even when I have the path added correctly to my system environment variables as shown below:

enter image description here

Essentially what I'm attempting to do is scrape the data in the table from https://mor.nlm.nih.gov/RxClass/search?query=ALIMENTARY%20TRACT%20AND%20METABOLISM%7CATC1-4&searchBy=class&sourceIds=a&drugSources=atc1-4%7Catc%2Cepc%7Cdailymed%2Cmeshpa%7Cmesh%2Cdisease%7Cmedrt%2Cchem%7Cdailymed%2Cmoa%7Cdailymed%2Cpe%7Cdailymed%2Cpk%7Cmedrt%2Ctc%7Cfmtsme%2Cva%7Cva%2Cdispos%7Csnomedct%2Cstruct%7Csnomedct%2Cschedule%7Crxnorm then store it to a pandas dataframe and pass it to a csv file.

like image 869
user1470034 Avatar asked Dec 28 '25 20:12

user1470034


2 Answers

If you are still using Selenium v3.x then you shouldn't use the Service() and in that case the key executable_path is relevant. In that case the lines of code will be:

driver = webdriver.Chrome(executable_path='C:\Program Files\Chrome Driver\chromedriver.exe')

Else, if you are using selenium4 then you have to use Service() and in that case the key executable_path is no more relevant. So you need to change the line of code:

service = Service(executable_path='C:\Program Files\Chrome Driver\chromedriver.exe')
driver = webdriver.Chrome(service=service)

as:

service = Service('C:\Program Files\Chrome Driver\chromedriver.exe')
like image 140
undetected Selenium Avatar answered Dec 30 '25 10:12

undetected Selenium


I've been having this problem also since I switched from pip Jupyter to Anaconda Jupyter. This worked for me:

driver = webdriver.Chrome(ChromeDriverManager().install())

Apparently you don't need the service in the Jupyter package.

like image 22
tshirtdr1 Avatar answered Dec 30 '25 09:12

tshirtdr1



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!