Python Web Automation Using Selenium


What is Selenium?

Selenium is a powerful tool for browser automation using Python. It allows you to:

  • Automate form filling
  • Click buttons and links
  • Scrape data from dynamic (JavaScript-loaded) websites
  • Perform automated testing

Install Required Tools

pip install selenium

You also need a web driver like ChromeDriver:

Download from: https://sites.google.com/chromium.org/driver/

Place it in your system path or specify its location.



Basic Web Automation with Selenium


Step 1: Import and Launch Browser

from selenium import webdriver
from selenium.webdriver.common.by import By
import time

driver = webdriver.Chrome()  # Or specify path in Chrome("path/to/chromedriver")
driver.get("https://www.google.com")

Step 2: Search on Google Automatically

search_box = driver.find_element(By.NAME, "q")
search_box.send_keys("Python web automation with Selenium")
search_box.submit()

time.sleep(3)
print("Title:", driver.title)
driver.quit()


Common Selenium Tasks

Fill Forms Automatically

driver.get("https://example.com/login")
driver.find_element(By.NAME, "username").send_keys("myusername")
driver.find_element(By.NAME, "password").send_keys("mypassword")
driver.find_element(By.ID, "login-btn").click()

Click Buttons or Links

driver.find_element(By.LINK_TEXT, "Next Page").click()
driver.find_element(By.CLASS_NAME, "submit-button").click()

Extract Dynamic Page Data

elements = driver.find_elements(By.CLASS_NAME, "item-title")
for el in elements:
    print(el.text)

Select Dropdown Options

from selenium.webdriver.support.ui import Select

dropdown = Select(driver.find_element(By.ID, "dropdown-id"))
dropdown.select_by_visible_text("Option 2")

Take Screenshot

driver.save_screenshot("screenshot.png")


Headless Browser Mode (No GUI)

from selenium.webdriver.chrome.options import Options

options = Options()
options.headless = True

driver = webdriver.Chrome(options=options)
driver.get("https://example.com")
print(driver.title)
driver.quit()


Wait for Elements to Load

from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

element = WebDriverWait(driver, 10).until(
    EC.presence_of_element_located((By.ID, "dynamic-content"))
)
print(element.text)


Exporting Scraped Data to CSV

import csv

items = driver.find_elements(By.CLASS_NAME, "item")
with open('items.csv', 'w', newline='', encoding='utf-8') as f:
    writer = csv.writer(f)
    writer.writerow(["Item Name"])
    for item in items:
        writer.writerow([item.text])


Handling Logins and Sessions

You can maintain login state using cookies:

cookies = driver.get_cookies()
# Save cookies and reuse them in next session