Selenium is an excellent tool for testing and even for interacting with third-party data sources.  When combined with the scalability and flexibility of Kubernetes, you have powerful testing infrastructure.

How do you get started?  

Step 1: Setup your Docker Image with Chrome on Ubuntu 18.04

FROM ubuntu:18.04

#update environment
RUN apt-get -y upgrade
RUN apt-get -y update
RUN apt-get -y --with-new-pkgs upgrade
RUN apt-get -y autoremove

#install chrome
RUN apt-get -y install lsb-release libappindicator3-1
RUN wget
RUN dpkg -i google-chrome-stable_current_amd64.deb || true
RUN apt-get -fy install

#install curl
RUN apt-get -y install curl wget

#install node
RUN curl -sL | bash -
RUN apt-get -y install nodejs
RUN node --version
RUN npm --version

(Optional) I also like to use PM2 to manage the Node process.  Add it to the Dockerfile

#install pm2
RUN npm install pm2 -g --production

You can either install ChromeDriver in the Dockerfile or in your Node.js code.  In this example, I'll install it through the code.

Step 2: Install ChromeDriver and Selenium WebDriver

ChromeDriver is a common way of installing it, but I prefer; they add support to the newest ChromeDriver versions more quickly.

npm i
npm i selenium-webdriver
NPM Install

Step 3: Initialize ChromeDriver

This routine handles the basics of configuring Chrome for basic web automation, including downloading files.

const RunChomeHeadless = true;
const chromeDriver = require('');
const { Builder, logging } = require('selenium-webdriver');
const Chrome = require('selenium-webdriver/chrome');

const getDriver = async (downloadFolder, enablePerformanceLogging = false) => {
  const options = new Chrome.Options;
  options.addArguments('--no-sandbox'); //needed to run as root on ubuntu server

    'plugins.always_open_pdf_externally': true,
    'Browser.setDownloadBehavior': 'allow',
    'download.default_directory': downloadFolder

  if (enablePerformanceLogging) {
    //set performance logs including network events so we can detect / track downloads
    let logPrefs = new logging.Preferences();
    options.set('goog:loggingPrefs', logPrefs);

  if (RunChomeHeadless) {

  const service = new Chrome.ServiceBuilder(chromeDriver.binPath());
  const driver = await new Builder().forBrowser('chrome').setChromeService(service).setChromeOptions(options).build();

  //when running headless, you must enable file downloads manually
  await driver.sendDevToolsCommand('Page.setDownloadBehavior', { 'behavior': 'allow', 'downloadPath': downloadFolder })

  const timeouts = {
    implicit: 0,
    pageLoad: 600000,
    script: 300000

  await driver.manage().setTimeouts(timeouts);
  return driver;

module.exports.getDriver = getDriver;

Step 4: Use the Driver

const driverProvider = require('./driverProvider.js');

const loadGoogle = async () => {
  const driver = driverProvider.getDriver();
  try {
    await driver.get('');  
  finally {
    await driver.quit();

Step 5: Configure and Deploy your Docker Image to Kubernetes

Add your Node.js project code to the Dockerfile, build the container, and you are ready to deploy your Docker image to your preferred Container Host.  Possibly: a VM, Azure Container Instances (ACI), Azure Web Apps, or Kubernetes (EKS, AKS, GCE).