Associate an Elastic IP Address with your Cluster

It may be helpful to assign a static IP address to your Spark driver for the following use cases:

  • Point your BI tool to a static IP address for JDBC integration.
  • Whitelist a static IP address in a security group to allow your Spark driver to connect to an external database.

Important

This functionality is not available in the Standard Plan. Instead, a different solution can be used; see Proxy Traffic Through a NAT gateway.

Step 1: Create an elastic IP address

  1. Create an elastic IP address in AWS. Note this IP address for Step 2.

    Note

    Elastic IP addresses cost money when not associated with an instance. You’ll want to remove any unused elastic IPs from your AWS account if they’re no longer used. An instance is just a physical machine that is a part of your Databricks cluster.

  2. Provision an IAM policy in AWS that allows you to associate an elastic IP address with an instance. Here’s a sample IAM policy:

    {
        "Version": "2012-10-17",
        "Statement": [
        {
            "Effect": "Allow",
            "Action": ["ec2:DescribeAddresses", "ec2:AssociateAddress", "ec2:DisassociateAddress" ],
            "Resource": "*"
        }
      ]
    }
    

Note

You must rerun the notebook in Step 2 each time you bring up a new Spark cluster. Alternately, you can put this code in an init script to automatically run on your cluster at launch time. After attaching an elastic IP address, the public IP address of the EC2 instance may change. Places that show the IP address, such as the Spark UI or Cluster API, may display out of date IP addresses. Databricks attempts to retain the address of the Spark driver, but this behavior is not guaranteed.

Step 2: Associate Cluster with Elastic IP Address