Jobs Access Control

Note

Access control is available only in the Databricks Operational Security Package.

By default, all users can create and modify jobs unless an administrator enables jobs access control. With jobs access control, individual permissions determine a user’s abilities. This topic describes the individual permissions and how to enable and configure jobs access control.

Job permissions

There are five permission levels for jobs: No Permissions, Can View, Can Manage Run, Is Owner, and Can Manage. The Can Manage permission is reserved for administrators. The table lists the abilities for each permission.

Ability No Permissions Can View Can Manage Run Is Owner Can Manage (admin)
View job details and settings x x x x x
View results, Spark UI, logs of a job run   x x x x
Run now     x x x
Cancel run     x x x
Edit job settings       x x
Modify permissions       x x
Delete job       x x
Change owner         x

Note

  • The creator of a job has Is Owner permission.
  • A job cannot have more than one owner.
  • A job cannot have a group as an owner.
  • Jobs triggered through Run Now assume the permissions of the job owner and not the user who issued Run Now. For example, even if job A is configured to run on an existing cluster accessible only to the job owner (user A), a user (user B) with Can Manage Run permission can start a new run of the job.
  • You can view notebook run results only if you have the Can View or higher permission on the job. This allows jobs access control to be intact even if the job notebook was renamed, moved, or deleted.
  • Jobs access control applies to jobs displayed in the Databricks Jobs UI and their runs. It doesn’t apply to runs spawned by notebook workflows or runs submitted by API whose ACLs are bundled with the notebooks.

Note

Jobs access control was introduced in the September 2017 release of Databricks. Customers with cluster access control enabled automatically have jobs access control enabled.

For jobs that existed before September 2017, job access control changes behavior for customers who had cluster access control enabled. Previously job access control settings on the job notebook were coupled with the access control of job run results. That is, a user could view the notebook job run result if the user could view the job notebook. Databricks initializes job access control settings to be compatible with previous access control settings as follows:

  • Job creators are granted the Is Owner permission and administrators are granted the Can Manage permission.

  • Databricks grants users who can view the job notebook the Can View permission on the job. This preserves the view access control on notebook jobs.

    Can View permission applies to all historical runs with regard to notebook results. However, it doesn’t apply to clusters created by the job that existed before jobs access control was available. For example, suppose a job has a completed run (say run 1) that created a cluster C1 and ran notebook N1. Later the job was set to run notebook N2. Users with Can View permission can view run 1 but cannot view the Spark UI or driver logs of cluster C1. You can use cluster access control to control access to C1.

Enable jobs access control

  1. Go to the Admin Console.

  2. Select the Access Control tab.

    ../../_images/access-control-tab.png
  3. Click the Enable button next to Cluster and Jobs Access Control.

    ../../_images/ClusterAndJobsACLs.png
  4. Click Confirm to confirm the change.

Configure job permissions

You must be an administrator or have Is Owner permission.

  1. Go to the details page for a job.

  2. Click Advanced.

    ../../_images/job-advanced.png
  3. Click the Edit link next to Permissions.

    ../../_images/job-permissions.png
  4. In the pop-up dialog box, assign job permissions via the drop-down menu beside a user’s name.

    ../../_images/JobManageACLs.png
  5. Click Save Changes.