Skip to main content

Dataproc Console

In the Analytics Environment, Dataproc Console is used primarily to run jobs with Spark Jobs Submit when you want to submit and run a PySpark or Python job on a distributed Hadoop cluster. See "Determining the Applications to Use."

You can use Sparks Jobs Submit for the following tasks with your Safe Haven data:

  • Running ML models

  • Running long-duration PySpark code to optimize resources

  • Creating modules in PySpark

See "Working with Spark Jobs Submit."