Hiding sensitive information provided in the context of your Spark Job
When you execute your Jobs for Apache Spark on a JobServer, if this Job uses a context and your user password is defined in this context, your JobServer may fail to hide the password from its Commandline terminal.
Your Job runs on one of the following clusters:
- Microsoft HD Insight
- Google Cloud Dataproc
- Cloudera Altus
- All the other supported distributions when they run on Yarn Cluster.
Use tRunJob in a Standard Job to orchestrate your Spark Job.
- Create a Standard Job and add tRunJob to it.
Click the Contexts tab to open its view and load the contexts to be used by your Spark Job.
It is recommended to set up the contexts to be used under the Context node in the Repository of the Studio. This way, you can directly click the to import these contexts to your Job.
Double-click tRunJob to open its Component view.
- Click the ... button next to the Job field.
- From the Context drop-down list, select the context to be used. The contexts that appear on this list are those you imported to this Job in the previous steps.
- Select the Use an independent process to run subjob check box to avoid memory limitation issue.
- Select the Transmit whole context check box to apply the all the context variables of this Standard Job on your Spark Job.
- If you need to add some supplementary context variables, use the Context Param table to add them.