Spark executor memory not modifiable from UI when starting jupyter

Description

UPDATE: I figured out this happens when the memory is set lower than the minimum of 1024MB. But it does it silently, so I think it would be better to show it in the UI that lower than 1024MB is not possible.

On current master, when starting Jupyter in Spark (static and dynamic) mode, setting the executor memory to something other than 4096MB it will still default back to 4096MB.

The issue is that the REST call does not contain the field `spark.executor.memory` in the jobConfig json (see screenshot).

Must be a bug recently introduced as on a VM from 16th of July it was still working.

Assignee

Gibson Chikafa

Reporter

Moritz Meister

Labels

Fix versions

Priority

Medium
Configure