Currently the ExperimentType default value in SparkJobConfiguration is experiment. This causes problems since when a client sends a JSON configuration object where the experimentType field is not present, it will set it to Experiment during the marshalling causing the job to run with an invalid configuration.
To fix the problem the default value of the ExperimentType field should be null, and only instantiated when the job configuration is first created for Jupyter and Jobs. To set the initial value for ExperimentType I created a new SparkJobConfiguration constructor.
When a job is configured the users sets a binary which is either a .jar, .py or .ipynb file. If the users sets a .jar file we should return a JobConfiguration corresponding to spark dynamic allocation. If the users sets a .py or .ipynb file we should return a JobConfiguration using the ExperimentType.EXPERIMENT enum.