[PredictionIO] Spark running multiple applications at same time?

Basically my problem:

I have created few engines, trained them and attempted to deploy (used spark stand-alone cluster)

and I met with problem, currently it only allows me to have 1 application running per 1 core, I have around 30 engines I want to deploy on this machine... how do I make spark run all applications at the same time without having 30-40 cores.

(btw, while originally I was running on built-in spark with PredictionIO it had problem with ports for SparkUI (it only went from 4040-4056 and then it stated couldn't not bind SparkUI filed to bind and ... i could not deploy more engines... :( ... so I moved to spark stand-alone cluster as I was told it would solve the issue, and it did; but created this problem of me being unable to run amount engines i want)

I'm sure I am missing something, any help from pro's around here?

example of how i'm executing it

(pio train works like a glove)
pio train -- --master spark://CL-GRAPH-1VUS01:7077

(problem here... spark cluster wants to eat all them resouces)
pio deploy --ip CL-GRAPH-1vUS01 --port 8008 -- --master spark://CL-GRAPH-1VUS01:7077 --conf spark.port.maxRetries=100

figured it out,

:|

I can bypass spark cluster and still call onto port retry conf for internal built-in spark in predictionIO
simple, I like it ~ i tried similar thingy but without escape -- from pio deploy, now it works..

pio deploy --ip CL-GRAPH-1vUS01 --port 8000 -- --conf spark.port.maxRetries=100

regards.