Call Scala & Spark code from Python -


i have codebase written in scala uses spark. i'd method foo called externally python code.

def foo(a: int, b: string): string 

i saw here java python integration can use jython. think that's overkill though.

can't add pyspark method wraps scala/spark existing method?

if not, isn't there simpler solution, wouldn't need special module jython?


Comments

Popular posts from this blog

django - (fields.E300) Field defines a relation with model 'AbstractEmailUser' which is either not installed, or is abstract -

matlab - error with cyclic autocorrelation function -

php - Using grpc in Laravel, "Class 'Grpc\ChannelCredentials' not found." -