Call Scala & Spark code from Python -
i have codebase written in scala uses spark. i'd method foo
called externally python code.
def foo(a: int, b: string): string
i saw here java python integration can use jython. think that's overkill though.
can't add pyspark method wraps scala/spark existing method?
if not, isn't there simpler solution, wouldn't need special module jython?
Comments
Post a Comment