In jupyter Notebook kernel PySpark3 (HDInsight Spark cluster) I've got " The code failed because of a fatal error: Neither SparkSession nor HiveContext/SqlContext is available" error when I did all steps describe in "Safely manage Python environment on Azure HDInsight using Script Action" https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-python-package-installation?fbclid=IwAR1eDOEaBrRgJQMZY8HeCPW_C9pfVw6Qq6MOhA7q_PKpfzRa2R51QDR1dlE
Could you help me ? the best option for me would be to add one new package (SpaCy) to PySpark3 kernel (py35 env.). How can I do this?
2.1m questions
2.1m answers
60 comments
57.0k users