Skip to content

Unable to instantiate SparkSession with Hive support because Hive classes are not found. #33

@emptyr1

Description

@emptyr1

Hi,
I'm having issues with this command:

val spark = (SparkSession
          .builder()
          .appName("interfacing spark sql to hive metastore without configuration file")
          .config("hive.metastore.uris", "thrift://hive-metastorerver-201227-ro-001:903,thrift://hive-metasteserver-201727-ro-002:9083,thrift://hive-metastoreser-201727-ro-003:9083")
          .enableHiveSupport()
          .getOrCreate())

I get error:
java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
I've added . the jars via the interpreter and restarted it, but still getting the issue. Any idea how to solve this?
I added org.apache.spark:spark-hive_2.11:2.1.0 under "jdbc"

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions