ошибка в Pycharm: ОШИБКА SparkContext: не удалось добавить dependencies.jar
Я запускаю код в dbconnect conda enviromemnet с помощью pycharm, мне удалось запустить некоторый код. Но затем запустил более крупный проект в надежде, что он будет запущен на удаленном кластере блоков данных, но я получаю этот проводной erorr, хотя у меня есть файл зависимостей в проекте, почему я получаю эту ошибку? Ошибка находится в конце вывода, я получаю эту ошибку для всех банок, но я просто публикую только первую.
22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\na\PycharmProjects\..\dependencies\udfs-1.0-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\name\PycharmProjects\..\dependencies\urlz-1.2-20211108.112618-928-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\name\PycharmProjects\..\dependencies\gremlin-1.0-20211108.112604-912-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 INFO SparkContext: Running Spark version 3.0.1-SNAPSHOT
22/01/01 15:14:27 INFO ResourceUtils: ==============================================================
22/01/01 15:14:27 INFO ResourceUtils: Resources for spark.driver:
22/01/01 15:14:27 INFO ResourceUtils: ==============================================================
22/01/01 15:14:27 INFO SparkContext: Submitted application: dbconnect_session
22/01/01 15:14:27 INFO SecurityManager: Changing view acls to: name
22/01/01 15:14:27 INFO SecurityManager: Changing modify acls to: name
22/01/01 15:14:27 INFO SecurityManager: Changing view acls groups to:
22/01/01 15:14:27 INFO SecurityManager: Changing modify acls groups to:
22/01/01 15:14:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(name); groups with view permissions: Set(); users with modify permissions: Set(name); groups with modify permissions: Set()
22/01/01 15:14:30 INFO Utils: Successfully started service 'sparkDriver' on port 61506.
22/01/01 15:14:30 INFO SparkEnv: Registering MapOutputTracker
22/01/01 15:14:30 INFO SparkEnv: Registering BlockManagerMaster
22/01/01 15:14:30 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/01/01 15:14:30 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/01/01 15:14:30 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/01/01 15:14:30 INFO DiskBlockManager: Created local directory at C:\Users\name\AppData\Local\Temp\blockmgr-963a7542-d01d-4f70-a0fe-ee430175cc27
22/01/01 15:14:30 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
22/01/01 15:14:30 INFO SparkEnv: Registering OutputCommitCoordinator
22/01/01 15:14:30 INFO AsyncProfiler: Cannot create AsyncProfiler instance (unsupported platform)
22/01/01 15:14:30 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set.
22/01/01 15:14:31 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/01/01 15:14:31 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://kubernetes.docker.internal:4040
22/01/01 15:14:31 ERROR SparkContext: Failed to add ../dependencies/udfs-1.0-jar-with-dependencies.jar to Spark environment
java.io.FileNotFoundException: Jar C:\Users\name\PycharmProjects\..\dependencies\udfs-1.0-jar-with-dependencies.jar not found
at org.apache.spark.SparkContext.addLocalJarFile$1(SparkContext.scala:2040)