问题
当你启用了防火墙在Azure虚拟网络(联接)和你试图访问ADLS使用ADLS Gen1连接器,它失败的错误:
328格式(target_id”。“其他名称),值)329:330提高Py4JError (Py4JJavaError:调用o196.parquet时发生一个错误。:. lang。RuntimeException:找不到ADLS com.databricks.backend.daemon.data.client.adl令牌。AdlCredentialContextTokenProvider anonfun得到令牌美元1.美元在com.databricks.backend.daemon.data.client.adl应用(AdlCredentialContextTokenProvider.scala: 18)。AdlCredentialContextTokenProvider anonfun得到令牌美元1.美元(AdlCredentialContextTokenProvider.scala: 18)应用scala.Option.getOrElse (Option.scala: 121) com.databricks.backend.daemon.data.client.adl.AdlCredentialContextTokenProvider.getToken (AdlCredentialContextTokenProvider.scala: 18) com.microsoft.azure.datalake.store.ADLStoreClient.getAccessToken (ADLStoreClient.java: 1036) com.microsoft.azure.datalake.store.HttpTransport.makeSingleCall (HttpTransport.java: 177) com.microsoft.azure.datalake.store.HttpTransport.makeCall (HttpTransport.java: 91) com.microsoft.azure.datalake.store.Core.getFileStatus (Core.java: 655) com.microsoft.azure.datalake.store.ADLStoreClient.getDirectoryEntry (ADLStoreClient.java: 735) com.microsoft.azure.datalake.store.ADLStoreClient.getDirectoryEntry (ADLStoreClient.java: 718) com.databricks.adl.AdlFileSystem.getFileStatus (AdlFileSystem.java: 423) org.apache.hadoop.fs.FileSystem.exists (FileSystem.java: 1426) org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run (InsertIntoHadoopFsRelationCommand.scala: 94)
导致
这是一个已知的问题与ADLS Gen1连接器。连接到ADLS Gen1启用防火墙时是不支持的。
解决方案
使用ADLS代代替。