Spark/Phoenix with Kerberos on YARN -


i have spark (1.4.1) application runs on non-kerberized cluster , copied instance has kerberos running. application takes data hdfs , puts phoenix.

however, not work:

    error ipc.abstractrpcclient: sasl authentication failed. cause missing or invalid credentials. consider 'kinit'.     javax.security.sasl.saslexception: gss initiate failed [caused gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)]             @ com.sun.security.sasl.gsskerb.gsskrb5client.evaluatechallenge(gsskrb5client.java:211)             @ org.apache.hadoop.hbase.security.hbasesaslrpcclient.saslconnect(hbasesaslrpcclient.java:179)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.setupsaslconnection(rpcclientimpl.java:611)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.access$600(rpcclientimpl.java:156)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection$2.run(rpcclientimpl.java:737)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection$2.run(rpcclientimpl.java:734)             @ java.security.accesscontroller.doprivileged(native method)             @ javax.security.auth.subject.doas(subject.java:422)             @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1657)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.setupiostreams(rpcclientimpl.java:734)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.writerequest(rpcclientimpl.java:887)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl$connection.tracedwriterequest(rpcclientimpl.java:856)             @ org.apache.hadoop.hbase.ipc.rpcclientimpl.call(rpcclientimpl.java:1200)             @ org.apache.hadoop.hbase.ipc.abstractrpcclient.callblockingmethod(abstractrpcclient.java:213)             @ org.apache.hadoop.hbase.ipc.abstractrpcclient$blockingrpcchannelimplementation.callblockingmethod(abstractrpcclient.java:287)             @ org.apache.hadoop.hbase.protobuf.generated.masterprotos$masterservice$blockingstub.ismasterrunning(masterprotos.java:50918)             @ org.apache.hadoop.hbase.client.connectionmanager$hconnectionimplementation$masterservicestubmaker.ismasterrunning(connectionmanager.java:1564)             @ org.apache.hadoop.hbase.client.connectionmanager$hconnectionimplementation$stubmaker.makestubnoretries(connectionmanager.java:1502)             @ org.apache.hadoop.hbase.client.connectionmanager$hconnectionimplementation$stubmaker.makestub(connectionmanager.java:1524)             @ org.apache.hadoop.hbase.client.connectionmanager$hconnectionimplementation$masterservicestubmaker.makestub(connectionmanager.java:1553)             @ org.apache.hadoop.hbase.client.connectionmanager$hconnectionimplementation.getkeepalivemasterservice(connectionmanager.java:1704)             @ org.apache.hadoop.hbase.client.mastercallable.prepare(mastercallable.java:38)             @ org.apache.hadoop.hbase.client.rpcretryingcaller.callwithretries(rpcretryingcaller.java:124)             @ org.apache.hadoop.hbase.client.hbaseadmin.executecallable(hbaseadmin.java:3917)             @ org.apache.hadoop.hbase.client.hbaseadmin.gettabledescriptor(hbaseadmin.java:441)             @ org.apache.hadoop.hbase.client.hbaseadmin.gettabledescriptor(hbaseadmin.java:463)             @ org.apache.phoenix.query.connectionqueryservicesimpl.ensuretablecreated(connectionqueryservicesimpl.java:815)             @ org.apache.phoenix.query.connectionqueryservicesimpl.createtable(connectionqueryservicesimpl.java:1215)             @ org.apache.phoenix.query.delegateconnectionqueryservices.createtable(delegateconnectionqueryservices.java:112)             @ org.apache.phoenix.schema.metadataclient.createtableinternal(metadataclient.java:1902)             @ org.apache.phoenix.schema.metadataclient.createtable(metadataclient.java:744)             @ org.apache.phoenix.compile.createtablecompiler$2.execute(createtablecompiler.java:186)             @ org.apache.phoenix.jdbc.phoenixstatement$2.call(phoenixstatement.java:304)             @ org.apache.phoenix.jdbc.phoenixstatement$2.call(phoenixstatement.java:296)             @ org.apache.phoenix.call.callrunner.run(callrunner.java:53)             @ org.apache.phoenix.jdbc.phoenixstatement.executemutation(phoenixstatement.java:294)             @ org.apache.phoenix.jdbc.phoenixstatement.executeupdate(phoenixstatement.java:1243)             @ org.apache.phoenix.query.connectionqueryservicesimpl$12.call(connectionqueryservicesimpl.java:1893)             @ org.apache.phoenix.query.connectionqueryservicesimpl$12.call(connectionqueryservicesimpl.java:1862)             @ org.apache.phoenix.util.phoenixcontextexecutor.call(phoenixcontextexecutor.java:77)             @ org.apache.phoenix.query.connectionqueryservicesimpl.init(connectionqueryservicesimpl.java:1862)             @ org.apache.phoenix.jdbc.phoenixdriver.getconnectionqueryservices(phoenixdriver.java:180)             @ org.apache.phoenix.jdbc.phoenixembeddeddriver.connect(phoenixembeddeddriver.java:132)             @ org.apache.phoenix.jdbc.phoenixdriver.connect(phoenixdriver.java:151)             @ java.sql.drivermanager.getconnection(drivermanager.java:664)             @ java.sql.drivermanager.getconnection(drivermanager.java:208)             @ org.apache.phoenix.mapreduce.util.connectionutil.getconnection(connectionutil.java:99)             @ org.apache.phoenix.mapreduce.util.connectionutil.getinputconnection(connectionutil.java:57)             @ org.apache.phoenix.mapreduce.util.connectionutil.getinputconnection(connectionutil.java:45)             @ org.apache.phoenix.mapreduce.util.phoenixconfigurationutil.getselectcolumnmetadatalist(phoenixconfigurationutil.java:263)             @ org.apache.phoenix.spark.phoenixrdd.todataframe(phoenixrdd.scala:109)             @ org.apache.phoenix.spark.sparksqlcontextfunctions.phoenixtableasdataframe(sparksqlcontextfunctions.scala:37)             @ com.bosch.asc.utils.hbaseutils$.scanphoenix(hbaseutils.scala:123)             @ com.bosch.asc.smtprocess.addlookup(smtprocess.scala:1125)             @ com.bosch.asc.smtprocess.savemounttracelogtophoenix(smtprocess.scala:1039)             @ com.bosch.asc.smtprocess.runetl(smtprocess.scala:87)             @ com.bosch.asc.smtprocessmonitor$delayedinit$body.apply(smtprocessmonitor.scala:20)             @ scala.function0$class.apply$mcv$sp(function0.scala:40)             @ scala.runtime.abstractfunction0.apply$mcv$sp(abstractfunction0.scala:12)             @ scala.app$$anonfun$main$1.apply(app.scala:71)             @ scala.app$$anonfun$main$1.apply(app.scala:71)             @ scala.collection.immutable.list.foreach(list.scala:318)             @ scala.collection.generic.traversableforwarder$class.foreach(traversableforwarder.scala:32)             @ scala.app$class.main(app.scala:71)             @ com.bosch.asc.smtprocessmonitor$.main(smtprocessmonitor.scala:5)             @ com.bosch.asc.smtprocessmonitor.main(smtprocessmonitor.scala)             @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)             @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)             @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)             @ java.lang.reflect.method.invoke(method.java:498)             @ org.apache.spark.deploy.yarn.applicationmaster$$anon$2.run(applicationmaster.scala:486)     caused by: gssexception: no valid credentials provided (mechanism level: failed find kerberos tgt)             @ sun.security.jgss.krb5.krb5initcredential.getinstance(krb5initcredential.java:147)             @ sun.security.jgss.krb5.krb5mechfactory.getcredentialelement(krb5mechfactory.java:122)             @ sun.security.jgss.krb5.krb5mechfactory.getmechanismcontext(krb5mechfactory.java:187)             @ sun.security.jgss.gssmanagerimpl.getmechanismcontext(gssmanagerimpl.java:224)             @ sun.security.jgss.gsscontextimpl.initseccontext(gsscontextimpl.java:212)             @ sun.security.jgss.gsscontextimpl.initseccontext(gsscontextimpl.java:179)             @ com.sun.security.sasl.gsskerb.gsskrb5client.evaluatechallenge(gsskrb5client.java:192)             ... 70 more 

i have added

export _java_options="-djava.security.krb5.conf=/etc/hadoop/krb5.conf" 

in spark submission script, no avail. have change code allow authentication? had assumed ticket shared between applications, , code not change.

in case helps: in shell not see spark.authenticate option set when execute:

sc.getconf.getall.foreach(println) 

see: http://spark.apache.org/docs/latest/security.html

i have little experience kerberos, appreciated.

assuming cluster kerberized, initialize credentials with:

kinit -kt /path/to/keytab/file user/domain@realm


Comments

Popular posts from this blog

sequelize.js - Sequelize group by with association includes id -

android - Robolectric "INTERNET permission is required" -

java - Android raising EPERM (Operation not permitted) when attempting to send UDP packet after network connection -