[Rdma-spark-discuss] Hi! Ihave some error in rdma spark.

Xiaoyi Lu@cse.osu luxi at cse.ohio-state.edu
Wed Apr 27 13:04:51 EDT 2016


Hi, Zino,
Like we have mentioned in our userguide (
http://hibd.cse.ohio-state.edu/static/media/rdma-spark/rdma-spark-0.9.1-userguide.pdf),
you need to add two paths in your spark-defaults.conf as follows.

spark.executor.extraLibraryPath $SPARK_HOME/lib/native/Linux-amd64-64
spark.driver.extraLibraryPath $SPARK_HOME/lib/native/Linux-amd64-64

Please use the absolute path on your cluster to replace '$SPARK_HOME'.

Thanks,
Xiaoyi

On Tue, Apr 26, 2016 at 4:05 AM, zino lin <linzino7 at gmail.com> wrote:

> HI :
>   I deploy yours RDMA-spark. we have a error which is no rdmaipcserver in
> java.library.path.  Do I need to add any path?  I'm sure  my server can run
> RDMA.
>
>
>
> error log:
>
>paslab at fastnet  ~/zino/rdma-spark-0.9.1-bin 
> MASTER=spark://fastnet:7077 ./bin/run-example
> org.apache.spark.examples.GroupByTest 96 65536 4096 96
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 16/04/26 16:49:59 INFO SparkContext: Running Spark version 1.5.1
> 16/04/26 16:49:59 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 16/04/26 16:49:59 INFO SecurityManager: Changing view acls to: paslab
> 16/04/26 16:49:59 INFO SecurityManager: Changing modify acls to: paslab
> 16/04/26 16:49:59 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(paslab); users
> with modify permissions: Set(paslab)
> 16/04/26 16:49:59 INFO Slf4jLogger: Slf4jLogger started
> 16/04/26 16:49:59 INFO Remoting: Starting remoting
> 16/04/26 16:50:00 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://sparkDriver@192.168.1.2:45536]
> 16/04/26 16:50:00 INFO Utils: Successfully started service 'sparkDriver'
> on port 45536.
> 16/04/26 16:50:00 INFO SparkEnv: Registering MapOutputTracker
> 16/04/26 16:50:00 INFO SparkEnv: create RdmaBlockTransferService with 0
> 16/04/26 16:50:00 INFO SparkEnv: Registering BlockManagerMaster
> 16/04/26 16:50:00 INFO DiskBlockManager: Created local directory at
> /tmp/blockmgr-b31827b1-3026-4a08-ba16-4cd0d0048197
> 16/04/26 16:50:00 INFO MemoryStore: MemoryStore started with capacity
> 530.3 MB
> 16/04/26 16:50:00 INFO HttpFileServer: HTTP File server directory is
> /tmp/spark-ed579380-aedb-4798-aff0-9b021592b93e/httpd-22553087-c264-488e-bafe-4ab25b89d138
> 16/04/26 16:50:00 INFO HttpServer: Starting HTTP Server
> 16/04/26 16:50:00 INFO Utils: Successfully started service 'HTTP file
> server' on port 38954.
> 16/04/26 16:50:00 INFO SparkEnv: Registering OutputCommitCoordinator
> 16/04/26 16:50:00 INFO Utils: Successfully started service 'SparkUI' on
> port 4040.
> 16/04/26 16:50:00 INFO SparkUI: Started SparkUI at http://192.168.1.2:4040
> 16/04/26 16:50:00 INFO SparkContext: Added JAR
> file:/home/paslab/zino/rdma-spark-0.9.1-bin/lib/spark-examples-1.5.1-hadoop2.6.0.jar
> at http://192.168.1.2:38954/jars/spark-examples-1.5.1-hadoop2.6.0.jar with
> timestamp 1461660600396
> 16/04/26 16:50:00 WARN MetricsSystem: Using default name DAGScheduler for
> source because spark.app.id is not set.
> 16/04/26 16:50:00 INFO AppClient$ClientEndpoint: Connecting to master
> spark://fastnet:7077...
> 16/04/26 16:50:00 INFO SparkDeploySchedulerBackend: Connected to Spark
> cluster with app ID app-20160426165000-0002
> Exception in thread "main" java.lang.UnsatisfiedLinkError: no
> rdmaipcserver in java.library.path
>         at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1889)
>         at java.lang.Runtime.loadLibrary0(Runtime.java:849)
>         at java.lang.System.loadLibrary(System.java:1088)
>         at org.apache.spark.ipc.rdma.RdmaServer.<clinit>(SourceFile:92)
>         at
> org.apache.spark.network.server.RdmaShuffleServer.<init>(SourceFile:125)
>         at
> org.apache.spark.network.rdma.RdmaBlockTransferService.init(SourceFile:82)
>         at
> org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:199)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:528)
>         at
> org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:37)
>         at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>         at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>         at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/04/26 16:50:00 INFO DiskBlockManager: Shutdown hook called
> 16/04/26 16:50:00 INFO ShutdownHookManager: Shutdown hook called
> 16/04/26 16:50:00 INFO ShutdownHookManager: Deleting directory
> /tmp/spark-ed579380-aedb-4798-aff0-9b021592b93e
>
>
> by   zino
>
> from National Taiwan University  Department of Computer Science &
> Imformation System
>
> _______________________________________________
> Rdma-spark-discuss mailing list
> Rdma-spark-discuss at cse.ohio-state.edu
> http://mailman.cse.ohio-state.edu/mailman/listinfo/rdma-spark-discuss
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/rdma-spark-discuss/attachments/20160427/4ee8134a/attachment.html>


More information about the Rdma-spark-discuss mailing list