반응형
Error..
not found 에러!
15:54:42.639 [warn] ::::::::::::::::::::::::::::::::::::::::::::::
15:54:42.640 [warn] :: UNRESOLVED DEPENDENCIES ::
15:54:42.640 [warn] ::::::::::::::::::::::::::::::::::::::::::::::
15:54:42.640 [warn] :: net.databinder.dispatch#dispatch-core_2.12;0.11.2: not found
15:54:42.640 [warn] :: org.scalatest#scalatest_2.12;2.2.4: not found
15:54:42.640 [warn] :: org.apache.spark#spark-core_2.12;1.4.0: not found
15:54:42.640 [warn] :: org.elasticsearch#elasticsearch-spark_2.12;20_2.12: not found
15:54:42.641 [warn] ::::::::::::::::::::::::::::::::::::::::::::::
ivy에서 ivysettings.xml을 jar안에 것을 읽어들였다.
vy Default Cache set to: /hanmail/.ivy2/cache
The jars for the packages stored in: /hanmail/.ivy2/jars
:: loading settings :: url = jar:file:/daum/program/hadoop-client-env-v2/target/hadoop-doopey/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.elasticsearch#elasticsearch-spark-20_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-35c7f637-edf9-4399-a22b-869f1d8263e2;1.0
해결
spark-submit을 할 때 IVYSETTINGS를 conf에 넣어서 실행하면 된다.
ivysettings.xml의 내용은 사내 레파지토리쪽 내용.
IVYSETTINGS=/daum/program/hadoop-client-env-v2/target/hadoop-doopey/spark/jars/ivysettings.xml
spark-submit --packages ${SPARK_DEPENDENCY_PACKAGES} --conf "spark.jars.ivySettings=${IVYSETTINGS}"
결과
잘 가져와졌다!
:: loading settings :: file = /daum/program/hadoop-client-env-v2/target/hadoop-doopey/spark/jars/ivysettings.xml
Ivy Default Cache set to: /hanmail/.ivy2/cache
The jars for the packages stored in: /hanmail/.ivy2/jars
:: loading settings :: url = jar:file:/daum/program/hadoop-client-env-v2/target/hadoop-doopey/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.elasticsearch#elasticsearch-spark-20_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-4717d44b-70a1-4e5f-ad6b-1efc0367bd53;1.0
confs: [default]
found org.elasticsearch#elasticsearch-spark-20_2.12;7.12.1 in maven.daumcorp.com
found org.scala-lang#scala-reflect;2.12.8 in maven.daumcorp.com
found org.slf4j#slf4j-api;1.7.6 in maven.daumcorp.com
found commons-logging#commons-logging;1.1.1 in maven.daumcorp.com
found javax.xml.bind#jaxb-api;2.3.1 in maven.daumcorp.com
found com.google.protobuf#protobuf-java;2.5.0 in maven.daumcorp.com
found org.apache.spark#spark-yarn_2.12;2.4.4 in maven.daumcorp.com
downloading http://..../groups/daum-ria-group/org/elasticsearch/elasticsearch-spark-20_2.12/7.12.1/elasticsearch-spark-20_2.12-7.12.1.jar ...
[SUCCESSFUL ] org.elasticsearch#elasticsearch-spark-20_2.12;7.12.1!elasticsearch-spark-20_2.12.jar (61ms)
:: resolution report :: resolve 93530ms :: artifacts dl 64ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 7 | 7 | 7 | 0 || 1 | 1 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-4717d44b-70a1-4e5f-ad6b-1efc0367bd53
confs: [default]
1 artifacts copied, 0 already retrieved (850kB/9ms)
반응형
'BigDATA > spark' 카테고리의 다른 글
Upgrade IntelliJ IDEA for Big Data Tool Plug-In & running spark! (0) | 2021.07.29 |
---|---|
[Spark-Study] Day-3 스파크 예제를 위한 셋팅 (2) | 2021.07.01 |
HDFS부터 DB까지 팁 아닌 팁~ (0) | 2019.01.15 |
sbt lib 연동 안되는 현상 (0) | 2019.01.04 |
spark rdd programining (0) | 2018.12.30 |