當前位置:首頁 » 雲伺服器 » 搭建hive需要什麼伺服器

搭建hive需要什麼伺服器

發布時間: 2024-10-13 09:07:16

① 如何使用kettle連接hive和hive2

連接hive的方法:
進入hive所在的伺服器,輸入:hive --service hiveserver(目的:啟動thrift)
打開kettle配置連接界面,輸入hive所在伺服器的ip、所需要的hive庫、埠號(thrift默認埠為:10000)
測試連接,即可
連接hive2的方法:
Error connecting to database [Hive] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database

Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

at org.pentaho.di.core.database.Database.normalConnect(Database.java:428)
at org.pentaho.di.core.database.Database.connect(Database.java:361)
at org.pentaho.di.core.database.Database.connect(Database.java:314)
at org.pentaho.di.core.database.Database.connect(Database.java:302)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2685)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.test(DatabaseDialog.java:109)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2.test(CreateDatabaseWizardPage2.java:157)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizardPage2$3.widgetSelected(CreateDatabaseWizardPage2.java:147)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.core.database.wizard.CreateDatabaseWizard.createAndRunDatabaseWizard(CreateDatabaseWizard.java:111)
at org.pentaho.di.ui.spoon.Spoon.createDatabaseWizard(Spoon.java:7457)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1297)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7801)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9130)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:638)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:151)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Error connecting to database: (using class org.apache.hadoop.hive.jdbc.HiveDriver)
Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration

at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:573)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:410)
... 43 more
Caused by: java.sql.SQLException: Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration
at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:107)
at org.apache.hive.jdbc.HiveDriver.callWithActiveDriver(HiveDriver.java:121)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:132)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:555)
... 44 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:105)
... 49 more
Caused by: java.lang.RuntimeException: Unable to load JDBC driver of type: hive2
at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:108)
... 54 more
Caused by: java.lang.Exception: JDBC driver of type 'hive2' not supported
at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:104)
... 54 more

上述報錯的解決方法如下:
1.找到%KETTLE_HOME%/plugins/pehtaho-big-data-plugin/plugin.properties文件
2.修改plugin.properties文件中的值:active.hadoop.configuration=hdp13
3.修改後重啟kettle
4.配置完成後,即可連接上對應的庫
如果要使用hadoop-20,則需要添加如下jar包:
hadoop-core-1.2.1.jar

hive-common-0.13.0.jar

hive-jdbc-0.13.0.jar

hive-service-0.13.0.jar

libthrift-0.9.1.jar

slf4j-api-1.7.5.jar

httpclient-4.2.5.jar

httpcore-4.2.5.jar

總結:使用hive2的好處,優化了連接、增加安全性、並行度

我的世界Hivemc伺服器怎麼關閉玩家

如果你使用的許可權插件是 GroupManager 指令如下- /mangaddp Builder -essentials.me /mangaddp Builder -essentials.afk /mangaddp Builder -essentials.powertool 然後將Builder改為Default以後再輸入一次 這些功能都是由Essentials插件提供的,。

③ the hive伺服器的版本是什麼這個伺服器一定要用正版嗎

The Hive是一個Minecraft基岩版的伺服器,此伺服器總部假設在新加坡,所以對於中國玩家來說延遲較低。版本是不停改變的,根據目前基岩版的最新正式版。基岩Beta版和舊版本無法加入伺服器。且必須使用Minecraft正版(國際版)基岩版才可以加入。

④ 如何配置hive訪問其他伺服器的hadoop

修改文件 /etc/profile,添加如下的行:
export HADOOP_HOME=/usr/local/hadoopexport ANT_HOME=$HADOOP_HOME/apache-ant-1.7.1export PATH=$PATH:/usr/local/hadoop/bin:$JAVA_HOME/bin:$HADOOP_HOME/contrib/hive/bin:$ANT_HOME/bin

export ANT_LIB=$HADOOP_HOME/apache-ant-1.7.1/lib
export HADOOP=$HADOOP_HOME/bin/hadoop4、修改hive配置文件 /usr/local/hadoop/contrib/hive/conf/hive-default.xml,只要改一個地方,使其內容為:/usr/local/hadoop/contrib/hive/lib/hive_hwi.war。昨天我把它書寫成 「hive-hwi.war」,瀏覽器訪問,就只列出文件目錄,死活都不對,唉!
5、啟動hive web服務: $ hive –service hwi & .監聽埠默認是9999,也可以自己到hive-default.xml定製。瀏覽器的訪問url為 http:/hadoop:9999/hwi.
註:hive表的存放地為hdfs,默認是 /user/hive .這個路徑只有通過hadoop shell才能看見(不是/usr)

熱點內容
如何將安卓機設置為蘋果機 發布:2024-11-25 09:41:24 瀏覽:969
伺服器屏蔽一段ip 發布:2024-11-25 08:52:06 瀏覽:100
售茶源碼 發布:2024-11-25 08:37:29 瀏覽:463
壓縮包改直鏈 發布:2024-11-25 08:34:33 瀏覽:611
安卓機的照片如何傳送到蘋果機上 發布:2024-11-25 08:32:48 瀏覽:917
手游伺服器怎麼找ip 發布:2024-11-25 08:23:10 瀏覽:752
c語言名次 發布:2024-11-25 08:04:22 瀏覽:56
新浪雲伺服器登錄 發布:2024-11-25 08:04:21 瀏覽:854
工控機伺服器電腦的區別 發布:2024-11-25 08:04:21 瀏覽:514
Python對比matlab 發布:2024-11-25 07:45:58 瀏覽:307