Giter VIP home page Giter VIP logo

cdh-package's Introduction

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Welcome to Bigtop!

Bigtop (http://incubator.apache.org/bigtop/) is a project for the development of
packaging and tests of the Apache Hadoop (http://hadoop.apache.org/)
ecosystem, currently in the Apache Incubator.

The primary goal of Bigtop is to build a community around the
packaging and interoperability testing of Hadoop-related
projects. This includes testing at various levels (packaging,
platform, runtime, upgrade, etc...) developed by a community with a
focus on the system as a whole, rather than individual projects.

## Building Bigtop

Packages have been built on Ubuntu 10.10, CentOS 5 and openSUSE
11.4. They can probably be built on other platforms as well.

Building Bigtop requires the following tools:

* Java JDK 1.6
* Apache Forrest 0.8 (requires 32bit version of Java JDK 1.5)
* Apache Ant
* Apache Maven
* git
* subversion
* autoconf
* automake
* liblzo2-dev
* libz-dev
* sharutils
* libfuse-dev

On Debian-based systems one also needs

* build-essential dh-make debhelper devscripts
* reprepro

On openSUSE 11.4+, in addition, one also needs to ensure the following is installed:


relaxngDatatype
docbook-utils docbook-simple
asciidoc
fuse-devel
docbook5
docbook5-xsl-stylesheets
libxml2-devel
xmlformat
xmlto
libxslt
libopenssl-devel

## Building packages

    $ make [component-name]-[rpm|deb]

## Building local YUM/APT repositories

    $ make [component-name]-[yum|apt]

##  Running the tests

WARNING: since testing packages requires installing them on a live
system it is highly recommended to use VMs for that.

Testing Bigtop is done using iTest framework. For more documentation
on iTest visit the iTest page
(http://cloudera.github.com/bigtop/iTest) but here's 2 steps to get started:

* install package testing iTest artifacts locally:

        cd test/src/smokes/package/ && mvn install -DskipTests -DskipITs -DperformRelease

* use those locally installed iTest package testing artifacts to run a suite:

        cd test/suites/package/ && mvn clean verify -Dcdh.repo.file.url.CentOS=XXX  -D'org.apache.maven-failsafe-plugin.testInclude=**/TestPackagesReadiness.*'

##  Contact us!

You can get in touch with us on the Bigtop mailing lists (http://incubator.apache.org/bigtop/mail-lists.html).

cdh-package's People

Contributors

abayer avatar bmahe avatar rvs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cdh-package's Issues

when executing presto-client-cli command for querying ,it show Exception in thread "pool-1-thread-1" java.lang.OutOfMemoryError: Java heap space

Hi friends:
Now I can't open the page https://groups.google.com/forum/#!forum/presto-users ,so show my question here.
I have started hiveserver and started presto-server on a machine with commands below:
hive --service hiveserver -p 9083
./launcher run
When I use the presto-client-cli command ./presto --server localhost:9083 --catalog hive --schema default ,the console shows presto:default>,input the command as show tables the console prints Error running command: java.nio.channels.ClosedChannelException,
and the hiveserver console print as below:
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "pool-1-thread-1" java.lang.OutOfMemoryError: Java heap space
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:27)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)

my configuration file below:
node.properties
node.environment=production
node.id=cc4a1bbf-5b98-4935-9fde-2cf1c98e8774
node.data-dir=/home/hadoop/cloudera-5.0.0/presto-0.56/presto/data

config.properties
coordinator=true
datasources=jmx
http-server.http.port=8080
presto-metastore.db.type=h2
presto-metastore.db.filename=/home/hadoop/cloudera-5.0.0/presto-0.56/presto/db/MetaStore
task.max-memory=1GB
discovery-server.enabled=true
discovery.uri=http://slave4:8080

jvm.config
-server
-Xmx16G
-XX:+UseConcMarkSweepGC
-XX:+ExplicitGCInvokesConcurrent
-XX:+CMSClassUnloadingEnabled
-XX:+AggressiveOpts
-XX:+HeapDumpOnOutOfMemoryError
-XX:OnOutOfMemoryError=kill -9 %p
-XX:PermSize=150M
-XX:MaxPermSize=150M
-XX:ReservedCodeCacheSize=150M
-Xbootclasspath/p:/home/hadoop/cloudera-5.0.0/presto-0.56/presto-server-0.56/lib/floatingdecimal-0.1.jar

log.properties
com.facebook.presto=DEBUG

catalog/hive.properties
connector.name=hive-cdh4
hive.metastore.uri=thrift://master:9083

HADOOP ENVIRONMENT IS CDH5+CDH5-HIVE-0.11+PRESTO-0.56

Last I had increased the Java heap size for the Hive metastore,but it still given me the same error informations ,please help me to check if that is a bug of CDH5.Now I have no idea,god !

please help me ,thanks.

CHD5.10.2's sqoop component when use --incremental throw exception

My current sqoop version is: Sqoop 1.4.6-cdh5.10.2

when execute command as below:

sqoop import
--connect jdbc:oracle:thin:@//IP:PORT/XXXX
--username ${USER}
--password ${PASS}
--table USERS
--hive-import
--hive-table USERS_TMP
--incremental lastmodified
--check-column UPDATE_TIME
--last-value '2017-09-13'
--merge-key ID --verbose
--target-dir /user/hive/warehouse/USERS_TMP
--hive-drop-import-delims
--fields-terminated-by '\001'
--m 1

It throw exception as below:

17/09/15 12:42:41 DEBUG sqoop.Sqoop: --incremental lastmodified option for hive imports is not supported. Please remove the parameter --incremental lastmodified.
--incremental lastmodified option for hive imports is not supported. Please remove the parameter --incremental lastmodified.
at org.apache.sqoop.tool.BaseSqoopTool.validateHiveOptions(BaseSqoopTool.java:1501)
at org.apache.sqoop.tool.ImportTool.validateOptions(ImportTool.java:1157)
at org.apache.sqoop.Sqoop.run(Sqoop.java:137)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
--incremental lastmodified option for hive imports is not supported. Please remove the parameter --incremental lastmodified.

I check up CDH5.10.2 release notes, it says it's a new feature that not included in the Sqoop base version 1.4.6 :
[SQOOP-2986] - Add validation check for --hive-import and --incremental lastmodified

Is the enhancement(validation check) brings this error?

By the way, when this command execute at version of Sqoop 1.4.6-cdh5.7.2, it's right, has no error.

CDH for Debian 8 Jessie

Hello,

please, is CDH for Jessie planned? Please, could you disclose approximate date? We've managed to get scm-server and scm-agents up and running, although getting agent up and running required couple "workarounds". I can disclose those workarounds, if you're interested in. We've managed to bring up monitoring in manager, but Cluster can't be created. I believe this is because there are no parcels or packages for Jessie.

Thank you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.