teradata / docker-images Goto Github PK
View Code? Open in Web Editor NEWDocker images used internally by various Teradata projects for automation, testing, etc
License: Apache License 2.0
Docker images used internally by various Teradata projects for automation, testing, etc
License: Apache License 2.0
We do not mention that docker-compose is a requirement to build the images
See 'get webhook' on each layer's page @ microbadger.io (e.g. here)
Maybe you'll find it useful, maybe not :)
https://docs.docker.com/engine/userguide/eng-image/multistage-build/
Cheers! :)
I am trying to build on ubuntu latest version.
azureuser@test:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 20.04.2 LTS
Release: 20.04
Codename: focal
And, when I ran the make command getting these errors.
azureuser@test:~$ make teradatalabs/cdh5-base.dependants
make: *** No rule to make target 'teradatalabs/cdh5-base.dependants'. Stop.
azureuser@test:~$ make teradatalabs/cdh5-hive
make: *** No rule to make target 'teradatalabs/cdh5-hive'. Stop.
azureuser@test:~$ make teradatalabs/cdh5-base.dependants
make: *** No rule to make target 'teradatalabs/cdh5-base.dependants'. Stop.
How to add Spark to the hive?
I am unable to build teradata docker image using the following steps and running the following on Ubuntu 16.04
Total download size: 507 M
Downloading Packages:
https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/RPMS/x86_64/hadoop-mapreduce-2.6.0%2Bcdh5.9.1%2B1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm: [Errno 12] Timeout on https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/RPMS/x86_64/hadoop-mapreduce-2.6.0%2Bcdh5.9.1%2B1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm: (28, 'Operation too slow. Less than 1 bytes/sec transfered the last 30 seconds')
Trying other mirror.
https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/RPMS/x86_64/hadoop-yarn-2.6.0%2Bcdh5.9.1%2B1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm: [Errno 12] Timeout on https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/RPMS/x86_64/hadoop-yarn-2.6.0%2Bcdh5.9.1%2B1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm: (28, 'Connection time-out')
Trying other mirror.
https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/RPMS/noarch/solr-4.10.3%2Bcdh5.9.1%2B460-1.cdh5.9.1.p0.5.el6.noarch.rpm: [Errno 14] PYCURL ERROR 18 - "transfer closed with 21762312 bytes remaining to read"
Trying other mirror.
Error Downloading Packages:
solr-4.10.3+cdh5.9.1+460-1.cdh5.9.1.p0.5.el6.noarch: failure: RPMS/noarch/solr-4.10.3+cdh5.9.1+460-1.cdh5.9.1.p0.5.el6.noarch.rpm from cloudera-cdh5: [Errno 256] No more mirrors to try.
hadoop-yarn-2.6.0+cdh5.9.1+1889-1.cdh5.9.1.p0.5.el6.x86_64: failure: RPMS/x86_64/hadoop-yarn-2.6.0+cdh5.9.1+1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm from cloudera-cdh5: [Errno 256] No more mirrors to try.
hadoop-mapreduce-2.6.0+cdh5.9.1+1889-1.cdh5.9.1.p0.5.el6.x86_64: failure: RPMS/x86_64/hadoop-mapreduce-2.6.0+cdh5.9.1+1889-1.cdh5.9.1.p0.5.el6.x86_64.rpm from cloudera-cdh5: [Errno 256] No more mirrors to try.
The command '/bin/sh -c wget -nv http://archive.cloudera.com/cdh5/one-click-install/redhat/6/x86_64/cloudera-cdh-5-0.x86_64.rpm && yum --nogpgcheck localinstall -y cloudera-cdh-5-0.x86_64.rpm && rm cloudera-cdh-5-0.x86_64.rpm && rpm --import http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/RPM-GPG-KEY-cloudera && sed -i '/^baseurl=/c\baseurl=https://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/5.9.1/' /etc/yum.repos.d/cloudera-cdh5.repo && yum install -y hive && yum install -y python-setuptools && easy_install pip && pip install supervisor && mkdir /etc/supervisord.d/ && wget -nv http://dl.fedoraproject.org/pub/epel/6/x86_64/python-meld3-0.6.7-1.el6.x86_64.rpm && rpm -ihv python-meld3-0.6.7-1.el6.x86_64.rpm && rm python-meld3-0.6.7-1.el6.x86_64.rpm && yum -y clean all && rm -rf /tmp/* /var/tmp/* && mkdir -p /var/log/hadoop && ln -s /var/log/hadoop-hdfs /var/log/hadoop/hdfs && ln -s /var/log/hadoop-mapreduce /var/log/hadoop/mapreduce && ln -s /var/log/hadoop-yarn /var/log/hadoop/yarn' returned a non-zero code: 1
Command exited with non-zero status 1
0.12user 0.01system 5:33.01elapsed 0%CPU (0avgtext+0avgdata 33488maxresident)k
0inputs+0outputs (0major+2967minor)pagefaults 0swaps
Makefile:240: recipe for target 'teradatalabs/cdh5-base@latest' failed
make: *** [teradatalabs/cdh5-base@latest] Error 1
Hello,
I'm following up on docker/cli#267 opened by @ebd2. You may be interested in dobi a project I've been working on to automate docker build tasks.
There's an example of building and tagging multiple images here: https://github.com/dnephin/dobi/blob/master/examples/tag-images/dobi.yaml
Using that example you could extend it with the following config to also push multiple images:
alias=push-images:
tasks: ['app:push', 'db:push']
description: "Push all the images"
Let me know if you have any questions or feedback about the project.
Feel free to close this issue at any time if you are not interested.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.