所以我想做的是使用“COPY script.sh script.sh”(将脚本从主机复制到容器并执行)但是在容器中执行脚本时,似乎这个脚本也在主机上执行。
下面是 Dockerfile :
FROM almalinux/almalinux:latest
RUN mkdir /opt/confluent
RUN mkdir /opt/confluent-hub
#Confluent Home
ENV CONFLUENT_HOME=/opt/confluent
ENV KAFKA_CONFIG=$KAFKA_CONFIG
ENV ZOOKEEPER_CONFIG=$ZOOKEEPER_CONFIG
ENV SCHEMA_REGISTRY_CONFIG=$ZOOKEEPER_CONFIG
ENV CONNECT_CONFIG=$CONNECT_CONFIG
# Zookeeper
ENV ZOOKEEPER_DATA_DIR=$ZOOKEEPER_DATA_DIR
ENV ZOOKEEPER_CLIENT_PORT=$ZOOKEEPER_CLIENT_PORT
#Kafka
ENV BOOTSTRAP_SERVERS=$BOOTSTRAP_SERVERS
ENV KAFKA_SERVER_BROKER_ID=$KAFKA_SERVER_BROKER_ID
ENV ZOOKEEPER_CONNECT_IP_PORT=$ZOOKEEPER_CONNECT_IP_PORT
ENV KAFKA_SERVER_LOG_DIR=$KAFKA_SERVER_LOG_DIR
# schmea registry
ENV KAFKASTORE_TOPIC=$KAFKASTORE_TOPIC
ENV PROTOCOL_BOOTSTRAP_SERVERS=$PROTOCOL_BOOTSTRAP_SERVERS
ENV SCHEMA_REGISTRY_GROUP_ID=$SCHEMA_REGISTRY_GROUP_ID
ENV SCHEMA_REGISTRY_LEADER_ELIGIBILITY=$SCHEMA_REGISTRY_LEADER_ELIGIBILITY
# Kafka connect
ENV CONNECT_REST_PORT=$CONNECT_REST_PORT
ENV CONNECT_OFFSETS=$CONNECT_OFFSETS
ENV CONNECT_KEY_CONVERTER=$CONNECT_KEY_CONVERTER
ENV SCHEMA_REGISTRY_URL=$SCHEMA_REGISTRY_URL
ENV CONNECT_VALUE_CONVERTER=$CONNECT_VALUE_CONVERTER
ENV SCHEMA_REGISTRY_LISTENER=$SCHEMA_REGISTRY_LISTENER
ENV CONNECT_PLUGIN_PATH=/usr/share/java/,$CONFLUENT_HOME/share/confluent-hub-components/
# install openjdk8
RUN dnf update -y && dnf install epel-release -y
RUN dnf install wget zip moreutils gettext unzip java-1.8.0-openjdk.x86_64 -y
# install conflunet
WORKDIR $CONFLUENT_HOME
RUN wget https://packages.confluent.io/archive/6.1/confluent-community-6.1.1.tar.gz -P .
RUN tar -xvzf confluent-community-6.1.1.tar.gz
RUN mv confluent-6.1.1/* .
RUn rm -rf confluent-6.1.1 confluent-community-6.1.1.tar.gz
# install confluent hub
RUN wget http://client.hub.confluent.io/confluent-hub-client-latest.tar.gz -P /opt/confluent-hub
WORKDIR /opt/confluent-hub
RUN tar -xvzf confluent-hub-client-latest.tar.gz
RUN rm -rf confluent-hub-client-latest.tar.gz
ENV CONFLUENT_HUB /opt/confluent-hub/bin
# Export path
ENV PATH $PATH:$CONFLUENT_HOME:$CONFLUENT_HUB
# install jdbc connector
COPY confluentinc-kafka-connect-jdbc-10.1.0.zip $CONFLUENT_HOME/share/confluent-hub-components/
RUN unzip $CONFLUENT_HOME/share/confluent-hub-components/confluentinc-kafka-connect-jdbc-10.1.0.zip
RUN rm -rf confluentinc-kafka-connect-jdbc-10.1.0.zip
# Copy confleunt config to docker
WORKDIR $CONFLUENT_HOME
COPY config/* config/
# startup
COPY startup.sh ./startup.sh
RUN chmod +x ./startup.sh
CMD ./startup.sh
下面是 startup.sh,它替换了配置文件中的环境变量并启动了 kafka 服务,但是当在容器中运行时,这个脚本正在替换主机配置文件中的值:
#!/bin/bash
# Substitue environment variables in actual $CONFLUENT_HOME/configs
envsubst < $CONFLUENT_HOME/config/zookeeper.properties | sponge $CONFLUENT_HOME/config/zookeeper.properties
envsubst < $CONFLUENT_HOME/config/server.properties | sponge $CONFLUENT_HOME/config/server.properties
envsubst < $CONFLUENT_HOME/config/schema-registry.properties | sponge $CONFLUENT_HOME/config/schema-registry.properties
envsubst < $CONFLUENT_HOME/config/connect-avro-standalone.properties | sponge $CONFLUENT_HOME/config/connect-avro-standalone.properties
# start zookeeper
$CONFLUENT_HOME/bin/zookeeper-server-start -daemon $ZOOKEEPER_CONFIG
sleep 2
# start kafka broker
$CONFLUENT_HOME/bin/kafka-server-start -daemon $KAFKA_CONFIG
sleep 2
# start schema registry
$CONFLUENT_HOME/bin/schema-registry-start -daemon $SCHEMA_REGISTRY_CONFIG
sleep 2
# start kafka connect
$CONFLUENT_HOME/bin/connect-standalone -daemon $CONNECT_CONFIG $CONFLUENT_HOME/etc/kafka/connect-file-sink.properties
sleep 2
while :
do
echo "Confluent Running "
sleep 5
done
码头工人撰写:
version : "3.9"
services:
confluent-community:
build: ./
environment:
- KAFKA_CONFIG=$CONFLUENT_HOME/config/server.properties
- ZOOKEEPER_CONFIG=$CONFLUENT_HOME/config/zookeeper.properties
- SCHEMA_REGISTRY_CONFIG=$CONFLUENT_HOME/config/schema-registry.properties
- CONNECT_CONFIG=$CONFLUENT_HOME/config/connect-avro-standalone.properties
- CONNECT_REST_PORT=8083
- CONNECT_OFFSETS=$CONFLUENT_HOME/data/connect/connect.offsets
- CONNECT_KEY_CONVERTER=io.confluent.connect.avro.AvroConverter
- SCHEMA_REGISTRY_URL=http://localhost:8081
- CONNECT_VALUE_CONVERTER=io.confluent.connect.avro.AvroConverter
- SCHEMA_REGISTRY_LISTENER=http://0.0.0.0:8081
- KAFKASTORE_TOPIC=_schemas
- SCHEMA_REGISTRY_GROUP_ID=SCHEMA_REGISTRY_A
- SCHEMA_REGISTRY_LEADER_ELIGIBILITY=true
- PROTOCOL_BOOTSTRAP_SERVERS=PLAINTEXT://localhost:9092
- ZOOKEEPER_DATA_DIR=$CONFLUENT_HOME/data/zookeeper
- ZOOKEEPER_CLIENT_PORT=2181
- BOOTSTRAP_SERVERS=localhost:9092
- KAFKA_SERVER_BROKER_ID=0
- ZOOKEEPER_CONNECT_IP_PORT=localhost:2181
- KAFKA_SERVER_LOG_DIR=$CONFLUENT_HOME/data/kafka-logs
# ports:
#- "9092:9092"
# - "8081:8081"
#- "8083:8083"
network_mode: "host"
volumes:
- ~/Documents/confluent/docker-logs:/opt/confluent/logs
- ~/Documents/confluent/config:/opt/confluent/config
- ~/Documents/confluent/docker-data:/opt/confluent/data