HELK/docker/helk-spark-worker/scripts/spark-worker-entrypoint.sh

80 lines
2.2 KiB
Bash
Raw Normal View History

HELK v0.1.3-alpha08032018 All + Moved all to docker folder. Getting ready to start sharing other ways to deploy helk (terraform & Packer maybe) Compose-files + Basic & Trial Elastic Subscriptions available now and can be automatically managed via the helk_install script ELK Version : 6.3.2 Elasticsearch + Set 4GB for ES_JAVA_OPTS by default allowing the modification of it via docker-compose and calculating half of the host memory if it is not set + Added Entrypoint script and using docker-entrypoint to start ES Logstash + Big Pipeline Update by Nate Guagenti (@neu5ron) ++better cli & file name searching ++”dst_ip_public:true” filter out all rfc1918/non-routable ++Geo ASName ++Identification of 16+ windows IP fields ++Arrayed IPs support ++IPv6&IPv4 differentiation ++removing “-“ values and MORE!!! ++ THANK YOU SO MUCH NATE!!! ++ PR: https://github.com/Cyb3rWard0g/HELK/pull/93 + Added entrypoint script to push new output_templates straight to Elasticsearch per Nate's recommendation + Starting Logstash now with docker-entrypoint + "event_data" is now taken out of winlogbeat logs to allow integration with nxlog (sauce added by Nate Guagenti (@neu5ron) Kibana + Kibana yml file updated to allow a longer time for timeout Nginx: + it handles communications to Kibana and Jupyterhub via port 443 SSL + certificate and key get created at build time + Nate added several settings to improve the way how nginx operates Jupyterhub + Multiple users and mulitple notebooks open at the same time are possible now + Jupytehub now has 3 users hunter1,hunter2.hunter3 and password patterh is <user>P@ssw0rd! + Every notebook created is also JupyterLab + Updated ES-Hadoop 6.3.2 Kafka Update + 1.1.1 Update Spark Master + Brokers + reduce memory for brokers by default to 512m Resources: + Added new images for Wiki
2018-08-03 18:13:25 +00:00
#!/bin/bash
# HELK script: spark-worker-entrypoint.sh
# HELK script description: Starts Spark Worker Service
# HELK build Stage: Alpha
# Author: Roberto Rodriguez (@Cyb3rWard0g)
# License: GPL-3.0
# Reference:
# https://github.com/apache/spark/blob/master/sbin/start-slave.sh (Modified to not execute daemon script)
if [ -z "${SPARK_HOME}" ]; then
export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
fi
# NOTE: This exact class name is matched downstream by SparkSubmit.
# Any changes need to be reflected there.
CLASS="org.apache.spark.deploy.worker.Worker"
#if [[ $# -lt 1 ]] || [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then
echo "Usage: ./sbin/start-slave.sh [options] <master>"
pattern="Usage:"
pattern+="\|Using Spark's default log4j profile:"
pattern+="\|Registered signal handlers for"
"${SPARK_HOME}"/bin/spark-class $CLASS --help 2>&1 | grep -v "$pattern" 1>&2
exit 1
fi
. "${SPARK_HOME}/sbin/spark-config.sh"
. "${SPARK_HOME}/bin/load-spark-env.sh"
# First argument should be the master; we need to store it aside because we may
# need to insert arguments between it and the other arguments
#MASTER=$1
#shift
# Determine desired worker port
if [ "$SPARK_WORKER_WEBUI_PORT" = "" ]; then
SPARK_WORKER_WEBUI_PORT=8081
fi
if [ "$SPARK_WORKER_PORT" = "" ]; then
PORT_FLAG=
PORT_NUM=
else
PORT_FLAG="--port"
PORT_NUM="$SPARK_WORKER_PORT"
fi
$SPARK_HOME/bin/spark-class $CLASS \
--webui-port $SPARK_WORKER_WEBUI_PORT $PORT_FLAG $PORT_NUM $SPARK_MASTER
# Start up the appropriate number of workers on this machine.
# quick local function to start a worker
#function start_instance {
# WORKER_NUM=$1
# shift
# if [ "$SPARK_WORKER_PORT" = "" ]; then
# PORT_FLAG=
# PORT_NUM=
# else
# PORT_FLAG="--port"
# PORT_NUM=$(( $SPARK_WORKER_PORT + $WORKER_NUM - 1 ))
# fi
# WEBUI_PORT=$(( $SPARK_WORKER_WEBUI_PORT + $WORKER_NUM - 1 ))
# $SPARK_HOME/bin/spark-class $CLASS $WORKER_NUM \
# --webui-port "$WEBUI_PORT" $PORT_FLAG $PORT_NUM $MASTER "$@"
#}
#if [ "$SPARK_WORKER_INSTANCES" = "" ]; then
# start_instance 1 "$@"
#else
# for ((i=0; i<$SPARK_WORKER_INSTANCES; i++)); do
# start_instance $(( 1 + $i )) "$@"
# done
#fi