This was a 350 points forensics challenge at Santhacklaus 2019 CTF. It’s description was as follows :
It looks like a naughty developer has been deploying a Docker image on a Santa production server a few days before Christmas.
He was in a rush and was not able to properly pass all security checks on the built Docker image.
Would be a shame if this image could give you an SSH access to the production server... http://46.30.204.47"
By browsing the given URL, we come across the following website :
Let’s pull and run the docker image, using the given command docker run --rm -p 3000:3000 -d santactf/app
.
Now we can get a shell within the container using docker ps
first to get its id, and then sudo docker exec -u 0 -it de918fe9900e bash
, where de918fe9900e
is the id we just grabbed.
Be inspecting the filesystem, we understand that there is nothing special in it. There is a basic nodejs server running on the port 3000, simply rendering an Hello world page (and there is a tons of mb of node_modules installed just for this ;-;)
As the description say, in the end we should be able to ssh into the production server which is hosting this container. As there are no other way to interact with this container than the simple webserver, and given that this is a forensics challenge, we should problably look deeper.
Moving on by first looking at how this image was built using the docker history
command :
$ sudo docker history santactf/app
IMAGE CREATED CREATED BY SIZE COMMENT
ddde36e22093 4 days ago /bin/sh -c #(nop) CMD ["node" "server.js"] 0B
<missing> 4 days ago /bin/sh -c #(nop) USER node 0B
<missing> 4 days ago /bin/sh -c #(nop) COPY file:8b53431519dafa70… 458B
<missing> 4 days ago /bin/sh -c npm ci 5.59MB
<missing> 4 days ago /bin/sh -c #(nop) COPY multi:2f093554c78265f… 12.8kB
<missing> 4 days ago /bin/sh -c #(nop) WORKDIR /home/node 0B
<missing> 4 days ago /bin/sh -c #(nop) EXPOSE 3000 0B
<missing> 4 days ago /bin/sh -c #(nop) CMD ["node"] 0B
<missing> 4 days ago /bin/sh -c #(nop) ENTRYPOINT ["docker-entry… 0B
<missing> 4 days ago /bin/sh -c #(nop) COPY file:6781e799bed1693e… 116B
<missing> 4 days ago /bin/sh -c ln -s /usr/local/bin/node /usr/lo… 19B
<missing> 4 days ago /bin/sh -c ARCH= && dpkgArch="$(dpkg --print… 67.2MB
<missing> 4 days ago /bin/sh -c #(nop) COPY dir:795933707ce316a31… 18.6kB
<missing> 4 days ago /bin/sh -c #(nop) ENV NODE_VERSION=12.13.1 0B
<missing> 4 days ago /bin/sh -c groupadd --gid 1000 node && use… 333kB
<missing> 4 weeks ago /bin/sh -c set -ex; apt-get update; apt-ge… 562MB
<missing> 4 weeks ago /bin/sh -c apt-get update && apt-get install… 142MB
<missing> 4 weeks ago /bin/sh -c set -ex; if ! command -v gpg > /… 7.81MB
<missing> 4 weeks ago /bin/sh -c apt-get update && apt-get install… 23.2MB
<missing> 4 weeks ago /bin/sh -c #(nop) CMD ["bash"] 0B
<missing> 4 weeks ago /bin/sh -c #(nop) ADD file:152359c10cf61d800… 101MB
We see that a bunch of files got manually added to the container file systems, and some bash commands were executed. Using the --no-trunc
option to get the full output, we stumble upon one suspicious command :
/bin/sh -c ln -s /usr/local/bin/node /usr/local/bin/nodejs && rm /home/node/.bashrc /home/node/.bash_history && rm -rf /usr/share/prod-common
It really looks like someone tried to cover his tracks ;).
What if we could retrieve those erased files ? After all, there were accessible at the beginning of the build of the image.
Let’s try to take a look at the image filesystem from the host. Docker actually has a command for that : sudo docker save santactf/app -o santactf_app.tar
.
By chmoding and extracting the tarball, we get 13 directories, with three files called layer.tar
, json
and VERSION
in each one.
Extracting the layer.tar
of the first directory, we find out that it contains pieces of the actual image’s filesystem. By poking around , we come accross one of the directory called be3d4ffa7682700bcbc51a8655568428c4979c5464169a286208e9e03f7673a5
.
Its layer.tar
contains the following files:
$ tar xvf layer.tar
home/
home/node/
home/node/.bash_history
home/node/.bashrc
usr/
usr/share/
usr/share/prod-common/
usr/share/prod-common/.wh..wh..opq
usr/share/prod-common/dev_081219_backup.zip
usr/share/prod-common/dev_091219_backup.zip
usr/share/prod-common/dev_101219_backup.zip
usr/share/prod-common/dev_111219_backup.zip
usr/share/prod-common/dev_121219_backup.zip
usr/share/prod-common/dev_131219_backup.zip
usr/share/prod-common/dev_141219_backup.zip
usr/share/prod-common/dev_151219_backup.zip
usr/share/prod-common/dev_161219_backup.zip
Nice ! That looks (com)promising.
Here is the content of .bash_history
:
exit
sudo apt-get update
sudo apt-get upgrade
sudo apt-get install git make
curl -sL https://deb.nodesource.com/setup_4.x | sudo -E bash -
sudo apt-get install -y nodejs
sudo apt-get install libavahi-compat-libdnssd-dev
sudo npm install -g --unsafe-perm santa2019 hap-nodejs node-gyp
cd /usr/lib/node_modules/santa2019/
cd /usr/lib/node_modules/hap-nodejs/node_modules/mdns
sudo node-gyp BUILDTYPE=Release rebuild
cd ~
santa2019
sudo npm install -g santa2019-nest
cd ~/.santa2019/
ls
cd accessories/
ls
cd ..
cd persist/
ls
cd ..
nano config.json
santa2019
nano config.json
nano config.json
santa2019
cd /etc/default/
sudo nano santa2019
cd /etc/systemd/system/
sudo nano santa2019.service
useradd --system santa2019
sudo useradd --system santa2019
cd /var
mkdir santa2019
sudo mkdir santa2019
cd santa2019/
cd ..
ls -alF
sudo chmod 777 santa2019
ls -alF
systemctl daemon-reload
sudo systemctl daemon-reload
sudo systemctl enable santa2019
sudo systemctl start santa2019
systemctl status santa2019
sudo raspi-config
vncserver
vncpasswd
vncpasswd -type
nano config.json
santa2019
nano config.json
vncpasswd -type Password
vncpasswd -type "Password"
sudo nano /etc/ssh/sshd_config
sudo service restart ssh
sudo service ssh restart
vncserver -kill :1
sudo service vncserver stop
export ARCHIVE_PIN=25362
ls ~/.ssh
cd ~
ssh-keygen -t rsa -C jmding0714@gmail.com
cd ~/.ssh/
ls
santa2019
nano config.json
santa2019
santa2019
ls
nano config.json
santa2019
nano config.json
santa2019
nano config.json
santa2019
pwd
chmod 777 config.json
santa2019
nano config.json
santa2019
nano config.json
santa2019
ifconfig
nano authorized_keys
ls
rm authorized_keys
sudo nano /etc/modprobe.d/raspi-blacklist.conf
reboot
cd ~
ls
mkdir apps
cd apps
crontab -e
ls
./upload_ip_address.sh
nano upload_ip_address.sh
./upload_ip_address.sh
exit
cd /var/santa2019/
ls
cp ~/.santa2019/config.json config.json
nano config.json
reboot
systemctl santa2019 status
systemctl status santa2019
cd /var/santa2019/
ls
chmod 777 config.json
systemctl status santa2019
sudo chown -R santa2019:santa2019 /var/santa2019
vncserver -kill :1
cd $PRODUCTION_DIR
zip --password "$ARCHIVE_PIN" "$PRODUCTION_BACKUP_FILE" id_santa_production*
vncserver -geometry 800x600
sudo chmod 777 -R /var/santa2019
ls /usr/local/bin/
which santa2019
nano /etc/systemd/system/santa2019.service
sudo nano /etc/systemd/system/santa2019.service
reboot
systemctl status santa2019
journalctl santa2019
santa2019
nano config.json
journalctl -u santa2019
exit
nano ~/.ssh/authorized_keys
ssh -p 5700 rudolf-the-reindeer@46.30.204.47
exit
vncserver
raspi-config
sudo raspi-config
vncserver -kill :1
vncserver -geometry 800x600
exit
cd ~/apps/
kls
ls
nano upload_ip_address.sh
rm ip_address.txt
exit
w
write pi pts/0
echo "hi" > /dev/pts/0
exit
sudo apt-get update
.upload_ip_address.sh
./upload_ip_address.sh
ls
rm ip_address.txt
clear
We can spot many interestings things here. First, the developper made an encrypted zip with the following command : zip --password "$ARCHIVE_PIN" "$PRODUCTION_BACKUP_FILE" id_santa_production*
. By scrolling up a bit, we can see that the environnement variable $ARCHIVE_PIN
value has been set to 25362
.
Let’s try to get the contents of the zip files with this password :
$ ls
dev_081219_backup.zip dev_101219_backup.zip dev_121219_backup.zip dev_141219_backup.zip dev_161219_backup.zip
dev_091219_backup.zip dev_111219_backup.zip dev_131219_backup.zip dev_151219_backup.zip
$ for z in *.zip; do unzip -P 25362 $z; done
Archive: dev_081219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_091219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_101219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_111219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_121219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_131219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_141219_backup.zip
inflating: id_santa_production
inflating: id_santa_production.pub
Archive: dev_151219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
Archive: dev_161219_backup.zip
skipping: id_santa_production incorrect password
skipping: id_santa_production.pub incorrect password
dev_141219.zip
got extracted ! we now have access to an openssh public key, id_santa_production.pub
and it’s corrsponding private key id_santa_production
.
In the .bash_history
, we can also notice that an ssh command has been issued : ssh -p 5700 rudolf-the-reindeer@46.30.204.47
.
Let’s try to connect to this server using the private key we just got :
$ chmod 700 id_santa_production*
$ ssh -p 5700 rudolf-the-reindeer@46.30.204.47 -i id_santa_production
Enter passphrase for key 'id_santa_production'
Well, the private key is password protected :(.A
As there are no other interesting informations in the .bash_history
, let’s try to look at the .bashrc
file. Here is a reduced version of its content:
# ~/.bashrc: executed by bash(1) for non-login shells.
# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)
...
[Skipped this uninteresting part]
...
# Should make life easier for development
if [ "$NODE_ENV" = "developer_workstation" ]; then
export PRD_PWD='HoHoHo2020!NorthPole'
fi
...
[Skipped this uninteresting part]
...
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion
Looks like we now have a password! Let’s try it :
$ ssh -p 5700 rudolf-the-reindeer@46.30.204.47 -i id_santa_production
Enter passphrase for key 'id_santa_production':
___ _ _ _ _____ _ ___ _____ ___
/ __| /_\ | \| |_ _/_\ / __|_ _| __|
\__ \/ _ \| .` | | |/ _ \ | (__ | | | _|
|___/_/ \_\_|\_| |_/_/ \_\ \___| |_| |_|
Well done, the flag is SANTA{NeverTrustDockerImages7263}
You may now log out of this server with "exit"
-bash-5.0$
Annnd here it is, our beloved flag.