Painter

Eh4x CTFby smothy

Painter - Forensics (500pts)

EH4X CTF

there was a painter working for a security company who attempted to deceive one of the employees. To do so, he said...


First Look - What We Got

We get a .pcap file. Classic forensics moment. Time to fire up the tools and see what's inside.

$ file pref.pcap pref.pcap: pcapng capture file - version 1.0

Let's check what protocols we're dealing with:

$ tshark -r pref.pcap -q -z io,phs Protocol Hierarchy Statistics frame frames:9888 bytes:702048 usb frames:9888 bytes:702048

USB only. No network traffic, no HTTP, no TCP. Just raw USB data. 9888 packets of it.

Me seeing USB forensics: "Ah shit, here we go again" - CJ, probably


Understanding USB HID Data

USB Human Interface Devices (mice, keyboards, tablets) send small data packets for every input event. Since the challenge is about a painter, this is likely mouse movement data - someone literally drew something with their mouse.

Each packet is 7 bytes:

01 00 FF FF F5 FF 00 │ │ └──┬──┘ └──┬──┘ │ │ │ │ │ └─ Wheel (unused) │ │ │ └────── Y movement (signed 16-bit LE) │ │ └────────────── X movement (signed 16-bit LE) │ └──────────────────── Padding └─────────────────────── Button state (0x01 = left click held)

Key insight: The button state is 0x01 for ALL packets = left mouse button held down the entire time. The painter was click-dragging continuously - like drawing in MS Paint!

If you want to learn more about USB HID descriptors: https://wiki.wireshark.org/USB https://usb.org/document-library/device-class-definition-hid-111


Step 1: Extract the Mouse Data

bash
$ tshark -r pref.pcap -T fields -e usb.capdata > mouse_data.txt
$ wc -l mouse_data.txt
9888

Quick sanity check on the data:

01000000fdff00 -> buttons=01, dx=0, dy=-3 0100fffff5ff00 -> buttons=01, dx=-1, dy=-11 0100fefff3ff00 -> buttons=01, dx=-2, dy=-13

Looks like small relative mouse movements. We need to sum them up to get absolute positions and plot the path!


Step 2: Plot the Drawing

python
import struct
from PIL import Image, ImageDraw

lines = open("mouse_data.txt").read().strip().split("\n")

x, y = 0, 0
points = []

for line in lines:
    data = bytes.fromhex(line.strip())
    # bytes 2-3: X movement (signed 16-bit LE)
    dx = struct.unpack('<h', data[2:4])[0]
    # bytes 4-5: Y movement (signed 16-bit LE)
    dy = struct.unpack('<h', data[4:6])[0]
    x += dx
    y += dy
    points.append((x, y))

# Calculate bounds and create image
xs = [p[0] for p in points]
ys = [p[1] for p in points]
min_x, max_x = min(xs), max(xs)
min_y, max_y = min(ys), max(ys)

pad = 20
img = Image.new("RGB",
    (max_x - min_x + 2*pad, max_y - min_y + 2*pad), "white")
draw = ImageDraw.Draw(img)

for i in range(1, len(points)):
    x0 = points[i-1][0] - min_x + pad
    y0 = points[i-1][1] - min_y + pad
    x1 = points[i][0] - min_x + pad
    y1 = points[i][1] - min_y + pad
    draw.line([(x0, y0), (x1, y1)], fill="black", width=2)

img.save("mouse_drawing.png")

Result:

A messy drawing with a speech bubble and handwritten text! But since the mouse button is ALWAYS held down, we get unwanted lines connecting separate strokes (when the painter moved between characters).

The drawing looking like my doctor's prescription fr fr 💀


Step 3: Clean It Up - Stroke Segmentation

The trick is: when the painter moves between characters, the mouse moves FAST. When actually drawing, it moves SLOW. We can split strokes by velocity threshold!

python
# If movement > threshold, it's a "pen lift" (transition between strokes)
threshold = 15

strokes = []
current_stroke = [points[0]]

for i in range(1, len(points)):
    velocity = abs(dx) + abs(dy)  # Manhattan distance
    if velocity > threshold:
        if len(current_stroke) > 1:
            strokes.append(current_stroke)
        current_stroke = [points[i]]
    else:
        current_stroke.append(points[i])

This gives us 14 separate strokes - individual characters and shapes!


Step 4: Color-Code the Strokes

By assigning different colors to each stroke, the text becomes MUCH more readable:

python
import colorsys

for idx, stroke in enumerate(strokes):
    hue = idx / len(strokes)
    r, g, b = colorsys.hsv_to_rgb(hue, 1.0, 0.8)
    color = (int(r*255), int(g*255), int(b*255))
    # draw each stroke in its own color...

Now we can clearly see:

PartColorContent
Speech bubbleGreen + Cyan ovalsThe "o" letters
Top lineBlue (main stroke)wh4t c0l0ur
Bottom linePink/Red15 th3 fl4g
Bubble tailLate strokesArrow pointing down

Step 5: Read the Message

Top line: wh4t c0l0ur

The largest stroke draws: w-h-4-t then space then c-0-l-0-u-r

It's leetspeak! 4 = a, 0 = o → "what colour"

Bottom line: 15 th3 fl4g

Breaking it down character by character (chunk-by-chunk temporal analysis):

ChunkShapeCharacter
0-300dot + vertical stroke1
300-600S-curve5
600-1500vertical + arch + curvet, h
1500-2700angular 3 shape3
2700-3600ovals + angularf, l, 4
3600-4365vertical + circle + curveg

Leetspeak decode: 1=i, 5=s, 3=e, 4=a → "is the flag"


The Full Message

The painter drew a speech bubble containing:

wh4t c0l0ur 15 th3 fl4g

Decoded: "What colour is the flag?"

A painter at a security company asking "what colour is the flag?" - classic social engineering. Pretending to need the info for their painting job while actually trying to extract security info.

Social engineers be like: "I'm just a painter bro, what colour is your security badge? For... painting reasons" 🎨🕵️


Flag

EH4X{wh4t_c0l0ur_15_th3_fl4g}

TL;DR for the Speedrunners

  1. PCAP has USB mouse data (7-byte HID packets)
  2. Extract mouse movements, accumulate X/Y positions
  3. Plot the path → messy handwritten text in a speech bubble
  4. Segment strokes by velocity threshold to separate characters
  5. Color-code strokes to distinguish overlapping text
  6. Read the leetspeak: wh4t c0l0ur 15 th3 fl4g

Tools Used

  • tshark - extract USB data from PCAP
  • Python + Pillow (PIL) - plot mouse movements
  • struct module - parse binary USB HID data
  • Patience and squinting at terrible mouse handwriting 👀

Lessons Learned

  • USB forensics is a real thing in DFIR (Digital Forensics & Incident Response). Keyloggers, mouse recorders, and USB device artifacts are common evidence sources.
  • HID protocol knowledge is useful - mice, keyboards, and game controllers all use standard report formats.
  • Velocity-based segmentation is a legit technique used in handwriting recognition and digital ink analysis.
  • Social engineering is still the #1 attack vector. Even a painter can be a threat actor! 🎨

"The best hackers don't hack computers, they hack people" - Every cybersec talk ever


Writeup by a sleep-deprived CTF player who spent way too long squinting at mouse scribbles ✌️