Distributing source code through image hosting sites - youtube-dl
Also hosted on hashify
Here are some more workarounds for sharing things.
method 1: The complementary .zip and .gif formats
.zip files start at the end and can have arbitrary bytes stored at the beginning. .gif files start at the beginning and can have arbitrary bytes stored at the end. Therefore it's simple enough to concatenate a .gif and a .zip file (in that order) and have them both continue to work as expected. If you unzip it, you'll get the .zip. If you view it, you'll see the .gif.
This image of scissors for example includes the entire source code for youtube-dl. Just download it and unzip it, or run
curl https://i.ibb.co/QbkMm29/scissors-enc.gif --output scissors.gif && unzip scissors.gif
(Of course, you can just upload the .zip directly, but a lot of free hosting allows you to upload gifs but not zips. Just note that they might re-encode or compress your images (e.g. imgur will probably turn it into an .mp4 and now your .zip is gone).
method 2: Steganography
This one is a bit more involved and requires some Python dependencies that might be hard to install. It also results in an image file that's less obviously containing code though.
It extracts the youtube-dl source from this image (look familiar? It's a .png instead of a .gif now, and no zip file included).
It encodes a base64 encoded version of the .zip file into the least significant bit of each pixel of the image. To the human eye, you can't see the different, but each pixel is slightly altered from the original image.
from stegano import lsb import base64 import requests from zipfile import ZipFile r = requests.get("https://i.ibb.co/cDvsnDK/scissorsenc.png") with open("scissorsenc.png", "wb") as f: f.write(r.content) s = lsb.reveal("scissorsenc.png") with open("youtube-dl.zip", "wb") as f: f.write(base64.b64decode(s)) with ZipFile('youtube-dl.zip') as z: z.extractall()
With a bit more work, you could upload these images to dozens or hundreds of image sites, and write code that kept searching through a set of known URLs until it found one that hadn't been taken down yet.
This might be against the terms of service of many sites though, so maybe read those.