This repo contains a framework for messing with images in a variety of ways, in an attempt to make DroneDeploy's photogrammetry engine do weird things.
This doesn't really have much practical purpose, and was really just intended to be a fun experiment.
- Clone this repo
- Create a virtual environment and install the requirements
python3 -m venv venv
source venv/bin/activate
- Install the requirements
pip install -r requirements.txt
For all options, run
python3 image_processing.py --help
This experiment removes the red, green, or blue color band(s) from an image.
For example, to keep the red and green bands and remove the blue band from a set of images (yielding yellow images), run the following command:
python3 image_processing.py --experiment color-bands \
--input <directory with images> --output <output directory> \
--bands-to-keep rg
This experiment converts a certain percentage of input images to monochrome (black and white).
For example, to convert 50% of images to monochrome, run the following command:
python3 image_processing.py --experiment monochrome \
--input <directory with images> --output <output directory> \
--percentage 50
This experiment flips and mirrors certain percentages of input images.
For example, to flip 50% of images upside-down, and mirror 25% of images horizontally, run the following command:
python3 image_processing.py --experiment inverted \
--input <directory with images> --output <output directory> \
--flip 50 --mirror 25
Note: This will result in roughly the following distribution (this is somewhat random):
- 35 % of the images will be flipped upside-down
- 15 % of the images will be mirrored horizontally
- 10 % of the images will be flipped upside-down and mirrored horizontally
- 40 % of the images will be left as is
This experiment adds random noise to images. The percentage of noise is controlled by the --noise-level
parameter, which is an integer between 0 and 100.
For example, to add 25% noise to images, run the following command:
python3 image_processing.py --experiment noise \
--input <directory with images> --output <output directory> \
--noise-level 25
This experiment warps the perspective of a certain percentage of input images, by randomly selecting 4 points on the image and warping them by a provided --warp-by
percentage.
For example, to warp 75% of images by 10% (significant), run the following command:
python3 image_processing.py --experiment perspective \
--input <directory with images> --output <output directory> \
--percentage 75 --warp-by 10
This experiment tilts a certain percentage of input images by a random angle, between a specficied minimum and maximum angle, in degrees.
For example, to tilt 75% of images by a random angle between -10 and 10 degrees, run the following command:
python3 image_processing.py --experiment tilt \
--input <directory with images> --output <output directory> \
--percentage 75 --max-tilt 10
This experiment edits or removes GPS data from a certain percentage of input images.
For example, to remove GPS data from 50% of images, run the following command:
python3 image_processing.py --experiment set-gps \
--input <directory with images> --output <output directory> \
--percentage 50
To set the GPS data of 100% of images to 44.7471° N, 85.54547° W, run the following command:
python3 image_processing.py --experiment set-gps \
--input <directory with images> --output <output directory> \
--percentage 100 --lat 44.7471 --lng -85.54547
To set the GPS data of 100% of images to 44.7471° N, 85.54547° W, randomly moved within a 50 meter radius, run the following command:
python3 image_processing.py --experiment set-gps \
--input <directory with images> --output <output directory> \
--percentage 100 --lat 44.7471 --lng -85.54547 --max-wiggle 50
Note: When a --lat
and --lng
are specified, the altitude is passed through with no change. Future work may include the ability to modify the altitude.
This experiment removes the pose metadata from a certain percentage of input images.
Specifically, these fields are removed from the EXIF data:
- Orientation: Horizontal (normal)
- Gimbal Degree
- Gimbal Roll Degree
- Gimbal Yaw Degree
- Gimbal Pitch Degree
- Flight Degree
- Flight Roll Degree
- Flight Yaw Degree
- Flight Pitch Degree
For example, to remove pose data from 50% of images, run the following command:
python3 image_processing.py --experiment no-pose \
--input <directory with images> --output <output directory> \
--percentage 50
This experiment adds random timestamps to images, between two provided dates.
For example, to add random timestamps between 2024-01-01 and 2024-01-02 to all the images, run the following command:
python3 image_processing.py --experiment timestamp \
--input <directory with images> --output <output directory> \
--start-date 2024-01-01 --end-date 2024-01-02
There is support for automatically uploading modified images to DroneDeploy for processing afterwards. To use this, set the DD_API_KEY
environment variable to your DroneDeploy API key, and run any of the experiments with the --dd-project-id flag set to the ID of the project you want to upload to.
If you don't care about the output images, you can omit the --output
flag, which will store the images in a temporary directory before uploading.
You can also specify a --dd-plan-name
. If one is not provided, a sensical name will be generated based on the experiment and parameters.
For example:
python3 image_processing.py --experiment no-pose \
--dd-project-id <project id> \
--input <directory with images> \
--percentage 50
To try all of the experiments at once, open test.sh
, and modify the INPUT_DIR
and OUTPUT_BASE_DIR
variables to point to the directories with the images you want to process. Then run the script:
bash test.sh
If you don't have a drone or test dataset, feel free to use these images of Baywatch Resort in Traverse City, MI, flown manually with a Mavic Air 2. Images are licensed under the CC BY (Attribution) license, play around and have fun! If you do something cool, make a PR!
Here are some samples from the above example dataset, after running them through test.sh
!
All of these experiments only affect metadata, so the images look identical to the original.
These are the results of a handful of experiments! Overall, I'm pretty impressed with how well DroneDeploy handled these weird images, and even with some truly bizzare situations, it still managed to stitch them together reasonably well given how screwed up they are.
Overall, I'm super impressed with how well this map turned out. It isn't hugely surprising given how detailed I was when flying it, but impressive nevertheless!
Processing Report | View map in DroneDeploy
I'm pretty surprised this one turned out as well as it did, but it makes sense. It's just yellow after all!
Processing Report | View map in DroneDeploy
Similar story to removing blue, but this one turned out just fine, just very red!
Processing Report | View map in DroneDeploy
Not all that surprising, but there are gray splotches throughout the map, but it looks like the stitcher had no problem with it!
Processing Report | View map in DroneDeploy
This one is pretty hilarious! I really had no idea what it would do going into this, but the result isn't surprising. Although, the angle of the upside-down model is unexpected.
Processing Report | View map in DroneDeploy
Besides being a bit lighter looking, this one turned out just fine, much like the original.
Processing Report | View map in DroneDeploy
While not bad, there is a noticeable degredadation in quality here, as we see this large hole on the side of the building, and an overall decrease in quality from Noise - 25%, and certainly from the Original.
Processing Report | View map in DroneDeploy
It feels like I'm finally getting somewhere with breaking photogrammetry! Although, that said, this is still a pretty decent result for some pretty fouled up images. I'm impressed that it seems to have mostly ignored the black borders around the photos, although you can see some odd coloration on the roof..
Processing Report | View map in DroneDeploy
Seems like perspective warp will really do a number on photogrammetry! This map looks pretty terrible compared to the original!
Processing Report | View map in DroneDeploy
Interestingly, this model turned out surprisingly good! However, you can really clearly see the black boundaries from the photo painted onto the roof. We see a bit of this on the perspective warp experiments, but we see it in full effect here!
Processing Report | View map in DroneDeploy
Unfortunately, DroneDeploy is too smart, and the uploader doesn't accept images weird spread out timestamps, images without GPS metadata, or images that all have the same GPS location :(.