Nov 15, 2023

Public workspaceColorBones: Full step-by-step protocol for the visualization and identification of bone discolorations using the ImageJ© plugin DStretch®

  • 1Tübingen
Open access
Protocol CitationDominik Göldner 2023. ColorBones: Full step-by-step protocol for the visualization and identification of bone discolorations using the ImageJ© plugin DStretch®. protocols.io https://dx.doi.org/10.17504/protocols.io.4r3l22ybxl1y/v1
Manuscript citation:

License: This is an open access protocol distributed under the terms of the Creative Commons Attribution License,  which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited
Protocol status: Working
We use this protocol and it's working
Created: November 11, 2023
Last Modified: November 15, 2023
Protocol Integer ID: 90791
Keywords: DStretch, decorrelation stretch, decorrelating, ImageJ, image enhencement, post processing, forensic, archaeology, physical anthropology, bioarchaeology, osteology, imaging, photography, scalebar, colormetry, colormetric, color analysis, plugin, open-method, open-source, taphonomy, taphonomic, decay, discoloration, hue, staining, green, algae, copper, artifact, rust, malachite, corrosion, identification, visualization , false color, explorative, experimental, color channels
Disclaimer
The protocol displays digital images of real human remains. Only a small section of the selected specimen was photographed and presented to minimize the exposure of the human remains of this individual. If you are sensitive to such content, it is recommended that you refrain from reading and scrolling through this protocol. Please view and read this protocol with dignity and care. The provenance of the individual has been verified before (refer to Göldner, 2020 and Hämmerling, 2020).

The following disclaimer contents strictly follow that published in Göldner and Deter-Wolf (2023):
Always obtain official permission from descendant communities and curating institutions before capturing, handling, or disseminating images of human remains. Handle the remains with the highest level of care and respect, adhering to best practice guidelines in accordance with scientific and ethical standards.
It is recommended to thoroughly review and test the entire protocol before applying it to your specimens.

If utilized, always cite this protocol appropriately.

Please note that protocols published on Protocols.io are not peer-reviewed. All outlined steps reflect my personal experiences with the employed software and materials. Additionally, be aware that the effectiveness of the DStretch® algorithm is influenced by various factors related to image parameters (e.g., light conditions, equipment used, recording parameters, image quality, etc.). As a result, the enhanced images may vary, and consistency is not guaranteed in most cases. It is also possible that the workflow described in this protocol may not be applicable to your specific data. In such instances, it is recommended to experiment with the software to explore alternative approaches.

There may be situations where DStretch® does not yield satisfactory results for the submitted data. Reasons for such situations may include overall weak color differences, low image resolution, or image artifacts. In such cases, consider options like chemical analysis or more advanced imaging techniques such as multispectral imaging.

For results that are transparent, reproducible, and consistent, it is advisable to use colorchecker cards for the digital color calibration of your photographs. However, this was not tested here, as uncalibrated images of good photographic quality were found to result in satisfactory transformations.
Those with dyschromatopsia may encounter difficulties identifying and comprehending certain colors in decorrelation-stretch output images. In some instances, the Grayscale Mode in DStretch® (not showcased here) could be employed to circumvent this issue.
Abstract
Fresh, undecayed, and healthy bone in animals and humans typically exhibits a greasy, yellowish-white to yellowish-brown color (Dupras and Schultz, 2013). However, their color can undergo alterations, especially in archaeological or forensic contexts, when exposed to environmental factors. In such cases, bones may display various types of discoloration. These color changes can be attributed to different taphonomic agents during the postmortem deposition of the deceased body. Still, they can also occur during the organism's lifetime and under specific conditions, such as medical treatment or nutrition. Depending on the cause, bone staining can be superficial, but it can also penetrate deeper into the bone structure. Discoloration may be confined to a single bone, parts of it, or it can extend over multiple bones (e.g., in joint areas) or even the entire skeleton. Additionally, discolorations can be (more or less) monochromatic or polychromatic, depending on the number of taphonomic events causing staining, their sequence, and the physicochemical agents involved. Hence, diverse coloration processes can either conceal previously present colors or collaboratively create new color combinations. Furthermore, different processes may result in similar discolorations, complicating the making of a differential diagnosis regarding their origin. Conversely, colors can fade and lose vibrancy over time, posing challenges to their recognition. Examining color patterns in human skeletal remains as part of taphonomic processes is crucial for collecting and reconstructing information about their depositional history. Ultimately, this information can assist in identifying unknown individuals in medicolegal cases or contribute to the study of human history.

A notable hue in bone staining is the appearance of green colors. This can be induced by the growth of plants, algae, and moss on bones when they are positioned on or near the terrestrial surface (Dupras and Schultz, 2013). Additionally, distinct green discolorations on bones can result from corroding copper-containing artifacts, such as grave goods (e.g., earrings, fingerings, jewelry, weapons, dishes, etc.) or clothing items (e.g., modern zippers and buttons) (Dupras and Schultz, 2013). In rare instances, green bone staining can occur during the lifetime of patients due to prolonged intake of tetra- and minocycline antibiotics (Judge et al., 2018).

Typically, green discolorations tend to be more vivid and distinguishable from the surrounding bone colors. However, if only very small areas are affected, their identification can become challenging for the human eye, depending on factors such as intensity or the contrast with neighboring or underlying colors. This difficulty persists even when using conventional imaging techniques like photographic documentation in the visible light spectrum. While chemical analyses can unveil the composition and cause of such discolorations, the sampling and analytical process are invasive and expensive, often rendering them impractical. Alternatively, more advanced imaging techniques, such as multispectral cameras, could be considered. Nevertheless, this option relies on costly equipment and skilled operators and may not be widely available.

Nonetheless, there are cost-effective and user-friendly methodological alternatives available for enhancing and identifying (green) bone discolorations through standard digital photography. One such option relies on a digital image enhancement technique called decorrelation stretching. Originally developed and utilized in remote sensing by NASA (Gillespie et al., 1986), the decorrelation stretch algorithm was later implemented by Jon Harman (2005) as a plugin called DStretch® for the open-source image processing tool ImageJ© (Schneider et al., 2012). Initially designed to aid the visualization of faded cave and rock art, DStretch® has become widely established in cave and wall art archaeology (e.g., Evans and Mourad, 2018). In recent years, researchers have further explored its alternative applications, e.g., by successfully applying DStretch® to painted pottery (Gonzáles et al., 2019), metal armor (Emmit et al., 2021), or tattooed skin of mummified human remains (e.g., Austin, 2022; Göldner and Deter-Wolf, 2023). The minimum requirements for utilizing DStretch® are a good camera and a desktop computer or laptop. It's worth noting that theoretically, a modern mobile phone or tablet with an integrated camera could also suffice, as there is a DStretch® app available. However, this may not meet the officially required documentation standards. Overall, DStretch® can be considered a noninvasive, relatively inexpensive, easy, and fast alternative for enhancing and recognizing (green) bone discolorations.

To be able to reproduce the DStretch® procedure on discolored bones, I will present here a full step-by-step protocol based on digital photographs of an archaeological example of a female skull with copper-induced discolorations originating most likely from copper earrings. Note that the application of DStretch® only aims at the visualization, identification, and documentation of bone discolorations. It cannot reveal the exact cause or taphonomic agents involved in the development of these color changes. Also note that while this protocol is exemplified on green bone discolorations only, it can also be applied to other hues as well. However, if applied to colors other than green, defined parameters within the DStretch® application must be adjusted to produce desirable results.

To facilitate the reproduction of the DStretch® procedure on discolored bones, I will present a comprehensive step-by-step protocol using digital photographs of an archaeological example: a female skull with copper-induced discolorations, likely stemming from copper earrings. It's crucial to understand that the application of DStretch® is focused solely on visualizing, identifying, and documenting bone discolorations. It does not provide insight into the exact cause or taphonomic agents involved in the development of these color changes. Additionally, it's important to note that while this protocol is demonstrated using green bone discolorations, it is equally applicable to other hues. However, when applied to colors other than green, certain parameters within the DStretch® application must be adjusted to achieve optimal results.

Refer to the content below for additional background information on this protocol. Also, be aware that I have recently co-published a DStretch® protocol on protocols.io specifically for identifying hard-to-see tattoos on mummified human skin (Göldner and Deter-Wolf, 2023). This initial DStretch® protocol includes valuable information that may be of interest to you, encompassing the mathematical background of the algorithm and more general tips.

Fig. 0.1: Key steps of the image transformation using DStretch®. From left to right: 1) Original input image. 2) YBK transformed image. 3) YBK transformed image after hue shift. 4) Extracted, cropped, and back-projected region of interest (green bone discoloration) onto the original input image.
Image Attribution
All photographs were taken and processed by the author (DG). See "Before starting" section for specimen information.
Guidelines
Texts written in italic refer to software commands.

Supplementary materials:
Additional information are published in the supplementary materials (by Göldner and Deter-Wolf 2023) on Zenodo.org (DOI: 10.5281/zenodo.7607277).


Time required to apply this protocol:

  • Taking and post-processing photographs: around 30 to 90 minutes

  • DStretch® enhancement: 10 minutes

  • Documentation: depending on the complexity of the existing tattoos, up to several hours.
Materials
Hardware:

  • DSLR Camera

  • Camera accessories (e.g., batteries, charger, SD card, external hard drive)

  • Computer or laptop

  • Sun cover (for image collection in the field)

  • Externally directed light source (e.g., studio lights for lab taken images)

  • Matte-black paper or cloth for the background

  • Camera tripod (optional)

  • Remote camera control (optional) 

  • Object stands (optional, e.g., cork ring stand or plastic stands to support positioning the specimen)

  • Scalebar with millimeter scale (black and white, appropriate size depending on the object dimensions; see Göldner 2022)
Software:

  • ImageJ© (Schneider et al. 2012; free; download via https://imagej.nih.gov/ij/download.html). ImageJ© version used in this protocol: 1.53k.
  • DStretch® (Harman 2005; low-cost for 50$; proprietary software; contact Jon Harman via https://www.dstretch.com/ or directly via DStretch@prodigy.net for purchase). DStretch® version used in this protocol: 8.22.
  • IrfanView© (free; proprietary software; download via https://www.irfanview.de/). Version used in this protocol: 4.60.

  • IrfanView© Plugin to convert RAW images to JPEG files (free; proprietary software; download via https://www.irfanview.de/). Version used in this protocol: 4.60.

  • Image processing software of your choice, such as GIMP (free; download via https://www.gimp.org/) or Adobe® Illustrator® or Photoshop® (high-cost; proprietary software; purchase and download via https://www.adobe.com/products/photoshop.html); the latter has been used by us in this protocol (Adobe® Photoshop® CS6). Version used in this protocol: 13.0 x64.
Safety warnings
Attention
The FOROST Skull Photography Protocol (Báez-Molgado et al., 2013) provides a concise summary of valuable information regarding safety precautions of handling osteological subjects during photography.
Refer to the SOAP protocol (Cerasoni & Rodrigues, 2022)for a comprehensive guide on (small) object photography in archaeology, with applicability to human remains.
Ethics statement
Refer to Disclaimer for more information.
Before start
The mathematical foundation of the DStretch® algorithm has been elucidated by other reserachers (e.g., Alley, 1996; Campbell, 1996; Gillespie et al, 1986; Gunn et al. 2010; Harman 2005; Kaur and Sohi, 2017) and further expounded upon by Göldner and Deter-Wolf (2023). The latter reference also provides general tips on DStretch® image enhancements concerning specific colors, advises on the suitable color channels to select within the program, and offers additional references. I found out that for green hues, the predefined LAB, LBK, and YBK color channels yielded the best results.

The subsequent instructions for the preparation phase align with those recently outlined by Göldner and Deter-Wolf (2023) and are rephrased here to maintain protocol integrity.

Specimen information:

The cranial specimen selected as an example for this protocol was analyzed by me as part of my bachelor's thesis at the Free University of Berlin, submitted in early 2019 (Göldner, 2019). The aim of the thesis was to anthropologically examine human remains housed in the Berlin collection, considering taphonomic perspectives, in an attempt to infer information about the original and widely undocumented depositional grave contexts. This cranial individual is part of a small archaeological skull series housed at the Anthropological Rudolf-Virchow-Collection (RVS) in Berlin, which is owned by the Berlin Society for Anthropology, Ethnology, and Prehistory (Berliner Gesellschaft für Anthropologie, Ethnologie und Urgeschichte, BGAEU). The inventory number of this individual is RV 3134. Alongside 14 other skulls (together with some mandibles, only two long bones, dog bones, and some grave goods, including copper artifacts), this cranial series was excavated by local workers and history enthusiasts and sent to Berlin without proper archaeological documentation in 1877 from a smaller early medieval Merovingian burial site in Alsheim, Germany.

RV 3134 was anthropologically and archaeologically sexed as female, with an approximate age-of-death between 40 and 60 years. Predominantly located on the mastoid process and close to the ear hole (Meatus acusticus externus), the skull exhibits distinct but locally small and restricted green bone discoloration (Fig. 0.2). Other female individuals from this site group showed similar color changes in the same anatomical locations. Male crania, on the other hand, did not display such green discolorations. As part of the Merovingian grave custom, female individuals were often buried wearing copper-containing earrings, whereas males rarely wore them in graves. It is thus reasonable to assume that the anthropologically assessed sex and the occurrence of green bone discolorations in the ear areas of the female skulls coincide with the former presence of copper earrings in their original graves. Some of the collected grave goods are made of copper.

The results of my bachelor's thesis are published in Göldner (2020). I developed the first version of the current protocol as part of my original thesis. The publication of the original thesis is scheduled for 2024. Permission to use these images was thankfully granted by the curator. For more information on the Alsheim burial site, refer to the thesis summary by Renee Hämmerling, who wrote her own bachelor's thesis about the collection history of the Alsheim artifacts (Hämmerling, 2019, 2020).

Fig. 0.2: Detail image of the left ear area of RV 3134.
Preparation phase
Preparation phase
Camera setup:

Prepare your camera and accessories by charging and cleaning them before use. If available, regularly use the digital image sensor cleaning function of your camera.

DSLR cameras, known for producing less image noise, contribute to overall better image enhancement quality. Opt for the highest megapixel setting, a wide focus field, and automatic focus. Set the ISO to the lowest setting (Harman, 2015). Adjust shutter speed and aperture (f-stop or f-number) values to suit the lighting conditions of your scenery. Avoid capturing photographs in the JPEG file format, as it leads to data compression and a loss of quality. Instead, choose the loss-free RAW or uncompressed TIFF file formats. If JPEG is necessary, select the highest quality format and image resolution settings.

Stabilize the camera using a tripod securely positioned on a stable surface. If available for your camera model, use a remote control to trigger the image capture. Both measures help minimize image blurring.

Subject preparation:
Determine the necessity and permission for cleaning the specimen or the region of interest. If cleaning is permitted, handle the subject with utmost care. When capturing images, ensure that the subject is securely positioned in your imaging setup to prevent any movement, falls, or potential damage. Use stands if required to immobilize the subject and maintain safety distances. Refer to the FOROST Skull Photography Protocol (Báez-Molgado et al., 2013) for general recommendations on subject safety provisions. Adhere to the official institution protocols concerning subject handling and protective measures.

Scenery and environmental setting:

Cast a shadow over the subject to block direct sunlight and prevent over-exposure. Simultaneously, strive to evenly illuminate the scenery. Employ studio lights or other light sources to create and control optimal lighting conditions. If needed, consider using a flash for additional lighting.
Scale bar:
Always include a scale bar in your photographs. Instructions on how to insert scale bars in scientific images were previously published on Protocols.io in Göldner 2022.

Background setting:

Better image enhancement results can be achieved with a uniform, monochrome black background, such as a matte-black paper sheet or non-reflective cloth. Colored backgrounds may interfere with the DStretch® transformation process. White backgrounds are impractical, as they may reflect excessive light onto the specimen and the camera sensor, potentially causing over-exposure. Ensure that the background material covers the entire imaging field. If available, consider utilizing a professional photography light tent with studio lights. Alternatively, for instance in the field, you can eliminate the background using image editing software and replace it with a virtual black color (hex color code: #000000) during the post-processing phase (see below in step 8). Use a soft brush or small air blower to remove dust and other particles from the background but be mindful that some institutions may require the collection of these particles for ethical reasons (as debris can contain small fragments of human remains) or evidence-based purposes.

Imaging phase:
Review your setup once again and initiate the image capture process. Begin by capturing multiple overview images of the entire specimen, followed by detailed images of the specific region(s) of interest. Capture several images from each camera position to mitigate potential issues with blurring, lighting, or focus. After completing the imaging step, duplicate and save the pictures on two separate storage devices to prevent data loss.
Review each image file individually and sort out blurry and insufficient images. If necessary, retake pictures. It is advisable to start the image-checking process directly after the image-taking phase while the setup is still in place. This can reduce time, costs, and avoid the effects of daylight changes. Also, try to capture all photographs in one session to minimize the need to rearrange the setup and camera position. If possible, detach the data storage medium (e.g., SD card) from your camera while it is still in a mounted place to transfer the data to a computer or laptop for review.

File conversion:
Convert RAW or TIFF images to high-quality JPEG files for compatibility with the ImageJ© software and DStretch® plugin. Utilize a converter program suitable for your camera brand and model's file format. Note that different camera brands employ distinct RAW file formats (e.g., Canon uses CRW, CR2, or CR3 files, Nikon uses NEF files, Olympus uses ORF files, and Sony uses ARW files). A versatile and cost-free file converter program like IrfanView, along with associated plugins, supports various RAW file formats from different camera brands. You can use this software to convert individual images or a set of multiple photographs through the Batch Conversion function in the File menu. Select for the highest JPEG quality value (i.e., 100) in the saving and conversion options within the IrfanView Conversion dialog for the final transformed image output. When converting files to a new format, disable the "noise suppression" option if possible. Always ensure to keep your original files as backups.

Image post-processing

If necessary, utilize an image editing software such as GIMP or Adobe Photoshop, and employ tools like the Magic Wand or Wizard Selection Tool to manually eliminate and replace the background (see above in step 5). At this stage, you can also adjust internal image properties, such as contrast, light, and shadow values. However, avoid making (fraudulent) manipulations to the images in significant ways. Alternatively, DStretch® also offers built-in options, including the Auto Contrast and Flat buttons in the Main Panel, to fine-tune brightness and contrast. The CB button can be employed to balance the RGB color of the original input image, especially beneficial when dealing with an overall color cast in the photograph (Harman, 2015).

Photography setting documentation:
For reasons of scientific reproducibility and transparency, it is crucial to thoroughly document the chosen camera parameters and present them alongside your images in publications or as part of the supplementary materials for your studies. These parameters should encompass essential details such as camera type, camera brand, camera model, lens brand, lens type, lens model, ISO, shutter speed, lens-subject-distance, light conditions, optional equipment used, and more. You may use the pre-generated documentation template published as supplementary materials alongside the previous DStretch® transformation protocol by Göldner and Deter-Wolf (2023) named “Photography and Post-Processing Documentation Sheet I” and “Photography and Post-Processing Documentation Sheet II” (DOI: 10.5281/zenodo.7607277). This sheet contains all the necessary fields to document your camera specifications and photography parameter settings reliably, including the DStretch® transformation process. Utilize this template to ensure comprehensive documentation of your imaging process.

Color visualization using DStretch®
Color visualization using DStretch®
Open the ImageJ© software (Fig. 1).
Fig. 1: Opened ImageJ© program interface.
Load the image into ImageJ© (Fig. 2). Either drag and drop the image from the folder into the ImageJ© tool bar (blank gray bar beneath the tool icons tool bar) or via File/Open/navigate to the target folder and select image File/Open.
Fig. 2: Untransformed (original) input image opened in ImageJ©.
In ImageJ© open the DStretch® plugin via Plugins\DStretch (Fig. 3). The input image will automatically be opened in DStretch®.
Fig. 3: Input image opened in the DStretch® plugin in ImageJ©.
Optional step (not shown here): If a background is present, employ one of ImageJ©'s selection tools to choose a larger portion of the specimen or captured region of interest, displaying a representative amount of its color variation. The precise pixel coordinates and selection dimension values of the chosen area are provided within the gray bar beneath the icon toolbar of the primary ImageJ© menu. To deselect the active selection, click the left mouse button somewhere within the image panel.

Set the Scale value on the left panel side to a value between 10 and 15, which, judged based on my own experiences, works well for most images. Note that depending on factors such as image quality, light conditions, and contrast, a higher or lower scale value might give you overall better results (Fig. 3). The scale parameter determines the magnitude of the transformation. Test different scale values to find the best enhancement strength for your specific photograph.

Color visualization using DStretch® - Quick transformation
Color visualization using DStretch® - Quick transformation
For quick and automated DStretch® processing, click on one of the predefined color channels in the Main Panel. For green hues, the LAB, LBK, and YBK channels often reveal effective color separations. However, consider experimenting with different channels, possibly in combination with varying scale values (as explained in the previous step). This short procedure will promptly yield an extreme false-color image that might already provide a good color separation between your target color (e.g., green in this case) and the surrounding specimen surface colors. You can already explore and identify color differences at this stage. In the present example, the YBK channel was selected with the Scale set to 12.5 (Fig. 4). Take note that in the current example, the original green bone discoloration has retained its green hue but with increased vibrancy.

To generate a visually more appealing result, especially for publication purposes, proceed with the subsequent steps. Otherwise, you may skip to the final steps on how to save the transformed image (as explained further below).
Fig. 4: Input image transformed with the YBK color channel option.
DStretch® transformation, color extraction, and back-projection
DStretch® transformation, color extraction, and back-projection
For further image editing and enhancement of the transformed input image click on the Expert button (Fig. 4). The image will be opened in a new DStretch® window called Expert (Fig. 5).
Fig. 5: YBK transfomred image shown in the Expert panel.
Within the Expert window, click on the Hue Mask Panel button in the center. The image will now be displayed in the Hue Mask Panel window (Fig. 6).
Fig. 6: Input image shown in the Hue Mask Panel.
In the Hue Mask Panel, manually adjust the Hue Shift sliding bar in the panel center to a value that makes the discoloration in the region of interest even more visible and distinct from the surrounding colors (Fig. 7). Hue shift values only range between 1 and 360 degrees (deg). Here, some testing might again be required. In the current example, a hue shift value of 175 degrees (deg) was found to reveal the highest contrast. The initial green color has been rendered to a purple hue.
Proceed with the steps from this point onward to back-project the highlighted green color onto the original image.
Fig. 7: Hue shifted YBK image. The hue shift is set to 175 degrees (deg).
Begin the back-projecting process by slowly sliding your cursor over the pixels within the region of interest to determine the color value range and average HSL value that correspond to the discolored green area.
You can see the pixelwise HSL values in the status bar of the ImageJ© main tool bar panel (Fig. 8). Like the hue shift value form before, HSL values can only range between 1 and 360. The HSL value varies for each pixel, yet pixels of similar colors generally share close HSL values. It is recommended to zoom in on the image, as scattered noise pixels with significantly different HSL values often blend into visually monochromatic areas. Discern these pixels by their distinct hues and avoid them. It might take you some time to determine the lowest and highest color values in this region.
At this point, you have two options:

1) Attempt to estimate a rough mean value of the pixels defining your region of interest and establish a suitable minimum to maximum range around this approximate mean. In the provided example, the approximate mean was 320. Around this mean, endeavor to ascertain a range that encompasses the color variation of the target discoloration using the identified minimum and maximum HSL values. In this case, I found a range of ±20 from a roughly approximated mean HSL value of 320 to be most suitable, i.e., between 300 (Min Hue) and 340 (Max Hue).
2) Explore the HSL values across different areas within your region of interest to develop a sense of the range. If you are confident in your selection of the minimum and maximum values, add and subtract 10 to establish a likely range that will yield satisfactory results.

Fig. 8: Determination of the minimum to maximum HSL range in the region of interest. The identified mean value of the green (here purple after transformation) discolored area (red cross) is HSL = 320 (indicated in the ImageJ© interface; red arrow).
Enter your identified minimum and maximum HSL values from the previous step into the Min Hue and Max Hue fields within the Hue Mask Panel, respectively (fig. 8). Once the parameters have been adjusted, all colored pixels outside of this range are removed, and only those pixels within the range will remain surrounded by a black background.

Fig. 8: Entered Min Hue (here: 300) and Max Hue (here: 340) parameter values (red arrows).
Click on the Do Mask button below the Min to Max Hue fields. This will open a new little window called the Lightness Test (fig. 9).
Fig. 9: Opened Lightness Test window after pressing the Do Mask button.
Leave the parameters in the Lightness Test window as predefined (S min = 0.00, S max = 1.00, L min = 0.00, L max = 1.00; Fig. 9) and press OK. This process will yield an image still opened in the Hue Mask Panel, displaying only pixels falling within the indicated minimum to maximum hue range. All other pixels outside this range have been eliminated and replaced with black. However, it can be possible that some pixels (clouds) will persist outside the target area (Fig. 10). This effect may be attributed to various reasons, like random image data noise related to lightning conditions, image quality, or small discolorations that might or might not be related to the discolored area you are primarily focusing on.

Fig. 10: Remaining colored pixels after applying the hue mask within the specified minimum to maximum hue range. Note that not all pixels outside this area are necessarily eliminated, despite the pixels corresponding to the target discoloration.
If your range selection and masking resulted in the co-occurrence of unwanted smaller and larger, respectively, sparse and dense lose point clouds of numerous floating pixels (artifacts) outside the region of interest, you can filter them out using the Clean button in the Hue Mask Panel, which you find on the right side of the Hue Mask Panel. Pressing (not demonstrated here). By clicking on the Clean button, a new window called Hue Mask Cleaning will open automatically, where you can specify the parameters as follows (not pictured in this protocol): Maintain the default settings for the second and last parameters (Type = Clean and Number of extra dilations = 1); refrain from altering them. If you elevate the values for the two remaining options (Cleaning strength and Number of iterations), the DStretch® process will actively eliminate noisy pixels. Setting these values excessively high may also impact and reduce the size of the pixels in the region of interest. Then, press the OK button in the Hue Mask Cleaning panel after specifying the cleaning parameters. This will open another window named Hue Mask, asking you whether to Keep the cleaned image? Press on the OK button in this new window to confirm your choice to obtain the cleaned result.

Another cleaning option is to manually select and delete these areas with the cursor using a selection tool from the ImageJ© tool bar, followed by pressing the Delete button on your keyboard (Figs. 11.1 and 11.2). For the manual selection of larger areas, the rectangular selection tool is generally effective, while the polynomial selection tool is handy for closely outlining the target area. After manually deleting, click somewhere in the displayed image with the left mouse button to deselect the area. If you obtained only a small number of noisy pixels, you may consider refraining from their deletion, as they will likely not notably affect the final result. Otherwise, continue with the manual deletion until only the pixels belonging to the target area remain (Fig. 11.1).

However, differentiating between areas that will be kept and those that are purely artificial can be difficult. It might help to decide whether to delete or filter out such floating artifact pixels based on the size and consistency of these areas and their containing pixels.
Figs. 10.1 (left) and 11.2 (right): Manually selected area outside the main discoloration area using the rectangular selection tool (left) and the cleared space after pressing the Delete button (right).
After the cleaning step, you can once again utilize the Hue Shift bar to modify the color of the extracted discolored area to a hue that approximately corresponds to the original hue of the actual discoloration, i.e., green in our example. This will provide a more consistent visualization and display of the final output image. In this case, a value of 150 degrees was used (Fig. 11.2).

Figs. 11.1 (left) and 11.2 (right): Remaining pixels corresponding to the discolored bone area after deletion of the surrounding pixel noise (left). Color-adapted area to closely resamble the original discoloration hue (right).
Next, click on the "Add to HM Out" button in the Hue Mask Panel. This action will open a small window named "Hue Mask," prompting you with the message "Replace HM Out Image?" Click on OK to confirm your choice (Fig. 12). After confirmation, the extracted and enhanced discoloration pixel color information is successfully back-projected to the original untransformed input image (Fig. 13).

Figs. 12: Choice confirmation to replace the original input image.
Image saving
Image saving
Save the final image (Fig. 14) as a JPEG file by selecting the Save JPG button located on the right side of the still-opened Hue Mask Panel. This action will prompt a new dialog window titled Save File as Jpeg, where you can adjust the image quality for the JPEG file.
Set the JPEG quality value to 100 to save the image at the highest possible quality (Fig. 13). Within this dialog, you'll also encounter the Save matrix also? option. This feature enables the saving of a TXT file containing numeric metadata information about the applied matrix transformation. While this option may be relevant for applying similar basic transformation operations to a batch of images, it might not be particularly useful. Transparent documentation of the values used and the applied steps, e.g., using the provided documentation sheets in the supplementary materials by Göldner and Deter-Wolf (2023), is more crucial for comprehensive documentation.

Fig. 14: Saving dialog.
To save the JPEG image file, press OK. Navigate to the target folder, name the file, and click on the Save button.
Final image review
Final image review
Review your saved image (Fig. 14).

Fig. 14: Final image.
Protocol references
Alley, R.E., Decorrelation Stretch. Version 2.2. Jet Propulsion Laboratory, Pasadena, CA.

Austin, A., 2022. Identifying red ink tattoos using DStretch(R): imaging a papal tattoo identified on the arm of a man in the Burns collection. Archaeological and Anthropological Sciences 14, 12. https://doi.org/10.1007/s12520-021-01485-z

Báez-Molgado, S., Hart, K., Najarro, E., Sholts, S.B., Gilbert, W., 2013. Cranium and mandible imaging protocol: A guide for publication-quality digital photography of the skull (FOROST Skull Photography Protocol).
Campbell, N.A., 1996. The decorrelation stretch transformation. International Journal of Remote Sensing 17, 10, 1939-1949. https://doi.org/10.1080/01431169608948749
Cerasoni, J.N., Rodrigues F. D. N., 2022. Small Object and Artefact Photography - 'SOAP' Protocol V.3.https://doi.org/10.17504/protocols.io.b53zq8p66

Emmitt, J., McAlister, A., Bawden, N., Armstrong, J., 2021. XRF and 3D Modelling on a Composite Etruscan Helmet. Applied Sciences 11, 8026. https://doi.org/10.3390/app11178026
Evans, L., Mourad, A.-F., 2018. DStretch(R) and Egyptian tomb paintings: A case study from Beni Hassan. Journal of Archaeological Science: Reports 18, 78–84. https://doi.org/10.1016/j.jasrep.2018.01.011

Dupras, T.L., Schultz, J.J., 2013. Taphonomic Bone Staining and Color Changes in Forensic Contexts. In Pokines, J.T., Symes, S.A. (eds.), Manual of Forensic Taphonomy, 315-340. Bocan Raton: CRC Press. https://doi.org/10.1201/b15424-13
Gillespie, A.R., Kahle, A.B., Walker, R.E., 1986. Color Enhancement of Highly Correlated Images. I. Decorrelation and HSI Contrast Stretch. Remote Sensing of Environment – An Interdisciplinary Journal 20, 209-235.
Göldner, D., 2020. Anthropologische und taphonomische Analysen zur Befundrekonstruktion der frühmittelalterlichen Alsheim-Schädel in der Berliner Rudolf-Virchow-Sammlung. Mitteilungen der Berliner Gesellschaft für Anthropologie, Ethnologie und Urgeschichte 41 (2020), 171-186. https://doi.org/10.30819/mbgaeu.41.144

Göldner, D., 2022. ArchaeoScale Protocol: Inserting digital scale bars into scientific images – Full step-by-step guideline for anthropological and archaeological specimen photography v2. https://doi.org/10.17504/protocols.io.8epv5j1p6l1b/v1
Göldner, D., 2019. Die frühmittelalterlichen Schädel aus Alsheim in der Rudolf-Virchow-Sammlung. Versuch der Rekontextualisierung eines Gräberfeldes aus interdisziplinarer Sicht. Unpublished bachelor thesis. Institute for Prehistoric Archaeology. Free University Berlin (Berlin 2019).

Göldner, D., Deter-Wolf, A., 2023. DStretch Tattoo Protocol: Full step-by-step protocol for identification and visualization of tattoos on preserved archaeological remains using the ImageJ plugin DStretch. https://doi.org/10.17504/protocols.io.n92ldp52xl5b/v1
Gonzáles, E.R., Pastor, S.C., Casals, J.R., 2019. Lost colours: Photogrammetry, image analysis using the DStretch plugin, and 3-D modelling of post-firing painted pottery from the south west Iberian Peninsula. Digital Applications in Archaeology and Cultural Heritage 13, e00093. https://doi.org/10.1016/j.daach.2019.e00093
Gunn, R.G., Ogleby, C.L., Lee, D., Whear, R.L., 2010. A Method to visually rationalize superimposed pigment motifs. Rock Art Research 27, 2, 131-136.
Hämmerling, R., 2020. Schlecht dokumentiert – so gut wie verloren? Provenienzrecherche und Aufarbeitung einer Altgrabung in Alsheim. Mitteilungen der Berliner Gesellschaft für Anthropologie, Ethnologie und Urgeschichte 41 (2020), 129-145. https://doi.org/10.30819/mbgaeu.41.13
Hämmerling, R., 2019. Schlecht dokumentiert – so gut wie verloren? Provenienzrecherche und Aufarbeitung einer Altgrabung in Alsheim. Unpublished bachelor thesis. Institute for Prehistoric Archaeology. Free University Berlin (Berlin 2019).
Harman, J., 2005. Using Decorrelation Stretch to Enhance Rock Art Image. URL: www.dstretch.com/AlgorithmDescription.pdf (accessed 24.10.2022)
Judge, M.S., Miller, M., Lyons 3rd, M., 2018. Green Bone: Minocycline-Induced Discoloration of Bone Rarely Reported in Foot and Ankle. The Journal of Foot and Ankle Surgery, 57, 4, 801-807. https://doi.org/10.1053/j.jfas.2017.11.009

Kaur, H., Sohi, N., 2017. A novel enhencment method for colored rock art archaeological images. International Journal of Advanced Research in Computer Science, 8, 7, 1163-1167. http://dx.doi.org/10.26483/ijarcs.v8i7.4479
Schneider, C.A., Rasband, W.S., & Eliceiri, K.W. (2012). NIH Image to ImageJ: 25 years of image analysis. Nature Methods, 9, 7, 671-675. https://doi.org/10.1038/nmeth.2089