Does an Azure Backup of a VM bump users off the system during execution?

I haven’t yet been able to find any documentation that discusses this and am seeking to learn from others’ experience here. I have a 1 TB Standard SSD attached to my Windows 2019 Server VM as data disk LUN 0 and I’d like to know if the Azure Backup policy I’ve just created would last long and whether it would cause any connectivity issues for our RDP users while it runs. I suppose eventually I’ll find out but I’m just trying to give my users a heads up if this is known to cause any connectivity issues.

Any good resources to bump up my GoPro hero 7 [Black edition] skills?

is there any GoPro aficionado out there with good skills, I would like to get my hands on some good resources/tutorials on how to go about using this equipment that I recently purchased.

Also some post-processing tutorials would be helpful as I am aware of Adobe Lightroom but for GoPro file type do I need something else?

bash wrapper around ‘git commit’ to automatically bump (Python) package CalVer and create matching CalVer tag on new commit

I’m doing a lot of Python development lately for various (small-)data analysis pipelines at work. I’ve been wrestling with how to robustly and ~automatically version the code at a fine-grained level, so as to provide strong guarantees of reproducibility of a result generated from any particular version of the code, at any particular point in my development process.

I’ve settled on a CalVer approach. Given that I often want to have multiple versions of the code tagged within a single day, I’m using a ~nonstandard $ TIMESTAMP format of YYYY.MM.DD.hhmm. (hhmmss seemed like it would be overkill.)

In any event, I want two things to happen every time I commit code to one of these data analysis repos:

  1. Wherever relevant in the package (usually just in the main __init__.py), __version__ should be updated to $ TIMESTAMP.
  2. Once the code is committed, a tag named $ TIMESTAMP should be applied to the new commit

Ancillary goals are the usual: easy to configure, minimal likelihood of breaking all the things, and minimal additional cleanup effort in common non-happy-path scenarios.

The following is a bash script I’ve put together for the purpose:

#! /bin/bash  export TIMESTAMP="$  ( date '+%Y.%m.%d.%H%M' )" export VERPATH='.verpath'  if [ -z $  VERPATH ] then   # Complain and exit   echo "ERROR: Path to files with versions to update must be provided in {repo root}/.verpath"   echo " "   exit 1 fi  # $  VERPATH must contain the paths to the files to be updated with # the timestamped version, one per line while read VERFILE do   # Cosmetic   echo ""    if [ -e "$  VERFILE" ]   then     # File to be updated with version exists; update and add to commit.     # Tempfile with old file stored in case of commit cancellation.     echo "Updating $  VERFILE"     cp "$  VERFILE" "$  VERFILE.tmp"     sed -i "s/^__version__ = .*$  /__version__ = '$  TIMESTAMP'/" $  VERFILE     git add "$  VERFILE"    else     echo "$  VERFILE not found!"    fi  done < $  VERPATH  # Cosmetic echo ""  # So user can see what was updated sleep 2s  # Actually do the commit, passing through any parameters git commit $  @  # If the commit succeeded, tag HEAD with $  TIMESTAMP and delete temp file(s). # If the commit failed, restore the prior state of the $  VERFILEs. if [ "$  ?" -eq "0" ] then   git tag -f "$  TIMESTAMP"    while read VERFILE   do     rm -f "$  VERFILE.tmp"   done < $  VERPATH  else   while read VERFILE   do     if [ -e "$  VERFILE.tmp" ]     then       git reset HEAD "$  VERFILE" > /dev/null 2>&1       rm "$  VERFILE"       mv "$  VERFILE.tmp" "$  VERFILE"     fi   done < $  VERPATH  fi 

The contents of .verpath in my test repo are:

pkg/__init__.py pkg/__dupe__.py pkg/nofile.py 

Both pkg/__init__.py and pkg/__dupe__.py exist; pkg/nofile.py does not.

I have *.tmp in my .gitignore so that the $ VERFILE.tmp don’t show up as untracked files when drafting the commit message.


It works like I want it to… the happy path works great, and it handles aborted commits and nonexistent .verpath files gracefully.

I’m no bash expert, though, so I’m partly concerned about subtle misbehaviors I haven’t thought of. Also, I’m not super thrilled about the use of in-folder temporary files, and per here and here the while read VERFILE ... done < $ VERPATH has the potential to be fragile if I don’t set it up correctly.

How can I bump up shadows in post without overexposing the image?

I’m trying to create a Photoshop/Lightroom RAW processing preset to process 1200 images for photogrammetry. It’s important that the lighting on them is as flat as possible and the shadows get bumped up to reveal details.

On some images, such as this one, pushing up the shadows and pushing down the highlights gives great results, making the lighting flatter and revealing details hidden in the shadows.

On others, especially ones with bright sky in them, such as this one, the same exact settings make the photo way too bright, I’m guessing because, on average, the image is brighter. Notice that, even though the wall in both images is the same, the same settings make it look very different.

Is there a setting that will make both images work? It’s important that I don’t have to manually tweak each photo since there are so many of them. All images were shot with the same ISO, aperture, shutter speed, etc.