Storing users’ private keys to clone git repositories on their behalf

Let’s say I have a multi-user application deployed in a customer’s data center that needs to clone/update git repositories – both “interactively” when a user is creating a new project (which consists of 1-N git repositories) and “in the background” when the app is updating those projects and running analyses on them.

The way I’m doing it right now is that admins will configure git with proper ssh key on the machine where the app is installed and the key is then used automatically by git to do all the required clone/pull operations.

However, there’s a problem with the aforementioned approach: In some configurations, not all users of the application are supposed to have access too all projects/repositories. But (since the git ssh key is “shared”) they can clone all the repositories that are accessible using the key (assuming they know the URL of a repository) even though they wouldn’t be able to access such repositories normally.

To fix this, I’d need to make sure that users are only able to clone repositories to which they really do have access. I can only think of two possible solutions:

  1. Build a proper integration with all possible Git providers (GitHub, GitLab, Bitbucket, Team Foundation Server, etc.) and authenticate users via these providers APIs. Then store users’ access tokens and use them to perform git operations (via HTTPS-like git URLs).
  2. Require each user (capable of creating a new project) upload their private ssh key which will be used to perform all git operations for all projects they create.

Option 1 sounds really complex and isn’t really compatible with the current design and existing installations. And not all git providers (including custom git servers) may give us proper authentication options via “access tokens”.

Option 2 means that we’d need to get ssh keys for all users capable of creating a new project and then store those ssh keys somewhere (possibly in a database).
This sounds bad. I’ve read two related questions (How to securely store users' private keys? and Storing User's Private Keys in DB) and it only confirms my worries.
Is this still a reasonable approach to follow in our case or should we avoid it by all means? Any other options we could use or extra preventive measures we should implement? (remember the app won’t be installed on our servers so the things we can do on that level are limited)

Side note: One other option we thought of would be to make users use “Basic” authentication with git, that is specify git clone URLs like https://user:token@github.com/org/repo. However, we don’t like this as much since it’s still difficult to manage and users have to enter credentials in plaintext (this already proved to be easy-to-leak). It’s, however, a zero-effort on our side so it’s appealing from that point of view :).

Bash: Code to sync Github repositories local, remote and fork’s

It’s a simple bash script but I’m hoping for feedback, advice and examples on how to improve the script and code.

Can you guide me how to put more checks in the code and more if possible?

This code:

  • Sets IFS variable and backs it up
  • Sets a trap for signals that can kill the script and a trap for exit to do run a cleanup function.
  • Then it pushes changes from the local TO the remote repository
  • Then it syncs the local copy FROM the repository/fork
  • Then there is code to update the Fork from the original but that will be used later on other repositories.
  • Then it gives control back to these signals SIGINT SIGQUIT SIGTERM
  • Then it’s set to send a email with the result/status of what’s been run after theexit

Here is the code:

#!/usr/bin/env bash  IFS_OLD=$  IFS IFS=$  '\n\t'  cleanup () {     if [ -n "$  1" ]; then         echo "Aborted by $  1"     elif [ $  status -ne 0 ]; then         echo "Failure (status $  status)"     else         echo "Success"         IFS=$  IFS_OLD         #cd "$  HOME" || { echo "cd $  HOME failed"; exit 155; }     fi } trap 'status=$  ?; cleanup; exit $  status' EXIT trap 'trap - HUP; cleanup SIGHUP; kill -HUP $  $  ' HUP ############################################################################ Sync the local TO the remote ########################################## #{ #{    #...part of script with redirection... #} > file1 2>file2 # ...and others as appropriate... cd /home/kristjan/gitRepo_May2019/ || { echo "Failed to cd to /home/kristjan/gitRepo_May2019/!!!!!!!!"; exit 155; } git add -A || { echo "Failed to git add -A: Sync the local TO the remote!!!!!!!!"; exit 155; } |  if ! `grep -r up-to-date` then     git commit -m 'One small commit for man, one giant leap for melted MacBooks, UNIX and Linux are the best!!!!!!!!!!!!!!!!!!!!!!!!!' || { echo "Failed to git commit -m '......': Sync the local TO the remote!!!!!!!!"; exit 155; }      git push -u origin master || { echo "Failed to git push -u origin master: Sync the local TO remote!!!!!!!!"; exit 155; } fi  ##################################################################333#3# Sync the local copy FROM the original repository/fork(github) #################################### #git remote add upstream https://github.com/ORIGINAL_OWNER/ORIGINAL_REPOSITORY.git #git remote add upstream https://github.com/somethingSomething78/C_Programming.git || { echo "Failed to git remote add upstream ........https://....: sync local copy FROM repo!!!!!!!!"; exit 155; } #cd gitRepo_May2019/ || { echo "Failed to cd to the install directory!!!!!!!!"; exit 155; } #sleep 2 git fetch upstream || { echo "Failed to fetch upstream: sync local copy FROM repo!!!!!!!!"; exit 155; }  git checkout master || { echo "Failed to git checkout master: sync local copy FROM repo!!!!!!!!"; exit 155; }  git merge upstream/master || { echo "Failed to git merge upstream/master: sync local copy FROM repo!!!!!!!!"; exit 155; } ####################################################################### Sync the remote fork with the ORIGINAL repository(github)  # Setting and configuring under a new filename and for other repo's #git clone git@github.com:YOUR-USERNAME/YOUR-FORKED-REPO.git #cd into/cloned/fork-repo #git remote add upstream git://github.com/ORIGINAL-DEV-USERNAME/REPO-YOU-FORKED-FROM.git #git fetch upstream #git pull upstream master   echo "Finished syncing system and remotes!" sleep 2 ############ Give control back to these signals trap SIGINT SIGQUIT SIGTERM ############################  #} > file1 2>file2 # ...and others as appropriate...  exit 0 

This is my .git/config for connecting to the servers with ssh keys (you could notice that origin and upstream are the same but that’s because this git is not a fork but my personal private repo, this code will come to much better use when I setup syncing my forks:

 [core]      repositoryformatversion = 0      filemode = true      bare = false      logallrefupdates = true  [remote "origin"]      url = ssh://C_Programming:somethingSomething78/C_Programming.git      fetch = +refs/heads/*:refs/remotes/origin/*  [branch "master"]      remote = origin      merge = refs/heads/master  [remote "upstream"]      url = ssh://C_Programming/somethingSomething78/C_Programming.git      fetch = +refs/heads/*:refs/remotes/upstream/* 

I also had to setup ssh key for this particular repo and I have edited my .ssh/config file like this:

Host C_Programming     HostName github.com     User git     IdentityFile ~/.ssh/id_rsa-GITHUBSCRIPT 

Here I am testing the script:

[08:58:57][kristjan] ~ ~↓↓$  ↓↓ ./gitRepo_May2019.sh | mail -s "Github SYNC System Report: `hostname`" somethingSomething@mail.com Already on 'master' 

This is the mail it sent aftwards:

Github SYNC System Report: Kundrum Hallur Kristjan Stefansson 09:00 (3 hours ago)
Your branch is up-to-date with ‘origin/master’.

Already up-to-date.

Finished syncing system and remotes!

Success

I put it in a cron(crontab -e) on my Debian Stretch 9.9 system:

21 00 * * 7 /bin/bash /home/kristjan/gitRepo_May2019.sh | mail -s "Github SYNC System Report: `hostname`" somethingSomething@mail.com 

Here are other questions on this site with code for the same or similar purpose:

Rust GitHub repository downloader

Clone GitHub repository using Python

Python script to synchronise your locally cloned fork to its parent github repository

No DHCP after Kali repositories installed in Ubuntu 18.04

I installed the Kali repositories on my Ubuntu 18.04 laptop using katoolin, and since then I haven’t been able to receive DHCP settings from my network. Both my wired and wireless interfaces function fine when a static IP and DNS are configured but I cannot get an IP address or DNS settings from my DHCP server. How do I get dynamic settings back?

updating from repositories can`t done by securely

when I run sudo apt-get update on ubuntu 19.04 it results as; Err:1 http://lt.archive.ubuntu.com/ubuntu disco InRelease Could not connect to lt.archive.ubuntu.com:80 (193.219.61.87). - connect (111: Connection refused) Cannot initiate the connection to lt.archive.ubuntu.com:80 (2001:778::87). - connect (101: Network is unreachable) Err:2 http://lt.archive.ubuntu.com/ubuntu disco-updates InRelease
Cannot initiate the connection to lt.archive.ubuntu.com:80 (2001:778::87). - connect (101: Network is unreachable) Get:3 http://deb.playonlinux.com trusty InRelease [2,590 B]
Hit:4 http://security.ubuntu.com/ubuntu disco-security InRelease
Ign:5 http://ppa.launchpad.net/gnome3-team/gnome3/ubuntu disco InRelease Err:6 http://ppa.launchpad.net/gnome3-team/gnome3/ubuntu disco Release 404 Not Found [IP: 91.189.95.83 80] Ign:7 http://download.opensuse.org/repositories/home:/strycore/xUbuntu_16.04 InRelease Get:8 http://download.opensuse.org/repositories/home:/strycore/xUbuntu_16.04 Release [986 B] Get:9 http://deb.playonlinux.com trusty/main amd64 Packages [564 B] Get:10 http://download.opensuse.org/repositories/home:/strycore/xUbuntu_16.04 Release.gpg [481 B] Get:11 http://deb.playonlinux.com trusty/main i386 Packages [564 B] Get:12 http://download.opensuse.org/repositories/home:/strycore/xUbuntu_16.04 Packages [831 B] Reading package lists... Done
E: The repository 'http://ppa.launchpad.net/gnome3-team/gnome3/ubuntu disco Release' does not have a Release file. N: Updating from such a repository can't be done securely, and is therefore disabled by default. N: See apt-secure(8) manpage for repository creation and user configuration details.

I cannot install any program based on sudo apt-get install I tried everything that I see so far but none of them worked for me. I would appreciate your help. Thank you.

Is it a good idea to share repositories across microservices in Spring Boot Application?

We are migrating a desktop application into web based Spring Boot micro services application with a client imposed mandate of using their existing MySQL database, so all micro services share a common database.

Since its a SQL database we chose Spring JPA (Hibernate).

During project setup, our architecture team generated entities via Hibernate Tools into a “db-commons” project and also added Spring JPA repositories to this shared library citing reusable code.

Although shared entities sound harmless to me, I vehemently opposed idea of having shared repositories, as –

  1. It violates S of SOLID. A micro service should only see & operate on data it owns.
  2. Developers under pressure would directly user these repositories in other services to modify data owned by other micro services.
  3. It leads to duplicate code and possibly missed validations.
  4. It could lead to concurrency & data issues at scale.

Are my concerns wrong ?

If right, did I miss any possible negative impacts (present/ future) ?

Sorting github repositories by topic

I have many repositories in GitHub and I would like to somewow sort them by topic. I looked for a way to “tag” or to “group” repositories but did not find such an option.

One option is to create, for each topic, an “organization” where I will be the only member, and transfer the repositories of this topic to that organization. Does this option have any unwanted side-effects?

Is there a better way to arrange github repositories into topics?

Best way to re-use modules/themes across multiple repositories

We run 3 websites. We have the same core module on our three websites. We also have a core theme that each of the three webites’ theme uses as the parent.

Right now if I make an edit to the Vendor_Core module or frontend/vendor/default theme then I will manually copy the files across to the other 2 websites which isn’t ideal.

What’s the best way to manage this? I have three separate repositories for the whole Magento project..

should I move the core module and theme into their own repos?

Stop being auto-subscribed to GitHub repositories from an organisation

I’ve been joined to a GitHub “organisation” for some things I do at the day job occasionally (administrative and Debian packaging things). I’m not normally involved in the development of the stuff hosted by that GitHub organisation.

However, when someone creates a new repository on that organisation, I am automatically subscribed to it and need to manually unwatch it, all the time.

How can I stop being automatically subscribed to new repositories—ideally, for one particular organisation only?