Safest option for resizing /home and merging newly freed space with existing free space to form shared partition

I have Ubuntu 18.04.2 installed in the dual-boot mode along with Windows 10, and here is my current partition setup.

enter image description here

What I’d like to do is install Ubuntu 18.04.4 afresh while resizing my /home partition to, say, 4 GB (is this a reasonable size? I do not plan on storing any large files such as audio/video ones) and then combining the freed space with the 2.36 GB currently unallocated to create a shared NTFS partition.

What is my best (safest) course of action? Should I carry out the partition resizing and partition creation steps before I install Ubuntu 18.04.4? I believe I can do this by booting using a gparted live USB. Then, I could just install 18.0.4 on the existing / partition while formatting it and the newly resized /home. Is this the right way to do it?

Or will I get a chance to make these changes at the time of the installation of the OS? I have not clicked beyond this screen during the installation process and do not whether the above will be an option.

enter image description here

Merging individual files from two folders sequentially (never asked question)

I need to merge the contents of two folders sequentially into individual destinations. Here is what i have. A,B,C,D,E,F are individual files.

Folder 1: A B C ...  Folder 2: D E F ... 

At the output i need merged files sequentially. So they should be merged like this:

Output: AD BE CF ... 

In other words, first file from the Folder 1 with the first file from the Folder 2. Second file from the Folder 1 with the second file from the Folder 2. In our case AD is consisting of the content of files A and D, BE consisting of files B and E, etc.

Both folders have the same number of files. Output can go anywhere. For the sake of simplicity let’s call it Output folder and locate it in the same root as Folder 1 and 2.

Been looking for the solution for two days now. Yikes!

Merging nodes of a DAG

I would like to merge connected nodes with a specific attribute of a directed acyclic graph. After each merge operation, the graph should remain acyclic. Let’s say my graph contains blue nodes and white nodes.

enter image description here

A and B are connected, they may be merged:

enter image description here

The above graph is DAG. Merging node {A,B} with node {D} is illegal since it creates a cycle!

enter image description here

My current algorithm is based on union-find.

For each blue node b:    Make-Set(b) merge blue nodes if they are connected 

The bug is the output graph contains cycles. How can I avoid merges that create cycles in the graph?

Decoding MP3 file then merging with other PCM streams

I am looking to merge PCM streams containing microphone data with sound in a mp3 file. the main idea behind this is to cut bandwidth down.

Now my problem is how do I decode a given MP3 file chunk by chunk to merge with my other PCM sources for streaming.

my theoretical solution: I was thinking to first pre-decode the mp3 file on my server and save it as PCM data. (Storage is cheaper than bandwidth). Only when the server is ready to play the audio I then take chunks of the pcm data(pre-decoded mp3) and merge/mix it with the other PCM sources.

I am not sure if this would would work and would like a point into the right direction.

Does merging with your fetch give you a merit?

In the 1d4chan summary of changeling the lost there is a line that says

…Or, you can do the “hard but moral” thing and try to merge with it, (since, you know, it’s you), which gives you even more cool stuff and merits.

However in Autumn nightmares there isn’t a mention of a merit received from merging with your fetch. The exact line regarding what you gain from merging with your fetch is:

When it is over, the fetch is gone and the character gains a dot of Clarity, as well as the more-valuable memories from her time in exile.

What i wish to ask is that is there a merit that represents acquiring those memories or something similar that is mentioned in another book?

16.04 replacing folders instead of merging them

When I try to copy a folder into a directory where an older version of that folder already exists, say when backing up a folder to an external hard drive, this window pops up:

enter image description here

Is there a way to also have a Replace option? Because merging will keep all the files in the older folder that I have deleted in the newer one, which is not always preferable.

I use Ubuntu 16.04 with Cinnamon and Nemo as the default file manager.

Merging teams with different front end framework preferences

My company just went through a merger and as a result my small team has been expanded. From our side we have been using react for several years, and I firmly believe for us it is the superior choice in every way to angular which they use historically. We both have our own templates which have been developed over the last few years for various common projects.

Given that the end product of both can be made to look and function almost identically, is it advisable to let both developers use whichever they are most comfortable for new projects? In my head if everyone sticks to some predefined style and data protocols, end users won’t know the difference, and everyone is happy.

As an additionally complexity, if we do take this approach, should new junior devs be pushed either way to try and push towards a majority framework?

Can long lived feature branches be justified by merging master into the feature branch daily

I’ve always considered long lived feature branches a symptom of some underlying problem, but I’ve recently moved jobs and the company I am working for now encourages long lived feature branches. They say it’s fine as long as you merge master (well, the Dev branch, but whatever branch we all work off of) into the feature branch daily. They say you don’t end up with conflicts that way or if you do they are small conflicts. I don’t personally understand this, is there any truth behind the idea that the difficulties surrounding long lived feature branches can be negated by merging master into the branch frequently?

Use LVM for extending a folder’s size by merging it with a folder on an external drive

I basically have this folder where I need to store 1.5 terabytes. However, I only have 900 gb on my internal drive. How do I configure a folder using LVM so that it tries to fill up internal, then moves files to the external drive once it runs out of internal drive space?

Note: I want it to regular check if the internal drive is free; if it still has free space, I want the computer to keep moving external drive files from a designated folder back to the internal drive until internal storage fills up.

Merging partial objects while avoiding path collisions and ordering array indexes

I have been creating a small library to copy parts of deeply nested objects through path strings that supports wildcards. For that I list all paths and values actually found as ObjectPartials and merge-partials these are merged while avoiding path collisions and ordering array indexes. This review is only about this part.

The whole merge-partials.js seems too contrived and confusing. I wanted to have smaller, clearer functions that don’t rely so much on side-effects, as here the ObjectPartials passed to the functions are being changed. Maybe there is even a way simpler solution that I am not seeing.


const set = require('lodash.set');  const InvalidArgError = require('./errors/invalid-arg-error'); const { isUndefined } = require('./utils');  module.exports = class ObjectPartial {   /**    * Creates a new ObjectPartial with the given path and its value    * @param {Array} path    * @param {any} value    */   constructor(path, value) {     if (!Array.isArray(path) || !path.length || isUndefined(value)) {       throw new InvalidArgError(         'No valid path or missing value for ObjectPartial'       );     }     this.path = path.slice();     this.value = value;   }    getPath() {     return this.path;   }    getValue() {     return this.value;   }    /**    * Merge the partial into a new or an existing object    * @param {*} object Optional object to merge the partial into    * @returns Resulting object    */   mergeToObject(object = {}) {     return set(object, this.path, this.value);   }    static createFromObject(object, path) {     if (!path || !Array.isArray(path) || isUndefined(object)) {       throw new InvalidArgError(         'Missing path or object for ObjectPartial.createFromObject'       );     }     let curValue = object;      for (let i = 0; i < path.length; i++) {       if (isUndefined(curValue[path[i]])) {         return null;       }       curValue = curValue[path[i]];     }      return new ObjectPartial(path, curValue);   } }; 


const { isUndefined, pushAllUnique } = require('./utils'); const InvalidArgError = require('./errors/invalid-arg-error');  /**  * Creates a map of all the keys on current partials level and its subsequent keys  * @param {ObjectPartial[]} partials  * @param {Number} pathIndex  * @returns Map of each key occurance (*key* (key, indices[], nextKeys[])) to the partials and next keys  */ function createPossibleCollisionsKeyMap(partials, pathIndex) {   const result = { possibleColisionKeys: {}, hasNextLevel: false };    for (let i = 0; i < partials.length; i++) {     const partialPath = partials[i].getPath() || [];     const key = partialPath[pathIndex];     const nextKey = partialPath[pathIndex + 1];      if (!isUndefined(key)) {       if (isUndefined(result.possibleColisionKeys[key])) {         result.possibleColisionKeys[key] = { key, indices: [], nextKeys: [] };       }       result.possibleColisionKeys[key].indices.push(i);       result.possibleColisionKeys[key].nextKeys.push(nextKey);     }     if (!isUndefined(nextKey)) {       result.hasNextLevel = true;     }   }    return result; }  function hasPossibleKeyCollision(indices, nextKeys) {   const hasMultiple = indices.length > 1;   const hasNextUndefined = nextKeys.findIndex(isUndefined) > -1;   const hasNextNonZeroIndices =     nextKeys.filter(nextKey => Number.isInteger(nextKey) && nextKey != 0)       .length > 0;    return (hasNextUndefined && hasMultiple) || hasNextNonZeroIndices; }  /**  * Solves possible key collisions and normalizes array indices (order from 0)  * @hasSideEffects  * @param {ObjectPartial[]} partials  * @param {Array} pathIndex  * @param {Object} indices.nextKeys possible colision key map  */ function solveKeyCollision(partials, pathIndex, { indices, nextKeys }) {   if (!hasPossibleKeyCollision(indices, nextKeys)) {     return;   }   const indicesTransform = [];   let curNextIndex = 0;    for (let i = 0; i < indices.length; i++) {     const partialIndex = indices[i];     const partialNextKey = nextKeys[i];     const partialPath = partials[partialIndex].getPath();      if (isUndefined(partialNextKey)) {       partialPath.splice(pathIndex + 1, 0, curNextIndex);       curNextIndex += 1;     } else if (Number.isInteger(partialNextKey)) {       let newIndex = indicesTransform[partialNextKey];        if (isUndefined(newIndex)) {         newIndex = curNextIndex;         curNextIndex += 1;         indicesTransform[partialNextKey] = newIndex;       }       partialPath[pathIndex + 1] = newIndex;     }   } }  /**  * Handles the possible collisions in current section of the object partials  * @param {ObjectPartial[]} partials  * @param {*} scanMaps (curLevel, partialIndices[])  * @returns (hasNextLevel, uniqueKeys)  */ function handlePathColisions(partials, { curLevel, partialIndices = [] }) {   const thesePartials = partials.filter((_partial, index) =>     partialIndices.includes(index)   );   const { possibleColisionKeys, hasNextLevel } = createPossibleCollisionsKeyMap(     thesePartials,     curLevel   );   const uniqueKeys = [];    Object.keys(possibleColisionKeys).forEach(key => {     solveKeyCollision(thesePartials, curLevel, possibleColisionKeys[key]);     uniqueKeys.push(possibleColisionKeys[key].key);   });    return { hasNextLevel, uniqueKeys }; }  function createDummyScanMap(partials, level) {   return {     curLevel: level,     partialIndices:, index) => {       return {         path: partial.getPath(),         index       };     })   }; }  /**  * Gets a part of the ObjectPartials to be checked for possible path collisions  * @param {ObjectPartial[]} partials  * @param {Number} level  * @param {string} parentKey  * @returns {Object} Object composed of (curLevel,parentKey,partialIndices[] (path,index))  */ function getScanMap(partials, level, parentKey) {   const result = createDummyScanMap(partials, level);    if (isUndefined(parentKey)) {     result.partialIndices = result.partialIndices       .filter(value => !isUndefined(value.path[0]))       .map(value => value.index);   } else {     result.partialIndices = result.partialIndices       .filter(         value =>           value.path[level - 1] === parentKey && !isUndefined(value.path[level])       )       .map(value => value.index);   }    return result; }  /**  * Prepares object partials to merge contents and normalize array indexes  * @hasSideEffects  * @param {ObjectPartial[]} partials  */ /* eslint-disable no-loop-func */ function mergePartials(partials) {   if (!partials || !Array.isArray(partials) || !partials.length) {     throw new InvalidArgError('Missing ObjectPartial list for mergePartials()');   }   let hasNextLevel = true;   let curLevel = 0;   let parentKeys;   let scanMaps;   let tempResult;    while (hasNextLevel) {     if (curLevel === 0) {       scanMaps = [getScanMap(partials, 0)];     } else {       scanMaps = => getScanMap(partials, curLevel, pKey));     }     parentKeys = [];      for (let i = 0; i < scanMaps.length; i++) {       tempResult = handlePathColisions(partials, scanMaps[i]);       pushAllUnique(parentKeys, tempResult.uniqueKeys);       hasNextLevel = false || tempResult.hasNextLevel;     }     curLevel += 1;   } }  module.exports = mergePartials; 

I did do some tests for this that are passing and here is an example of input and output ObjectPartials. Hope the purpose of the whole thing is clear, let me know if I missed something.

  mergePartials([     new ObjectPartial(['A', 3, 'BA'], 'example'),     new ObjectPartial(['A', 3, 'B'], 123),     new ObjectPartial(['A', 4, 'B'], 456),     new ObjectPartial(['D'], 'yay'),     new ObjectPartial(['D', 0], 'nay')   ]);   /* --> [     new ObjectPartial(['A', 0, 'BA'], 'example'),     new ObjectPartial(['A', 0, 'B'], 123),     new ObjectPartial(['A', 1, 'B'], 456),     new ObjectPartial(['D', 0], 'yay'),     new ObjectPartial(['D', 1], 'nay')   ] */