A Rule to copy the user’s picture

In Drupal 7.. I’m trying to create a rule that copies the user’s profile picture into an image field on another content type. How can I select the ‘picture’ from a user’s account, as a source for copying? The event is after updating an existing account, which provides the source ‘account’. The File Field Rules module provides the copy action, but it requires a source field, and the user-picture isn’t a field.

Another approach I tried is with the file URL. The FileField Sources module allows the use of a remote URL in an image field, but if I try to set the value of the image field with the token [account:picture:url] Rules doesn’t allow it.

There is a similar question here Rules: Copying field from user's account to field in content type but the solution is with an additional field attached to the user, not with the default ‘picture’ one.

thanks, d

How can I create a copy of a image that carries my edits within Lightroom Classic CC?

I’m using a workflow within Lightroom Classic CC that I don’t believe to be optimal. What I do sometimes to achieve the results that I want is to make some ajustments within Lightroom, fixing exposure, shadows, highligts, camera correction and etc. Then I open the image with Photoshop CC carrying the Lightroom changes and just save the image in order to generate a TIFF file, then I go back to Lightroom and work on styling, sometimes using some select presets that I like.

If I open the image and just apply the preset the results are far inferior than when using this weird and clunky method. I’ve tried Virtual Copies, however when using the Virtual Copy the changes aren’t made on top of my previous edits, they override them.

Is there a better way to do this?

2010 – Copy user field values containing orphaned users

Good day everyone,

I need to copy documents and metadata from a library in site collection A (legacy) to another one in site collection B (brand new) via PowerShell. Lucky me, there’s is only the native metadata but the SPFieldUser ones (Author, Editor) give me a hard time.

Of course, there are many values related to people who since left the company and have their account disabled/deleted. And as the source and target site collections are different, they don’t rely on the same User Information List, making things really tricky to migrate them.

What I’ve tried so far :

I have no issue to copy files and folders themselves. For the sake of readability, I’ll just display the method copying metadata from $ srcItem (the original SPListItem in sitecoll A) to $ tgtItem (its copy in sitecoll B).

1. The “brute-force” way

function Copy-Metadata($  srcItem, $  tgtItem) {     $  tgtItem["Created"]  = $  srcItem["Created"]      $  tgtItem["Author"]   = $  srcItem["Author"]     $  tgtItem["Modified"] = $  srcItem["Modified"]     $  tgtItem["Editor"]   = $  srcItem["Editor"]     $  tgtItem.Update(); } 

Of course that doesn’t work properly, because User IDs don’t match between both sitecolls.

2. The “classic” way

function Copy-Metadata($  srcItem, $  tgtItem) {     $  tgtItem["Created"]  = $  srcItem["Created"]      $  tgtItem["Author"]   = $  tgtItem.Web.EnsureUser((new-object Microsoft.SharePoint.SPFieldUserValue($  srcItem.Web, $  srcItem["Author"])).User.LoginName)     $  tgtItem["Modified"] = $  srcItem["Modified"]     $  tgtItem["Editor"]   = $  tgtItem.Web.EnsureUser((new-object Microsoft.SharePoint.SPFieldUserValue($  srcItem.Web, $  srcItem["Editor"])).User.LoginName)     $  tgtItem.Update(); } 

Works great for active users, but raises the following exception for orphan ones : Exception calling “EnsureUser” with “1” argument(s): “The specified user domain\goneuser could not be found.

3. The “desperate” way

function Copy-Metadata($  srcItem, $  tgtItem) {     $  tgtItem["Created"]  = $  srcItem["Created"]      $  tgtItem["Author"]   = Build-SPFieldUserValue $  srcItem $  tgtItem "Author"     $  tgtItem["Modified"] = $  srcItem["Modified"]     $  tgtItem["Editor"]   = Build-SPFieldUserValue $  srcItem $  tgtItem "Editor"     $  tgtItem.Update(); }  function Build-SPFieldUserValue ($  srcItem, $  tgtItem, $  fieldInternalName) {     $  srcUser = (new-object Microsoft.SharePoint.SPFieldUserValue($  srcItem.Web, $  srcItem[$  fieldInternalName])).User     try {         $  tgtUser = $  tgtItem.Web.EnsureUser($  srcUser.LoginName)     }     catch {         New-SPUser -UserAlias $  srcUser.LoginName -DisplayName $  srcUser.Name -Web $  tgtItem.Web.Url         $  tgtUser = $  tgtItem.Web.EnsureUser($  srcUser.LoginName)     }     return $  tgtUser } 

Basically, I’m trying to explicitly add the orphan user in the target sitecoll cache, but it fails too as the New-SPUser command interrogates Active Directory, hence raising the following exception : New-SPUser : The specified user domain\goneuser could not be found.

I’m now out of ideas. Does anyone know how to achieve this ? Thank you very much.

Improve file copy performance over VPN

I have two Apple machines: a server and a client. Both are running the newest builds of MacOS Sierra 10.12.6. The “server” machine is running MacOS Server running VPN and file sharing. Both machines are connected to Gigabit internet (1Gbps up and down) but are located thousands of miles apart. The disk read/write speeds on both ends are above 100 MBps (800Mbps).

I connect my client to the server VPN (L2TP) and then fileshare over AFP. I regularly shuttle large files and folders to the server from the client. However, the highest speed I can get is 5 MBps (40 Mbps) sustained transfer, with a max peak of 12 MBps. If I’m moving files the other way (server to client), but transfers all initiated on the client machine), I can get as high as 10 MBps (80Mbps) sustained transfer. However, I can never get higher than that. Even with the VPN’s overhead, I was hoping to get a higher speed than, at most, 10% of the rated speed.

My questions are:
Is this low speed normal for a VPN (factoring in overhead)? What can I do to improve my transfer performance (for large files, not large folders with lots of small files).

Is it possible to copy results from ngrok by pure terminal operations?

I use ngrok now for reverse tunnel.

Every time the ngrok is started up, the terminal will hide the current terminal and switch into another screen. Then, I read the thread value with my eyes and type it into another PC’s for ssh.

Any efficient and smarter way to copy result from the second screen? I tried to write results into a file and xclip, but both of them failed.

How do I replace an old copy of my gDoc (google sheet) with the new one without messing up the link?

I was told that simply dragging and dropping the new file into the folder should overwrite the old copy, this wasn’t true. Nor is there “manage file version” or similar within the drive interface right click. (I’ve seen old screenshots where it appears in the same context menu with download, share, etc). Within the sheet itself I can see the different versions but there is nowhere to upload a new version.

The folder is a team drive but it doesn’t seem to work in my personal folders either.

Craft A Magnetic, Persuasive And Result Getting Advertising Copy for $10

When you order this gig you will be in possession of a magnetic advertising copy that will skyrocket your profit. I will craft an irresistible advert that will help you realize your business goals in shortest possible time. You will get these benefits from this gig: I will write your advertising copy to meet your specificationYour prospective customers will be motivated to do business with youYou will get massive profit as a result of advertising This is the most result-getting advertising copy you can ever order!

by: masterwriter
Created: —
Category: Content & Writing
Viewed: 155


Ansible: copy all local files to target from a local relative directory

With Ansible version 2.7.5, (running locally, on Ubuntu 18.10, with apparmor disabled)

I have a role ‘udev’ which has a structure as below: roles/udev

├── defaults/ │   └── main.yml ├── files/ │   └── udev_rules/ │       └── 99-local.rules |       (potentially other rules files) ├── tasks/ │   └── main.yml 

I want to copy all files from files/udev_rules into /etc/udev/rules.d/, (without the “udev_rules” subdirectory being created) so have a task in udev/tasks/main.yml like so:

- name: "Copy local udev rules"   file:      dest: "/etc/udev/rules.d/"     src:  "files/udev_rules/"     owner: "root"     group: "root" 

Running the PB, I get no errors, but /etc/udev/rules.d remains empty:

TASK [udev : Copy local udev rules] ****************************************************************************************  [WARNING]: The src option requires state to be 'link' or 'hard'.  This will become an error in Ansible 2.10  ok: [localhost] 

I’ve tried various combinations of trailing / on the src and dest values. In some cases the playbook runs to completion, but doesn’t copy, or I get the error as above. The Ansible docs for the “src” parameter say this:

Local path to a file to copy to the remote server; can be absolute or relative. If path is a directory, it is copied recursively. In this case, if path ends with “/”, only inside contents of that directory are copied to destination. Otherwise, if it does not end with “/”, the directory itself with all contents is copied. This behavior is similar to Rsync

(my emphasis added)

Update: if I heed the warning, and add “state: hard” as a parameter to the task, (which is not mentioned as a valid param in the docs), I get the error:

TASK [udev : Copy local udev rules] **************************************************************************************** fatal: [localhost]: FAILED! => {"changed": false, "msg": "src must be an absolute path"} 

This is embarrassing, to be honest – it’s basic stuff!