Why do we need to take the derivative of the activation function in backwards propagation?

I was reading this article here: https://towardsdatascience.com/how-does-back-propagation-in-artificial-neural-networks-work-c7cad873ea7.

When he gets to the part where he calculates the loss at every node, he says to use the following formula:

delta_0 = w . delta_1 . f'(z) where values delta_0, w and f’(z) are those of the same unit’s, while delta_1 is the loss of the unit on the other side of the weighted link.”

And $ f$ is the activation function.

He then says:

“You can think of it this way, in order to get the loss of a node (e.g. Z0), we multiply the value of its corresponding f’(z) by the loss of the node it is connected to in the next layer (delta_1), by the weight of the link connecting both nodes.”

However, he doesn’t actually explain why we need the derivative term. Where does that term come from and why do we need it?

My idea so far is this:

The fact that the identity activation function causes the term to disappear is a hint. The node doesn’t feed into the next exactly as is, it depends on the activation function. When the activation function is the identity, the loss at that node just passes to the next one based on the weight. Basically, you just need to factor in the activation function somehow, specifically in a way that doesn’t matter when it’s the identity, and of course the derivative is a way to do this.

The issue is that this isn’t very rigorous, so I’m looking for a slightly more detailed explanation.

Why is my plastic credit card and activation code sent separately?

Capital one recently sent my plastic credit card by paper mail and it’s activation code by a separate paper mail. What security problem does this mitigate? If a rogue element has access to my mail box or home, they will have both the plastic card as well as the activation code. The only thing I can think of is that they are preventing rogue elements on their side from having access to the two information at the same time? Or is it something else?

can’t launch matlab after activation ubuntu 18.04 Licensing error: -8,523

wided@wided:/usr/local/MATLAB/MATLAB_Production_Server/R2015a/bin$ sudo ./matlab License checkout failed. License Manager Error -8 Make sure the HostID of the license file matches this machine, and that the HostID on the SERVER line matches the HostID of the license file.

Troubleshoot this issue by visiting: http://www.mathworks.com/support/lme/R2015a/8

Diagnostic Information: Feature: MATLAB License path: /home/wided/.matlab/R2015a_licenses:/usr/local/MATLAB/MATLAB_Production_Server/R2015a/licenses/licen se.dat:/usr/local/MATLAB/MATLAB_Production_Server/R2015a/licenses/license_wided_161052_R2015a.lic Licensing error: -8,523. wided@wided:/usr/local/MATLAB/MATLAB_Production_Server/R2015a/bin$

Activation of network connection failed – hotspot

Fresh install of 19.04, running alongside an installation of Windows 10 (different drives). I have an ethernet cable coming into my onboard port, which I use for my main connection. I also have the ANEWKODI wireless adapter plugged in. This device does not natively support Linux, so the drivers I am using are installed following this guide.

I am trying to set up the hotspot setting that Ubuntu offers, but it is failing. I am going into settings, navigating to “Wifi,” clicking the three dots, and clicking “Turn On Wi-Fi Hotspot.” It will bring up the following screen temporarily…

enter image description here

However, after 15-30 seconds, the following error pops up on the top of the screen, and the hotspot turns off.

enter image description here

Running lshw -c network yields this.

What I’ve tried so far:

I’ve followed the steps listed here, which recommended manually starting the network-manager process, here, which said to disable fast boot from Windows, and here, which was a reinstall of network-manager, none of which worked.