Minimum cost of “signal” cover in a tree with DP

I’m given a (not necessarily binary) tree. Now every node can have a signal with range $ i$ , reaching all nodes being at most $ i$ edges away. The cost of a signal is determined by a function $ f(n, i)$ with $ n$ being a node and $ i$ being the signal strenght. The cost for each node may vary, the only assumption one can make is that $ f(n, i) \geq f(n, j)$ for $ i > j$ .

I need to find the minimum cost to cover the whole tree.

Example:

tree

For $ f(n, i) = (i + 1)^2$ , the minimum cost would be 7:

Setting a signal with strenght 0 for every node covers the whole tree for the cost of 7. Setting a signal with strenght 1 for the nodes $ b$ and $ c$ covers the tree for the cost of 8 and setting a signal with strenght 2 for node $ a$ results in a cost of 9.

Using Dynamic Programmming this task should be achieved in $ O(n^2)$ . This is an assignment so I’d be grateful for tips.