Is there a reasonable chance of a well-funded agent obtaining raw traffic over Tor circuits

If an agent has a few middle Tor relays (Am) and a few exit Tor relays (Ae), could they obtain the original traffic of some of the circuits with a reasonable probability?

Let’s assume, without too much loss of generality, that Tor only uses middle-middle-exit circuits and that there are M middle relays and E exit relays.

The probability of such a circuit consisting only of nodes this agent controls then is:

P = Am/M * (Am - 1)/(M - 1) * Ae/E 

According to Tor Metrics, there are just short of 7000 relays in total, with almost 2000 being exit relays. I will round these figures up into 7000 – 2000 = 5000 middle relays and 2000 exit relays.

Assuming the attacker owns 10 middle relays and 10 exit relays, the probability of them getting to control the whole circuit is

P = 10/5000 * 9/4999 * 10/2000 ~= 1.8e-8 

which is very low. However, once you factor in the enormous amount of Tor circuits being established (could not find a reliable figure anywhere, will gladly edit one in if someone has it), wouldn’t this agent be able to consistently get complete circuits through their relays and, as a consequence, have complete access to the data it was relaying?

I understand that some of the data through the circuits would also be using TLS, but at least some of it should be plaintext.

It may also be worth pointing out that if this is a really well-funded agent, they might have substantially more than 20 relays at their disposal.

Counting circuits with constraints

Please forgive me if this question is trivial, I couldn’t come up with an answer (nor finding one).

In order to show that there are boolean functions $ f : \{0,1\}^n \rightarrow \{0,1\}$ which can be computed only using circuits of size $ \Omega(2^n/n)$ , we use a counting argument: there are at most $ O(2^{k \log k})$ circuits of size $ k$ , and $ 2^{2^n}$ such functions.

Suppose that I am interested in counting circuits of size $ k$ that compute different functions. The "simple" counting argument won’t work since it may be possible that two "syntactically" different circuits actually compute the same function. In other words, I want to bound the size of the set: $ $ F = \{ f: \{0,1\}^n \rightarrow \{0,1\} | f \text{ can be computed using a circuit of size }k \} $ $

Then $ |F| < $ the number of circuits of size $ k$ (since any circuit computes one function), but how can I bound $ |F|$ from below? (i.e. $ x <|F|$ )

Circuits and formulas for Clique

Is it correct to say that the Clique Problem is in $ P$ iff there exists a family of Boolean circuits $ C$ to decide Clique whose sizes are bounded by a polynomial? And based on this question, does that imply that there exists an equivalent set of Boolean formulas $ F$ to decide Clique whose sizes are bounded by a polynomial? And if there is such an $ F$ , would there be correct derivations based on propositional logic axioms from any member of $ F$ to the corresponding large naive formula for Clique?

Books for learning about Digital logic, circuits, logic design etc

I am a computer science student and I have some courses names “Fundamental of electronics and digital systems, Logic design and switching circuits, System analysis and design “. I searched for books that may help me for these course and I found one named ” Digital logic and computer design by Mano”. I was wondering if anyone could suggest me some more books that will help me to master these topics. Thanks

Making complex boolean circuits that give true as output only for a specific combination of boolean inputs

This is my first question on a stack exchange website so please bear with me. I am making challenges for a jeopardy style capture the flag event in my college and I had come across the minetest challenge in the hardware section of google CTF qualifier conducted last year. A clean and organized solution to this problem has been provided by liveoverflow.

I would like to design a simpler version of this problem for my college’s CTF event but I am unable to design a complex circuit that gives true output only for a specific combination of inputs. I know that a circuit with this functionality is not very difficult to implement and just needs to represent the following logic:

trueinput1 AND trueinput2 AND ... NOT falseinput1 AND NOT falseinput2 ...  

However I want it to be vast and complicated so that participants cannot decode its functionality just by doing a visual analysis. Is there any technique to complicate the boolean logic above and to design a corresponding circuit that looks ugly even for a small number of inputs(32/64).

Making aysnchronous sequential circuits hazard free

The problem I am stuck at requires me design a hazard free asynchronous sequential circuit for a given problem description. I have followed the routine steps as follows:

  • I have obtained the primitive flow table from the problem description
  • I have reduced the flow table using state minimisation routines of incompletely specified FSM
  • I have assigned the output symbol preventing glitches
  • I have done the state assignments of the reduced flow table

Let us assume that I require three secondary variables $ y_1,y_2 ,y_2 $ for the state encoding.

Now I have the resulting flow table. I am stuck at how I should proceed for hazard checking. I know the general procedure of checking and removing static and dynamic hazards, given a function and some transitions.

In this problem,

  • For static hazards, I guess, I should check the KMap for each $ y_1 ,y_2,y_3 $ and thereby check for adjacent $ 1’s$ . Am I right?
  • For dynamic hazards , I really have no clue how to proceed.Descriptive answers would be very helpful

Thank you in advance for all your answers.

Will learning about integrated circuits, help me be a better computer architect(long-term)?

I do not know if this is the right place to ask this type of question, but here I go, im thinking about learning integrated circuits as part of learning more about computer hardware in general (but more focused in its architecture/hardware), will I be losing my time learning about integrated circuits?(Im planning to read the following book: “Analysis and Design of Analog Integrated Circuits 5th edition”). thanks for the answers.

Lower bound on number of (different) circuits of given size?

For circuits with $ n$ input bits, we know that, for any function $ s$ , there are at most $ O(s(n)^{s(n)}) = O(2^{s(n) \log s(n)})$ circuits with size at most $ s(n)$ (where the description is fixed beforehand).

Say two circuits $ C_1$ and $ C_2$ are different if the function they compute is different, that is, there is an $ n$ -bit string $ x$ such that $ C_1(x) \neq C_2(x)$ . The $ O(s(n)^{s(n)})$ bound above is an upper bound on the number of circuits of a given size. Is there a known lower bound on the number of different circuits with size at most $ s(n)$ ?

Clearly such a bound must be strictly smaller than the $ O(s(n)^{s(n)})$ bound since there are pairs of circuits with different structures (and even different number of gates) and which nevertheless compute the same function (i.e., they are not “different” as defined above)—but how smaller can it be?

Names of circuits that comprise memory

I just had a quick question about memory, and I was looking to get some insight. I’m a student, and one of our homework questions was “name two circuits aside from memory cells that RAM must contain in order to complete fetch and store operations”. I answered the MAR/MDR, which was counted as correct, with a note that technically this isn’t what was asked, as those are both registers.

Would anyone be able to provide any insight into what the actual circuits would be that allows RAM to perform these operations? Thanks!!